1 00:00:05,200 --> 00:00:07,400 Speaker 1: Hey, this is Annie and Samantha and welcome stuff. 2 00:00:07,400 --> 00:00:18,560 Speaker 2: One ever told you perfection if I heard you? And 3 00:00:18,640 --> 00:00:24,480 Speaker 2: once again we are so happy to be joined by friend, colleague, talented, 4 00:00:24,880 --> 00:00:26,840 Speaker 2: tenacious Bridget Todd. 5 00:00:27,240 --> 00:00:30,080 Speaker 3: Welcome Bridget Oh, I got the double G. I love 6 00:00:30,160 --> 00:00:32,560 Speaker 3: the alliteration and that is impressive. 7 00:00:32,800 --> 00:00:33,680 Speaker 1: Thank you for having me. 8 00:00:34,360 --> 00:00:39,600 Speaker 2: We always love having you. Thank you for acknowledgment of 9 00:00:39,680 --> 00:00:40,279 Speaker 2: my alliteration. 10 00:00:40,479 --> 00:00:41,000 Speaker 1: I try. 11 00:00:42,240 --> 00:00:43,760 Speaker 2: How have you been, Bridget. 12 00:00:44,040 --> 00:00:45,160 Speaker 1: I'm doing well. 13 00:00:45,880 --> 00:00:48,199 Speaker 3: Right before we got on, I'm like knee deep in 14 00:00:48,720 --> 00:00:51,040 Speaker 3: do you all watch Love Island? I'm like knee deep 15 00:00:51,120 --> 00:00:54,640 Speaker 3: in Love Island Takes? So I'm my head is like 16 00:00:55,000 --> 00:00:57,240 Speaker 3: do you know when you watch way too much TikTok 17 00:00:57,320 --> 00:00:59,640 Speaker 3: or way too much short form content about one subject 18 00:00:59,680 --> 00:01:01,080 Speaker 3: and then it's like all you can think and you're 19 00:01:01,120 --> 00:01:02,840 Speaker 3: kind of it's almost like a sugar high. 20 00:01:02,920 --> 00:01:03,720 Speaker 1: That's I'm in. 21 00:01:03,800 --> 00:01:06,640 Speaker 3: That's so if I seem out of sorts, I've been 22 00:01:06,840 --> 00:01:09,319 Speaker 3: watching NonStop Love Island and Love Island Takes. 23 00:01:10,360 --> 00:01:13,480 Speaker 4: I actually made mention to Annie that I have never 24 00:01:13,640 --> 00:01:17,119 Speaker 4: watched this show. We actually were talking about like slurs 25 00:01:17,200 --> 00:01:18,520 Speaker 4: and all these things, and I'm sure you know we're 26 00:01:18,520 --> 00:01:21,919 Speaker 4: talking about, but I'm being fed Love Island content against 27 00:01:21,959 --> 00:01:23,760 Speaker 4: my will, much like a lot of other stuff, Like 28 00:01:23,840 --> 00:01:27,280 Speaker 4: I have never in my life ever watched this show 29 00:01:27,400 --> 00:01:30,200 Speaker 4: to the point that I really thought it was just 30 00:01:30,280 --> 00:01:33,600 Speaker 4: a spoof like they talked about in thirty Rock, you know, 31 00:01:33,720 --> 00:01:38,360 Speaker 4: Milf Island that I was like, is this a thing? So, 32 00:01:38,720 --> 00:01:41,360 Speaker 4: but like TikTok is like, no, you will definitely enjoy this. 33 00:01:41,440 --> 00:01:43,360 Speaker 4: I'm like, I definitely don't want to be a part 34 00:01:43,400 --> 00:01:45,720 Speaker 4: of this because I am very nervous and this makes 35 00:01:45,760 --> 00:01:49,480 Speaker 4: me really sad in terms of like why are men 36 00:01:49,600 --> 00:01:50,000 Speaker 4: so bad? 37 00:01:50,400 --> 00:01:52,640 Speaker 1: Yeah, it's not, especially if you haven't seen it. 38 00:01:52,760 --> 00:01:55,000 Speaker 3: It's not like this like this was a bad season, 39 00:01:55,280 --> 00:01:59,639 Speaker 3: and I think it's a season that illustrates how social 40 00:01:59,720 --> 00:02:04,720 Speaker 3: media has really impacted reality TV and right, Yeah, and 41 00:02:04,760 --> 00:02:07,360 Speaker 3: it's not just one castmate who was thrown off for 42 00:02:07,480 --> 00:02:10,960 Speaker 3: using an anti Asian slur. Another castmate was not thrown off, 43 00:02:11,000 --> 00:02:13,799 Speaker 3: but was revealed to have made like an ain't a 44 00:02:14,040 --> 00:02:15,520 Speaker 3: skit making fun of Asians. 45 00:02:15,560 --> 00:02:17,120 Speaker 1: So it's it's interesting. I was like, Wow, what are 46 00:02:17,120 --> 00:02:19,840 Speaker 1: the odds two different people on the show, What are 47 00:02:19,919 --> 00:02:20,400 Speaker 1: the odds? 48 00:02:20,960 --> 00:02:22,680 Speaker 4: Well, I feel like that's another episode that we should 49 00:02:22,680 --> 00:02:26,200 Speaker 4: talk about reality TV and social media, so you know 50 00:02:26,200 --> 00:02:27,440 Speaker 4: I'm gonna put that on your plate too. 51 00:02:27,480 --> 00:02:28,720 Speaker 1: Oh my god, this is something I mean. 52 00:02:28,760 --> 00:02:31,720 Speaker 3: I make no apologies for the fact that I watched 53 00:02:31,960 --> 00:02:34,680 Speaker 3: way too much reality TV, so that is very much 54 00:02:34,720 --> 00:02:35,520 Speaker 3: in my wheelhouse. 55 00:02:36,120 --> 00:02:37,959 Speaker 4: When I was on cable TV, I definitely had a 56 00:02:38,000 --> 00:02:41,080 Speaker 4: lot of Bravo shows come up on my world. I mean, 57 00:02:41,320 --> 00:02:44,639 Speaker 4: million Dollar Matchmaker. I don't know why why I watched that. 58 00:02:44,720 --> 00:02:46,239 Speaker 4: There was no reason for me to watch that. I 59 00:02:46,400 --> 00:02:48,560 Speaker 4: was not gonna be on that show, but yet I 60 00:02:48,720 --> 00:02:49,400 Speaker 4: was fascinated. 61 00:02:50,000 --> 00:02:50,560 Speaker 1: It's so good. 62 00:02:50,639 --> 00:02:53,480 Speaker 3: I don't know where Patti Stanger is now the million 63 00:02:53,560 --> 00:02:56,320 Speaker 3: Dollar Matchmaker, but I did enjoy her show quite a bit. 64 00:02:57,080 --> 00:02:58,320 Speaker 4: Can't help it anyway. 65 00:02:58,400 --> 00:03:02,480 Speaker 2: Sorry, back to no Oh, we should definitely come back 66 00:03:02,520 --> 00:03:04,120 Speaker 2: and talk about that. I would love to. I will 67 00:03:04,160 --> 00:03:07,600 Speaker 2: share my story of when I was on Impossible Kitchen 68 00:03:07,680 --> 00:03:10,120 Speaker 2: and didn't realize I was on the like before part 69 00:03:11,080 --> 00:03:12,680 Speaker 2: where I was supposed to be shocked. 70 00:03:13,200 --> 00:03:14,400 Speaker 1: What is Impossible Kitchen? 71 00:03:15,440 --> 00:03:17,919 Speaker 2: Oh, it's just like a you know, they go in 72 00:03:18,639 --> 00:03:22,560 Speaker 2: and they find a restaurant that's terrible. I'm probably it's 73 00:03:22,560 --> 00:03:25,160 Speaker 2: a restaurant impossible. Sorry, that's I got the title wrong. 74 00:03:25,240 --> 00:03:29,000 Speaker 2: That probably helps you go in and the restaurant's terrible 75 00:03:29,480 --> 00:03:32,000 Speaker 2: and the guy tries to fix it, but they bring 76 00:03:32,080 --> 00:03:35,839 Speaker 2: in people to taste the restaurant's food before and after, 77 00:03:36,040 --> 00:03:38,280 Speaker 2: so you're supposed to be like this is so gross, 78 00:03:38,480 --> 00:03:41,600 Speaker 2: and then after like, oh wow, improvement. But I was 79 00:03:41,640 --> 00:03:43,000 Speaker 2: a college kid at the time, and I was like, 80 00:03:43,080 --> 00:03:44,520 Speaker 2: this is free fried chicken. 81 00:03:44,480 --> 00:03:44,560 Speaker 4: So. 82 00:03:46,560 --> 00:03:52,640 Speaker 1: It tastes great. No issues here unless raw. Thank you 83 00:03:52,800 --> 00:03:53,200 Speaker 1: my gosh. 84 00:03:53,320 --> 00:03:55,320 Speaker 3: This is such a non sequentar and feel feel free 85 00:03:55,360 --> 00:03:56,760 Speaker 3: to like not include this in the show. 86 00:03:56,880 --> 00:03:59,800 Speaker 1: But here where I live in Ice live in at 87 00:04:00,200 --> 00:04:01,280 Speaker 1: a park, Maryland, and there. 88 00:04:01,240 --> 00:04:05,720 Speaker 3: Was a very popular bar called Pirate's Tavern that was 89 00:04:05,760 --> 00:04:08,200 Speaker 3: a pirate theme bar that like when I. 90 00:04:08,240 --> 00:04:10,840 Speaker 1: Lived in Maryland, this we would you would. The food 91 00:04:10,960 --> 00:04:11,480 Speaker 1: was terrible. 92 00:04:11,520 --> 00:04:14,080 Speaker 3: I got food poisoning there multiple times, but like you 93 00:04:14,200 --> 00:04:17,040 Speaker 3: would go for a laugh because the staff dressed pirate. 94 00:04:17,120 --> 00:04:20,080 Speaker 3: The owner clearly really cared a lot about pirate like 95 00:04:20,480 --> 00:04:23,240 Speaker 3: they like they had a pirate band, and one of 96 00:04:23,279 --> 00:04:25,320 Speaker 3: the songs that the pirate band would perform was the 97 00:04:25,440 --> 00:04:27,040 Speaker 3: free Credit Report dot Com. 98 00:04:27,040 --> 00:04:28,320 Speaker 1: Song, you know, the pirate version. 99 00:04:28,440 --> 00:04:31,320 Speaker 3: Yes, this place was wild, and it was on the 100 00:04:31,440 --> 00:04:35,840 Speaker 3: show Bar Rescue where that guy I don't know his name, 101 00:04:35,839 --> 00:04:37,800 Speaker 3: he's a huge trumper, but I forget his name. He 102 00:04:37,880 --> 00:04:39,360 Speaker 3: comes in and he tells them what's wrong with it, 103 00:04:39,680 --> 00:04:42,920 Speaker 3: and his feedback was to not make it pirate. He 104 00:04:43,080 --> 00:04:46,720 Speaker 3: was like, oh, you're in a downtown corporate environment, you 105 00:04:46,760 --> 00:04:49,240 Speaker 3: should make the theme be corporate. So the theme they 106 00:04:49,279 --> 00:04:52,320 Speaker 3: turned it from a pirate bar into like almost like 107 00:04:52,400 --> 00:04:56,200 Speaker 3: a Dilbert theme bar, and it totally tanked it. It 108 00:04:56,279 --> 00:04:59,080 Speaker 3: turned into this huge local scandal where the pirates were 109 00:04:59,200 --> 00:05:03,400 Speaker 3: like burning an effigy of this guy, like it was wild. 110 00:05:03,520 --> 00:05:04,920 Speaker 1: I don't know why I'm sharing this. It's just a 111 00:05:04,960 --> 00:05:05,760 Speaker 1: story that I love. 112 00:05:06,440 --> 00:05:10,600 Speaker 4: That's amazing. Well that's the reality of reality TV. Like 113 00:05:10,720 --> 00:05:13,000 Speaker 4: you think it's all going well, and the way they 114 00:05:13,040 --> 00:05:15,160 Speaker 4: were like no, definitely, this is the worst idea, and 115 00:05:15,240 --> 00:05:17,640 Speaker 4: then they fixed it and it comes back to me like, no, 116 00:05:17,839 --> 00:05:20,279 Speaker 4: this is this is worse. Like I love the renovation 117 00:05:20,400 --> 00:05:22,680 Speaker 4: shows where the people come in after the fix and 118 00:05:22,720 --> 00:05:24,320 Speaker 4: they're like, what the hell did you do in my place? 119 00:05:24,440 --> 00:05:24,600 Speaker 2: Yes? 120 00:05:24,720 --> 00:05:31,240 Speaker 1: I hate it, no, but my god, but this is 121 00:05:31,279 --> 00:05:35,160 Speaker 1: the worst and I hate it. Samantha. You like, where 122 00:05:35,240 --> 00:05:36,840 Speaker 1: were the trading spaces there? 123 00:05:37,080 --> 00:05:40,640 Speaker 3: Literally it would be like we glued straw to the 124 00:05:40,760 --> 00:05:43,599 Speaker 3: walls of your living room, or like we put your 125 00:05:43,680 --> 00:05:45,840 Speaker 3: furniture on the ceiling to make it upside down room. 126 00:05:45,960 --> 00:05:48,600 Speaker 3: These are not exaggerations. These things actually happened on the show. 127 00:05:48,760 --> 00:05:51,120 Speaker 3: It was you need to find it on YouTube cause 128 00:05:51,120 --> 00:05:51,560 Speaker 3: you would. 129 00:05:51,400 --> 00:05:53,680 Speaker 4: Like I'm gonna have to search for those because they 130 00:05:53,680 --> 00:05:54,479 Speaker 4: would make me happy. 131 00:05:54,920 --> 00:05:56,960 Speaker 2: I have a relative who was on I don't know 132 00:05:56,960 --> 00:05:59,480 Speaker 2: if it was that show, but was on some kind 133 00:05:59,520 --> 00:06:01,440 Speaker 2: of We're going to go in and fix up your place, 134 00:06:01,839 --> 00:06:03,760 Speaker 2: and I think she actually lived in Washington, d C. 135 00:06:04,960 --> 00:06:11,560 Speaker 2: And they painted her walls pink and then got a 136 00:06:12,520 --> 00:06:16,240 Speaker 2: paintball gun with like lime green balls and shot the 137 00:06:16,360 --> 00:06:18,920 Speaker 2: wall so it had these like green polka dots on it. 138 00:06:19,880 --> 00:06:21,520 Speaker 1: She was distructed. 139 00:06:22,000 --> 00:06:23,000 Speaker 2: She was not. 140 00:06:27,400 --> 00:06:28,120 Speaker 4: I love it. 141 00:06:30,160 --> 00:06:33,000 Speaker 3: It's like, oh, yeah, you're on this show. We had 142 00:06:33,040 --> 00:06:35,559 Speaker 3: a crew come in and just destroy your home. Aren't 143 00:06:35,560 --> 00:06:37,520 Speaker 3: you so happy? And then you're on camera so you 144 00:06:37,600 --> 00:06:39,800 Speaker 3: can't really be like, you can't have the reaction that 145 00:06:39,839 --> 00:06:42,240 Speaker 3: you probably want to have, and you're from restrain O. 146 00:06:43,320 --> 00:06:44,480 Speaker 4: Yeah, staring at everybody. 147 00:06:46,320 --> 00:06:48,200 Speaker 1: Incredible, Oh my goodness. 148 00:06:48,400 --> 00:06:52,159 Speaker 2: Anyway, well, tabling that for a future episode. 149 00:06:53,360 --> 00:06:56,160 Speaker 1: Had a rabbit hole. I'm sorry, no, I actually it 150 00:06:56,279 --> 00:06:56,839 Speaker 1: is my fault. 151 00:06:57,160 --> 00:06:59,200 Speaker 4: I do this often, but yeah, keep going to a 152 00:06:59,279 --> 00:07:01,040 Speaker 4: different the actual. 153 00:07:01,680 --> 00:07:04,360 Speaker 2: The actual thing that we're talking about, which might be 154 00:07:04,720 --> 00:07:07,160 Speaker 2: it might give you the same sensations it kind of 155 00:07:07,240 --> 00:07:10,920 Speaker 2: does walking into your old home and being like, Wow, 156 00:07:11,720 --> 00:07:15,120 Speaker 2: this is not really where I'm belonging anymore. What's going 157 00:07:15,200 --> 00:07:18,520 Speaker 2: on here? At least at the superficial I don't know 158 00:07:18,640 --> 00:07:22,960 Speaker 2: much about this level. So Bridget, what are we talking 159 00:07:22,960 --> 00:07:23,520 Speaker 2: about today? 160 00:07:23,840 --> 00:07:24,119 Speaker 1: Today? 161 00:07:24,160 --> 00:07:29,000 Speaker 3: We are talking about also masterfult segue there, Annie, you're 162 00:07:29,080 --> 00:07:32,600 Speaker 3: really a pro. Today we are talking about the intersection 163 00:07:32,880 --> 00:07:36,720 Speaker 3: of ai chatchyput and pregnancy. This actually was a was 164 00:07:36,760 --> 00:07:39,040 Speaker 3: a story that was flagged to me by a longtime 165 00:07:39,120 --> 00:07:42,280 Speaker 3: Sminty listener, Susie Susie to go on social media. If 166 00:07:42,280 --> 00:07:45,040 Speaker 3: you're listening, Hey Susie, thanks for flagging this because it 167 00:07:45,160 --> 00:07:46,240 Speaker 3: was super interesting. 168 00:07:46,480 --> 00:07:47,320 Speaker 2: Right, thank you. 169 00:07:48,040 --> 00:07:50,920 Speaker 4: I'm very grateful that Bridget you took this because as 170 00:07:51,000 --> 00:07:52,840 Speaker 4: soon as we looked at this headline, I was like, oh, 171 00:07:52,920 --> 00:07:55,560 Speaker 4: dear God, yeah, I don't know what to do with this. 172 00:07:55,720 --> 00:07:57,600 Speaker 4: I'm alla wait for a second, I let this do. 173 00:07:57,920 --> 00:07:59,480 Speaker 4: And then I was like, of course, bridges on top 174 00:07:59,520 --> 00:08:00,840 Speaker 4: of this even better. 175 00:08:01,160 --> 00:08:04,040 Speaker 3: It's a pretty weird topic. So Susie flagged this piece 176 00:08:04,080 --> 00:08:07,760 Speaker 3: in Rolling Stone called chat gpt is helping women get pregnant, 177 00:08:07,960 --> 00:08:09,520 Speaker 3: which sounds like a. 178 00:08:10,280 --> 00:08:11,200 Speaker 1: Very well concept. 179 00:08:11,640 --> 00:08:14,720 Speaker 3: A lot of times when I'm reporting on women and AI, 180 00:08:14,960 --> 00:08:18,040 Speaker 3: it's because AI is being used to do something absolutely 181 00:08:18,240 --> 00:08:21,800 Speaker 3: horrifying to women, like AI new toify apps, or like 182 00:08:21,960 --> 00:08:24,960 Speaker 3: AI non consensual deep fakes, things like that. So it 183 00:08:25,040 --> 00:08:28,000 Speaker 3: actually was nice to spend a little bit of time 184 00:08:28,080 --> 00:08:32,800 Speaker 3: thinking about that sort of non completely horrifying, though interesting 185 00:08:32,920 --> 00:08:36,679 Speaker 3: and maybe unexpected use cases for the way AI is 186 00:08:36,679 --> 00:08:39,680 Speaker 3: showing up for folks like pregnancy. So in this piece, 187 00:08:39,679 --> 00:08:42,400 Speaker 3: I'll give you just some brief summary in Rolling Stone 188 00:08:42,480 --> 00:08:46,480 Speaker 3: by Fortessa Latifi. Basically, the piece argues that people who 189 00:08:46,520 --> 00:08:49,280 Speaker 3: are hoping to become pregnant are turning to the chatbot 190 00:08:49,440 --> 00:08:53,280 Speaker 3: chatjit bt for advice, tips and informations during their pregnancy journeys, 191 00:08:53,320 --> 00:08:57,640 Speaker 3: and importantly, they are actually changing their conception plans based 192 00:08:57,679 --> 00:09:01,199 Speaker 3: on whatever advice chat jipt gives them. They're even taking 193 00:09:01,280 --> 00:09:03,760 Speaker 3: it so far as to like asking chat to be 194 00:09:03,840 --> 00:09:06,080 Speaker 3: tita psychically channel their future child. 195 00:09:06,520 --> 00:09:08,280 Speaker 1: And even though there are lots. 196 00:09:08,080 --> 00:09:11,200 Speaker 3: Of privacy concerns about this kind of sensitive use case, 197 00:09:11,240 --> 00:09:11,880 Speaker 3: which we'll talk. 198 00:09:11,720 --> 00:09:15,640 Speaker 1: About they're really getting benefit out of this. 199 00:09:26,320 --> 00:09:29,280 Speaker 4: There's a lot of questions I have in this conversation, 200 00:09:29,400 --> 00:09:31,240 Speaker 4: but you know, I feel like the last time you 201 00:09:31,280 --> 00:09:35,520 Speaker 4: were here, we definitely hit onto how people were using 202 00:09:36,160 --> 00:09:40,640 Speaker 4: AI to propose. Essentially they've got you know, that dude 203 00:09:40,679 --> 00:09:45,120 Speaker 4: proposed to their AI bought instead of his girlfriend, which 204 00:09:45,200 --> 00:09:48,199 Speaker 4: is one conversation. And then like, I've definitely read a 205 00:09:48,320 --> 00:09:52,160 Speaker 4: lot of headlines about people using it for therapy, essentially 206 00:09:52,280 --> 00:09:55,679 Speaker 4: turning to these chatbots to be like help me in 207 00:09:55,800 --> 00:10:01,839 Speaker 4: these situations, which is both okay, I like at least 208 00:10:01,880 --> 00:10:06,840 Speaker 4: they're seeking something, but it's also concerning because Internet, right. 209 00:10:07,480 --> 00:10:09,440 Speaker 1: You really summarized it quite well. 210 00:10:09,920 --> 00:10:14,160 Speaker 3: You know, when people report using AI for things like therapy, 211 00:10:15,120 --> 00:10:16,760 Speaker 3: it's sort of I mean, I feel too bays about it. 212 00:10:16,840 --> 00:10:19,600 Speaker 3: On the one hand, therapy can be expensive, it can 213 00:10:19,640 --> 00:10:22,440 Speaker 3: be inaccessible for a lot of folks, and so I'm 214 00:10:23,120 --> 00:10:26,920 Speaker 3: happy that there are people are finding ways to get 215 00:10:26,960 --> 00:10:31,160 Speaker 3: around that. However, big however, we know a lot about 216 00:10:31,280 --> 00:10:36,640 Speaker 3: how AI works, and we've seen reports of AI really 217 00:10:37,160 --> 00:10:39,480 Speaker 3: presenting some challenges with people who are really struggling. Right, So, 218 00:10:39,520 --> 00:10:42,920 Speaker 3: if you're struggling with things like mania or delusion, AI 219 00:10:43,360 --> 00:10:45,920 Speaker 3: is the last place you should be going to find 220 00:10:46,000 --> 00:10:49,800 Speaker 3: comfort because we know that AI has a tendency to 221 00:10:49,920 --> 00:10:52,000 Speaker 3: just spit back what you want to hear. There are 222 00:10:52,080 --> 00:10:57,520 Speaker 3: pretty shocking reports of people, you know, experiencing worsening mania 223 00:10:57,600 --> 00:11:00,319 Speaker 3: and delusions because of their interactions with AI. 224 00:11:00,760 --> 00:11:03,360 Speaker 1: Not to mention all of the like privacy stuff and 225 00:11:03,480 --> 00:11:05,920 Speaker 1: that we'll get into, but like, so it is. And 226 00:11:06,000 --> 00:11:07,040 Speaker 1: I think that in the. 227 00:11:07,080 --> 00:11:10,800 Speaker 3: Conversation about the role that chatbots like chat tipt can 228 00:11:10,840 --> 00:11:14,560 Speaker 3: play in pregnancy is the very same thing, right, where yes, 229 00:11:14,760 --> 00:11:17,160 Speaker 3: I am happy that people who are trying to get pregnant, 230 00:11:17,200 --> 00:11:21,400 Speaker 3: who are embarking on this sometimes incredibly fraught, complicated, anxiety 231 00:11:21,520 --> 00:11:26,640 Speaker 3: inducing journey, have a resource that they self report makes 232 00:11:26,679 --> 00:11:29,199 Speaker 3: them feel better. But that doesn't mean that resource is 233 00:11:29,280 --> 00:11:33,839 Speaker 3: not without concerns that we shouldn't really understand in a 234 00:11:33,960 --> 00:11:35,920 Speaker 3: clear eyed way, right. 235 00:11:36,400 --> 00:11:40,559 Speaker 4: And I guess in actuality when it comes to asking questions, 236 00:11:40,840 --> 00:11:43,920 Speaker 4: I mean, it's become a descriptor Google this if you 237 00:11:44,040 --> 00:11:46,319 Speaker 4: want information. So we've always kind of turned to the 238 00:11:46,400 --> 00:11:50,480 Speaker 4: Internet and now a lot of like Google in itself, 239 00:11:50,600 --> 00:11:53,640 Speaker 4: And I keep saying that I'm not trying to aim 240 00:11:53,679 --> 00:11:57,120 Speaker 4: at Google, but that they have AI as the first topic, 241 00:11:57,520 --> 00:11:59,880 Speaker 4: like first suggested, this is what may be what you 242 00:12:00,160 --> 00:12:02,160 Speaker 4: looking for and gives you that summary. So it is 243 00:12:02,280 --> 00:12:04,839 Speaker 4: not too big of a stretch that we're turning from 244 00:12:04,880 --> 00:12:08,040 Speaker 4: internet to chatbot, right or and. 245 00:12:08,160 --> 00:12:09,719 Speaker 3: Like it kind of makes sense when you think about it, 246 00:12:09,800 --> 00:12:12,960 Speaker 3: because when you are experiencing some sort of a symptom 247 00:12:13,040 --> 00:12:15,640 Speaker 3: or you have a question, the first place most people 248 00:12:15,679 --> 00:12:17,480 Speaker 3: go to is the internet, right, and so that's where 249 00:12:17,520 --> 00:12:19,520 Speaker 3: you would be going to to answer these questions. But 250 00:12:20,800 --> 00:12:24,400 Speaker 3: pregnancy Internet can be confusing. There's a ton of information. 251 00:12:24,840 --> 00:12:27,480 Speaker 3: A lot of that information, as you just astutely pointed out, 252 00:12:27,520 --> 00:12:30,959 Speaker 3: SAM is being filtered through AI if you're googling, and 253 00:12:31,080 --> 00:12:33,719 Speaker 3: then even if you go to sites that are kind 254 00:12:33,720 --> 00:12:36,480 Speaker 3: of human curated, like message boards like the Bump or 255 00:12:36,559 --> 00:12:39,120 Speaker 3: what to Expect, that can be a little bit confusing. 256 00:12:39,480 --> 00:12:40,480 Speaker 1: I just checked them out. 257 00:12:41,040 --> 00:12:43,640 Speaker 3: I'm not pregnant, but like I went to prepare for 258 00:12:43,720 --> 00:12:45,439 Speaker 3: this episode, I went to look at some of them, 259 00:12:45,760 --> 00:12:48,839 Speaker 3: and even that was confusing. There they're full of acronyms 260 00:12:48,920 --> 00:12:52,640 Speaker 3: like FtM, do you guys know what that means? That's 261 00:12:52,720 --> 00:12:56,839 Speaker 3: first time mom or TCC trying to conceive, and so 262 00:12:57,520 --> 00:13:02,000 Speaker 3: it can be confusing to use the Internet to answer 263 00:13:02,160 --> 00:13:06,280 Speaker 3: questions about fertility and pregnancy and health just in general. 264 00:13:06,679 --> 00:13:10,719 Speaker 3: And so yeah, it makes sense that chat ept might 265 00:13:10,840 --> 00:13:13,000 Speaker 3: be easier to navigate, It might be easier to just 266 00:13:13,120 --> 00:13:16,320 Speaker 3: ask a question, to have a chatbot's bit a clear 267 00:13:16,480 --> 00:13:18,760 Speaker 3: answer back at you. That that does make sense to me. 268 00:13:20,720 --> 00:13:25,360 Speaker 2: Yeah, I mean, I completely agree with what Samantha was saying. 269 00:13:25,400 --> 00:13:28,400 Speaker 2: I used to joke that Google is my therapist because 270 00:13:29,960 --> 00:13:31,959 Speaker 2: I just have a question and I need it answered. 271 00:13:32,120 --> 00:13:34,400 Speaker 2: Can you just please give me an answer? And especially 272 00:13:34,400 --> 00:13:39,160 Speaker 2: if there's medical stuff involved, or something like pregnancy, where 273 00:13:39,160 --> 00:13:40,920 Speaker 2: there is a lot of confusion, like should I go 274 00:13:41,000 --> 00:13:42,719 Speaker 2: to the hospital? House? Serious? Is this? 275 00:13:43,000 --> 00:13:43,520 Speaker 4: What is this? 276 00:13:44,640 --> 00:13:49,400 Speaker 2: Don't laugh at me, Samantha, I did go to the hospital, 277 00:13:50,520 --> 00:13:54,640 Speaker 2: but you know, but it is because unfortunately, in our system, 278 00:13:54,920 --> 00:13:57,040 Speaker 2: it's very expensive to go to the hospital. So you 279 00:13:57,559 --> 00:14:02,920 Speaker 2: want some thing something to give you the answer of 280 00:14:03,760 --> 00:14:05,599 Speaker 2: am I okay, what's going on? 281 00:14:06,760 --> 00:14:07,520 Speaker 1: What do I need to do? 282 00:14:07,960 --> 00:14:14,199 Speaker 2: And there still is so much misinformation and misconception and 283 00:14:14,360 --> 00:14:17,280 Speaker 2: just not a lot of information about pregnancy. 284 00:14:18,440 --> 00:14:22,800 Speaker 3: Absolutely, and it's just there. It's hard to access reliable 285 00:14:22,880 --> 00:14:26,800 Speaker 3: information about pregnancy, especially now where I mean, I don't 286 00:14:26,800 --> 00:14:30,840 Speaker 3: want to get political, but we have people in charge 287 00:14:31,160 --> 00:14:34,960 Speaker 3: of you know, government organizations that are essensibly there to 288 00:14:35,040 --> 00:14:37,280 Speaker 3: help us get information about our health, who might not 289 00:14:37,360 --> 00:14:40,800 Speaker 3: be super trustworthy or informed, right, And so we have 290 00:14:41,480 --> 00:14:45,200 Speaker 3: influencers who get lots and lots of platforms on social 291 00:14:45,280 --> 00:14:47,440 Speaker 3: media who might not know what they're talking about. So 292 00:14:47,680 --> 00:14:51,200 Speaker 3: I think, especially now, the Internet can be a difficult 293 00:14:51,280 --> 00:14:56,200 Speaker 3: place for people to find clear, reliable, accurate information to 294 00:14:56,280 --> 00:14:58,320 Speaker 3: help them make decisions about their bodies and their health. 295 00:14:58,360 --> 00:15:01,080 Speaker 3: It just is the reality of trying to find information 296 00:15:01,160 --> 00:15:04,560 Speaker 3: and our information ecosystem. In twenty twenty five, and in 297 00:15:04,840 --> 00:15:07,000 Speaker 3: the piece, the journalist spoke to a woman who used 298 00:15:07,040 --> 00:15:09,840 Speaker 3: chatjet BT on her pregnancy journey, who said, I was 299 00:15:09,880 --> 00:15:11,920 Speaker 3: able to ask my dumb questions in a bunch of 300 00:15:11,960 --> 00:15:14,480 Speaker 3: different ways that helped me understand the science, which helped 301 00:15:14,520 --> 00:15:17,360 Speaker 3: me to have a baby. I found understanding the science 302 00:15:17,440 --> 00:15:20,600 Speaker 3: of conception way easier this way than just navigating through 303 00:15:20,640 --> 00:15:23,360 Speaker 3: books and the crappy articles that Google gives you these days. 304 00:15:23,440 --> 00:15:24,840 Speaker 1: Right, And so it's. 305 00:15:24,720 --> 00:15:27,440 Speaker 3: Exactly what you were saying, that it can just be 306 00:15:28,320 --> 00:15:32,040 Speaker 3: hard to navigate and even I mean, I think it 307 00:15:32,080 --> 00:15:35,240 Speaker 3: really says something about our information ecosystem that chatjet BT, 308 00:15:35,600 --> 00:15:39,400 Speaker 3: a platform that we know is prone to misinformation hallucinations, 309 00:15:39,400 --> 00:15:43,040 Speaker 3: which is just like inaccurate facts that people are like, well, 310 00:15:43,040 --> 00:15:46,000 Speaker 3: that's still better than just like me up to my 311 00:15:46,080 --> 00:15:47,480 Speaker 3: own devices on the internet these. 312 00:15:47,440 --> 00:15:47,840 Speaker 1: Days, you know. 313 00:15:48,880 --> 00:15:54,360 Speaker 2: Yeah, Unfortunately, sometimes you just want an answer, even if 314 00:15:54,440 --> 00:15:58,400 Speaker 2: you know this might not be correct, But I just 315 00:15:58,600 --> 00:16:02,400 Speaker 2: really want things some kind of comfort or some kind 316 00:16:02,440 --> 00:16:04,600 Speaker 2: of this is what I should do, some kind of 317 00:16:04,720 --> 00:16:05,560 Speaker 2: guide map. 318 00:16:05,920 --> 00:16:09,600 Speaker 4: Right, I mean definitely, I see like the going back 319 00:16:09,600 --> 00:16:12,000 Speaker 4: and forth and asking different ways because just trying to 320 00:16:12,080 --> 00:16:14,360 Speaker 4: google my own symptoms and like forgetting things that I 321 00:16:14,400 --> 00:16:16,440 Speaker 4: would sell to the doctor that I wouldn't tell the 322 00:16:16,520 --> 00:16:18,240 Speaker 4: doctor and coming back and be like, oh but I 323 00:16:18,280 --> 00:16:20,880 Speaker 4: forgot to add this. I forgot to add this. So 324 00:16:21,000 --> 00:16:23,720 Speaker 4: when like that person was talking about asking several different ways, 325 00:16:24,160 --> 00:16:26,120 Speaker 4: that makes sense because you don't have the time to 326 00:16:26,360 --> 00:16:28,280 Speaker 4: or the doctor doesn't have the time for you to 327 00:16:28,360 --> 00:16:31,680 Speaker 4: send them. Oh and also this and also this and also. 328 00:16:31,560 --> 00:16:34,680 Speaker 3: This, Yeah, and exactly, so that is exactly what the 329 00:16:35,000 --> 00:16:37,840 Speaker 3: research And not only is it what the research suggests, 330 00:16:37,880 --> 00:16:40,840 Speaker 3: but it's also what people who use AI to answer 331 00:16:40,840 --> 00:16:43,160 Speaker 3: their medical questions are self reporting, because you might be 332 00:16:43,200 --> 00:16:45,720 Speaker 3: thinking like why on Earth with someone turned to chat 333 00:16:45,800 --> 00:16:49,840 Speaker 3: gpt for medical advice, But people self report feeling more 334 00:16:49,960 --> 00:16:53,560 Speaker 3: open when they're asking AI a medical question as opposed 335 00:16:53,560 --> 00:16:56,040 Speaker 3: to a human doctor. In twenty twenty three, a study 336 00:16:56,120 --> 00:17:00,480 Speaker 3: called comparing physician and artificial intelligence chatbot responses to patient 337 00:17:00,560 --> 00:17:04,119 Speaker 3: questions posted to a public social media forum actually found 338 00:17:04,119 --> 00:17:08,040 Speaker 3: that people rated chat sjat BT higher than human doctors 339 00:17:08,240 --> 00:17:11,359 Speaker 3: when it came to things like being empathetic, and actually 340 00:17:11,480 --> 00:17:14,480 Speaker 3: the results were like not even close. For nearly eighty 341 00:17:14,560 --> 00:17:17,680 Speaker 3: percent of the answers. Chat sheet BT was considered better 342 00:17:17,840 --> 00:17:20,760 Speaker 3: than the physicians. Specifically, when it came to giving good 343 00:17:20,920 --> 00:17:24,439 Speaker 3: or very good quality answers, chat SHEETPT received high ratings 344 00:17:24,520 --> 00:17:27,399 Speaker 3: for seventy eight percent of responses, while physicians only did 345 00:17:27,440 --> 00:17:29,680 Speaker 3: so for twenty two percent of responses. And when it 346 00:17:29,760 --> 00:17:33,600 Speaker 3: came to giving empathetic or very empathetic answers, chat sheep 347 00:17:33,600 --> 00:17:37,240 Speaker 3: BT scored forty five percent and human physicians four point 348 00:17:37,320 --> 00:17:43,119 Speaker 3: six percent. Now a few important caveats about this study. Importantly, 349 00:17:43,280 --> 00:17:46,520 Speaker 3: this particular study was not designed to answer two key questions. 350 00:17:46,800 --> 00:17:50,840 Speaker 3: Do AI responses offer accurate medical information and improve patient 351 00:17:50,960 --> 00:17:52,640 Speaker 3: health while also avoiding confusion? 352 00:17:52,720 --> 00:17:55,200 Speaker 1: Or harm. The study did not answer that question, and 353 00:17:55,400 --> 00:17:58,480 Speaker 1: will patients accept that questions they pose to their doctor 354 00:17:58,760 --> 00:18:00,679 Speaker 1: might be answered by a butt The study also did 355 00:18:00,720 --> 00:18:03,320 Speaker 1: not look at that question. But another thing to. 356 00:18:03,359 --> 00:18:08,680 Speaker 3: Note is that chatjeet bt gave much longer answers than 357 00:18:08,800 --> 00:18:14,080 Speaker 3: the human physicians. Human physicians averaged fifty two word answers, 358 00:18:14,119 --> 00:18:16,960 Speaker 3: while chat cheat BT gave two hundred and eleven word answers. 359 00:18:17,560 --> 00:18:19,040 Speaker 3: Kind of goes back to what you were saying, like, 360 00:18:19,240 --> 00:18:22,360 Speaker 3: that's probably because human doctors are busy, and chatcheat BT 361 00:18:22,760 --> 00:18:24,600 Speaker 3: has all the time in the world for you to 362 00:18:24,680 --> 00:18:27,240 Speaker 3: ask the question a thousand different ways, and you know 363 00:18:27,320 --> 00:18:30,680 Speaker 3: it's not going to quote judge you because it's not human. 364 00:18:30,720 --> 00:18:32,639 Speaker 3: Where it's like, you don't want to waste a human 365 00:18:32,720 --> 00:18:36,720 Speaker 3: doctor's time. Maybe if you're a woman or a marginalized person, 366 00:18:36,760 --> 00:18:39,040 Speaker 3: you might be thinking like, oh, I don't want this 367 00:18:39,160 --> 00:18:41,280 Speaker 3: doctor to get some idea in their head about me, 368 00:18:41,359 --> 00:18:43,080 Speaker 3: about what kind of person I am, if I'm like 369 00:18:43,200 --> 00:18:46,080 Speaker 3: annoying or like thinking that I'm a hypochondriac or some 370 00:18:46,200 --> 00:18:49,000 Speaker 3: sort of a nuisance patient. It sucks that these are 371 00:18:49,040 --> 00:18:52,359 Speaker 3: the kind of questions that oftentimes women have to have 372 00:18:52,520 --> 00:18:54,399 Speaker 3: in their head when they're speaking to a human doctor. 373 00:18:54,840 --> 00:18:57,800 Speaker 3: When you're talking to AI, you don't necessarily feel that 374 00:18:57,920 --> 00:18:59,680 Speaker 3: you have to keep that in mind because the AI 375 00:18:59,760 --> 00:19:01,439 Speaker 3: has all the time in the world to answer your 376 00:19:01,480 --> 00:19:02,680 Speaker 3: questions one hundred different ways. 377 00:19:03,119 --> 00:19:03,239 Speaker 2: Right. 378 00:19:03,560 --> 00:19:06,280 Speaker 4: I will throw out there too again because Bridget, as 379 00:19:06,320 --> 00:19:08,720 Speaker 4: you know, Annie and I have recently decided to be 380 00:19:08,760 --> 00:19:11,080 Speaker 4: adults and see doctors or of these attempt to see 381 00:19:11,119 --> 00:19:14,159 Speaker 4: doctors and medical professionals like that. My experience even with 382 00:19:14,320 --> 00:19:16,920 Speaker 4: one specific doctor, and it was telemedicine, and she wasn't 383 00:19:17,000 --> 00:19:18,680 Speaker 4: based out of Atlanta, but it was the quickest I 384 00:19:18,680 --> 00:19:20,320 Speaker 4: could get That was kind of like forty five minutes 385 00:19:20,320 --> 00:19:23,000 Speaker 4: outside of Atlanta. I don't know. She was googling half 386 00:19:23,040 --> 00:19:26,639 Speaker 4: the answers. I was asking, are you kidding? And she 387 00:19:26,760 --> 00:19:30,240 Speaker 4: told me that I could hear googling and then she 388 00:19:30,280 --> 00:19:32,400 Speaker 4: would call it to the receptionist and be like, can 389 00:19:32,480 --> 00:19:34,560 Speaker 4: you look up this? And they were just giving me 390 00:19:34,680 --> 00:19:36,760 Speaker 4: information that I had already looked up myself. But I 391 00:19:36,840 --> 00:19:39,400 Speaker 4: didn't want to be a jerk and be like, yeah, 392 00:19:39,720 --> 00:19:43,640 Speaker 4: I was hoping to oh cool going back to much 393 00:19:43,640 --> 00:19:45,040 Speaker 4: at pot like it literally was. 394 00:19:45,080 --> 00:19:47,080 Speaker 1: One of those moments, well, what's funny? 395 00:19:47,160 --> 00:19:50,000 Speaker 3: So my late mother, god rest her soul, was a physician. 396 00:19:50,160 --> 00:19:53,000 Speaker 3: And the thing I remember a lot about her experience 397 00:19:53,119 --> 00:19:56,080 Speaker 3: as a physician was so many medical books. When I 398 00:19:56,080 --> 00:19:57,760 Speaker 3: would go in, when I would visit her at her office, 399 00:19:57,920 --> 00:19:59,840 Speaker 3: she was always looking at a medical book. And so I, 400 00:20:00,359 --> 00:20:02,080 Speaker 3: I mean, I'm not a doctor, so like people who 401 00:20:02,119 --> 00:20:04,800 Speaker 3: are doctors can write in but like, I almost wonder 402 00:20:05,440 --> 00:20:07,920 Speaker 3: if your doctor googling. 403 00:20:08,040 --> 00:20:10,120 Speaker 1: It is the twenty twenty five version of my mom 404 00:20:10,240 --> 00:20:10,560 Speaker 1: going to. 405 00:20:10,640 --> 00:20:13,840 Speaker 3: A bug, right, but owning and validating that it doesn't 406 00:20:13,840 --> 00:20:16,639 Speaker 3: feel great to see your doctor googling when you're like, 407 00:20:16,680 --> 00:20:18,160 Speaker 3: why I could have done that, right? 408 00:20:18,520 --> 00:20:20,240 Speaker 4: I mean, it was really one of those funny moments 409 00:20:20,280 --> 00:20:22,800 Speaker 4: because she was giving me information that I already had 410 00:20:22,840 --> 00:20:25,000 Speaker 4: looked up, but I decided to go see an in 411 00:20:25,080 --> 00:20:27,159 Speaker 4: person doctor because I'm like, I don't need to do 412 00:20:27,280 --> 00:20:29,920 Speaker 4: this to myself. Blah blah blah. Also need references because 413 00:20:29,920 --> 00:20:32,320 Speaker 4: you know how the system works. But it just made 414 00:20:32,359 --> 00:20:35,040 Speaker 4: me laugh because I definitely understand they have to keep 415 00:20:35,119 --> 00:20:36,920 Speaker 4: up with the times. I would rather you be a 416 00:20:37,000 --> 00:20:41,359 Speaker 4: doctor who often relearns or goes back to learn about 417 00:20:41,480 --> 00:20:43,639 Speaker 4: the new things, new procedures, new ways, because we know 418 00:20:44,040 --> 00:20:47,000 Speaker 4: as time goes hopefully there are better ways to do things, 419 00:20:47,040 --> 00:20:49,400 Speaker 4: safer ways to do things, so that is an automatic. 420 00:20:49,560 --> 00:20:52,200 Speaker 4: Of course we want we want that to occur. But 421 00:20:52,400 --> 00:20:55,280 Speaker 4: it was really really uh ironic. I guess that I 422 00:20:55,359 --> 00:20:57,720 Speaker 4: was in there were like, no, but I already but 423 00:20:57,880 --> 00:21:01,320 Speaker 4: that this is a good okay, cool, you know, like 424 00:21:01,400 --> 00:21:03,280 Speaker 4: it was one of those moments watching her doing this, 425 00:21:03,440 --> 00:21:05,320 Speaker 4: as well as knowing that a lot of this was 426 00:21:05,359 --> 00:21:08,959 Speaker 4: getting through the Ai Gemini, which is again not a ponsor, 427 00:21:09,440 --> 00:21:10,840 Speaker 4: was part of the reason she was giving me some 428 00:21:10,920 --> 00:21:13,200 Speaker 4: of the answers because I could read what she was saying. 429 00:21:13,840 --> 00:21:16,160 Speaker 1: Yeah, I mean, I will I have this big caveat, 430 00:21:16,320 --> 00:21:16,800 Speaker 1: I wouldn't. 431 00:21:16,840 --> 00:21:19,040 Speaker 3: I don't trust jem and I Jimi will have you 432 00:21:19,119 --> 00:21:21,920 Speaker 3: put in glue in your pizza recipe and telling you 433 00:21:21,960 --> 00:21:24,080 Speaker 3: all kinds of stuff. And what's funny is that not 434 00:21:24,240 --> 00:21:26,960 Speaker 3: not to like dunk on Google, but to dunk on 435 00:21:27,000 --> 00:21:29,919 Speaker 3: them a little bit. Yeah, Google search has I think 436 00:21:29,960 --> 00:21:33,359 Speaker 3: that people at Google would say this have gotten worse, 437 00:21:34,000 --> 00:21:35,920 Speaker 3: Like the version of Googles you and I are using 438 00:21:35,960 --> 00:21:38,520 Speaker 3: in twenty twenty five in July today is worse in 439 00:21:38,560 --> 00:21:40,240 Speaker 3: the version we were using just a few years ago. 440 00:21:40,640 --> 00:21:43,560 Speaker 3: Because of the way that they have really pushed forward 441 00:21:43,640 --> 00:21:48,600 Speaker 3: AI integration and have like really made some changes, and yeah, 442 00:21:49,000 --> 00:21:52,000 Speaker 3: I like, I think, And they actually talk about this 443 00:21:52,080 --> 00:21:56,000 Speaker 3: in the article that women who say they're using CHATCHYPT 444 00:21:56,119 --> 00:21:59,159 Speaker 3: and AI to help answer their fertility questions, they know 445 00:21:59,440 --> 00:22:03,239 Speaker 3: that CHAT is prone to incorrect answers, and they say, like, well, 446 00:22:03,280 --> 00:22:05,240 Speaker 3: I know that, but I could be finding those same 447 00:22:05,280 --> 00:22:07,920 Speaker 3: incorrect answers even if I wasn't using AI, and that 448 00:22:08,000 --> 00:22:11,280 Speaker 3: it's prompting me to point me in the right direction 449 00:22:11,440 --> 00:22:14,000 Speaker 3: to find the actual answer myself. So even if I'm Micha, 450 00:22:14,000 --> 00:22:16,600 Speaker 3: well that doesn't sound right, it's giving me the opportunity 451 00:22:16,640 --> 00:22:19,880 Speaker 3: to like go look into myself and where I should 452 00:22:19,880 --> 00:22:21,200 Speaker 3: be looking, which I thought was interesting. 453 00:22:31,800 --> 00:22:35,919 Speaker 2: Another thing that Samantha and I have experienced in our 454 00:22:36,080 --> 00:22:40,960 Speaker 2: ongoing attempts to g to doctors is link the wait 455 00:22:41,119 --> 00:22:46,760 Speaker 2: times and honestly a real difficulty accessing healthcare once we 456 00:22:46,840 --> 00:22:51,280 Speaker 2: were actually trying to do it. And I can only 457 00:22:51,320 --> 00:22:56,280 Speaker 2: imagine when you're talking about people who are pregnant those 458 00:22:56,359 --> 00:23:00,000 Speaker 2: wait times are, especially if we've got your these abortions 459 00:23:00,160 --> 00:23:02,240 Speaker 2: laws and plays like all of these things, that. 460 00:23:02,359 --> 00:23:08,120 Speaker 3: That timing really really matters, absolutely so This actually comes 461 00:23:08,200 --> 00:23:10,359 Speaker 3: up in the piece too. If you're trying to get pregnant, 462 00:23:10,359 --> 00:23:12,280 Speaker 3: there's what's known as the two week wait, which is 463 00:23:12,320 --> 00:23:15,040 Speaker 3: the time between ovulation and either the sart of your 464 00:23:15,080 --> 00:23:17,960 Speaker 3: period or the confirmation of your pregnancy. And so the 465 00:23:18,080 --> 00:23:19,840 Speaker 3: journalist spoke to people who were trying to get pregnant 466 00:23:19,840 --> 00:23:22,119 Speaker 3: who would put their symptoms into chat sheet bt to 467 00:23:22,160 --> 00:23:24,479 Speaker 3: get a sense of like whether or not they more 468 00:23:24,560 --> 00:23:29,280 Speaker 3: align with pregnancy or starting your period, and they would 469 00:23:29,280 --> 00:23:32,840 Speaker 3: actually change up their behavior based on whatever chatget bt said. 470 00:23:32,840 --> 00:23:35,159 Speaker 3: And that this person lives in a rural part of 471 00:23:35,280 --> 00:23:37,800 Speaker 3: Michigan where she said, like, oh, it takes it might 472 00:23:37,880 --> 00:23:40,920 Speaker 3: take six months to even see a doctor, and so 473 00:23:41,600 --> 00:23:46,360 Speaker 3: chatgebt is just more accessible in this time sensitive use 474 00:23:46,440 --> 00:23:48,920 Speaker 3: case where it's like, you know, what's going to happen 475 00:23:48,960 --> 00:23:50,720 Speaker 3: in these next few days really matters. 476 00:23:51,160 --> 00:23:52,960 Speaker 1: Chatgebt is right there and accessible. 477 00:23:53,480 --> 00:23:56,280 Speaker 3: It might take six months of waiting to actually see 478 00:23:56,280 --> 00:23:58,719 Speaker 3: a doctor in real life, and so again a lot 479 00:23:58,760 --> 00:24:01,000 Speaker 3: of this does come down to the need to combat 480 00:24:01,160 --> 00:24:05,200 Speaker 3: accessibility issues for rural communities. So maybe you can't always 481 00:24:05,240 --> 00:24:06,960 Speaker 3: get to a doctor, or anybody who might not be 482 00:24:07,000 --> 00:24:08,960 Speaker 3: able to get to a doctor in a timely way 483 00:24:09,040 --> 00:24:09,720 Speaker 3: in real life. 484 00:24:10,359 --> 00:24:13,000 Speaker 2: Yeah, and going back to the amount of time a 485 00:24:13,119 --> 00:24:17,880 Speaker 2: physician might be able to spend with you recently, when 486 00:24:17,960 --> 00:24:21,119 Speaker 2: I was going when I was in the hospital, I 487 00:24:21,320 --> 00:24:23,000 Speaker 2: was shocked at how short of the time it was. 488 00:24:23,080 --> 00:24:25,000 Speaker 2: And I know they were busy, but I was like, 489 00:24:25,040 --> 00:24:28,440 Speaker 2: I didn't even get to finish like talking about what's 490 00:24:28,520 --> 00:24:32,920 Speaker 2: going on here, and you've moved on, and afterwards I 491 00:24:33,040 --> 00:24:35,680 Speaker 2: was kind of left with this sensation of I really 492 00:24:35,760 --> 00:24:39,840 Speaker 2: just want to talk to someone about this. And that 493 00:24:40,080 --> 00:24:43,520 Speaker 2: is something people have reported as well, right. 494 00:24:44,040 --> 00:24:46,520 Speaker 1: Yeah, I mean I know how that feels. 495 00:24:46,600 --> 00:24:49,800 Speaker 3: Like my dad before his death was quite ill, and 496 00:24:49,840 --> 00:24:51,119 Speaker 3: so I was the person who was like in the 497 00:24:51,160 --> 00:24:53,119 Speaker 3: hospital with him and he was hospitalized for like a 498 00:24:53,240 --> 00:24:59,040 Speaker 3: very long time and essentially how. 499 00:24:57,760 --> 00:24:59,560 Speaker 1: It worked for in this hospital anyway, and then they 500 00:24:59,560 --> 00:24:59,920 Speaker 1: were great. 501 00:25:00,000 --> 00:25:01,480 Speaker 3: Don't want to I don't want to disparage them, but 502 00:25:01,600 --> 00:25:04,520 Speaker 3: like the nurses were always there and they were wonderful. 503 00:25:05,040 --> 00:25:07,719 Speaker 3: The team of doctors working with him would do their 504 00:25:07,880 --> 00:25:10,280 Speaker 3: rounds and so they would come by once a day, 505 00:25:10,640 --> 00:25:13,239 Speaker 3: usually in the morning, and like take a look at him. 506 00:25:13,600 --> 00:25:15,560 Speaker 1: And I kind of got the got the. 507 00:25:15,680 --> 00:25:17,439 Speaker 3: M with like any to be in the room if 508 00:25:17,480 --> 00:25:21,160 Speaker 3: I have questions and my father was very very ill, 509 00:25:21,560 --> 00:25:24,520 Speaker 3: and they would they would spend like maybe ten minutes. 510 00:25:24,840 --> 00:25:27,720 Speaker 3: Now I know people doc their doctors, they have a 511 00:25:27,760 --> 00:25:30,000 Speaker 3: whole hospital of sick people to see that they're who's 512 00:25:30,080 --> 00:25:31,880 Speaker 3: lives they are trying to save. So I'm not I'm 513 00:25:31,920 --> 00:25:35,600 Speaker 3: not belittling them, but it's it's this weird situation where 514 00:25:35,720 --> 00:25:38,000 Speaker 3: like even if they spent two. 515 00:25:37,960 --> 00:25:39,680 Speaker 1: Hours with me, that would not have been enough time. 516 00:25:39,800 --> 00:25:42,040 Speaker 3: Like like it's an emotional thing where like nothing they 517 00:25:42,119 --> 00:25:44,440 Speaker 3: could have done was going to be good enough, because 518 00:25:44,440 --> 00:25:46,560 Speaker 3: it's like, this is my opportunity to ask every question 519 00:25:46,680 --> 00:25:48,840 Speaker 3: I want and like I want you to fix my dad, 520 00:25:48,960 --> 00:25:52,000 Speaker 3: and you just come in with this like expectation that 521 00:25:52,200 --> 00:25:55,760 Speaker 3: no human doctor could meet, even a doctor with all 522 00:25:55,800 --> 00:25:58,720 Speaker 3: the time in the world, but especially a doctor who 523 00:25:58,960 --> 00:26:01,600 Speaker 3: has a million patients to see and they're they're trying 524 00:26:01,680 --> 00:26:04,359 Speaker 3: to save everybody's life, right like, And so I can 525 00:26:04,480 --> 00:26:08,439 Speaker 3: understand how AI would just create a different emotional dynamic 526 00:26:08,520 --> 00:26:09,320 Speaker 3: because it's it's true. 527 00:26:09,359 --> 00:26:12,040 Speaker 1: I mean, like human doctors don't have a ton of time. 528 00:26:12,080 --> 00:26:13,560 Speaker 3: They don't they don't get to spend a lot of 529 00:26:13,600 --> 00:26:15,280 Speaker 3: time with you, and the a matter of and I 530 00:26:15,320 --> 00:26:17,639 Speaker 3: guess I would say that like there is not and 531 00:26:17,680 --> 00:26:19,120 Speaker 3: there is not an amount of time that would feel 532 00:26:19,119 --> 00:26:20,880 Speaker 3: like enough time when you are a loved. 533 00:26:20,680 --> 00:26:22,400 Speaker 1: One is unwell. 534 00:26:24,359 --> 00:26:26,280 Speaker 4: I think there's a lot too about how you know 535 00:26:26,359 --> 00:26:29,040 Speaker 4: we have a rotation of doctors too, that you have 536 00:26:29,160 --> 00:26:32,879 Speaker 4: to constantly repeat yourself so you never get the new 537 00:26:33,000 --> 00:26:37,280 Speaker 4: question out anyway, which I like. I know people who recently, 538 00:26:37,760 --> 00:26:40,200 Speaker 4: my father and my mother we had a whole like 539 00:26:40,200 --> 00:26:41,639 Speaker 4: six months today where they just kept going in and 540 00:26:41,720 --> 00:26:42,520 Speaker 4: out of the house, but like what. 541 00:26:42,600 --> 00:26:43,159 Speaker 1: Are we doing here? 542 00:26:43,520 --> 00:26:45,359 Speaker 4: But when we would be there for more than a 543 00:26:45,480 --> 00:26:47,399 Speaker 4: day or so, the fact that we had to like 544 00:26:47,720 --> 00:26:50,119 Speaker 4: go around and get a new a doctor or a 545 00:26:50,200 --> 00:26:52,359 Speaker 4: new and we know they have to go home, we 546 00:26:52,400 --> 00:26:55,360 Speaker 4: want them to be rested, but the frustration of having 547 00:26:55,359 --> 00:26:57,960 Speaker 4: to repeat the same things over and over again and 548 00:26:58,040 --> 00:27:01,080 Speaker 4: feeling like no one's listening when you're like, no, but 549 00:27:01,160 --> 00:27:03,760 Speaker 4: this is a really big concern and y'all not addressing it. Instead, 550 00:27:03,800 --> 00:27:06,159 Speaker 4: you're just looking at test results and not talking to us. 551 00:27:06,560 --> 00:27:09,439 Speaker 1: So what do we do here? I will be here 552 00:27:09,480 --> 00:27:12,760 Speaker 1: all day. Well, let me just say, like I know 553 00:27:12,960 --> 00:27:14,040 Speaker 1: exactly how you feel. 554 00:27:14,400 --> 00:27:16,720 Speaker 3: When my dad was hospitalized for a big chunk of that, 555 00:27:16,840 --> 00:27:21,120 Speaker 3: he was nonverbal, and so the only way that information 556 00:27:21,240 --> 00:27:23,320 Speaker 3: can be communicated was me saying it and. 557 00:27:24,080 --> 00:27:27,800 Speaker 1: Again, doctors should get to go home. I want them rested, 558 00:27:28,320 --> 00:27:28,720 Speaker 1: all of that. 559 00:27:29,160 --> 00:27:31,680 Speaker 3: But one of the things that made it very frustrating 560 00:27:31,720 --> 00:27:34,080 Speaker 3: for me was having to explain what was like a 561 00:27:34,720 --> 00:27:38,640 Speaker 3: very traumatic situation multiple times a day to multiple people, 562 00:27:38,760 --> 00:27:39,639 Speaker 3: multiple specialists. 563 00:27:42,480 --> 00:27:45,720 Speaker 1: It was like, yes, I'll just say yes, and it 564 00:27:45,920 --> 00:27:48,240 Speaker 1: really sucks, really and it's frustrating. 565 00:27:48,320 --> 00:27:50,919 Speaker 3: On top of the fact that you're like you're already 566 00:27:50,960 --> 00:27:53,600 Speaker 3: having a bad time, and it's like feeling like you 567 00:27:53,680 --> 00:27:55,560 Speaker 3: need to repeat yourself over and over and over again, 568 00:27:55,600 --> 00:27:57,600 Speaker 3: which you do. They need the information, they need to 569 00:27:57,600 --> 00:28:01,000 Speaker 3: hear the story a million times to understand, but yeah, 570 00:28:01,040 --> 00:28:04,040 Speaker 3: it does in those situations. It does not surprise me 571 00:28:04,160 --> 00:28:10,520 Speaker 3: that people would turn to AI when that feels so frustrating. 572 00:28:10,640 --> 00:28:13,080 Speaker 4: Right, and again, it kind of comes back to that 573 00:28:13,160 --> 00:28:17,440 Speaker 4: whole level of trying to find someone who can be 574 00:28:17,520 --> 00:28:21,680 Speaker 4: empathetic at a time of such deep anxiety and stress. 575 00:28:22,080 --> 00:28:25,080 Speaker 4: There's so much in this conversation when we talk about 576 00:28:27,960 --> 00:28:30,720 Speaker 4: bedside manner. I mean, that's definitely one of those big 577 00:28:30,760 --> 00:28:33,600 Speaker 4: conversations that comes up a lot. I've seen tiktoks about 578 00:28:33,600 --> 00:28:37,600 Speaker 4: it repeatedly about doctors and like people going after healthy 579 00:28:37,640 --> 00:28:39,680 Speaker 4: and professionals, not because they're not doing their job, but 580 00:28:39,760 --> 00:28:42,200 Speaker 4: because of the lack of empathy. It feels like even 581 00:28:42,200 --> 00:28:44,840 Speaker 4: though it's professionalism to be one thing for another, Like 582 00:28:44,960 --> 00:28:47,760 Speaker 4: there's this level of like needing someone to understand you 583 00:28:48,080 --> 00:28:50,400 Speaker 4: and that anxiety instead of just being dismissive of you 584 00:28:51,240 --> 00:28:53,760 Speaker 4: in so many ways, right, I can only imagine, especially 585 00:28:53,840 --> 00:28:57,720 Speaker 4: with pregnancy in general, the level of anxiety all the 586 00:28:58,280 --> 00:29:00,560 Speaker 4: like all of the horror stories, like I have chosen 587 00:29:00,600 --> 00:29:03,160 Speaker 4: to not have a child. One of the conversations that 588 00:29:03,360 --> 00:29:05,880 Speaker 4: was because I don't know my body well enough to 589 00:29:05,960 --> 00:29:09,160 Speaker 4: understand if I did try and something would go wrong, 590 00:29:09,600 --> 00:29:13,120 Speaker 4: how would I handle this? You know? And there's so 591 00:29:13,280 --> 00:29:15,680 Speaker 4: much to this in that level of again, like with 592 00:29:15,800 --> 00:29:18,920 Speaker 4: pregnant people trying to find people to understand that fear 593 00:29:18,960 --> 00:29:21,400 Speaker 4: and especially if it does happen, if something does happen 594 00:29:21,440 --> 00:29:24,120 Speaker 4: where you lose something that you were so excited about, 595 00:29:24,640 --> 00:29:28,160 Speaker 4: that level like in trying to figure out how to 596 00:29:28,320 --> 00:29:30,840 Speaker 4: get someone to hear you and understand you and at 597 00:29:30,920 --> 00:29:33,560 Speaker 4: least give you good advice, like not just be like 598 00:29:33,640 --> 00:29:35,480 Speaker 4: I'm so sorry for your loss and walk away, you 599 00:29:35,560 --> 00:29:36,040 Speaker 4: know what I mean. 600 00:29:36,400 --> 00:29:40,280 Speaker 3: Yeah, And one of the way the article doesn't really 601 00:29:40,360 --> 00:29:42,400 Speaker 3: get into this, but when I was doing research, I 602 00:29:42,440 --> 00:29:46,360 Speaker 3: did find there's a subreddit called the Pregnancy after Loss 603 00:29:46,400 --> 00:29:52,640 Speaker 3: subreddit where so many people self report chat GPT doing 604 00:29:52,760 --> 00:29:56,479 Speaker 3: exactly that, just being the empathetic ear that they can 605 00:29:56,600 --> 00:29:59,000 Speaker 3: unload on. You know, these are people who have a 606 00:29:59,040 --> 00:30:02,840 Speaker 3: lot of understand toable anxieties about pregnancy after experiencing a 607 00:30:02,880 --> 00:30:05,800 Speaker 3: pregnancy loss, and chat Schipt has become like a trusted 608 00:30:06,080 --> 00:30:09,960 Speaker 3: that that trusted empathetic ear. One editor wrote, I used 609 00:30:09,960 --> 00:30:12,520 Speaker 3: it like a journal that talks back with my anxiety 610 00:30:12,680 --> 00:30:15,520 Speaker 3: about loss, and I have found it so helpful. I 611 00:30:15,640 --> 00:30:17,920 Speaker 3: needed to talk so much about the anxiety I was feeling, 612 00:30:18,160 --> 00:30:20,200 Speaker 3: more than I could ask my husband or friends to 613 00:30:20,320 --> 00:30:21,880 Speaker 3: listen to. And it was such a help to be 614 00:30:21,960 --> 00:30:24,480 Speaker 3: able to unload the chat Scheept as much as I needed. 615 00:30:24,760 --> 00:30:26,880 Speaker 3: And I think that's exactly what you're talking about, this 616 00:30:27,080 --> 00:30:30,480 Speaker 3: feeling of you know, it's it's good to have a 617 00:30:30,520 --> 00:30:33,640 Speaker 3: community when you're going through anything, pregnancy or any other 618 00:30:33,720 --> 00:30:37,800 Speaker 3: kind of tough medical thing. But I understand the feeling 619 00:30:38,000 --> 00:30:41,240 Speaker 3: of I've been venting to my friends to my husband, 620 00:30:41,560 --> 00:30:44,560 Speaker 3: to my doctor, so much. But I have so much 621 00:30:44,640 --> 00:30:46,520 Speaker 3: to say. I need to say things out loud. I 622 00:30:46,560 --> 00:30:48,440 Speaker 3: need to process the way I'm feeling. I need somebody 623 00:30:48,520 --> 00:30:52,640 Speaker 3: to talk back to me and I It kind of 624 00:30:52,760 --> 00:30:55,400 Speaker 3: breaks my heart, but I get it that people feel 625 00:30:55,440 --> 00:30:57,880 Speaker 3: like the humans in their life, the humans and their community, 626 00:30:57,920 --> 00:31:00,440 Speaker 3: they have so much that they cannot ask the humans 627 00:31:00,480 --> 00:31:03,680 Speaker 3: to hold all of it and so on the one hand, 628 00:31:04,720 --> 00:31:07,160 Speaker 3: that makes me sad, But on the other hand, I'm 629 00:31:07,200 --> 00:31:10,240 Speaker 3: happy they have resources. Like I wouldn't say, you know, 630 00:31:10,320 --> 00:31:12,600 Speaker 3: you shouldn't be using chat SHEETBT for this if you 631 00:31:12,720 --> 00:31:16,160 Speaker 3: feel like that is genuinely helping you process in a 632 00:31:16,200 --> 00:31:18,080 Speaker 3: way that you need because we all, we all need 633 00:31:18,200 --> 00:31:21,080 Speaker 3: to to get it out, you know, let it out. 634 00:31:21,240 --> 00:31:21,840 Speaker 1: I don't like that. 635 00:31:22,040 --> 00:31:24,960 Speaker 3: I don't like the idea that you could be, that 636 00:31:25,080 --> 00:31:27,680 Speaker 3: you could be too much for your community, have too 637 00:31:27,760 --> 00:31:30,640 Speaker 3: much inside. But I understand it because like, nobody wants 638 00:31:30,680 --> 00:31:33,120 Speaker 3: to be that friend that like is like always wanting 639 00:31:33,160 --> 00:31:34,520 Speaker 3: to vent at you. It's it's it's it's you know 640 00:31:34,560 --> 00:31:36,280 Speaker 3: what I'm saying. I don't really have the answer, but 641 00:31:36,480 --> 00:31:37,040 Speaker 3: I get. 642 00:31:37,600 --> 00:31:41,360 Speaker 4: Being I think if you've been friends with women long enough, 643 00:31:41,960 --> 00:31:45,400 Speaker 4: especially in an era where many of them were so 644 00:31:45,640 --> 00:31:49,040 Speaker 4: excited to be a mother, to have a child. I mean, 645 00:31:49,120 --> 00:31:52,240 Speaker 4: I've had many a friends, even really young, like getting 646 00:31:52,320 --> 00:31:55,520 Speaker 4: pregnant and having what is a common miscarriage, and you know, 647 00:31:55,640 --> 00:31:57,880 Speaker 4: we we've talked about this as we need to normalize 648 00:31:57,880 --> 00:32:00,320 Speaker 4: it and have conversation, but that's still a loss. That's 649 00:32:00,360 --> 00:32:03,080 Speaker 4: still a loss to so many people who can get pregnant. 650 00:32:03,520 --> 00:32:05,000 Speaker 4: And I think that's one of the things that we 651 00:32:05,240 --> 00:32:08,600 Speaker 4: kind of forget in this normalizing that it's still personal 652 00:32:08,920 --> 00:32:11,200 Speaker 4: for a lot of people that for some we need 653 00:32:11,240 --> 00:32:14,400 Speaker 4: to normalize it in that like this does not be criminalized, 654 00:32:15,000 --> 00:32:19,080 Speaker 4: but again, it's still a moment where there was especially 655 00:32:19,200 --> 00:32:21,560 Speaker 4: for those who had long term plans and have been 656 00:32:21,640 --> 00:32:25,720 Speaker 4: trying and really have fertility issues, are like questions about 657 00:32:25,720 --> 00:32:28,840 Speaker 4: that and it's so excited about it, and like losing 658 00:32:29,600 --> 00:32:31,560 Speaker 4: what they thought would be a new part of their 659 00:32:31,640 --> 00:32:34,920 Speaker 4: family at like six weeks old is traumatic. It's a 660 00:32:35,000 --> 00:32:38,600 Speaker 4: traumatic moment, even though people will say it's common. And 661 00:32:38,720 --> 00:32:42,160 Speaker 4: I think being able to actually have something that understands 662 00:32:42,400 --> 00:32:45,360 Speaker 4: loss in that way instead of hearing just like, oh, 663 00:32:45,680 --> 00:32:48,000 Speaker 4: it's okay, it's not your fault, which we it's true 664 00:32:48,040 --> 00:32:50,560 Speaker 4: it's not your fault. It is common that's not necessarily 665 00:32:50,600 --> 00:32:52,520 Speaker 4: what it's going to help at that moment. And as 666 00:32:52,600 --> 00:32:55,680 Speaker 4: good intended as all of us are, including therapists, social workers, 667 00:32:55,720 --> 00:33:01,200 Speaker 4: all that that sometimes without knowing with you saying those things, 668 00:33:01,320 --> 00:33:04,080 Speaker 4: without knowing that that's not the right thing, and having 669 00:33:04,160 --> 00:33:06,480 Speaker 4: maybe chat gpt with the way you were to understand 670 00:33:06,520 --> 00:33:08,959 Speaker 4: a little more of what you need. That that can 671 00:33:09,080 --> 00:33:11,360 Speaker 4: make a lot of sense as to why a lot 672 00:33:11,400 --> 00:33:13,280 Speaker 4: of people who get pregnant or who has had a 673 00:33:13,400 --> 00:33:15,720 Speaker 4: pregnancy loss would have turned to that to want to 674 00:33:15,760 --> 00:33:18,640 Speaker 4: be able to validate some of that loss again, to 675 00:33:18,720 --> 00:33:21,840 Speaker 4: go through that. Again, why subreddits and reddits were so important, 676 00:33:21,840 --> 00:33:24,320 Speaker 4: why message boards can be so important. It is that level. 677 00:33:24,440 --> 00:33:27,960 Speaker 4: But again, humans error, they err on the side of 678 00:33:28,040 --> 00:33:30,200 Speaker 4: what they have gone through. So that's what they say, 679 00:33:30,480 --> 00:33:33,920 Speaker 4: which could be a good bad thing, And I. 680 00:33:33,960 --> 00:33:37,040 Speaker 1: Mean that's what AI is good at. 681 00:33:37,160 --> 00:33:40,240 Speaker 3: Like the same the I hate to say it the 682 00:33:40,360 --> 00:33:44,440 Speaker 3: thing that makes AI dangerous for somebody who is experiencing 683 00:33:44,480 --> 00:33:47,040 Speaker 3: delusions because it understands what it is that you want 684 00:33:47,320 --> 00:33:50,600 Speaker 3: or need to hear. That is the same thing that 685 00:33:50,760 --> 00:33:53,720 Speaker 3: might make it a good place for someone to vent 686 00:33:53,880 --> 00:33:56,560 Speaker 3: because it understands like what it is that you want 687 00:33:56,760 --> 00:33:58,600 Speaker 3: or need to hear, And it's good at spitting that 688 00:33:58,760 --> 00:34:01,000 Speaker 3: back out at you. And there are use cases where 689 00:34:01,040 --> 00:34:03,719 Speaker 3: that might be a good thing, if you need reassurance, 690 00:34:03,800 --> 00:34:05,640 Speaker 3: if you need just something. 691 00:34:06,440 --> 00:34:08,440 Speaker 1: I'm not going to say someone because. 692 00:34:08,160 --> 00:34:10,839 Speaker 3: I have a bad habit of answerpomorphizing, which is also 693 00:34:10,880 --> 00:34:11,800 Speaker 3: a word I cannot pronounce. 694 00:34:12,160 --> 00:34:13,960 Speaker 1: I have to say a lot of my work, and 695 00:34:14,200 --> 00:34:19,320 Speaker 1: I can make a word never tried. Yeah, I've never tried. 696 00:34:19,440 --> 00:34:22,680 Speaker 3: Let's just don't rewind and break down how I said it. 697 00:34:22,760 --> 00:34:26,440 Speaker 1: Let's just move on. But like, uh, you know, the 698 00:34:26,760 --> 00:34:28,480 Speaker 1: thing that makes it good at understanding. 699 00:34:28,680 --> 00:34:30,800 Speaker 3: AI is very good at understanding what people want and 700 00:34:30,920 --> 00:34:32,799 Speaker 3: saying what people want to hear and spitting that back 701 00:34:32,840 --> 00:34:34,800 Speaker 3: out at them. And so there are times when that 702 00:34:34,800 --> 00:34:37,000 Speaker 3: would be useful, and there are times where that would 703 00:34:37,000 --> 00:34:39,120 Speaker 3: be dangerous. And I know it sounds like what people 704 00:34:39,120 --> 00:34:41,680 Speaker 3: are saying is that when you are trying to get 705 00:34:41,719 --> 00:34:45,480 Speaker 3: pregnant after experiencing pregnancy loss or just going through pregnancy loss, 706 00:34:45,520 --> 00:34:47,240 Speaker 3: it might be a use case where it's helpful. 707 00:34:49,360 --> 00:34:52,879 Speaker 2: Yes, And before we get into sort of the more 708 00:34:53,000 --> 00:34:57,320 Speaker 2: negative aspect, perhaps there is an instance you have in 709 00:34:57,400 --> 00:35:01,800 Speaker 2: here of it being helpful and saving someone's life. 710 00:35:02,360 --> 00:35:06,239 Speaker 3: Yes, so this was wild to me asking chat cheat 711 00:35:06,239 --> 00:35:09,279 Speaker 3: Betea might have actually saved a pregnant woman's life. A 712 00:35:09,360 --> 00:35:12,520 Speaker 3: woman in her third trimester felt a tightness in her 713 00:35:12,640 --> 00:35:15,600 Speaker 3: jaw before she went to bed, and almost just for 714 00:35:15,719 --> 00:35:18,680 Speaker 3: a lark, she asked chat cheat Beta in this casual way, 715 00:35:19,160 --> 00:35:20,319 Speaker 3: why does my jaw feel tight? 716 00:35:20,680 --> 00:35:23,839 Speaker 1: Chatcheat BT was like, check your blood pressure? And then 717 00:35:23,880 --> 00:35:25,480 Speaker 1: she did and she was like, oh, my blood pressure 718 00:35:25,560 --> 00:35:27,959 Speaker 1: is very high, and within thirty minutes, chat cheat Bete 719 00:35:28,239 --> 00:35:30,759 Speaker 1: was urging her to call an ambulance. She did. 720 00:35:31,160 --> 00:35:33,520 Speaker 3: She went to the hospital. Her blood pressure had climbed 721 00:35:33,560 --> 00:35:36,880 Speaker 3: to two hundred over one forty six. Doctors were like, 722 00:35:37,040 --> 00:35:39,680 Speaker 3: we need to deliver your baby now, and she had 723 00:35:40,440 --> 00:35:42,520 Speaker 3: what to start with, the pre preclamsy of I think 724 00:35:42,520 --> 00:35:46,000 Speaker 3: that's what it's called, like high blood pressure and doctor 725 00:35:46,120 --> 00:35:48,320 Speaker 3: said that had she gone to sleep that night, she 726 00:35:48,960 --> 00:35:53,600 Speaker 3: she could have died. And so chatjeet bt helped her 727 00:35:53,800 --> 00:35:56,880 Speaker 3: make the decision to take this seriously and was like 728 00:35:56,960 --> 00:35:59,360 Speaker 3: the deciding factor deciding to call an ambulance. 729 00:35:59,600 --> 00:36:02,719 Speaker 1: Importantly, chat gpt did not diagnose her. 730 00:36:03,120 --> 00:36:06,000 Speaker 3: It helped her make the decision to call an ambulance 731 00:36:06,040 --> 00:36:09,880 Speaker 3: where human doctors at the hospital diagnosed her and saved 732 00:36:09,880 --> 00:36:12,000 Speaker 3: her life. And so I want to say that because 733 00:36:12,840 --> 00:36:16,080 Speaker 3: I don't want someone to hear chat GPT can diagnose 734 00:36:16,160 --> 00:36:18,759 Speaker 3: me with something. I think the better way to think 735 00:36:18,800 --> 00:36:21,120 Speaker 3: about it is chat GPT can help me get the 736 00:36:21,200 --> 00:36:24,200 Speaker 3: information to make the decision to connect with medical professionals 737 00:36:24,280 --> 00:36:26,600 Speaker 3: who then can diagnose me and help save my life. 738 00:36:27,800 --> 00:36:32,120 Speaker 2: Yes, which is a great distinction and something again I 739 00:36:32,200 --> 00:36:34,200 Speaker 2: think a lot of us have done. Where you go online, 740 00:36:34,640 --> 00:36:38,560 Speaker 2: you look up how serious is this? Sometimes it can 741 00:36:38,600 --> 00:36:42,800 Speaker 2: be hard to separate from your own panic and perhaps 742 00:36:43,000 --> 00:36:46,240 Speaker 2: hyperbolic reading of it, but you know, kind of getting 743 00:36:46,280 --> 00:36:48,400 Speaker 2: a lay of the land of like what could this be? 744 00:36:49,520 --> 00:36:52,959 Speaker 2: Should I be really concerned and then going to seek 745 00:36:53,560 --> 00:36:56,439 Speaker 2: professional help if the Internet's like, oh yeah, that's really 746 00:36:56,520 --> 00:37:02,000 Speaker 2: bad exactly, Yes, But that does as to some of 747 00:37:02,239 --> 00:37:04,200 Speaker 2: the downsides of this whole thing. 748 00:37:04,760 --> 00:37:08,400 Speaker 3: Yeah, because it sounds like I'm like, oh, definitely, you 749 00:37:08,440 --> 00:37:10,960 Speaker 3: don't even need a doctor, y'all, You've got chat gypt. 750 00:37:11,120 --> 00:37:12,360 Speaker 1: That is not what I'm saying. 751 00:37:12,480 --> 00:37:15,719 Speaker 3: So let's pump the brakes, because using chat GPT for 752 00:37:15,800 --> 00:37:18,279 Speaker 3: pregnancy advice is not without risk. 753 00:37:18,440 --> 00:37:18,879 Speaker 1: First of all. 754 00:37:19,040 --> 00:37:23,160 Speaker 3: As I said, AI is notorious for spitting out misinformation 755 00:37:23,800 --> 00:37:26,600 Speaker 3: what's called the hallucinations, which is just flat out incorrect 756 00:37:26,600 --> 00:37:30,200 Speaker 3: information stated as fact. Like when I ask chat gypt 757 00:37:30,320 --> 00:37:35,520 Speaker 3: about guests that I have interviewed. It told me once 758 00:37:35,600 --> 00:37:38,319 Speaker 3: that one of my note I was known for having 759 00:37:38,400 --> 00:37:41,959 Speaker 3: interviewed Monica Lewinsky. Monica Lewinsky is one of my White 760 00:37:42,040 --> 00:37:44,719 Speaker 3: Whale podcast guests, and I must have said it on 761 00:37:44,800 --> 00:37:46,920 Speaker 3: a podcast. I think somebody asked me, like, who are 762 00:37:46,920 --> 00:37:48,520 Speaker 3: the two guests you would most want to interview When 763 00:37:48,520 --> 00:37:50,640 Speaker 3: I think I said Monica Lewinsky and Missy Elliott, and 764 00:37:51,080 --> 00:37:53,960 Speaker 3: chat gpt has like spoken this into existence. 765 00:37:54,040 --> 00:37:56,759 Speaker 1: Only one problem, it never happened, But I love that 766 00:37:56,840 --> 00:37:57,840 Speaker 1: it's manifesting for me. 767 00:37:58,000 --> 00:38:02,360 Speaker 3: But that's a hallucination, just a flat out incorrect detail 768 00:38:02,480 --> 00:38:03,200 Speaker 3: stated as. 769 00:38:03,280 --> 00:38:06,000 Speaker 1: Fact, And so chatchept is notorious for that. 770 00:38:06,200 --> 00:38:09,000 Speaker 3: One study last year found that fifty two percent of 771 00:38:09,120 --> 00:38:13,800 Speaker 3: chatgeept's answers contained some level of misinformation or incorrect information. So, 772 00:38:14,560 --> 00:38:17,080 Speaker 3: especially when you're talking about your health, that is something 773 00:38:17,160 --> 00:38:21,560 Speaker 3: to really really keep in the forefront of your mind 774 00:38:21,719 --> 00:38:25,480 Speaker 3: that we simply don't know if the information we're getting 775 00:38:25,560 --> 00:38:26,320 Speaker 3: is accurate or not. 776 00:38:26,560 --> 00:38:27,800 Speaker 1: When you're using chat Gypt. 777 00:38:28,360 --> 00:38:30,600 Speaker 3: And the second is what we talked about the last 778 00:38:30,640 --> 00:38:33,719 Speaker 3: time we all were hanging out, which is privacy. I 779 00:38:33,800 --> 00:38:37,920 Speaker 3: think Annie you alluded to this given how surveiled pregnancy 780 00:38:38,080 --> 00:38:40,640 Speaker 3: is right now in the United States and how criminalized 781 00:38:40,680 --> 00:38:43,359 Speaker 3: pregnancy is in the United States. That you know, we've 782 00:38:43,480 --> 00:38:47,799 Speaker 3: had reports of women being criminalized for miscarriage and pregnancy loss. 783 00:38:48,440 --> 00:38:50,759 Speaker 3: That is something to keep in mind if you are 784 00:38:50,920 --> 00:38:55,080 Speaker 3: using chat ept as a place to ask medical questions. 785 00:38:55,239 --> 00:38:56,759 Speaker 3: My opinion is that they are not so like I 786 00:38:56,800 --> 00:38:59,840 Speaker 3: wouldn't trust CHATTYBT with anything that is sensitive or in 787 00:39:00,040 --> 00:39:00,760 Speaker 3: him about myself. 788 00:39:01,440 --> 00:39:05,160 Speaker 4: So we know that I believe they have They're one 789 00:39:05,200 --> 00:39:07,720 Speaker 4: of the companies that may have a contract with the government. 790 00:39:07,840 --> 00:39:10,600 Speaker 1: So that's correct. We need anything like that. 791 00:39:10,920 --> 00:39:14,680 Speaker 4: It should be recognized immediately as Yeah, this is not 792 00:39:14,760 --> 00:39:15,880 Speaker 4: going to good for privacy. 793 00:39:16,320 --> 00:39:16,560 Speaker 1: Yeah. 794 00:39:16,600 --> 00:39:19,759 Speaker 3: I mean, I personally do not trust open AI. I 795 00:39:19,880 --> 00:39:22,960 Speaker 3: don't trust the way they move, I don't trust their leadership. 796 00:39:23,000 --> 00:39:24,920 Speaker 3: I don't when they say things in public, it makes it. 797 00:39:25,080 --> 00:39:27,600 Speaker 3: It sets off a million warning bells for me, the 798 00:39:27,680 --> 00:39:30,440 Speaker 3: way that they move, and so even the stuff that 799 00:39:30,560 --> 00:39:32,680 Speaker 3: they have said in public, I have a heart. 800 00:39:32,880 --> 00:39:34,080 Speaker 1: It's I just don't trust. 801 00:39:34,120 --> 00:39:36,239 Speaker 3: It's a company I don't trust, and so it does 802 00:39:36,400 --> 00:39:41,120 Speaker 3: suck that when people think about AI, the first company 803 00:39:41,200 --> 00:39:43,520 Speaker 3: they usually think of is open Ai because they're thinking 804 00:39:43,520 --> 00:39:45,800 Speaker 3: about chatsh EPT and they definitely take up this like 805 00:39:46,000 --> 00:39:49,880 Speaker 3: large footprint in the AI landscape, and they are not 806 00:39:49,960 --> 00:39:53,879 Speaker 3: a company that I would say I trust, and I'll 807 00:39:53,920 --> 00:39:56,799 Speaker 3: just leave it at that, like they really I could 808 00:39:56,840 --> 00:39:58,840 Speaker 3: talk a lot about this, and a lot of it 809 00:39:58,960 --> 00:40:00,960 Speaker 3: A lot of it is they have said publicly and 810 00:40:01,000 --> 00:40:03,719 Speaker 3: also just like what I glean from the way that 811 00:40:03,760 --> 00:40:04,840 Speaker 3: they move publicly, So. 812 00:40:05,440 --> 00:40:09,480 Speaker 2: Yeah, yes, And also other people have spoken out about 813 00:40:10,480 --> 00:40:13,080 Speaker 2: the concerns around privacy correct. 814 00:40:13,280 --> 00:40:16,400 Speaker 3: So in the article, the journalist spoke to Tom Subach, 815 00:40:16,440 --> 00:40:18,920 Speaker 3: who is the founder of the Reimagination Lab and the 816 00:40:19,000 --> 00:40:21,960 Speaker 3: former chief strategy officer at Planned Parenthood, who said that 817 00:40:22,000 --> 00:40:24,680 Speaker 3: there are very real privacy risks when entrusting the details 818 00:40:24,719 --> 00:40:28,319 Speaker 3: of your reproductive health to chat YOUBT. Subec says, could 819 00:40:28,360 --> 00:40:30,719 Speaker 3: the data be used by a hostile prosecutor who is 820 00:40:30,760 --> 00:40:34,640 Speaker 3: going on a fishing expedition for women who have had miscarriages? Absolutely, 821 00:40:34,920 --> 00:40:37,320 Speaker 3: as is the case with almost any free online platform, 822 00:40:37,440 --> 00:40:39,759 Speaker 3: you are trading some level of your anonymity and most 823 00:40:39,800 --> 00:40:44,400 Speaker 3: personal health details for use of that app. So just 824 00:40:44,480 --> 00:40:46,120 Speaker 3: another thing to keep in mind, you know. 825 00:40:46,480 --> 00:40:51,439 Speaker 2: Yeah, yeah, And as we've discussed multiple times on the show, 826 00:40:51,960 --> 00:40:56,440 Speaker 2: it's really unfortunate because we do live in this online 827 00:40:56,480 --> 00:41:02,080 Speaker 2: space and a lot of these these free programs that 828 00:41:02,200 --> 00:41:05,680 Speaker 2: so many of us use have those huge contracts that 829 00:41:05,800 --> 00:41:09,759 Speaker 2: you were like, yes, I agree. Uh. And also sometimes 830 00:41:09,840 --> 00:41:11,840 Speaker 2: I try not to do this anymore, but there was 831 00:41:11,880 --> 00:41:13,520 Speaker 2: a time when it was like, do you just want 832 00:41:13,560 --> 00:41:16,560 Speaker 2: to use your Google email just to sign in and 833 00:41:16,680 --> 00:41:19,600 Speaker 2: you can get this out article? And I was like, yeah, sure. 834 00:41:20,680 --> 00:41:22,480 Speaker 2: And now I look back and I'm like, oh, yeah, 835 00:41:22,600 --> 00:41:25,520 Speaker 2: they were definitely getting a lot of my data, and 836 00:41:25,600 --> 00:41:29,160 Speaker 2: I was just training the ease of doing that versus 837 00:41:30,680 --> 00:41:34,080 Speaker 2: the reality of what of what it would cost. 838 00:41:35,840 --> 00:41:36,040 Speaker 1: Yeah. 839 00:41:36,239 --> 00:41:39,160 Speaker 3: I think that you really put that so well because 840 00:41:39,200 --> 00:41:42,480 Speaker 3: I think that is the dance that we're all navigating. 841 00:41:43,080 --> 00:41:43,520 Speaker 1: I think that. 842 00:41:45,000 --> 00:41:48,120 Speaker 3: We all need to understand the trade offs that we 843 00:41:48,280 --> 00:41:53,560 Speaker 3: make for convenience and that and really in a deep 844 00:41:53,680 --> 00:41:55,800 Speaker 3: way ask ourselves is it worth it? 845 00:41:56,040 --> 00:41:59,720 Speaker 1: Are we getting something that is worth what we are losing? 846 00:42:00,080 --> 00:42:03,080 Speaker 3: And I I was the same way, like I probably 847 00:42:03,160 --> 00:42:06,080 Speaker 3: had some of the worst digital hygiene practices of anybody 848 00:42:06,160 --> 00:42:08,600 Speaker 3: I know in the early days of the internet, because 849 00:42:08,600 --> 00:42:10,920 Speaker 3: I was just thinking, like, I mean, remember when Facebook 850 00:42:11,040 --> 00:42:14,360 Speaker 3: came to your college, Like it was like it was excitement. 851 00:42:14,440 --> 00:42:20,160 Speaker 3: I could not conceptualize that this could have a negative outcome. 852 00:42:20,200 --> 00:42:22,120 Speaker 1: I just like I wasn't there yet. 853 00:42:22,200 --> 00:42:25,040 Speaker 3: And one of the reasons I care so deeply about 854 00:42:25,080 --> 00:42:26,800 Speaker 3: this is that I think back to that and I'm like, 855 00:42:27,239 --> 00:42:29,920 Speaker 3: I was misled, I was lied to, I was hoodwinked, 856 00:42:29,960 --> 00:42:33,680 Speaker 3: and like, I gave away so much without even asking, 857 00:42:33,760 --> 00:42:35,799 Speaker 3: because I wanted to put pictures of our nights out 858 00:42:35,880 --> 00:42:38,400 Speaker 3: on the internet easily, right, Like I gave up so 859 00:42:38,640 --> 00:42:41,120 Speaker 3: much without even asking the question. And that's one of 860 00:42:41,160 --> 00:42:43,239 Speaker 3: the reasons why this is like a hill I will 861 00:42:43,320 --> 00:42:46,080 Speaker 3: die on that we should at least be asking the question, 862 00:42:46,400 --> 00:42:48,400 Speaker 3: is this trade off worth it? Is what I am 863 00:42:48,480 --> 00:42:51,719 Speaker 3: getting in return from these tech companies benefiting me enough 864 00:42:51,880 --> 00:42:56,840 Speaker 3: to make trading away these these intimate things about my body, 865 00:42:57,360 --> 00:43:01,239 Speaker 3: my relationships, my life, the things that make us human, 866 00:43:01,440 --> 00:43:02,000 Speaker 3: isn't worth it? 867 00:43:02,200 --> 00:43:03,319 Speaker 1: Like that really is the question. 868 00:43:14,120 --> 00:43:18,080 Speaker 2: It's very fascinating to be in this part of the 869 00:43:18,160 --> 00:43:20,799 Speaker 2: timeline where we had Facebook that was so exciting. It's 870 00:43:20,840 --> 00:43:25,920 Speaker 2: so new and now we have chat GPT, which feels 871 00:43:26,200 --> 00:43:28,120 Speaker 2: like a whole different beast in a way, but it 872 00:43:28,400 --> 00:43:30,160 Speaker 2: does have that same vibe of like, what is this 873 00:43:30,280 --> 00:43:33,759 Speaker 2: going to be? What can I learn from it? 874 00:43:33,920 --> 00:43:34,800 Speaker 1: What can I ask for it? 875 00:43:34,880 --> 00:43:37,719 Speaker 2: What is the good and the bad? And that is 876 00:43:37,760 --> 00:43:41,359 Speaker 2: something that we are we're still wrestling with even within 877 00:43:41,520 --> 00:43:43,680 Speaker 2: this question that's right. 878 00:43:43,840 --> 00:43:45,640 Speaker 3: So I took a lott So after reading this article 879 00:43:45,680 --> 00:43:49,880 Speaker 3: that Susie posted on to me on Blue Sky, I 880 00:43:50,040 --> 00:43:52,160 Speaker 3: was like, okay, so this is what people are using 881 00:43:52,280 --> 00:43:54,920 Speaker 3: chat cept for pregnancy and fertility information. 882 00:43:55,160 --> 00:43:57,440 Speaker 1: Got it, But then I was like, is this a 883 00:43:57,600 --> 00:44:00,080 Speaker 1: good thing? Like again, I mean I have I. 884 00:44:00,040 --> 00:44:01,960 Speaker 3: Feel like I wrestled with this in the conversation that 885 00:44:02,000 --> 00:44:05,040 Speaker 3: we've had today of like it's I'm happy that people 886 00:44:05,600 --> 00:44:07,840 Speaker 3: have a platform that offers them a resource that they 887 00:44:07,880 --> 00:44:10,160 Speaker 3: self report helps them out, but is this a good 888 00:44:10,280 --> 00:44:11,960 Speaker 3: thing long term? So I took a look at an 889 00:44:12,040 --> 00:44:15,480 Speaker 3: article called chat JAPT a Reliable Fertility Decision Making Tool 890 00:44:15,920 --> 00:44:19,240 Speaker 3: in the Journal of Human Reproduction. The authors we're wrestling 891 00:44:19,320 --> 00:44:22,600 Speaker 3: with the question that you know, yes, people are saying 892 00:44:22,680 --> 00:44:25,480 Speaker 3: that they feel more empowered by using chat GPT on 893 00:44:25,560 --> 00:44:27,360 Speaker 3: their pregnancy journey but is that a good thing? And 894 00:44:27,480 --> 00:44:30,760 Speaker 3: basically they found exactly what you might expect that compared 895 00:44:30,840 --> 00:44:32,640 Speaker 3: to just googling stuff randomly. 896 00:44:33,040 --> 00:44:34,000 Speaker 1: Yes, it is better. 897 00:44:34,239 --> 00:44:38,239 Speaker 3: The piece says, freely available versions of generative AI are 898 00:44:38,280 --> 00:44:40,720 Speaker 3: a faster and equally accurate means of gaining an answer 899 00:44:40,760 --> 00:44:44,040 Speaker 3: to a straightforward question. Furthermore, the chatbot has been shown 900 00:44:44,160 --> 00:44:48,000 Speaker 3: to equalize user search performance across different education levels, promoting 901 00:44:48,040 --> 00:44:50,640 Speaker 3: equity of access to the broad range of health literacy 902 00:44:50,680 --> 00:44:54,400 Speaker 3: capabilities that exist within the general population. So it might 903 00:44:54,520 --> 00:44:57,479 Speaker 3: if you're someone who, like me, who any thing about 904 00:44:57,800 --> 00:45:01,000 Speaker 3: medical stuff, it might be a way to like make 905 00:45:01,080 --> 00:45:03,720 Speaker 3: that information more accessible and more equitable. 906 00:45:03,840 --> 00:45:04,359 Speaker 1: That's great. 907 00:45:05,160 --> 00:45:07,840 Speaker 3: But importantly they say that because we don't know what 908 00:45:08,080 --> 00:45:13,120 Speaker 3: information chatgbt was trained on, and arguments around falling into 909 00:45:13,160 --> 00:45:16,920 Speaker 3: the hype of chatgebt, healthcare providers should be using this 910 00:45:17,040 --> 00:45:20,080 Speaker 3: information to invest in the generation of their own models 911 00:45:20,200 --> 00:45:23,440 Speaker 3: using the emerging technology and trusted data, a sentiment the 912 00:45:23,480 --> 00:45:26,960 Speaker 3: authors of that article strongly support, which I support too. 913 00:45:27,000 --> 00:45:27,879 Speaker 1: I think that makes sense. 914 00:45:27,920 --> 00:45:31,759 Speaker 3: I think that the idea of oh well, healthcare and 915 00:45:31,880 --> 00:45:35,160 Speaker 3: health information is inaccessible for people, so they're just going 916 00:45:35,200 --> 00:45:35,880 Speaker 3: on chat GPT. 917 00:45:36,080 --> 00:45:36,600 Speaker 1: That's fine. 918 00:45:37,360 --> 00:45:41,600 Speaker 3: I think that the next step should be so actual 919 00:45:41,800 --> 00:45:45,919 Speaker 3: medical professionals should be making their own models to give 920 00:45:46,040 --> 00:45:49,200 Speaker 3: people access to health information that is accurate in a 921 00:45:49,239 --> 00:45:51,480 Speaker 3: way that is more accessible. They should be learning from 922 00:45:51,480 --> 00:45:53,800 Speaker 3: the fact that people are turning to chat sheatpt, not 923 00:45:54,160 --> 00:45:58,279 Speaker 3: just having that be the outcome like, Okay, they're getting 924 00:45:58,320 --> 00:46:00,600 Speaker 3: what they need, because I actually don't think they're getting 925 00:46:00,600 --> 00:46:00,960 Speaker 3: what they need. 926 00:46:01,040 --> 00:46:04,560 Speaker 1: They need actual medical information that they can trust. 927 00:46:06,560 --> 00:46:10,360 Speaker 4: I wonder how soon we'll have healthcare companies try to 928 00:46:10,400 --> 00:46:11,880 Speaker 4: take that as a money making scheme. 929 00:46:12,800 --> 00:46:15,080 Speaker 3: Well, I mean I can answer that question for you, 930 00:46:15,200 --> 00:46:18,279 Speaker 3: which is gill me, how soon is now? Because they're 931 00:46:18,360 --> 00:46:19,759 Speaker 3: doing it, that's happening. 932 00:46:19,480 --> 00:46:20,200 Speaker 1: As we speak. 933 00:46:20,600 --> 00:46:23,560 Speaker 4: Yeah, and what are they doing? Should I be weary? 934 00:46:24,080 --> 00:46:25,919 Speaker 1: I mean, this is this is a little bit above 935 00:46:26,000 --> 00:46:26,640 Speaker 1: my pay grade. 936 00:46:27,000 --> 00:46:29,839 Speaker 3: But I do know that, like, like before my mom's death, 937 00:46:29,880 --> 00:46:32,839 Speaker 3: she was talking about how like medical professionals. 938 00:46:32,320 --> 00:46:35,160 Speaker 1: Are really using AI a lot. 939 00:46:35,360 --> 00:46:37,680 Speaker 3: Like the one of the aspects of this conversation that 940 00:46:37,960 --> 00:46:39,960 Speaker 3: I cut but we can talk about is how you know, 941 00:46:40,040 --> 00:46:43,040 Speaker 3: we're talking about using chat shept as a way to 942 00:46:44,000 --> 00:46:48,320 Speaker 3: give people information on their pregnancy journey, but AI, not chatcheapt. 943 00:46:48,520 --> 00:46:51,200 Speaker 1: Other forms of AI is actually helping people to get. 944 00:46:51,040 --> 00:46:53,799 Speaker 3: Pregnant in a literal way as well, because doctors at 945 00:46:53,840 --> 00:46:57,400 Speaker 3: Columbia they reported what they were calling the first pregnancy 946 00:46:57,560 --> 00:47:01,759 Speaker 3: using AI, using this new AI so to help with 947 00:47:02,239 --> 00:47:06,800 Speaker 3: low sperm count. And so apparently male fertility challenges account 948 00:47:06,800 --> 00:47:09,759 Speaker 3: for forty percent of infertility in the US, and specifically 949 00:47:09,920 --> 00:47:12,640 Speaker 3: low sperm count is responsible for ten percent of those cases. 950 00:47:12,680 --> 00:47:14,600 Speaker 3: And it used to be that if that was the problem, 951 00:47:14,920 --> 00:47:16,799 Speaker 3: there was not much that they could do. But they 952 00:47:16,840 --> 00:47:20,719 Speaker 3: had this new AI procedure called Star Sperm Track and Recovery, 953 00:47:21,200 --> 00:47:25,680 Speaker 3: where doctors used an AI algorithm to detect sperm using 954 00:47:25,760 --> 00:47:29,120 Speaker 3: a fluidic chip that passed the semen sample through a 955 00:47:29,200 --> 00:47:32,840 Speaker 3: tiny tube on a plastic chip. And if AI detected sperm, 956 00:47:33,400 --> 00:47:37,000 Speaker 3: even a teeny tiny amount, they could use that AI 957 00:47:37,120 --> 00:47:40,160 Speaker 3: detection to collect it and actually fertilize an egg. And 958 00:47:40,760 --> 00:47:43,360 Speaker 3: a couple that had been trying to conceive for a 959 00:47:43,480 --> 00:47:46,600 Speaker 3: very long time was able to actually conceive a child 960 00:47:46,880 --> 00:47:49,520 Speaker 3: using this AI model. And so I don't want to 961 00:47:49,560 --> 00:47:54,080 Speaker 3: make it seem like the intersection of AI and pregnancy 962 00:47:54,640 --> 00:47:58,600 Speaker 3: is all super sketchy, because they it sounds like they 963 00:47:58,640 --> 00:48:01,000 Speaker 3: are using it in some ways that are that are interesting, 964 00:48:01,560 --> 00:48:05,640 Speaker 3: but I just know that when it comes to health 965 00:48:05,880 --> 00:48:09,360 Speaker 3: information and technology, not every actor out there is a 966 00:48:09,400 --> 00:48:13,280 Speaker 3: good actor, and it just makes me concerned. 967 00:48:13,360 --> 00:48:15,080 Speaker 1: Yeah, I do have concerns. 968 00:48:16,120 --> 00:48:20,560 Speaker 2: That is a lot to take in. Glad that you 969 00:48:20,760 --> 00:48:23,200 Speaker 2: shared it. I had not heard of that, but wow, 970 00:48:24,320 --> 00:48:26,440 Speaker 2: I mean I'm happy it's helping people. It just sounds 971 00:48:26,480 --> 00:48:27,319 Speaker 2: so futuristic. 972 00:48:27,560 --> 00:48:29,239 Speaker 1: Yeah, the future is now. 973 00:48:30,000 --> 00:48:34,120 Speaker 2: The future is now, And you know this isn't the 974 00:48:34,200 --> 00:48:41,000 Speaker 2: same thing as AI, but I have experienced something along 975 00:48:41,040 --> 00:48:44,440 Speaker 2: the lines of I'm trying to get help for whatever, 976 00:48:45,120 --> 00:48:47,240 Speaker 2: and it's always like, have you talked to our chatbot? 977 00:48:47,520 --> 00:48:51,080 Speaker 2: Have you talked to our chatbot? And the chatbot is 978 00:48:51,200 --> 00:48:54,400 Speaker 2: not helping me out. It's just like have you tried 979 00:48:55,640 --> 00:49:01,400 Speaker 2: looking on the internet of like yes. So you know 980 00:49:01,560 --> 00:49:05,800 Speaker 2: that there are ways we can move forward and improve 981 00:49:06,040 --> 00:49:08,279 Speaker 2: and there are still some things that we need to 982 00:49:08,320 --> 00:49:11,360 Speaker 2: worry about when it comes to integrating this kind of 983 00:49:11,400 --> 00:49:15,480 Speaker 2: technology in a medical space exactly. And that study I 984 00:49:15,600 --> 00:49:18,239 Speaker 2: was referencing in the Journal of Human Reproduction really makes 985 00:49:18,280 --> 00:49:20,200 Speaker 2: this clear. They said that we need to be aware 986 00:49:20,560 --> 00:49:25,560 Speaker 2: of and tackle the common ethical concerns about AI, including privacy, trust, accountability, 987 00:49:25,600 --> 00:49:29,040 Speaker 2: and responsibility and bias, and that healthcare professionals have an 988 00:49:29,160 --> 00:49:32,160 Speaker 2: edge here when compared to chat TPT because their training 989 00:49:32,280 --> 00:49:34,719 Speaker 2: and practice makes them well versed in the delivery of 990 00:49:34,840 --> 00:49:38,520 Speaker 2: best practice care something reserved for sentient beings at least 991 00:49:38,600 --> 00:49:39,640 Speaker 2: for now, right, And so. 992 00:49:41,120 --> 00:49:44,560 Speaker 1: It does seem true that humans might have the edge 993 00:49:44,640 --> 00:49:45,440 Speaker 1: on this one, and that. 994 00:49:45,520 --> 00:49:48,560 Speaker 3: We I don't think that we should be replacing human 995 00:49:49,400 --> 00:49:53,680 Speaker 3: doctors or humans in anything when it comes to AI, Like, ultimately, 996 00:49:54,840 --> 00:49:57,200 Speaker 3: you don't want to explain your problems to a chatbot. 997 00:49:57,280 --> 00:49:59,280 Speaker 3: There needs to be a human in the mix there somewhere. 998 00:50:00,440 --> 00:50:02,800 Speaker 3: As you said, Annie, you don't want to trust a 999 00:50:02,960 --> 00:50:05,640 Speaker 3: chatbot when you're really in the thick of it. I 1000 00:50:05,719 --> 00:50:07,600 Speaker 3: think people don't want to be going back and forth 1001 00:50:07,600 --> 00:50:09,200 Speaker 3: with what they know with a chatbot. 1002 00:50:11,640 --> 00:50:19,920 Speaker 2: Yeah, it's frustrating and not fun, but it is as 1003 00:50:20,840 --> 00:50:25,360 Speaker 2: you have made the point throughout this, clearly, there is 1004 00:50:25,520 --> 00:50:28,320 Speaker 2: an interest in it, and there is a value in 1005 00:50:28,400 --> 00:50:33,719 Speaker 2: it in terms of how we can perhaps build something better, 1006 00:50:34,200 --> 00:50:37,319 Speaker 2: and it is something that is impacting us all, whether 1007 00:50:37,440 --> 00:50:38,400 Speaker 2: we want it to or not. 1008 00:50:39,320 --> 00:50:41,399 Speaker 1: That is expertly put. 1009 00:50:41,560 --> 00:50:44,440 Speaker 3: I would say, like the bottom line is that we 1010 00:50:44,560 --> 00:50:48,640 Speaker 3: are navigating how AI is changing these foundational things of 1011 00:50:48,719 --> 00:50:51,040 Speaker 3: what it means to be human, and that includes pregnancy 1012 00:50:51,080 --> 00:50:53,320 Speaker 3: and childbirth, and so we've got to go into it 1013 00:50:53,800 --> 00:51:01,000 Speaker 3: with a clear understanding of the ethical, safety, privacy, environmental bias, 1014 00:51:01,120 --> 00:51:04,279 Speaker 3: like all of these different risks that come with the 1015 00:51:04,360 --> 00:51:07,000 Speaker 3: way that AI is changing this. We should really really 1016 00:51:07,160 --> 00:51:09,520 Speaker 3: be grappling with that instead of just getting lost in 1017 00:51:09,920 --> 00:51:12,839 Speaker 3: AI hype that this is going to be automatically good 1018 00:51:12,920 --> 00:51:14,920 Speaker 3: for everybody and this is the future, right, So I 1019 00:51:15,000 --> 00:51:18,239 Speaker 3: think as long as we're proceeding with a clear understanding 1020 00:51:18,280 --> 00:51:20,400 Speaker 3: of that like that, like that is the way to 1021 00:51:20,480 --> 00:51:22,040 Speaker 3: go forward in the future in my opinion. 1022 00:51:24,760 --> 00:51:28,160 Speaker 2: Yes, agreed, and thank you so much for tackling this. 1023 00:51:29,640 --> 00:51:33,640 Speaker 2: Thank you so much to Susie to Go for suggesting it. 1024 00:51:35,000 --> 00:51:37,759 Speaker 2: We always love talking to you, Bridget. Where can the 1025 00:51:37,840 --> 00:51:38,759 Speaker 2: good listeners find you? 1026 00:51:39,120 --> 00:51:41,680 Speaker 3: Well, you can find me at my podcast all about 1027 00:51:41,680 --> 00:51:44,600 Speaker 3: the intersection of tech and identity and social media. So 1028 00:51:44,600 --> 00:51:46,600 Speaker 3: if you want to have more conversations like this, check 1029 00:51:46,719 --> 00:51:49,560 Speaker 3: us out. You can find me on Instagram at Bridget 1030 00:51:49,640 --> 00:51:52,000 Speaker 3: Marine DC. You can find me on TikTok at Bridget 1031 00:51:52,080 --> 00:51:54,759 Speaker 3: Marine DC. You can find me on YouTube. And there 1032 00:51:54,800 --> 00:51:57,960 Speaker 3: are no girls on the Internet yeah, hang out with me. 1033 00:51:58,920 --> 00:52:02,120 Speaker 3: That's sound empathetic, but you know what I mean, it's 1034 00:52:02,200 --> 00:52:02,840 Speaker 3: not pathetic. 1035 00:52:03,120 --> 00:52:04,680 Speaker 2: Bridgett is lovely and I. 1036 00:52:04,640 --> 00:52:05,560 Speaker 4: Would love to hang out with you. 1037 00:52:06,360 --> 00:52:08,640 Speaker 2: We always yeah, I'd love to hang out with you. 1038 00:52:08,960 --> 00:52:10,600 Speaker 1: Oh, the pleasure is always all mine. 1039 00:52:12,280 --> 00:52:15,360 Speaker 2: Well, until next time, listeners, go check out all of 1040 00:52:15,400 --> 00:52:17,440 Speaker 2: that stuff. If you haven't already, If you would like 1041 00:52:17,480 --> 00:52:19,520 Speaker 2: to contact us, you can or email us hello at 1042 00:52:19,520 --> 00:52:21,360 Speaker 2: stuff Whenever Told You dot com. You can find us 1043 00:52:21,400 --> 00:52:23,640 Speaker 2: on Blue Sky on Stuff podcast or on Instagram and 1044 00:52:23,719 --> 00:52:25,680 Speaker 2: TikTok at stuff when the Never Told You for us 1045 00:52:25,800 --> 00:52:27,200 Speaker 2: on YouTube, and we have a book you can get 1046 00:52:27,239 --> 00:52:29,160 Speaker 2: wherever you get your books. Thanks, it's always to our 1047 00:52:29,200 --> 00:52:31,879 Speaker 2: super just because senior executive preuser My and your contributor Joey. 1048 00:52:32,040 --> 00:52:34,440 Speaker 2: Thank you and thanks you for listening. Stuff Never Told 1049 00:52:34,480 --> 00:52:36,160 Speaker 2: You is production by Heart Radio. For more podcast or 1050 00:52:36,160 --> 00:52:37,600 Speaker 2: my Heart Radio, you can check out the heart Radio 1051 00:52:37,640 --> 00:52:39,920 Speaker 2: appple podcast wherever you listen to your favorite shows