1 00:00:09,640 --> 00:00:12,840 Speaker 1: I started to realize that not being an expert isn't 2 00:00:12,880 --> 00:00:16,600 Speaker 1: a liability, it's a real gift. If we don't know 3 00:00:16,680 --> 00:00:18,919 Speaker 1: something about ourselves at this point in our life, it's 4 00:00:18,960 --> 00:00:20,919 Speaker 1: probably because it's uncomfortable to know. 5 00:00:21,640 --> 00:00:24,520 Speaker 2: If you can die before you die, then you can 6 00:00:24,560 --> 00:00:29,040 Speaker 2: really live. There's a wisdom at death's door. I thought 7 00:00:29,040 --> 00:00:31,840 Speaker 2: I was insane. Yeah, and I didn't know what to 8 00:00:31,920 --> 00:00:33,080 Speaker 2: do because there was no internet. 9 00:00:33,600 --> 00:00:36,800 Speaker 1: I don't know, man, I'm like, I feel like everything 10 00:00:37,680 --> 00:00:38,479 Speaker 1: is hard. 11 00:00:40,120 --> 00:00:42,920 Speaker 3: Hey, y'all, my name is Kat. I'm a human first 12 00:00:42,960 --> 00:00:46,519 Speaker 3: and a licensed therapist second. And right now I'm inviting 13 00:00:46,520 --> 00:00:49,760 Speaker 3: you into conversations that I hope encourage you to become 14 00:00:50,200 --> 00:00:54,240 Speaker 3: more curious and less judgmental about yourself, others, and the 15 00:00:54,280 --> 00:00:54,960 Speaker 3: world around you. 16 00:00:55,640 --> 00:00:58,080 Speaker 2: Welcome to You Need Therapy. 17 00:00:59,360 --> 00:01:02,240 Speaker 1: Hi guys, and welcome to a new episode of You 18 00:01:02,280 --> 00:01:05,560 Speaker 1: Need Therapy podcast. My name is Kat. I am the host, 19 00:01:05,760 --> 00:01:08,480 Speaker 1: and if you are new to You Need Therapy. I 20 00:01:08,520 --> 00:01:11,520 Speaker 1: always like to remind people before we get into the 21 00:01:11,560 --> 00:01:14,360 Speaker 1: meat of what we're talking about that although this podcast 22 00:01:14,800 --> 00:01:17,360 Speaker 1: is called You Need Therapy, I am a licensed therapist. 23 00:01:17,640 --> 00:01:20,319 Speaker 1: This does not serve as a replacement or a substitute 24 00:01:20,360 --> 00:01:25,720 Speaker 1: for any actual mental health services, but we always hope 25 00:01:25,720 --> 00:01:28,120 Speaker 1: that it can help you somehow, someway wherever you are 26 00:01:28,880 --> 00:01:32,600 Speaker 1: on your journey to wherever you're going. And this episode 27 00:01:32,680 --> 00:01:34,080 Speaker 1: is a little bit different one. 28 00:01:34,160 --> 00:01:34,720 Speaker 2: It's just me. 29 00:01:34,959 --> 00:01:37,479 Speaker 1: You get to just hear my voice the whole episode, 30 00:01:37,520 --> 00:01:41,399 Speaker 1: and it's also a little bit of a wild ride, 31 00:01:41,600 --> 00:01:44,679 Speaker 1: or it was a wild ride for me. I was 32 00:01:45,040 --> 00:01:48,800 Speaker 1: a little bit scared. There's a lot of fear. There 33 00:01:48,880 --> 00:01:52,520 Speaker 1: was a lot of shock that came over me as 34 00:01:52,640 --> 00:01:56,680 Speaker 1: I did some of the research for this episode, and 35 00:01:56,760 --> 00:02:01,040 Speaker 1: at one point I had to to say, Okay, enough 36 00:02:01,080 --> 00:02:04,480 Speaker 1: is enough. I've gone down, I've gone down the black 37 00:02:04,480 --> 00:02:06,440 Speaker 1: hole far enough and it's time for me to work 38 00:02:06,480 --> 00:02:09,000 Speaker 1: my way back up to the top. Or I don't 39 00:02:09,040 --> 00:02:12,440 Speaker 1: know what's going to happen. And this episode is a 40 00:02:12,440 --> 00:02:17,040 Speaker 1: prerequisite to what honestly, what I came to is a 41 00:02:17,360 --> 00:02:20,440 Speaker 1: docuseries that could be created. Now I'm not going to 42 00:02:20,480 --> 00:02:23,960 Speaker 1: be the one that does that, but I really think, 43 00:02:24,360 --> 00:02:26,400 Speaker 1: or I really wonder, and I was wondering now as 44 00:02:26,400 --> 00:02:29,080 Speaker 1: I was doing this research, if this is going this 45 00:02:29,200 --> 00:02:32,120 Speaker 1: topic is going to be the topic of a docuseries 46 00:02:32,200 --> 00:02:33,960 Speaker 1: on a Netflix or a Hulu or one of those 47 00:02:33,960 --> 00:02:39,320 Speaker 1: streaming services in the future, because I think those are 48 00:02:39,360 --> 00:02:44,800 Speaker 1: getting more popular, and this is such a popular thing 49 00:02:44,919 --> 00:02:47,520 Speaker 1: right now that I just I don't know. Maybe it's 50 00:02:47,520 --> 00:02:49,880 Speaker 1: because I just watched the Hill Song documentary on Hulu 51 00:02:49,960 --> 00:02:51,840 Speaker 1: and I'm like, huh, never would have seen that coming 52 00:02:51,880 --> 00:02:54,640 Speaker 1: ten years ago, but now I do see those things coming. 53 00:02:54,919 --> 00:02:58,680 Speaker 1: If you haven't seen that, very good now, I will 54 00:02:58,760 --> 00:03:02,160 Speaker 1: say before before we get into it. Usually, I like 55 00:03:02,240 --> 00:03:04,560 Speaker 1: to come on here and I like this podcast to 56 00:03:04,600 --> 00:03:08,200 Speaker 1: be more of a kind of a platform to spread 57 00:03:08,240 --> 00:03:12,560 Speaker 1: hope and share helpful information and tools that promote a 58 00:03:12,600 --> 00:03:16,160 Speaker 1: more positive outlook on some of the complexities of life. 59 00:03:16,880 --> 00:03:18,680 Speaker 1: And I like to offer insight that might be helpful 60 00:03:18,760 --> 00:03:21,040 Speaker 1: for y'all and it could help you understand yourselves in 61 00:03:21,040 --> 00:03:24,040 Speaker 1: the world better. That isn't exactly what we're going to 62 00:03:24,040 --> 00:03:27,800 Speaker 1: do today. A couple of years ago, I read a 63 00:03:27,840 --> 00:03:30,320 Speaker 1: book that some of you may remember me talking about 64 00:03:30,360 --> 00:03:31,920 Speaker 1: because I talked about it a lot, and I did 65 00:03:31,960 --> 00:03:34,960 Speaker 1: a whole episode on some of the stuff I learned 66 00:03:35,000 --> 00:03:37,960 Speaker 1: from the book. And the book was called Cultish was 67 00:03:38,000 --> 00:03:41,240 Speaker 1: written by a woman named Amanda Mantel, and she actually 68 00:03:41,320 --> 00:03:43,480 Speaker 1: hosts a podcast called Sounds Like a Cult it's very 69 00:03:43,480 --> 00:03:47,080 Speaker 1: good and the book was incredible. I cannot recommend it enough. 70 00:03:47,800 --> 00:03:50,360 Speaker 1: And the basis of the book was talking about how 71 00:03:50,440 --> 00:03:54,520 Speaker 1: language shapes our experiences and how cults and cult like 72 00:03:54,640 --> 00:03:58,800 Speaker 1: groups use language to create their followings. Since the term 73 00:03:59,080 --> 00:04:03,000 Speaker 1: cult is actually quite relative and there's not a really 74 00:04:03,080 --> 00:04:08,080 Speaker 1: an objectifiable criteria, or there isn't objectifiable criteria to identify 75 00:04:08,560 --> 00:04:11,200 Speaker 1: what is and what isn't a cult, what the book 76 00:04:11,240 --> 00:04:14,840 Speaker 1: talked about is the varying level of groups to the 77 00:04:14,880 --> 00:04:16,559 Speaker 1: ones that come to your mind when you think cult, 78 00:04:16,640 --> 00:04:18,680 Speaker 1: to the ones that you might be involved in that 79 00:04:18,720 --> 00:04:20,920 Speaker 1: you just think this is cool. It talked about the 80 00:04:21,000 --> 00:04:23,960 Speaker 1: varying level of groups and followings that reached these like 81 00:04:24,040 --> 00:04:27,120 Speaker 1: fanatic levels and how they ended up getting there. 82 00:04:27,360 --> 00:04:30,240 Speaker 2: And this book aided to. 83 00:04:30,839 --> 00:04:36,040 Speaker 1: My already somewhat jaded, I think outlook on social media helpers. 84 00:04:36,360 --> 00:04:37,800 Speaker 1: You guys, if you've been here for a while, you 85 00:04:37,800 --> 00:04:40,480 Speaker 1: know how I feel about that, and me being one 86 00:04:40,520 --> 00:04:43,280 Speaker 1: of those people. And this led me to having a 87 00:04:43,279 --> 00:04:46,120 Speaker 1: little bit of a crisis of my own. I was 88 00:04:46,279 --> 00:04:49,400 Speaker 1: contemplating how do I promote the information that I think 89 00:04:49,480 --> 00:04:53,000 Speaker 1: is helpful, also maintain a business and stay within my 90 00:04:53,240 --> 00:04:58,159 Speaker 1: ethical walls. And when helping others is mixed with money 91 00:04:58,680 --> 00:05:03,400 Speaker 1: in business and power, what I have found for myself 92 00:05:03,480 --> 00:05:05,120 Speaker 1: and a lot of the things that I've seen, it's 93 00:05:05,200 --> 00:05:09,839 Speaker 1: nearly impossible to make any sense of altruism. So you 94 00:05:10,000 --> 00:05:15,600 Speaker 1: really have to be careful to continue to check yourself 95 00:05:15,640 --> 00:05:18,839 Speaker 1: and evaluate yourself to make sure you're staying in your 96 00:05:19,080 --> 00:05:23,679 Speaker 1: morality and you are staying within your value system. And 97 00:05:24,200 --> 00:05:27,800 Speaker 1: when helping others can be monetized, and then we can 98 00:05:27,880 --> 00:05:32,880 Speaker 1: scale that monetization, there tends to become the slippery slope 99 00:05:32,920 --> 00:05:37,080 Speaker 1: that can lead to not only ethical dilemmas, but in 100 00:05:37,120 --> 00:05:41,200 Speaker 1: a lot of places, malpractice. And we saw this with 101 00:05:41,279 --> 00:05:45,640 Speaker 1: the rise of telehealth companies like Story brol and I 102 00:05:45,680 --> 00:05:47,720 Speaker 1: don't know if you guys have heard of that. It 103 00:05:48,040 --> 00:05:50,920 Speaker 1: grew really fast, especially when COVID hit. 104 00:05:51,240 --> 00:05:53,280 Speaker 2: And it started as. 105 00:05:53,200 --> 00:05:56,920 Speaker 1: A seemingly innocent way for people to get medical attention 106 00:05:57,320 --> 00:06:01,200 Speaker 1: and get what they need prescriptions, just to see telehealth 107 00:06:01,240 --> 00:06:05,279 Speaker 1: doctors easily and affordably. It started innocently, and then it 108 00:06:05,320 --> 00:06:07,320 Speaker 1: turned into a company that was way more focused on 109 00:06:07,839 --> 00:06:11,000 Speaker 1: marketing than the quality of their clinical work and actually 110 00:06:11,000 --> 00:06:15,800 Speaker 1: helping others. And this resulted in the hiring of hundreds 111 00:06:16,440 --> 00:06:20,000 Speaker 1: of clinicians without proper training and credentials, and the over 112 00:06:20,040 --> 00:06:26,120 Speaker 1: prescribing of stimulants, which interestingly enough surprise opioids as the 113 00:06:26,200 --> 00:06:30,279 Speaker 1: largest drug class of illicit drug transactions, which is a 114 00:06:30,279 --> 00:06:35,360 Speaker 1: big deal. And this company was prescribing that like it 115 00:06:35,560 --> 00:06:40,120 Speaker 1: was you know, sour gummy worms. And I actually was 116 00:06:40,120 --> 00:06:42,400 Speaker 1: reading an article on the Wall Street Journal recently that 117 00:06:43,040 --> 00:06:45,800 Speaker 1: quoted one of their potential investors who passed on the 118 00:06:45,800 --> 00:06:49,359 Speaker 1: opportunity to invest, as saying, we wanted to back a 119 00:06:49,400 --> 00:06:52,760 Speaker 1: team in mental health committed to clinical quality. Ultimately, we 120 00:06:52,839 --> 00:06:56,920 Speaker 1: perceived their focus to be more in marketing, which ended 121 00:06:57,240 --> 00:07:00,240 Speaker 1: up in a lot of controversy. And you can look 122 00:07:00,320 --> 00:07:04,000 Speaker 1: that up. That's another dark hole that we might explore 123 00:07:04,000 --> 00:07:06,520 Speaker 1: another day. But I just was using that as an 124 00:07:06,560 --> 00:07:11,400 Speaker 1: example of when money is mixed with helping and scaling 125 00:07:11,640 --> 00:07:17,520 Speaker 1: and capitalism, it really makes you have to consistently constantly 126 00:07:17,640 --> 00:07:21,239 Speaker 1: check your ethics and your morality and your values, because 127 00:07:21,240 --> 00:07:25,440 Speaker 1: it's easy to get really motivated by money that bends 128 00:07:25,600 --> 00:07:28,960 Speaker 1: and shapes and moves and distorts the view of what 129 00:07:29,000 --> 00:07:32,040 Speaker 1: you're actually doing in your mission. Now back to Cultish. 130 00:07:32,720 --> 00:07:35,360 Speaker 1: One of the names mentioned in this book was one 131 00:07:35,360 --> 00:07:38,080 Speaker 1: that I had heard of a lot, but I never 132 00:07:38,120 --> 00:07:40,840 Speaker 1: really gave him much attention. And this person's name was 133 00:07:40,880 --> 00:07:43,480 Speaker 1: Joe Despenza. Now I'm sure some of you are like, 134 00:07:43,880 --> 00:07:45,800 Speaker 1: oh my gosh, I know that guy. I love that guy. 135 00:07:45,840 --> 00:07:48,080 Speaker 1: His meditations are awesome. Oh my gosh, she's wonderful. What 136 00:07:48,080 --> 00:07:49,000 Speaker 1: are you gonna say about him? 137 00:07:49,440 --> 00:07:50,080 Speaker 2: That makes sense. 138 00:07:50,160 --> 00:07:54,480 Speaker 1: He's extremely popular right now, and I can see from 139 00:07:54,480 --> 00:07:57,520 Speaker 1: the outside why and how he is so popular. So 140 00:07:58,160 --> 00:08:01,360 Speaker 1: why would he be in this book? And I think 141 00:08:01,400 --> 00:08:04,560 Speaker 1: where I want to start is explaining to those of 142 00:08:04,600 --> 00:08:06,480 Speaker 1: you who were like me and might have heard of 143 00:08:06,520 --> 00:08:08,680 Speaker 1: his name but didn't really know who he was. I'm 144 00:08:08,680 --> 00:08:11,800 Speaker 1: going to start explaining who this guy is. So Joe 145 00:08:11,840 --> 00:08:19,080 Speaker 1: des Benza is a self proclaimed researcher of epigenetics, quantum physics, 146 00:08:19,960 --> 00:08:24,040 Speaker 1: and neuroscience, and he currently, I just checked this, has 147 00:08:24,080 --> 00:08:27,640 Speaker 1: two point seven million followers on Instagram, which is a 148 00:08:27,640 --> 00:08:30,360 Speaker 1: lot of followers. He sounds very fancy when you read 149 00:08:30,440 --> 00:08:34,520 Speaker 1: those words epigenetics, quantum physics, neuroscience, Like my brain goes wow, 150 00:08:35,080 --> 00:08:39,320 Speaker 1: he's so smart and educated. I wonder where he was educated. Well, 151 00:08:39,800 --> 00:08:42,960 Speaker 1: this is interesting. If you go to his website or 152 00:08:43,040 --> 00:08:45,560 Speaker 1: the internet in general, you will be very hard pressed 153 00:08:45,600 --> 00:08:51,600 Speaker 1: to find any information about Joe himself, which I find 154 00:08:51,800 --> 00:08:55,320 Speaker 1: as odd and so odd that I felt like I 155 00:08:55,400 --> 00:08:56,520 Speaker 1: was like missing something. 156 00:08:56,679 --> 00:08:57,360 Speaker 2: Was I being dumb? 157 00:08:57,400 --> 00:08:59,480 Speaker 1: I'm like, of course, there has to be somewhere on 158 00:08:59,520 --> 00:09:03,400 Speaker 1: his website this that gives it a bio or about 159 00:09:03,480 --> 00:09:06,320 Speaker 1: him that tells us where this guy came from and 160 00:09:06,360 --> 00:09:10,280 Speaker 1: why we should trust him instead. All I could find 161 00:09:10,320 --> 00:09:13,320 Speaker 1: on his website, and I'm open to being wrong still, 162 00:09:13,320 --> 00:09:14,760 Speaker 1: but I really spent a lot of time on that 163 00:09:14,840 --> 00:09:18,559 Speaker 1: website was a tab on the top titled proof that 164 00:09:18,640 --> 00:09:22,880 Speaker 1: included a lot of in quotes, research articles that were 165 00:09:22,880 --> 00:09:29,839 Speaker 1: created by Joe or his team themselves, or summaries of research, 166 00:09:30,000 --> 00:09:34,800 Speaker 1: but not a lot of real academic or peer reviewed anything. 167 00:09:35,040 --> 00:09:39,440 Speaker 1: It was just pretty general information that looked really legit 168 00:09:39,920 --> 00:09:41,040 Speaker 1: even when I read it. 169 00:09:40,880 --> 00:09:42,920 Speaker 2: It just didn't really say a lot. 170 00:09:43,280 --> 00:09:46,439 Speaker 1: And on the mission page of his website, you also 171 00:09:46,480 --> 00:09:49,520 Speaker 1: won't find much about Joe. What you will find is 172 00:09:49,600 --> 00:09:53,920 Speaker 1: these stories of transformation that look more like clickbait for 173 00:09:54,440 --> 00:09:59,440 Speaker 1: the vulnerable than actual scientific evidence or explanation of what 174 00:09:59,520 --> 00:10:03,040 Speaker 1: Joe is selling. Some of the titles were she Went 175 00:10:03,080 --> 00:10:06,600 Speaker 1: all In and her Depression disappeared in a selfless act 176 00:10:06,600 --> 00:10:10,440 Speaker 1: of love, her lung disease was healed. He regenerated his 177 00:10:10,559 --> 00:10:15,280 Speaker 1: jawbone and himself, which again very clickbaity, right. And I 178 00:10:15,320 --> 00:10:17,440 Speaker 1: want to read one of the stories of transformation on 179 00:10:17,679 --> 00:10:25,400 Speaker 1: Joe's website. And I want to read one of the 180 00:10:25,400 --> 00:10:29,560 Speaker 1: stories of transformation on Joe's website. This is quoted from 181 00:10:29,600 --> 00:10:33,520 Speaker 1: the website. After suffering with many illnesses throughout her life, 182 00:10:33,679 --> 00:10:36,080 Speaker 1: Yvonne was already deep into doctor Joe's work when she 183 00:10:36,160 --> 00:10:40,920 Speaker 1: received the toughest diagnosis of all metatastic ovarian cancer. When 184 00:10:40,920 --> 00:10:44,600 Speaker 1: our doctor performed surgery, though he was mystified the results 185 00:10:44,600 --> 00:10:48,079 Speaker 1: didn't match the diagnosis, he wanted to pursue further treatment, 186 00:10:48,160 --> 00:10:50,800 Speaker 1: but Yvonne was convinced she had found the answer to 187 00:10:50,880 --> 00:10:53,720 Speaker 1: healing and she would continue the work on her own. 188 00:10:54,040 --> 00:10:57,480 Speaker 1: Two years later, her oncologist had to agree that cancer 189 00:10:57,559 --> 00:11:01,040 Speaker 1: was gone, and as she recounts her story, three years later, 190 00:11:01,200 --> 00:11:05,680 Speaker 1: Yvonne is still cancer free. So this doesn't really say much. 191 00:11:05,720 --> 00:11:08,640 Speaker 1: It basically makes you think that whatever this person did 192 00:11:08,640 --> 00:11:13,480 Speaker 1: with Joe Despenza cured her cancer. Doctor Joe Despenza, which 193 00:11:13,840 --> 00:11:17,959 Speaker 1: we will get to later, and Joe hosts a lot 194 00:11:18,000 --> 00:11:20,600 Speaker 1: of workshops and retreats. It's one of the main selling 195 00:11:20,679 --> 00:11:25,040 Speaker 1: points on his website. He's known for and in these retreats, 196 00:11:25,080 --> 00:11:29,120 Speaker 1: he claims to heal genetic disorders through the power of belief, 197 00:11:29,880 --> 00:11:33,000 Speaker 1: and at one convention, he claims to have helped a woman, 198 00:11:33,400 --> 00:11:39,240 Speaker 1: this woman named Petra, regain her eyesight. He was quoted 199 00:11:39,240 --> 00:11:41,840 Speaker 1: saying she could do surgery and drugs, but it wouldn't 200 00:11:41,840 --> 00:11:45,360 Speaker 1: really change her gene expression. Instead, he believed that believing 201 00:11:45,400 --> 00:11:47,960 Speaker 1: her vision could return, this woman was able to change 202 00:11:47,960 --> 00:11:50,439 Speaker 1: her genetic makeup in a way that would allow her 203 00:11:50,440 --> 00:11:54,960 Speaker 1: to regain her site. Now I heard about this read 204 00:11:54,960 --> 00:11:56,440 Speaker 1: about in an article, and I was like, Okay, well, 205 00:11:56,720 --> 00:11:58,680 Speaker 1: haters of Joe could say anything on the internet, so 206 00:11:58,800 --> 00:12:01,439 Speaker 1: let me go do some research. Well, I found the 207 00:12:01,559 --> 00:12:05,360 Speaker 1: video on YouTube of this woman at the convention along 208 00:12:05,400 --> 00:12:09,800 Speaker 1: with Joe talking about this experience. What I didn't find 209 00:12:10,240 --> 00:12:15,120 Speaker 1: was any research about this. I didn't find any clinical 210 00:12:15,160 --> 00:12:19,360 Speaker 1: trials about what he's doing. I couldn't find anything other 211 00:12:19,480 --> 00:12:23,040 Speaker 1: than this video where him and this woman claimed that 212 00:12:23,120 --> 00:12:28,360 Speaker 1: this had been done. Which is confusing because if this happened, 213 00:12:28,400 --> 00:12:31,000 Speaker 1: why wouldn't we want everybody to know about it because 214 00:12:31,040 --> 00:12:34,400 Speaker 1: we could help other people, but also why should we 215 00:12:34,440 --> 00:12:37,280 Speaker 1: believe what you're saying, Like, where wouldn't you want to 216 00:12:37,280 --> 00:12:40,439 Speaker 1: give us some more information on this? And I watched 217 00:12:40,440 --> 00:12:42,440 Speaker 1: the video, like I said, and he explained how this 218 00:12:42,480 --> 00:12:45,240 Speaker 1: can be done to a large group of people and 219 00:12:45,480 --> 00:12:48,160 Speaker 1: including myself, Including myself in this group of people even 220 00:12:48,160 --> 00:12:50,320 Speaker 1: though I wasn't there in the video that don't have 221 00:12:50,679 --> 00:12:55,360 Speaker 1: the education or the real understanding of gene expression, genetics 222 00:12:55,679 --> 00:12:56,640 Speaker 1: the medical world. 223 00:12:56,840 --> 00:12:59,880 Speaker 2: So he explains it in a way that makes the average. 224 00:12:59,520 --> 00:13:02,640 Speaker 1: Person they're able to understand what he's saying, but he's 225 00:13:02,679 --> 00:13:04,640 Speaker 1: not really saying anything at all. There was a lot 226 00:13:04,679 --> 00:13:07,440 Speaker 1: of just like fancy words, something that I like to 227 00:13:07,480 --> 00:13:10,440 Speaker 1: call word salad that doesn't really mean anything. But at 228 00:13:10,440 --> 00:13:13,880 Speaker 1: the same time, as I'm listening, it's so charismatically presented, 229 00:13:13,920 --> 00:13:16,240 Speaker 1: and he's such a good speaker in that way that 230 00:13:16,320 --> 00:13:19,840 Speaker 1: I'm like, oh, okay, that sounds a ligit, that sounds 231 00:13:19,880 --> 00:13:23,520 Speaker 1: like science, that sounds like something that actually at the 232 00:13:23,559 --> 00:13:25,679 Speaker 1: end of it, I was like, Okay, now I'm just confused. 233 00:13:26,120 --> 00:13:28,920 Speaker 1: But for somebody who wants to believe this, I can 234 00:13:29,040 --> 00:13:32,080 Speaker 1: very easily understand why I would listen to that and 235 00:13:32,120 --> 00:13:33,520 Speaker 1: say there's the proofs right there. 236 00:13:33,640 --> 00:13:34,640 Speaker 2: He just explained it to. 237 00:13:34,679 --> 00:13:38,240 Speaker 1: Us, and what he was saying as my brain was 238 00:13:38,240 --> 00:13:40,200 Speaker 1: trying to do gymnastics understanding what he was saying. He 239 00:13:40,360 --> 00:13:43,720 Speaker 1: said that she was born again in the same life 240 00:13:43,960 --> 00:13:50,880 Speaker 1: during this meditative magnetic field that his followers had created 241 00:13:51,040 --> 00:13:53,600 Speaker 1: during this retreat, and that's how she regained her eyesight 242 00:13:53,640 --> 00:13:56,400 Speaker 1: because she was born again in the same life. Now, 243 00:13:56,600 --> 00:14:00,160 Speaker 1: this is very curious to me because, like I said before, 244 00:14:00,640 --> 00:14:04,719 Speaker 1: if this man can heal this woman's blindness, why are 245 00:14:04,760 --> 00:14:07,480 Speaker 1: we not talking about this. Why are we not doing 246 00:14:07,800 --> 00:14:10,000 Speaker 1: trials on this, Why are we not trying to get 247 00:14:10,040 --> 00:14:15,160 Speaker 1: this information to legitimate doctors? Why are we not using 248 00:14:15,200 --> 00:14:18,640 Speaker 1: this if this is what he says it is, If 249 00:14:18,679 --> 00:14:21,600 Speaker 1: this is true and real, I just don't know why 250 00:14:21,640 --> 00:14:25,040 Speaker 1: we aren't currying everybody with cancer in this way, because 251 00:14:25,040 --> 00:14:27,560 Speaker 1: if it's that easy, I would love to be able 252 00:14:27,600 --> 00:14:29,240 Speaker 1: to help some people in my life that have some 253 00:14:29,360 --> 00:14:33,960 Speaker 1: of these diseases that they go through mainstream medicine to 254 00:14:34,000 --> 00:14:37,760 Speaker 1: find healing from. Now, Joe Defenza also charges thousands of 255 00:14:37,760 --> 00:14:40,640 Speaker 1: dollars for people to come to these retreats for what 256 00:14:41,360 --> 00:14:46,640 Speaker 1: I can see appear more like performances and shows, if 257 00:14:46,640 --> 00:14:49,480 Speaker 1: you will, And they seem to take advantage of the 258 00:14:49,520 --> 00:14:53,160 Speaker 1: average person's understanding and ability to understand science and their 259 00:14:53,240 --> 00:14:56,800 Speaker 1: vulnerability towards the claims they make. Maybe there's a lot 260 00:14:56,800 --> 00:14:59,200 Speaker 1: of people who have been burned by mainstream medicine or 261 00:14:59,240 --> 00:15:01,120 Speaker 1: other things haven't worked, and so they turned to this 262 00:15:01,160 --> 00:15:05,840 Speaker 1: as a last resort. And I do believe, because I 263 00:15:05,880 --> 00:15:08,240 Speaker 1: don't want people to think that I'm just hating on 264 00:15:08,320 --> 00:15:10,280 Speaker 1: him to hate him, I really wish what he was 265 00:15:10,280 --> 00:15:11,880 Speaker 1: saying is true, because I would be awesome. 266 00:15:11,640 --> 00:15:12,120 Speaker 2: For the world. 267 00:15:12,840 --> 00:15:15,360 Speaker 1: I do believe there is power and positivity. There is 268 00:15:15,480 --> 00:15:19,240 Speaker 1: power in believing something. If we don't believe something is 269 00:15:19,320 --> 00:15:21,480 Speaker 1: able to work, it really does hurt our chances of 270 00:15:22,440 --> 00:15:25,680 Speaker 1: that actually working. So I do believe that our attitude 271 00:15:26,160 --> 00:15:28,200 Speaker 1: does shape some of our suffering, some of our healing. 272 00:15:28,280 --> 00:15:31,640 Speaker 1: I do believe meditation is helpful. I mean, we have 273 00:15:31,840 --> 00:15:35,280 Speaker 1: evidence to prove that, just not to the extent that 274 00:15:35,320 --> 00:15:39,160 Speaker 1: this man is proclaiming. And I know that I'm one 275 00:15:39,240 --> 00:15:41,520 Speaker 1: to be skeptical, so I want you guys to know 276 00:15:41,560 --> 00:15:43,800 Speaker 1: that this is coming from somebody who is known to 277 00:15:43,800 --> 00:15:49,160 Speaker 1: be skeptical. However, my skepticism I believe here comes with 278 00:15:49,320 --> 00:15:53,480 Speaker 1: good reason, because just because someone says something is true 279 00:15:53,560 --> 00:15:58,120 Speaker 1: in a really charismatic way, it doesn't mean I should 280 00:15:58,160 --> 00:16:01,800 Speaker 1: trust them. And this is something that really gets me. 281 00:16:02,280 --> 00:16:05,680 Speaker 1: If you want to find Joe's credentials, you will need 282 00:16:05,760 --> 00:16:07,720 Speaker 1: a lot of patience and a web browser with a 283 00:16:07,760 --> 00:16:12,320 Speaker 1: magnifying glass, because they aren't something that it appears Joe 284 00:16:12,400 --> 00:16:15,680 Speaker 1: wants you to see. And I find this very curious. 285 00:16:16,160 --> 00:16:18,360 Speaker 1: I have my degrees hanging on my wall in my office. 286 00:16:18,400 --> 00:16:21,120 Speaker 1: I have my background listed on my website and plain site. 287 00:16:21,320 --> 00:16:23,160 Speaker 1: I want people to know where I went to school. 288 00:16:23,440 --> 00:16:25,200 Speaker 1: I want people to know that I'm licensed. I want 289 00:16:25,240 --> 00:16:27,200 Speaker 1: people to know that I'm regulated. I want people to 290 00:16:27,240 --> 00:16:29,280 Speaker 1: know that I did the things that I needed to 291 00:16:29,320 --> 00:16:32,760 Speaker 1: do and studied the things that I needed to study 292 00:16:32,800 --> 00:16:35,120 Speaker 1: in order to do the work that I'm doing. I 293 00:16:35,160 --> 00:16:39,200 Speaker 1: think that is very helpful for people that might become clients, 294 00:16:39,200 --> 00:16:42,120 Speaker 1: and it's just helpful to create safety in our field. 295 00:16:43,080 --> 00:16:46,680 Speaker 1: It helps people understand what gives you the right to 296 00:16:46,720 --> 00:16:49,840 Speaker 1: do the work that you're doing. It builds trust, and 297 00:16:50,120 --> 00:16:52,720 Speaker 1: it's something that I believe we are allowed to be 298 00:16:52,760 --> 00:16:53,320 Speaker 1: proud of too. 299 00:16:53,640 --> 00:16:55,880 Speaker 2: I worked really hard to get to where I am. 300 00:16:56,360 --> 00:16:59,880 Speaker 1: What I could find on Joe is that he holds 301 00:16:59,880 --> 00:17:04,119 Speaker 1: a Bachelor of Science degree with an emphasis in neuroscience. 302 00:17:04,280 --> 00:17:04,880 Speaker 2: I found some. 303 00:17:04,880 --> 00:17:07,119 Speaker 1: Places, but some places have just said Bachelor of Science 304 00:17:07,160 --> 00:17:12,040 Speaker 1: degree and as a doctor of chiropractic, which for that 305 00:17:12,200 --> 00:17:15,159 Speaker 1: he went to this place called Life University. There's a 306 00:17:15,160 --> 00:17:20,280 Speaker 1: lot of controversy on that university, some because they lost 307 00:17:20,320 --> 00:17:24,840 Speaker 1: their credentials because of their illegitimate teaching practices at some point. 308 00:17:25,320 --> 00:17:27,439 Speaker 1: I didn't do a lot of research on that, but 309 00:17:27,520 --> 00:17:30,840 Speaker 1: that did come up in several articles that I read. 310 00:17:31,840 --> 00:17:34,520 Speaker 1: But this is what I'm finding. He has a Bachelor 311 00:17:34,560 --> 00:17:38,560 Speaker 1: of Science degree and he is a chiropractor. His end 312 00:17:38,640 --> 00:17:41,960 Speaker 1: quotes postgraduate training, which that is I don't really know 313 00:17:42,040 --> 00:17:47,440 Speaker 1: what that term exactly means, includes the fields of neuroscience, neuroplasticity, 314 00:17:47,880 --> 00:17:51,280 Speaker 1: quantitative electro. I don't even know how to say this 315 00:17:51,359 --> 00:17:58,280 Speaker 1: word electro, sell a felogram, measurements, epigenetics, mind body medicine, 316 00:17:58,320 --> 00:18:01,840 Speaker 1: and brain slash heart coherence. I don't know what that means. 317 00:18:02,040 --> 00:18:04,399 Speaker 1: I don't know what he did for his training. He 318 00:18:04,440 --> 00:18:06,320 Speaker 1: could just have read a couple of books like that 319 00:18:06,359 --> 00:18:09,280 Speaker 1: could mean anything. As my point, and he used a 320 00:18:09,280 --> 00:18:12,720 Speaker 1: lot of fancy words. Again, I could read some books 321 00:18:12,760 --> 00:18:16,800 Speaker 1: on neuroscience and say that I did training in neuroscience 322 00:18:16,840 --> 00:18:19,719 Speaker 1: postgraduate as well. So why wouldn't we want to Why 323 00:18:19,760 --> 00:18:22,760 Speaker 1: wouldn't he want us to know the legitimacy of that, 324 00:18:23,320 --> 00:18:26,800 Speaker 1: because what I can gain is what he has done 325 00:18:27,240 --> 00:18:31,399 Speaker 1: does not equal neuroscientists. Also, back to his Bachelor of 326 00:18:31,440 --> 00:18:33,720 Speaker 1: Science degree, I don't know what it is in. It 327 00:18:33,760 --> 00:18:36,480 Speaker 1: could be in business, it could be in fashion merchandising, 328 00:18:36,720 --> 00:18:39,560 Speaker 1: it could be in nutrition. I don't know what it 329 00:18:39,600 --> 00:18:42,280 Speaker 1: is in, and I find that odd that he wouldn't 330 00:18:42,280 --> 00:18:46,159 Speaker 1: want us to know that. So let's go back to 331 00:18:47,119 --> 00:18:49,560 Speaker 1: some of what Joe does along with these retreats, which 332 00:18:50,000 --> 00:18:53,080 Speaker 1: again thousands of dollars. If that is what you want 333 00:18:53,080 --> 00:18:55,159 Speaker 1: to spend your money on, we'll get to that. But 334 00:18:55,960 --> 00:18:58,760 Speaker 1: that's one thing that he promotes and is known for. 335 00:18:59,280 --> 00:19:03,399 Speaker 1: He also teaches an online course at the Quantum University, 336 00:19:03,600 --> 00:19:06,400 Speaker 1: which I had never heard of before. There's another man 337 00:19:06,480 --> 00:19:11,360 Speaker 1: named Bruce Lipton that also teaches at this university, and 338 00:19:11,920 --> 00:19:15,760 Speaker 1: he's a biologist who believes cells are reprogrammable through the 339 00:19:15,800 --> 00:19:16,520 Speaker 1: power of God. 340 00:19:16,760 --> 00:19:18,399 Speaker 2: And this university is very confusing and. 341 00:19:18,480 --> 00:19:26,280 Speaker 1: Advertises bachelors masters, PhD, and doctorate programs in holistic, natural 342 00:19:26,960 --> 00:19:31,480 Speaker 1: and integrated medicine. Now, none of this is actually accredited 343 00:19:31,520 --> 00:19:34,840 Speaker 1: by any agency that's recognized in the United States. 344 00:19:34,920 --> 00:19:37,280 Speaker 2: So that means you can. 345 00:19:37,160 --> 00:19:39,960 Speaker 1: Go to the school and get this bachelor's degree, but 346 00:19:40,160 --> 00:19:44,280 Speaker 1: it doesn't really mean anything to anybody other than this 347 00:19:44,440 --> 00:19:48,240 Speaker 1: school and the boards that are created around what the 348 00:19:48,280 --> 00:19:49,040 Speaker 1: school teaches. 349 00:19:49,280 --> 00:19:49,800 Speaker 2: Let's say I. 350 00:19:49,760 --> 00:19:52,919 Speaker 1: Went to Alabama University of Alabama, and I did a 351 00:19:53,000 --> 00:19:56,119 Speaker 1: year of training there or a year of school there, 352 00:19:56,160 --> 00:19:58,480 Speaker 1: and then I said, I want to transfer to University 353 00:19:58,480 --> 00:20:02,399 Speaker 1: of Tennessee. Your credits most likely are going to transfer 354 00:20:02,520 --> 00:20:04,800 Speaker 1: up over to that other school because it is an 355 00:20:04,800 --> 00:20:11,040 Speaker 1: accredited school recognized by the United States. Nothing is recognized 356 00:20:11,080 --> 00:20:13,240 Speaker 1: in this school. So if I go and I study here, 357 00:20:13,600 --> 00:20:16,200 Speaker 1: it doesn't really mean anything. I would have to start over. 358 00:20:16,359 --> 00:20:20,520 Speaker 1: And again, it does not hold any value to anybody 359 00:20:20,960 --> 00:20:22,120 Speaker 1: other than the people that are. 360 00:20:22,080 --> 00:20:23,600 Speaker 2: Associated with this school. 361 00:20:23,760 --> 00:20:27,520 Speaker 1: At this school, both Joe Despenza and this guy, Bruce 362 00:20:27,560 --> 00:20:31,080 Speaker 1: Lifton teach their students that DNA is controlled through the 363 00:20:31,119 --> 00:20:33,920 Speaker 1: power of thought and that each of us are able 364 00:20:33,920 --> 00:20:38,200 Speaker 1: to alter our genetics through our minds. Now, again, it'd 365 00:20:38,240 --> 00:20:40,639 Speaker 1: be pretty cool if this was true, and if this 366 00:20:40,760 --> 00:20:43,000 Speaker 1: is true, I really want to listen and I want 367 00:20:43,040 --> 00:20:46,080 Speaker 1: more information about this. I would love to read and 368 00:20:46,119 --> 00:20:48,399 Speaker 1: hear and learn about this, but I can't seem to 369 00:20:48,440 --> 00:20:53,639 Speaker 1: find where anybody's getting this from. The scientific research and 370 00:20:53,640 --> 00:20:56,159 Speaker 1: I'm doing air quotes around that on his website are 371 00:20:56,200 --> 00:20:59,080 Speaker 1: either studies that he did himself with his own team, 372 00:20:59,720 --> 00:21:03,480 Speaker 1: or they're just things that don't really say much. There's 373 00:21:03,640 --> 00:21:08,680 Speaker 1: one that said that meditation does aid in the progression 374 00:21:08,720 --> 00:21:13,639 Speaker 1: of certain illnesses and mental health struggles. Well, yeah, we 375 00:21:13,760 --> 00:21:16,199 Speaker 1: know that. Well I know that, and I believe that, 376 00:21:16,520 --> 00:21:19,240 Speaker 1: but that doesn't mean that we can reprogram. We're going 377 00:21:19,240 --> 00:21:21,360 Speaker 1: from A to Z, and I want to know what's 378 00:21:21,400 --> 00:21:23,240 Speaker 1: in the middle. So this brings me back to the 379 00:21:23,320 --> 00:21:26,560 Speaker 1: question I asked a long time ago that I posed earlier. 380 00:21:26,640 --> 00:21:29,119 Speaker 1: Why was his name mentioned in this book? Well, I 381 00:21:29,160 --> 00:21:32,040 Speaker 1: assume most of you guys could take a really good 382 00:21:32,160 --> 00:21:36,240 Speaker 1: educated guest now based on what I have just said. 383 00:21:35,960 --> 00:21:37,080 Speaker 2: In explaining who he is. 384 00:21:37,720 --> 00:21:39,560 Speaker 1: But the reason he's in this book is because he 385 00:21:39,640 --> 00:21:43,480 Speaker 1: has created a cult like following of people by creating 386 00:21:43,560 --> 00:21:48,920 Speaker 1: a delusion of intelligence and credibility through co opting scientific language, 387 00:21:49,440 --> 00:21:51,920 Speaker 1: which is becoming more and more of a problem in 388 00:21:52,000 --> 00:21:57,040 Speaker 1: our communities. And it's actually some of that is the 389 00:21:57,760 --> 00:22:01,440 Speaker 1: motivator in the series that Tara Booker and I did 390 00:22:01,840 --> 00:22:04,399 Speaker 1: and will continue to do, where we talk about the 391 00:22:04,400 --> 00:22:08,120 Speaker 1: difference between real mental health terms and how they are 392 00:22:08,800 --> 00:22:12,280 Speaker 1: shifted in pop culture. People take these terms and they 393 00:22:12,280 --> 00:22:16,280 Speaker 1: distort them to use as basis of credibility. They make 394 00:22:16,359 --> 00:22:18,159 Speaker 1: you sound smart, and then people are like, ooh, I 395 00:22:18,160 --> 00:22:22,240 Speaker 1: want to follow that person. So a woman named Nicole 396 00:22:22,320 --> 00:22:27,240 Speaker 1: Carlis interviewed the author of the book Cultish, Amanda Montel, 397 00:22:27,480 --> 00:22:31,640 Speaker 1: and in this interview, I'm going to take a quote 398 00:22:31,880 --> 00:22:35,240 Speaker 1: from Amanda, the author of Cultish and just read it 399 00:22:35,280 --> 00:22:38,199 Speaker 1: to you because it says so much good stuff in it. 400 00:22:38,720 --> 00:22:42,240 Speaker 2: She said. This is Amanda, the author of Cultish. She said. 401 00:22:42,359 --> 00:22:45,880 Speaker 1: Co Opting technical terms from scientific fields and giving them 402 00:22:45,960 --> 00:22:50,399 Speaker 1: new metaphysical meanings is something that all of history's most 403 00:22:50,480 --> 00:22:55,600 Speaker 1: notorious New Age leaders, from Marshall Applewhite to l Ron Hubbard, 404 00:22:55,680 --> 00:22:59,280 Speaker 1: has done. This is what New Age groups have always done. 405 00:22:59,400 --> 00:23:03,320 Speaker 1: They combine scientific language or language from the DSM, and 406 00:23:03,400 --> 00:23:06,640 Speaker 1: the DSM is the diagnostic and Statistical Manual of mental 407 00:23:06,680 --> 00:23:12,600 Speaker 1: health disorders like psychological language with spiritual mystical metaphysical language 408 00:23:12,640 --> 00:23:15,960 Speaker 1: in order to create this impression that they are tapped 409 00:23:16,000 --> 00:23:20,520 Speaker 1: into a power higher than science. And what someone like 410 00:23:20,600 --> 00:23:25,119 Speaker 1: Joe Dispensa does which is particularly grating and harmful, is 411 00:23:25,160 --> 00:23:29,120 Speaker 1: he will co opt terms from astrophysics. He'll talk about 412 00:23:29,200 --> 00:23:32,080 Speaker 1: quantum fields. And again his credential is he has a 413 00:23:32,080 --> 00:23:36,919 Speaker 1: degree in chiropractic from the university called Life University. He 414 00:23:37,040 --> 00:23:40,280 Speaker 1: is a joke, but he will basically use really complex 415 00:23:40,400 --> 00:23:44,520 Speaker 1: terms that are above the average follower's head. And because 416 00:23:44,520 --> 00:23:48,000 Speaker 1: he uses them with such confidence, and he's the picture 417 00:23:48,080 --> 00:23:50,679 Speaker 1: of that type of person that we would expect to 418 00:23:50,720 --> 00:23:55,120 Speaker 1: know about astrophysics, aka a middle aged, balding white man. 419 00:23:55,440 --> 00:23:58,120 Speaker 1: Either the average follower is not going to fact check 420 00:23:58,200 --> 00:24:01,320 Speaker 1: that you're just scrolling through in Instagram or you're just 421 00:24:01,359 --> 00:24:04,720 Speaker 1: surfing the internet with this overload of information that you're 422 00:24:04,760 --> 00:24:08,040 Speaker 1: getting on Instagram or on the Internet, you're not going 423 00:24:08,080 --> 00:24:11,320 Speaker 1: to FactCheck every little thing. It would take you all day. 424 00:24:11,640 --> 00:24:14,040 Speaker 1: It would be like a full time job. And study 425 00:24:14,080 --> 00:24:17,560 Speaker 1: after studies shows that misinformation spreads more quickly on the 426 00:24:17,600 --> 00:24:22,000 Speaker 1: Internet than true stories, especially on Twitter, and it's really 427 00:24:22,000 --> 00:24:26,520 Speaker 1: difficult to differentiate between false information that feels more novel, 428 00:24:26,800 --> 00:24:29,840 Speaker 1: and we're more likely to spread or retweet or reshare 429 00:24:29,920 --> 00:24:33,720 Speaker 1: information that feels new because it makes us feel again 430 00:24:34,000 --> 00:24:37,080 Speaker 1: like we are accessing something special, that we're in the know, 431 00:24:37,800 --> 00:24:41,919 Speaker 1: and sometimes true information feels boring. So yeah, the combination 432 00:24:42,080 --> 00:24:46,480 Speaker 1: of metaphysical language in science language is really dangerous because 433 00:24:46,480 --> 00:24:50,399 Speaker 1: it devalues actual science. And in an era like we 434 00:24:50,480 --> 00:24:53,040 Speaker 1: are and in now, where there is a civil war 435 00:24:53,119 --> 00:24:57,240 Speaker 1: over disinformation, when people think that science is a conspiracy, 436 00:24:57,600 --> 00:25:01,879 Speaker 1: when people have such mistrust in the healthcare sits in academia, 437 00:25:01,960 --> 00:25:07,240 Speaker 1: that becomes incredibly dangerous, especially because people like Joe Despenza, 438 00:25:07,359 --> 00:25:10,120 Speaker 1: and he's a diamond dozen by the way, aren't spreading 439 00:25:10,200 --> 00:25:13,320 Speaker 1: this ideology because they really think it's going to help people. 440 00:25:13,800 --> 00:25:16,600 Speaker 1: They're spreading it to make a buck. The roster of 441 00:25:16,640 --> 00:25:19,800 Speaker 1: products and services that Joe Despenza has for sale would 442 00:25:19,800 --> 00:25:23,480 Speaker 1: blow your mind. It's really like the metaphysical Disney Store. 443 00:25:24,080 --> 00:25:27,359 Speaker 1: And so he's reaching for attention. That's what he wants. 444 00:25:27,440 --> 00:25:29,800 Speaker 1: He doesn't want to help people, he wants attention and 445 00:25:29,880 --> 00:25:32,879 Speaker 1: followers and money. How do you get attention. You spread 446 00:25:32,880 --> 00:25:34,879 Speaker 1: the news that's going to feel most novel to people, 447 00:25:35,320 --> 00:25:37,280 Speaker 1: the news that it's going to feel the most novel 448 00:25:37,320 --> 00:25:40,200 Speaker 1: to people, it's probably going to be false, and that's 449 00:25:40,400 --> 00:25:43,000 Speaker 1: just destructive for so many reasons. And that's the end 450 00:25:43,040 --> 00:25:45,239 Speaker 1: of that quote. I am going to post all of 451 00:25:45,280 --> 00:25:47,720 Speaker 1: the or a lot of the articles that I got 452 00:25:47,720 --> 00:25:50,439 Speaker 1: the information for this episode in the show notes. You 453 00:25:50,440 --> 00:25:53,920 Speaker 1: can read that whole interview with Amanda, but you can 454 00:25:53,960 --> 00:25:56,880 Speaker 1: see a lot of already why it is so important 455 00:25:56,920 --> 00:26:01,000 Speaker 1: to talk about and peel apart the peace people. And 456 00:26:01,119 --> 00:26:03,560 Speaker 1: like you said, Joe is a dimond dozen that are 457 00:26:03,600 --> 00:26:07,239 Speaker 1: doing this because we are going to follow things that 458 00:26:07,280 --> 00:26:11,760 Speaker 1: feel most novel, and we also are in a space 459 00:26:11,800 --> 00:26:14,080 Speaker 1: where the world feels a little broken. We are a 460 00:26:14,240 --> 00:26:18,480 Speaker 1: distrust of certain agencies, and there's so many conspiracy stories, 461 00:26:18,520 --> 00:26:21,760 Speaker 1: and we want to trust something, and so we go 462 00:26:21,840 --> 00:26:24,040 Speaker 1: to the new thing because the other thing has burned 463 00:26:24,119 --> 00:26:27,280 Speaker 1: us in some way. And I also read a study 464 00:26:27,600 --> 00:26:30,160 Speaker 1: done by a group of researchers at USC in their 465 00:26:30,240 --> 00:26:33,000 Speaker 1: School of Business, and they looked into why fake news 466 00:26:33,040 --> 00:26:36,040 Speaker 1: spreads so fast on social media and they found, which 467 00:26:36,200 --> 00:26:39,919 Speaker 1: was not surprising to me, that social media has a 468 00:26:39,960 --> 00:26:43,440 Speaker 1: reward system that encourages users to stay on their accounts 469 00:26:43,760 --> 00:26:47,440 Speaker 1: and keep posting and sharing. So users who post and 470 00:26:47,480 --> 00:26:52,280 Speaker 1: share frequently, especially sensational, eye catching information, are likely to 471 00:26:52,320 --> 00:26:55,800 Speaker 1: attract attention, which huge Joe de spends his claims, And 472 00:26:55,800 --> 00:26:58,960 Speaker 1: that's what Amanda's saying when she's saying, if you want attention, 473 00:26:59,119 --> 00:27:04,480 Speaker 1: you're going to share information that feels most novel. And 474 00:27:04,520 --> 00:27:08,000 Speaker 1: the information that feels most novel that gets people's attention 475 00:27:08,560 --> 00:27:11,760 Speaker 1: is so often false because the truth sometimes can be 476 00:27:11,840 --> 00:27:21,040 Speaker 1: really boring. So at this point you might be thinking, 477 00:27:21,080 --> 00:27:23,800 Speaker 1: we get it. It seems like you don't like this person, 478 00:27:24,440 --> 00:27:27,960 Speaker 1: and it seems like you think he's a joke. Why 479 00:27:28,359 --> 00:27:31,520 Speaker 1: are we talking about him on Union Therapy? And my 480 00:27:31,600 --> 00:27:35,000 Speaker 1: answer to that is because of what Amana Montel said 481 00:27:35,000 --> 00:27:38,239 Speaker 1: in that quote where she said people like Joe are 482 00:27:38,280 --> 00:27:41,800 Speaker 1: a dime a dozen and the average person can't and 483 00:27:41,880 --> 00:27:45,080 Speaker 1: won't fact check everything we see on the internet. There 484 00:27:45,119 --> 00:27:49,280 Speaker 1: is science, and there is ethical research conducted to support 485 00:27:50,040 --> 00:27:53,760 Speaker 1: or disprove the theories that individuals develop. In that science, 486 00:27:54,400 --> 00:27:59,119 Speaker 1: there's also pseudoscience, something that can easily pass as legitimate 487 00:27:59,560 --> 00:28:01,800 Speaker 1: but does have to go through the same regulations to 488 00:28:01,840 --> 00:28:05,160 Speaker 1: get to your eyes. What science does is it accepts 489 00:28:05,240 --> 00:28:09,920 Speaker 1: the inevitability of error and it sets out to find 490 00:28:10,000 --> 00:28:17,600 Speaker 1: and eliminate that pseudoscience really starts from a desire to 491 00:28:18,680 --> 00:28:23,600 Speaker 1: prove something right. They start with a commitment to their 492 00:28:23,680 --> 00:28:30,359 Speaker 1: views that they don't want to falsify. And because pseudoscience 493 00:28:30,400 --> 00:28:33,600 Speaker 1: typically tries to pass for science, that's what it does. 494 00:28:33,680 --> 00:28:36,320 Speaker 1: It wants to look legitimate. There's nobody that's going to 495 00:28:36,359 --> 00:28:38,920 Speaker 1: pose something or write an article and be like, this 496 00:28:39,000 --> 00:28:40,760 Speaker 1: is fake science. So I want everybody to know they're 497 00:28:40,760 --> 00:28:43,040 Speaker 1: going to do it in a way that passes for science. 498 00:28:43,360 --> 00:28:48,320 Speaker 1: It can be hard to differentiate, and that affects our 499 00:28:48,360 --> 00:28:53,320 Speaker 1: ability as a viewer as the public to make decisions 500 00:28:53,360 --> 00:28:56,160 Speaker 1: based in science, Like, we want to make decisions based 501 00:28:56,200 --> 00:28:59,880 Speaker 1: on good research, based on education, based on what we know, 502 00:29:00,120 --> 00:29:03,320 Speaker 1: but that's hard to do when we have fake science 503 00:29:03,360 --> 00:29:06,239 Speaker 1: that looks like real science that we're making decisions off of. 504 00:29:06,840 --> 00:29:09,520 Speaker 1: So to me, this is like a right now and 505 00:29:09,600 --> 00:29:13,760 Speaker 1: it's a future health crisis. An article that was posted 506 00:29:13,800 --> 00:29:17,560 Speaker 1: by MPR titled what is Pseudoscience from twenty seventeen stated 507 00:29:18,080 --> 00:29:22,560 Speaker 1: one reason that differentiating science from pseudoscience matters is because 508 00:29:22,600 --> 00:29:26,800 Speaker 1: many individuals and institutional decisions depend on our best understanding 509 00:29:26,800 --> 00:29:30,600 Speaker 1: of the natural world and understanding that science is uniquely 510 00:29:30,640 --> 00:29:35,440 Speaker 1: poised to provide social and natural sciences inform medical decisions, 511 00:29:35,640 --> 00:29:38,880 Speaker 1: legal decisions, and public policy, not to mention our own 512 00:29:38,960 --> 00:29:42,240 Speaker 1: decisions about what to eat, how to manage illness, and 513 00:29:42,280 --> 00:29:46,080 Speaker 1: how to lead our lives. If pseudoscience is an unreliable 514 00:29:46,080 --> 00:29:49,720 Speaker 1: basis for making these decisions, it is important to draw 515 00:29:49,760 --> 00:29:54,160 Speaker 1: a line between science and alternatives that purport to offer 516 00:29:54,200 --> 00:29:58,120 Speaker 1: the same level of authority. So here's where my brain 517 00:29:58,160 --> 00:30:01,200 Speaker 1: starts to panic. How do we know who to trust? 518 00:30:01,240 --> 00:30:03,960 Speaker 1: And how do we know who to look into more? 519 00:30:04,640 --> 00:30:08,760 Speaker 1: And what my flight response in my nervous system tells 520 00:30:08,800 --> 00:30:10,880 Speaker 1: me to do is trust no one. And that's where 521 00:30:11,320 --> 00:30:15,120 Speaker 1: a lot of my skepticism has come in. But also 522 00:30:15,320 --> 00:30:18,600 Speaker 1: that has helped me do the fact checking. It's helped 523 00:30:18,640 --> 00:30:21,480 Speaker 1: me look at things more critically. And what I don't 524 00:30:21,520 --> 00:30:23,920 Speaker 1: want us to do is just ignore any information that 525 00:30:23,920 --> 00:30:26,520 Speaker 1: comes at us because we're afraid we're going to get duped. 526 00:30:26,920 --> 00:30:28,920 Speaker 1: Because there are plenty of good humans out there and 527 00:30:28,960 --> 00:30:31,680 Speaker 1: lots of health information out there that these people are 528 00:30:31,680 --> 00:30:34,160 Speaker 1: providing as well. So how can we learn to tell 529 00:30:34,160 --> 00:30:37,760 Speaker 1: the difference between actual science and pseudoscience? We know that 530 00:30:37,800 --> 00:30:42,360 Speaker 1: fake scientists purposely deceive their audience by inflating their credentials, 531 00:30:42,480 --> 00:30:47,800 Speaker 1: making themselves look more legitimate without actually being legitimate, which 532 00:30:47,880 --> 00:30:50,560 Speaker 1: is what Joe Dispensa has done. He calls himself a 533 00:30:50,600 --> 00:30:54,640 Speaker 1: doctor even though he has I believe chiropractors have a 534 00:30:54,880 --> 00:30:57,680 Speaker 1: purpose in life. That is a whole thing. They go 535 00:30:57,720 --> 00:31:00,440 Speaker 1: to school, they do that is not a medical doctor. 536 00:31:00,760 --> 00:31:03,360 Speaker 1: And there is a difference between that. The same thing 537 00:31:03,360 --> 00:31:06,720 Speaker 1: when somebody go is a doctorate of education, has a 538 00:31:06,720 --> 00:31:10,320 Speaker 1: PhD in or has a PhD in clinical psychology, that's 539 00:31:10,360 --> 00:31:14,160 Speaker 1: not the same as a as a psychiatrist or a 540 00:31:14,360 --> 00:31:18,560 Speaker 1: gastroinurologist or any of those kinds of medical doctors. And 541 00:31:18,600 --> 00:31:23,320 Speaker 1: so we need to look at the differences between those. So, 542 00:31:24,280 --> 00:31:26,520 Speaker 1: like I was saying, these fake scientists, what they do 543 00:31:26,560 --> 00:31:29,000 Speaker 1: is they inflate their credentials. So if you notice somebody's 544 00:31:29,080 --> 00:31:34,040 Speaker 1: hiding something red flag and they make false claims or 545 00:31:34,280 --> 00:31:37,400 Speaker 1: just they might not even be false, but we don't 546 00:31:37,400 --> 00:31:39,040 Speaker 1: really have anything to base. 547 00:31:38,800 --> 00:31:39,360 Speaker 2: Them off of. 548 00:31:39,480 --> 00:31:43,040 Speaker 1: They're just guesses that haven't been looked at any further, 549 00:31:43,800 --> 00:31:46,760 Speaker 1: based on a bias that they might have, and they 550 00:31:46,920 --> 00:31:50,240 Speaker 1: disguise them as they're based in scientific evidence, and they 551 00:31:50,320 --> 00:31:53,000 Speaker 1: profit from the despair of people who have been disappointed 552 00:31:53,320 --> 00:32:00,240 Speaker 1: by science, mainstream medical world, mainstream anything. So we can 553 00:32:00,320 --> 00:32:03,800 Speaker 1: identify some things to look for before we as a 554 00:32:03,920 --> 00:32:07,240 Speaker 1: public hit the reshare or subscribe buttons to some of 555 00:32:07,280 --> 00:32:12,360 Speaker 1: this information. I found a good starter to checking science 556 00:32:12,360 --> 00:32:15,840 Speaker 1: from pseudoscience checklist from this article the Rise of Fake 557 00:32:15,880 --> 00:32:19,000 Speaker 1: Science from nests labs, which is an online learning community, 558 00:32:19,600 --> 00:32:23,120 Speaker 1: and they posted these couple red flags to look for. 559 00:32:23,400 --> 00:32:25,640 Speaker 1: So the first one is if the person has no 560 00:32:25,760 --> 00:32:30,000 Speaker 1: peer reviewed research. And what's funny is I remember being 561 00:32:30,200 --> 00:32:33,560 Speaker 1: so annoyed by having to read all these peer reviewed 562 00:32:33,560 --> 00:32:35,880 Speaker 1: research articles and do this and do that in school, 563 00:32:36,040 --> 00:32:38,200 Speaker 1: and now I'm like, oh, I get why this is 564 00:32:38,200 --> 00:32:41,720 Speaker 1: so important because this legitimizes what we were doing. But 565 00:32:41,800 --> 00:32:45,080 Speaker 1: if this person has no peer reviewed research to back 566 00:32:45,160 --> 00:32:48,800 Speaker 1: up their claims, that research is not very regulated and 567 00:32:48,880 --> 00:32:53,640 Speaker 1: anybody can say anything, so something might pass as research, 568 00:32:53,680 --> 00:32:57,400 Speaker 1: but we have to look for peer reviewed research. Two, 569 00:32:57,680 --> 00:33:00,680 Speaker 1: all the materials about their work were produced by themselves 570 00:33:00,760 --> 00:33:04,520 Speaker 1: or their teams, so this is kind of similar. If 571 00:33:04,600 --> 00:33:08,400 Speaker 1: everything that they are talking about comes from just themselves, 572 00:33:09,240 --> 00:33:12,440 Speaker 1: that is a red flag. There are a lot of 573 00:33:12,480 --> 00:33:15,000 Speaker 1: people that I really trust in the community of research, 574 00:33:15,080 --> 00:33:18,480 Speaker 1: and particularly psychology, because that is what I do for 575 00:33:18,480 --> 00:33:22,000 Speaker 1: a living, and often they look at what other people 576 00:33:22,040 --> 00:33:25,280 Speaker 1: have done and they will grow from that, or they'll 577 00:33:25,400 --> 00:33:28,440 Speaker 1: use that to corroborate what they're with the hypotheses that 578 00:33:28,440 --> 00:33:32,560 Speaker 1: they're creating. And that feels really helpful and hopeful because 579 00:33:32,840 --> 00:33:35,479 Speaker 1: it's a team of people working together based on the 580 00:33:35,520 --> 00:33:38,120 Speaker 1: community of what we've found together, versus I have an 581 00:33:38,160 --> 00:33:41,400 Speaker 1: idea and I'm going to go with it blindly and 582 00:33:41,440 --> 00:33:44,280 Speaker 1: then just convince you that I'm right with this information 583 00:33:44,360 --> 00:33:49,000 Speaker 1: that just came from me. Number three look for a 584 00:33:49,240 --> 00:33:53,280 Speaker 1: vague biography that leaves out important information, such as they're 585 00:33:53,280 --> 00:33:53,840 Speaker 1: Alma Marter. 586 00:33:54,840 --> 00:33:55,960 Speaker 2: Where did they go to school? 587 00:33:56,680 --> 00:34:01,960 Speaker 1: Actually, I don't think I've found anywhere where went to undergraduate? 588 00:34:02,640 --> 00:34:05,240 Speaker 1: I did read somewhere that he took some classes at Rutgers, 589 00:34:05,280 --> 00:34:07,200 Speaker 1: but it didn't say that he graduated from there, so 590 00:34:07,680 --> 00:34:12,080 Speaker 1: that's interesting. And then number four made up scientific terms 591 00:34:12,160 --> 00:34:16,040 Speaker 1: not used by anybody else in the scientific community. And 592 00:34:16,360 --> 00:34:19,359 Speaker 1: this one was really interesting. They gave that example. This 593 00:34:19,400 --> 00:34:23,120 Speaker 1: person named I'm going to pronounce this name wrong. Idris 594 00:34:23,800 --> 00:34:27,640 Speaker 1: Abercaine is known to have coined the term neuro wisdom 595 00:34:28,239 --> 00:34:31,440 Speaker 1: in his book Free Up Your Mind, which was described 596 00:34:31,520 --> 00:34:38,000 Speaker 1: by neurology researcher Sebastian Diegez as an uninterrupted succession of 597 00:34:38,120 --> 00:34:45,080 Speaker 1: isolated facts, of pointless detours, antidotes of personal opinion, elementary mistakes, 598 00:34:45,239 --> 00:34:50,120 Speaker 1: debunked theories, truisms, hyperboles, and alphorisms, which do. 599 00:34:50,120 --> 00:34:52,600 Speaker 2: Not make for good science education. 600 00:34:53,920 --> 00:34:57,359 Speaker 1: So look for what I mentioned earlier that I call 601 00:34:57,480 --> 00:35:01,200 Speaker 1: word salad, which is not a scientific and not a 602 00:35:01,280 --> 00:35:03,839 Speaker 1: term that I've coined. I actually heard that. I think 603 00:35:03,840 --> 00:35:06,640 Speaker 1: the first time I heard that was in the book Cultish. 604 00:35:06,680 --> 00:35:08,280 Speaker 2: But it's just an. 605 00:35:08,120 --> 00:35:13,080 Speaker 1: Idea that there's a lot of these fancy, cool sounding 606 00:35:13,200 --> 00:35:18,239 Speaker 1: words that really mean nothing. Now, if you look into 607 00:35:18,280 --> 00:35:22,160 Speaker 1: the argument deeper of what makes science science, you will 608 00:35:22,160 --> 00:35:27,279 Speaker 1: find information about a philosopher named Carl Popper and his 609 00:35:27,320 --> 00:35:31,440 Speaker 1: attempt to separate empirical science and pseudoscience. Had the main 610 00:35:31,520 --> 00:35:34,839 Speaker 1: takeaway that you will see this idea of falsifiability. In 611 00:35:34,880 --> 00:35:37,360 Speaker 1: real science, something has to have the ability to be 612 00:35:37,400 --> 00:35:40,360 Speaker 1: proven false to be science, which is interesting because a 613 00:35:40,360 --> 00:35:43,080 Speaker 1: lot of pseudoscience you will see focus only on the 614 00:35:43,080 --> 00:35:46,319 Speaker 1: information that corroborates their story. They aren't looking to see 615 00:35:46,320 --> 00:35:49,040 Speaker 1: if it doesn't make sense. They're only looking into and 616 00:35:49,120 --> 00:35:52,479 Speaker 1: talking about what does. And in an article title drawing 617 00:35:52,520 --> 00:35:56,360 Speaker 1: the line between science and pseudoscience, the author Janet D. 618 00:35:56,760 --> 00:36:00,719 Speaker 1: Stemweedle writes, the big difference Popper identifies between science and 619 00:36:00,719 --> 00:36:04,520 Speaker 1: pseudoscience is a difference in attitude. While a pseudoscience is 620 00:36:04,640 --> 00:36:07,480 Speaker 1: set up to look for evidence that supports its claims, 621 00:36:07,800 --> 00:36:10,799 Speaker 1: Popper says, a science is set up to challenge its 622 00:36:10,840 --> 00:36:13,920 Speaker 1: claims and to look for evidence that might prove it false. 623 00:36:14,400 --> 00:36:20,000 Speaker 1: Pseudoscience seeks confirmation. In science seeks falsifications. Now, he doesn't 624 00:36:20,040 --> 00:36:23,799 Speaker 1: think that we should dismiss pseudoscience as utterly useless, uninsterresting, 625 00:36:23,920 --> 00:36:24,440 Speaker 1: or false. 626 00:36:24,760 --> 00:36:26,440 Speaker 2: It's just not science. 627 00:36:27,080 --> 00:36:30,560 Speaker 1: Now under the assumption that science has this kind of power. 628 00:36:31,320 --> 00:36:33,440 Speaker 1: One of the problems with pseudoscience, again, this is the 629 00:36:33,480 --> 00:36:36,040 Speaker 1: point that we've been making throughout this is that it 630 00:36:36,080 --> 00:36:39,759 Speaker 1: gets an unfair credibility boost by so cleverly mimicking the 631 00:36:39,840 --> 00:36:43,080 Speaker 1: service appearance of science. So the reason this is so 632 00:36:43,160 --> 00:36:47,040 Speaker 1: important is that pseudoscience gets the benefit of looking scientific 633 00:36:47,480 --> 00:36:51,160 Speaker 1: and has the unfair advantage that it doesn't have to 634 00:36:51,280 --> 00:36:54,719 Speaker 1: look for falsifiability. It just has to prove itself. Right, 635 00:36:55,120 --> 00:36:57,560 Speaker 1: People like Jodaspenza make a lot of claims, and they 636 00:36:57,560 --> 00:37:00,200 Speaker 1: tell their side of the story that includes enough corroborating 637 00:37:00,280 --> 00:37:03,600 Speaker 1: evidence to support their in quotes theories. But this is 638 00:37:03,719 --> 00:37:06,000 Speaker 1: very important. All of these claims are just that their 639 00:37:06,080 --> 00:37:09,680 Speaker 1: theories and theories are not facts. And as a neuroscientist, 640 00:37:09,800 --> 00:37:12,600 Speaker 1: one would think you would be testing this theory over 641 00:37:12,680 --> 00:37:15,640 Speaker 1: and over and over in reputable ways. But for so 642 00:37:15,800 --> 00:37:20,200 Speaker 1: many like him, that's not happening, Kitchie. Clickbait turns you 643 00:37:20,320 --> 00:37:24,200 Speaker 1: into spending five thousand dollars plus on a program, a retreat, 644 00:37:24,239 --> 00:37:27,279 Speaker 1: a workshop, a membership, or something that might just be 645 00:37:27,320 --> 00:37:29,919 Speaker 1: a bunch of words that sound important when said next 646 00:37:29,960 --> 00:37:32,640 Speaker 1: to each other. And what I really love about being 647 00:37:32,640 --> 00:37:36,360 Speaker 1: a therapist is that our ethical guidelines balance on us 648 00:37:36,400 --> 00:37:40,240 Speaker 1: owning that we are not the knowers, we are the helpers. 649 00:37:40,520 --> 00:37:42,880 Speaker 1: It is why we can offer insight with ease, but 650 00:37:42,920 --> 00:37:46,240 Speaker 1: are somewhat slow to give direct advice or tell someone 651 00:37:46,320 --> 00:37:50,120 Speaker 1: with certainty something about themselves. We use phrases like it 652 00:37:50,200 --> 00:37:53,759 Speaker 1: sounds like I wonder, I'm curious about etc. But I 653 00:37:53,800 --> 00:37:58,000 Speaker 1: would be cautious to give someone a recipe for guaranteed success. 654 00:37:58,760 --> 00:38:01,440 Speaker 1: And I make a lot of us sumps. But I 655 00:38:01,480 --> 00:38:04,160 Speaker 1: state that in some of those phrases that I use. 656 00:38:04,239 --> 00:38:07,640 Speaker 1: I wonder it sounds like I have this idea that 657 00:38:09,000 --> 00:38:11,440 Speaker 1: none of that is fact. I know a lot of 658 00:38:11,480 --> 00:38:16,360 Speaker 1: things that aren't helpful and do cause harm, and I 659 00:38:16,400 --> 00:38:19,120 Speaker 1: know a lot of things that are helpful, but those 660 00:38:19,160 --> 00:38:22,839 Speaker 1: things very person to person, case to case. We can 661 00:38:22,880 --> 00:38:26,040 Speaker 1: make observations and we can come up with ideas. We 662 00:38:26,080 --> 00:38:30,239 Speaker 1: are trained to know and notice our limitations. Now does 663 00:38:30,280 --> 00:38:32,480 Speaker 1: that mean everyone does that and does it well one 664 00:38:32,560 --> 00:38:35,520 Speaker 1: hundred percent of the time. No, I've made mistakes before, 665 00:38:35,920 --> 00:38:38,600 Speaker 1: But there's also a system set up to regulate the 666 00:38:38,640 --> 00:38:41,640 Speaker 1: best it can. How this category of helpers is helping 667 00:38:42,360 --> 00:38:45,440 Speaker 1: people like Joe Despenza don't have that. He doesn't have 668 00:38:45,480 --> 00:38:50,319 Speaker 1: a regulating body. Again, He's a chiropractor acting as a 669 00:38:51,000 --> 00:38:54,640 Speaker 1: essentially medical professor. That's curing cancer and blindness. He's acting 670 00:38:54,680 --> 00:38:58,200 Speaker 1: as a oncologist, he's acting as an optometrist. He's acting 671 00:38:58,239 --> 00:39:01,040 Speaker 1: on all these things. But there's no life, so he 672 00:39:01,080 --> 00:39:05,720 Speaker 1: can't somewhat be sued for malpractice. I wonder that's actually interesting. 673 00:39:05,760 --> 00:39:08,399 Speaker 1: I wonder if he's even still licensed as a chiropractor, 674 00:39:09,320 --> 00:39:12,000 Speaker 1: because I can be sued for malpractice if I act 675 00:39:12,000 --> 00:39:14,759 Speaker 1: out of my scope. Let's just say I had a 676 00:39:14,760 --> 00:39:20,080 Speaker 1: PhD in some kind of psychology whatever. If I use 677 00:39:20,200 --> 00:39:24,600 Speaker 1: that term doctor to legitimize myself and then talk about 678 00:39:24,640 --> 00:39:27,319 Speaker 1: something that I actually wasn't trained in, I can be 679 00:39:27,320 --> 00:39:31,200 Speaker 1: sued for malpractice. So we would be acting outside of 680 00:39:31,239 --> 00:39:33,880 Speaker 1: our scope. But what Joe was doing, he's acting in 681 00:39:33,920 --> 00:39:37,640 Speaker 1: a world that isn't regulated by these licensing boards. It 682 00:39:37,680 --> 00:39:40,160 Speaker 1: sounds like they don't have to play by any rules 683 00:39:40,160 --> 00:39:42,879 Speaker 1: because there aren't any rules for them to follow. And 684 00:39:43,160 --> 00:39:46,320 Speaker 1: you know, sometimes I really really hate some of the 685 00:39:46,440 --> 00:39:49,719 Speaker 1: rules that we have to follow as licensed therapists. It's 686 00:39:49,719 --> 00:39:54,200 Speaker 1: a lot of tedious stuff. However, when I've broken them, 687 00:39:54,360 --> 00:39:57,759 Speaker 1: I'm very quick to understand why they're there. And it 688 00:39:57,800 --> 00:40:00,400 Speaker 1: makes me think of the show Shrinking. I kind of 689 00:40:00,440 --> 00:40:04,360 Speaker 1: jealous watching that because the main character just went rogue. 690 00:40:04,440 --> 00:40:07,240 Speaker 1: He was a therapist and he just started doing whatever 691 00:40:07,280 --> 00:40:10,040 Speaker 1: he wanted to do and then breaking all these ethical 692 00:40:10,040 --> 00:40:15,800 Speaker 1: guidelines and all of his professional boundaries were basically non. 693 00:40:15,600 --> 00:40:18,400 Speaker 2: Existent, and I was like, oh man, he's so bold. 694 00:40:18,400 --> 00:40:19,279 Speaker 2: I wish I could do that. 695 00:40:19,640 --> 00:40:22,399 Speaker 1: Why do I have to have so many boundaries in 696 00:40:22,520 --> 00:40:25,719 Speaker 1: this and that I should be cooler. But then as 697 00:40:25,719 --> 00:40:29,120 Speaker 1: you continue to watch this show, you saw very clearly 698 00:40:29,280 --> 00:40:32,080 Speaker 1: why those boundaries are needed. Things started to get messy, 699 00:40:32,120 --> 00:40:35,759 Speaker 1: and things started to get very unsafe. And as I 700 00:40:35,800 --> 00:40:37,879 Speaker 1: was preparing for this episode, one thing that I kept 701 00:40:37,920 --> 00:40:40,680 Speaker 1: asking myself, because I was sucked into this hole and 702 00:40:40,719 --> 00:40:42,960 Speaker 1: I had a lot of feelings, is why do I 703 00:40:43,000 --> 00:40:45,719 Speaker 1: care so much? Why does my blood boil when I 704 00:40:45,800 --> 00:40:49,280 Speaker 1: hear about this kind of stuff? Because again, this happens 705 00:40:49,320 --> 00:40:51,560 Speaker 1: over and over. There are so many people like Joe Dispensa. 706 00:40:51,600 --> 00:40:53,080 Speaker 1: He's not the only one, He's just the one I'm 707 00:40:53,120 --> 00:40:56,399 Speaker 1: using as the example. But why can't you just use 708 00:40:56,440 --> 00:40:59,360 Speaker 1: the mel Robbin's tactic, let them and just go on 709 00:40:59,480 --> 00:41:02,080 Speaker 1: living my life life? And in this moment, the most 710 00:41:02,120 --> 00:41:05,600 Speaker 1: true answer feels like I get really angry when people 711 00:41:05,680 --> 00:41:09,799 Speaker 1: are taking advantage of, especially for the vulnerability of their 712 00:41:09,840 --> 00:41:13,399 Speaker 1: unmet needs. It's the justice part of me, the part 713 00:41:13,400 --> 00:41:16,239 Speaker 1: that made my therapist wonder if I'm actually an eight 714 00:41:16,280 --> 00:41:19,319 Speaker 1: on the Enneagram and not a seven. I want this 715 00:41:19,400 --> 00:41:21,440 Speaker 1: episode to be a wake up call for all of us, 716 00:41:21,520 --> 00:41:24,680 Speaker 1: for those listening and for those helping. I want us 717 00:41:24,719 --> 00:41:27,239 Speaker 1: to get more curious about what we're listening to and 718 00:41:27,320 --> 00:41:30,400 Speaker 1: I even mean what you're listening to me say, because 719 00:41:30,480 --> 00:41:33,000 Speaker 1: I'm not an expert in everything. I'm not an expert 720 00:41:33,000 --> 00:41:36,080 Speaker 1: on the difference between science and pseudoscience. I went and 721 00:41:36,120 --> 00:41:38,160 Speaker 1: did some research on it. But there might be some 722 00:41:38,200 --> 00:41:40,640 Speaker 1: of you listening that are like, huh, I want to 723 00:41:40,719 --> 00:41:41,520 Speaker 1: learn more about that. 724 00:41:41,640 --> 00:41:44,480 Speaker 2: Fact check me. I would love that. Please fact check me. 725 00:41:45,080 --> 00:41:47,000 Speaker 1: Maybe I can learn something more from you guys even 726 00:41:47,040 --> 00:41:49,359 Speaker 1: doing that. But I really want this to be a 727 00:41:49,360 --> 00:41:53,080 Speaker 1: call for us to be listening, but listening more intently, 728 00:41:53,640 --> 00:41:58,719 Speaker 1: being curious, but also being skeptical. I can be optimistic 729 00:41:58,840 --> 00:42:02,160 Speaker 1: and skeptical at the same time time. I mean truly, 730 00:42:02,239 --> 00:42:05,280 Speaker 1: I mean when I say I want cure and cancer 731 00:42:05,320 --> 00:42:09,200 Speaker 1: to be as easy as Jodasvenza says it is. However, 732 00:42:09,320 --> 00:42:11,640 Speaker 1: I have to be skeptical of that because I don't 733 00:42:11,680 --> 00:42:14,360 Speaker 1: know that it's actually true, and that could cause a 734 00:42:14,360 --> 00:42:17,279 Speaker 1: lot of harm to me and the people around me 735 00:42:17,560 --> 00:42:20,400 Speaker 1: if it is affecting the choices that we're making. I 736 00:42:20,440 --> 00:42:24,040 Speaker 1: also want us to be slow, slower to reshare things 737 00:42:24,400 --> 00:42:30,240 Speaker 1: that sound really cool and exciting and click baity and novel. 738 00:42:30,360 --> 00:42:34,160 Speaker 1: I want us to maybe settle in the idea that 739 00:42:34,440 --> 00:42:38,360 Speaker 1: the truth can be boring sometimes, and we're allowed to 740 00:42:38,520 --> 00:42:41,560 Speaker 1: regulate our systems and get used to that a little bit. 741 00:42:42,080 --> 00:42:45,839 Speaker 1: When we're always looking for the most exciting, new fanatical thing, 742 00:42:46,560 --> 00:42:51,040 Speaker 1: then even exciting things can feel really boring. So maybe 743 00:42:51,120 --> 00:42:57,719 Speaker 1: our system of our attention has been skewed because we're 744 00:42:57,760 --> 00:43:01,360 Speaker 1: constantly looking for the thing that's going to trump the 745 00:43:01,480 --> 00:43:05,480 Speaker 1: last really exciting thing that we heard or learned. So 746 00:43:06,040 --> 00:43:08,279 Speaker 1: that's all I have for you today. If you have 747 00:43:08,320 --> 00:43:12,279 Speaker 1: any feedback, questions, stories, anything about this or anything else, 748 00:43:12,320 --> 00:43:15,000 Speaker 1: you can always email me Katherine at you Need Therapy 749 00:43:15,120 --> 00:43:16,560 Speaker 1: podcast dot com. 750 00:43:16,920 --> 00:43:18,600 Speaker 2: You can follow me at you Need. 751 00:43:18,520 --> 00:43:22,600 Speaker 1: Therapy Podcast and at cat dot defata. And until I 752 00:43:22,640 --> 00:43:24,479 Speaker 1: talk to you guys next time, I hope you guys 753 00:43:24,520 --> 00:43:26,640 Speaker 1: have the day you need to have.