1 00:00:01,160 --> 00:00:01,800 Speaker 1: Good Day team. 2 00:00:01,800 --> 00:00:04,640 Speaker 2: It's the two champions, it's the two standout superstars of 3 00:00:04,680 --> 00:00:09,000 Speaker 2: the U Project. And that doesn't include Tiff, even though 4 00:00:09,400 --> 00:00:11,440 Speaker 2: IF's editing this, so I feel like I could get 5 00:00:11,480 --> 00:00:15,680 Speaker 2: in trouble. Shout out to you anyway, Tiff, David, James Gillespie, Craig, 6 00:00:15,720 --> 00:00:18,919 Speaker 2: Anthony Harper. Here at your service, folks. So what I firstly, 7 00:00:19,120 --> 00:00:21,000 Speaker 2: hi mate? How are you good? 8 00:00:21,079 --> 00:00:21,400 Speaker 3: How are you? 9 00:00:21,560 --> 00:00:21,880 Speaker 1: How are you? 10 00:00:22,480 --> 00:00:25,840 Speaker 2: It's good. I'm going to sound like a little winger 11 00:00:25,880 --> 00:00:27,880 Speaker 2: for a moment, but it's not the worst wind. It's 12 00:00:27,920 --> 00:00:33,080 Speaker 2: forty four degrees in Melbourne, so we are pretty hot 13 00:00:33,120 --> 00:00:36,519 Speaker 2: as fires all over Victoria at the moment. What's the 14 00:00:36,640 --> 00:00:38,800 Speaker 2: temperature up there in the Sunshine State. 15 00:00:39,200 --> 00:00:41,879 Speaker 4: I think it's about thirty, but it was pretty hot yesterday. 16 00:00:42,080 --> 00:00:47,040 Speaker 4: It's like thirty eight or something yesterday. But yeah, when 17 00:00:47,040 --> 00:00:49,559 Speaker 4: I used to live in Victoria, I do remember not 18 00:00:49,680 --> 00:00:52,280 Speaker 4: liking that particular aspect of it is that most of 19 00:00:52,320 --> 00:00:56,320 Speaker 4: the time it's pretty nice, but when it gets hot, 20 00:00:56,400 --> 00:00:57,760 Speaker 4: it really gets off. 21 00:00:57,880 --> 00:00:59,240 Speaker 1: Yeah, it's yeah. 22 00:00:59,280 --> 00:01:02,960 Speaker 2: Well the one days the predicted top well, it's still 23 00:01:02,960 --> 00:01:04,920 Speaker 2: going to get hotter. They reckon, it's going to get 24 00:01:04,959 --> 00:01:09,959 Speaker 2: forty six at about six o'clock, but the prediction was, well, 25 00:01:09,959 --> 00:01:11,400 Speaker 2: the forecast was forty seven. 26 00:01:11,440 --> 00:01:15,360 Speaker 1: I'm like, who does forty seven? Anyway, we're here. 27 00:01:15,840 --> 00:01:18,080 Speaker 2: As long as everyone's safe, and everyone's and all the 28 00:01:18,120 --> 00:01:22,360 Speaker 2: animals and forests are not getting burnt and devastated, that's 29 00:01:22,400 --> 00:01:27,320 Speaker 2: all we care about. So I sent your message before 30 00:01:27,319 --> 00:01:29,320 Speaker 2: and you went, I said, well, what are we talking about? 31 00:01:29,360 --> 00:01:29,959 Speaker 1: What have you written? 32 00:01:29,959 --> 00:01:31,959 Speaker 2: And you went, nothing, put on your thinking cap And 33 00:01:32,000 --> 00:01:34,319 Speaker 2: I went, you know what I'm doing. I'm getting AI 34 00:01:34,400 --> 00:01:38,440 Speaker 2: to put on its thinking cap. Oh no, so I 35 00:01:38,480 --> 00:01:41,199 Speaker 2: said to Chatters. So if this is terrible, everyone don't 36 00:01:41,200 --> 00:01:44,240 Speaker 2: blame me. It couldn't possibly be me. It's either David 37 00:01:44,720 --> 00:01:48,320 Speaker 2: or chat GPT. So you're welcome, I said to Chatters, 38 00:01:48,360 --> 00:01:51,360 Speaker 2: if you not me, because it always if I ask 39 00:01:51,440 --> 00:01:53,640 Speaker 2: it stuff, it'll put its itself in my place. 40 00:01:53,880 --> 00:01:54,480 Speaker 1: If you not me. 41 00:01:54,560 --> 00:01:57,480 Speaker 2: Were interviewing author, researcher and lawyer David Gillespie, what would 42 00:01:57,480 --> 00:01:58,400 Speaker 2: you ask him? 43 00:01:58,680 --> 00:01:59,560 Speaker 1: And then I don't know why. 44 00:01:59,600 --> 00:02:02,960 Speaker 2: It brought up sweet poison, like you've only ever done 45 00:02:03,000 --> 00:02:07,280 Speaker 2: one thing, like there's a few other there's some deviations, 46 00:02:07,280 --> 00:02:09,359 Speaker 2: but if I need to modified, I will. 47 00:02:10,720 --> 00:02:11,040 Speaker 3: Craig. 48 00:02:11,160 --> 00:02:12,080 Speaker 1: This is what he said to me. 49 00:02:12,240 --> 00:02:15,960 Speaker 2: He or she if I were interviewing David Gilespiotim for 50 00:02:16,040 --> 00:02:18,160 Speaker 2: questions that go beyond what he thinks and dig into 51 00:02:18,200 --> 00:02:21,040 Speaker 2: how he thinks, why he changed his mind, and where 52 00:02:21,080 --> 00:02:26,079 Speaker 2: people most misunderstand him. Here's how I'd structure it, origin, 53 00:02:26,200 --> 00:02:32,760 Speaker 2: story and epistemology. I'm like, thanks, chatters, subtitle how a 54 00:02:32,880 --> 00:02:37,919 Speaker 2: lawyer ends up waging war on sugar and big narratives. 55 00:02:37,560 --> 00:02:39,000 Speaker 1: If these are ship will be in. 56 00:02:39,120 --> 00:02:39,960 Speaker 3: But we'll see how we go. 57 00:02:40,560 --> 00:02:41,919 Speaker 1: You were trained as a lawyer, not. 58 00:02:41,880 --> 00:02:44,560 Speaker 2: A nutritionist or a neuroscientist. How did that shape the 59 00:02:44,600 --> 00:02:50,320 Speaker 2: way you evaluate evidence compared to experts trained in those fields. 60 00:02:52,040 --> 00:02:54,320 Speaker 1: It's not a bad. 61 00:02:54,480 --> 00:02:56,280 Speaker 3: Well done chatters, Yeah, it's not bad. It's not bad. 62 00:02:56,320 --> 00:02:57,320 Speaker 1: Good chatters. 63 00:02:57,960 --> 00:03:01,560 Speaker 4: Well, I guess the important thing to know is that 64 00:03:01,639 --> 00:03:05,440 Speaker 4: I don't have a dog in the fight, and lawyers 65 00:03:05,520 --> 00:03:06,079 Speaker 4: rarely do. 66 00:03:06,360 --> 00:03:06,600 Speaker 1: You know. 67 00:03:06,639 --> 00:03:11,720 Speaker 4: There's that old saying, you know, if you represents yourself, 68 00:03:12,080 --> 00:03:19,560 Speaker 4: you've got an idiot as a client. And that's kind 69 00:03:19,639 --> 00:03:22,359 Speaker 4: of one of I think the most valuable things I 70 00:03:22,480 --> 00:03:26,079 Speaker 4: bring to anything I look at. And it's also why 71 00:03:26,080 --> 00:03:29,960 Speaker 4: I never really write about law. So I can't be 72 00:03:30,040 --> 00:03:34,160 Speaker 4: fired for anything I write because I'm not employed as 73 00:03:34,200 --> 00:03:38,840 Speaker 4: a dietitian or a psychologist or anything like that. I 74 00:03:39,720 --> 00:03:42,040 Speaker 4: don't have a dog in the fight. I'm coming to 75 00:03:42,080 --> 00:03:46,200 Speaker 4: the question and I'm saying, what does the evidence show us? 76 00:03:46,600 --> 00:03:48,760 Speaker 4: And I look at the evidence on both sides of 77 00:03:48,760 --> 00:03:53,400 Speaker 4: the argument and see what it says. And I think 78 00:03:53,520 --> 00:03:56,280 Speaker 4: that's the skill that I'm bringing to the table here, 79 00:03:56,880 --> 00:04:00,400 Speaker 4: which is that a so called expert in the area, 80 00:04:00,800 --> 00:04:04,400 Speaker 4: say in biothet x or nutrition or something, has a 81 00:04:04,440 --> 00:04:08,520 Speaker 4: dog in the fight. They have a job that and 82 00:04:09,160 --> 00:04:14,200 Speaker 4: them keeping their job usually means that they have to 83 00:04:14,280 --> 00:04:17,599 Speaker 4: stick with the consensus, the company line on this stuff. 84 00:04:17,600 --> 00:04:21,200 Speaker 4: So if the consensus is fat makes you fat, then 85 00:04:21,640 --> 00:04:23,760 Speaker 4: those are the glasses they will have on when they 86 00:04:23,760 --> 00:04:27,680 Speaker 4: look at any of the evidence, Whereas I don't care. 87 00:04:28,680 --> 00:04:32,800 Speaker 4: I just want to know what the evidence says. And 88 00:04:33,040 --> 00:04:35,680 Speaker 4: if the evidence said fat makes you fat, then that's 89 00:04:35,680 --> 00:04:38,280 Speaker 4: what I would have said. And nobody would have probably 90 00:04:38,279 --> 00:04:40,040 Speaker 4: published a book written by me, because I would have 91 00:04:40,080 --> 00:04:41,640 Speaker 4: been saying the same thing as everybody else. 92 00:04:42,680 --> 00:04:45,000 Speaker 3: But I don't care about that either. 93 00:04:44,960 --> 00:04:47,359 Speaker 4: In the sense that, as you well know as a 94 00:04:47,360 --> 00:04:51,799 Speaker 4: fellow author, nobody gets rich writing books, and certainly nobody 95 00:04:51,800 --> 00:04:58,000 Speaker 4: gets rich writing books in Australia. Yes, So all I'm 96 00:04:58,040 --> 00:05:03,080 Speaker 4: doing is writing down in terms that people who are 97 00:05:03,080 --> 00:05:07,080 Speaker 4: not experts in the area like me, can understand what 98 00:05:07,120 --> 00:05:11,800 Speaker 4: the evidence actually says. And this is something probably the 99 00:05:11,839 --> 00:05:15,000 Speaker 4: only useful thing that lawyers are trained to do, which 100 00:05:15,080 --> 00:05:20,800 Speaker 4: is they are asked to understand the evidence in all 101 00:05:20,880 --> 00:05:24,359 Speaker 4: manner of disciplines. They have to understand the evidence in 102 00:05:24,400 --> 00:05:28,600 Speaker 4: a car accident, or the evidence in a bridge failure, 103 00:05:28,880 --> 00:05:33,520 Speaker 4: or the evidence in a construction litigation, or the evidence 104 00:05:34,360 --> 00:05:36,599 Speaker 4: you know, in whatever you name it. It's all different, 105 00:05:36,720 --> 00:05:39,520 Speaker 4: and they're trained in one of us. They do have 106 00:05:39,600 --> 00:05:42,400 Speaker 4: to understand the evidence and they have to be able 107 00:05:42,440 --> 00:05:47,160 Speaker 4: to clearly state what the evidence says. And even more 108 00:05:47,320 --> 00:05:49,880 Speaker 4: than that, and this is a side that most people 109 00:05:49,920 --> 00:05:52,919 Speaker 4: don't see in lawyers, is they have an obligation to 110 00:05:52,960 --> 00:05:57,200 Speaker 4: the court to present evidence to the court which contradicts 111 00:05:58,000 --> 00:06:02,920 Speaker 4: their case, obligation not to mislead the court. So if 112 00:06:02,960 --> 00:06:06,760 Speaker 4: there is evidence that they are aware of that contradicts 113 00:06:06,800 --> 00:06:09,000 Speaker 4: something they are saying to the court, they must point 114 00:06:09,000 --> 00:06:15,000 Speaker 4: it out. So I guess in that sense, I'm the 115 00:06:15,040 --> 00:06:18,159 Speaker 4: best possible person to look at the questions that I'm 116 00:06:18,160 --> 00:06:23,400 Speaker 4: writing about exactly because I'm not an expert in the area, 117 00:06:23,520 --> 00:06:26,120 Speaker 4: and I am trained in assessing evidence. 118 00:06:26,880 --> 00:06:29,760 Speaker 2: That is such a good answer, Tode. I hate paying 119 00:06:29,800 --> 00:06:33,120 Speaker 2: you a compliment, but it's very good. Like I think 120 00:06:33,160 --> 00:06:40,520 Speaker 2: also this because people firstly, you're possibly our most if 121 00:06:40,560 --> 00:06:44,279 Speaker 2: not popular, definitely in the top three, right. I mean 122 00:06:44,279 --> 00:06:46,080 Speaker 2: they are probably a couple of people who don't like you. 123 00:06:46,120 --> 00:06:48,560 Speaker 2: There are thousand who don't like me. Who cares, right, 124 00:06:48,640 --> 00:06:51,840 Speaker 2: But the thing about you is that you're not emotional 125 00:06:52,520 --> 00:06:54,680 Speaker 2: and you don't have you said, you don't have a 126 00:06:54,680 --> 00:06:56,480 Speaker 2: dog in the fight for me. You don't have any 127 00:06:57,480 --> 00:07:00,880 Speaker 2: financial incentive to lean one way or the like. Whatever 128 00:07:00,920 --> 00:07:03,600 Speaker 2: the data says, whatever, wherever the evidence goes. What a 129 00:07:03,680 --> 00:07:06,160 Speaker 2: but buck up fill in the blank. It just is 130 00:07:06,160 --> 00:07:09,320 Speaker 2: what it is. The next thing I want you to clarify, 131 00:07:09,360 --> 00:07:11,480 Speaker 2: which I know and you know the answer to clearly. 132 00:07:12,200 --> 00:07:16,600 Speaker 1: But when you say the evidence for this stuff, just clarify. 133 00:07:16,760 --> 00:07:20,200 Speaker 2: For our listeners who are not scientists or researchers or 134 00:07:20,560 --> 00:07:24,520 Speaker 2: gatherers of evidence in this sense, where do you get 135 00:07:24,560 --> 00:07:30,800 Speaker 2: your evidence? I'm guessing not YouTube or Instagram, No. 136 00:07:32,320 --> 00:07:35,960 Speaker 4: But mostly because those are media that people are using 137 00:07:36,000 --> 00:07:41,560 Speaker 4: to summarize evidence. Hopefully they're not really presenting the original 138 00:07:41,600 --> 00:07:45,240 Speaker 4: evidence because it would bore people rigid and no one 139 00:07:45,240 --> 00:07:50,200 Speaker 4: would watch their YouTube channel. Yes, So what I mean 140 00:07:50,640 --> 00:07:53,360 Speaker 4: by the evidence is I go to the source to 141 00:07:53,960 --> 00:07:58,480 Speaker 4: the best quality of evidence that is available. So if 142 00:07:58,480 --> 00:08:01,679 Speaker 4: I often, i'll see an artist call online and maybe 143 00:08:01,680 --> 00:08:03,880 Speaker 4: it is even on Instagram post. I'm not sure, you know, 144 00:08:03,920 --> 00:08:06,320 Speaker 4: but I'll see something that said that makes a claim. 145 00:08:06,960 --> 00:08:08,640 Speaker 4: Like I'll give you an example. I saw one today. 146 00:08:08,880 --> 00:08:11,440 Speaker 4: So there was an article I think was on Instagram. 147 00:08:11,600 --> 00:08:14,480 Speaker 4: It's been doing the rounds. It says AI makes you dumber, 148 00:08:14,880 --> 00:08:19,320 Speaker 4: all right, and has The one I saw on Instagram 149 00:08:19,360 --> 00:08:21,600 Speaker 4: had a sort of a picture of a brain scan 150 00:08:21,720 --> 00:08:24,320 Speaker 4: of someone who had apparently been using AI next to 151 00:08:24,400 --> 00:08:27,600 Speaker 4: a picture of a brain scan of someone who hadn't 152 00:08:27,680 --> 00:08:31,160 Speaker 4: been using AI, and it was much more colorful, so 153 00:08:31,240 --> 00:08:33,840 Speaker 4: their brain was working better, apparently, and the other persons 154 00:08:33,840 --> 00:08:38,600 Speaker 4: were just a dull gray now, you know. And so 155 00:08:39,240 --> 00:08:43,240 Speaker 4: I thought, well, that sounds interesting. Is where is that 156 00:08:43,320 --> 00:08:47,160 Speaker 4: coming from and what is the evidence underneath it? And 157 00:08:47,200 --> 00:08:50,120 Speaker 4: so I went look. I went looking for what is 158 00:08:50,160 --> 00:08:52,439 Speaker 4: this based on? And it is based on a study 159 00:08:53,400 --> 00:08:56,760 Speaker 4: done by an American university I can't remember which one, 160 00:08:56,760 --> 00:08:59,520 Speaker 4: but quite a reputable one. The study was only about 161 00:08:59,520 --> 00:09:07,280 Speaker 4: forty p and it was that they found that in 162 00:09:07,320 --> 00:09:11,360 Speaker 4: some circumstances they were able to test in a variety 163 00:09:11,360 --> 00:09:15,120 Speaker 4: of ways and find that, for example, people had less 164 00:09:15,160 --> 00:09:17,880 Speaker 4: recall when they used an AI to write something than 165 00:09:17,880 --> 00:09:19,040 Speaker 4: if they wrote it themselves. 166 00:09:19,720 --> 00:09:21,079 Speaker 3: And I'm kind of. 167 00:09:21,040 --> 00:09:26,319 Speaker 4: Thinking, well, the yes, that seems to be logical. If 168 00:09:26,360 --> 00:09:28,520 Speaker 4: you didn't write it, you're probably not going to remember 169 00:09:28,559 --> 00:09:30,559 Speaker 4: what you didn't write, of. 170 00:09:30,679 --> 00:09:35,880 Speaker 2: Course, whereas it's such a fucking dumb like claim out 171 00:09:35,920 --> 00:09:39,000 Speaker 2: of that or that particular research design. 172 00:09:39,559 --> 00:09:44,920 Speaker 4: Yeah, and so sure I did look at something. It 173 00:09:45,000 --> 00:09:48,640 Speaker 4: got my attention initially in a popular media. But what 174 00:09:48,720 --> 00:09:51,160 Speaker 4: I just did there is what I tend to do 175 00:09:51,200 --> 00:09:53,720 Speaker 4: when I'm looking for research, which just keep drilling down 176 00:09:54,200 --> 00:09:57,360 Speaker 4: until you find the source and then make judgments about 177 00:09:57,400 --> 00:09:58,600 Speaker 4: the quality of that source. 178 00:09:59,120 --> 00:10:00,360 Speaker 3: So first of all, all. 179 00:10:00,800 --> 00:10:05,640 Speaker 4: You know handily journals are rated, so you can you 180 00:10:05,679 --> 00:10:08,480 Speaker 4: don't have to guess is this a reputable source of information? 181 00:10:09,679 --> 00:10:14,280 Speaker 4: They have ratings and you can decide, okay, well this 182 00:10:14,360 --> 00:10:17,280 Speaker 4: is not a particularly highly rated journal. So I'm already 183 00:10:18,080 --> 00:10:20,400 Speaker 4: a little bit cautious about whether this has been peer 184 00:10:20,440 --> 00:10:22,240 Speaker 4: reviewed or not. So the particular study I was talking 185 00:10:22,240 --> 00:10:27,960 Speaker 4: about there with AI not peer reviewed. So so straight 186 00:10:28,000 --> 00:10:30,440 Speaker 4: away you're thinking, but that doesn't mean throw it in 187 00:10:30,440 --> 00:10:33,559 Speaker 4: the bin. It just means start being really, really cautious 188 00:10:34,000 --> 00:10:38,360 Speaker 4: and questioning about anything that's being said and put your 189 00:10:38,400 --> 00:10:39,640 Speaker 4: critical thinking head on. 190 00:10:39,880 --> 00:10:41,880 Speaker 3: Okay, and. 191 00:10:43,760 --> 00:10:46,160 Speaker 4: So you apply that sort of logic to it, and 192 00:10:46,200 --> 00:10:49,280 Speaker 4: then you start looking at, well, who's paying for this study? 193 00:10:49,559 --> 00:10:52,560 Speaker 4: Do they have a vested interest in the outcome? All 194 00:10:52,600 --> 00:10:55,720 Speaker 4: those kinds of questions as you're looking at it, and 195 00:10:55,760 --> 00:10:58,480 Speaker 4: then if it withstands all of that, you start going 196 00:10:58,480 --> 00:11:01,600 Speaker 4: to look for other research, and well, what other This 197 00:11:01,640 --> 00:11:04,360 Speaker 4: can't possibly be the only person who's looked at this question. 198 00:11:04,440 --> 00:11:05,040 Speaker 3: It never is. 199 00:11:05,080 --> 00:11:09,199 Speaker 4: Thousands of studies are published every day. What do other 200 00:11:09,240 --> 00:11:13,160 Speaker 4: people say about this particular area, What is their contradictory evidence? 201 00:11:13,160 --> 00:11:16,000 Speaker 4: What does it say? And then I look at all 202 00:11:16,040 --> 00:11:19,040 Speaker 4: of that and I start to form a view as 203 00:11:19,080 --> 00:11:24,480 Speaker 4: to how reliable or otherwise what's being said is. And 204 00:11:24,520 --> 00:11:29,520 Speaker 4: then I might form a hypothesis about whether it's you know, 205 00:11:29,800 --> 00:11:33,160 Speaker 4: whether A plus B equals C, and then I'll start 206 00:11:33,200 --> 00:11:36,360 Speaker 4: trying to prove it wrong. So I'll start I'll start 207 00:11:36,400 --> 00:11:38,840 Speaker 4: looking for research that might prove that what I just 208 00:11:38,920 --> 00:11:43,040 Speaker 4: said A plus B equal C is wrong. And if 209 00:11:43,040 --> 00:11:46,840 Speaker 4: I can't find any despite my best efforts, then I 210 00:11:46,920 --> 00:11:49,960 Speaker 4: start to think, okay, well this is starting to stack 211 00:11:50,040 --> 00:11:52,559 Speaker 4: up logically. I can't find anything that says it's wrong. 212 00:11:52,840 --> 00:11:55,080 Speaker 4: Can I find things that suggest that elements of this 213 00:11:55,200 --> 00:11:59,000 Speaker 4: is right? Because you rarely ever find one that says 214 00:11:59,040 --> 00:12:01,200 Speaker 4: all of it together. You say, well, if A is 215 00:12:01,200 --> 00:12:03,480 Speaker 4: true and B is true, then C must be true. 216 00:12:04,320 --> 00:12:07,320 Speaker 4: And so what I'll just describe to you there is 217 00:12:07,600 --> 00:12:10,800 Speaker 4: kind of the process I go through over and over 218 00:12:10,880 --> 00:12:17,559 Speaker 4: and over again with everything I look at, and it's 219 00:12:17,559 --> 00:12:20,520 Speaker 4: an almost automatic thing that I do. And then every 220 00:12:20,559 --> 00:12:24,000 Speaker 4: now and then, one in one hundred times I might 221 00:12:24,320 --> 00:12:26,080 Speaker 4: find something that I think, you know, well, actually this 222 00:12:26,200 --> 00:12:30,120 Speaker 4: is just surprising. This is the opposite of what I 223 00:12:30,200 --> 00:12:34,080 Speaker 4: had believed and the opposite of what I think what 224 00:12:34,200 --> 00:12:36,719 Speaker 4: most people believe. And then I'm all right about it. 225 00:12:38,000 --> 00:12:40,360 Speaker 1: Yeah, it such a good process. 226 00:12:40,760 --> 00:12:45,560 Speaker 2: I didn't even know until I started my PhD that 227 00:12:45,600 --> 00:12:49,120 Speaker 2: there were generally there's like Q one through to Q 228 00:12:49,320 --> 00:12:51,640 Speaker 2: four or Tier one to tier four papers which you 229 00:12:51,679 --> 00:12:54,160 Speaker 2: know about. And just to give our listeners an idea, 230 00:12:54,960 --> 00:12:59,320 Speaker 2: like to get accepted and published in a Tier one 231 00:12:59,440 --> 00:13:00,480 Speaker 2: academic journal. 232 00:13:01,280 --> 00:13:02,640 Speaker 1: It depends on the journal. 233 00:13:02,720 --> 00:13:06,200 Speaker 2: They vary, but some of them take less than two 234 00:13:06,280 --> 00:13:09,600 Speaker 2: percent of the papers that get submitted, and these are 235 00:13:09,640 --> 00:13:13,160 Speaker 2: all obviously peer reviewed, and then down to Q four 236 00:13:13,400 --> 00:13:16,440 Speaker 2: so the lowest level, which is where they're often not 237 00:13:16,559 --> 00:13:19,320 Speaker 2: peer reviewed, and the way that they get published is 238 00:13:19,360 --> 00:13:22,640 Speaker 2: by the authors paying for it to be published. 239 00:13:23,000 --> 00:13:27,199 Speaker 3: Yes, you know play Yeah, so it's like oh. 240 00:13:26,960 --> 00:13:30,080 Speaker 2: And then when you so it's and people don't know that, 241 00:13:30,240 --> 00:13:31,200 Speaker 2: not that that's an insult. 242 00:13:31,200 --> 00:13:32,520 Speaker 1: People aren't expected to know that. 243 00:13:32,600 --> 00:13:34,760 Speaker 2: So when you go, oh, this was published in a 244 00:13:34,800 --> 00:13:38,120 Speaker 2: scientific journal that could be analogous to the fucking Woman's 245 00:13:38,200 --> 00:13:41,240 Speaker 2: Day or new Idea like it to be. Yeah, that's 246 00:13:41,559 --> 00:13:46,400 Speaker 2: that's cool, but let's wish what was what was the journal? 247 00:13:46,400 --> 00:13:48,920 Speaker 1: And some on. But even with your forty people. 248 00:13:48,679 --> 00:13:51,800 Speaker 2: Research, you go, so, there's eight billion people in the 249 00:13:51,800 --> 00:13:55,840 Speaker 2: world and we've done some research with forty it's that's 250 00:13:55,920 --> 00:13:59,439 Speaker 2: not necessarily going to extrapolate or you know, and the other. 251 00:13:59,280 --> 00:14:02,360 Speaker 4: Thing there's a problem with that than just this. I mean, 252 00:14:02,400 --> 00:14:07,280 Speaker 4: small research can give you interesting insights that you might 253 00:14:07,320 --> 00:14:09,520 Speaker 4: not other have seen, particularly when it's done in humans. 254 00:14:11,040 --> 00:14:13,520 Speaker 4: So but then you start to look at, well, what 255 00:14:13,600 --> 00:14:19,040 Speaker 4: exactly did they do? And if it becomes you look 256 00:14:19,080 --> 00:14:22,000 Speaker 4: at it and you think, well, yeah, well that's kind 257 00:14:22,000 --> 00:14:26,120 Speaker 4: of obvious, you know, And what are they proving by 258 00:14:26,280 --> 00:14:29,080 Speaker 4: by showing that people who didn't write things don't know 259 00:14:29,120 --> 00:14:33,600 Speaker 4: what they didn't write, you know, I guess they are. 260 00:14:33,520 --> 00:14:35,520 Speaker 3: Proving that those people didn't think much. 261 00:14:36,760 --> 00:14:43,120 Speaker 5: But so what yes, yeah, well I think also there 262 00:14:43,120 --> 00:14:45,640 Speaker 5: are so many things that I never thought about until 263 00:14:45,640 --> 00:14:49,880 Speaker 5: I started this journey, but like the quality of the participants, 264 00:14:50,040 --> 00:14:51,520 Speaker 5: or the demographic. 265 00:14:50,960 --> 00:14:52,920 Speaker 1: Or the age, or the buppa bappa bar fill in 266 00:14:52,960 --> 00:14:53,640 Speaker 1: the blank. Right. 267 00:14:53,720 --> 00:14:57,760 Speaker 2: So with I ran a few projects and in one 268 00:14:57,760 --> 00:14:59,920 Speaker 2: of my projects, I had two hundred and ninety nine 269 00:15:00,080 --> 00:15:03,880 Speaker 2: people and pretty much it was a bit skewed, but 270 00:15:03,960 --> 00:15:08,400 Speaker 2: not a lot. It was like one hundred and forty students, 271 00:15:09,920 --> 00:15:16,280 Speaker 2: so undergrad psychologist students, and the other group was general public. 272 00:15:16,840 --> 00:15:20,280 Speaker 2: So you don't think about the limitations, and you don't 273 00:15:20,280 --> 00:15:23,640 Speaker 2: think about, like when I explain to people, like good 274 00:15:23,760 --> 00:15:27,840 Speaker 2: data and bad data. So this is true because most 275 00:15:28,480 --> 00:15:33,000 Speaker 2: or many many research papers use students as their participants 276 00:15:33,040 --> 00:15:36,440 Speaker 2: because they're so accessible and easy to motivate to do 277 00:15:36,480 --> 00:15:40,600 Speaker 2: the research. But so the problem in inverted commas with 278 00:15:40,640 --> 00:15:43,960 Speaker 2: some of my research is that I had one hundred 279 00:15:44,000 --> 00:15:47,360 Speaker 2: and sixty odd general public who wanted to be there. 280 00:15:48,040 --> 00:15:50,520 Speaker 2: They chose to be there. They're interested in the research, 281 00:15:50,640 --> 00:15:54,560 Speaker 2: interested in the topic. They potentially follow me on social media. 282 00:15:55,720 --> 00:15:57,760 Speaker 2: They were motivated by the fact that I was doing 283 00:15:57,760 --> 00:15:59,920 Speaker 2: a public workshop at the end, which they paid zi 284 00:16:00,040 --> 00:16:03,440 Speaker 2: zo four. How many came because of that, I'm not sure, 285 00:16:03,880 --> 00:16:07,560 Speaker 2: but anyway, the average time, give or take, and this 286 00:16:07,640 --> 00:16:11,360 Speaker 2: is this is written in in the paper, like you 287 00:16:11,440 --> 00:16:14,560 Speaker 2: have to write what the limitations and what problems and 288 00:16:14,640 --> 00:16:17,400 Speaker 2: you know, like in terms of the design, what did 289 00:16:17,440 --> 00:16:17,920 Speaker 2: you fuck up? 290 00:16:17,960 --> 00:16:18,920 Speaker 1: What would you do different? 291 00:16:20,440 --> 00:16:24,760 Speaker 2: So the paper to answer all these questionnaires like paper 292 00:16:24,840 --> 00:16:27,440 Speaker 2: questionnaires with a pen and paper took if you did 293 00:16:27,440 --> 00:16:31,160 Speaker 2: it properly, about fifty minutes. And so the general public, 294 00:16:31,760 --> 00:16:33,640 Speaker 2: it took all of them somewhere in the forty to 295 00:16:33,720 --> 00:16:34,800 Speaker 2: sixty minute range. 296 00:16:35,560 --> 00:16:36,920 Speaker 1: And then I won't. 297 00:16:36,720 --> 00:16:39,479 Speaker 2: Even say how many, but a lot of the students 298 00:16:40,040 --> 00:16:43,880 Speaker 2: were getting through in under fifteen minutes. And the reason, 299 00:16:44,120 --> 00:16:48,040 Speaker 2: the main reason is because they were motivated to be 300 00:16:48,080 --> 00:16:51,920 Speaker 2: there because they were getting academic credits, you know, and 301 00:16:52,000 --> 00:16:56,320 Speaker 2: so you go, oh, so you have a ranging quality 302 00:16:56,480 --> 00:16:59,360 Speaker 2: of data. And then then you've got to try and 303 00:16:59,400 --> 00:17:03,520 Speaker 2: figure out which you literally remove some of the data 304 00:17:03,560 --> 00:17:06,760 Speaker 2: because some of them didn't even finish the thing properly, 305 00:17:06,800 --> 00:17:08,679 Speaker 2: so you just take that out. But yeah, there's a 306 00:17:08,760 --> 00:17:12,160 Speaker 2: huge range of variables. Another thing that is a red 307 00:17:12,200 --> 00:17:15,159 Speaker 2: flag for me is when you see some kind of 308 00:17:16,119 --> 00:17:21,680 Speaker 2: pseudoscience that says, oh, fifty percent of people respond this 309 00:17:21,720 --> 00:17:24,359 Speaker 2: way if they do that, or they see an improvement 310 00:17:24,440 --> 00:17:28,119 Speaker 2: in that if they take this, and you're like, fifty percent, 311 00:17:28,240 --> 00:17:32,760 Speaker 2: what are the chances that's thirty percent of the population, 312 00:17:33,080 --> 00:17:37,240 Speaker 2: Like it's always an exact number, but in real research, 313 00:17:37,280 --> 00:17:39,760 Speaker 2: it's almost never an exact number. 314 00:17:40,080 --> 00:17:42,280 Speaker 1: Like never I didn't have any. 315 00:17:42,320 --> 00:17:46,320 Speaker 2: Completely you know, like fifty or sixty or it's always 316 00:17:46,400 --> 00:17:50,119 Speaker 2: like thirty seven point four or whatever it was. So yeah, 317 00:17:50,440 --> 00:17:53,360 Speaker 2: and that's that's what you don't realize until you kind 318 00:17:53,400 --> 00:17:56,399 Speaker 2: of open that door and try to understand how it 319 00:17:56,480 --> 00:17:57,920 Speaker 2: works well. 320 00:17:58,119 --> 00:18:01,119 Speaker 4: Data transparency is one of the are big problems in 321 00:18:01,200 --> 00:18:06,119 Speaker 4: research is that good papers make their data available so 322 00:18:06,160 --> 00:18:09,520 Speaker 4: that any other research can go in and use their 323 00:18:09,600 --> 00:18:12,560 Speaker 4: data to see if they get the same results. 324 00:18:13,080 --> 00:18:14,840 Speaker 3: And that's one of the big problems. 325 00:18:14,880 --> 00:18:16,920 Speaker 4: And we talked about this briefly before when I think 326 00:18:16,920 --> 00:18:20,360 Speaker 4: we talked about STATN research is STATIN research is almost 327 00:18:20,400 --> 00:18:24,840 Speaker 4: exclusively controlled by consortium funded by the drug companies that 328 00:18:24,920 --> 00:18:30,479 Speaker 4: make statins, and all of that research is published without 329 00:18:30,520 --> 00:18:34,760 Speaker 4: access to the original data. In fact, it's explicitly and 330 00:18:34,840 --> 00:18:40,280 Speaker 4: on purpose kept secret from any other researcher. So you 331 00:18:40,320 --> 00:18:45,000 Speaker 4: will never ever see papers that contradict the results of 332 00:18:45,040 --> 00:18:48,719 Speaker 4: STATIN papers because no one else can see the data, 333 00:18:48,760 --> 00:18:52,240 Speaker 4: which means you can't tell who did they eliminate and why, 334 00:18:52,960 --> 00:18:55,919 Speaker 4: what exactly did they do to the data set to 335 00:18:56,000 --> 00:18:58,520 Speaker 4: get this result, what did they manipulate about it? And 336 00:18:58,560 --> 00:19:00,679 Speaker 4: we know some of the things that they do, so 337 00:19:00,720 --> 00:19:03,919 Speaker 4: there's a peculiar way that they count whether or not 338 00:19:03,960 --> 00:19:07,639 Speaker 4: someone persisted with the drug or not, which is quite 339 00:19:07,680 --> 00:19:11,760 Speaker 4: unusual in that kind of research. And that's just one 340 00:19:11,840 --> 00:19:14,320 Speaker 4: example where data matters. And here's another one which is 341 00:19:14,359 --> 00:19:17,600 Speaker 4: really important. There was a massive study in the United 342 00:19:17,640 --> 00:19:23,600 Speaker 4: States called the Minnesota County Experiment in I think it 343 00:19:23,640 --> 00:19:25,960 Speaker 4: was done in the nineteen sixties and early nineteen seventies, 344 00:19:26,359 --> 00:19:30,679 Speaker 4: and in that what they were trying to prove was 345 00:19:30,760 --> 00:19:34,840 Speaker 4: that if you took people off the traditional American diet 346 00:19:34,920 --> 00:19:37,520 Speaker 4: at that time, which was heavy and saturated fat and 347 00:19:37,640 --> 00:19:42,159 Speaker 4: replace the saturated fat with margarines and seed oils, you 348 00:19:42,200 --> 00:19:45,600 Speaker 4: would reduce the incidents of heart attacks. Now, the study 349 00:19:45,880 --> 00:19:49,240 Speaker 4: cost an absolute fortune because unlike the study we're talking 350 00:19:49,280 --> 00:19:51,400 Speaker 4: about before with about forty people in it, this one 351 00:19:51,520 --> 00:19:55,320 Speaker 4: had I think almost one hundred thousand people involved and 352 00:19:55,400 --> 00:19:59,440 Speaker 4: it ran for the better part of a decade. So 353 00:19:59,560 --> 00:20:03,600 Speaker 4: this thing was massive and hugely expensive and funded by 354 00:20:03,640 --> 00:20:10,359 Speaker 4: the American government. And it proved nothing in the sense 355 00:20:10,440 --> 00:20:14,600 Speaker 4: that it didn't prove the hypothesis, which is that there'd 356 00:20:14,600 --> 00:20:17,800 Speaker 4: be less heart attacks and the people eating margarine, and 357 00:20:17,840 --> 00:20:24,359 Speaker 4: so they published nothing. So they ran that study and 358 00:20:24,920 --> 00:20:27,960 Speaker 4: produced a paper at the end of it, which was meaningless. 359 00:20:28,480 --> 00:20:30,639 Speaker 4: They didn't produce a paper because I presume they had to, 360 00:20:31,240 --> 00:20:36,000 Speaker 4: but it was meaningless. And recently, I say recently in 361 00:20:36,040 --> 00:20:39,240 Speaker 4: the last decade, the National Institutes of Health in the 362 00:20:39,359 --> 00:20:43,880 Speaker 4: US had access to the original data. So they went 363 00:20:44,040 --> 00:20:46,440 Speaker 4: back looked at the data and said, well, it didn't 364 00:20:46,480 --> 00:20:49,800 Speaker 4: prove the hypothesis, but did it prove anything else? Because 365 00:20:49,800 --> 00:20:52,200 Speaker 4: it's a really important and useful data sets. You should 366 00:20:52,200 --> 00:20:57,399 Speaker 4: make such a huge number of humans and have this effect, 367 00:20:58,080 --> 00:20:59,080 Speaker 4: Did it prove anything? 368 00:20:59,160 --> 00:21:00,040 Speaker 3: And what they found is. 369 00:21:00,840 --> 00:21:05,920 Speaker 4: That it didn't. Actually, the people who were on the 370 00:21:05,960 --> 00:21:09,160 Speaker 4: traditional diet had better heart disease outcomes than the people 371 00:21:09,240 --> 00:21:14,399 Speaker 4: whose diet was changed, and the worse than that. The 372 00:21:14,440 --> 00:21:18,920 Speaker 4: all cause mortality was much higher in the changed group. 373 00:21:19,600 --> 00:21:25,920 Speaker 4: So reducing people's cholesterol, which was the aim, actually got 374 00:21:26,000 --> 00:21:29,440 Speaker 4: worse outcomes, not just from heart disease but from everything. 375 00:21:30,080 --> 00:21:35,199 Speaker 4: Now everything is usually code for cancer, so there were 376 00:21:35,280 --> 00:21:38,600 Speaker 4: more people dying of cancer, more people are dying of 377 00:21:38,600 --> 00:21:42,600 Speaker 4: heart disease. When they re looked at the data, and 378 00:21:43,119 --> 00:21:47,040 Speaker 4: that was the research that got published by the National 379 00:21:47,119 --> 00:21:49,120 Speaker 4: Institutes of Health about a decade ago. 380 00:21:49,800 --> 00:21:52,920 Speaker 3: Now that's pretty big news, I would have thought. 381 00:21:53,520 --> 00:21:55,480 Speaker 1: I mean, you think about. 382 00:21:56,640 --> 00:21:59,280 Speaker 2: In reality, what they've done to those people by making 383 00:21:59,320 --> 00:22:03,159 Speaker 2: the meat this alted diet, which was allegedly better and 384 00:22:03,240 --> 00:22:07,760 Speaker 2: healthy and produce better health outcomes, was they overall increased 385 00:22:07,760 --> 00:22:09,240 Speaker 2: morbidity and mortality. 386 00:22:09,640 --> 00:22:10,760 Speaker 3: So it was bad. 387 00:22:10,880 --> 00:22:13,879 Speaker 2: It was literally bad for people, but they buried that. 388 00:22:14,640 --> 00:22:20,080 Speaker 2: I mean, look about talk about fucking unethical, if not evil. 389 00:22:20,200 --> 00:22:25,800 Speaker 2: Imagine knowing that this is actually bad for people, but then, 390 00:22:25,880 --> 00:22:28,720 Speaker 2: because of your ego or your commitment to some kind 391 00:22:28,760 --> 00:22:33,359 Speaker 2: of authority or financial backing or whatever, you hide that 392 00:22:33,640 --> 00:22:35,639 Speaker 2: actual evidence. 393 00:22:35,200 --> 00:22:38,720 Speaker 1: Of what you know to be true for whatever range 394 00:22:38,760 --> 00:22:40,639 Speaker 1: of reasons, like so immoral. 395 00:22:42,200 --> 00:22:46,440 Speaker 3: Yeah, it's sorry. I don't want to mislead people. 396 00:22:46,480 --> 00:22:48,880 Speaker 4: So just while you were rabbiting on there probably saying 397 00:22:49,359 --> 00:22:53,359 Speaker 4: really important, I actually looked up the study and I 398 00:22:53,400 --> 00:22:55,840 Speaker 4: want to be really accurate with some of the things 399 00:22:55,840 --> 00:22:57,680 Speaker 4: that I just said to you there. So it wasn't 400 00:22:57,800 --> 00:23:00,639 Speaker 4: one hundred thousand. The cohort was nine douns four hundred 401 00:23:00,640 --> 00:23:04,840 Speaker 4: and twenty three women and men age twenty to ninety seven, 402 00:23:04,920 --> 00:23:08,560 Speaker 4: so it was just short of ten thousand. And what 403 00:23:08,600 --> 00:23:15,440 Speaker 4: they were measuring was the cholesterol and the exposure. So 404 00:23:15,640 --> 00:23:19,680 Speaker 4: they had serum, they had cholesterol lowering diet that replaced 405 00:23:19,680 --> 00:23:24,480 Speaker 4: saturated fat with corn oil and corn oil's polyunsaturated margarine, 406 00:23:25,800 --> 00:23:30,119 Speaker 4: and the control diet was the normal American die at 407 00:23:30,160 --> 00:23:33,760 Speaker 4: the time, which was high in animal fats and butter 408 00:23:33,840 --> 00:23:37,600 Speaker 4: and so on. So they then looked at, okay, well 409 00:23:37,720 --> 00:23:44,320 Speaker 4: what happened when you reanalyze that data, and they found 410 00:23:44,359 --> 00:23:47,960 Speaker 4: there was a twenty two percent higher risk of death 411 00:23:49,200 --> 00:23:52,840 Speaker 4: for every thirty milligrams per deesci let, or the more 412 00:23:52,880 --> 00:23:55,320 Speaker 4: common measure for Australia's point seven million miles for later 413 00:23:55,720 --> 00:23:59,719 Speaker 4: reduction in cholesterol. So you had a twenty two percent 414 00:23:59,800 --> 00:24:03,560 Speaker 4: in increase in death for every zero point seven to 415 00:24:03,560 --> 00:24:05,680 Speaker 4: eight reduction in cholesterol. 416 00:24:07,000 --> 00:24:09,560 Speaker 3: So that's pretty nasty. 417 00:24:10,520 --> 00:24:16,720 Speaker 4: So the people who were being fed the experimental diet, 418 00:24:16,960 --> 00:24:21,160 Speaker 4: the margarine, were much more likely to die. 419 00:24:22,119 --> 00:24:24,840 Speaker 2: That's exactly what I said when you alleged that I 420 00:24:24,960 --> 00:24:25,760 Speaker 2: was rabbiting on. 421 00:24:27,840 --> 00:24:29,800 Speaker 4: And then there was the all cause increase as well. 422 00:24:30,359 --> 00:24:38,320 Speaker 4: So the the thing about that is there, at least 423 00:24:38,520 --> 00:24:40,960 Speaker 4: we could get even if every even if the health 424 00:24:41,160 --> 00:24:48,120 Speaker 4: authorities have completely ignored that, which they have, at least 425 00:24:48,440 --> 00:24:52,879 Speaker 4: it was possible to do that analysis. And by going 426 00:24:52,960 --> 00:24:54,840 Speaker 4: back to that sort start. By the way, anyone who 427 00:24:54,840 --> 00:24:57,119 Speaker 4: wants to read this, it's in the British Medical Journal 428 00:24:57,720 --> 00:25:01,639 Speaker 4: in twenty and sixteen and it's by a research at 429 00:25:01,640 --> 00:25:07,119 Speaker 4: the National Institute of Health. Lead researcher is Chris Ramsden, 430 00:25:08,040 --> 00:25:10,960 Speaker 4: so that also everyone. 431 00:25:11,040 --> 00:25:14,119 Speaker 2: Just in case you're wondering, there's a bunch of academic 432 00:25:14,160 --> 00:25:16,520 Speaker 2: search engines, but probably the easiest one. 433 00:25:17,320 --> 00:25:19,760 Speaker 1: Is just Google scholar. So if you just go into 434 00:25:19,800 --> 00:25:21,679 Speaker 1: Google and literally type in Google. 435 00:25:21,480 --> 00:25:24,440 Speaker 2: Scholar, it'll bring that up and it's free and got 436 00:25:24,640 --> 00:25:27,120 Speaker 2: It's not quite as good as some of the youbut ones, 437 00:25:27,119 --> 00:25:28,680 Speaker 2: but it's pretty good, so start there. 438 00:25:29,040 --> 00:25:30,640 Speaker 4: Well, and this one you can have the full text 439 00:25:30,720 --> 00:25:33,160 Speaker 4: of because it's in the BMJ and it's open access. 440 00:25:33,320 --> 00:25:36,240 Speaker 4: So for anyone in your audience who's saying, oh, that 441 00:25:36,280 --> 00:25:38,760 Speaker 4: can't be true, you can't possibly it can't possibly be 442 00:25:38,800 --> 00:25:41,760 Speaker 4: true that a cholesterol lowering diet actually increases your risk 443 00:25:41,800 --> 00:25:44,240 Speaker 4: of heart disease, well there you go. 444 00:25:44,480 --> 00:25:48,240 Speaker 3: It is. It's well documented. The data has. 445 00:25:48,160 --> 00:25:52,960 Speaker 4: Been there for decades, ignored for decades because it contradicted 446 00:25:53,000 --> 00:25:57,680 Speaker 4: the mainstream message. Ramsden looked at it again and said, well, actually, 447 00:25:58,040 --> 00:26:01,960 Speaker 4: this data says the opposite. It says you're in you're 448 00:26:02,000 --> 00:26:04,520 Speaker 4: in big trouble if you do what the health experts 449 00:26:04,520 --> 00:26:08,800 Speaker 4: tell you to do. And still the health advice was 450 00:26:08,840 --> 00:26:13,760 Speaker 4: totally resistant to that information. And that's what we're fighting 451 00:26:13,800 --> 00:26:18,040 Speaker 4: against is you have a consensus and the average doctor 452 00:26:18,080 --> 00:26:20,560 Speaker 4: today in Australia would still say to you, oh, oh 453 00:26:20,600 --> 00:26:22,919 Speaker 4: you should be you know, you should be eating margarines 454 00:26:22,920 --> 00:26:26,920 Speaker 4: and lowering your cholesterol. And yet the evidence, very very 455 00:26:27,000 --> 00:26:29,600 Speaker 4: high quality evidence, says that's nonsense. 456 00:26:31,480 --> 00:26:35,840 Speaker 2: Yeah, I'm just writing down the topic of tonight's you 457 00:26:35,960 --> 00:26:39,360 Speaker 2: just gave it to me. Thanks, health advice, Who to trust? Hey, 458 00:26:40,480 --> 00:26:43,160 Speaker 2: you know it's hilarious. Is there's I don't know seven 459 00:26:43,240 --> 00:26:47,760 Speaker 2: a questions or something that chatters? Yeah, we're currently on 460 00:26:47,840 --> 00:26:50,000 Speaker 2: number one, but I'm going to tell you something funny. 461 00:26:50,359 --> 00:26:53,000 Speaker 1: At the end. I've got one more question for you, 462 00:26:53,040 --> 00:26:55,120 Speaker 1: which is a great question. Probably won't be as good 463 00:26:55,119 --> 00:27:00,040 Speaker 1: as chatters, of course. At the end, Chatter says. 464 00:26:59,800 --> 00:27:03,760 Speaker 2: If you want, I can one sharpen these into more 465 00:27:03,960 --> 00:27:10,199 Speaker 2: provocative and confrontational versions like really, two rewrite them in 466 00:27:10,280 --> 00:27:15,640 Speaker 2: full you project conversational tone, or three add neuro behavior, 467 00:27:15,720 --> 00:27:19,000 Speaker 2: change angles to align with your metaperception work. And then 468 00:27:19,040 --> 00:27:20,719 Speaker 2: it goes just tell me the vibe. 469 00:27:22,680 --> 00:27:25,720 Speaker 1: So funny. All right, So let's wind up. 470 00:27:25,760 --> 00:27:27,680 Speaker 2: But I want to wind up with this could take 471 00:27:27,720 --> 00:27:30,920 Speaker 2: you thirty seconds or five minutes or ten, it doesn't matter. 472 00:27:31,280 --> 00:27:34,719 Speaker 2: So some people are listening to this going that's interesting, 473 00:27:34,760 --> 00:27:39,640 Speaker 2: that's fascinating. I'm fucking terrified. But how do I know 474 00:27:39,680 --> 00:27:40,439 Speaker 2: who to trust? 475 00:27:40,880 --> 00:27:44,399 Speaker 1: Like? That's what? Like? Who do I listen to? Who 476 00:27:44,480 --> 00:27:45,439 Speaker 1: do I not listen to? 477 00:27:45,480 --> 00:27:49,720 Speaker 2: And I know that's not an arbitrary answer, but how 478 00:27:49,760 --> 00:27:53,000 Speaker 2: do I know who or what to trust with this stuff? 479 00:27:53,040 --> 00:27:57,000 Speaker 2: That's literally going to impact everything from my healthpan to 480 00:27:57,040 --> 00:27:57,720 Speaker 2: my lifespan. 481 00:27:58,520 --> 00:28:00,960 Speaker 4: Do you know it's a lot of easier these days 482 00:28:01,440 --> 00:28:04,199 Speaker 4: now that we have access to AI in this sense, 483 00:28:04,720 --> 00:28:07,880 Speaker 4: my answer to that has always been only trust people 484 00:28:07,920 --> 00:28:11,199 Speaker 4: who can point, you know, who can put a footnote 485 00:28:11,440 --> 00:28:15,719 Speaker 4: after every fact they state. Lawyers and law students are 486 00:28:15,760 --> 00:28:18,960 Speaker 4: trained with this. We're trained that our opinion is valueless, 487 00:28:19,280 --> 00:28:23,480 Speaker 4: our opinion is nonsense. Our opinion doesn't matter to a court. 488 00:28:24,560 --> 00:28:27,119 Speaker 4: All that matters to a court is the evidence. And 489 00:28:27,200 --> 00:28:30,520 Speaker 4: so we are trained that whenever we state anything the 490 00:28:30,600 --> 00:28:34,080 Speaker 4: sun is shining, then we should put a little number 491 00:28:34,119 --> 00:28:37,639 Speaker 4: after that, and you know, put a reference that tells 492 00:28:37,760 --> 00:28:41,160 Speaker 4: where we got that information from. So and the reason 493 00:28:41,160 --> 00:28:43,880 Speaker 4: the courts require us to do that is so that 494 00:28:43,920 --> 00:28:47,479 Speaker 4: if they don't believe us for whatever reason, or have 495 00:28:47,600 --> 00:28:51,200 Speaker 4: cause to investigate further, they can dig themselves. They can 496 00:28:51,240 --> 00:28:53,479 Speaker 4: go and see look at the footnote and see if 497 00:28:53,480 --> 00:28:55,680 Speaker 4: they agree with the evidence that the sun is shining. 498 00:28:56,920 --> 00:29:00,959 Speaker 4: And anyone who does that, who is trained to do that, 499 00:29:02,120 --> 00:29:05,440 Speaker 4: is always saying to the reader, don't trust me. You 500 00:29:05,480 --> 00:29:08,160 Speaker 4: don't have to trust me. I'm not requiring you to 501 00:29:08,240 --> 00:29:12,200 Speaker 4: trust me if you don't think I'm what I'm saying 502 00:29:12,280 --> 00:29:14,560 Speaker 4: is true, then read the evidence that I'm basing this 503 00:29:14,640 --> 00:29:17,960 Speaker 4: statement on, which is why every single book I've ever 504 00:29:17,960 --> 00:29:21,480 Speaker 4: written comes with a complete index of the studies I 505 00:29:21,520 --> 00:29:23,600 Speaker 4: am relying on, and they are all footnoted. 506 00:29:24,480 --> 00:29:27,840 Speaker 3: So you don't have to trust me. 507 00:29:28,720 --> 00:29:31,960 Speaker 4: If you think what I'm saying is bs, go to 508 00:29:32,000 --> 00:29:35,880 Speaker 4: the source that I say I'm getting it from, Shove 509 00:29:36,000 --> 00:29:38,640 Speaker 4: that into an AI and say does this say that? 510 00:29:39,920 --> 00:29:41,080 Speaker 1: Yes? Yes? 511 00:29:41,400 --> 00:29:45,479 Speaker 2: And also the thing about you is that you are 512 00:29:45,520 --> 00:29:51,480 Speaker 2: the last person to try to shove an opinion down 513 00:29:51,560 --> 00:29:54,640 Speaker 2: someone's throat or an ideas well. 514 00:29:54,640 --> 00:29:55,520 Speaker 3: I'm not entitled. 515 00:29:55,560 --> 00:29:58,480 Speaker 4: I'm not entitled to an opinion because I'm not an expert. 516 00:30:00,280 --> 00:30:03,440 Speaker 4: The problem with experts, though, is that they give opinions 517 00:30:04,160 --> 00:30:09,560 Speaker 4: that are not necessarily evidence based, but are leaps based 518 00:30:09,680 --> 00:30:13,280 Speaker 4: on they know A and D, so B and C 519 00:30:13,440 --> 00:30:17,400 Speaker 4: is probably true because they've never had to investigate it. 520 00:30:17,720 --> 00:30:20,440 Speaker 4: They've never had to satisfy themselves that it is in 521 00:30:20,520 --> 00:30:25,720 Speaker 4: fact true. And I don't get to do that because 522 00:30:25,720 --> 00:30:29,280 Speaker 4: I'm not an expert. I don't get to make statements 523 00:30:29,320 --> 00:30:33,880 Speaker 4: about diet without verifying that the statement has evidence behind it, 524 00:30:34,120 --> 00:30:37,520 Speaker 4: whereas an expert will happily do that based on Oh, 525 00:30:37,600 --> 00:30:41,960 Speaker 4: I've heard people say this, yeah. 526 00:30:41,200 --> 00:30:44,160 Speaker 2: And it's so like, there's probably you're not all over 527 00:30:44,400 --> 00:30:47,640 Speaker 2: this because you don't follow probably a lot of the 528 00:30:47,640 --> 00:30:49,520 Speaker 2: people I follow, but or who come. 529 00:30:49,400 --> 00:30:52,360 Speaker 3: Across my I don't follow you, Craig, but you. 530 00:30:52,320 --> 00:30:55,680 Speaker 1: Know you don't need to. There's nothing new. 531 00:30:55,960 --> 00:30:58,600 Speaker 4: I must have followed you somewhere, because I do every 532 00:30:58,600 --> 00:31:01,160 Speaker 4: now and then see your inspiration quotes come up somewhere 533 00:31:01,240 --> 00:31:01,520 Speaker 4: or other. 534 00:31:01,760 --> 00:31:03,520 Speaker 1: Oh well, at least you call them that. Thank you. 535 00:31:05,000 --> 00:31:08,040 Speaker 1: I'm surprised you didn't say self help dogshit. 536 00:31:08,800 --> 00:31:10,680 Speaker 3: No, Chatters is doing a great job for you. 537 00:31:12,600 --> 00:31:13,480 Speaker 1: What was I going to say? 538 00:31:16,040 --> 00:31:16,200 Speaker 3: Oh? 539 00:31:16,280 --> 00:31:20,720 Speaker 2: Yeah, there's a whole bunch of influencers and quite literally 540 00:31:21,440 --> 00:31:26,239 Speaker 2: legitimately qualified and shudo qualified people who making all of 541 00:31:26,280 --> 00:31:30,200 Speaker 2: these claims about all of these things, and invariably, when 542 00:31:30,200 --> 00:31:34,160 Speaker 2: you get through it, they're actually selling a supplement or 543 00:31:34,160 --> 00:31:40,080 Speaker 2: a product or a program or are something that this 544 00:31:40,240 --> 00:31:43,840 Speaker 2: whole kind of spiel up front is meant to be. Oh, 545 00:31:43,880 --> 00:31:46,440 Speaker 2: here's just some information that might help you. And then 546 00:31:46,480 --> 00:31:48,760 Speaker 2: you get through and it's like, by the way, I've 547 00:31:48,760 --> 00:31:51,239 Speaker 2: got this range of supplements and it's all this, and 548 00:31:51,280 --> 00:31:52,440 Speaker 2: that's all that, and it doesn't. 549 00:31:52,680 --> 00:31:57,720 Speaker 4: That doesn't automatically mean that what they're saying is not true. 550 00:31:58,280 --> 00:32:04,720 Speaker 4: It just means be careful because this person has other motivations. 551 00:32:04,920 --> 00:32:07,120 Speaker 4: But it might be the case that the person has 552 00:32:07,280 --> 00:32:10,360 Speaker 4: in fact done the research, has looked to be evidence 553 00:32:10,920 --> 00:32:13,720 Speaker 4: and has come to the conclusion that the only possible 554 00:32:13,720 --> 00:32:16,720 Speaker 4: way that you can achieve this outcome is by buying 555 00:32:16,760 --> 00:32:19,160 Speaker 4: this supplement, and they believe in it so much that 556 00:32:19,200 --> 00:32:22,720 Speaker 4: they've invested their life's fortune, you know, life savings into 557 00:32:23,840 --> 00:32:25,320 Speaker 4: you know, sending it out there. 558 00:32:25,840 --> 00:32:29,280 Speaker 3: That might be the case. It probably isn't, but it 559 00:32:29,400 --> 00:32:29,720 Speaker 3: might be. 560 00:32:30,120 --> 00:32:35,840 Speaker 4: And and you shouldn't automatically discount something just because someone 561 00:32:36,080 --> 00:32:38,640 Speaker 4: stands to make money out of what they're saying. It 562 00:32:38,840 --> 00:32:43,880 Speaker 4: just means you have to be much more careful. Yeah, 563 00:32:44,360 --> 00:32:49,160 Speaker 4: I agree, you're making a fortune out of our conversation. 564 00:32:49,360 --> 00:32:53,840 Speaker 2: So you know, I'm just sitting on a pile of 565 00:32:53,920 --> 00:32:57,360 Speaker 2: cash as we speak, actually chafing my ass. 566 00:32:57,400 --> 00:32:58,640 Speaker 1: I should put some pants on. 567 00:33:00,160 --> 00:33:02,680 Speaker 4: Yeah, And so you have to be careful about what 568 00:33:02,760 --> 00:33:05,720 Speaker 4: Craig tells us because you know, you know, you'll know 569 00:33:05,840 --> 00:33:08,600 Speaker 4: that money he could be covering what he says, he 570 00:33:08,640 --> 00:33:11,640 Speaker 4: could he could be only presenting me because he's actually 571 00:33:11,840 --> 00:33:14,880 Speaker 4: you know, he's he's part of I don't know, big 572 00:33:14,960 --> 00:33:18,120 Speaker 4: meat or something and try to encourage you to to 573 00:33:18,240 --> 00:33:19,640 Speaker 4: eat eat more. 574 00:33:19,520 --> 00:33:23,640 Speaker 1: Beef, fat beef. Tellow that's exactly it. 575 00:33:24,040 --> 00:33:26,400 Speaker 2: Yeah, I don't even trust me, and I am me, 576 00:33:26,640 --> 00:33:30,920 Speaker 2: so definitely trust me. Do you want to plug anything 577 00:33:31,120 --> 00:33:34,000 Speaker 2: or point anyone in a direction today? You never normally 578 00:33:34,000 --> 00:33:34,360 Speaker 2: do that. 579 00:33:34,480 --> 00:33:35,520 Speaker 3: But don't. 580 00:33:35,800 --> 00:33:37,360 Speaker 4: Well, I haven't even got an article to point for 581 00:33:37,400 --> 00:33:40,200 Speaker 4: people to just as always. You know, if you keep 582 00:33:40,240 --> 00:33:42,960 Speaker 4: an eye on my sub stack, which you can get 583 00:33:42,960 --> 00:33:45,440 Speaker 4: to from my website, which is David Gillespie dot org, 584 00:33:47,360 --> 00:33:50,640 Speaker 4: then you know I'll probably try and put out an 585 00:33:50,680 --> 00:33:51,320 Speaker 4: article tomorrow. 586 00:33:51,400 --> 00:33:52,360 Speaker 3: Don't ask me what it's about. 587 00:33:52,360 --> 00:33:54,600 Speaker 4: I haven't thought about it yet, although I was interested 588 00:33:54,640 --> 00:33:57,320 Speaker 4: in something that caught my eye this morning about them 589 00:33:57,400 --> 00:34:00,240 Speaker 4: someone suggesting that, you know, it should be compulsory for 590 00:34:00,320 --> 00:34:03,640 Speaker 4: children to be slathered in sunscreen at school, and since 591 00:34:03,680 --> 00:34:07,040 Speaker 4: I think most sunscreen is a toxic I might write 592 00:34:07,040 --> 00:34:07,440 Speaker 4: about that. 593 00:34:08,320 --> 00:34:09,480 Speaker 1: You definitely should do that. 594 00:34:10,200 --> 00:34:14,160 Speaker 2: You could you could write another article one day about 595 00:34:14,680 --> 00:34:17,000 Speaker 2: no that's going to say who to trust and why? 596 00:34:17,080 --> 00:34:21,919 Speaker 1: But I feel like that could that could not work out. 597 00:34:21,920 --> 00:34:22,240 Speaker 3: Well. 598 00:34:22,600 --> 00:34:24,960 Speaker 4: The short answer to that is trust anyone who can 599 00:34:24,960 --> 00:34:27,680 Speaker 4: show you where they're getting their information from so that 600 00:34:27,800 --> 00:34:30,160 Speaker 4: you can go check it out yourself. You want someone 601 00:34:30,160 --> 00:34:33,800 Speaker 4: who says, I don't want you to believe me, because 602 00:34:33,840 --> 00:34:35,879 Speaker 4: you don't have to. You can just go and look 603 00:34:35,920 --> 00:34:36,520 Speaker 4: at it yourself. 604 00:34:38,560 --> 00:34:41,759 Speaker 1: Somebody who's not doing any arm twisting. Mate. 605 00:34:42,000 --> 00:34:44,840 Speaker 2: I will say goodbye off here, but as always, appreciate 606 00:34:44,920 --> 00:34:46,080 Speaker 2: you thanks for being on the show. 607 00:34:46,320 --> 00:34:47,919 Speaker 3: Talk to you, Seah.