1 00:00:00,320 --> 00:00:02,679 Speaker 1: I get a team. It's a new project. Doctor Bills back. 2 00:00:04,040 --> 00:00:05,960 Speaker 1: I was going to say my absolute favorite, but that's 3 00:00:06,000 --> 00:00:08,680 Speaker 1: probably not fair, but definitely in my top five favorites. 4 00:00:08,680 --> 00:00:12,959 Speaker 1: And we have a lot of people here and appreciate you, sir. 5 00:00:13,080 --> 00:00:15,239 Speaker 1: It's great to have you back. I love chatting with you. 6 00:00:15,320 --> 00:00:17,120 Speaker 1: I was just pumping up your tires. Have you heard 7 00:00:17,160 --> 00:00:19,239 Speaker 1: that expression pumping up your tires? 8 00:00:20,400 --> 00:00:22,040 Speaker 2: No, but I can gather what it means. 9 00:00:22,520 --> 00:00:25,240 Speaker 1: Yeah, that's an Australian expression for kind of you know, 10 00:00:25,760 --> 00:00:26,960 Speaker 1: telling you you're amazing. 11 00:00:27,720 --> 00:00:27,920 Speaker 2: Yeah. 12 00:00:28,000 --> 00:00:30,400 Speaker 1: Yeah, So if you're pumping up someone's tires, you're making 13 00:00:30,400 --> 00:00:33,120 Speaker 1: them feel good. But it's you know, in this case, 14 00:00:33,159 --> 00:00:34,880 Speaker 1: it's genuine how have you been? 15 00:00:35,880 --> 00:00:36,520 Speaker 2: I've been great? 16 00:00:36,560 --> 00:00:38,560 Speaker 3: And let me just say it's it's always a pleasure 17 00:00:38,640 --> 00:00:41,879 Speaker 3: to come back to your podcast. It's you know, I 18 00:00:41,960 --> 00:00:43,199 Speaker 3: really enjoy speaking with you. 19 00:00:44,479 --> 00:00:44,680 Speaker 4: Is it? 20 00:00:45,560 --> 00:00:48,720 Speaker 1: What are you not necessarily about this one? But in general, 21 00:00:48,880 --> 00:00:52,919 Speaker 1: like doing this is probably the easiest, and by easiest, 22 00:00:53,000 --> 00:00:57,280 Speaker 1: I just mean the less, the least pressure, the most comfortable. 23 00:00:57,800 --> 00:01:00,680 Speaker 1: This is almost the easiest way to do right. 24 00:01:02,200 --> 00:01:05,720 Speaker 3: Probably most of the other interviews are more formal, and 25 00:01:05,760 --> 00:01:11,240 Speaker 3: they play into my credentials in biomedicine, and the advice 26 00:01:11,280 --> 00:01:15,280 Speaker 3: that I might be giving is, you know, more like 27 00:01:15,360 --> 00:01:16,920 Speaker 3: standard of care type of thing. 28 00:01:17,840 --> 00:01:19,039 Speaker 2: So it is a pleasure to. 29 00:01:19,000 --> 00:01:21,119 Speaker 3: Come on here in a little more of an informal 30 00:01:21,160 --> 00:01:25,759 Speaker 3: atmosphere and you know, just shoot the breeze about various topics. 31 00:01:27,720 --> 00:01:30,319 Speaker 1: I used to work in television for a little bit, 32 00:01:30,360 --> 00:01:32,080 Speaker 1: and by a little bit I mean a few years, 33 00:01:32,120 --> 00:01:36,280 Speaker 1: but like a weekly thing, and my weekly thing was 34 00:01:37,200 --> 00:01:41,039 Speaker 1: anywhere between five and seven minutes. So I'd literally leave home, 35 00:01:41,840 --> 00:01:44,240 Speaker 1: drive into the city. You know, you've done it before, 36 00:01:44,280 --> 00:01:46,240 Speaker 1: you've done television, and you go in and you get 37 00:01:46,319 --> 00:01:48,760 Speaker 1: you sit in makeup, and they put stuff on you, 38 00:01:48,800 --> 00:01:51,600 Speaker 1: and you get prepped and they wire you up and 39 00:01:51,640 --> 00:01:56,920 Speaker 1: then and it's just this, it's a really impractical, impossible 40 00:01:57,000 --> 00:02:00,720 Speaker 1: way to deliver quality information and build any kind of 41 00:02:00,760 --> 00:02:04,520 Speaker 1: connection and rapport because you walk onto the TV set 42 00:02:04,640 --> 00:02:08,360 Speaker 1: between ads, you sit down, they count you down, and 43 00:02:08,400 --> 00:02:11,040 Speaker 1: then they kind of intro the show, and then they 44 00:02:11,120 --> 00:02:13,960 Speaker 1: intro you, and then they talk to you for three 45 00:02:14,040 --> 00:02:17,240 Speaker 1: or four minutes, and half the talking is them, and 46 00:02:17,280 --> 00:02:20,320 Speaker 1: then it's like that's quite carperence, you know, and it's like, oh, 47 00:02:20,360 --> 00:02:23,920 Speaker 1: there are so many things that we can't say, you know, 48 00:02:23,960 --> 00:02:30,480 Speaker 1: we really it's a really difficult way to share genuine 49 00:02:30,560 --> 00:02:34,920 Speaker 1: insight and helpful information because the practical reality of the 50 00:02:35,000 --> 00:02:37,520 Speaker 1: situation is you've got almost no time to talk. 51 00:02:38,560 --> 00:02:39,080 Speaker 2: That's true. 52 00:02:39,320 --> 00:02:42,600 Speaker 3: You got time enough for maybe three or four bullet points, 53 00:02:42,760 --> 00:02:46,840 Speaker 3: and you don't have time to discuss caveats or pitfalls 54 00:02:46,880 --> 00:02:49,400 Speaker 3: with any of those. So your message really has to 55 00:02:49,440 --> 00:02:53,560 Speaker 3: be filtered to a degree where sometimes it's almost meaningless. 56 00:02:54,560 --> 00:02:56,960 Speaker 3: So it can be frustrating at times. 57 00:02:57,360 --> 00:03:02,440 Speaker 4: Yeah, exactly, especially when especially the interviewer or interviewers have 58 00:03:03,760 --> 00:03:05,800 Speaker 4: they've got a little b in their bonnet about a 59 00:03:05,840 --> 00:03:09,520 Speaker 4: particular angle, or they want to know, they want to 60 00:03:09,560 --> 00:03:13,800 Speaker 4: dive in on something, and they have an opinion and 61 00:03:13,840 --> 00:03:16,360 Speaker 4: they almost want your endorsement when you're like, well. 62 00:03:16,200 --> 00:03:20,760 Speaker 1: It isn't exactly that. And then but don't you think though, 63 00:03:20,960 --> 00:03:23,880 Speaker 1: you know, it's like you know, now, we're about opinion, 64 00:03:23,880 --> 00:03:27,360 Speaker 1: we're not about science or information. But yeah, that's what 65 00:03:27,440 --> 00:03:29,720 Speaker 1: I love about this. I mean for me being able 66 00:03:29,760 --> 00:03:31,520 Speaker 1: to talk. You know, you and I can talk for 67 00:03:31,560 --> 00:03:33,840 Speaker 1: twenty minutes, we can talk for an hour, We can 68 00:03:33,880 --> 00:03:36,320 Speaker 1: go wherever we want to go. We don't need to 69 00:03:36,320 --> 00:03:38,160 Speaker 1: throw to an ad we don't need to keep the 70 00:03:38,200 --> 00:03:42,080 Speaker 1: sponsors happy. We have sponsors, but you know, there's no interruption, 71 00:03:43,200 --> 00:03:45,760 Speaker 1: you know. So for me, the free range kind of 72 00:03:46,440 --> 00:03:49,800 Speaker 1: capacity to be able to just conversationally go wherever we 73 00:03:50,000 --> 00:03:56,120 Speaker 1: want is quite liberating in especially in a time twenty 74 00:03:56,160 --> 00:04:00,800 Speaker 1: twenty five where everything is five second posts in ten 75 00:04:00,880 --> 00:04:04,120 Speaker 1: second reels and sound bites and grabs. You know. 76 00:04:04,280 --> 00:04:06,800 Speaker 3: Now, it's really refreshing to be able to go deeper 77 00:04:06,840 --> 00:04:09,440 Speaker 3: on some of these issues and you get at the 78 00:04:09,520 --> 00:04:12,600 Speaker 3: nuance of some of these topics. And the other bonus 79 00:04:12,640 --> 00:04:14,960 Speaker 3: is that when you post these on your Facebook, Craig, 80 00:04:15,840 --> 00:04:19,440 Speaker 3: your listeners are fantastic. They ask some intelligent questions or 81 00:04:19,480 --> 00:04:21,920 Speaker 3: ask for resources, and it's great to be able to 82 00:04:21,960 --> 00:04:23,479 Speaker 3: have that venue to follow up on. 83 00:04:24,320 --> 00:04:26,760 Speaker 1: Yeah, exactly. And if you are a listener and you don't, 84 00:04:28,560 --> 00:04:30,440 Speaker 1: you're not part of the Facebook group. So it's just 85 00:04:30,480 --> 00:04:35,799 Speaker 1: the You Project podcast Facebook group. And yeah, we post 86 00:04:35,920 --> 00:04:39,640 Speaker 1: every episode that goes up, We talk about it and 87 00:04:39,880 --> 00:04:41,640 Speaker 1: feel free to jump in there. And if you've got 88 00:04:41,640 --> 00:04:44,880 Speaker 1: a question or you've got any thoughts on potential guests, 89 00:04:44,960 --> 00:04:47,080 Speaker 1: hit us up and we'll see how we go. But 90 00:04:47,200 --> 00:04:50,160 Speaker 1: we love having that community, and I love creating something 91 00:04:50,240 --> 00:04:54,040 Speaker 1: that's you know, thankfully for the most part, very positive. 92 00:04:54,120 --> 00:04:57,000 Speaker 1: We have very very little negativity in that space, which 93 00:04:57,080 --> 00:05:01,400 Speaker 1: is just a little bit of a light in the 94 00:05:01,400 --> 00:05:04,640 Speaker 1: the social media darkness sometimes. So I'm very grateful to 95 00:05:04,720 --> 00:05:06,920 Speaker 1: that for all of that from all of our listeners. 96 00:05:07,920 --> 00:05:11,400 Speaker 1: Speaking of such things, I was looking at something on 97 00:05:12,920 --> 00:05:17,240 Speaker 1: something came across my social media landscape, and that was 98 00:05:17,279 --> 00:05:22,159 Speaker 1: some guy being hysterical about the fact that this new 99 00:05:22,240 --> 00:05:24,680 Speaker 1: study is out and we're not going to dive into 100 00:05:24,680 --> 00:05:26,520 Speaker 1: the study. I just want to talk about this thing. 101 00:05:26,560 --> 00:05:32,080 Speaker 1: This where you know, there's there's apparently that the COVID 102 00:05:32,160 --> 00:05:34,920 Speaker 1: vaccine is causing all this cancer and blah blah blah, 103 00:05:35,000 --> 00:05:38,159 Speaker 1: and it's it's a new research study. It's out from 104 00:05:38,200 --> 00:05:41,680 Speaker 1: South Korea. And I thought, well, one, either that's crap, 105 00:05:41,800 --> 00:05:46,000 Speaker 1: or two they're you know, they're misinterpreting something or or 106 00:05:46,320 --> 00:05:49,040 Speaker 1: it doesn't exist right. And quite often these things when 107 00:05:49,080 --> 00:05:51,240 Speaker 1: you go to try to find that research, the research 108 00:05:51,279 --> 00:05:54,440 Speaker 1: doesn't exist. Anyway, the paper does exist, and I had 109 00:05:54,440 --> 00:05:57,360 Speaker 1: a look at it, and yes, the video clip was 110 00:05:57,480 --> 00:06:02,600 Speaker 1: quite misrepresentative to say the least, But does that frustrate 111 00:06:02,720 --> 00:06:07,120 Speaker 1: you being a researcher and being someone who's only interested 112 00:06:07,160 --> 00:06:10,440 Speaker 1: in truth and facts and science where you see things 113 00:06:10,520 --> 00:06:14,200 Speaker 1: or things get brought to your attention which are super 114 00:06:14,279 --> 00:06:18,080 Speaker 1: duper misleading and very very incomplete. 115 00:06:19,480 --> 00:06:22,760 Speaker 3: Yeah, well it concerns me on a number of levels 116 00:06:22,880 --> 00:06:27,480 Speaker 3: because when and just full disclaimer, I have not read 117 00:06:27,520 --> 00:06:30,560 Speaker 3: that study. In fact, Craig was one of the first 118 00:06:30,560 --> 00:06:32,920 Speaker 3: to bring it to my attention today. It's brand new, 119 00:06:33,240 --> 00:06:36,920 Speaker 3: so I don't have you know, I can't speak one 120 00:06:36,920 --> 00:06:38,800 Speaker 3: way or the other on it. All I can tell 121 00:06:38,839 --> 00:06:40,480 Speaker 3: you is there's a lot of other studies that have 122 00:06:40,560 --> 00:06:44,080 Speaker 3: been done that's say, the exact opposite, absolutely no link 123 00:06:44,600 --> 00:06:47,599 Speaker 3: between COVID nineteen and increased cancer. So it's probably a 124 00:06:47,640 --> 00:06:52,400 Speaker 3: red herring. But the upshot of the question boils down 125 00:06:52,480 --> 00:06:57,440 Speaker 3: to this sensationalism or you know, a study that comes 126 00:06:57,480 --> 00:07:02,600 Speaker 3: out that may bolster someone's by or preconceived notions gets 127 00:07:02,600 --> 00:07:06,080 Speaker 3: a lot more press than the many other studies that 128 00:07:06,400 --> 00:07:09,600 Speaker 3: are that show the opposite result. And this is known 129 00:07:09,640 --> 00:07:13,600 Speaker 3: as confirmation bias. And when you're cherry picking data and 130 00:07:13,720 --> 00:07:16,400 Speaker 3: just representing one side of the story that happens to 131 00:07:16,440 --> 00:07:19,720 Speaker 3: align with whatever your personal beliefs might be, that does 132 00:07:19,760 --> 00:07:24,040 Speaker 3: a disservice to the entire biomedical community. It does a 133 00:07:24,080 --> 00:07:28,760 Speaker 3: disservice to your listeners. I think it's irresponsible to not, 134 00:07:29,400 --> 00:07:33,960 Speaker 3: you know, weigh both sides of the equation with you know, 135 00:07:34,040 --> 00:07:39,640 Speaker 3: an equal you know, an equal amount that correlates with 136 00:07:39,720 --> 00:07:42,080 Speaker 3: the amount of evidence that's out there. You know, if 137 00:07:42,080 --> 00:07:44,840 Speaker 3: you're going to say this one study shows this, I 138 00:07:44,840 --> 00:07:48,480 Speaker 3: think it's responsible to say there are many other studies, however, 139 00:07:48,680 --> 00:07:49,240 Speaker 3: that do. 140 00:07:49,160 --> 00:07:50,239 Speaker 2: Not show this effect. 141 00:07:50,640 --> 00:07:54,640 Speaker 3: I think that is just giving the audience fair and 142 00:07:54,720 --> 00:07:55,680 Speaker 3: balanced information. 143 00:07:56,760 --> 00:07:59,520 Speaker 1: Yeah, totally great. And even the ULSA is in that pipe, 144 00:07:59,560 --> 00:08:02,520 Speaker 1: which that the highlight real thing that I saw on 145 00:08:02,560 --> 00:08:07,320 Speaker 1: social media didn't mention this. Authors basically say this doesn't 146 00:08:07,320 --> 00:08:12,040 Speaker 1: in any way reflect causation and it may not. Like 147 00:08:12,080 --> 00:08:14,320 Speaker 1: there's lots of more research that needs to be done, 148 00:08:14,480 --> 00:08:19,600 Speaker 1: so they actually weren't claiming anything but that paper if 149 00:08:19,600 --> 00:08:24,440 Speaker 1: people want to look at it, it is one year. 150 00:08:24,560 --> 00:08:28,840 Speaker 1: Risks of Cancer associated with COVID nineteen vaccination was published 151 00:08:29,280 --> 00:08:33,400 Speaker 1: September twenty six. But if you want to save yourself 152 00:08:33,440 --> 00:08:37,880 Speaker 1: some time, I had a scan this morning and it's 153 00:08:38,120 --> 00:08:43,120 Speaker 1: relatively inert. It doesn't really say a whole lot more 154 00:08:43,200 --> 00:08:46,360 Speaker 1: than what we've just spoken about. Speaking about things that 155 00:08:46,440 --> 00:08:50,520 Speaker 1: might affect our body. There has been will definitely affect 156 00:08:50,520 --> 00:08:56,800 Speaker 1: our body. There was a bit of controversy about what's 157 00:08:56,840 --> 00:08:59,600 Speaker 1: called thailand ol in the States, and we call it paracetamol, 158 00:08:59,640 --> 00:09:05,400 Speaker 1: and an alleged link between that and autism. And I 159 00:09:05,480 --> 00:09:07,200 Speaker 1: might get some of this wrong. I'm not all over it, 160 00:09:07,240 --> 00:09:11,200 Speaker 1: but I'm pretty sure RFK, who for some reason is 161 00:09:11,360 --> 00:09:14,640 Speaker 1: like the health minister or something. What is his job? 162 00:09:15,280 --> 00:09:17,040 Speaker 1: How the fun did you get that job? 163 00:09:17,800 --> 00:09:21,800 Speaker 3: That's a really great question. Well, he got that job 164 00:09:21,840 --> 00:09:26,880 Speaker 3: because he's a Trump loyalist. Basically, he is woefully unqualified 165 00:09:27,240 --> 00:09:31,280 Speaker 3: to be the Secretary of HHS, which is the American 166 00:09:31,679 --> 00:09:35,600 Speaker 3: Health and Human Services. This is a guy who literally said, 167 00:09:36,360 --> 00:09:39,600 Speaker 3: no one should be taking medical advice from me. But 168 00:09:39,679 --> 00:09:43,320 Speaker 3: that's his freaking job. It's to give medical advice. So 169 00:09:43,800 --> 00:09:47,040 Speaker 3: your question is really on point. How in the world 170 00:09:47,240 --> 00:09:51,319 Speaker 3: did we come to this point. It's an excellent question, 171 00:09:51,360 --> 00:09:53,920 Speaker 3: and it boils down to politics. Unfortunately, it does not 172 00:09:54,000 --> 00:09:56,040 Speaker 3: boil down to what's in the public's best interest. 173 00:09:56,640 --> 00:10:01,640 Speaker 1: Hmmm. Yeah, it's like, yeah, like me being a direct 174 00:10:02,040 --> 00:10:04,720 Speaker 1: point of the creative director of the bolsho or ballet 175 00:10:04,840 --> 00:10:06,719 Speaker 1: or something. I don't know. It's like, no, Craig, you're 176 00:10:06,760 --> 00:10:09,080 Speaker 1: not a good bit for that. You have no talent, 177 00:10:09,120 --> 00:10:14,960 Speaker 1: you're not creative, you can't dance. So the allegation or 178 00:10:15,000 --> 00:10:20,560 Speaker 1: the message was that it causes potentially there's a lot 179 00:10:20,679 --> 00:10:24,440 Speaker 1: doesn't cause, but there's a potential link to increased rates 180 00:10:24,440 --> 00:10:27,720 Speaker 1: of autism with women, pregnant women, was it? I think, yeah, 181 00:10:27,800 --> 00:10:34,040 Speaker 1: pregnant women using tailanol. Where did that claim come from? 182 00:10:34,200 --> 00:10:37,440 Speaker 1: And what's the actual skinny on it? What's the truth? 183 00:10:39,400 --> 00:10:39,679 Speaker 2: Yeah? 184 00:10:39,720 --> 00:10:42,920 Speaker 3: So this this was a disastrous press conference on a 185 00:10:43,040 --> 00:10:47,719 Speaker 3: number of different levels. First of all, they completely mischaracterized 186 00:10:47,760 --> 00:10:51,760 Speaker 3: what autism really is as a you know, as a condition, 187 00:10:52,440 --> 00:10:57,960 Speaker 3: and very insensitively painted everyone on the spectrum with a 188 00:10:58,080 --> 00:11:02,120 Speaker 3: very wide, you know, a very broad brush, and it 189 00:11:02,200 --> 00:11:05,800 Speaker 3: really misrepresented and offended a lot of people with who 190 00:11:05,800 --> 00:11:09,800 Speaker 3: are on the ASD spectrum. It's autism spectrum disorder. But 191 00:11:10,160 --> 00:11:13,280 Speaker 3: getting to the point about the drug, a set of 192 00:11:13,360 --> 00:11:17,360 Speaker 3: menafin or what we call tailanol in the States, paracetamol, 193 00:11:17,360 --> 00:11:20,160 Speaker 3: I guess is what you guys referred to it as 194 00:11:20,520 --> 00:11:27,079 Speaker 3: all the same drug, same molecule, And this press conference 195 00:11:27,160 --> 00:11:32,079 Speaker 3: pointed to a series of very dubious studies that were 196 00:11:32,120 --> 00:11:36,400 Speaker 3: summarized in a review article suggesting that there may be 197 00:11:36,520 --> 00:11:43,160 Speaker 3: a link between people who are pregnant taking tylenol and 198 00:11:43,200 --> 00:11:47,880 Speaker 3: then having children who have an increase increased risk for 199 00:11:48,000 --> 00:11:54,559 Speaker 3: developing autism. So there was no nuance provided here and 200 00:11:54,600 --> 00:11:58,640 Speaker 3: no mention of studies that proved otherwise. So it was 201 00:11:58,679 --> 00:12:04,560 Speaker 3: a really irrespondsible announcement. Just unbelievable that this came from 202 00:12:04,600 --> 00:12:09,040 Speaker 3: an HHS secretary in the States, because that position commands 203 00:12:09,120 --> 00:12:13,840 Speaker 3: a great deal of attention worldwide. You know, many people 204 00:12:13,840 --> 00:12:17,319 Speaker 3: outside of the US pay attention to what our Human 205 00:12:17,400 --> 00:12:22,520 Speaker 3: and Health Services Office states. So this has the potential 206 00:12:22,559 --> 00:12:30,960 Speaker 3: to be, you know, quite devastating for families because what 207 00:12:31,040 --> 00:12:35,000 Speaker 3: it might end up doing is steering pregnant people away 208 00:12:35,160 --> 00:12:40,400 Speaker 3: from taking a medication that could be very important with 209 00:12:40,480 --> 00:12:46,680 Speaker 3: respect to curing pain and fever during a pregnancy. Pain 210 00:12:46,720 --> 00:12:52,560 Speaker 3: and fever are known to produce miscarriage, stillbirths, birth defects, 211 00:12:52,920 --> 00:12:58,280 Speaker 3: and wait for it, autism. So by discouraging people from 212 00:12:58,320 --> 00:13:03,600 Speaker 3: taking this medication, you actually may be increasing the risk 213 00:13:03,640 --> 00:13:07,200 Speaker 3: of autism. So it boils down to your question, where 214 00:13:07,200 --> 00:13:09,240 Speaker 3: did this come from? Well, it came from this one 215 00:13:09,320 --> 00:13:13,880 Speaker 3: dubious review article that cherry picks studies that showed a 216 00:13:14,280 --> 00:13:19,040 Speaker 3: very tiny risk which was not emphasized in this press conference. 217 00:13:19,080 --> 00:13:24,360 Speaker 3: The risk is extremely small of autism being developed, and 218 00:13:24,400 --> 00:13:28,120 Speaker 3: they failed to mention two studies that were much larger, 219 00:13:28,480 --> 00:13:32,160 Speaker 3: much better controlled, and did what we call a sibling 220 00:13:32,320 --> 00:13:37,439 Speaker 3: pair analysis, which controlled for a confounding variable of genetics, 221 00:13:38,160 --> 00:13:40,560 Speaker 3: because that is the main driver of autism. If you 222 00:13:40,600 --> 00:13:44,320 Speaker 3: talk to anyone with a medical or scientific background, it 223 00:13:44,400 --> 00:13:49,720 Speaker 3: boils down to genetics. But this paper, what this press 224 00:13:49,760 --> 00:13:53,000 Speaker 3: conference was trying to say, Actually, I think I have 225 00:13:53,040 --> 00:13:56,160 Speaker 3: a quote here from Trump, who had to chime in, 226 00:13:56,200 --> 00:14:00,320 Speaker 3: of course. He said tail and all is quote not good, 227 00:14:00,720 --> 00:14:03,640 Speaker 3: quite like hell, not to take it, and even admonished 228 00:14:03,679 --> 00:14:08,840 Speaker 3: women to just tough it out. You know, just you 229 00:14:08,880 --> 00:14:11,120 Speaker 3: have a fever of one hundred and five, just tough 230 00:14:11,120 --> 00:14:13,160 Speaker 3: it out. You're going to tell the fetus to tough 231 00:14:13,200 --> 00:14:17,640 Speaker 3: it out too, you know. It was extraordinarily insensitive and 232 00:14:17,679 --> 00:14:27,320 Speaker 3: irresponsible from from a biomedical scientist's position. And we can 233 00:14:27,360 --> 00:14:30,120 Speaker 3: we can go over some of the data which I 234 00:14:30,160 --> 00:14:32,960 Speaker 3: think might help your listeners understand a little bit better 235 00:14:32,960 --> 00:14:33,960 Speaker 3: where I'm coming from. 236 00:14:34,000 --> 00:14:36,080 Speaker 2: But does everything make sense so far? Great? 237 00:14:36,880 --> 00:14:43,440 Speaker 1: Plain? Yeah, Yeah, I just I know Trump has no filter, 238 00:14:43,680 --> 00:14:46,760 Speaker 1: but it just to me, I'm like, even he must 239 00:14:46,800 --> 00:14:51,600 Speaker 1: realize sometimes surely, surely you're not the guy to comment 240 00:14:51,720 --> 00:14:54,840 Speaker 1: and even if you think something, you don't know it 241 00:14:55,640 --> 00:14:58,440 Speaker 1: and I don't anyway. Yep, all right, give us give 242 00:14:58,520 --> 00:15:00,520 Speaker 1: us some of the stuff. Anton. 243 00:15:00,560 --> 00:15:04,160 Speaker 3: I actually that that reminds me of something, Craig. I 244 00:15:04,160 --> 00:15:08,240 Speaker 3: would love to hear your interpretation, you know, as being 245 00:15:08,240 --> 00:15:12,200 Speaker 3: a representative of all of Australia, which I'm sure you are. Yeah, 246 00:15:12,880 --> 00:15:17,400 Speaker 3: what you what your country thinks of what has happened 247 00:15:17,920 --> 00:15:21,200 Speaker 3: with respect to you know, this press conference and so on, 248 00:15:21,240 --> 00:15:25,560 Speaker 3: How how your government agencies responded. I'm sure they refuted 249 00:15:25,920 --> 00:15:30,400 Speaker 3: the claims and encouraged your medical professions to continue giving 250 00:15:31,080 --> 00:15:35,760 Speaker 3: paracetamol during pregnancy for pain and fever, as did like 251 00:15:36,000 --> 00:15:43,840 Speaker 3: every other worldwide uh medical organization. So we can circle 252 00:15:43,880 --> 00:15:45,920 Speaker 3: back to that at the end, because I'd love to hear, 253 00:15:47,280 --> 00:15:51,240 Speaker 3: you know, an outsider's perspective on how you're perceiving the US. 254 00:15:51,280 --> 00:15:54,520 Speaker 1: Now, yeah, let's let's get to that light. But definitely, 255 00:15:55,360 --> 00:15:57,800 Speaker 1: I mean they're just my thoughts, but yeah, so give us, 256 00:15:57,920 --> 00:16:00,320 Speaker 1: give us the deeper kind of understanding of what we're 257 00:16:00,360 --> 00:16:05,200 Speaker 1: not understanding with this tile now autism kind of issue. 258 00:16:05,280 --> 00:16:08,560 Speaker 3: Yeah, absolutely, so, first, a little bit of common sense, 259 00:16:09,320 --> 00:16:13,000 Speaker 3: you know, that can dispel it pretty quickly. Two thirds 260 00:16:13,240 --> 00:16:16,400 Speaker 3: of pregnant women in the US take Thailand all for 261 00:16:16,480 --> 00:16:19,040 Speaker 3: pain and fever, and the standard of care is to 262 00:16:19,080 --> 00:16:23,840 Speaker 3: give as little as possible for as short as possible. Okay, 263 00:16:24,080 --> 00:16:26,800 Speaker 3: that's that's just the standard of care. That's not a 264 00:16:26,840 --> 00:16:30,880 Speaker 3: great revelation that RFK made. So when you consider the 265 00:16:30,880 --> 00:16:33,360 Speaker 3: sheer number of pregnant women in the US taking Thailand 266 00:16:33,400 --> 00:16:36,320 Speaker 3: all already, and the fact that two thirds of our 267 00:16:36,360 --> 00:16:40,200 Speaker 3: kids do not have ASD you know, it's actually less 268 00:16:40,200 --> 00:16:43,280 Speaker 3: than one it's actually less than three percent, and only 269 00:16:43,320 --> 00:16:46,000 Speaker 3: one fourth of that three percent have the severe and 270 00:16:46,040 --> 00:16:52,240 Speaker 3: profound autism that RFK misled people to believe that's what 271 00:16:52,320 --> 00:16:58,680 Speaker 3: everybody with autism is like, gross misrepresentation. You can see 272 00:16:58,760 --> 00:17:01,040 Speaker 3: right off the bat that these number don't make any sense. 273 00:17:01,080 --> 00:17:04,800 Speaker 3: They don't align, you know, in any meaningful way. Many 274 00:17:04,840 --> 00:17:11,359 Speaker 3: people use thailanol without incident. Autism cases have been documented 275 00:17:11,480 --> 00:17:14,520 Speaker 3: prior to the introduction of tailanol. It wasn't mass marketed 276 00:17:14,600 --> 00:17:16,919 Speaker 3: until the late fifties, but there's plenty of cases in 277 00:17:16,960 --> 00:17:21,639 Speaker 3: the literature for autism, spectrum disorder, and even profound autism. 278 00:17:21,720 --> 00:17:25,400 Speaker 3: So that doesn't make any sense. The studies that they 279 00:17:25,400 --> 00:17:29,399 Speaker 3: were citing were correlational, just like the South Korean study 280 00:17:29,480 --> 00:17:34,840 Speaker 3: that you mentioned claiming a correlation between the COVID vaccine 281 00:17:34,840 --> 00:17:42,359 Speaker 3: and certain cancers. It's not causation. There's a wonderful meme 282 00:17:42,680 --> 00:17:47,840 Speaker 3: that circulates that illustrates the dangers of drawing conclusions from 283 00:17:47,920 --> 00:17:49,200 Speaker 3: correlative studies. 284 00:17:49,920 --> 00:17:50,680 Speaker 2: You can make a. 285 00:17:50,680 --> 00:17:58,159 Speaker 3: Graph of rising autism diagnoses and the amount the popularity 286 00:17:58,560 --> 00:18:03,400 Speaker 3: of organic foods in marketplace and low and hold they correlate. Okay, 287 00:18:03,640 --> 00:18:06,480 Speaker 3: both of those graphs go up in the same direction. 288 00:18:07,920 --> 00:18:10,480 Speaker 3: So what RFK might have you believe if he was 289 00:18:10,520 --> 00:18:13,600 Speaker 3: reading this graph based on his interpretation of this data, 290 00:18:13,800 --> 00:18:20,000 Speaker 3: is that, oh, organic foods cause autism. So correlation doesn't 291 00:18:20,040 --> 00:18:25,119 Speaker 3: mean anything unless you have a biological mechanism, So we 292 00:18:25,240 --> 00:18:29,560 Speaker 3: have to keep that limitation in mind. The association, as 293 00:18:29,600 --> 00:18:33,719 Speaker 3: I mentioned before, is also very small. It's one point 294 00:18:33,760 --> 00:18:38,800 Speaker 3: one times more. That translates to less than ten percent 295 00:18:39,200 --> 00:18:43,919 Speaker 3: greater risk. To put that in perspective, let's consider the 296 00:18:44,080 --> 00:18:48,800 Speaker 3: increased risk that smoking does for lung cancer. That's a 297 00:18:48,880 --> 00:18:53,000 Speaker 3: ten to twenty times greater risk, which translates to almost 298 00:18:53,040 --> 00:18:59,000 Speaker 3: a thousand percent greater risk. So these studies that they 299 00:18:59,000 --> 00:19:05,760 Speaker 3: were citing were marginal at best. Okay, no dose dependent 300 00:19:05,840 --> 00:19:09,320 Speaker 3: effect was ever recorded, which means, like you would expect 301 00:19:09,359 --> 00:19:14,679 Speaker 3: that the more tail and all taken during pregnancy, the 302 00:19:14,800 --> 00:19:18,120 Speaker 3: higher the risk or more severe autism should show up. 303 00:19:18,520 --> 00:19:21,440 Speaker 3: That's not the case. There's that that doesn't hold up, 304 00:19:22,160 --> 00:19:24,639 Speaker 3: and there's no mechanism that's been proposed. If you're going 305 00:19:24,720 --> 00:19:27,280 Speaker 3: to tell me Thailand all causes autism, I want you 306 00:19:27,320 --> 00:19:29,840 Speaker 3: to tell me at a molecular level how that's happening. 307 00:19:30,240 --> 00:19:34,240 Speaker 3: And nothing like that exists. Same with the COVID nineteen 308 00:19:34,320 --> 00:19:39,000 Speaker 3: vaccine and this cancer link. Okay, fine, tell me how 309 00:19:39,200 --> 00:19:42,600 Speaker 3: in the world that can possibly happen. I haven't heard 310 00:19:42,720 --> 00:19:46,800 Speaker 3: any plausible model. So those are the things that I 311 00:19:46,880 --> 00:19:50,879 Speaker 3: need in place before my government makes some kind of 312 00:19:52,119 --> 00:19:56,320 Speaker 3: you know, sledge hammer demand that we should stop using 313 00:19:56,600 --> 00:19:59,919 Speaker 3: a seat of menafin during during pregnancy. 314 00:20:01,480 --> 00:20:03,600 Speaker 2: I've got yeah, go ahead, ask a question. 315 00:20:03,600 --> 00:20:06,040 Speaker 1: I'm just sorry, I'm just I'm being completely silly here, 316 00:20:06,080 --> 00:20:10,400 Speaker 1: but I'm thinking about correlation versus causation, right, I'm thinking, well, 317 00:20:10,480 --> 00:20:12,639 Speaker 1: I'm going to do a graph. In fact, I probably 318 00:20:12,720 --> 00:20:15,080 Speaker 1: might do this on social media because I'm a fucking idiot. 319 00:20:15,080 --> 00:20:18,480 Speaker 1: But like in the last ten years, we've seen both 320 00:20:18,920 --> 00:20:22,160 Speaker 1: electric cars, the use of electric cars go through the roof, 321 00:20:22,280 --> 00:20:23,160 Speaker 1: and diabetes. 322 00:20:23,760 --> 00:20:28,680 Speaker 3: Now I'm not obviously Craig, you know those electric cars 323 00:20:28,720 --> 00:20:30,160 Speaker 3: are causing diabetes. 324 00:20:30,240 --> 00:20:33,800 Speaker 1: Well, I mean that can't just be a coincidence, Professor 325 00:20:33,840 --> 00:20:36,280 Speaker 1: Bill Happy, I mean. 326 00:20:37,200 --> 00:20:38,800 Speaker 2: Go publish it, Go publish it. 327 00:20:38,920 --> 00:20:42,240 Speaker 1: I think I think I've just solved the type two 328 00:20:42,280 --> 00:20:46,560 Speaker 1: diabetes crisis. We just got to, I think, go back 329 00:20:46,600 --> 00:20:52,639 Speaker 1: to internal combustion engines, get rid of electric cars. Problem solved. 330 00:20:52,680 --> 00:20:57,399 Speaker 1: You welcome world. That sounds ridiculous, but it's not that 331 00:20:57,600 --> 00:20:58,600 Speaker 1: far away, is it. 332 00:20:59,359 --> 00:21:02,119 Speaker 2: No principle, that's exactly what's happening here. 333 00:21:02,640 --> 00:21:07,280 Speaker 3: Because what you need to think about in these studies 334 00:21:08,520 --> 00:21:14,720 Speaker 3: tailan al. They're claiming there's an association with autism. There's 335 00:21:14,720 --> 00:21:17,680 Speaker 3: no plausible mechanism for how that can happen, at least 336 00:21:17,720 --> 00:21:21,000 Speaker 3: nothing that I've read. So what that might be indicating 337 00:21:21,040 --> 00:21:25,360 Speaker 3: to us is that it's an indirect effect. What are 338 00:21:25,400 --> 00:21:30,560 Speaker 3: these pregnant people taking the tilenol for. Is it an infection? 339 00:21:31,760 --> 00:21:35,320 Speaker 3: Is it some kind of pain disorder. Maybe that's what's 340 00:21:35,400 --> 00:21:39,040 Speaker 3: causing the autism. You know, that's a study that's worth 341 00:21:39,200 --> 00:21:42,160 Speaker 3: following up on. But you don't come out and say, 342 00:21:42,200 --> 00:21:44,840 Speaker 3: Thailand All's causing it, because that could be a total 343 00:21:44,880 --> 00:21:48,480 Speaker 3: red herring, meaning it could be totally an indirect effect, 344 00:21:48,720 --> 00:21:51,200 Speaker 3: and you're going to miss out on what the direct 345 00:21:51,240 --> 00:21:55,680 Speaker 3: effect is. Okay, you can, you can take away tailand 346 00:21:55,680 --> 00:21:59,120 Speaker 3: all from pregnant women, but still not fix this problem 347 00:21:59,520 --> 00:22:03,640 Speaker 3: because that wasn't the issue to begin with. So that's 348 00:22:03,840 --> 00:22:08,359 Speaker 3: that's why these these types of reckless announcements are just 349 00:22:08,440 --> 00:22:13,159 Speaker 3: so dangerous and not serving the public well. And the 350 00:22:13,920 --> 00:22:19,240 Speaker 3: biggest sin that they committed at this press conference was 351 00:22:19,280 --> 00:22:22,840 Speaker 3: that bigger and better studies have been done and there's 352 00:22:23,000 --> 00:22:26,480 Speaker 3: no link whatsoever between the use of a set of 353 00:22:26,560 --> 00:22:30,199 Speaker 3: menace and during pregnancy and the risk of autism. And 354 00:22:30,240 --> 00:22:34,560 Speaker 3: what those studies have found point to the confounding variable 355 00:22:34,680 --> 00:22:37,680 Speaker 3: of genetics, and the way they do that is they 356 00:22:37,720 --> 00:22:41,080 Speaker 3: compare siblings, which is something these other studies never did. 357 00:22:42,200 --> 00:22:46,000 Speaker 3: And when you compare siblings, which rules out the genetic 358 00:22:46,080 --> 00:22:50,280 Speaker 3: factor or you know, allows it to be considered, there's 359 00:22:50,320 --> 00:22:54,800 Speaker 3: no link whatsoever that the set of metaphemetic medication was 360 00:22:54,920 --> 00:22:58,199 Speaker 3: not a player here. So what are these two studies? Well, 361 00:22:58,240 --> 00:23:01,240 Speaker 3: there was one in Sweden conduct in twenty twenty four. 362 00:23:01,720 --> 00:23:04,840 Speaker 3: It's the largest study on this topic ever conducted to date. 363 00:23:05,240 --> 00:23:09,680 Speaker 3: It involved two point five million children monitored over twenty 364 00:23:09,720 --> 00:23:12,880 Speaker 3: six years. You couldn't ask for a better and more 365 00:23:12,880 --> 00:23:16,400 Speaker 3: comprehensive study, and they found no link when they did 366 00:23:16,440 --> 00:23:20,720 Speaker 3: disibling pair analysis. And if one study is not good 367 00:23:20,800 --> 00:23:23,520 Speaker 3: enough for you, well in Japan there was another study 368 00:23:23,560 --> 00:23:27,840 Speaker 3: done just this year two hundred thousand children. It's not 369 00:23:27,920 --> 00:23:30,480 Speaker 3: in the millions, but two hundred thousands in appreciable study. 370 00:23:31,240 --> 00:23:34,800 Speaker 3: I would call that a good size. Again, found absolutely 371 00:23:34,840 --> 00:23:38,720 Speaker 3: no link when the sibling pair when genetics was considered. 372 00:23:40,400 --> 00:23:43,399 Speaker 3: All the scientific and medical societies that I'm aware of, 373 00:23:43,480 --> 00:23:49,120 Speaker 3: including the American Academy for Pediatrics, American College of Obgyn, 374 00:23:49,760 --> 00:23:52,320 Speaker 3: and the World Health Organization, all of them denounced this 375 00:23:52,400 --> 00:23:57,720 Speaker 3: press conference and reiterated that tailanol is absolutely safe to 376 00:23:57,720 --> 00:24:01,320 Speaker 3: give for pain and fever during pregnancy, and in fact 377 00:24:01,359 --> 00:24:04,159 Speaker 3: it should be given because that can cause complications if 378 00:24:04,160 --> 00:24:07,000 Speaker 3: you don't treat it, and there's no other medicine that's safer, 379 00:24:07,960 --> 00:24:10,280 Speaker 3: and there's no compelling evidence of the link, you know, 380 00:24:10,359 --> 00:24:13,159 Speaker 3: based on these two very large recent studies that have 381 00:24:13,200 --> 00:24:20,160 Speaker 3: been conducted. The AAP, the American Academy of Pediatrics characterized 382 00:24:20,160 --> 00:24:23,760 Speaker 3: this press conference as quote filled with dangerous claims and 383 00:24:23,800 --> 00:24:28,719 Speaker 3: misleading information. And that's just inexcusable in our government officials. 384 00:24:30,880 --> 00:24:37,359 Speaker 1: You know what's interesting is that like the actual lack 385 00:24:37,520 --> 00:24:44,879 Speaker 1: of transparency and scientific kind of integrity, Like they're standing 386 00:24:44,920 --> 00:24:47,119 Speaker 1: up there going, this is the real science. This is 387 00:24:47,160 --> 00:24:50,520 Speaker 1: what they don't want you to know. And it's really 388 00:24:50,720 --> 00:24:53,520 Speaker 1: just a it's just as you were talking about before. 389 00:24:53,560 --> 00:24:58,440 Speaker 1: It's really just confirmation bias playing out where we're looking 390 00:24:58,480 --> 00:25:01,280 Speaker 1: at this thing which is not a very well run 391 00:25:01,320 --> 00:25:05,080 Speaker 1: study with this dodgy what we call an Australia dodgy 392 00:25:05,640 --> 00:25:12,000 Speaker 1: evidence or whatever interpretation of results versus these mega studies 393 00:25:12,080 --> 00:25:16,160 Speaker 1: that we're not talking about that actually contradict what we're 394 00:25:16,160 --> 00:25:21,600 Speaker 1: talking about. I mean, it's the funny thing about science 395 00:25:21,680 --> 00:25:25,880 Speaker 1: and scientists is that every scientist has got an opinion 396 00:25:26,040 --> 00:25:30,080 Speaker 1: and emotions and an identity and a brand. And I'm 397 00:25:30,119 --> 00:25:33,040 Speaker 1: not saying I'm doing a PhD. So I'm not throwing 398 00:25:33,119 --> 00:25:36,240 Speaker 1: me and you under the bus. But the problem is 399 00:25:36,720 --> 00:25:40,080 Speaker 1: that it's so hard to find people who are truly 400 00:25:40,320 --> 00:25:44,360 Speaker 1: truly I mean especially in the public space. I mean 401 00:25:44,440 --> 00:25:49,080 Speaker 1: perhaps where they don't care about the outcome. They just 402 00:25:49,200 --> 00:25:51,800 Speaker 1: want the outcome, like whatever that is, Like they don't 403 00:25:51,840 --> 00:25:56,320 Speaker 1: have an emotional or a financial or a career kind 404 00:25:56,400 --> 00:25:57,120 Speaker 1: of interest. 405 00:25:57,240 --> 00:25:57,479 Speaker 2: You know. 406 00:25:57,560 --> 00:26:00,840 Speaker 1: It's like, how do we just get to truth without 407 00:26:00,880 --> 00:26:05,080 Speaker 1: putting all the bullshit, without putting all the confirmation bias 408 00:26:05,359 --> 00:26:11,520 Speaker 1: and the intolerance and the agendas. And yeah, that's and 409 00:26:11,520 --> 00:26:14,320 Speaker 1: for the average punter who listens to this, who isn't 410 00:26:14,359 --> 00:26:17,800 Speaker 1: a scientist or a researcher, or a medical professional or 411 00:26:17,800 --> 00:26:24,920 Speaker 1: a health professional, no wonder, it's confusing and no wonder. 412 00:26:25,400 --> 00:26:28,520 Speaker 1: Sometimes we don't know. It's like we've got more information 413 00:26:28,640 --> 00:26:32,000 Speaker 1: than ever and more confusion than ever. It's tricky, I 414 00:26:32,000 --> 00:26:36,200 Speaker 1: think for the average person to know who to trust, 415 00:26:36,240 --> 00:26:40,200 Speaker 1: who to pay attention to. You know, even with us 416 00:26:40,200 --> 00:26:43,160 Speaker 1: and we're not pointing people in a direction, We're trying 417 00:26:43,200 --> 00:26:47,439 Speaker 1: to have a rational conversation about something that's been big news. 418 00:26:49,400 --> 00:26:54,120 Speaker 3: Yeah, and that doesn't bid well for the future, not 419 00:26:54,160 --> 00:26:55,960 Speaker 3: just the US, but globally. And you know, I think 420 00:26:55,960 --> 00:26:58,320 Speaker 3: this is a global trend as we become more and 421 00:26:58,359 --> 00:27:01,520 Speaker 3: more saturated with information, and we have a bunch of 422 00:27:02,640 --> 00:27:09,720 Speaker 3: non experts or adjacent experts getting out there and just 423 00:27:10,040 --> 00:27:12,399 Speaker 3: saying a bunch of stuff that is not true, or 424 00:27:12,480 --> 00:27:15,479 Speaker 3: is downright misleading, or is a flat out lie. And 425 00:27:15,520 --> 00:27:22,480 Speaker 3: they're doing this for attention or for money, and it's 426 00:27:23,840 --> 00:27:26,800 Speaker 3: or maybe they think they are doing it for a 427 00:27:26,840 --> 00:27:29,919 Speaker 3: good cause. But what it boils down to at the 428 00:27:30,000 --> 00:27:32,800 Speaker 3: end of the day, like you said, the public is 429 00:27:32,840 --> 00:27:35,840 Speaker 3: not getting the information they need. You know, we can't 430 00:27:35,840 --> 00:27:39,720 Speaker 3: be experts in everything. We rely on experts to tell 431 00:27:39,800 --> 00:27:43,680 Speaker 3: us what's up and to give us evidence based policy 432 00:27:43,760 --> 00:27:47,720 Speaker 3: and guidelines so that we can protect ourselves and our families. 433 00:27:48,400 --> 00:27:48,920 Speaker 2: And if we. 434 00:27:48,960 --> 00:27:54,000 Speaker 3: Destroy that system of trust, you know, I don't know 435 00:27:54,080 --> 00:27:58,560 Speaker 3: where we go as a species moving forward. We're going 436 00:27:58,600 --> 00:28:02,840 Speaker 3: to implode because people will be making decisions that just 437 00:28:03,000 --> 00:28:07,159 Speaker 3: aren't based in reality and that can be absolutely devastating 438 00:28:07,200 --> 00:28:09,800 Speaker 3: for them and their family. And that's what really breaks 439 00:28:09,800 --> 00:28:13,399 Speaker 3: my heart. You know, I've been studying, like, you know, 440 00:28:13,480 --> 00:28:17,280 Speaker 3: biomedicine for thirty some years now, and then people come 441 00:28:17,320 --> 00:28:21,560 Speaker 3: and show some YouTube video to me of someone who 442 00:28:21,640 --> 00:28:25,920 Speaker 3: has absolutely no experience or hasn't been in the trenches 443 00:28:25,960 --> 00:28:28,639 Speaker 3: on this and they ask me what I think, and like, 444 00:28:29,280 --> 00:28:33,400 Speaker 3: why would you even consider taking this person's advice? 445 00:28:33,480 --> 00:28:35,080 Speaker 2: What credentials do they have? 446 00:28:35,920 --> 00:28:39,000 Speaker 3: You know, what have they done in their lives to 447 00:28:39,120 --> 00:28:41,280 Speaker 3: contribute to biomedicine. 448 00:28:42,160 --> 00:28:43,760 Speaker 2: And it's just such a. 449 00:28:43,680 --> 00:28:49,360 Speaker 3: Misorientation of you know, of what we should be paying 450 00:28:49,400 --> 00:28:49,960 Speaker 3: attention to. 451 00:28:50,080 --> 00:28:51,960 Speaker 2: And it's quite scary. 452 00:28:52,120 --> 00:28:55,040 Speaker 3: You know, I was not caught car mechanic to do 453 00:28:55,120 --> 00:28:59,360 Speaker 3: heart surgery on me, and vice versa. We have to 454 00:28:59,400 --> 00:29:02,200 Speaker 3: be able to trust and rely on experts. And when 455 00:29:02,240 --> 00:29:05,200 Speaker 3: you have a government telling you not to trust the experts, 456 00:29:05,480 --> 00:29:07,800 Speaker 3: you've got a real serious problem in your hands. 457 00:29:09,000 --> 00:29:11,520 Speaker 1: That's true. I feel like for you, I feel your pain, 458 00:29:11,760 --> 00:29:14,680 Speaker 1: Like I'm listening to you and I'm thinking, you know 459 00:29:14,800 --> 00:29:17,560 Speaker 1: and you're not You're not perfect, but you know a 460 00:29:17,600 --> 00:29:20,600 Speaker 1: lot and this is your field and you are an expert, 461 00:29:20,720 --> 00:29:24,840 Speaker 1: and I think it must be must be frustrating for you. 462 00:29:24,920 --> 00:29:27,120 Speaker 1: I'm not at your level, will probably never be at 463 00:29:27,160 --> 00:29:31,600 Speaker 1: your level in anything. But you know, I've owned multiple gyms, 464 00:29:31,640 --> 00:29:33,480 Speaker 1: I've worked in the fitness industry in my whole life. 465 00:29:33,520 --> 00:29:36,400 Speaker 1: I've trained thousands of humans, work with tens of thousands 466 00:29:36,400 --> 00:29:39,120 Speaker 1: of bodies, and sometimes you know when people are talking 467 00:29:39,160 --> 00:29:43,280 Speaker 1: about training or something that I absolutely know, not I 468 00:29:43,320 --> 00:29:46,200 Speaker 1: think or I guess. It's like no, I know this 469 00:29:46,280 --> 00:29:48,560 Speaker 1: for certain because I've been in the middle of this 470 00:29:48,760 --> 00:29:53,440 Speaker 1: thousands of times and somebody says something and it's completely untrue. 471 00:29:53,520 --> 00:29:55,440 Speaker 1: It's not like, oh, it's a little bit wrong, it's 472 00:29:55,560 --> 00:30:00,280 Speaker 1: completely wrong, but they say it with such conviction, right 473 00:30:00,440 --> 00:30:04,479 Speaker 1: or Probably the most misused phrase on the Internet in 474 00:30:04,520 --> 00:30:08,720 Speaker 1: regards to science or alleged science or health or wellness 475 00:30:08,720 --> 00:30:13,240 Speaker 1: information is research tells us. All somebody's got to do 476 00:30:13,360 --> 00:30:18,440 Speaker 1: is preface any fucking statement with research tells us, and 477 00:30:18,480 --> 00:30:22,960 Speaker 1: then people think that's science. I'm like, research doesn't tell 478 00:30:23,040 --> 00:30:26,320 Speaker 1: us that. Or when people go, oh, I've done my 479 00:30:26,400 --> 00:30:29,479 Speaker 1: own research, I go, what does that mean? Tell me 480 00:30:29,520 --> 00:30:33,520 Speaker 1: what your research entailed, because no, disrespect, I'm pretty sure 481 00:30:33,560 --> 00:30:36,640 Speaker 1: you've done no research. Maybe you jumped on the internet 482 00:30:36,680 --> 00:30:38,760 Speaker 1: and looked at things that isn't research. 483 00:30:39,840 --> 00:30:42,560 Speaker 2: Yeah, yeah, you don't need a lab code to do that. 484 00:30:43,720 --> 00:30:45,400 Speaker 1: No no? 485 00:30:46,200 --> 00:30:46,560 Speaker 2: Yeah? 486 00:30:46,640 --> 00:30:50,440 Speaker 1: Oh well, so is there any light at the end 487 00:30:50,480 --> 00:30:54,040 Speaker 1: of the tunnel, Like what is the what is the panacea? 488 00:30:54,200 --> 00:30:56,640 Speaker 1: Like how do we is it just people like you 489 00:30:56,720 --> 00:31:00,400 Speaker 1: and me talking about this publicly and like pushing back 490 00:31:00,440 --> 00:31:03,640 Speaker 1: a little bit and trying to shine a lot on that. 491 00:31:05,120 --> 00:31:07,800 Speaker 3: Well, I'll answer that question, but I do want to 492 00:31:07,800 --> 00:31:09,520 Speaker 3: touch on a point you made, and it's just my 493 00:31:09,560 --> 00:31:10,320 Speaker 3: personal opinion. 494 00:31:10,360 --> 00:31:10,800 Speaker 2: I don't know. 495 00:31:11,160 --> 00:31:15,480 Speaker 3: I haven't done my googling on this, Craig. So I 496 00:31:15,520 --> 00:31:20,840 Speaker 3: would imagine research shows that it's our negativity bias that 497 00:31:20,960 --> 00:31:24,400 Speaker 3: has a lot to do with this, because experts and 498 00:31:24,520 --> 00:31:27,920 Speaker 3: doctors are wrong from time to time, or you know, 499 00:31:28,000 --> 00:31:31,080 Speaker 3: they say one thing, new studies come out and they 500 00:31:31,200 --> 00:31:32,480 Speaker 3: change their mind you know. 501 00:31:32,400 --> 00:31:34,640 Speaker 2: That's how that's how we progress. 502 00:31:34,840 --> 00:31:37,840 Speaker 3: Right, If we never changed our minds, that means we're 503 00:31:37,840 --> 00:31:40,200 Speaker 3: not learning from new data that came out, So that 504 00:31:40,280 --> 00:31:43,800 Speaker 3: should be part and parcel part of the process. And 505 00:31:43,840 --> 00:31:47,120 Speaker 3: I think that's misunderstood. And when we hear about a 506 00:31:47,160 --> 00:31:50,760 Speaker 3: doctor who did something bad, or you know, a scientific 507 00:31:50,760 --> 00:31:54,240 Speaker 3: study that was later proven wrong, that sticks to us 508 00:31:54,360 --> 00:31:58,440 Speaker 3: like glue, and we forget about all the thousands of 509 00:31:58,520 --> 00:32:02,440 Speaker 3: studies that have held up. We forget about the tens 510 00:32:02,440 --> 00:32:06,120 Speaker 3: of millions of doctors who are giving good advice every day, 511 00:32:06,240 --> 00:32:08,040 Speaker 3: saving lives every minute. 512 00:32:08,920 --> 00:32:10,160 Speaker 2: Those don't make the news. 513 00:32:10,720 --> 00:32:15,360 Speaker 3: All that really gets sensationalized are the negative stories. And 514 00:32:15,400 --> 00:32:18,000 Speaker 3: I'm not saying they shouldn't be reported. But I am 515 00:32:18,080 --> 00:32:23,280 Speaker 3: saying that we as media digestors, need to keep in 516 00:32:23,320 --> 00:32:27,080 Speaker 3: mind that we are being only shown a sliver of reality, 517 00:32:27,360 --> 00:32:31,240 Speaker 3: and that sliver is usually something that is going to 518 00:32:31,360 --> 00:32:36,360 Speaker 3: drive a headline or drive clicks. It's probably a little sensationalized, 519 00:32:36,640 --> 00:32:39,320 Speaker 3: and we need to be cognizant of that, and keep 520 00:32:39,360 --> 00:32:42,280 Speaker 3: in mind that there's lots of stories not being reported 521 00:32:43,120 --> 00:32:47,240 Speaker 3: because they're not exciting that show that experts are getting 522 00:32:47,280 --> 00:32:51,440 Speaker 3: it right all the time, So we have to you 523 00:32:51,440 --> 00:32:54,000 Speaker 3: know me, I think that's a big part of the problem. 524 00:32:54,000 --> 00:32:55,880 Speaker 3: It's a built in bias in our brain. I'm not 525 00:32:55,960 --> 00:33:00,000 Speaker 3: blaming anybody. It's not our fault, but we should recogni 526 00:33:00,440 --> 00:33:05,720 Speaker 3: that and try to be more savvy and intelligent when 527 00:33:06,120 --> 00:33:10,760 Speaker 3: we're listening to these influencers or watching these tiktoks. Got 528 00:33:10,800 --> 00:33:13,520 Speaker 3: to check out their credentials and we have to fact 529 00:33:13,680 --> 00:33:18,360 Speaker 3: check them in robust ways that rely on evidence and 530 00:33:18,640 --> 00:33:20,680 Speaker 3: scientific or medical consensus. 531 00:33:22,040 --> 00:33:27,160 Speaker 1: My training partner, God bless him, he's the current reigning 532 00:33:27,880 --> 00:33:34,600 Speaker 1: Australian conspiracy theory champion. He sends me about ten conspiracy 533 00:33:34,680 --> 00:33:37,760 Speaker 1: theory videos a day. Shout out to the crab. His 534 00:33:37,880 --> 00:33:40,920 Speaker 1: name is Mark I lovingly call him the crab because 535 00:33:40,920 --> 00:33:42,960 Speaker 1: he looks like a crab and he moves like a crab. 536 00:33:43,000 --> 00:33:46,840 Speaker 1: But yeah, he's way down the rabbit hole, Like he 537 00:33:47,920 --> 00:33:50,360 Speaker 1: only comes out of the rabbit hole for lunch, you know, 538 00:33:50,440 --> 00:33:51,880 Speaker 1: and then then back. 539 00:33:52,080 --> 00:33:52,800 Speaker 2: Wow. 540 00:33:53,560 --> 00:33:55,800 Speaker 1: And I'm like, oh, yeah, I think we all trust 541 00:33:56,000 --> 00:34:01,880 Speaker 1: like that. He trusts nobody. He sent me. I shouldn't 542 00:34:01,880 --> 00:34:03,600 Speaker 1: say this, but I'm going to throw him under the bus. 543 00:34:03,680 --> 00:34:07,360 Speaker 1: He sent me one yesterday on this guy like who 544 00:34:07,760 --> 00:34:10,320 Speaker 1: I don't know, worked for the Air Force or NASAU 545 00:34:10,400 --> 00:34:14,799 Speaker 1: or someone, and talking about yeah, flat Earth, And I'm like, 546 00:34:14,840 --> 00:34:18,840 Speaker 1: come on, bro, surely we draw a lot. Surely, surely 547 00:34:18,880 --> 00:34:22,319 Speaker 1: we draw a line at flat Earth, even you. And 548 00:34:22,360 --> 00:34:26,879 Speaker 1: I'm like, yeah, so it's it is. I mean, we've 549 00:34:26,920 --> 00:34:31,400 Speaker 1: never had the level of access to information, good and 550 00:34:31,480 --> 00:34:40,040 Speaker 1: bad information, misinformation, actual information, pseudoscience, real science, hysteria, data like, 551 00:34:40,160 --> 00:34:43,560 Speaker 1: but it's all coming at us unless you're very discerning 552 00:34:43,680 --> 00:34:48,840 Speaker 1: or educated. It's often, you know, I'm mildly educated, and 553 00:34:48,880 --> 00:34:50,880 Speaker 1: it's still hard for me to figure out what's what. 554 00:34:51,960 --> 00:34:56,160 Speaker 3: Yeah, you know, I actually feel a little sympathy for 555 00:34:56,719 --> 00:35:01,520 Speaker 3: conspiracy theorists. I mean, the glom into something that seems exciting. 556 00:35:01,600 --> 00:35:06,280 Speaker 3: That seems different. I think their intentions are often genuine. 557 00:35:06,360 --> 00:35:09,200 Speaker 3: You know, they're trying to help, you know, they think 558 00:35:09,200 --> 00:35:14,440 Speaker 3: they got something, you know, fascinating or important to share, 559 00:35:15,040 --> 00:35:17,360 Speaker 3: even though it's demonstrably false. 560 00:35:19,280 --> 00:35:24,520 Speaker 2: For some reason, they are immune to hearing. 561 00:35:24,239 --> 00:35:27,600 Speaker 3: About the truth, or they think it's a hoax, which 562 00:35:27,640 --> 00:35:30,440 Speaker 3: is another thing I get all the time, you know, 563 00:35:30,800 --> 00:35:35,280 Speaker 3: you know, the conspiracy Whenever you show data or debunk something, 564 00:35:36,360 --> 00:35:38,400 Speaker 3: the easy out is while it's just a hoax or 565 00:35:38,440 --> 00:35:41,520 Speaker 3: you're lying. I mean, okay, well then it just boils 566 00:35:41,560 --> 00:35:44,759 Speaker 3: down to a matter of trust, and you know, you 567 00:35:44,840 --> 00:35:49,600 Speaker 3: can trust the conspiracy theorists or you can trust experts 568 00:35:49,600 --> 00:35:51,280 Speaker 3: who live in the real world. 569 00:35:51,600 --> 00:35:53,920 Speaker 2: And yeah, I don't know, but I do. 570 00:35:54,040 --> 00:35:57,160 Speaker 3: I have sympathy because I sense that a lot of 571 00:35:57,160 --> 00:36:01,160 Speaker 3: those conspiracy theorists have genuine feeling that they're trying to help. 572 00:36:01,840 --> 00:36:04,120 Speaker 3: They got some kind of top secret information that the 573 00:36:04,160 --> 00:36:09,600 Speaker 3: world needs to hear. But it's just so so terribly misguided, 574 00:36:11,000 --> 00:36:15,840 Speaker 3: you know that. Yeah, yeah, it's a little heartbreaking. But 575 00:36:15,960 --> 00:36:17,640 Speaker 3: to get back to your point, is there some light 576 00:36:17,640 --> 00:36:20,200 Speaker 3: at the end of this tunnel? 577 00:36:21,400 --> 00:36:21,640 Speaker 2: Well? 578 00:36:22,680 --> 00:36:27,120 Speaker 3: I was really encouraged to see so many scientific and 579 00:36:27,160 --> 00:36:35,319 Speaker 3: medical organizations coming forward immediately, you know, against what these 580 00:36:35,360 --> 00:36:40,840 Speaker 3: recommendations were that came from our own government. And that 581 00:36:41,360 --> 00:36:43,200 Speaker 3: was great to see because a lot of people are 582 00:36:43,360 --> 00:36:46,480 Speaker 3: no longer standing up to some of the lies that 583 00:36:46,480 --> 00:36:49,279 Speaker 3: we are being fed, some of the falsehoods that are 584 00:36:49,320 --> 00:36:52,600 Speaker 3: being propagated. So it was encouraging to see them take 585 00:36:52,600 --> 00:36:55,720 Speaker 3: a stand and say absolutely not, this is this is 586 00:36:55,800 --> 00:36:59,960 Speaker 3: this is totally nuts. Here's why listen to your doctors, 587 00:37:00,360 --> 00:37:03,160 Speaker 3: not the guy who said, don't take medical advice from me. 588 00:37:03,960 --> 00:37:07,280 Speaker 1: Yeah, there seems to be I mean, from the outside 589 00:37:07,280 --> 00:37:10,640 Speaker 1: looking in and the very far away lens through which 590 00:37:10,680 --> 00:37:14,840 Speaker 1: we observe the US, which of course has just filtered 591 00:37:14,880 --> 00:37:19,600 Speaker 1: through our media, is there a bit of pushback, more 592 00:37:19,640 --> 00:37:22,560 Speaker 1: pushback coming, Like we have this thing, all these rallies, 593 00:37:22,640 --> 00:37:26,399 Speaker 1: no kings, this has been on our this has been 594 00:37:26,440 --> 00:37:29,839 Speaker 1: occupying out TV screens for the last day or so. 595 00:37:30,920 --> 00:37:32,719 Speaker 1: Just give me an insight into that. 596 00:37:33,920 --> 00:37:36,799 Speaker 3: Yeah, that happened today and it looked to be an 597 00:37:36,800 --> 00:37:40,400 Speaker 3: extraordinarily huge event. Or no, I'm sorry, it happened yesterday, 598 00:37:40,680 --> 00:37:43,200 Speaker 3: but it looked to be an extraordinarily huge event. The 599 00:37:43,239 --> 00:37:47,200 Speaker 3: turnouts were far greater than I think anyone really expected. 600 00:37:48,080 --> 00:37:52,880 Speaker 3: So I think, you know, there's an extraordinary amount of 601 00:37:52,920 --> 00:37:56,240 Speaker 3: people in the US who see through all these lies, 602 00:37:56,560 --> 00:38:02,320 Speaker 3: who see through the propaganda and have a greater understanding 603 00:38:02,400 --> 00:38:07,400 Speaker 3: of what fascism and authoritarianism really is, and you know, 604 00:38:08,600 --> 00:38:11,799 Speaker 3: conceive that this is a dangerous road that we are 605 00:38:12,120 --> 00:38:17,799 Speaker 3: going down. We have, you know, a president that is 606 00:38:19,520 --> 00:38:24,440 Speaker 3: you know, just so unpresidential, so unprofessional, and so puerile 607 00:38:25,120 --> 00:38:29,400 Speaker 3: in his remarks and actions that is just not befitting 608 00:38:30,320 --> 00:38:34,640 Speaker 3: the integrity of that office. To many people, this is 609 00:38:34,680 --> 00:38:38,440 Speaker 3: a global embarrassment and they just can't believe, you know, 610 00:38:39,800 --> 00:38:42,080 Speaker 3: that it happened once, much less that it happened twice. 611 00:38:43,239 --> 00:38:47,560 Speaker 3: So I don't quite know what the future holds, because 612 00:38:47,680 --> 00:38:53,320 Speaker 3: now that the Republicans have power of so many things, 613 00:38:54,680 --> 00:38:58,200 Speaker 3: they have both houses of Congress, they have the president, 614 00:38:58,239 --> 00:39:02,920 Speaker 3: They've stacked the courts. It it almost seems like a 615 00:39:02,920 --> 00:39:06,359 Speaker 3: helpless situation that there's no way the Democrats are going 616 00:39:06,400 --> 00:39:12,279 Speaker 3: to ever reclaim power because of all the gerrymandering and 617 00:39:13,520 --> 00:39:17,960 Speaker 3: things that are going on that don't really allow them 618 00:39:17,960 --> 00:39:21,880 Speaker 3: to have a fair fight. But seeing something like the 619 00:39:21,920 --> 00:39:26,040 Speaker 3: No King's rallies does give you a glimmer of hope, 620 00:39:26,239 --> 00:39:30,160 Speaker 3: you know, that when you can visibly see a bunch 621 00:39:30,200 --> 00:39:32,399 Speaker 3: of people out there saying we're not going to take this, 622 00:39:34,160 --> 00:39:35,120 Speaker 3: that's inspiring. 623 00:39:36,360 --> 00:39:38,759 Speaker 1: There's something like, I don't know, two and a half 624 00:39:38,960 --> 00:39:43,920 Speaker 1: thousand different like groups rallies around the country, so definitely 625 00:39:44,640 --> 00:39:49,040 Speaker 1: there's some momentum. So always good chatting to you, sir. 626 00:39:49,280 --> 00:39:52,040 Speaker 1: We appreciate your wisdom and your insight and your time 627 00:39:53,120 --> 00:39:59,440 Speaker 1: anytime and you're shakiness. Thank you so much. We'll say 628 00:39:59,520 --> 00:40:02,319 Speaker 1: goodbye fair but thanks mate. I appreciate you for being 629 00:40:02,320 --> 00:40:05,560 Speaker 1: on the project and being such a fun part of 630 00:40:05,600 --> 00:40:07,640 Speaker 1: our team. Thank you. 631 00:40:07,640 --> 00:40:08,880 Speaker 2: You got it, Craig, thanks again.