1 00:00:00,440 --> 00:00:04,040 Speaker 1: I'll good eight boys and girls, moms and dads, kids. 2 00:00:03,800 --> 00:00:09,280 Speaker 1: It's quat Anthony Harper's You Project. It's David Kevin Patrick, 3 00:00:09,360 --> 00:00:16,360 Speaker 1: James Gillespie from the Sunshine State and other places. Happy 4 00:00:16,400 --> 00:00:17,960 Speaker 1: New Year, my friend. How are you? 5 00:00:18,079 --> 00:00:20,880 Speaker 2: Yeah? Happy New Year to you. Oh good refreshed. I've 6 00:00:20,880 --> 00:00:22,599 Speaker 2: been on holidays, as I was saying to your fair, 7 00:00:23,800 --> 00:00:26,840 Speaker 2: feeling all charged up, ready to go. Been working on 8 00:00:26,840 --> 00:00:30,720 Speaker 2: my new book. So I've been playing with some ideas there. 9 00:00:31,600 --> 00:00:33,600 Speaker 1: How to Get Rich Painlessly? Is that the book? 10 00:00:33,640 --> 00:00:33,839 Speaker 2: Is that? 11 00:00:33,840 --> 00:00:36,040 Speaker 1: The one? How to get Jack ripped and rock hard? 12 00:00:36,320 --> 00:00:38,840 Speaker 1: How to be more like David Gillespie. 13 00:00:38,760 --> 00:00:43,880 Speaker 2: Kind of the opposite. Actually, it's how to survive when 14 00:00:43,960 --> 00:00:51,080 Speaker 2: AI comes for your job. Also, it's yeah, looking at 15 00:00:51,760 --> 00:00:53,640 Speaker 2: what you should be doing if you don't want a 16 00:00:53,720 --> 00:00:54,800 Speaker 2: robot to take your job. 17 00:00:55,960 --> 00:01:01,480 Speaker 1: That is such an interesting and relevant and timely conversation 18 00:01:01,680 --> 00:01:04,520 Speaker 1: to have and topic to explore. I didn't know you 19 00:01:04,520 --> 00:01:06,520 Speaker 1: were doing that. I knew you were doing another book. 20 00:01:06,720 --> 00:01:08,720 Speaker 2: I didn't know I was doing that before the holiday either. 21 00:01:08,800 --> 00:01:10,240 Speaker 2: But that's just where I got to. 22 00:01:12,200 --> 00:01:14,800 Speaker 1: Right, give us without giving away anything, give us like 23 00:01:14,920 --> 00:01:18,000 Speaker 1: two minutes on that if you can like a okay, 24 00:01:18,200 --> 00:01:21,440 Speaker 1: So a bit of a like the concept in your head. 25 00:01:22,640 --> 00:01:25,399 Speaker 2: Yeah, it's sort of. I started with the idea of 26 00:01:25,480 --> 00:01:30,480 Speaker 2: the parallels between this and the railroad boom that happened 27 00:01:31,240 --> 00:01:36,119 Speaker 2: in the late eighteen hundreds in the United States. So 28 00:01:36,959 --> 00:01:40,520 Speaker 2: that was a massive change in US society and eventually 29 00:01:40,640 --> 00:01:44,320 Speaker 2: world society in that you went from so if you 30 00:01:44,360 --> 00:01:48,280 Speaker 2: were the local blacksmith in a town, you know, out 31 00:01:48,320 --> 00:01:54,160 Speaker 2: of nowhere middle of the USA, your competition for work 32 00:01:54,600 --> 00:01:59,480 Speaker 2: was pretty much as far as anyone could walk before 33 00:01:59,520 --> 00:02:02,200 Speaker 2: the railroad. So you could be as good or as 34 00:02:02,240 --> 00:02:04,560 Speaker 2: bad as you like to charge whatever you liked, because 35 00:02:04,720 --> 00:02:07,200 Speaker 2: unless someone else set up shop in your little town, 36 00:02:08,320 --> 00:02:13,800 Speaker 2: you were pretty much competition free. And the railroad changed 37 00:02:13,919 --> 00:02:19,240 Speaker 2: all of that. Suddenly the blacksmith in little you know, Poke, 38 00:02:19,720 --> 00:02:22,480 Speaker 2: New Jersey or whatever was not competing with whoever was 39 00:02:22,520 --> 00:02:26,600 Speaker 2: within walking distance. He was competing with everybody in the country, 40 00:02:27,000 --> 00:02:31,239 Speaker 2: and in particular, he was competing with factories that employed 41 00:02:31,240 --> 00:02:36,760 Speaker 2: thousands of people. And that was the big change that happened, 42 00:02:36,760 --> 00:02:40,640 Speaker 2: and it was a massive change in the way the 43 00:02:40,680 --> 00:02:44,560 Speaker 2: economy of that country and eventually the world worked. And 44 00:02:44,919 --> 00:02:49,200 Speaker 2: I feel we're seeing something similar happening here with Ai 45 00:02:50,000 --> 00:02:54,160 Speaker 2: it's the railroad for the knowledge worker. It's going to 46 00:02:54,240 --> 00:02:58,760 Speaker 2: be at least as disruptive. Is being at least as disruptive, 47 00:02:59,080 --> 00:03:02,320 Speaker 2: and so sort of in my head called this book 48 00:03:02,440 --> 00:03:05,640 Speaker 2: Escape from the Middle, because I think there is develop 49 00:03:05,919 --> 00:03:11,560 Speaker 2: being a large chunk of the middle of society and 50 00:03:11,639 --> 00:03:17,240 Speaker 2: the middle of the job scheme where you are unlikely 51 00:03:17,280 --> 00:03:21,360 Speaker 2: to survive unless you move to either end. And what's 52 00:03:21,360 --> 00:03:25,320 Speaker 2: at either end is the things that AI can't do. 53 00:03:26,160 --> 00:03:29,120 Speaker 2: So there are two big and important things that AI 54 00:03:29,240 --> 00:03:32,080 Speaker 2: can't do, and I think we've briefly talked about this 55 00:03:32,120 --> 00:03:35,960 Speaker 2: one before. One of the things, one of the things 56 00:03:36,000 --> 00:03:40,560 Speaker 2: that can't do is invent things. So and as I mentioned, 57 00:03:40,600 --> 00:03:43,040 Speaker 2: you know, if you gave an AI training on all 58 00:03:43,080 --> 00:03:45,320 Speaker 2: the documents that existed in eighteen fifty, it would never 59 00:03:45,400 --> 00:03:48,920 Speaker 2: invent the motor vehicle. It would just invent a better 60 00:03:49,720 --> 00:03:54,200 Speaker 2: orse drawn cart. And it might be significantly better. It 61 00:03:54,280 --> 00:03:57,000 Speaker 2: might have round the wheels or better whips or better 62 00:03:57,040 --> 00:03:59,600 Speaker 2: horses or something I don't know, but it wouldn't invent 63 00:03:59,600 --> 00:04:04,839 Speaker 2: a car because it can't invent things. And so there's 64 00:04:04,840 --> 00:04:07,720 Speaker 2: obviously going to be jobs for inventors, people who can 65 00:04:08,040 --> 00:04:12,080 Speaker 2: join the dots outside the square and imagine things that 66 00:04:12,160 --> 00:04:16,640 Speaker 2: don't exist, and then there's going to be jobs for 67 00:04:17,320 --> 00:04:21,960 Speaker 2: what I would call the artisans, you know, the people 68 00:04:22,640 --> 00:04:25,640 Speaker 2: who make things that aren't as good as a machine 69 00:04:25,760 --> 00:04:30,159 Speaker 2: on purpose. You know, even today, even today, people are 70 00:04:30,240 --> 00:04:33,640 Speaker 2: quite happy to pay ten dollars for pretty rough sort 71 00:04:33,680 --> 00:04:38,640 Speaker 2: of bread, a loaf of sourdough bread, which are surprising 72 00:04:38,680 --> 00:04:43,120 Speaker 2: given they can buy a perfectly perfect piece or loaf 73 00:04:43,200 --> 00:04:45,000 Speaker 2: from a supermarket for two dollars. 74 00:04:45,560 --> 00:04:49,120 Speaker 1: Yeah, yeah, yeah, Well you can buy buy an Australia 75 00:04:49,160 --> 00:04:52,159 Speaker 1: that looks like it was made by your grandma with 76 00:04:52,240 --> 00:04:55,760 Speaker 1: the lights out for one hundred bucks or I don't know. 77 00:04:55,960 --> 00:04:58,480 Speaker 1: People buy Austrais now will probably get one from the 78 00:04:58,560 --> 00:05:01,040 Speaker 1: op shop that's incredible for fifty cents. 79 00:05:01,760 --> 00:05:05,720 Speaker 2: Yeah. So I think there are And so obviously that's 80 00:05:05,800 --> 00:05:09,640 Speaker 2: not the whole book, but that's where my thoughts are going, 81 00:05:09,720 --> 00:05:12,760 Speaker 2: and some of the drafts I've done on the manuscript 82 00:05:13,200 --> 00:05:16,760 Speaker 2: about thinking about, Okay, well, what does that imply for 83 00:05:16,839 --> 00:05:20,040 Speaker 2: the kinds of jobs that you should be looking at now, 84 00:05:21,720 --> 00:05:24,040 Speaker 2: what does it imply for the other jobs, and what 85 00:05:24,080 --> 00:05:27,279 Speaker 2: does it imply for society in general? You know, if 86 00:05:27,360 --> 00:05:29,640 Speaker 2: that kind of change is going to happen in society, 87 00:05:30,440 --> 00:05:35,080 Speaker 2: then what has to change about the way everything is done? Yeah? 88 00:05:35,279 --> 00:05:39,920 Speaker 1: Yeah, So this is an ironic question in your book 89 00:05:39,960 --> 00:05:45,880 Speaker 1: about essentially preparing for the dangers of AI or you know, 90 00:05:45,920 --> 00:05:48,360 Speaker 1: trying to mitigate the personal damage to you in your 91 00:05:48,360 --> 00:05:52,880 Speaker 1: career or whatever. How are you using AI now in 92 00:05:52,920 --> 00:05:58,159 Speaker 1: the writing process? Now that we have such an amazing 93 00:05:58,240 --> 00:06:01,320 Speaker 1: tool available that you didn't have available for most of 94 00:06:01,360 --> 00:06:03,400 Speaker 1: your books. 95 00:06:03,520 --> 00:06:10,240 Speaker 2: For research, its ability to research is incredible, you know, 96 00:06:10,320 --> 00:06:14,120 Speaker 2: the I think one of the things that I brought 97 00:06:14,160 --> 00:06:16,680 Speaker 2: to the table when I first started writing. And so 98 00:06:16,720 --> 00:06:19,600 Speaker 2: when I wrote my book about sugar back in two 99 00:06:19,640 --> 00:06:25,520 Speaker 2: thousand and seven, the Internet wasn't very useful for research, 100 00:06:26,880 --> 00:06:30,800 Speaker 2: and very few people were using it that way. So 101 00:06:31,080 --> 00:06:36,159 Speaker 2: sure people were searching for the nearest top dog stand 102 00:06:36,240 --> 00:06:39,400 Speaker 2: or something, but they weren't doing the thing that I did, 103 00:06:39,480 --> 00:06:43,320 Speaker 2: which was to say, okay, well, here's this database of 104 00:06:43,360 --> 00:06:45,960 Speaker 2: medical journals that says this thing, and then here's this 105 00:06:46,080 --> 00:06:49,000 Speaker 2: database of medical journals read by a completely different set 106 00:06:49,000 --> 00:06:51,960 Speaker 2: of professionals that say this other thing. But if both 107 00:06:52,000 --> 00:06:53,920 Speaker 2: of those things are true, then this third thing must 108 00:06:54,000 --> 00:06:56,679 Speaker 2: be true. And let's see if that's the case, whether 109 00:06:56,720 --> 00:07:00,720 Speaker 2: this third database agrees with that. I did a lot 110 00:07:00,800 --> 00:07:03,520 Speaker 2: of that kind of thing, which is drawing connections that 111 00:07:03,720 --> 00:07:08,920 Speaker 2: nobody else was drawing, and then put that together in 112 00:07:08,960 --> 00:07:11,559 Speaker 2: the form of notes to myself, which ultimately came became 113 00:07:11,600 --> 00:07:15,720 Speaker 2: a book. Well, today, a good AI does all of 114 00:07:15,720 --> 00:07:20,600 Speaker 2: that in seconds. And the skill of the researcher is 115 00:07:20,680 --> 00:07:25,720 Speaker 2: in staying out of the rails it tries to shove 116 00:07:25,760 --> 00:07:31,560 Speaker 2: you down. So it's so convincing in its ability to 117 00:07:31,600 --> 00:07:35,080 Speaker 2: do what I just described that it's very very easy 118 00:07:35,160 --> 00:07:39,320 Speaker 2: to just accept it and not ask questions and not 119 00:07:40,720 --> 00:07:46,920 Speaker 2: push the boundaries and ask what ifs. And I think 120 00:07:46,960 --> 00:07:49,240 Speaker 2: that's the skill I bring to the table when using 121 00:07:49,280 --> 00:07:51,360 Speaker 2: a tool that's that powerful on that fast. 122 00:07:52,120 --> 00:07:55,960 Speaker 1: Yeah. Yeah, it's so interesting, Like I have not been 123 00:07:56,120 --> 00:07:59,880 Speaker 1: very good at at the prompts, you know, the questions 124 00:08:00,560 --> 00:08:04,360 Speaker 1: and the and then the secondary and third question, and 125 00:08:04,960 --> 00:08:07,920 Speaker 1: you know is learning how to how do I ask 126 00:08:08,040 --> 00:08:12,120 Speaker 1: this in a way where I actually get what I 127 00:08:12,240 --> 00:08:14,640 Speaker 1: not necessarily want as in to get it to agree 128 00:08:14,680 --> 00:08:17,600 Speaker 1: with me, but to get the real answer that I want, 129 00:08:17,720 --> 00:08:23,200 Speaker 1: not the convenient, politically correct right answer, but to get to, 130 00:08:23,480 --> 00:08:25,680 Speaker 1: like you said, to actually find out what I want 131 00:08:25,720 --> 00:08:28,400 Speaker 1: to find out. So yeah, I think that ability to 132 00:08:28,480 --> 00:08:35,000 Speaker 1: be able to ask great questions and write the the. 133 00:08:35,200 --> 00:08:38,960 Speaker 2: Well, there's a cheap the chief over there, ask it 134 00:08:39,040 --> 00:08:39,800 Speaker 2: to write the prompt. 135 00:08:41,400 --> 00:08:42,600 Speaker 1: Give me an example of that. 136 00:08:43,280 --> 00:08:48,000 Speaker 2: Well, so say you want to find out all the 137 00:08:48,080 --> 00:08:50,440 Speaker 2: research about a particular So say there's a study you're 138 00:08:50,480 --> 00:08:51,920 Speaker 2: interested in, and you want to find out all the 139 00:08:51,920 --> 00:08:56,240 Speaker 2: research that agrees or disagrees with that study. Now, if 140 00:08:56,240 --> 00:08:59,040 Speaker 2: you just type that in and you may or may 141 00:08:59,080 --> 00:09:03,400 Speaker 2: not get a particularly good response, but generally the way 142 00:09:03,440 --> 00:09:07,600 Speaker 2: to do it is say this is what I want. Now, 143 00:09:07,679 --> 00:09:12,360 Speaker 2: construct me a prompt that would be written by a 144 00:09:12,400 --> 00:09:17,760 Speaker 2: well educated researcher in this space. Wow, And it'll write 145 00:09:17,760 --> 00:09:20,400 Speaker 2: a prompt and it'll be a lot longer, and think 146 00:09:20,440 --> 00:09:22,040 Speaker 2: of a lot of things that you wouldn't think of 147 00:09:22,120 --> 00:09:25,640 Speaker 2: as limitters that you should be putting in place. And 148 00:09:25,800 --> 00:09:30,320 Speaker 2: it'll also probably have you start thinking about, Oh, it's 149 00:09:30,320 --> 00:09:33,400 Speaker 2: saying do do do these things? Don't do these things 150 00:09:33,920 --> 00:09:36,959 Speaker 2: in its prompt. It'll essentially right an essays a prompt, 151 00:09:37,679 --> 00:09:42,400 Speaker 2: but in reading it before you use it, think about, oh, yeah, 152 00:09:42,480 --> 00:09:44,679 Speaker 2: that's something I didn't think of. Maybe I should tell 153 00:09:44,679 --> 00:09:46,440 Speaker 2: it not to do that, or I should tell it 154 00:09:46,480 --> 00:09:51,120 Speaker 2: to do that. You'll get a much better prompt if 155 00:09:51,120 --> 00:09:51,560 Speaker 2: you do that. 156 00:09:53,080 --> 00:09:58,600 Speaker 1: I feel like the limitation with AI, probably the most 157 00:09:58,640 --> 00:10:00,840 Speaker 1: common limitation, is that we just don't know how to 158 00:10:00,960 --> 00:10:04,720 Speaker 1: use it. Well, yeah, you know, like how to exactly 159 00:10:04,800 --> 00:10:07,960 Speaker 1: what you're talking about. Now, I'm like, well, that actually 160 00:10:08,000 --> 00:10:09,960 Speaker 1: makes sense. Why don't I do that? Why didn't I 161 00:10:10,000 --> 00:10:13,480 Speaker 1: think of that? And so it's got way more capacity 162 00:10:13,520 --> 00:10:17,840 Speaker 1: than I'm that I'm getting from it, not because it 163 00:10:17,920 --> 00:10:21,480 Speaker 1: can't generate better answers for me, but because I don't 164 00:10:21,520 --> 00:10:23,000 Speaker 1: know how to get them. 165 00:10:23,679 --> 00:10:25,480 Speaker 2: Yeah. And also you've got to get around the thing 166 00:10:25,520 --> 00:10:27,560 Speaker 2: that all ais are written for, which at the end 167 00:10:27,559 --> 00:10:29,840 Speaker 2: of the day, they are a sales tool. And just 168 00:10:29,960 --> 00:10:34,040 Speaker 2: like social media, their metric is engagement, which is how 169 00:10:34,120 --> 00:10:39,400 Speaker 2: long can I keep you engaged with this? And for 170 00:10:39,480 --> 00:10:43,480 Speaker 2: that reason that's why they're such sycophantic so and sos 171 00:10:43,559 --> 00:10:46,320 Speaker 2: and tell you that everything you do is fabulous, you know, 172 00:10:46,600 --> 00:10:48,839 Speaker 2: So you'd be used to this by now. Whenever you 173 00:10:48,920 --> 00:10:51,240 Speaker 2: put something in and so what do you think about this, 174 00:10:51,320 --> 00:10:53,160 Speaker 2: it'll say, Oh, my god, Craig, that's the best writing 175 00:10:53,160 --> 00:10:54,839 Speaker 2: I've ever seen in my life. How on earth did 176 00:10:54,840 --> 00:10:59,520 Speaker 2: you produce? That? Is just a few tips the. 177 00:10:59,480 --> 00:11:02,400 Speaker 1: Main reason, and that's my best friend. 178 00:11:03,080 --> 00:11:06,400 Speaker 2: It never comes up. It never goes Oh Jesus, Craig, 179 00:11:06,520 --> 00:11:08,360 Speaker 2: you're really stuffed at this time. 180 00:11:09,280 --> 00:11:12,520 Speaker 1: What is wrong with you? That is? That question is shit? 181 00:11:13,720 --> 00:11:16,199 Speaker 2: Where's piece of garbage I've seen in my life? Have 182 00:11:16,320 --> 00:11:17,760 Speaker 2: you thought about giving up? 183 00:11:18,480 --> 00:11:20,480 Speaker 1: You don't? You know? 184 00:11:20,679 --> 00:11:23,320 Speaker 2: It never does that. It wants to pretend to be 185 00:11:23,400 --> 00:11:25,600 Speaker 2: your friend because it wants to keep you engaged. And 186 00:11:25,640 --> 00:11:28,960 Speaker 2: that's why a lot of ais. Now, no matter what 187 00:11:29,000 --> 00:11:32,200 Speaker 2: you ask, the very next it'll give you the answer 188 00:11:32,559 --> 00:11:34,640 Speaker 2: and then the right at the bottom and I'll say, oh, 189 00:11:34,720 --> 00:11:36,760 Speaker 2: would you like me to do this? That or the 190 00:11:36,800 --> 00:11:40,120 Speaker 2: other is sort of an extension of what you're thinking 191 00:11:40,200 --> 00:11:43,760 Speaker 2: to just keep you going for the next question, and 192 00:11:43,760 --> 00:11:45,720 Speaker 2: it'll keep doing that for everything you do. So if 193 00:11:45,720 --> 00:11:48,040 Speaker 2: you put in the next question, you'll say, oh, now 194 00:11:48,040 --> 00:11:49,400 Speaker 2: that we've done that, do you want to do this 195 00:11:49,520 --> 00:11:51,560 Speaker 2: next thing? Yes? 196 00:11:51,679 --> 00:11:55,040 Speaker 1: Yes, Well, when I put in something that's I don't know, 197 00:11:55,080 --> 00:11:59,200 Speaker 1: I guess anything in performance or human behavior or the 198 00:11:59,240 --> 00:12:02,840 Speaker 1: mind of this. So that firstly it's going to say, hey, Craig, 199 00:12:02,920 --> 00:12:08,720 Speaker 1: great question, Thanks well, thank you for judging my question positively. 200 00:12:08,840 --> 00:12:13,600 Speaker 1: I appreciate that. And then once it answers the question 201 00:12:14,120 --> 00:12:18,480 Speaker 1: or responds to my directive or prompt or whatever. Then 202 00:12:18,520 --> 00:12:22,520 Speaker 1: it says, so, would you like me to write just 203 00:12:22,559 --> 00:12:25,480 Speaker 1: a snapshot of what I just wrote, you know, something 204 00:12:25,520 --> 00:12:27,880 Speaker 1: that you could use as a synopsis or an overview, 205 00:12:28,000 --> 00:12:30,000 Speaker 1: or it goes, would you like me to turn this 206 00:12:30,080 --> 00:12:34,480 Speaker 1: into a whiteboard post? Which is my Instagram right Not 207 00:12:35,040 --> 00:12:38,920 Speaker 1: that I ever use it for that, but I I 208 00:12:39,000 --> 00:12:41,640 Speaker 1: will bounce ideas off it. Right now, I'll put up 209 00:12:41,640 --> 00:12:43,600 Speaker 1: a post and I'll go what's wrong with this? What's 210 00:12:43,679 --> 00:12:44,600 Speaker 1: right with this? You know? 211 00:12:44,960 --> 00:12:47,840 Speaker 2: Be critical by the way, Craig, you know I'm sensing 212 00:12:47,840 --> 00:12:49,800 Speaker 2: in you, and I get this from a lot of 213 00:12:49,800 --> 00:12:55,800 Speaker 2: people hesitation to admit that you use AI, and Andy, 214 00:12:56,800 --> 00:12:59,439 Speaker 2: I sense it there. You're sneaking around the edges of it. 215 00:13:00,600 --> 00:13:06,000 Speaker 2: And I don't think anyone should be hesitating about admitting 216 00:13:06,080 --> 00:13:10,400 Speaker 2: to using AI. That's like talking to a scientist or 217 00:13:10,400 --> 00:13:13,720 Speaker 2: a mathematician and them fudging the fact that they used 218 00:13:13,720 --> 00:13:21,280 Speaker 2: a calculator. It's just a tool, and you know it 219 00:13:21,320 --> 00:13:23,360 Speaker 2: will be as good or as bad as the person 220 00:13:23,400 --> 00:13:24,520 Speaker 2: in whose hands it is. 221 00:13:26,679 --> 00:13:29,800 Speaker 1: I definitely use it, definitely, and I'm open about that. 222 00:13:29,840 --> 00:13:32,320 Speaker 1: But what I don't do is I don't get it 223 00:13:32,400 --> 00:13:34,960 Speaker 1: to write things for me, and then I pretend that 224 00:13:35,120 --> 00:13:37,000 Speaker 1: I wrote what it wrote, like I. 225 00:13:36,920 --> 00:13:39,880 Speaker 2: Don't do that. Yes, but I tell. 226 00:13:39,720 --> 00:13:42,360 Speaker 1: You where is a bloody can of worms is in 227 00:13:42,960 --> 00:13:46,440 Speaker 1: universities at the moment, they're all running around with their 228 00:13:46,520 --> 00:13:49,119 Speaker 1: undies on their heads because you've got all these typically 229 00:13:49,280 --> 00:13:53,000 Speaker 1: older men and women who are the bloody the powers 230 00:13:53,000 --> 00:13:56,199 Speaker 1: that be, who are all kind of fifty fifty five 231 00:13:56,400 --> 00:14:00,720 Speaker 1: sixty plus, who do not really understand AI, who are 232 00:14:00,960 --> 00:14:06,240 Speaker 1: ultimately the decision makers making decisions and rulings on shit 233 00:14:06,320 --> 00:14:09,480 Speaker 1: they don't fully understand. So that's that's a name. 234 00:14:09,559 --> 00:14:13,200 Speaker 2: Well, the problem there is that the entire model of 235 00:14:13,360 --> 00:14:16,240 Speaker 2: university has just been thrown out the window. And this 236 00:14:16,360 --> 00:14:18,720 Speaker 2: is one of the things I'm looking at in the book, 237 00:14:18,960 --> 00:14:22,360 Speaker 2: is you can't do university or school for that matter, 238 00:14:22,440 --> 00:14:25,200 Speaker 2: the way it used to be done, because it's no 239 00:14:25,240 --> 00:14:30,560 Speaker 2: longer acceptable to say to a student, regurgitate this stuff 240 00:14:30,600 --> 00:14:34,080 Speaker 2: that you came and theoretically listened to for ten weeks, 241 00:14:34,440 --> 00:14:38,840 Speaker 2: because student can point it at your lectures and regurgitate 242 00:14:38,880 --> 00:14:43,040 Speaker 2: whatever you damn well want, having learned exactly nothing. So 243 00:14:43,840 --> 00:14:48,720 Speaker 2: the traditional methods of examining all the tasks you set 244 00:14:48,760 --> 00:14:53,160 Speaker 2: for students to be examined, just don't do anything. They 245 00:14:53,200 --> 00:14:56,920 Speaker 2: don't prove that the student learned anything, and in all probability, 246 00:14:56,960 --> 00:15:00,600 Speaker 2: the student didn't learn anything. What the student learned how 247 00:15:00,640 --> 00:15:02,600 Speaker 2: to do is how to get an AI to do 248 00:15:02,640 --> 00:15:06,920 Speaker 2: the work for them. And I think it's a fundamental 249 00:15:06,920 --> 00:15:09,840 Speaker 2: break in the model of the way university is conducted. 250 00:15:10,200 --> 00:15:14,600 Speaker 2: Universities and schools should be focusing on the reality that 251 00:15:14,680 --> 00:15:19,440 Speaker 2: these tools exist and teaching kids how to do the 252 00:15:19,520 --> 00:15:23,160 Speaker 2: things we've just been discussing, which is, Okay, there's a tool, Now, 253 00:15:23,240 --> 00:15:25,800 Speaker 2: how do we make it do what we wanted to 254 00:15:25,840 --> 00:15:32,200 Speaker 2: do to add value? Not let's all pretend everyone's not 255 00:15:32,280 --> 00:15:35,600 Speaker 2: getting their assignments written by AI, because I guarantee you 256 00:15:35,680 --> 00:15:39,480 Speaker 2: they all are. And don't sit there, you know, on 257 00:15:39,520 --> 00:15:41,800 Speaker 2: your academic high horse saying oh, yeah, but I've got 258 00:15:41,800 --> 00:15:43,760 Speaker 2: this piece of software that checks it and tells me 259 00:15:43,760 --> 00:15:47,040 Speaker 2: if an AI wrote it. Well, that's bs. The first 260 00:15:47,040 --> 00:15:50,160 Speaker 2: thing any student learns is to tell the AI to 261 00:15:50,240 --> 00:15:53,880 Speaker 2: make sure that an AI detection software can't detect the 262 00:15:53,920 --> 00:15:55,000 Speaker 2: thing it writes. 263 00:15:55,280 --> 00:16:01,680 Speaker 1: Yes, yes, yeah, one hundred percent. So I mean like 264 00:16:01,760 --> 00:16:06,120 Speaker 1: this really has kind of accelerated a lot in the 265 00:16:06,120 --> 00:16:09,200 Speaker 1: academic space in the last two years where they're really 266 00:16:09,320 --> 00:16:12,120 Speaker 1: I mean, I don't know, I know that Monash where 267 00:16:12,120 --> 00:16:16,840 Speaker 1: I'm at there, they're reviewing it a lot, and as 268 00:16:16,920 --> 00:16:19,320 Speaker 1: is my understanding, which is very limited. But what I know, 269 00:16:20,880 --> 00:16:25,960 Speaker 1: lots of conversations, lots of meetings, lots of potential strategies, 270 00:16:26,040 --> 00:16:32,920 Speaker 1: ideas what what's you're off the top of your head, 271 00:16:33,080 --> 00:16:36,280 Speaker 1: like you sht your gut feel about, how is education 272 00:16:36,440 --> 00:16:40,000 Speaker 1: going to work? And look, if not as it has been, 273 00:16:40,440 --> 00:16:43,000 Speaker 1: which it can't be as it has been because this 274 00:16:43,160 --> 00:16:47,160 Speaker 1: new tool makes that kind of learning essentially redundant and impossible, 275 00:16:49,440 --> 00:16:51,320 Speaker 1: what's going to work moving forward? 276 00:16:51,680 --> 00:16:53,240 Speaker 2: It's going to have to go back to the way 277 00:16:53,240 --> 00:16:58,080 Speaker 2: it was before everyone started doing education online, back to 278 00:16:58,080 --> 00:17:01,200 Speaker 2: the good old days, Craig. You know when I was 279 00:17:01,240 --> 00:17:03,880 Speaker 2: in university and undergraduate days in the nineteen eighties and 280 00:17:03,880 --> 00:17:09,399 Speaker 2: you actually had to show up, you know, and and 281 00:17:09,560 --> 00:17:14,280 Speaker 2: hand right your exam responses. Let's see Nai do that. 282 00:17:16,119 --> 00:17:20,040 Speaker 2: But I certainly, you know, in my undergraduate degree, had 283 00:17:20,040 --> 00:17:23,000 Speaker 2: to show up, had to go to an exam in person, 284 00:17:23,560 --> 00:17:25,680 Speaker 2: had to sit there for four hours and hand write 285 00:17:25,680 --> 00:17:26,360 Speaker 2: out answers. 286 00:17:27,960 --> 00:17:31,320 Speaker 1: Do you think like that would do it, especially if 287 00:17:31,359 --> 00:17:34,280 Speaker 1: they obviously had no computers and no phones and no 288 00:17:34,440 --> 00:17:37,280 Speaker 1: tablets with them, Like if they needed to go in 289 00:17:37,320 --> 00:17:42,280 Speaker 1: and write essays, do exams, answer questions with no help, 290 00:17:43,480 --> 00:17:46,560 Speaker 1: that's going to there. They're either going to get you know, 291 00:17:46,640 --> 00:17:49,359 Speaker 1: they're either going to learn or they're going to crash 292 00:17:49,400 --> 00:17:49,840 Speaker 1: and burn. 293 00:17:51,000 --> 00:17:53,080 Speaker 2: Do you think now that one stopped I'm using AI, 294 00:17:53,400 --> 00:17:55,560 Speaker 2: you'd still I mean, if I was back then but 295 00:17:55,680 --> 00:17:59,639 Speaker 2: had AI available to me, I'd be using it to 296 00:17:59,640 --> 00:18:04,840 Speaker 2: help me study. Yeah, because but but and so it 297 00:18:04,840 --> 00:18:05,680 Speaker 2: would be still help. 298 00:18:06,440 --> 00:18:07,960 Speaker 1: Yeah, but you've got to be able to go into 299 00:18:08,000 --> 00:18:11,160 Speaker 1: the exam room without AI and prove that you actually 300 00:18:11,240 --> 00:18:11,560 Speaker 1: know the. 301 00:18:11,640 --> 00:18:13,400 Speaker 2: Stuff, at least for that hour. 302 00:18:13,560 --> 00:18:20,640 Speaker 1: Yes, and well yeah, and literally you know, write an 303 00:18:20,840 --> 00:18:22,159 Speaker 1: essay or whatever. 304 00:18:24,000 --> 00:18:26,200 Speaker 2: In response to a question you haven't seen. 305 00:18:26,600 --> 00:18:30,199 Speaker 1: Yeah, exactly, and that you can only do that with 306 00:18:30,800 --> 00:18:33,240 Speaker 1: you know, knowledge, with quiet knowledge. 307 00:18:33,320 --> 00:18:36,119 Speaker 2: So I think that's where we're going to end up. 308 00:18:36,440 --> 00:18:41,600 Speaker 2: I can't see any alternative to significantly greater requirement for 309 00:18:41,720 --> 00:18:45,800 Speaker 2: in person And if you were to write assignments, I 310 00:18:45,800 --> 00:18:48,800 Speaker 2: think the only sort of assignments you could write would 311 00:18:48,880 --> 00:18:50,879 Speaker 2: be By the way, when I went to university and 312 00:18:50,920 --> 00:18:53,720 Speaker 2: did undergraduates, there were no assignments at all. You did 313 00:18:53,720 --> 00:18:55,960 Speaker 2: a full year of study, you did a full year subject, 314 00:18:56,000 --> 00:18:57,439 Speaker 2: and you did a four hour exam at the end 315 00:18:57,440 --> 00:19:02,159 Speaker 2: of it, and that was that really, So you know, 316 00:19:02,840 --> 00:19:04,480 Speaker 2: you could go back to that, but if you wanted 317 00:19:04,480 --> 00:19:07,800 Speaker 2: to do assignments, they could be more like we had 318 00:19:08,080 --> 00:19:11,919 Speaker 2: a discussion in class today about X or Y topic. 319 00:19:12,400 --> 00:19:15,880 Speaker 2: What did you think of Craig's take on that. Let's 320 00:19:15,920 --> 00:19:17,200 Speaker 2: see an AI do that one. 321 00:19:17,880 --> 00:19:25,280 Speaker 1: Yeah, right, right right, yeah. I've I've spoken to Melissa 322 00:19:25,320 --> 00:19:29,080 Speaker 1: who runs my life about this, and she's my ra 323 00:19:29,359 --> 00:19:35,240 Speaker 1: on my PhD. And and obviously we're using AI in 324 00:19:35,680 --> 00:19:38,239 Speaker 1: some bits, like in the research bits, but all of 325 00:19:38,280 --> 00:19:42,879 Speaker 1: my all of my papers are coming out of the 326 00:19:42,880 --> 00:19:45,399 Speaker 1: research that I've done, which isn't anywhere on the internet. 327 00:19:45,440 --> 00:19:48,400 Speaker 1: It's all, you know, so there's there's not really even 328 00:19:48,440 --> 00:19:53,199 Speaker 1: a possibility of cheating. But I said to her, you 329 00:19:53,240 --> 00:19:56,480 Speaker 1: could design a PhD depending on that, you know, especially 330 00:19:56,560 --> 00:20:00,360 Speaker 1: when you do something like psychology, where do you get 331 00:20:00,359 --> 00:20:05,680 Speaker 1: to depending on the university and how much leeway they 332 00:20:05,720 --> 00:20:09,600 Speaker 1: give you. You can design a study. You can write 333 00:20:09,600 --> 00:20:11,879 Speaker 1: your own research questions, come up with your own aims 334 00:20:11,880 --> 00:20:18,080 Speaker 1: and hypotheses, and do I think you could almost if 335 00:20:18,080 --> 00:20:21,720 Speaker 1: you knew how to do it. Not that you'd get through, 336 00:20:21,800 --> 00:20:24,679 Speaker 1: but you could literally do a PhD in a week 337 00:20:24,960 --> 00:20:28,040 Speaker 1: if you knew how to just you know what I'm saying. 338 00:20:28,600 --> 00:20:30,359 Speaker 2: Yeah, I agree with you. I think you could. I 339 00:20:30,400 --> 00:20:35,000 Speaker 2: think if you you know, it wouldn't necessarily get through 340 00:20:35,359 --> 00:20:38,320 Speaker 2: some universities, But look, I think if you knew what 341 00:20:38,320 --> 00:20:40,879 Speaker 2: you were doing, you could crank out. You know, I 342 00:20:40,880 --> 00:20:42,520 Speaker 2: don't know, what's the length of a PhD is at 343 00:20:42,560 --> 00:20:46,440 Speaker 2: sixty seventy thousand words something like that, you could crank 344 00:20:46,440 --> 00:20:50,520 Speaker 2: that out in a night. You know. That's just the 345 00:20:50,560 --> 00:20:53,520 Speaker 2: trick is making it something that is PhD worthy. 346 00:20:53,560 --> 00:20:57,480 Speaker 1: I guess, yeah, yeah, yeah, yeah, My mine's different because 347 00:20:57,480 --> 00:21:02,280 Speaker 1: I've got to write papers and get them published. But yeah, 348 00:21:02,320 --> 00:21:06,280 Speaker 1: so mine is by publication, not by thesis. But for 349 00:21:06,320 --> 00:21:11,440 Speaker 1: people who do it by thesis, it's like, yeah, you could. Yeah, 350 00:21:11,480 --> 00:21:13,720 Speaker 1: you're probably right if you if you had a big 351 00:21:13,760 --> 00:21:17,560 Speaker 1: crack and took a you know, some nicorette gum or 352 00:21:17,560 --> 00:21:21,000 Speaker 1: something and sat down for twelve and a half hours, 353 00:21:21,040 --> 00:21:24,360 Speaker 1: you could probably start to finish your pH d. Yeah, 354 00:21:26,480 --> 00:21:29,240 Speaker 1: what about This wasn't what we were going to chat 355 00:21:29,280 --> 00:21:29,760 Speaker 1: about tonight. 356 00:21:29,960 --> 00:21:32,440 Speaker 2: You probably fail an oral examination of it though, which 357 00:21:32,560 --> 00:21:35,880 Speaker 2: kind of leaves a bit of a clue to how 358 00:21:35,960 --> 00:21:37,840 Speaker 2: PhD should be assessed as well. 359 00:21:38,359 --> 00:21:41,120 Speaker 1: Well, yeah, you make a good point, Like in mind, 360 00:21:41,200 --> 00:21:48,560 Speaker 1: I had five assessments along the way. Four assessments along 361 00:21:48,560 --> 00:21:53,960 Speaker 1: the way where called academic reviews, where you go and 362 00:21:54,080 --> 00:21:57,160 Speaker 1: stand in front of a board and you you've got 363 00:21:57,160 --> 00:21:59,679 Speaker 1: to defend your thesis and talk about your study, your 364 00:21:59,720 --> 00:22:02,240 Speaker 1: re arch, how you did it, why you did it, 365 00:22:02,320 --> 00:22:04,600 Speaker 1: how it was designed, why it was designed, why this, 366 00:22:05,760 --> 00:22:08,240 Speaker 1: why this science needs to be brought into the world, 367 00:22:08,760 --> 00:22:11,520 Speaker 1: why it's different, blah blah blah, all that stuff. You 368 00:22:11,920 --> 00:22:13,959 Speaker 1: have to do that four times, which I've done it, 369 00:22:14,080 --> 00:22:17,920 Speaker 1: and that as someone who does this for living, talk 370 00:22:18,200 --> 00:22:22,320 Speaker 1: publicly and speak in front of audiences. That's the scaredest 371 00:22:22,400 --> 00:22:26,000 Speaker 1: I've ever been speaking to a group. I've never felt 372 00:22:26,240 --> 00:22:29,600 Speaker 1: I've never felt dumber, I've never felt more insecure, I've 373 00:22:29,680 --> 00:22:33,720 Speaker 1: never felt more out of my you know, my environment, 374 00:22:34,080 --> 00:22:37,680 Speaker 1: because I'm, as you've well established, with my audience. Thanks 375 00:22:37,720 --> 00:22:40,520 Speaker 1: so much. I'm not a real academic and I'm talking 376 00:22:40,560 --> 00:22:43,520 Speaker 1: to real academics than I do. 377 00:22:43,600 --> 00:22:45,960 Speaker 2: Graig, Well, you know. 378 00:22:46,040 --> 00:22:50,520 Speaker 1: It's yeah, but it's just fucking terrifying, but that is Yeah. Look, 379 00:22:50,520 --> 00:22:53,320 Speaker 1: there are a few what they call academic milestones along 380 00:22:53,359 --> 00:22:57,440 Speaker 1: the way, but yeah, other than that, I think you could. 381 00:22:58,040 --> 00:23:02,000 Speaker 1: You could. Isn't there a famous story about blokes who 382 00:23:03,520 --> 00:23:07,639 Speaker 1: wrote all these academic papers that were completely bullshit and go. 383 00:23:07,720 --> 00:23:10,400 Speaker 2: Oh yes, yes, yeah, very much on purpose. 384 00:23:11,200 --> 00:23:15,760 Speaker 1: Yeah, got accepted into Q one journal, which everyone that's 385 00:23:15,800 --> 00:23:19,679 Speaker 1: the highest level, like the most highly regarded, you know, 386 00:23:19,800 --> 00:23:24,040 Speaker 1: peer reviewed, kind of well credentialed. Yeah, and they wrote 387 00:23:24,040 --> 00:23:29,919 Speaker 1: these bullshit papers based on bullshit data and outcomes and 388 00:23:30,119 --> 00:23:33,600 Speaker 1: got them actually published. And they did it just to 389 00:23:33,640 --> 00:23:34,880 Speaker 1: prove that that was possible. 390 00:23:35,440 --> 00:23:40,639 Speaker 2: Yeah, yeah, Look, a significant percentage of so called quality 391 00:23:40,680 --> 00:23:45,439 Speaker 2: research probably falls into the category of bullshit. I do 392 00:23:45,560 --> 00:23:49,520 Speaker 2: remember there was a study about what percentage it was, 393 00:23:49,560 --> 00:23:51,480 Speaker 2: and it was a surprisingly large one. I don't have 394 00:23:51,480 --> 00:23:54,840 Speaker 2: it off the top of my head, but yes, just 395 00:23:54,880 --> 00:24:00,080 Speaker 2: because it's in a journal doesn't mean it's true. 396 00:24:00,520 --> 00:24:04,760 Speaker 1: So we'll leave our conversation about the food pyramids next time. 397 00:24:04,800 --> 00:24:10,919 Speaker 1: But my last question read all of this is it 398 00:24:11,000 --> 00:24:14,200 Speaker 1: seems like an understandably a lot of people who work 399 00:24:14,240 --> 00:24:20,600 Speaker 1: in the creative space, you know, writers, musicians, you know, 400 00:24:20,800 --> 00:24:25,399 Speaker 1: all different kinds of artists are not a fan of 401 00:24:25,440 --> 00:24:27,639 Speaker 1: AI at all because you know, like some of the 402 00:24:28,600 --> 00:24:30,840 Speaker 1: of course it's subjective, but some of the art that's 403 00:24:30,840 --> 00:24:34,119 Speaker 1: produced by AI, as if you thought a person did that, 404 00:24:34,240 --> 00:24:37,040 Speaker 1: you'd go, well, that's fucking incredible. And some of the music, 405 00:24:37,119 --> 00:24:39,800 Speaker 1: and you know, some of the writing and some of 406 00:24:39,840 --> 00:24:41,960 Speaker 1: the you know, the poetry that it can produce and 407 00:24:42,200 --> 00:24:49,320 Speaker 1: songs that it can write. So that's not a it's 408 00:24:49,320 --> 00:24:54,040 Speaker 1: not a great outcome for them, I guess, yeah, But 409 00:24:55,440 --> 00:24:59,240 Speaker 1: I guess that would be harder to take if they 410 00:24:59,280 --> 00:25:04,280 Speaker 1: hadn't all been stuffed by online use of their work. 411 00:25:04,520 --> 00:25:07,040 Speaker 2: Now, I mean, I don't know, what does a Spotify 412 00:25:07,119 --> 00:25:10,080 Speaker 2: artist get is point zero zero zero zero zero zero 413 00:25:10,160 --> 00:25:16,480 Speaker 2: one cent every time it's played? You know, the entitlement 414 00:25:17,080 --> 00:25:22,800 Speaker 2: to be paid for your work is at best theoretical now, 415 00:25:23,200 --> 00:25:28,560 Speaker 2: I would say largely, And you know, the remnants of 416 00:25:28,600 --> 00:25:30,879 Speaker 2: it are the industry. I mean, I guess when my 417 00:25:31,160 --> 00:25:34,400 Speaker 2: when my books get published, is that there's publishing houses 418 00:25:34,440 --> 00:25:37,280 Speaker 2: that still make money out of doing this and still 419 00:25:37,560 --> 00:25:39,919 Speaker 2: send you a few cents on the doll of my 420 00:25:40,040 --> 00:25:44,080 Speaker 2: way for doing it. But you know, unless unless you're 421 00:25:44,119 --> 00:25:50,080 Speaker 2: writing you know, Minion selling blockbusters you're not. No one's 422 00:25:50,119 --> 00:25:55,000 Speaker 2: making a fortune out of publishing anything, And which is 423 00:25:55,000 --> 00:25:58,720 Speaker 2: why I always find it really amusing when someone accuses 424 00:25:58,840 --> 00:26:04,479 Speaker 2: me of taking a position just for the royalties. That 425 00:26:04,640 --> 00:26:05,399 Speaker 2: is hilarious. 426 00:26:06,560 --> 00:26:07,439 Speaker 1: People don't get it. 427 00:26:07,960 --> 00:26:13,200 Speaker 2: Yeah, But so to the extent that AI is screwing 428 00:26:13,240 --> 00:26:18,879 Speaker 2: that in the sense that it's using your work in 429 00:26:18,920 --> 00:26:23,640 Speaker 2: a derivative fashion, I'm sure I guess it is. Prove it. 430 00:26:24,800 --> 00:26:30,520 Speaker 2: And what percentage of what it does are you entitled to? 431 00:26:30,960 --> 00:26:35,120 Speaker 2: You know, you entitled to point zero seventeen times one 432 00:26:35,280 --> 00:26:40,440 Speaker 2: cent for it? You know, it's you know, I've been 433 00:26:40,480 --> 00:26:44,240 Speaker 2: involved in situations where people have flat out plagiarized things 434 00:26:44,240 --> 00:26:48,080 Speaker 2: that I've written in other books by other publishers, and 435 00:26:48,119 --> 00:26:51,840 Speaker 2: you can see it's not like is this derivative or paraphrase. 436 00:26:51,920 --> 00:26:53,960 Speaker 2: It's like, here's a chunk of text from my book, 437 00:26:54,000 --> 00:26:56,439 Speaker 2: and there it is in the other book exactly the same. 438 00:26:57,960 --> 00:27:02,520 Speaker 2: And publishers have said to me, yeah, not really worth 439 00:27:02,640 --> 00:27:05,920 Speaker 2: briefing the lawyers on that, you know, and they're right. 440 00:27:06,640 --> 00:27:11,240 Speaker 1: Yeah, yeah, yeah, I am. One of my books was 441 00:27:11,280 --> 00:27:16,439 Speaker 1: published by Penguin and it sells for thirty dollars and 442 00:27:16,480 --> 00:27:19,760 Speaker 1: people are like, thirty dollars. That's a fair bit for 443 00:27:20,440 --> 00:27:24,680 Speaker 1: a paperback, Like, well, yeah, it's two hundred and eighty pages. 444 00:27:24,720 --> 00:27:29,000 Speaker 1: But yeah, and I go by the way, I get 445 00:27:29,000 --> 00:27:29,600 Speaker 1: three dollars. 446 00:27:31,400 --> 00:27:33,919 Speaker 2: Well they get they get forty seven doubts. 447 00:27:35,640 --> 00:27:38,160 Speaker 1: Yeah, well they get twenty seven dollars. I get three. 448 00:27:38,320 --> 00:27:43,240 Speaker 1: So don't be mad at me. I'm not getting rich 449 00:27:43,280 --> 00:27:46,480 Speaker 1: of this, so you know, but anyway, it is what 450 00:27:46,560 --> 00:27:50,359 Speaker 1: it is. So when are you back on deck? When 451 00:27:50,400 --> 00:27:53,520 Speaker 1: are you like when you're big boy pants and starting work? 452 00:27:53,600 --> 00:27:57,520 Speaker 2: Ah, Monday, Monday. I'll get back at it next Monday. Hey, 453 00:27:57,560 --> 00:27:59,119 Speaker 2: you know what we didn't talk about today, And I 454 00:27:59,160 --> 00:28:01,399 Speaker 2: don't know whether it's in you to your listeners, and 455 00:28:01,480 --> 00:28:03,280 Speaker 2: if so, we can hold it over the next time. 456 00:28:03,400 --> 00:28:04,639 Speaker 2: Is the bacon paradox? 457 00:28:05,119 --> 00:28:07,400 Speaker 1: Well, well, I mean, if you're up for it, I'm up. 458 00:28:07,520 --> 00:28:09,520 Speaker 2: Yeah, yeah, yeah. 459 00:28:09,560 --> 00:28:15,600 Speaker 1: My best mate rang me today and his name's Vin. 460 00:28:15,640 --> 00:28:19,919 Speaker 1: Shout out to Vin, and he's like, tell gillespo I 461 00:28:19,960 --> 00:28:22,679 Speaker 1: love him even more. I'm like, why is that he 462 00:28:22,760 --> 00:28:28,320 Speaker 1: goes bacon because it's just a big you know, Trady Bogan, 463 00:28:28,760 --> 00:28:32,000 Speaker 1: like fucking bacon sandwiches. That's his thing, you know. He's like, 464 00:28:32,040 --> 00:28:35,800 Speaker 1: oh yeah, bacon. Tell him he's a great man. I'm like, oh, 465 00:28:35,920 --> 00:28:39,240 Speaker 1: let him know, Vin, all right. So I haven't even 466 00:28:39,240 --> 00:28:41,080 Speaker 1: read it. I don't even know what it's about. So 467 00:28:41,120 --> 00:28:42,320 Speaker 1: tell me the bacon. 468 00:28:42,080 --> 00:28:45,160 Speaker 2: Story, all right. So you must have in all your 469 00:28:45,240 --> 00:28:48,880 Speaker 2: years traveling through the health industry, come across people saying, oh, 470 00:28:48,920 --> 00:28:52,960 Speaker 2: you shouldn't eat bacon because of the nitrates. Of course, yes, definitely, yes, 471 00:28:54,280 --> 00:28:59,840 Speaker 2: Well that's bs. And the article I wrote about that, 472 00:29:00,080 --> 00:29:04,840 Speaker 2: called the Bacon Paradox, is about that, which is, you know, 473 00:29:04,920 --> 00:29:07,840 Speaker 2: so that's the big reason people give you, or one 474 00:29:07,880 --> 00:29:10,040 Speaker 2: of the big reasons why you shouldn't beginning bacon. And 475 00:29:10,280 --> 00:29:12,360 Speaker 2: you know, there's sort of two camps, the ones who 476 00:29:12,400 --> 00:29:15,080 Speaker 2: won't touch bacon at all and the ones who then 477 00:29:15,120 --> 00:29:18,360 Speaker 2: go hunting for nitrate free bacon, which does exist, by 478 00:29:18,400 --> 00:29:23,920 Speaker 2: the way. And the article is about pointing out some 479 00:29:24,040 --> 00:29:28,760 Speaker 2: facts about nitrates that most people seem to not understand 480 00:29:28,920 --> 00:29:33,440 Speaker 2: or ignore, which is one is, yes, there are nitrates 481 00:29:33,480 --> 00:29:38,160 Speaker 2: added to bacon as part of the curing process, but 482 00:29:39,320 --> 00:29:44,000 Speaker 2: they're insignificant compared to the nitrates you would consume from say, 483 00:29:45,040 --> 00:29:50,560 Speaker 2: rocket salad or be troot or spinach. So, just to 484 00:29:50,560 --> 00:29:54,600 Speaker 2: give you a sense of scale, one hundred gram serving 485 00:29:54,600 --> 00:29:59,200 Speaker 2: of rocket or b troot or asparagus or salary. If 486 00:29:59,240 --> 00:30:01,880 Speaker 2: you had to consume an equivalent amount of bacon to 487 00:30:01,920 --> 00:30:04,400 Speaker 2: get the same amount of nitrates as in those things, 488 00:30:04,880 --> 00:30:08,040 Speaker 2: you'd have to consume five kilos of bacon. 489 00:30:08,600 --> 00:30:12,200 Speaker 1: Oh that's hilarious. Hang on, how on stop? So one 490 00:30:12,280 --> 00:30:16,560 Speaker 1: hundred grams five kilos. Yeah, so gram for Graham, it's 491 00:30:16,600 --> 00:30:21,920 Speaker 1: got fifty times less, fifty times less. Yeah, yeah, that's 492 00:30:21,960 --> 00:30:22,560 Speaker 1: so funny. 493 00:30:22,920 --> 00:30:26,680 Speaker 2: So you know, if nitrates were truly the toxic assassin 494 00:30:26,720 --> 00:30:28,680 Speaker 2: we are told they are, a salad would be a 495 00:30:28,720 --> 00:30:33,280 Speaker 2: suicide note. So you know, clearly there's a bit of 496 00:30:33,320 --> 00:30:36,240 Speaker 2: an issue there. Now, if you put that to people, 497 00:30:36,560 --> 00:30:38,640 Speaker 2: they do one or two things. They say, oh yeah, yeah, 498 00:30:38,640 --> 00:30:43,120 Speaker 2: but there are artificial nitrates in bacon. Well, no, they're added, 499 00:30:43,360 --> 00:30:46,320 Speaker 2: but they're exactly the same chemical. They're the same molecule. 500 00:30:46,880 --> 00:30:51,160 Speaker 2: It's no, it doesn't matter whether it came from you know, 501 00:30:51,200 --> 00:30:52,960 Speaker 2: a bottle and a lab that they added to the 502 00:30:53,000 --> 00:30:55,520 Speaker 2: curing of the meat, or whether they ground up some 503 00:30:55,600 --> 00:30:59,840 Speaker 2: celery and got it. It's the same molecule. So there's 504 00:30:59,840 --> 00:31:03,680 Speaker 2: no that. But they would make another point, which is valid, 505 00:31:03,720 --> 00:31:07,480 Speaker 2: which is, ah, yeah, yeah, but what makes nitrates in 506 00:31:07,600 --> 00:31:12,600 Speaker 2: cella less dangerous is that the cella comes with some 507 00:31:12,720 --> 00:31:17,000 Speaker 2: vitamin C, and vitamin C is an adiocidant, which essentially 508 00:31:17,120 --> 00:31:22,960 Speaker 2: neutralizes the danger presented by nitrates. And that's true, it does, 509 00:31:23,480 --> 00:31:26,440 Speaker 2: which is exactly why they put vitamin C in the 510 00:31:26,520 --> 00:31:30,560 Speaker 2: curing of bacon. So all bacon that's cured with nitrate 511 00:31:31,640 --> 00:31:34,960 Speaker 2: either has vitamin C or a very closely aligned molecule 512 00:31:35,000 --> 00:31:38,200 Speaker 2: which is also an antioxidant. I think it's E three 513 00:31:38,480 --> 00:31:42,560 Speaker 2: six or something same thing. So that's why it's there, 514 00:31:42,920 --> 00:31:45,160 Speaker 2: and you'll see it on the label every single time. 515 00:31:45,320 --> 00:31:47,840 Speaker 2: It's there to make sure that the nitrate doesn't do 516 00:31:47,920 --> 00:31:48,680 Speaker 2: anything to you. 517 00:31:49,520 --> 00:31:53,000 Speaker 1: So in the which we'll talk about next time, but 518 00:31:53,080 --> 00:31:56,960 Speaker 1: in the new inverted version of the food pyramid, essentially 519 00:31:57,000 --> 00:31:59,320 Speaker 1: I say it's inverted because it's almost just the old 520 00:31:59,320 --> 00:32:04,480 Speaker 1: one upside down, bacon would rank as a healthy food 521 00:32:04,520 --> 00:32:06,440 Speaker 1: or as a relatively healthy food. 522 00:32:06,960 --> 00:32:10,160 Speaker 2: Well, not an unhealthy food, let's put it that way. 523 00:32:10,440 --> 00:32:13,800 Speaker 2: Although you know, there is something to be said for 524 00:32:13,880 --> 00:32:17,120 Speaker 2: being careful about where the pig was raised or what 525 00:32:17,240 --> 00:32:22,080 Speaker 2: it was fed. So pigs, like humans, are omnivores and 526 00:32:22,160 --> 00:32:26,080 Speaker 2: they absorb poly unsaturated fats if they're fed them. Now, 527 00:32:26,120 --> 00:32:28,280 Speaker 2: pigs in nature don't get access to much in the 528 00:32:28,320 --> 00:32:32,040 Speaker 2: way of poly unsaturated fat. So you know, a wild 529 00:32:32,200 --> 00:32:36,320 Speaker 2: pig or a pig that's able to feed itself in 530 00:32:36,400 --> 00:32:43,680 Speaker 2: nature is fine. But a farm harvested or farm raised 531 00:32:43,720 --> 00:32:46,320 Speaker 2: pig is going to be fed grains which are high 532 00:32:46,360 --> 00:32:48,840 Speaker 2: in a Mega six fats and so they're fat. It 533 00:32:48,880 --> 00:32:50,640 Speaker 2: is going to be high in a Mega six fat two. 534 00:32:50,720 --> 00:32:55,400 Speaker 2: So you want to trim the bacon of non free 535 00:32:55,480 --> 00:33:01,520 Speaker 2: range ham or bacon. In Australia, almost all of our 536 00:33:01,560 --> 00:33:05,320 Speaker 2: bacon is not free range, so it's about ninety seven 537 00:33:05,320 --> 00:33:11,160 Speaker 2: percent of bacon sold in Australia is grain fed. So 538 00:33:12,200 --> 00:33:14,240 Speaker 2: if you're going to eat the fat, make sure it's 539 00:33:14,640 --> 00:33:17,160 Speaker 2: not grain fed. So you'd have to go to some 540 00:33:17,240 --> 00:33:18,360 Speaker 2: links to make sure of that. 541 00:33:19,360 --> 00:33:22,320 Speaker 1: So pigs in the wild, while pigs, I guess we 542 00:33:22,400 --> 00:33:26,680 Speaker 1: call them or bores, do they eat meat like they can? 543 00:33:26,920 --> 00:33:27,800 Speaker 1: I know they're omnivous. 544 00:33:28,280 --> 00:33:31,400 Speaker 2: They're omnivors. Yeah, they'll lead anything they can get. Wow, 545 00:33:32,280 --> 00:33:35,920 Speaker 2: So like humans, whatever is available. 546 00:33:36,000 --> 00:33:38,600 Speaker 1: Yeah yeah, yeah, yeah yeah wow. Wow. 547 00:33:39,240 --> 00:33:41,880 Speaker 2: They haven't invented fire yet, so they probably find it 548 00:33:41,920 --> 00:33:43,280 Speaker 2: a bit more tricky to eat meat. 549 00:33:45,040 --> 00:33:48,760 Speaker 1: Yeah, I think, yeah, just getting their hands or their 550 00:33:48,760 --> 00:33:52,360 Speaker 1: little what do you call them, trotters, trotters, thank you? 551 00:33:52,480 --> 00:33:55,160 Speaker 1: Getting their trotters on a box of matches is somewhat 552 00:33:55,200 --> 00:34:00,800 Speaker 1: complex out in the middle of and what what brought 553 00:34:00,840 --> 00:34:03,840 Speaker 1: that to your attention? Like what like when you come 554 00:34:03,920 --> 00:34:05,680 Speaker 1: up with and we'll go in a second, but like 555 00:34:06,320 --> 00:34:09,359 Speaker 1: I'm thinking, where did this idea come from? Like for that, 556 00:34:09,480 --> 00:34:13,600 Speaker 1: what caught your nitrates or just even the bacon the 557 00:34:13,680 --> 00:34:14,919 Speaker 1: story in general. 558 00:34:17,000 --> 00:34:19,439 Speaker 2: I don't know. I've seen it every now and then 559 00:34:19,680 --> 00:34:21,640 Speaker 2: you come across these things. One of the things that 560 00:34:21,840 --> 00:34:26,319 Speaker 2: irritates me that it is sort of vegetarian slash vegan propaganda, 561 00:34:26,600 --> 00:34:31,360 Speaker 2: and an often part of that is this story about 562 00:34:31,840 --> 00:34:34,520 Speaker 2: you know, oh, you can't eat bacon because of the nitrates. 563 00:34:35,560 --> 00:34:39,719 Speaker 2: And so I did look out of a long time 564 00:34:39,760 --> 00:34:41,880 Speaker 2: ago when I was writing a book about seed oils 565 00:34:41,880 --> 00:34:44,520 Speaker 2: and looked into the nitrates on bacon. And then I 566 00:34:44,560 --> 00:34:46,919 Speaker 2: remembered the other day and I was thinking about something 567 00:34:46,920 --> 00:34:49,879 Speaker 2: I might want to write. I thought, what, I don't 568 00:34:49,880 --> 00:34:52,480 Speaker 2: think I've ever actually written a piece about the nitrates, 569 00:34:52,640 --> 00:34:54,440 Speaker 2: So I thought I'd do it. 570 00:34:55,760 --> 00:34:58,920 Speaker 1: Do you know what, I would love, love, love love 571 00:34:59,080 --> 00:35:01,759 Speaker 1: for you to write an art colon, and I think 572 00:35:01,800 --> 00:35:04,919 Speaker 1: it will get huge traction. And I'm not being facetious. 573 00:35:06,120 --> 00:35:11,239 Speaker 1: So I don't know if you've seen the the saturation 574 00:35:11,600 --> 00:35:17,360 Speaker 1: exposure that Oprah's getting at the moment with her new book. No, No, Okay. 575 00:35:17,440 --> 00:35:21,000 Speaker 1: So her new book is called Give Me a Minute. 576 00:35:21,840 --> 00:35:25,200 Speaker 1: A new book is called Enough, Your Health, Your Weight, 577 00:35:25,320 --> 00:35:28,880 Speaker 1: and what It's Like to Be Free. It's co authored 578 00:35:28,880 --> 00:35:34,880 Speaker 1: with a doctor lady called it Anya or Anya Anya anyway, 579 00:35:36,160 --> 00:35:40,920 Speaker 1: She okay, I'm gonna I'm actually doing a podcast on Saturday. 580 00:35:41,000 --> 00:35:45,080 Speaker 1: As we record this, everyone, it's Thursday night, and I'm 581 00:35:45,080 --> 00:35:49,120 Speaker 1: going to explore some of these And so she doesn't 582 00:35:49,160 --> 00:35:54,000 Speaker 1: say people are obese. She says people have obesity, right, 583 00:35:54,320 --> 00:35:59,440 Speaker 1: and it's a chronic disease. And she says, if you 584 00:35:59,520 --> 00:36:04,200 Speaker 1: have oben it's not because you're overeating, it's because you 585 00:36:04,360 --> 00:36:10,239 Speaker 1: have the disease that causes you to overeat. And she 586 00:36:10,320 --> 00:36:14,319 Speaker 1: says willpower is not going to work if you have obesity. 587 00:36:15,680 --> 00:36:19,200 Speaker 1: She says, I was carrying a chronic disease that caused 588 00:36:19,200 --> 00:36:25,080 Speaker 1: me to overeat. And one of the reoccurring themes and 589 00:36:25,200 --> 00:36:28,880 Speaker 1: by the way, everyone listening to me, I think for 590 00:36:28,920 --> 00:36:31,520 Speaker 1: the most part. In general terms, she's amazing the stuff 591 00:36:31,520 --> 00:36:35,080 Speaker 1: that she's done. You know, I've done nothing compared to her. 592 00:36:35,160 --> 00:36:38,719 Speaker 1: She's a superstar. But I have worked a lot with 593 00:36:38,800 --> 00:36:41,400 Speaker 1: bodies and diets and food and people who are obese 594 00:36:41,440 --> 00:36:43,600 Speaker 1: and people who are lean and fit and highly functional 595 00:36:43,600 --> 00:36:47,520 Speaker 1: and poorly functional and all of that. But one of 596 00:36:47,520 --> 00:36:51,120 Speaker 1: the things that she says emphatically is that, like when 597 00:36:51,320 --> 00:36:53,200 Speaker 1: you know, talking to people about their weight or this 598 00:36:53,320 --> 00:36:57,960 Speaker 1: is fat shaming. And I'm like, well, no, no, no. 599 00:36:58,040 --> 00:37:01,840 Speaker 1: If I'm talking to somebody about like the reality of 600 00:37:01,880 --> 00:37:04,520 Speaker 1: what they weigh or their size, and I'm coming from 601 00:37:04,600 --> 00:37:06,759 Speaker 1: a point of view of well, do you want to 602 00:37:06,760 --> 00:37:09,440 Speaker 1: be obese? The answer is usually know and how do 603 00:37:09,440 --> 00:37:11,480 Speaker 1: you feel? And the answer is not, you know, it's 604 00:37:11,520 --> 00:37:14,279 Speaker 1: not always great. But for me to do that is 605 00:37:14,320 --> 00:37:20,280 Speaker 1: apparently triggering and fat shaming, whereas I'm really I just want, 606 00:37:20,760 --> 00:37:23,640 Speaker 1: like you didn't accidentally end up in the spot like 607 00:37:23,680 --> 00:37:25,960 Speaker 1: I've been obese. Now. When I was obese, it was 608 00:37:26,000 --> 00:37:28,960 Speaker 1: all about my choices and my behaviors, and when I 609 00:37:29,120 --> 00:37:32,759 Speaker 1: changed my operating system, i lost the weight and I 610 00:37:32,840 --> 00:37:36,279 Speaker 1: kept that off since I was fourteen, essentially, so you know, 611 00:37:36,320 --> 00:37:40,440 Speaker 1: whatever that is nearly fifty years. But it is some 612 00:37:40,480 --> 00:37:44,319 Speaker 1: of the stuff that they're saying, Like I think I've 613 00:37:44,360 --> 00:37:47,200 Speaker 1: figured I'm not positive, but I'm pretty sure it was 614 00:37:47,239 --> 00:37:52,360 Speaker 1: like seventy four percent of people are hardwired to be obese, 615 00:37:52,760 --> 00:37:56,279 Speaker 1: right And I'm thinking, Okay, So if it's not about 616 00:37:56,320 --> 00:37:58,800 Speaker 1: your lifestyle, it's not about your choices, it's not about 617 00:37:58,840 --> 00:38:03,560 Speaker 1: your habits, and it's not your fault, and then why 618 00:38:03,600 --> 00:38:06,359 Speaker 1: do we have sixty six percent obese in America give 619 00:38:06,440 --> 00:38:10,400 Speaker 1: or take right now? Overweight and obese in America right now? 620 00:38:10,800 --> 00:38:12,480 Speaker 1: And if we look one hundred years ago, I don't 621 00:38:12,480 --> 00:38:14,400 Speaker 1: know what the number would be, but I'm sure it 622 00:38:14,400 --> 00:38:20,080 Speaker 1: would have been less than ten. You know. It's just 623 00:38:20,360 --> 00:38:24,200 Speaker 1: like there's this whole weird psychology going on and this 624 00:38:24,600 --> 00:38:30,000 Speaker 1: if anybody says anything that's practical or to me, extremely 625 00:38:30,080 --> 00:38:34,319 Speaker 1: reasonable about let's not beat ourselves up, let's not self loath, 626 00:38:34,400 --> 00:38:38,080 Speaker 1: but let's be self aware. Let's be honest. Let's acknowledge 627 00:38:38,120 --> 00:38:41,000 Speaker 1: the decisions that we make and the behaviors that we embrace. 628 00:38:41,040 --> 00:38:45,799 Speaker 1: The things that we do actually have a physiological consequence. 629 00:38:46,600 --> 00:38:49,560 Speaker 1: And that's not because you're bad or terrible, but that 630 00:38:49,680 --> 00:38:53,160 Speaker 1: is just that is a consequence of those choices and behaviors. 631 00:38:53,680 --> 00:38:57,239 Speaker 1: But her message is essentially, oh, that's out of your control. 632 00:38:58,080 --> 00:39:00,719 Speaker 2: I'm like, and what say solution. 633 00:39:02,239 --> 00:39:02,680 Speaker 1: Drugs? 634 00:39:02,719 --> 00:39:05,680 Speaker 2: The drugs are all right, okay, Oh there we go. 635 00:39:05,680 --> 00:39:11,640 Speaker 2: So so obesity is an insufficiency of weight loss drugs. 636 00:39:13,640 --> 00:39:18,120 Speaker 1: Well, no, it's biological. She uses that word a lot, 637 00:39:18,239 --> 00:39:21,160 Speaker 1: and the other lady, the doctor lady, Oh, it's biological. 638 00:39:21,200 --> 00:39:25,480 Speaker 1: And people just you know, they don't understand. But when 639 00:39:25,520 --> 00:39:28,640 Speaker 1: they yeah, look, they really really like. She said. 640 00:39:28,680 --> 00:39:29,200 Speaker 2: One of her. 641 00:39:29,160 --> 00:39:31,560 Speaker 1: Quotes was, I will be on this forever. This is 642 00:39:31,600 --> 00:39:32,879 Speaker 1: me for the rest of my life. 643 00:39:33,320 --> 00:39:36,000 Speaker 2: Well you have to be Those drugs stop working the 644 00:39:36,080 --> 00:39:37,160 Speaker 2: minute you stop taking them. 645 00:39:37,800 --> 00:39:40,560 Speaker 1: Anyway, I reckon when you have a minute, or when 646 00:39:40,600 --> 00:39:42,520 Speaker 1: you have sixty minutes. Have a look at some of 647 00:39:42,560 --> 00:39:47,319 Speaker 1: the stuff that she's advocating. I reckon, you could have 648 00:39:47,360 --> 00:39:52,480 Speaker 1: a field day with and not to you know, start 649 00:39:52,480 --> 00:39:54,600 Speaker 1: a shit fight tour, but just to go Okay, So 650 00:39:54,680 --> 00:39:56,960 Speaker 1: let's address these things one at a time. So she 651 00:39:57,080 --> 00:40:00,239 Speaker 1: says this, Now, let's unpack that. Let's see if that's 652 00:40:00,280 --> 00:40:02,600 Speaker 1: true or not, or what part's true or what parts not, 653 00:40:03,480 --> 00:40:08,920 Speaker 1: because it's very much seems to me about alleviating people 654 00:40:08,920 --> 00:40:15,240 Speaker 1: of any responsibility or accountability. Anyway, have a look interesting, 655 00:40:15,280 --> 00:40:18,200 Speaker 1: I'll have a look at it all right, Well, say goodbye. 656 00:40:18,320 --> 00:40:19,920 Speaker 1: Fair but always good to chat mate. 657 00:40:19,960 --> 00:40:23,520 Speaker 2: Appreciate you, absolute pleasure. See you Graig, see buddy.