1 00:00:07,133 --> 00:00:10,453 Speaker 1: You're listening to the Saturday Morning with Jack Team podcast 2 00:00:10,573 --> 00:00:11,693 Speaker 1: from News Talks at B. 3 00:00:12,733 --> 00:00:15,493 Speaker 2: I was saying just after nine o'clock this morning that 4 00:00:15,893 --> 00:00:18,573 Speaker 2: I've had a bit of a wow moment once again 5 00:00:18,813 --> 00:00:22,053 Speaker 2: with AI. I'm not a total AI hyper I'm not 6 00:00:22,213 --> 00:00:25,133 Speaker 2: like going whoo for AI everything, but I've had a 7 00:00:25,133 --> 00:00:28,013 Speaker 2: few moments over the last couple of years where using 8 00:00:28,293 --> 00:00:31,213 Speaker 2: the AI tools that are available, I thought, oh, my goodness, 9 00:00:31,213 --> 00:00:32,933 Speaker 2: this is really impressive. So the first time I use 10 00:00:33,013 --> 00:00:35,853 Speaker 2: chat GPT, I thought that. The first time I saw 11 00:00:35,893 --> 00:00:38,613 Speaker 2: some of the video generation, I thought, okay, this is 12 00:00:38,653 --> 00:00:42,573 Speaker 2: pretty impressive. But now, for the first time in my life, 13 00:00:42,653 --> 00:00:46,613 Speaker 2: I have created a website, I've created an app. I've 14 00:00:46,613 --> 00:00:49,613 Speaker 2: created a few other bits and pieces via Vibe coding, 15 00:00:49,693 --> 00:00:54,573 Speaker 2: so zero programming experience, zero coding skills, but talking to 16 00:00:54,933 --> 00:00:58,773 Speaker 2: Clawed code, which is one of the big AI systems, 17 00:00:58,853 --> 00:01:02,613 Speaker 2: I've been able to just use English prompts to make 18 00:01:03,293 --> 00:01:06,373 Speaker 2: some of these things, which has been quite a remarkable, 19 00:01:06,413 --> 00:01:09,733 Speaker 2: a magic thing to experience. Honestly, our textbook, Paul Steinhouse 20 00:01:10,013 --> 00:01:12,373 Speaker 2: is a little more skillful and a little more ticked 21 00:01:12,373 --> 00:01:15,413 Speaker 2: adapt than I tick adept than I am, and he's 22 00:01:15,453 --> 00:01:16,533 Speaker 2: with us now to Paul. 23 00:01:17,413 --> 00:01:20,533 Speaker 3: Yeah, morning Jack, So you went straight for Claude code. 24 00:01:20,533 --> 00:01:22,853 Speaker 3: That's pretty impressive because you claud code you have to 25 00:01:22,933 --> 00:01:24,533 Speaker 3: use in the terminal. Did you use it in the 26 00:01:24,613 --> 00:01:26,373 Speaker 3: term you don't have to use it on the screen. 27 00:01:26,533 --> 00:01:29,733 Speaker 2: Nope, No, you don't have the desktop app. I use 28 00:01:29,773 --> 00:01:33,013 Speaker 2: the desktop app to do it. So yeah, yeah, no, 29 00:01:33,133 --> 00:01:36,213 Speaker 2: I tried. I over to terminal. Yes, I've connected to 30 00:01:36,213 --> 00:01:36,613 Speaker 2: get hub. 31 00:01:36,693 --> 00:01:37,133 Speaker 1: There you go. 32 00:01:37,493 --> 00:01:39,533 Speaker 2: See, I got called. I said, Claude, treat me like 33 00:01:39,573 --> 00:01:41,493 Speaker 2: an idiot. I talked to the chatbot. First of all, 34 00:01:41,493 --> 00:01:44,573 Speaker 2: I said, treat me like an idiot clad and it said, 35 00:01:44,613 --> 00:01:47,613 Speaker 2: don't worry, Jack, we know you're an idiot. And I said, 36 00:01:47,693 --> 00:01:50,373 Speaker 2: you tell me, like, talk me through and really basic 37 00:01:50,413 --> 00:01:52,053 Speaker 2: steps what I need to do. And I said, okay, 38 00:01:52,093 --> 00:01:53,613 Speaker 2: you need to get a GitHub account, you need to 39 00:01:53,693 --> 00:01:55,293 Speaker 2: use terminal. I did that and then it was like, oh, 40 00:01:55,293 --> 00:01:56,733 Speaker 2: the easier way to do this is this on the 41 00:01:56,773 --> 00:01:59,053 Speaker 2: desktop app. So that's what I did. But I mean 42 00:01:59,093 --> 00:02:01,533 Speaker 2: it is. I mean it is. It's hard to distinguish 43 00:02:01,613 --> 00:02:04,613 Speaker 2: the AI hype from reality, right and to work out 44 00:02:04,693 --> 00:02:08,293 Speaker 2: how warranted it is. But in seeing what it could 45 00:02:08,333 --> 00:02:13,333 Speaker 2: do with my basic English language prompt and seeing how 46 00:02:13,413 --> 00:02:16,333 Speaker 2: quickly it could do it, it's hard not to imagine 47 00:02:16,333 --> 00:02:18,373 Speaker 2: that people who are a little more skillful than me 48 00:02:18,693 --> 00:02:20,973 Speaker 2: when it comes to computer programming and coding and stuff 49 00:02:21,053 --> 00:02:25,093 Speaker 2: are able to build really impressive things. Oh for sure. 50 00:02:25,413 --> 00:02:29,133 Speaker 3: I think what's really fascinating about the AI type stuff 51 00:02:29,213 --> 00:02:32,253 Speaker 3: is that if you're a really good communicator and you 52 00:02:32,453 --> 00:02:36,333 Speaker 3: know the right terminology and the words in particular industries, 53 00:02:36,973 --> 00:02:39,893 Speaker 3: you can go really far. And the thing in AI 54 00:02:40,053 --> 00:02:42,813 Speaker 3: is that every single word matters, So when you're thinking 55 00:02:42,853 --> 00:02:45,893 Speaker 3: about how you're prompting it and what you're saying, it 56 00:02:46,013 --> 00:02:48,573 Speaker 3: just gets better and better, the more specific and the 57 00:02:48,653 --> 00:02:52,493 Speaker 3: more if you're using the correct terminology, it just does things. 58 00:02:52,533 --> 00:02:54,013 Speaker 3: It doesn't have to guess as much. It doesn't make 59 00:02:54,093 --> 00:02:57,053 Speaker 3: quite as many mistakes because what it is really is 60 00:02:57,293 --> 00:02:58,733 Speaker 3: autocomplete on steroids. 61 00:02:58,973 --> 00:02:59,773 Speaker 2: Yeah right. 62 00:02:59,773 --> 00:03:03,653 Speaker 3: It's looking at this massive amount of information, old projects 63 00:03:03,693 --> 00:03:05,733 Speaker 3: and all this sort of stuff, and it's literally just 64 00:03:05,813 --> 00:03:08,893 Speaker 3: like compare ones and twos ones and two ones and zeros, 65 00:03:08,893 --> 00:03:11,213 Speaker 3: i should say, And it just literally is looking to 66 00:03:11,213 --> 00:03:15,253 Speaker 3: see where the patterns are. And that's but you can 67 00:03:15,293 --> 00:03:17,733 Speaker 3: create with it. It's it is, it's remarkable. 68 00:03:17,733 --> 00:03:20,173 Speaker 2: But the thing that's interesting is it can work backwards. Right, 69 00:03:20,293 --> 00:03:23,213 Speaker 2: So if you say to it, you know, I want 70 00:03:23,293 --> 00:03:26,453 Speaker 2: a rhyming sonnet, and you give it the first line, 71 00:03:27,093 --> 00:03:29,413 Speaker 2: so make me a rhyming like like or give me 72 00:03:29,453 --> 00:03:32,133 Speaker 2: a ryming couplet, right, So so I want I want 73 00:03:32,173 --> 00:03:33,693 Speaker 2: you to come up with a second line that makes 74 00:03:33,733 --> 00:03:34,653 Speaker 2: sense and rhymes with this. 75 00:03:35,373 --> 00:03:35,893 Speaker 3: It will. 76 00:03:36,533 --> 00:03:39,493 Speaker 2: If it were just autocomplete on steroids, then it would 77 00:03:39,573 --> 00:03:41,893 Speaker 2: just that The obvious thing would is that it would 78 00:03:41,893 --> 00:03:43,373 Speaker 2: go to the next word, and then the next word, 79 00:03:43,413 --> 00:03:45,893 Speaker 2: and then and then then the next word. But it knows, 80 00:03:45,973 --> 00:03:48,573 Speaker 2: because you want a rhyming couplet, that it's got to 81 00:03:48,613 --> 00:03:51,573 Speaker 2: start with the rhyming point and then work backwards. So 82 00:03:51,693 --> 00:03:54,253 Speaker 2: it's smart enough to do. I mean, that's quite a 83 00:03:54,253 --> 00:03:56,453 Speaker 2: basic example, but it's smart enough to be able to 84 00:03:56,533 --> 00:03:59,493 Speaker 2: kind of contextualize requests in that way. 85 00:04:00,413 --> 00:04:00,853 Speaker 1: Yeah. I mean. 86 00:04:00,893 --> 00:04:02,853 Speaker 2: The thing that's amazing about these new systems too, is 87 00:04:02,853 --> 00:04:06,373 Speaker 2: that obviously a lot of the code for the systems 88 00:04:06,453 --> 00:04:10,253 Speaker 2: themselves is being written by the system, which is it's hating. 89 00:04:10,533 --> 00:04:13,533 Speaker 3: It's incredible, wasn't it. It's crazily circular. Okay, so can 90 00:04:13,573 --> 00:04:15,693 Speaker 3: we go? Can we go slightly even more dystopian? So 91 00:04:15,733 --> 00:04:19,373 Speaker 3: there's Jack Tame, you know, New Zealand's sexiest personal live 92 00:04:19,813 --> 00:04:20,933 Speaker 3: turned software engineer. 93 00:04:21,253 --> 00:04:21,733 Speaker 2: What was it? 94 00:04:21,773 --> 00:04:23,893 Speaker 3: Wasn't it the award you wont wasn't. 95 00:04:23,653 --> 00:04:24,933 Speaker 2: It a few years ago? 96 00:04:24,773 --> 00:04:25,373 Speaker 1: People? Yes? 97 00:04:26,093 --> 00:04:27,413 Speaker 2: I mean you were the one who bought it up. 98 00:04:27,413 --> 00:04:30,333 Speaker 2: But you know, we need to get lost in ancient history. 99 00:04:31,013 --> 00:04:33,613 Speaker 3: Okay, So imagine yourself. You're now working at the US 100 00:04:33,613 --> 00:04:40,413 Speaker 3: Department of War and you you just create a little 101 00:04:40,453 --> 00:04:43,493 Speaker 3: app and it decides that it wants to launch an 102 00:04:43,493 --> 00:04:46,933 Speaker 3: attack from an overhead drone. Is that a good idea 103 00:04:47,093 --> 00:04:51,773 Speaker 3: or a bad idea? Because it's happening. Well, the conversation 104 00:04:51,893 --> 00:04:54,333 Speaker 3: about that exact thing is happening right now. And while 105 00:04:54,373 --> 00:04:57,013 Speaker 3: I was waiting to talk to you, I got an 106 00:04:57,093 --> 00:05:00,373 Speaker 3: alert from CNN saying that because of this rift between 107 00:05:00,773 --> 00:05:04,093 Speaker 3: anthropic who makes flawed code and clawed the chat box, 108 00:05:05,453 --> 00:05:07,613 Speaker 3: they've got some hard lines in the stand. They work 109 00:05:07,693 --> 00:05:09,893 Speaker 3: with the Department of Defense slash War at the moment, 110 00:05:10,733 --> 00:05:12,453 Speaker 3: and they've said, you're not going to be using it 111 00:05:12,493 --> 00:05:14,613 Speaker 3: for anything like that. We do not want like, you 112 00:05:14,693 --> 00:05:18,773 Speaker 3: can't just start launching missiles that AI directed, and AI 113 00:05:18,893 --> 00:05:23,013 Speaker 3: decided no, thank you. It also said you won't be 114 00:05:23,133 --> 00:05:25,773 Speaker 3: using this to help surveil the public. So I guess, 115 00:05:25,853 --> 00:05:29,293 Speaker 3: looking at vast quantities of data and figuring out things 116 00:05:29,333 --> 00:05:33,933 Speaker 3: that are going on, the Department of Defense slash war said, well, 117 00:05:33,973 --> 00:05:36,773 Speaker 3: we want to use it, and we don't want you 118 00:05:36,853 --> 00:05:39,533 Speaker 3: to say what we can and can't do. And so 119 00:05:39,693 --> 00:05:42,493 Speaker 3: they have until it's currently four forty three here in 120 00:05:42,533 --> 00:05:45,133 Speaker 3: New York at the moment. They had until five one 121 00:05:45,253 --> 00:05:49,253 Speaker 3: pm to reach an agreement with the Department of Defense. 122 00:05:50,613 --> 00:05:54,013 Speaker 3: But President Trump has just tweeted a truth social I 123 00:05:54,013 --> 00:05:57,973 Speaker 3: should say, and said that every government department now has 124 00:05:58,093 --> 00:06:03,573 Speaker 3: six months to get off everything Anthropic, and the Department 125 00:06:03,773 --> 00:06:06,373 Speaker 3: of Defense is now deciding whether or not they'll be 126 00:06:06,533 --> 00:06:09,773 Speaker 3: invoking the Defense Production Act. Well they were previously. Who 127 00:06:09,773 --> 00:06:11,893 Speaker 3: knows what they're doing now, but the talk was they'd be. 128 00:06:12,333 --> 00:06:14,413 Speaker 3: The threat was they were saying they'd invoke the Defense 129 00:06:14,413 --> 00:06:18,373 Speaker 3: Production Act, which would force Anthropic to make its product available, 130 00:06:19,133 --> 00:06:20,813 Speaker 3: or they're going to go down a different road and 131 00:06:20,893 --> 00:06:25,133 Speaker 3: label it a supply chain risk to prevent every military supplier, 132 00:06:25,253 --> 00:06:29,573 Speaker 3: which there are obviously a number, from using it. So 133 00:06:30,173 --> 00:06:33,533 Speaker 3: crazy kind of situation at the moment. And you've got Anthropic, 134 00:06:34,173 --> 00:06:37,933 Speaker 3: whose biggest competitor is open AI behind chat GPT that 135 00:06:38,133 --> 00:06:41,853 Speaker 3: has backed Anthropic and said, yeah, those guardrails and things 136 00:06:41,893 --> 00:06:46,613 Speaker 3: sound you know, probably pretty good. So it sounds like 137 00:06:46,693 --> 00:06:50,093 Speaker 3: it's going to be the Department of War battling the 138 00:06:50,213 --> 00:06:51,133 Speaker 3: AI companies. 139 00:06:51,293 --> 00:06:51,533 Speaker 1: Yeah. 140 00:06:51,733 --> 00:06:53,253 Speaker 2: Man, it's going to be amazing to see how this 141 00:06:53,293 --> 00:06:56,973 Speaker 2: scends up. And it certainly sounds like what you've advised 142 00:06:57,013 --> 00:06:59,413 Speaker 2: us to read. The president's response means that it's only 143 00:06:59,453 --> 00:07:01,253 Speaker 2: going to get more and more feisty from here. Thank you, 144 00:07:01,293 --> 00:07:04,133 Speaker 2: paul A Textbert Paul Stenhouse. They're going to be one 145 00:07:04,133 --> 00:07:07,253 Speaker 2: to watch over the next few well minutes, hours, weeks, months, 146 00:07:07,253 --> 00:07:07,653 Speaker 2: et cetera. 147 00:07:08,413 --> 00:07:11,493 Speaker 1: For more from Saturday Morning with Jack Tame, listen live 148 00:07:11,613 --> 00:07:14,413 Speaker 1: to News Talks ed B from nine am Saturday, or 149 00:07:14,493 --> 00:07:16,413 Speaker 1: follow the podcast on iHeartRadio.