1 00:00:07,133 --> 00:00:10,453 Speaker 1: You're listening to the Saturday Morning with Jack team podcast 2 00:00:10,573 --> 00:00:11,773 Speaker 1: from News Talks. 3 00:00:11,453 --> 00:00:15,413 Speaker 2: A b VIN. You know, so much attention with AI, 4 00:00:15,733 --> 00:00:18,493 Speaker 2: at least for us plebs, has been around the large 5 00:00:18,573 --> 00:00:20,813 Speaker 2: language models, you know, like the likes of chat GPT, 6 00:00:20,893 --> 00:00:22,893 Speaker 2: where you go on you ask a question, it comes back. 7 00:00:23,653 --> 00:00:26,853 Speaker 2: But actually the big developments on the AI front and 8 00:00:26,893 --> 00:00:30,213 Speaker 2: the consumer facing AI front over the last few weeks 9 00:00:30,213 --> 00:00:33,453 Speaker 2: and month are on the agentic coding side. What does 10 00:00:33,533 --> 00:00:36,173 Speaker 2: that mean? That means the stuff where you can put 11 00:00:36,253 --> 00:00:39,853 Speaker 2: in a command and basically it writes computer code. So 12 00:00:40,053 --> 00:00:43,653 Speaker 2: even if you have zero computer coding knowledge, it's able 13 00:00:43,693 --> 00:00:46,653 Speaker 2: to do it for you. Our textbook Paul Steinhaus is here, 14 00:00:46,653 --> 00:00:49,213 Speaker 2: and a couple of new models this week from Anthropic 15 00:00:49,333 --> 00:00:54,533 Speaker 2: and Open AI, which suggests the agentic coding war is 16 00:00:54,573 --> 00:00:55,453 Speaker 2: really heating up. 17 00:00:56,013 --> 00:00:58,253 Speaker 3: Oh big time, Jack, I mean, we've got new models 18 00:00:58,293 --> 00:01:01,693 Speaker 3: from both of those big players, and it's incredible just 19 00:01:01,773 --> 00:01:04,933 Speaker 3: how good this is getting and how much information you 20 00:01:04,973 --> 00:01:08,653 Speaker 3: can feed these models too. It's it's just it's wild. 21 00:01:08,933 --> 00:01:09,853 Speaker 2: So how does it work? 22 00:01:10,893 --> 00:01:12,933 Speaker 3: So they basically know everything there is to know about 23 00:01:12,973 --> 00:01:16,533 Speaker 3: the programming languages, and then as you are giving it 24 00:01:16,853 --> 00:01:19,133 Speaker 3: things you want it to do it references all that 25 00:01:19,173 --> 00:01:23,093 Speaker 3: information like autocomplete on steroids. Right, It's basically the easiest 26 00:01:23,093 --> 00:01:23,853 Speaker 3: way to think of it. 27 00:01:23,933 --> 00:01:27,413 Speaker 2: So forever, I like, with someone who is zero knowledge 28 00:01:27,453 --> 00:01:30,293 Speaker 2: of the coding languages, what could I do if I 29 00:01:30,333 --> 00:01:33,573 Speaker 2: went on there today with these new models, what could 30 00:01:33,573 --> 00:01:33,973 Speaker 2: I do? 31 00:01:34,653 --> 00:01:36,493 Speaker 3: Yeah, you can go on to So there's a few 32 00:01:36,533 --> 00:01:38,973 Speaker 3: applications that have kind of built rappers around some of 33 00:01:39,013 --> 00:01:41,253 Speaker 3: these things. So you can go onto websites like lovable 34 00:01:41,373 --> 00:01:43,733 Speaker 3: or bolt and you honestly can give it anything. You 35 00:01:43,773 --> 00:01:46,053 Speaker 3: can get it to give you a to do list, 36 00:01:46,093 --> 00:01:50,013 Speaker 3: you can get it to give you a recipe web something, Yeah, 37 00:01:50,093 --> 00:01:52,293 Speaker 3: a whole number of things. It really is quite impressive, 38 00:01:52,293 --> 00:01:55,853 Speaker 3: and it just it churns away and it creates working things. 39 00:01:56,013 --> 00:02:01,013 Speaker 2: Right, So it effectively can make a website or even 40 00:02:01,013 --> 00:02:02,573 Speaker 2: make it like a kind of application. 41 00:02:03,093 --> 00:02:05,453 Speaker 3: Oh yeah for sure, Yeah, yeah, for sure. And it 42 00:02:05,453 --> 00:02:07,133 Speaker 3: builds the front end of the back end, It does 43 00:02:07,173 --> 00:02:10,253 Speaker 3: all of it. Now, the the the experts would say 44 00:02:10,253 --> 00:02:11,893 Speaker 3: you need to be careful about is it doing the 45 00:02:11,933 --> 00:02:14,333 Speaker 3: right thing right? But I think what's crazy is that 46 00:02:15,533 --> 00:02:18,093 Speaker 3: it oftentimes does do the right thing. It's quite incredible. 47 00:02:18,173 --> 00:02:21,333 Speaker 2: Yeah, Okay, Now, the EU reckons that tech took is 48 00:02:21,373 --> 00:02:22,533 Speaker 2: too addictive for kids. 49 00:02:23,173 --> 00:02:27,533 Speaker 3: Yes, well it's because you scroll and you scroll and 50 00:02:27,573 --> 00:02:30,493 Speaker 3: you scroll, and they say that that's not good. It's 51 00:02:30,533 --> 00:02:34,893 Speaker 3: not good. And so they've now said that basically, these 52 00:02:35,013 --> 00:02:37,253 Speaker 3: these app makers need to do something to try to 53 00:02:37,333 --> 00:02:40,333 Speaker 3: make the apps not as addictive. They have some ideas, 54 00:02:40,733 --> 00:02:42,293 Speaker 3: but none of the ideas seem very good. 55 00:02:42,373 --> 00:02:42,573 Speaker 1: Jack. 56 00:02:42,613 --> 00:02:46,413 Speaker 3: They've considered adding things like screen time breaks, which can 57 00:02:46,453 --> 00:02:48,093 Speaker 3: you imagine how that might work? Well, so the app 58 00:02:48,133 --> 00:02:51,213 Speaker 3: just kind of goes dark or you get disabled for. 59 00:02:51,333 --> 00:02:53,693 Speaker 2: A little bit of time, I mean, do you use 60 00:02:54,213 --> 00:02:55,693 Speaker 2: Would that be a bad thing? 61 00:02:56,373 --> 00:03:00,373 Speaker 3: Well, I mean, I mean no, but is it really 62 00:03:00,413 --> 00:03:03,613 Speaker 3: going to, you know, prevent kids. It's going to give 63 00:03:03,653 --> 00:03:05,333 Speaker 3: them a break, but it's not going to stop them 64 00:03:05,373 --> 00:03:09,453 Speaker 3: from coming back. As that, I've talked about changing the algorithms, 65 00:03:09,573 --> 00:03:13,133 Speaker 3: which I guess that I guess the the you know, 66 00:03:13,213 --> 00:03:16,853 Speaker 3: the disabling one actually stops the kids from using it. 67 00:03:16,893 --> 00:03:18,973 Speaker 3: The algorithm one kind of is what you're just going 68 00:03:19,053 --> 00:03:23,013 Speaker 3: to create less interesting contents? I guess maybe. But then 69 00:03:23,053 --> 00:03:26,733 Speaker 3: the years also said should you just disable the infinite scroll? 70 00:03:26,893 --> 00:03:30,973 Speaker 3: The scroll is the scroll is too much, it's too addictive, 71 00:03:31,533 --> 00:03:35,653 Speaker 3: and you know that the the kids should instead maybe 72 00:03:35,693 --> 00:03:38,893 Speaker 3: have to go back and pick their next video. Yeah, 73 00:03:39,053 --> 00:03:40,893 Speaker 3: but I don't know is that really going to make 74 00:03:41,213 --> 00:03:42,733 Speaker 3: that one. I'm really not sure is going to make 75 00:03:42,773 --> 00:03:43,693 Speaker 3: too much of a difference. 76 00:03:44,653 --> 00:03:45,293 Speaker 2: But I don't know how. 77 00:03:45,333 --> 00:03:47,373 Speaker 3: I don't know how mindless these kids are when they're 78 00:03:47,413 --> 00:03:50,053 Speaker 3: on these apps. But I guess you can quite easily 79 00:03:50,573 --> 00:03:52,573 Speaker 3: lose yourself in these holes, can't you. And I guess 80 00:03:52,613 --> 00:03:55,773 Speaker 3: that's what it's trying to stop. But some of the 81 00:03:55,813 --> 00:03:59,733 Speaker 3: contents good. I've learned some things from the TikTok or 82 00:03:59,773 --> 00:04:00,733 Speaker 3: the Instagram reels. 83 00:04:00,813 --> 00:04:06,493 Speaker 2: Yeah, yeah, well I don't have TikTok because I I 84 00:04:06,533 --> 00:04:09,173 Speaker 2: just think, I mean, I'm as susceptible as anyone to 85 00:04:09,253 --> 00:04:13,453 Speaker 2: these things. I do find myself mindlessly scolling Instagram. I 86 00:04:13,493 --> 00:04:15,453 Speaker 2: try not to get too deep on the reels, but then, 87 00:04:15,813 --> 00:04:18,493 Speaker 2: like sometimes my wife she'll do it for twenty minutes, 88 00:04:18,533 --> 00:04:20,133 Speaker 2: half an hour, just so come and watch these, come 89 00:04:20,173 --> 00:04:21,573 Speaker 2: watch these, and I'll be like, oh, this is going 90 00:04:21,653 --> 00:04:23,053 Speaker 2: to be stupid, and then I'll go and sit down. 91 00:04:23,053 --> 00:04:25,333 Speaker 2: And I do find them very funny. There are lots 92 00:04:25,373 --> 00:04:27,173 Speaker 2: of funny things. Yeah, there are lots of funny things. 93 00:04:27,213 --> 00:04:29,853 Speaker 2: But you know, I'm not being pulled into the more 94 00:04:30,053 --> 00:04:32,133 Speaker 2: harmful stuff. Yet I don't think. 95 00:04:32,093 --> 00:04:34,493 Speaker 3: No, no, And look, there is the problem, right, and 96 00:04:34,533 --> 00:04:37,013 Speaker 3: I think that you know, maybe these breaks, especially if 97 00:04:37,133 --> 00:04:39,053 Speaker 3: you know at later at night when kids are trying 98 00:04:39,053 --> 00:04:41,213 Speaker 3: to go to bed, maybe that's a good thing. But 99 00:04:41,293 --> 00:04:43,293 Speaker 3: I mean, look, and to be real, we've talked a 100 00:04:43,293 --> 00:04:46,493 Speaker 3: lot about some of these EU kind of mandates that 101 00:04:46,493 --> 00:04:48,973 Speaker 3: they've put on the tech companies over the years, even 102 00:04:48,973 --> 00:04:53,093 Speaker 3: including being forced to use the USBC plug. Yeah, and 103 00:04:53,613 --> 00:04:57,733 Speaker 3: it's incredible the ripple effect that it has because these 104 00:04:57,853 --> 00:05:00,813 Speaker 3: these these app makers don't want to create special software 105 00:05:00,853 --> 00:05:03,213 Speaker 3: for certain parts of the world, right, So something that 106 00:05:03,213 --> 00:05:05,493 Speaker 3: the EU is kind of pushing on may actually end 107 00:05:05,573 --> 00:05:07,853 Speaker 3: up forcing TikTok changes across the rest of the world. 108 00:05:08,093 --> 00:05:10,893 Speaker 2: Yeah, right, it'll be interesting to see. Thanks, Paul, take care. 109 00:05:10,973 --> 00:05:13,413 Speaker 2: That's our texpert Paul Stenhouse. 110 00:05:14,133 --> 00:05:17,213 Speaker 1: For more from Saturday Morning with Jack Tame, listen live 111 00:05:17,333 --> 00:05:20,133 Speaker 1: to News Talks ed B from nine am Saturday, or 112 00:05:20,213 --> 00:05:22,133 Speaker 1: follow the podcast on iHeartRadio.