1 00:00:00,200 --> 00:00:03,840 Speaker 1: The Visy and Weather with Kate Richie podcast eighty three 2 00:00:03,880 --> 00:00:06,800 Speaker 1: million jobs we vanished globally by twenty twenty seven. 3 00:00:07,640 --> 00:00:09,120 Speaker 2: See you later, workplaces. 4 00:00:10,119 --> 00:00:12,480 Speaker 1: This is due to AI and where the world is 5 00:00:12,520 --> 00:00:15,239 Speaker 1: moving and how many jobs can then be done by 6 00:00:15,480 --> 00:00:18,920 Speaker 1: the artificial intelligence or systems or robots. 7 00:00:19,040 --> 00:00:21,840 Speaker 3: Okay, did you see the woman the other day? Now 8 00:00:21,960 --> 00:00:26,520 Speaker 3: this wasn't too bad. She gap year, she's finished studying 9 00:00:26,880 --> 00:00:28,920 Speaker 3: and her and her friends want to go to Europe. 10 00:00:28,960 --> 00:00:30,960 Speaker 3: But they obviously you've got to do a lot of 11 00:00:31,000 --> 00:00:32,680 Speaker 3: research on the computer of where you're going to go. 12 00:00:32,720 --> 00:00:34,960 Speaker 3: They want to go to certain places. She basically said 13 00:00:34,960 --> 00:00:37,240 Speaker 3: to chat GPT, I want you to give me four 14 00:00:37,280 --> 00:00:39,400 Speaker 3: places in Europe for our friends to go. Give us 15 00:00:39,400 --> 00:00:42,000 Speaker 3: an itinery, tell us how much it's going to cost 16 00:00:42,000 --> 00:00:44,800 Speaker 3: to get from here to there, and tell us what 17 00:00:45,040 --> 00:00:48,200 Speaker 3: the top three things to do in each right out itinery, 18 00:00:49,080 --> 00:00:51,200 Speaker 3: full full itinerary. 19 00:00:51,280 --> 00:00:52,600 Speaker 4: I don't like all this stuff. 20 00:00:53,159 --> 00:00:55,040 Speaker 1: It is dangerous, but at the same time it is 21 00:00:55,120 --> 00:00:58,000 Speaker 1: people like. It works for a reason because they've collected 22 00:00:58,000 --> 00:01:02,280 Speaker 1: the data of the individuals or groups of humans. So yes, 23 00:01:02,360 --> 00:01:07,360 Speaker 1: the accuracy is there, but you play so much before. 24 00:01:07,080 --> 00:01:10,480 Speaker 4: I understand that you have you have the information, but 25 00:01:10,560 --> 00:01:11,839 Speaker 4: you don't have connection. 26 00:01:12,440 --> 00:01:14,800 Speaker 1: You don't see the other day, I said, I went 27 00:01:14,800 --> 00:01:19,080 Speaker 1: on chat GPT and I said do me. I said, 28 00:01:19,120 --> 00:01:21,560 Speaker 1: make up, do me. You can do anything, and he 29 00:01:21,680 --> 00:01:25,440 Speaker 1: said whoo up, whoo up. I said, twenty bucks is 30 00:01:25,480 --> 00:01:29,360 Speaker 1: twenty bucks And then I don't even get that. So 31 00:01:29,560 --> 00:01:32,280 Speaker 1: it says these silly things. I said, can you write 32 00:01:32,360 --> 00:01:36,640 Speaker 1: me a menu or like a food diary equaling X 33 00:01:36,680 --> 00:01:40,160 Speaker 1: amount of calories for seven days? And it did that, 34 00:01:40,480 --> 00:01:45,080 Speaker 1: so it said lunch four hundred calories, went through everything tomorrow? 35 00:01:45,280 --> 00:01:48,160 Speaker 3: Can I do the break? The big one? And it's 36 00:01:48,160 --> 00:01:55,120 Speaker 3: on its way? AI partners, holograms at home? 37 00:01:55,400 --> 00:01:56,080 Speaker 4: What do you mean? 38 00:01:56,320 --> 00:01:59,520 Speaker 2: It's becoming reality? So you can have you. 39 00:01:59,440 --> 00:02:02,400 Speaker 3: Don't need human in your life. You can have a wife. 40 00:02:02,760 --> 00:02:05,120 Speaker 3: You can have a wife and she's at home. If 41 00:02:05,160 --> 00:02:07,920 Speaker 3: you've ever seen Blade Run at twenty forty nine, you 42 00:02:08,000 --> 00:02:10,000 Speaker 3: can have a wife at home who's a hologram. 43 00:02:10,400 --> 00:02:11,919 Speaker 2: You can program her. 44 00:02:12,040 --> 00:02:15,200 Speaker 4: This nuessation is making me sick. 45 00:02:15,280 --> 00:02:19,760 Speaker 2: It's becoming reality. Kate, bank Teller's gone. Clerks are gone. 46 00:02:19,919 --> 00:02:20,720 Speaker 3: What's a clerk? 47 00:02:22,040 --> 00:02:23,400 Speaker 2: Such an old term, isn't it? 48 00:02:23,400 --> 00:02:23,760 Speaker 5: It is? 49 00:02:24,720 --> 00:02:25,840 Speaker 2: They write down information? 50 00:02:27,560 --> 00:02:32,720 Speaker 1: Postal clerks so off like an work. 51 00:02:32,520 --> 00:02:34,600 Speaker 2: Clerks, but you know what clerk. 52 00:02:34,680 --> 00:02:36,640 Speaker 3: No one knows what a clerk is, so I reckon 53 00:02:36,680 --> 00:02:38,520 Speaker 3: that job's been redundant for Yeah, I know. 54 00:02:38,600 --> 00:02:40,320 Speaker 4: Do we have a clerk in our team? 55 00:02:40,400 --> 00:02:43,560 Speaker 2: Who would be the clerk? He's a clerk? 56 00:02:43,720 --> 00:02:49,000 Speaker 1: Oh, yes, sorry, I got it wrong. Sorry mate, cashiers, 57 00:02:49,040 --> 00:02:55,600 Speaker 1: ticket seller, data entry, team's secretaries. Stop keeping secretary, account bookkeeping, payroll. 58 00:02:55,680 --> 00:02:57,840 Speaker 1: See you later. The systems will do it because the 59 00:02:57,880 --> 00:03:03,720 Speaker 1: systems will know the outcome of all the data that 60 00:03:03,760 --> 00:03:06,120 Speaker 1: they have better than you do. Have a listen to 61 00:03:06,160 --> 00:03:07,360 Speaker 1: this though, because there was a guy, I'm not sure 62 00:03:07,400 --> 00:03:09,720 Speaker 1: if this is the actual bloke, the guy that created 63 00:03:09,760 --> 00:03:12,880 Speaker 1: AI for Google. When Google saw where the trend was going, 64 00:03:13,200 --> 00:03:16,320 Speaker 1: he quit and said, guys, I'm pulling the pin on this. 65 00:03:16,560 --> 00:03:17,639 Speaker 2: It was looking dangerous. 66 00:03:17,639 --> 00:03:19,520 Speaker 1: I've now seen the impact of what this can do 67 00:03:19,600 --> 00:03:22,680 Speaker 1: to the world. So he said he actually left Google 68 00:03:22,720 --> 00:03:25,640 Speaker 1: because of this. And this is the professor speaking. 69 00:03:26,160 --> 00:03:28,680 Speaker 5: If it gets to be much smarter than us, it'll 70 00:03:28,720 --> 00:03:30,840 Speaker 5: be very good at manipulation because it will have learned 71 00:03:30,880 --> 00:03:32,960 Speaker 5: that from us, and it knows how to program, so 72 00:03:32,960 --> 00:03:35,839 Speaker 5: it'll figure out ways of getting around restrictions we put 73 00:03:35,840 --> 00:03:38,280 Speaker 5: on it. It'll figure out ways of manipulating people to 74 00:03:38,280 --> 00:03:40,560 Speaker 5: do what it wants. It's not clear to me that 75 00:03:40,600 --> 00:03:43,280 Speaker 5: we can solve this problem. I don't think we can 76 00:03:43,280 --> 00:03:46,040 Speaker 5: stop the progress. I didn't sign the petition saying we 77 00:03:46,080 --> 00:03:49,680 Speaker 5: should stop working on AI because if people in America stopped, 78 00:03:49,680 --> 00:03:50,680 Speaker 5: people in China wouldn't. 79 00:03:51,840 --> 00:03:53,720 Speaker 2: That's the thing. Can you just press about and he 80 00:03:53,760 --> 00:03:54,200 Speaker 2: shout it. 81 00:03:54,160 --> 00:03:56,080 Speaker 1: All down because the ball is rolling. 82 00:03:57,920 --> 00:04:00,560 Speaker 4: I can't help but think back to what you were 83 00:04:00,560 --> 00:04:04,920 Speaker 4: saying about, like the role of saying a secretary not 84 00:04:05,120 --> 00:04:08,200 Speaker 4: existing anymore, it would be a very good way of 85 00:04:08,400 --> 00:04:10,920 Speaker 4: working out whether your husband is having an affair on the. 86 00:04:11,080 --> 00:04:14,680 Speaker 1: You can't anymore because they when so you. 87 00:04:14,600 --> 00:04:16,960 Speaker 4: Could you don't need the secretary anymore. 88 00:04:17,640 --> 00:04:20,480 Speaker 1: Oh no, it's a very honey. It was a hologram 89 00:04:20,520 --> 00:04:25,159 Speaker 1: that I was in. Oh yeah, honey, it wasn't another 90 00:04:25,200 --> 00:04:25,920 Speaker 1: girl in the bed. 91 00:04:26,080 --> 00:04:30,640 Speaker 3: It was holograms Tomorrow. I've already ordered one, it says 92 00:04:30,839 --> 00:04:31,760 Speaker 3: A couple of weeks away. 93 00:04:32,120 --> 00:04:34,920 Speaker 4: Visy a Weaper with Kate Ritchie is a Nova podcast. 94 00:04:35,040 --> 00:04:38,119 Speaker 3: For more great comedy shows like this, head to novapodcast 95 00:04:38,200 --> 00:04:38,680 Speaker 3: dot com. 96 00:04:38,720 --> 00:04:39,919 Speaker 5: Jod Au