1 00:00:01,520 --> 00:00:06,319 Speaker 1: Is the City with Kate podcast AIS. 2 00:00:06,440 --> 00:00:10,200 Speaker 2: Tommy's just chewed our ears off during that song, saying 3 00:00:10,200 --> 00:00:12,960 Speaker 2: that five hundred mil of water gets burned every question 4 00:00:13,080 --> 00:00:16,640 Speaker 2: asked on. 5 00:00:15,000 --> 00:00:17,400 Speaker 1: One bottle of water telling your story, one. 6 00:00:17,320 --> 00:00:21,320 Speaker 3: Bottle of fresh water for every everything every question you ask. 7 00:00:22,640 --> 00:00:24,600 Speaker 1: So you mean cooling the servers down. 8 00:00:25,840 --> 00:00:28,800 Speaker 4: But see then, Tommy, I'm visualizing that there's one person 9 00:00:28,880 --> 00:00:31,600 Speaker 4: there that's opening a bottle of water every time another 10 00:00:31,680 --> 00:00:32,479 Speaker 4: question comes in. 11 00:00:32,560 --> 00:00:36,159 Speaker 1: It's not a bottle of it's another question they had 12 00:00:36,159 --> 00:00:36,440 Speaker 1: to get. 13 00:00:36,560 --> 00:00:40,320 Speaker 2: They had to get Andy from the Fire Festival documentary. 14 00:00:39,800 --> 00:00:45,199 Speaker 1: To get the water off the truck. Yeah, Tommy, the 15 00:00:45,240 --> 00:00:47,239 Speaker 1: water is used to cool the servers. 16 00:00:47,320 --> 00:00:50,320 Speaker 3: Yeah, exactly so, because otherwise they overheat, of course, but 17 00:00:50,960 --> 00:00:53,880 Speaker 3: some data servers are using up to sixty of a 18 00:00:53,960 --> 00:00:55,280 Speaker 3: city's energy. 19 00:00:55,480 --> 00:00:57,480 Speaker 1: You do know that a lot of it is cloud 20 00:00:57,560 --> 00:00:58,360 Speaker 1: storage as well. 21 00:00:58,600 --> 00:01:04,600 Speaker 3: Yeah, the water in those clouds come from. Something to 22 00:01:04,640 --> 00:01:06,560 Speaker 3: think about when you're using tank you. 23 00:01:06,480 --> 00:01:10,000 Speaker 1: For your anti AI campaign, tom I will never think 24 00:01:10,000 --> 00:01:11,280 Speaker 1: about it ever again, ever again. 25 00:01:12,480 --> 00:01:18,120 Speaker 2: This AI program developed out of China has really really 26 00:01:18,160 --> 00:01:21,200 Speaker 2: got people concerned but excited at the same time. If 27 00:01:21,240 --> 00:01:24,720 Speaker 2: you're a pet lover, they've been able to create AI, 28 00:01:25,000 --> 00:01:28,280 Speaker 2: which is like Doctor Doolittle where you are able to 29 00:01:28,360 --> 00:01:30,880 Speaker 2: talk to the animals. In fact, they're able to talk 30 00:01:30,920 --> 00:01:33,760 Speaker 2: to you. Give me so the AI can assess the 31 00:01:33,800 --> 00:01:36,080 Speaker 2: movement of the pet and the sounds from the pet 32 00:01:36,440 --> 00:01:38,039 Speaker 2: and translated into the. 33 00:01:38,120 --> 00:01:39,560 Speaker 1: Language of your choice. 34 00:01:40,400 --> 00:01:43,480 Speaker 2: So your dog, based on what it's doing, could have 35 00:01:43,520 --> 00:01:47,040 Speaker 2: a speaker around its neck on the collar fits which 36 00:01:47,160 --> 00:01:49,880 Speaker 2: talks to you, and it's meant to be accurate to 37 00:01:49,920 --> 00:01:50,640 Speaker 2: what the dog is. 38 00:01:50,640 --> 00:01:55,200 Speaker 1: Thinking and how it's behaving. I'm a fooling toad. Oh 39 00:01:55,400 --> 00:01:57,440 Speaker 1: my gosh, a snack. 40 00:01:58,040 --> 00:01:59,840 Speaker 4: It'll just be food, more. 41 00:01:59,680 --> 00:02:06,960 Speaker 1: Food, food, food, food, toilet food, walk, give me the ball. 42 00:02:07,320 --> 00:02:09,639 Speaker 2: The other oportion is you're able to point and shoot 43 00:02:09,680 --> 00:02:12,840 Speaker 2: your camera at the dog and the text will come 44 00:02:12,880 --> 00:02:14,680 Speaker 2: on the screen to say what your dog's thinking. 45 00:02:14,680 --> 00:02:15,120 Speaker 1: Are doing? 46 00:02:15,320 --> 00:02:19,160 Speaker 2: Well, it has to learn your dog first. They Yeah, 47 00:02:19,160 --> 00:02:21,480 Speaker 2: there must be some sort of training part to the AI. 48 00:02:22,720 --> 00:02:24,920 Speaker 2: But at the same time, what I like about this 49 00:02:25,000 --> 00:02:27,760 Speaker 2: technology is how can you ever challenge it? 50 00:02:28,720 --> 00:02:31,000 Speaker 1: You can't prove it or you can't disprove it. 51 00:02:31,560 --> 00:02:34,400 Speaker 2: Of course, if it just goes with generic dog things 52 00:02:34,760 --> 00:02:37,200 Speaker 2: like yeah, could do with more food, if in doubt, 53 00:02:37,240 --> 00:02:40,080 Speaker 2: do with more food, food and walk I reckon is 54 00:02:40,160 --> 00:02:42,400 Speaker 2: just the default to the platform, and no one can 55 00:02:42,480 --> 00:02:43,240 Speaker 2: argue it's wrong. 56 00:02:43,480 --> 00:02:45,960 Speaker 4: Yeah, the default would just be pat me, pat me, 57 00:02:46,280 --> 00:02:46,640 Speaker 4: pat me. 58 00:02:49,200 --> 00:02:50,120 Speaker 1: Can we isolate that? 59 00:02:50,200 --> 00:02:53,000 Speaker 2: Grab Tom absolutely talk about what he was feeling on 60 00:02:53,080 --> 00:02:53,600 Speaker 2: Mother's Day? 61 00:02:54,200 --> 00:02:57,800 Speaker 4: For some reason, I'm getting served up on my algorithm 62 00:02:57,800 --> 00:02:58,280 Speaker 4: at the moment. 63 00:02:58,320 --> 00:03:01,560 Speaker 1: Has anyone seen the translation glasses? 64 00:03:02,639 --> 00:03:02,839 Speaker 5: Oh? 65 00:03:03,000 --> 00:03:03,800 Speaker 1: How do they work? 66 00:03:03,919 --> 00:03:06,560 Speaker 4: So you wear a pair of glasses and whatever country 67 00:03:06,600 --> 00:03:09,040 Speaker 4: you go to, when somebody talks to you on the 68 00:03:09,080 --> 00:03:11,000 Speaker 4: glasses and pops up what they're saying. 69 00:03:11,080 --> 00:03:12,640 Speaker 1: Oh that's good, isn't it. Yeah? 70 00:03:12,639 --> 00:03:15,960 Speaker 4: But then I you instantly go and get the reviews 71 00:03:16,000 --> 00:03:17,120 Speaker 4: on it. 72 00:03:17,120 --> 00:03:18,560 Speaker 1: It's been absolutely slight. 73 00:03:18,720 --> 00:03:21,679 Speaker 2: You know, there was the inny technology not long ago 74 00:03:21,760 --> 00:03:24,360 Speaker 2: fits and so it did a similar thing, except they 75 00:03:24,360 --> 00:03:26,160 Speaker 2: didn't put the text, but you would wear it like 76 00:03:26,200 --> 00:03:28,959 Speaker 2: an earbud, and then if somebody spoke to you in Italian, 77 00:03:29,000 --> 00:03:32,720 Speaker 2: your earbud would translate it, so you had the English 78 00:03:32,960 --> 00:03:34,080 Speaker 2: version for yourself. 79 00:03:34,240 --> 00:03:34,440 Speaker 5: Yeah. 80 00:03:34,440 --> 00:03:37,240 Speaker 4: But see is that not because then you've got to 81 00:03:37,480 --> 00:03:41,160 Speaker 4: navigate when they're finished talking, and then you've got to 82 00:03:41,240 --> 00:03:43,680 Speaker 4: compute what they're saying and then have an answer. 83 00:03:43,720 --> 00:03:47,520 Speaker 1: It'd be very confused. It's a very weird delayed conversation. 84 00:03:49,280 --> 00:03:51,520 Speaker 2: With someone as it takes five or ten seconds to 85 00:03:51,560 --> 00:03:53,280 Speaker 2: process and then you come out with your answer. 86 00:03:54,280 --> 00:03:57,360 Speaker 1: But are the glasses? Are they selling well? Or are 87 00:03:57,360 --> 00:03:58,440 Speaker 1: they is it? 88 00:03:58,920 --> 00:03:59,000 Speaker 2: So? 89 00:03:59,440 --> 00:03:59,680 Speaker 1: Yeah? 90 00:04:00,360 --> 00:04:04,680 Speaker 4: Well you know that well that Internet sensation speed. You 91 00:04:04,680 --> 00:04:08,360 Speaker 4: know that guy who runs really fast and he travels speed. 92 00:04:08,600 --> 00:04:12,440 Speaker 1: Yeah, I show speed. He put them on and as 93 00:04:12,480 --> 00:04:14,760 Speaker 1: soon as he started going, I'm like god, this is amazing. 94 00:04:15,160 --> 00:04:17,440 Speaker 4: But then you go, you go to the reviews straight 95 00:04:17,440 --> 00:04:19,840 Speaker 4: away and it says they can't pick up. 96 00:04:19,839 --> 00:04:22,760 Speaker 1: You've got to stand really close to the glasses. You've 97 00:04:22,760 --> 00:04:23,560 Speaker 1: got to speak. 98 00:04:23,440 --> 00:04:26,160 Speaker 4: Very slowly into the glasses into the language that you're 99 00:04:26,200 --> 00:04:29,360 Speaker 4: talking before it can actually work out what tom. 100 00:04:30,200 --> 00:04:32,279 Speaker 1: Oh, no, I was just you'd like to gather, and 101 00:04:32,279 --> 00:04:32,640 Speaker 1: you said you. 102 00:04:33,400 --> 00:04:38,839 Speaker 3: Optus launched a technology where you could call any country 103 00:04:38,880 --> 00:04:41,960 Speaker 3: overseas and it would translate the call on what they 104 00:04:41,960 --> 00:04:44,640 Speaker 3: would say to you and what you would That was 105 00:04:44,680 --> 00:04:45,520 Speaker 3: pretty wild. 106 00:04:45,640 --> 00:04:53,159 Speaker 2: Called South Australia. Once I couldn't understand all love to 107 00:04:53,200 --> 00:04:58,760 Speaker 2: have understood it, idiot. Sorry hallucinated again. 108 00:04:59,120 --> 00:05:01,560 Speaker 6: Fits in with her with cap Bridgie podcast. 109 00:05:01,960 --> 00:05:06,880 Speaker 1: Let's move on to this one. Guys talk about fast talkers. Swani, 110 00:05:06,960 --> 00:05:10,400 Speaker 1: do you buy into fast talkers or do not trust them. 111 00:05:11,040 --> 00:05:15,599 Speaker 6: No, I get very confused. I am I will very 112 00:05:16,360 --> 00:05:20,160 Speaker 6: clearly say I'm sorry. Could you slow down? You're talking 113 00:05:20,760 --> 00:05:22,719 Speaker 6: far too quickly for the pace. 114 00:05:22,720 --> 00:05:24,680 Speaker 4: That is the worst thing you say to him, isn't it? 115 00:05:24,720 --> 00:05:27,599 Speaker 4: Slow down? Will you do it with your live aids? 116 00:05:27,640 --> 00:05:30,080 Speaker 4: You like? We have to sligh you down a little bit. 117 00:05:30,120 --> 00:05:32,400 Speaker 4: Sometimes take a deep breath, slow down and just. 118 00:05:32,360 --> 00:05:34,040 Speaker 1: Read the words, mate, one word after the other. 119 00:05:34,080 --> 00:05:34,599 Speaker 2: See how you go. 120 00:05:34,680 --> 00:05:38,080 Speaker 1: Let's go from the top three one. This is interesting. 121 00:05:38,120 --> 00:05:41,600 Speaker 1: Gary Vee is a social media expert. He's a futurist. 122 00:05:42,080 --> 00:05:43,000 Speaker 1: He's an entrepreneur. 123 00:05:43,120 --> 00:05:45,120 Speaker 2: He's invested in a lot of businesses, and he's made 124 00:05:45,120 --> 00:05:46,080 Speaker 2: a lot of money. 125 00:05:46,400 --> 00:05:49,200 Speaker 4: His own energy drink too, which is amazing, the energy drink. 126 00:05:49,880 --> 00:05:54,400 Speaker 6: He's got an unusual combination of speaking very very quickly 127 00:05:54,440 --> 00:05:57,159 Speaker 6: and also very loudly. 128 00:05:56,960 --> 00:06:03,279 Speaker 2: Very loudly, presents on stage and makes outrageous predictions in 129 00:06:03,320 --> 00:06:05,960 Speaker 2: the future. He's got a new one, guys, and he's 130 00:06:05,960 --> 00:06:07,560 Speaker 2: standing by it. Have a listen to this. 131 00:06:08,160 --> 00:06:10,839 Speaker 7: I believe a robot that is fully AI will marry 132 00:06:10,960 --> 00:06:14,440 Speaker 7: a human and vice versa for reel in your lifetime. 133 00:06:14,600 --> 00:06:15,760 Speaker 1: And when you say an AI. 134 00:06:15,960 --> 00:06:18,479 Speaker 4: Does that mean like a physical body of a robot? 135 00:06:18,600 --> 00:06:21,000 Speaker 4: So you're saying that the bodies will get so good 136 00:06:21,120 --> 00:06:21,920 Speaker 4: that it will look like. 137 00:06:21,920 --> 00:06:24,000 Speaker 7: Yes, Actually, it's funny. I've thought about this. I actually 138 00:06:24,040 --> 00:06:27,279 Speaker 7: think AI robots may save marriages. Just like real couples 139 00:06:27,320 --> 00:06:30,799 Speaker 7: introduce swinging in another partner, Now, is it even better 140 00:06:30,920 --> 00:06:34,440 Speaker 7: to introduce an AI robot that may help whatever sexual 141 00:06:34,560 --> 00:06:37,200 Speaker 7: or emotional things are going on in that relationship to 142 00:06:37,240 --> 00:06:38,920 Speaker 7: become an offsetting contributor. 143 00:06:39,360 --> 00:06:40,840 Speaker 2: I mean, you can hear him banging the desk at 144 00:06:40,880 --> 00:06:42,919 Speaker 2: the same time. But great to think that in the 145 00:06:42,960 --> 00:06:47,200 Speaker 2: future every relationship or married couple will just have an 146 00:06:47,279 --> 00:06:51,360 Speaker 2: AI robot in the cupboard that you can spend some time. 147 00:06:51,440 --> 00:06:54,200 Speaker 6: That's not a massive prediction of what's the documentary about 148 00:06:54,200 --> 00:06:58,280 Speaker 6: a woman who married a bridge? 149 00:06:56,680 --> 00:07:00,440 Speaker 1: Yea happened. 150 00:07:01,040 --> 00:07:02,880 Speaker 4: I thought a hologram would be first. 151 00:07:02,960 --> 00:07:05,000 Speaker 2: And then because you walk through it, you want to 152 00:07:05,000 --> 00:07:06,800 Speaker 2: be able to touch and feel it. And you know what 153 00:07:06,880 --> 00:07:08,719 Speaker 2: some of the robots out there and some of the 154 00:07:08,760 --> 00:07:12,720 Speaker 2: silicon skins and things that they're making is really scary. 155 00:07:13,160 --> 00:07:17,120 Speaker 4: Well, it's it's like we've got into virtual reality there 156 00:07:17,160 --> 00:07:19,600 Speaker 4: for a while. Love Chrissy, and he said you know 157 00:07:19,640 --> 00:07:20,120 Speaker 4: what this is. 158 00:07:20,600 --> 00:07:22,960 Speaker 1: This is because people like for. 159 00:07:22,960 --> 00:07:26,600 Speaker 4: The industry, for the erotic industry industry, which is worth 160 00:07:26,680 --> 00:07:30,480 Speaker 4: billions and trillions. Well you know how much it's worth massive. 161 00:07:30,200 --> 00:07:30,520 Speaker 3: He said. 162 00:07:30,520 --> 00:07:32,000 Speaker 4: You know what, people are just going to put these 163 00:07:32,000 --> 00:07:34,560 Speaker 4: goggles on and they're going to go transport themselves into 164 00:07:34,600 --> 00:07:36,520 Speaker 4: a room with someone that they want to be with. 165 00:07:36,920 --> 00:07:38,840 Speaker 4: And I do you know the only thing I think 166 00:07:38,880 --> 00:07:41,880 Speaker 4: of the whole time isn't imagine walking into that room 167 00:07:41,880 --> 00:07:45,440 Speaker 4: and finding that person just by themselves, trying to make 168 00:07:45,560 --> 00:07:48,480 Speaker 4: love to a fake person thrusting. 169 00:07:49,720 --> 00:07:51,600 Speaker 2: You would have your rubber doll and then you could 170 00:07:51,600 --> 00:07:53,440 Speaker 2: have your goggles on at the same time fits and 171 00:07:53,520 --> 00:07:53,880 Speaker 2: it could be. 172 00:07:54,040 --> 00:07:54,640 Speaker 1: What you want to. 173 00:07:56,720 --> 00:08:00,240 Speaker 2: I'm not I just bought a new camera. You know, 174 00:08:00,840 --> 00:08:02,760 Speaker 2: this is the future and no one can predict it. 175 00:08:02,800 --> 00:08:06,559 Speaker 2: But imagine if you were one of your kids down 176 00:08:06,600 --> 00:08:08,440 Speaker 2: the track says to you, I think I'm going to 177 00:08:08,440 --> 00:08:12,240 Speaker 2: propose to my girlfriend. What's your girlfriend's name? Number thirty 178 00:08:12,280 --> 00:08:17,120 Speaker 2: two stillicon number thirty two, because they would be there 179 00:08:17,120 --> 00:08:18,840 Speaker 2: on their wedding day with a robot that can walk, 180 00:08:18,880 --> 00:08:20,760 Speaker 2: and a robot that can talk, and a robot that 181 00:08:20,840 --> 00:08:24,040 Speaker 2: thinks and does everything else that a human could do. 182 00:08:24,480 --> 00:08:26,640 Speaker 4: I think some people. I think there'd be a lot 183 00:08:26,640 --> 00:08:29,040 Speaker 4: of people in this world that are compatible with a robot, 184 00:08:29,200 --> 00:08:31,800 Speaker 4: and you know what, they probably this is the thing. 185 00:08:31,880 --> 00:08:34,440 Speaker 4: And we all know when you find relationships, it's not 186 00:08:34,480 --> 00:08:36,640 Speaker 4: a fairy tale. You always think it is a fairy tale, 187 00:08:36,679 --> 00:08:39,320 Speaker 4: but things do go wrong. People don't want that. With 188 00:08:39,400 --> 00:08:42,920 Speaker 4: a robot, you can prog this is the thing. You know, 189 00:08:43,120 --> 00:08:46,079 Speaker 4: you program it to your lifestyle. 190 00:08:46,640 --> 00:08:49,080 Speaker 1: It's not bad. How much are they I don't know 191 00:08:49,120 --> 00:08:49,720 Speaker 1: at this stage. 192 00:08:49,880 --> 00:08:53,560 Speaker 2: Rochelle, Hello, Hi, Hi, you guys, shure, what did you 193 00:08:53,559 --> 00:08:54,080 Speaker 2: want to say? 194 00:08:55,679 --> 00:08:58,080 Speaker 8: I haven't watched this movie, but just hearing you guys 195 00:08:58,120 --> 00:09:01,880 Speaker 8: talk about this, it triggers add that I've seen Megan Fox. 196 00:09:02,080 --> 00:09:04,480 Speaker 8: We all know it is extremely hot. Is in a 197 00:09:04,559 --> 00:09:07,880 Speaker 8: movie as like an AI robot And I'm pretty sure 198 00:09:08,000 --> 00:09:12,080 Speaker 8: she sleeps with her owner, who is married to someone, 199 00:09:12,160 --> 00:09:15,040 Speaker 8: and it destroys the polls a partner marriage and then 200 00:09:15,080 --> 00:09:16,880 Speaker 8: she goes rogue. 201 00:09:17,000 --> 00:09:19,160 Speaker 6: Okay, that's basically a documentary. 202 00:09:19,320 --> 00:09:21,560 Speaker 1: So there's a robot. So are they a robot in 203 00:09:21,559 --> 00:09:23,000 Speaker 1: the movie or are you saying she just looks like 204 00:09:23,040 --> 00:09:23,960 Speaker 1: a robot? 205 00:09:24,320 --> 00:09:26,720 Speaker 8: No, she's acting as a robot. Like there's a whole 206 00:09:26,720 --> 00:09:30,400 Speaker 8: they going to this store, they choose out they want, 207 00:09:30,760 --> 00:09:35,920 Speaker 8: and yeah, she looks obviously a bit plastic. In real life, 208 00:09:36,240 --> 00:09:39,760 Speaker 8: she fully is animated to be and acts as a 209 00:09:39,840 --> 00:09:42,360 Speaker 8: robot in this movie. It's all like purchasing one and 210 00:09:42,360 --> 00:09:44,480 Speaker 8: bringing them into the home. And I think it's all 211 00:09:44,520 --> 00:09:45,240 Speaker 8: about the AI. 212 00:09:45,480 --> 00:09:48,440 Speaker 6: And I was like, I mean a new this isn't 213 00:09:48,480 --> 00:09:52,240 Speaker 6: a new comment, a new concept. I remember in the 214 00:09:52,320 --> 00:09:57,719 Speaker 6: eighties at Blockbuster Week for ten bucks, there was a 215 00:09:57,760 --> 00:10:01,360 Speaker 6: film called Weird Science Guy, you know, put in the 216 00:10:01,400 --> 00:10:03,560 Speaker 6: Perfect Woman, and you know it was. 217 00:10:05,559 --> 00:10:06,000 Speaker 1: Lebron. 218 00:10:06,440 --> 00:10:09,120 Speaker 6: But I think I think, I mean, you know, whether 219 00:10:09,200 --> 00:10:12,600 Speaker 6: or not this AI situation, the robot situation would save 220 00:10:13,120 --> 00:10:17,000 Speaker 6: relationships is you know, obviously up for questioning. I think 221 00:10:17,000 --> 00:10:22,079 Speaker 6: what would say relationships is if we had a supplementary wife, 222 00:10:22,920 --> 00:10:28,920 Speaker 6: if a woman could also have a wife, right, happy days? 223 00:10:29,200 --> 00:10:33,160 Speaker 4: Okayend like a supplementary number on powerful? 224 00:10:35,240 --> 00:10:37,720 Speaker 2: Can we d to Gary in Mount Druet? Hello, gaz, 225 00:10:38,440 --> 00:10:40,440 Speaker 2: how guys you've seen this happening? Buddy? 226 00:10:40,440 --> 00:10:41,400 Speaker 1: Would you date a robot? 227 00:10:42,240 --> 00:10:45,760 Speaker 9: Oh? Yeah, of course, mate, because they look so amazing 228 00:10:45,880 --> 00:10:47,960 Speaker 9: at this moment. I've got I'm getting a lot of 229 00:10:48,080 --> 00:10:53,040 Speaker 9: videos of them. Yeah, there's even one I've seen. I 230 00:10:53,160 --> 00:10:57,240 Speaker 9: can talk to her. She looks real. I got after 231 00:10:57,400 --> 00:11:01,959 Speaker 9: because line on the bed, can you get something? Actually 232 00:11:02,000 --> 00:11:05,360 Speaker 9: got off the bed, picked it up and handed it 233 00:11:05,400 --> 00:11:08,320 Speaker 9: to the person and got back on the bed and said, 234 00:11:08,360 --> 00:11:11,319 Speaker 9: what do you want now? So Gary, they do all 235 00:11:11,320 --> 00:11:12,680 Speaker 9: that sort of stuff you know, I mean. 236 00:11:12,600 --> 00:11:15,760 Speaker 2: So you're picturing you being there, the TV remotes fallen 237 00:11:15,840 --> 00:11:17,960 Speaker 2: on the floor and you feel like a beer at 238 00:11:17,960 --> 00:11:19,640 Speaker 2: the same time, And she can do that for. 239 00:11:19,559 --> 00:11:25,880 Speaker 5: You, Gary, Yes. And then after that, Gary, she would 240 00:11:26,200 --> 00:11:28,439 Speaker 5: just ten minutes in Mount Druitt, That's for sure. If 241 00:11:28,440 --> 00:11:30,400 Speaker 5: you brought it home and your mates found out about 242 00:11:30,400 --> 00:11:34,160 Speaker 5: your robot, girlfriend, Gary, she is going mate, she would 243 00:11:34,200 --> 00:11:38,960 Speaker 5: have She'd be stolen in five minutes, napped, absolutely, the. 244 00:11:38,880 --> 00:11:43,160 Speaker 1: First robot napping ever. Gary's on the hunt all right, 245 00:11:43,760 --> 00:11:44,439 Speaker 1: So you might. 246 00:11:44,280 --> 00:11:46,880 Speaker 6: Not need a robot if you Bloody occasionally unpacked the 247 00:11:46,920 --> 00:11:48,079 Speaker 6: dish or sea, you know what I mean. 248 00:11:49,040 --> 00:11:49,880 Speaker 1: They can do other things. 249 00:11:49,960 --> 00:11:53,240 Speaker 2: Sitzy Whipper with Kate Ritchie is an OVA podcast walk 250 00:11:53,280 --> 00:11:56,480 Speaker 2: great shows like this download the Nova player, Fire, the 251 00:11:56,520 --> 00:11:58,080 Speaker 2: app Store or Google Playing