1 00:00:00,280 --> 00:00:03,040 Speaker 1: You know those such and such for dummies books. Yeah, 2 00:00:03,080 --> 00:00:05,960 Speaker 1: that are pretty prolific. We're going to do a bit 3 00:00:05,960 --> 00:00:08,920 Speaker 1: of a series of that this week. But on AI, because. 4 00:00:08,640 --> 00:00:11,000 Speaker 2: AIS it can't be stopped. I don't think now. 5 00:00:11,000 --> 00:00:14,600 Speaker 3: AI is here and affecting people in all sorts of 6 00:00:14,600 --> 00:00:18,840 Speaker 3: way ways, but there are many offers that are still 7 00:00:19,079 --> 00:00:22,040 Speaker 3: not exactly sure of how it's all working. 8 00:00:22,640 --> 00:00:23,480 Speaker 1: Definitely one of them. 9 00:00:23,520 --> 00:00:25,600 Speaker 2: Well, I'm trying to I am trying to learn it, 10 00:00:25,640 --> 00:00:27,400 Speaker 2: and I am trying to use a little bit. But 11 00:00:27,480 --> 00:00:29,479 Speaker 2: I you know, I think I think we've opened a 12 00:00:29,520 --> 00:00:32,280 Speaker 2: bit of a Pandora's box. I really do. And I 13 00:00:32,320 --> 00:00:34,599 Speaker 2: don't think. I don't even think the people who run 14 00:00:34,640 --> 00:00:37,080 Speaker 2: the big AI companies, I don't even think they know 15 00:00:37,360 --> 00:00:38,840 Speaker 2: where this is going. 16 00:00:39,159 --> 00:00:42,080 Speaker 1: They're asking chat GPT, you know, tell them what's going on. 17 00:00:42,680 --> 00:00:44,080 Speaker 2: So I was just going to say the apparently there's 18 00:00:44,080 --> 00:00:45,960 Speaker 2: billboards in some of the big cities in America at 19 00:00:45,960 --> 00:00:48,720 Speaker 2: the moment from AI companies. One of them says, don't 20 00:00:48,800 --> 00:00:49,800 Speaker 2: hire humans. 21 00:00:50,280 --> 00:00:51,040 Speaker 1: Don't hire him. 22 00:00:51,159 --> 00:00:54,160 Speaker 2: Don't hire humans like that's that's their. 23 00:00:54,480 --> 00:00:56,600 Speaker 1: And this is why, this is why it terrifies me. 24 00:00:56,960 --> 00:01:01,040 Speaker 1: Paul in joodle up, Good morning morning. 25 00:01:01,080 --> 00:01:04,880 Speaker 2: How is how are you, how are you tackling this? 26 00:01:06,280 --> 00:01:09,039 Speaker 4: Oh yeah, well it's only just starting, like it's only 27 00:01:09,200 --> 00:01:12,680 Speaker 4: it hasn't fully been operational. But and it's not us either, 28 00:01:12,720 --> 00:01:14,840 Speaker 4: but a lot of rubbish trucks have got scanners in 29 00:01:14,880 --> 00:01:18,360 Speaker 4: them now with AI to find batteries and stuff that 30 00:01:18,400 --> 00:01:22,040 Speaker 4: shouldn't be in there. I was just speaking, yeah, and 31 00:01:22,080 --> 00:01:24,319 Speaker 4: I was just speaking to someone who does have it, 32 00:01:24,400 --> 00:01:28,160 Speaker 4: and it can actually send the letters out and eventually finds. 33 00:01:29,440 --> 00:01:32,120 Speaker 4: The problem is the person that did ten years of 34 00:01:32,240 --> 00:01:36,240 Speaker 4: university to think of this idea would hate for everyone 35 00:01:36,319 --> 00:01:38,840 Speaker 4: to find out that. I mean, the bins aren't locked. 36 00:01:38,880 --> 00:01:40,959 Speaker 4: I mean, anyone could have thrown it in there, so 37 00:01:41,000 --> 00:01:42,400 Speaker 4: they can't really do you anyway? 38 00:01:42,760 --> 00:01:43,520 Speaker 5: Yeah, you know what I mean. 39 00:01:43,680 --> 00:01:45,000 Speaker 2: Yes, that's true. 40 00:01:46,120 --> 00:01:49,040 Speaker 4: And they do have upside down locks over east. 41 00:01:49,760 --> 00:01:51,720 Speaker 2: Maybe a bins are on the way and they'll be 42 00:01:51,720 --> 00:01:53,640 Speaker 2: able to tell when it was opened and take a photo. 43 00:01:54,760 --> 00:01:57,840 Speaker 3: Yeah yeah, maybe like or. 44 00:01:57,840 --> 00:01:59,880 Speaker 1: Be in there inside. 45 00:02:00,920 --> 00:02:06,000 Speaker 3: Basically that's his as I want to get, Okay, ridiculous. 46 00:02:06,080 --> 00:02:08,200 Speaker 3: I guess it's it's good that you know, they can 47 00:02:08,240 --> 00:02:10,520 Speaker 3: scan for stuff that's not meant to be in the bin, 48 00:02:10,840 --> 00:02:13,320 Speaker 3: like batteries and things. But yeah, they're you know, good 49 00:02:13,440 --> 00:02:16,079 Speaker 3: luck with trying to prosecute people for it, because. 50 00:02:16,600 --> 00:02:19,200 Speaker 1: No, you really can't. You got you've got a defense. 51 00:02:19,280 --> 00:02:24,840 Speaker 2: They'll find a way. They'll find a way, I promise you. 52 00:02:26,120 --> 00:02:29,720 Speaker 2: So rubbish trucks. There you go, the AI Revolution. 53 00:02:31,040 --> 00:02:34,400 Speaker 3: To leave your note if you know there was stuff 54 00:02:34,840 --> 00:02:36,440 Speaker 3: in a bin that wasn't meant to be in a bin. 55 00:02:36,639 --> 00:02:42,520 Speaker 2: Yeah, yeah, thanks for your time, Lisa. AI is infiltrating everything. 56 00:02:43,000 --> 00:02:45,040 Speaker 2: I saw a couple of weeks ago on one of 57 00:02:45,040 --> 00:02:48,440 Speaker 2: the download charts, an AI song was number one. 58 00:02:48,320 --> 00:02:50,359 Speaker 1: And it was can't be good. 59 00:02:50,440 --> 00:02:53,800 Speaker 2: It was in country music. It wasn't It wasn't techno 60 00:02:54,160 --> 00:02:56,640 Speaker 2: or rap or anything like that. This was a country 61 00:02:56,720 --> 00:02:59,680 Speaker 2: music song that was number one on a download chart. 62 00:03:00,200 --> 00:03:04,040 Speaker 3: Complete there's there's nothing, There's nothing good about that. 63 00:03:04,440 --> 00:03:09,360 Speaker 2: No, hasn't been such a bad musical move since Crazy Frog. 64 00:03:09,520 --> 00:03:12,240 Speaker 1: Marry On the text said, I'm going to have that 65 00:03:12,280 --> 00:03:12,920 Speaker 1: in my head. 66 00:03:13,120 --> 00:03:14,639 Speaker 2: Now I'll find that. 67 00:03:15,600 --> 00:03:18,280 Speaker 1: Marry On the text says, I'm just glad I'm retiring soon. 68 00:03:18,400 --> 00:03:20,720 Speaker 3: Ais a game changer and if you don't keep up 69 00:03:20,720 --> 00:03:22,440 Speaker 3: with it, you'll be left behind. 70 00:03:23,320 --> 00:03:24,000 Speaker 1: Concerns me. 71 00:03:24,160 --> 00:03:30,480 Speaker 3: Let's go to Bayswater and Steve good morning, morning later. Good. 72 00:03:30,560 --> 00:03:31,600 Speaker 1: How scared should we be? 73 00:03:33,320 --> 00:03:37,080 Speaker 6: I use it for everything, like my son's just taken 74 00:03:37,120 --> 00:03:39,600 Speaker 6: up skateboarding. And by taking up skateboarding, I mean he 75 00:03:39,720 --> 00:03:43,080 Speaker 6: got on and fell over and displained bos rip, Yes, 76 00:03:44,520 --> 00:03:47,200 Speaker 6: so I used doctor tat GPT. Did I need to 77 00:03:47,200 --> 00:03:49,880 Speaker 6: take him to the doctor straight away? How was this swelling? 78 00:03:49,960 --> 00:03:52,840 Speaker 6: And it's amazing. It gives you symptoms, tells you when 79 00:03:52,840 --> 00:03:55,160 Speaker 6: you should take him. I ended up getting X rays 80 00:03:55,280 --> 00:03:55,920 Speaker 6: just to be sure. 81 00:03:56,160 --> 00:04:00,960 Speaker 2: Yeah, I would, yes, and you got a medical degree 82 00:04:01,000 --> 00:04:05,120 Speaker 2: by you know, having a word with chat GPT. 83 00:04:05,480 --> 00:04:08,320 Speaker 6: I now I'm looking at Christmas presents and I'm getting 84 00:04:08,800 --> 00:04:14,200 Speaker 6: CPT to do reviews re links for skateboards. And I 85 00:04:14,320 --> 00:04:16,800 Speaker 6: use it for work like I'm in sales. I do 86 00:04:17,200 --> 00:04:19,520 Speaker 6: presos all the time, and I put my presos into 87 00:04:19,560 --> 00:04:23,279 Speaker 6: chat GPT to make sure my spellings should be on path. 88 00:04:23,560 --> 00:04:25,800 Speaker 1: Okay there spell check for that. 89 00:04:26,000 --> 00:04:30,839 Speaker 3: I mean, is chat GPT writing the prop for you? 90 00:04:31,000 --> 00:04:38,480 Speaker 6: Or well, it's helping, it's helping phrase, it's helping my paragraphs. Okay, 91 00:04:40,240 --> 00:04:41,680 Speaker 6: too much energy in there. 92 00:04:42,440 --> 00:04:43,440 Speaker 1: So it's your own editor. 93 00:04:44,000 --> 00:04:46,760 Speaker 2: But Steve is chat GPT getting a slice of your 94 00:04:46,760 --> 00:04:49,240 Speaker 2: commission given the work that it's putting in there for you, 95 00:04:49,200 --> 00:04:50,960 Speaker 2: you lazy. 96 00:04:51,839 --> 00:04:55,039 Speaker 6: Well no, but it's getting my subscription dollars, so it's 97 00:04:55,040 --> 00:04:57,000 Speaker 6: making a little bit of coin there you go. 98 00:04:57,160 --> 00:05:00,960 Speaker 3: Yes, fair enough, that's true, all right, that's another side 99 00:05:01,000 --> 00:05:09,760 Speaker 3: to look. I'm not too sure she'd be relying on it. 100 00:05:08,960 --> 00:05:10,919 Speaker 2: All right, So I think Steve sees the benefits. 101 00:05:11,120 --> 00:05:15,719 Speaker 3: Yeah, it can be like your own personal editor. 102 00:05:16,000 --> 00:05:19,160 Speaker 2: Just go in there so bad. You just go in there, 103 00:05:19,240 --> 00:05:20,320 Speaker 2: have a bit of a player around. 104 00:05:20,960 --> 00:05:21,800 Speaker 1: You know. 105 00:05:22,200 --> 00:05:26,000 Speaker 3: I just replaced my iPad because my iPad was so 106 00:05:26,080 --> 00:05:28,760 Speaker 3: old it wouldn't let me download chat GPT, so I 107 00:05:28,800 --> 00:05:30,560 Speaker 3: had to go get a new iPad. And you know 108 00:05:30,600 --> 00:05:33,839 Speaker 3: what's really annoying. The new iPad has a different sized 109 00:05:34,200 --> 00:05:40,560 Speaker 3: charger thing. So now the charger that sits you know, well, 110 00:05:40,560 --> 00:05:42,600 Speaker 3: because I only just got the new iPad, it had 111 00:05:42,600 --> 00:05:45,320 Speaker 3: this before. The old iPad had the same size charger 112 00:05:45,360 --> 00:05:47,159 Speaker 3: the charge the mobile phone, so I could use the 113 00:05:47,160 --> 00:05:47,599 Speaker 3: same one. 114 00:05:47,640 --> 00:05:50,880 Speaker 1: Now I've got to chords going everywhere. 115 00:05:50,640 --> 00:05:52,680 Speaker 2: Now Lisa's on a. 116 00:05:52,720 --> 00:05:56,919 Speaker 3: Random Before we worry about AI and crap, let's just 117 00:05:56,960 --> 00:06:00,919 Speaker 3: get the chords, you know, aligned Joe and Billia, what 118 00:06:00,960 --> 00:06:03,080 Speaker 3: do you think of AIG? 119 00:06:04,000 --> 00:06:09,920 Speaker 7: Good morning, morning, I'm concerned over the dehumanization of our world. 120 00:06:10,080 --> 00:06:12,560 Speaker 6: Yes, I've got a. 121 00:06:12,240 --> 00:06:15,400 Speaker 7: Sixteen year old who's currently trying to get work, and 122 00:06:16,200 --> 00:06:20,240 Speaker 7: we've already dehumanized with our self served checkouts and taken 123 00:06:20,279 --> 00:06:22,159 Speaker 7: away the right of passage for our kids to be 124 00:06:22,240 --> 00:06:26,080 Speaker 7: checkout chicks to get into this world. What's the next step? 125 00:06:26,360 --> 00:06:28,400 Speaker 7: How many more jobs are we going to remove for 126 00:06:28,480 --> 00:06:30,760 Speaker 7: our young people and make it harder for them to 127 00:06:30,800 --> 00:06:31,800 Speaker 7: get into the workforce. 128 00:06:31,880 --> 00:06:34,479 Speaker 2: That's true, Joe. On the web site, some people might 129 00:06:34,520 --> 00:06:37,960 Speaker 2: say there's possibly new jobs that haven't even been invented 130 00:06:38,040 --> 00:06:41,840 Speaker 2: yet that may come out of this. So but I 131 00:06:41,920 --> 00:06:44,320 Speaker 2: understand we're in that transition at the moment. That's right, 132 00:06:44,640 --> 00:06:45,920 Speaker 2: And that's the concerning bit. 133 00:06:46,279 --> 00:06:50,760 Speaker 3: Yeah, yeah, absolutely, I think dehumanization is a very good 134 00:06:51,240 --> 00:06:52,120 Speaker 3: term to use. 135 00:06:52,240 --> 00:06:54,800 Speaker 2: I also think that when it comes to jobs, that 136 00:06:54,960 --> 00:06:57,200 Speaker 2: maybe it's trades that are going to be some of 137 00:06:57,200 --> 00:06:59,600 Speaker 2: the safer jobs rather than some of the white collar 138 00:06:59,720 --> 00:07:02,640 Speaker 2: jobs in the future, you know, because AI can't unblock 139 00:07:02,680 --> 00:07:05,200 Speaker 2: a dunny. 140 00:07:05,680 --> 00:07:08,560 Speaker 7: I certainly can't. One of my kids is an apprentice 141 00:07:08,560 --> 00:07:12,320 Speaker 7: tradee so you know, but you know, AI jobs are 142 00:07:12,320 --> 00:07:16,200 Speaker 7: going to still take people out away from that face 143 00:07:16,240 --> 00:07:18,960 Speaker 7: to face contact, don't they It might be a new 144 00:07:19,080 --> 00:07:21,600 Speaker 7: job that face to face stuff is gone. Definitely. 145 00:07:22,000 --> 00:07:23,160 Speaker 2: You've got a good point there, Joe. 146 00:07:23,160 --> 00:07:29,960 Speaker 1: They're already having enough trouble face to face. Yeah, yeah, yeah, 147 00:07:30,360 --> 00:07:31,920 Speaker 1: I warrant. 148 00:07:31,960 --> 00:07:38,960 Speaker 2: Concern right slightly worst case scenario Hello hello. 149 00:07:39,000 --> 00:07:41,280 Speaker 5: So yeah, I use it a lot for writing emails 150 00:07:41,280 --> 00:07:43,640 Speaker 5: to people that have been idiots or done something wrong 151 00:07:43,800 --> 00:07:48,040 Speaker 5: without telling them they're idiots. Very very handy. So I'm 152 00:07:48,080 --> 00:07:50,280 Speaker 5: thinking I'm going to use it to draft the letter 153 00:07:50,360 --> 00:07:55,000 Speaker 5: to the Minister of the Bureau of Mediology and maybe 154 00:07:55,080 --> 00:07:57,800 Speaker 5: Chris Bowen as well, and a few other people that 155 00:07:58,120 --> 00:07:59,960 Speaker 5: you know, so we can tell them they're all idiots 156 00:08:00,040 --> 00:08:02,160 Speaker 5: without telling them they're all idiots, and they probably wouldn't 157 00:08:02,160 --> 00:08:03,600 Speaker 5: be able to understand the English anyway. 158 00:08:03,640 --> 00:08:05,640 Speaker 2: Can I can I put my Can I put my 159 00:08:05,680 --> 00:08:06,480 Speaker 2: signature on that one? 160 00:08:06,520 --> 00:08:06,720 Speaker 8: Yeah? 161 00:08:06,760 --> 00:08:09,679 Speaker 2: Right, I'm not a big fan of the new bomb site. 162 00:08:10,360 --> 00:08:15,160 Speaker 5: Pretty good petition that one, pretty well, it's million dollars. 163 00:08:15,200 --> 00:08:17,080 Speaker 1: Evidently now they're saying, well. 164 00:08:16,920 --> 00:08:18,920 Speaker 5: On how well you could just. 165 00:08:21,600 --> 00:08:23,680 Speaker 1: Ai is baffled, like, I'm not there. 166 00:08:23,760 --> 00:08:25,320 Speaker 2: Yeah, I would have said I could have done it 167 00:08:25,360 --> 00:08:27,960 Speaker 2: for twenty five bucks subscription. We're a little bit confused 168 00:08:28,000 --> 00:08:30,160 Speaker 2: this morning about AI. 169 00:08:30,600 --> 00:08:33,000 Speaker 3: Yes, I've been a little bit confused about it for 170 00:08:33,040 --> 00:08:35,320 Speaker 3: some time now, but I thought it was time to 171 00:08:35,360 --> 00:08:37,120 Speaker 3: sort of go on the record and say, before this 172 00:08:37,320 --> 00:08:40,240 Speaker 3: really gets away from us, can we have a little 173 00:08:40,360 --> 00:08:44,880 Speaker 3: chat now. Sarah James is a data and AI executive 174 00:08:45,240 --> 00:08:48,480 Speaker 3: and she advises other executives on data and AI projects 175 00:08:48,480 --> 00:08:51,640 Speaker 3: for their clients. She has an expansive background across every 176 00:08:51,679 --> 00:08:55,160 Speaker 3: sector of technology and has been leading AI projects since 177 00:08:55,160 --> 00:08:57,720 Speaker 3: and this this date scared me because this is how 178 00:08:57,760 --> 00:09:00,280 Speaker 3: long it's been around for, since two thousand and three. 179 00:09:00,559 --> 00:09:03,199 Speaker 3: She is joining us to kick off this week of 180 00:09:03,400 --> 00:09:05,319 Speaker 3: talking to people about a I. 181 00:09:05,800 --> 00:09:07,679 Speaker 1: Good morning, Sarah. 182 00:09:07,320 --> 00:09:09,640 Speaker 2: Morning, Sarah, good morning. 183 00:09:14,040 --> 00:09:19,160 Speaker 1: In the most lamany of layman terms, what is AI? 184 00:09:20,520 --> 00:09:20,600 Speaker 7: So? 185 00:09:21,240 --> 00:09:25,160 Speaker 8: AI is the creation of computer systems and software that 186 00:09:25,240 --> 00:09:29,840 Speaker 8: can perform tasks typically that require intelligence, so like reasoning, 187 00:09:30,000 --> 00:09:35,920 Speaker 8: solving problems, understanding language, and recognizing images and making decisions. 188 00:09:36,480 --> 00:09:38,480 Speaker 8: So yeah, that's what it basically is. 189 00:09:40,240 --> 00:09:43,640 Speaker 2: How does it? How does it learn? Because we're always 190 00:09:43,640 --> 00:09:46,520 Speaker 2: told about how it learns and how it can be trained. 191 00:09:47,040 --> 00:09:47,800 Speaker 2: How does it do that? 192 00:09:48,760 --> 00:09:53,000 Speaker 8: Yeah? So it uses something called machine learning, but in 193 00:09:53,080 --> 00:09:55,680 Speaker 8: our terms, so if you think about that in a 194 00:09:55,760 --> 00:10:01,040 Speaker 8: logical way. It's basically thinking. It's think and it's being 195 00:10:01,120 --> 00:10:05,520 Speaker 8: trained to think. So that's based on data. So there 196 00:10:05,559 --> 00:10:08,240 Speaker 8: is three types of different types of learning. So there's 197 00:10:08,280 --> 00:10:14,120 Speaker 8: supervised learning, there's unsupervised learning, and reinforcement learning. So reforcement 198 00:10:14,200 --> 00:10:17,320 Speaker 8: learning is like when you give your dog a treat, 199 00:10:17,600 --> 00:10:20,400 Speaker 8: that type of thing, so that enforces the data is 200 00:10:20,440 --> 00:10:24,040 Speaker 8: being used. Supervising learning is when you label the data, 201 00:10:24,600 --> 00:10:26,720 Speaker 8: so any input that you put in, you can say, 202 00:10:26,800 --> 00:10:29,640 Speaker 8: I like this data, this data is good, this is 203 00:10:29,679 --> 00:10:32,840 Speaker 8: what good looks like. And then unsupervised learning is where 204 00:10:32,880 --> 00:10:36,720 Speaker 8: you give the data a bit more free ring to 205 00:10:36,800 --> 00:10:40,200 Speaker 8: do what it wants and you'd let it find hidden patterns, 206 00:10:40,240 --> 00:10:43,840 Speaker 8: structures and groupings on its own. So that's kind of 207 00:10:43,840 --> 00:10:45,080 Speaker 8: the ways in which it learns. 208 00:10:45,840 --> 00:10:48,640 Speaker 1: Is AI the same as a robot? 209 00:10:49,600 --> 00:10:54,120 Speaker 8: No, definitely not. Okay, however, so a robot is physical, 210 00:10:54,400 --> 00:10:58,400 Speaker 8: So think of your car, think of the pretty cool 211 00:10:58,480 --> 00:11:02,439 Speaker 8: robots that are coming now as helpers so to speak. 212 00:11:02,440 --> 00:11:05,600 Speaker 8: In the workplace that yea a certain large company workwird 213 00:11:05,720 --> 00:11:10,280 Speaker 8: is doing. But you have to have a separate system 214 00:11:10,520 --> 00:11:13,240 Speaker 8: that is the AI, and that's the software or the 215 00:11:13,280 --> 00:11:16,480 Speaker 8: system that runs it. Think of it like the smarts. Yes, 216 00:11:16,559 --> 00:11:19,240 Speaker 8: it's like in your car, it's the smarts in your car. 217 00:11:19,400 --> 00:11:20,079 Speaker 8: That type of thing. 218 00:11:20,520 --> 00:11:22,599 Speaker 2: One thing that gets talked about, Sarah, I got to 219 00:11:22,600 --> 00:11:25,760 Speaker 2: ask a question, do AI systems Do they have emotions 220 00:11:25,920 --> 00:11:28,199 Speaker 2: or consciousness? Can they? Could they? 221 00:11:30,000 --> 00:11:30,240 Speaker 7: No? 222 00:11:30,320 --> 00:11:34,280 Speaker 8: Not currently in the way we talk it, we don't 223 00:11:34,280 --> 00:11:38,880 Speaker 8: think so. But if you think about some of the 224 00:11:38,880 --> 00:11:41,800 Speaker 8: cool things that happening at the moment, there's like Hat six, 225 00:11:41,840 --> 00:11:44,640 Speaker 8: which Disney has been playing with for many, many, many years, 226 00:11:45,240 --> 00:11:53,600 Speaker 8: and that's about putting in feelings, so feeling heat, feeling touch, 227 00:11:53,840 --> 00:11:56,760 Speaker 8: that type of thing. So if you start to put 228 00:11:56,760 --> 00:12:01,480 Speaker 8: these technologies together, that's when it might get interesting, but 229 00:12:01,600 --> 00:12:03,439 Speaker 8: definitely not consciousness yet. 230 00:12:04,800 --> 00:12:05,280 Speaker 6: Interesting. 231 00:12:05,360 --> 00:12:06,480 Speaker 1: What do your thoughts, Sarah? 232 00:12:06,559 --> 00:12:09,679 Speaker 3: Are people that say they're having a relationship with something. 233 00:12:09,360 --> 00:12:13,480 Speaker 1: That's laugh. 234 00:12:14,840 --> 00:12:16,839 Speaker 8: I don't don't know. I think there's a time and 235 00:12:16,880 --> 00:12:21,040 Speaker 8: a place for everything. I think I think there are. 236 00:12:21,559 --> 00:12:25,000 Speaker 8: In terms of loneliness, there are a lot of people 237 00:12:25,000 --> 00:12:28,440 Speaker 8: who are lonely in the world. So if you think 238 00:12:28,480 --> 00:12:31,760 Speaker 8: of it, like for as an elderly person who needs 239 00:12:32,000 --> 00:12:34,440 Speaker 8: an annual or a dog, If you think of it 240 00:12:34,520 --> 00:12:38,280 Speaker 8: that in those terms, there's good that these things can 241 00:12:38,280 --> 00:12:40,600 Speaker 8: always be used for not so good things as well. 242 00:12:41,080 --> 00:12:45,880 Speaker 8: So it's about having balance and having ethics and AI 243 00:12:46,559 --> 00:12:51,880 Speaker 8: and regulations around these things, which I'm sure you'll have 244 00:12:51,960 --> 00:12:55,520 Speaker 8: people talk about later in the week. 245 00:12:56,600 --> 00:13:00,800 Speaker 3: Nice, Sarah, all right, well that's a good start something 246 00:13:00,840 --> 00:13:02,960 Speaker 3: to take in. We will we do have so many 247 00:13:03,000 --> 00:13:05,640 Speaker 3: other things to talk about. But Sarah, thank you for 248 00:13:06,200 --> 00:13:09,280 Speaker 3: setting us on our way. AI is not the same 249 00:13:09,360 --> 00:13:11,760 Speaker 3: as a robot the dog. 250 00:13:11,840 --> 00:13:16,000 Speaker 2: Things that RoboCop and terminator aren't here yet, but there 251 00:13:16,000 --> 00:13:16,560 Speaker 2: aren't the way. 252 00:13:17,840 --> 00:13:20,280 Speaker 8: You might be surprised to learn that we've had AI 253 00:13:20,400 --> 00:13:21,840 Speaker 8: on us for many, many years. 254 00:13:22,160 --> 00:13:23,080 Speaker 1: Well I was. 255 00:13:23,160 --> 00:13:26,800 Speaker 3: Yes, I couldn't believe how long you've been involved with it. 256 00:13:26,800 --> 00:13:32,280 Speaker 8: It's anyway, well yeah, yeah, look at look at smartphones. Yeah, 257 00:13:32,400 --> 00:13:35,440 Speaker 8: the look at your maps that are on the phone map. Yes, 258 00:13:35,559 --> 00:13:37,240 Speaker 8: and those have had it in there since. 259 00:13:37,679 --> 00:13:39,199 Speaker 1: Good things, they are all the good things. 260 00:13:39,360 --> 00:13:41,880 Speaker 3: Yeah, all right, well, Sarah, thank you, thank you for 261 00:13:41,920 --> 00:13:44,080 Speaker 3: sending us on our way on this AI journey that 262 00:13:44,160 --> 00:13:44,920 Speaker 3: we're on this week. 263 00:13:45,040 --> 00:13:47,120 Speaker 2: Wow, it's a it's a brave new world.