1 00:00:01,200 --> 00:00:03,520 Speaker 1: My name is Lily Maddon and I'm a proud Arunda 2 00:00:03,760 --> 00:00:08,520 Speaker 1: Bungelung Caalcuttin woman from Gadighl Country. The Daily oz acknowledges 3 00:00:08,600 --> 00:00:10,800 Speaker 1: that this podcast is recorded on the lands of the 4 00:00:10,800 --> 00:00:14,360 Speaker 1: Gadighl people and pays respect to all Aboriginal and Torres 5 00:00:14,360 --> 00:00:17,279 Speaker 1: Strait Island and nations. We pay our respects to the 6 00:00:17,320 --> 00:00:20,079 Speaker 1: first peoples of these countries, both past and present. 7 00:00:25,360 --> 00:00:27,880 Speaker 2: Good morning and welcome to the Daily os. It's Thursday, 8 00:00:27,920 --> 00:00:30,520 Speaker 2: the tenth of August. I'm sam Becauseloski. 9 00:00:30,160 --> 00:00:31,040 Speaker 3: I'm Zara, Saidler. 10 00:00:31,160 --> 00:00:34,280 Speaker 2: It's easy to imagine how AI could pose a threat 11 00:00:34,360 --> 00:00:38,080 Speaker 2: to writers, but how could this technology pose a threat 12 00:00:38,280 --> 00:00:39,000 Speaker 2: to actors? 13 00:00:39,240 --> 00:00:42,599 Speaker 4: If a movie came out with me and Tom Cruise 14 00:00:42,760 --> 00:00:45,640 Speaker 4: and Julia Roberts and none of us actually worked on 15 00:00:45,680 --> 00:00:47,600 Speaker 4: the movie, I think I would lose my mind. 16 00:00:47,720 --> 00:00:50,920 Speaker 2: Joe Kiley, our head of video here at TDA, is 17 00:00:50,960 --> 00:00:53,320 Speaker 2: going to join us on the podcast today to explain 18 00:00:53,440 --> 00:00:56,040 Speaker 2: what you need to know in the Deep Dive. But first, 19 00:00:56,120 --> 00:00:57,480 Speaker 2: Aara the Headlights. 20 00:01:00,200 --> 00:01:03,760 Speaker 3: Shows coral recovery on the Great Barrier REEF has stalled. 21 00:01:04,280 --> 00:01:07,800 Speaker 3: In its annual report, the Australian Institute of Marine Science 22 00:01:07,920 --> 00:01:11,080 Speaker 3: found coral cover had declined in all regions of the 23 00:01:11,120 --> 00:01:14,720 Speaker 3: reef following significant recovery efforts in recent years. 24 00:01:15,560 --> 00:01:17,840 Speaker 2: A crime scene has been established after a body was 25 00:01:17,880 --> 00:01:21,200 Speaker 2: discovered in a pond at a private golf club in Sydney. 26 00:01:21,760 --> 00:01:24,360 Speaker 2: Police divers were called in to retrieve the body at 27 00:01:24,400 --> 00:01:27,400 Speaker 2: the Lake's Golf Club in the city's East on Wednesday. 28 00:01:27,880 --> 00:01:29,640 Speaker 2: Investigations are continuing. 29 00:01:31,160 --> 00:01:34,240 Speaker 3: Ozzie fashion brand Zimmerman has sold a majority stake in 30 00:01:34,280 --> 00:01:36,800 Speaker 3: the business. The sale is set to be aimed at 31 00:01:36,920 --> 00:01:41,280 Speaker 3: helping Zimmerman expand into overseas market and strengthen its online presence. 32 00:01:41,760 --> 00:01:45,039 Speaker 3: The brands founders, sisters Nicki and Simone Zimmerman, will continue 33 00:01:45,040 --> 00:01:47,080 Speaker 3: to occupy senior roles in the business. 34 00:01:48,200 --> 00:01:50,480 Speaker 2: And today's good news. For the first time ever, there's 35 00:01:50,520 --> 00:01:54,320 Speaker 2: an all female nominee list for the MTV Video Music 36 00:01:54,360 --> 00:01:58,760 Speaker 2: Awards Artist of the Year category. Taylor Swift, Beyonce Shakira, 37 00:01:58,920 --> 00:02:03,040 Speaker 2: and Nicki Minaj are among those nominated. The awards scrapped 38 00:02:03,120 --> 00:02:07,120 Speaker 2: gendered categories in twenty seventeen. The twenty twenty three VMA's 39 00:02:07,160 --> 00:02:13,919 Speaker 2: will be held next month in New Jersey. It's been 40 00:02:13,960 --> 00:02:17,200 Speaker 2: really fun over the last little while, bringing different members 41 00:02:17,320 --> 00:02:20,640 Speaker 2: of the TDA team onto the podcast, and today i'd 42 00:02:20,680 --> 00:02:22,920 Speaker 2: like to introduce you the Joe Kylie, our head of 43 00:02:23,000 --> 00:02:25,360 Speaker 2: Video Joe to your podcast debut. 44 00:02:25,440 --> 00:02:27,200 Speaker 5: Thank you very much, Sam, happy to be here. 45 00:02:27,280 --> 00:02:30,280 Speaker 2: It's fantastic to have you on, especially to talk about 46 00:02:30,280 --> 00:02:35,400 Speaker 2: this investigation you've been doing into AI artificial intelligence in Hollywood. 47 00:02:35,840 --> 00:02:37,919 Speaker 2: You've been doing this for a few weeks. Now go 48 00:02:38,000 --> 00:02:40,400 Speaker 2: back a little bit for me. Where does all of 49 00:02:40,440 --> 00:02:41,520 Speaker 2: this start for you? 50 00:02:41,760 --> 00:02:43,799 Speaker 5: So, since May, we were following the story of the 51 00:02:43,840 --> 00:02:46,639 Speaker 5: writers strike happening in Hollywood, and then in July when 52 00:02:46,680 --> 00:02:48,520 Speaker 5: the actors also joined the picket lines. You know, one 53 00:02:48,600 --> 00:02:50,800 Speaker 5: hundred and sixty thousand of them. It's not just your 54 00:02:50,840 --> 00:02:54,720 Speaker 5: Matt Damon's and your and your Julia Roberts. It's all 55 00:02:54,760 --> 00:02:57,480 Speaker 5: of these people coming together to strike with writers. But 56 00:02:57,520 --> 00:02:59,040 Speaker 5: when I saw that one of the issues that they 57 00:02:59,120 --> 00:03:02,160 Speaker 5: had was that they were facing a reality where their 58 00:03:02,280 --> 00:03:05,800 Speaker 5: likeness could appear in future productions without their consent, and 59 00:03:05,840 --> 00:03:08,600 Speaker 5: of course, with no payments for that work. That's when 60 00:03:08,680 --> 00:03:10,639 Speaker 5: alarm bell started ringing in my head and I thought, 61 00:03:10,680 --> 00:03:12,320 Speaker 5: there's a really interesting story here. 62 00:03:12,400 --> 00:03:14,839 Speaker 2: And I feel like we're all going on some sort 63 00:03:14,880 --> 00:03:17,880 Speaker 2: of an education journey with AI. The first that I 64 00:03:17,960 --> 00:03:20,880 Speaker 2: kind of really knew about it was through a chat GBT, 65 00:03:21,400 --> 00:03:23,520 Speaker 2: which was only a couple of months ago, and now 66 00:03:23,520 --> 00:03:25,480 Speaker 2: it's working its way, as you said, into all of 67 00:03:25,520 --> 00:03:29,480 Speaker 2: these different sectors. We have spoken about those actor and 68 00:03:29,520 --> 00:03:32,239 Speaker 2: writer strikes on the podcast. I'll throw a link into 69 00:03:32,240 --> 00:03:34,840 Speaker 2: the show notes for anyone to go have a back listen. 70 00:03:35,360 --> 00:03:38,800 Speaker 2: But where does AI come into the reason why actors 71 00:03:38,800 --> 00:03:40,360 Speaker 2: and writers are actually striking. 72 00:03:40,520 --> 00:03:42,480 Speaker 5: We'll get to the AI stuff in a second, but 73 00:03:42,480 --> 00:03:45,680 Speaker 5: I think it's important to unpack the financial side of 74 00:03:45,720 --> 00:03:49,280 Speaker 5: this story and understand what a residual actually is. Residuals 75 00:03:49,640 --> 00:03:51,920 Speaker 5: you can kind of think of them like royalties. So 76 00:03:52,040 --> 00:03:55,160 Speaker 5: whenever someone works on a TV show or a film, 77 00:03:55,560 --> 00:03:58,640 Speaker 5: when it's broadcast or rerun, they receive a small payment 78 00:03:58,680 --> 00:03:59,200 Speaker 5: each time. 79 00:03:59,400 --> 00:04:02,400 Speaker 2: So to give an example, if you're Jennifer Aniston and 80 00:04:02,400 --> 00:04:04,720 Speaker 2: you're working on Friends, you get paid for that first 81 00:04:04,920 --> 00:04:07,720 Speaker 2: year that you're actually filming the series, but then every 82 00:04:07,720 --> 00:04:10,920 Speaker 2: time I'm watching it on freeware networks in Australia, she's 83 00:04:10,960 --> 00:04:13,120 Speaker 2: getting a little bit of money there exactly. 84 00:04:13,160 --> 00:04:15,400 Speaker 5: So every time you know, the Nanny or Friends or 85 00:04:15,440 --> 00:04:17,240 Speaker 5: Seinfeld or these kind of shows, that have been on 86 00:04:17,279 --> 00:04:20,320 Speaker 5: there for a long time. Anytime those episodes aired anywhere 87 00:04:20,320 --> 00:04:23,480 Speaker 5: around the world, those original creators were paid a nice 88 00:04:23,560 --> 00:04:25,599 Speaker 5: fee actually right, so you could make a living off 89 00:04:25,720 --> 00:04:27,760 Speaker 5: what they were getting paid. Absolutely, and this is one 90 00:04:27,800 --> 00:04:30,240 Speaker 5: of the big issues. Streaming has come along is completely 91 00:04:30,360 --> 00:04:32,960 Speaker 5: upended a lot of these kind of systems. Not only 92 00:04:33,080 --> 00:04:35,960 Speaker 5: actors and writers working for shorter periods of time because 93 00:04:36,000 --> 00:04:37,960 Speaker 5: the number of episodes that have been produced are a 94 00:04:38,000 --> 00:04:41,239 Speaker 5: lot shorter, but the income that were filling those gaps 95 00:04:41,240 --> 00:04:44,679 Speaker 5: in between jobs has reduced. Okay, so how does AI 96 00:04:44,880 --> 00:04:48,320 Speaker 5: then fit into that? So yes, So the chief negotiator 97 00:04:48,480 --> 00:04:51,839 Speaker 5: for the Actors Union, Duncan Crabtree, Ireland, he talked about 98 00:04:51,839 --> 00:04:55,000 Speaker 5: a proposal that they'd received from film studios and production companies. 99 00:04:55,320 --> 00:04:58,640 Speaker 6: They proposed that our background performers should be able to 100 00:04:58,680 --> 00:05:01,680 Speaker 6: be scanned, get paid for one day's pay, and their 101 00:05:01,720 --> 00:05:05,320 Speaker 6: companies should own that scan, their image, their likeness and 102 00:05:05,360 --> 00:05:07,280 Speaker 6: should be able to use it for the rest of 103 00:05:07,360 --> 00:05:10,360 Speaker 6: eternity and any project they want, with no consent and 104 00:05:10,480 --> 00:05:11,240 Speaker 6: no compensation. 105 00:05:11,600 --> 00:05:14,680 Speaker 5: We've all heard about writers talking about CHATGBT being used 106 00:05:14,720 --> 00:05:16,880 Speaker 5: to come up with scripts, but it really got me 107 00:05:17,000 --> 00:05:19,360 Speaker 5: thinking when I heard all of a sudden, actors might 108 00:05:19,440 --> 00:05:22,320 Speaker 5: face a similar kind of future where their physical selves 109 00:05:22,400 --> 00:05:26,160 Speaker 5: won't be even required on film sets. So clearly the Actors' 110 00:05:26,200 --> 00:05:29,559 Speaker 5: Union is against that. We've been hearing a lot about 111 00:05:29,680 --> 00:05:32,560 Speaker 5: AI and how it's going to shake up the film industry, 112 00:05:32,600 --> 00:05:35,280 Speaker 5: and it hasn't all been negative. There's been talks about 113 00:05:35,320 --> 00:05:37,479 Speaker 5: how it could be a positive and enhance visual effects 114 00:05:37,560 --> 00:05:40,080 Speaker 5: and you know, virtual reality all that kind of stuff. 115 00:05:40,800 --> 00:05:43,120 Speaker 5: What type of tech are we talking about here, Yeah, well, 116 00:05:43,240 --> 00:05:45,520 Speaker 5: a lot of it's not actually that new. People have 117 00:05:45,680 --> 00:05:48,000 Speaker 5: been using these kind of concepts for many, many years. 118 00:05:48,120 --> 00:05:50,800 Speaker 5: So you think about a stunt performer who's going to 119 00:05:50,880 --> 00:05:53,080 Speaker 5: do a really difficult stunt, they're obviously not going to 120 00:05:53,120 --> 00:05:55,320 Speaker 5: get the hero actor and put them in any kind 121 00:05:55,320 --> 00:05:57,680 Speaker 5: of danger. So these kind of practices have been in 122 00:05:57,760 --> 00:06:00,560 Speaker 5: place in Hollywood for quite a while. So while it's 123 00:06:00,640 --> 00:06:03,440 Speaker 5: never really been a threat before. With everything moving at 124 00:06:03,480 --> 00:06:06,000 Speaker 5: such a rapid pace, I just wanted someone who's on 125 00:06:06,040 --> 00:06:08,320 Speaker 5: the inside, who's working in these kind of films to 126 00:06:08,480 --> 00:06:10,680 Speaker 5: just break it down for me. So I reached out 127 00:06:10,720 --> 00:06:13,120 Speaker 5: to Tyson Donnelly, who's a visual effects artist. 128 00:06:13,200 --> 00:06:14,679 Speaker 7: Yeah, so more and more, there's a lot of hidden 129 00:06:14,800 --> 00:06:15,600 Speaker 7: visual effects work. 130 00:06:15,720 --> 00:06:17,919 Speaker 5: He's worked on things like the recent Star Wars films 131 00:06:18,000 --> 00:06:20,000 Speaker 5: and live action aladdiner the Matrix. 132 00:06:20,080 --> 00:06:21,839 Speaker 7: The Matrix four was the last one, which was very 133 00:06:21,880 --> 00:06:23,480 Speaker 7: exciting personally as a v EFFX. 134 00:06:23,600 --> 00:06:23,640 Speaker 1: No. 135 00:06:24,080 --> 00:06:26,360 Speaker 5: I think a lot of people are confused by the 136 00:06:26,440 --> 00:06:29,320 Speaker 5: term like AI and they don't really understand how that 137 00:06:29,720 --> 00:06:33,520 Speaker 5: might fit in visual effects world. Can you explain to 138 00:06:33,600 --> 00:06:36,120 Speaker 5: me in what ways you're using AI in visual effects? 139 00:06:36,160 --> 00:06:38,880 Speaker 7: So I think the most easiest example of how we 140 00:06:39,080 --> 00:06:43,160 Speaker 7: use it now ish would be for rotoscoping, which is 141 00:06:43,520 --> 00:06:45,680 Speaker 7: when you have to put someone on a sunny beach 142 00:06:45,760 --> 00:06:48,080 Speaker 7: instead of a car park lot. And then I think 143 00:06:48,160 --> 00:06:50,480 Speaker 7: the other one that people are more familiar with is 144 00:06:50,560 --> 00:06:54,159 Speaker 7: deep fakes, or replacing people's faces with either another actor 145 00:06:54,320 --> 00:06:55,920 Speaker 7: or a younger version of themselves. 146 00:06:56,320 --> 00:06:58,720 Speaker 5: So Tyson's talking about de aging and those kind of 147 00:06:58,760 --> 00:07:01,320 Speaker 5: techniques that you see in things like the latest Indiana 148 00:07:01,360 --> 00:07:03,360 Speaker 5: Jones film, where all of a sudden, Harrison Ford is 149 00:07:03,480 --> 00:07:06,320 Speaker 5: thirty forty years younger. So while it is all early 150 00:07:06,440 --> 00:07:09,000 Speaker 5: days for this, what we're hearing is that actors and 151 00:07:09,040 --> 00:07:12,360 Speaker 5: writers are concerned about how this might progress into the future. 152 00:07:12,280 --> 00:07:15,680 Speaker 2: Which is understandable. I mean, it's kind of reshaping the 153 00:07:15,840 --> 00:07:20,440 Speaker 2: industry in live time. Let's hone in on the actors specifically, 154 00:07:21,280 --> 00:07:23,520 Speaker 2: what kind of issues are they worried about in this 155 00:07:23,800 --> 00:07:24,400 Speaker 2: AI realm. 156 00:07:24,560 --> 00:07:25,880 Speaker 5: I think they're worried about a lot of things. But 157 00:07:26,080 --> 00:07:27,960 Speaker 5: I've thought, let's go straight to the source. So I 158 00:07:28,080 --> 00:07:30,160 Speaker 5: reached out to an actor, Rachel Nichols. 159 00:07:30,320 --> 00:07:34,120 Speaker 4: I'm an actress. I live in Los Angeles, and I 160 00:07:34,320 --> 00:07:37,400 Speaker 4: wanted to speak today about everything. 161 00:07:37,080 --> 00:07:38,280 Speaker 2: That's going on in the industry. 162 00:07:38,400 --> 00:07:40,400 Speaker 5: So Rachel's acted in a whole host of things like 163 00:07:40,680 --> 00:07:43,480 Speaker 5: Alias Criminal Minds Man in The High Castle Star Trek. 164 00:07:43,680 --> 00:07:46,000 Speaker 5: She's got quite an extensive list on IMDb, and she 165 00:07:46,200 --> 00:07:48,800 Speaker 5: says that she's really worried about the implication of AI 166 00:07:49,000 --> 00:07:51,600 Speaker 5: and deep fakes for both herself and other actors. 167 00:07:51,800 --> 00:07:55,160 Speaker 4: The idea that they could take my voice and put 168 00:07:55,200 --> 00:07:57,280 Speaker 4: it on my body and put it in. 169 00:07:57,480 --> 00:08:01,520 Speaker 3: A different project that I wasn't physically or mentally involved 170 00:08:01,560 --> 00:08:02,960 Speaker 3: in is very threatening. 171 00:08:03,400 --> 00:08:04,640 Speaker 4: It's also terrifying. 172 00:08:04,840 --> 00:08:07,720 Speaker 5: Not only are they facing a reality where they're residuals 173 00:08:07,760 --> 00:08:10,560 Speaker 5: from the work that they've previously done. Residuals remember like 174 00:08:10,680 --> 00:08:14,920 Speaker 5: royalties they're shrinking. They might end up in future productions 175 00:08:15,000 --> 00:08:17,640 Speaker 5: that they've never even consented to be and they've never 176 00:08:17,800 --> 00:08:20,240 Speaker 5: been on a film set, they haven't physically been involved 177 00:08:20,280 --> 00:08:22,520 Speaker 5: in these kind of productions, and they're not going to 178 00:08:22,560 --> 00:08:24,560 Speaker 5: see any kind of payment for those kind of works. 179 00:08:24,560 --> 00:08:27,760 Speaker 5: So it's almost like this horrible double squeeze where they're 180 00:08:27,800 --> 00:08:29,880 Speaker 5: losing out on stuff that they've done in the past 181 00:08:30,160 --> 00:08:32,280 Speaker 5: and they're not going to have any kind of payment 182 00:08:32,320 --> 00:08:33,160 Speaker 5: for anything in the future. 183 00:08:33,480 --> 00:08:36,800 Speaker 4: If a movie came out with me and Tom Cruise 184 00:08:37,040 --> 00:08:39,880 Speaker 4: and Julia Roberts and none of us actually worked on 185 00:08:39,920 --> 00:08:42,319 Speaker 4: the movie, I think I would lose my mind and 186 00:08:42,520 --> 00:08:43,800 Speaker 4: then I would go be a. 187 00:08:43,800 --> 00:08:47,360 Speaker 5: Librarian or something. People who are probably first to go 188 00:08:47,800 --> 00:08:51,559 Speaker 5: the background actors, extras, those kind of roles. And I 189 00:08:51,600 --> 00:08:54,160 Speaker 5: spoke to a producer about this, and she explained to me, well, 190 00:08:54,200 --> 00:08:57,000 Speaker 5: once you start to remove background actors, it's going to 191 00:08:57,040 --> 00:08:59,439 Speaker 5: have a ripple effect throughout the whole industry because you 192 00:08:59,760 --> 00:09:04,360 Speaker 5: then require fewer makeup people, fewer costumed people, fewer assistant directors. 193 00:09:04,840 --> 00:09:06,360 Speaker 5: Everybody will be affected by this. 194 00:09:06,880 --> 00:09:09,240 Speaker 2: I can understand why everyone across the board is concerned 195 00:09:09,240 --> 00:09:11,920 Speaker 2: about the emergence of AI. On the other side of 196 00:09:11,920 --> 00:09:15,400 Speaker 2: the things, what are we hearing from studios and producers. 197 00:09:15,920 --> 00:09:18,520 Speaker 5: Well, they're going to argue that they're also being hit 198 00:09:18,679 --> 00:09:22,280 Speaker 5: by financial losses, right. They will say that things like 199 00:09:22,600 --> 00:09:25,000 Speaker 5: the revenue they used to make from cable TV that's 200 00:09:25,080 --> 00:09:27,920 Speaker 5: been in decline for many, many years. The box officing 201 00:09:28,280 --> 00:09:30,360 Speaker 5: isn't what it used to be coming out of COVID. 202 00:09:30,400 --> 00:09:32,800 Speaker 5: You know, cinemas are still struggling to get people to 203 00:09:32,840 --> 00:09:34,959 Speaker 5: come back to theater. And so you've got people like 204 00:09:35,040 --> 00:09:37,719 Speaker 5: Bob Iger, who's the CEO of Disney. He said in 205 00:09:37,800 --> 00:09:40,600 Speaker 5: interviews that actors and writers are being completely unrealistic. 206 00:09:40,880 --> 00:09:43,800 Speaker 2: So with the speed of technology developing, particularly in the 207 00:09:43,840 --> 00:09:48,120 Speaker 2: film and entertainment industry, without getting too dramatic here, do 208 00:09:48,240 --> 00:09:51,320 Speaker 2: you think there's such a world that actors don't exist? 209 00:09:51,520 --> 00:09:55,080 Speaker 5: No, I don't think actors will become irrelevant, Like people 210 00:09:55,160 --> 00:09:57,319 Speaker 5: go to the movies for a human connection. Right. So 211 00:09:57,800 --> 00:09:59,880 Speaker 5: I actually put this to Rachel herself. Do you think 212 00:10:00,080 --> 00:10:03,679 Speaker 5: the AI generated performances will be able to evoke the 213 00:10:03,800 --> 00:10:05,320 Speaker 5: same levels of emotion? 214 00:10:05,840 --> 00:10:10,559 Speaker 4: Nope? Nope. I know, show me an AI of myself 215 00:10:10,840 --> 00:10:13,520 Speaker 4: where different ords are coming out of my mouth that 216 00:10:13,640 --> 00:10:15,080 Speaker 4: you need the tears, don't. 217 00:10:15,360 --> 00:10:15,920 Speaker 3: I don't buy it. 218 00:10:16,160 --> 00:10:18,440 Speaker 4: And then I also don't show me because I don't 219 00:10:18,480 --> 00:10:21,560 Speaker 4: want to know that it can be done because suck. 220 00:10:21,800 --> 00:10:23,800 Speaker 5: So yeah, I think we might be a while away 221 00:10:23,880 --> 00:10:26,400 Speaker 5: before an AI video is able to make us connect 222 00:10:26,480 --> 00:10:28,520 Speaker 5: and feel something the way that a human actor is 223 00:10:28,600 --> 00:10:31,319 Speaker 5: able to. But they've actually got a difficult task ahead 224 00:10:31,320 --> 00:10:34,880 Speaker 5: of them, the developers of these AI technologies, because as 225 00:10:35,000 --> 00:10:38,200 Speaker 5: you start to create a digital human in a film 226 00:10:38,320 --> 00:10:40,520 Speaker 5: or a TV show, as it starts to get better 227 00:10:40,600 --> 00:10:42,839 Speaker 5: and better and more realistic, you think, ah, this is 228 00:10:43,080 --> 00:10:45,640 Speaker 5: this is happening. We're on the right track. It actually 229 00:10:45,760 --> 00:10:49,439 Speaker 5: hits a point where it completely flips on itself and 230 00:10:49,559 --> 00:10:52,880 Speaker 5: it looks creepy. So I actually spoke to Tyson about 231 00:10:52,920 --> 00:10:54,719 Speaker 5: this and he kind of explained it really well. It's 232 00:10:54,720 --> 00:10:56,680 Speaker 5: actually called the uncanny valley where. 233 00:10:56,520 --> 00:10:58,679 Speaker 7: We're super good at recognizing faces. We'll get to a 234 00:10:58,760 --> 00:11:01,120 Speaker 7: point and we just know so it's not a real 235 00:11:01,200 --> 00:11:02,959 Speaker 7: person and we can just sense it. 236 00:11:03,240 --> 00:11:05,439 Speaker 5: A really good example of this actually is, you know, 237 00:11:05,559 --> 00:11:07,760 Speaker 5: cast your mind back to early two thousands, the first 238 00:11:07,840 --> 00:11:11,640 Speaker 5: Shrek film, So Princess Fiona, she actually looked too real 239 00:11:11,920 --> 00:11:14,440 Speaker 5: in the film, and children actually were freaked out by 240 00:11:14,520 --> 00:11:16,640 Speaker 5: this because you know, it was an animation, but it 241 00:11:16,760 --> 00:11:21,079 Speaker 5: was so close to human realism, and so after doing 242 00:11:21,200 --> 00:11:23,640 Speaker 5: market research, they went back and actually made her look 243 00:11:23,760 --> 00:11:26,680 Speaker 5: more cartoony so that she wouldn't scare the kids. 244 00:11:27,040 --> 00:11:29,719 Speaker 2: Joe, having spent so much time talking to experts in 245 00:11:29,760 --> 00:11:33,160 Speaker 2: the field, stakeholders, people who are innovating in AI but 246 00:11:33,320 --> 00:11:36,680 Speaker 2: also are in danger of being replaced by AI, what's 247 00:11:36,760 --> 00:11:38,280 Speaker 2: your take on where we're headed. 248 00:11:38,360 --> 00:11:41,120 Speaker 5: Honestly, I think it's very easy to say we need 249 00:11:41,240 --> 00:11:44,400 Speaker 5: to stop all of this because it's just getting too scary. 250 00:11:44,760 --> 00:11:47,840 Speaker 5: I think when I've spoken to you know, people from 251 00:11:48,160 --> 00:11:53,240 Speaker 5: lots of different industries, like scientists, people in health, other artists, musicians, like, 252 00:11:53,320 --> 00:11:56,040 Speaker 5: there are ways where we can harness the power of 253 00:11:56,120 --> 00:11:59,000 Speaker 5: what these AI tools can do, and it's amazing and 254 00:11:59,040 --> 00:12:01,000 Speaker 5: I'm sure we'll be able to create things that we 255 00:12:01,080 --> 00:12:04,160 Speaker 5: can't even imagine right now. But I think along the 256 00:12:04,200 --> 00:12:06,880 Speaker 5: way throughout this journey, it's about finding the right balance 257 00:12:07,120 --> 00:12:09,040 Speaker 5: so that we can we can start to use these 258 00:12:09,120 --> 00:12:10,760 Speaker 5: kind of tools, but it's not going to be the 259 00:12:10,840 --> 00:12:13,479 Speaker 5: expense of you know, people's jobs and livelihoods. 260 00:12:13,760 --> 00:12:16,120 Speaker 2: Joe, thanks for coming on the Daily Odds today. You're 261 00:12:16,200 --> 00:12:18,719 Speaker 2: never going to be replaced by AI. Wad Off as 262 00:12:18,720 --> 00:12:21,360 Speaker 2: special thank you, Sam, but when you are I'll be 263 00:12:21,440 --> 00:12:22,280 Speaker 2: the first to let you know. 264 00:12:26,760 --> 00:12:30,040 Speaker 3: Thanks for listening in to this episode of The Daily Ods. 265 00:12:30,280 --> 00:12:34,560 Speaker 8: We have loved reading your reviews on Spotify, so if 266 00:12:34,600 --> 00:12:37,120 Speaker 8: you liked this episode, you can go to your Spotify 267 00:12:37,200 --> 00:12:39,400 Speaker 8: app and it'll give you a little box under the 268 00:12:39,520 --> 00:12:41,520 Speaker 8: episode name, and there you can tell us how you 269 00:12:41,600 --> 00:12:44,520 Speaker 8: felt about the episode and any questions you might have. 270 00:12:45,280 --> 00:12:47,240 Speaker 3: Have a fabulous day and we'll see you tomorrow.