1 00:00:12,960 --> 00:00:16,080 Speaker 1: Welcome to Tech Stuff by'm moz Veloschen here with Cara Price. 2 00:00:16,360 --> 00:00:19,600 Speaker 2: Hello. Hello, So I just read this piece about how 3 00:00:19,720 --> 00:00:23,000 Speaker 2: AI has already started displacing jobs, but it's actually not 4 00:00:23,040 --> 00:00:24,040 Speaker 2: the jobs you would think. 5 00:00:24,120 --> 00:00:26,640 Speaker 1: So do we have something to celebrate this upcoming Labor Day? 6 00:00:26,720 --> 00:00:28,120 Speaker 1: AI not stealing our jobs? 7 00:00:29,040 --> 00:00:31,680 Speaker 2: Well, notebook LM, as you know, is trying to make 8 00:00:31,760 --> 00:00:36,280 Speaker 2: podcast hosts obsolete. But no. According to MIT's State of 9 00:00:36,320 --> 00:00:39,279 Speaker 2: AI and Business Report, AI is replacing jobs that are 10 00:00:39,360 --> 00:00:43,400 Speaker 2: usually outsourced to other countries. That's the short term. Long term, 11 00:00:43,640 --> 00:00:47,320 Speaker 2: around twenty seven percent of white collar jobs will be eliminated. Yeah. 12 00:00:47,360 --> 00:00:49,440 Speaker 1: I know a number of people who are worried about 13 00:00:49,440 --> 00:00:52,640 Speaker 1: the security and longevity of their jobs. But I also 14 00:00:52,720 --> 00:00:54,920 Speaker 1: know people who are thinking about the kind of glass 15 00:00:54,960 --> 00:00:58,240 Speaker 1: half full perspective about new jobs that AI might create. 16 00:00:58,800 --> 00:01:01,520 Speaker 1: And we're sharing a conversation with one of those people today. 17 00:01:01,800 --> 00:01:04,720 Speaker 2: Yeah. You and I recently spoke with journalist Robert Kapps, 18 00:01:04,720 --> 00:01:07,039 Speaker 2: who has been reporting on what jobs could exist in 19 00:01:07,080 --> 00:01:09,640 Speaker 2: an AI driven future, and he actually came up with 20 00:01:09,680 --> 00:01:12,200 Speaker 2: a list of twenty two jobs that don't necessarily exist 21 00:01:12,319 --> 00:01:14,919 Speaker 2: yet but are likely to exist when humans in AI 22 00:01:15,480 --> 00:01:17,720 Speaker 2: work sort of in a hybrid capacity. 23 00:01:17,920 --> 00:01:18,120 Speaker 3: Yeah. 24 00:01:18,160 --> 00:01:21,280 Speaker 1: He lists a few jobs that sounded pretty interesting, and 25 00:01:21,280 --> 00:01:24,040 Speaker 1: then wrote up this article for the New York Times magazine. 26 00:01:24,240 --> 00:01:27,720 Speaker 1: I think my personal favorite was probably AI plumber. That's 27 00:01:27,760 --> 00:01:30,480 Speaker 1: the person who will figure out why an AI system 28 00:01:30,600 --> 00:01:32,920 Speaker 1: might not be working the way it should be, and, 29 00:01:33,080 --> 00:01:35,160 Speaker 1: as Rob puts it, we'll snake the pipes. 30 00:01:35,720 --> 00:01:35,920 Speaker 3: Yeah. 31 00:01:35,959 --> 00:01:38,120 Speaker 2: And I was happy to hear him say that tastemaker 32 00:01:38,200 --> 00:01:39,240 Speaker 2: jobs will last a while. 33 00:01:39,480 --> 00:01:41,759 Speaker 1: Is that because you consider yourself a test maker? 34 00:01:41,959 --> 00:01:46,520 Speaker 2: Yes, a nondescript creative professional. And he says that those 35 00:01:46,520 --> 00:01:49,720 Speaker 2: will continue to exist, So I'm very excited. He mentioned 36 00:01:49,760 --> 00:01:55,040 Speaker 2: one job called a world designer, where a person fabricates 37 00:01:55,040 --> 00:01:59,360 Speaker 2: an entire universe, complete with fictional characters and locations, which 38 00:01:59,400 --> 00:02:02,720 Speaker 2: could apply to everything from marketing campaigns to video games. 39 00:02:02,880 --> 00:02:04,800 Speaker 1: Yeah, I remember him talking about that. He also said 40 00:02:04,800 --> 00:02:07,040 Speaker 1: that he wrote the first draft of the piece using 41 00:02:07,120 --> 00:02:10,600 Speaker 1: AI to see what would happen and his Robert describing 42 00:02:10,680 --> 00:02:12,560 Speaker 1: how that went down with his editor, I. 43 00:02:12,520 --> 00:02:14,680 Speaker 3: Thought it would be a funny joke to play on 44 00:02:14,720 --> 00:02:17,480 Speaker 3: my editor, Bill Wassick that you know, he assigned me 45 00:02:17,560 --> 00:02:19,400 Speaker 3: the article and then like two hours later, I'm like, 46 00:02:19,440 --> 00:02:24,040 Speaker 3: here you go, here's my invoice. I love this new future. No, 47 00:02:25,440 --> 00:02:26,480 Speaker 3: it was a little bit of a luck, but it 48 00:02:26,560 --> 00:02:29,200 Speaker 3: was also you know, you're thinking about AI and jobs, 49 00:02:29,200 --> 00:02:31,880 Speaker 3: and you're thinking about how what are the future jobs? 50 00:02:31,880 --> 00:02:33,640 Speaker 3: So I just thought it was a logic place to 51 00:02:33,680 --> 00:02:36,640 Speaker 3: start with. Okay, well what will my role be as writer? 52 00:02:37,320 --> 00:02:39,440 Speaker 3: And you know, freelance journalism is a hard place to 53 00:02:39,440 --> 00:02:42,640 Speaker 3: be I and we're not extremely well paid people anymore. 54 00:02:42,960 --> 00:02:44,880 Speaker 3: So it's if I can write more, I can make 55 00:02:44,919 --> 00:02:47,560 Speaker 3: more money, you know, just as a purely capitalistic play, 56 00:02:47,639 --> 00:02:50,880 Speaker 3: like that's the dream, right, But of course that wasn't possible. 57 00:02:51,080 --> 00:02:53,880 Speaker 3: It wasn't nearly good enough. And I would suggest this 58 00:02:54,000 --> 00:02:56,240 Speaker 3: like to anybody out there who's very worried about it. 59 00:02:56,240 --> 00:02:57,960 Speaker 3: It's like, just go have it, do your job. Just 60 00:02:57,960 --> 00:03:00,440 Speaker 3: try it, because like it'll teach you a lot about 61 00:03:00,560 --> 00:03:02,960 Speaker 3: how far it still has to go, which can be 62 00:03:03,000 --> 00:03:04,800 Speaker 3: surprising in our current type cycle. 63 00:03:04,880 --> 00:03:07,760 Speaker 2: Right to absolutely no one's surprised. The AI version of 64 00:03:07,840 --> 00:03:11,240 Speaker 2: Robert's story was not published because he couldn't risk his 65 00:03:11,280 --> 00:03:13,919 Speaker 2: reputation on an article where AI may or may not 66 00:03:13,960 --> 00:03:17,000 Speaker 2: be hallucinating, or may or may not be exaggerating things 67 00:03:17,080 --> 00:03:18,200 Speaker 2: or understating things. 68 00:03:18,440 --> 00:03:21,200 Speaker 1: You know, We've talked about cognitive offloading a bunch on 69 00:03:21,240 --> 00:03:25,960 Speaker 1: this show, basically atrophying your own skills by using AI 70 00:03:26,080 --> 00:03:28,639 Speaker 1: too much, and that's something that Robert said he's very 71 00:03:28,639 --> 00:03:29,200 Speaker 1: conscious of. 72 00:03:29,840 --> 00:03:32,400 Speaker 3: When you start to really work with the AI, at 73 00:03:32,400 --> 00:03:34,600 Speaker 3: some point you start to hand over your sense of 74 00:03:35,360 --> 00:03:38,200 Speaker 3: taste and your sense of uh, like is this good. 75 00:03:38,240 --> 00:03:41,360 Speaker 3: You sort of hand over the authority to the AI, 76 00:03:41,800 --> 00:03:44,920 Speaker 3: and the AI doesn't have any capability to accept that authority. 77 00:03:45,320 --> 00:03:47,320 Speaker 3: It will tell you that like, oh, yeah, this is great, 78 00:03:47,360 --> 00:03:49,560 Speaker 3: this is true, this is but like it doesn't It's 79 00:03:49,640 --> 00:03:52,240 Speaker 3: just a machine, right, so it doesn't. It doesn't really 80 00:03:52,280 --> 00:03:55,840 Speaker 3: have the ability to accept that moral responsibility. Like at 81 00:03:55,840 --> 00:03:58,560 Speaker 3: some level there are just these elements that like have 82 00:03:58,640 --> 00:04:00,840 Speaker 3: to come from the human because the AI is just 83 00:04:00,920 --> 00:04:02,720 Speaker 3: not capable of providing them. 84 00:04:02,760 --> 00:04:04,280 Speaker 2: You know what I love the most is that after 85 00:04:04,400 --> 00:04:06,760 Speaker 2: using AI, his next step was to reach out to 86 00:04:06,800 --> 00:04:09,160 Speaker 2: a bunch of people, which I know is basic reporting, 87 00:04:09,200 --> 00:04:11,200 Speaker 2: but I think it really speaks to the core of 88 00:04:11,240 --> 00:04:14,280 Speaker 2: his argument that AI will need the human touch for 89 00:04:14,360 --> 00:04:16,080 Speaker 2: quite some time to come right. 90 00:04:16,200 --> 00:04:19,159 Speaker 1: Human work will continue, but the nature of the work 91 00:04:19,279 --> 00:04:22,359 Speaker 1: may adapt to the AI Revolution and Robert have some 92 00:04:22,400 --> 00:04:25,480 Speaker 1: interesting speculations about what the jobs of the future might 93 00:04:25,520 --> 00:04:30,960 Speaker 1: look like and how they fit into three distinct buckets, trust, integration, 94 00:04:31,520 --> 00:04:32,280 Speaker 1: and taste. 95 00:04:32,680 --> 00:04:34,840 Speaker 2: So we started out by asking how he came up 96 00:04:34,880 --> 00:04:37,440 Speaker 2: with these three buckets, And here's the rest of our conversation. 97 00:04:38,240 --> 00:04:40,359 Speaker 3: The first person I called for the article was the 98 00:04:40,400 --> 00:04:45,120 Speaker 3: Ethan Molloch, who wrote the book Cointelligence, who's Professor Wharton 99 00:04:45,240 --> 00:04:48,159 Speaker 3: and who thinks and writes a ton about AI, and 100 00:04:48,200 --> 00:04:50,160 Speaker 3: I just wanted somebody to talk with about like, Hey, 101 00:04:50,160 --> 00:04:53,279 Speaker 3: how should I even be approaching this? And like right away, 102 00:04:53,360 --> 00:04:56,159 Speaker 3: even from that first conversation, I knew that this was 103 00:04:56,200 --> 00:05:01,440 Speaker 3: going to be way more philosophical than anticipated, because he wasn't. 104 00:05:01,680 --> 00:05:03,640 Speaker 3: He was basically like, there's no way I can tell 105 00:05:03,680 --> 00:05:05,520 Speaker 3: you specific jobs. 106 00:05:05,839 --> 00:05:08,360 Speaker 1: Why did he say that he couldn't tell you specific jobs? 107 00:05:08,440 --> 00:05:11,400 Speaker 1: What was the kind of intellectual exercise or missing step 108 00:05:11,440 --> 00:05:13,520 Speaker 1: from where we are today to knowing what those jobs 109 00:05:13,520 --> 00:05:14,520 Speaker 1: will be part of? 110 00:05:14,520 --> 00:05:17,440 Speaker 3: It is just too fast moving, you know. One of 111 00:05:17,480 --> 00:05:20,080 Speaker 3: his big phrases they told me a couple of times, 112 00:05:20,240 --> 00:05:22,360 Speaker 3: was like it depends on how good the AI gets 113 00:05:22,440 --> 00:05:25,880 Speaker 3: and how fast. Right. But then there are some things, 114 00:05:26,040 --> 00:05:27,799 Speaker 3: and you know, he talks a lot about the jagged 115 00:05:27,839 --> 00:05:30,640 Speaker 3: frontier of AI, which is sort of like the AI 116 00:05:30,680 --> 00:05:32,440 Speaker 3: can be really good at some things and then just 117 00:05:32,480 --> 00:05:35,320 Speaker 3: like really horrible at some things that you would expect 118 00:05:35,360 --> 00:05:39,440 Speaker 3: even the base like estimating the word count of an article, right, 119 00:05:39,440 --> 00:05:42,719 Speaker 3: like you expect any human to nail that pretty easily. 120 00:05:43,320 --> 00:05:45,120 Speaker 3: But when you think about all the different jobs, or 121 00:05:45,160 --> 00:05:47,719 Speaker 3: even your job, or any job, what level are you 122 00:05:47,800 --> 00:05:52,400 Speaker 3: bringing some sort of moral or technical or whatever responsibility 123 00:05:52,440 --> 00:05:55,279 Speaker 3: to that job, Like you are signing off on something, 124 00:05:55,320 --> 00:05:58,480 Speaker 3: you are the person saying like, yes, this is good 125 00:05:58,520 --> 00:06:00,440 Speaker 3: and right and best and you know whatever it be. 126 00:06:00,480 --> 00:06:01,840 Speaker 3: And that can exist in a lot of different things. 127 00:06:02,040 --> 00:06:05,440 Speaker 3: In writing this article, it's I'm accepting responsibility for the 128 00:06:05,440 --> 00:06:08,680 Speaker 3: truthfulness and accuracy of this article, and the AI can't. 129 00:06:09,080 --> 00:06:11,839 Speaker 3: It's just not part of our moral world. Right. You 130 00:06:11,880 --> 00:06:14,559 Speaker 3: can't turn to the AI and blame it when something 131 00:06:14,600 --> 00:06:16,479 Speaker 3: goes wrong. I mean you can, but it doesn't care. 132 00:06:16,560 --> 00:06:18,279 Speaker 3: And so you know, you can think about the like 133 00:06:18,680 --> 00:06:22,159 Speaker 3: extreme far end of that, in like autonomous warfare or 134 00:06:22,160 --> 00:06:26,200 Speaker 3: something where like AI robots kill somebody, they can't accept 135 00:06:26,360 --> 00:06:30,239 Speaker 3: moral responsibility. So the first category was trust and where 136 00:06:30,320 --> 00:06:33,680 Speaker 3: are humans gonna still be very necessary? Whereas AI need 137 00:06:33,720 --> 00:06:38,279 Speaker 3: them to authenticate and to provide that sort of trust. 138 00:06:38,839 --> 00:06:40,800 Speaker 1: I mean, to your point. This piece appeared in the 139 00:06:40,800 --> 00:06:43,600 Speaker 1: New York Times magazine. And when we think about like 140 00:06:44,000 --> 00:06:46,240 Speaker 1: the history of newspapers, or the history of publishing, or 141 00:06:46,240 --> 00:06:49,560 Speaker 1: the history of publishing houses, like as this concept of impremoto, 142 00:06:49,680 --> 00:06:53,320 Speaker 1: like I know if this article or this book or 143 00:06:53,320 --> 00:06:56,680 Speaker 1: this movie comes from this studio or this producer or whatever, 144 00:06:57,080 --> 00:06:59,120 Speaker 1: that I can trust it to a certain extent. The 145 00:06:59,160 --> 00:07:03,480 Speaker 1: output of AI is often entirely divorced from its creator. 146 00:07:03,600 --> 00:07:05,960 Speaker 1: But you know, you mentioned in the piece some of 147 00:07:06,000 --> 00:07:08,719 Speaker 1: the roles that may emerge in this trust bucket, like 148 00:07:09,040 --> 00:07:14,520 Speaker 1: AI auditors, trust authenticators, AI ethicists, And I wanted to 149 00:07:14,560 --> 00:07:16,320 Speaker 1: ask you, obviously, one of the things you have to 150 00:07:16,320 --> 00:07:18,480 Speaker 1: think about is how much cultural demand will there be 151 00:07:18,560 --> 00:07:20,680 Speaker 1: for these types of things. I mean, we're living in 152 00:07:20,720 --> 00:07:24,840 Speaker 1: a world now of alternative facts and conspiracies, and I'm 153 00:07:24,880 --> 00:07:28,120 Speaker 1: just wondering, do you think the demand signal will be 154 00:07:28,240 --> 00:07:31,000 Speaker 1: there from the world, even if it should be morally? 155 00:07:31,880 --> 00:07:34,800 Speaker 3: I think that there. I think that there are a 156 00:07:34,920 --> 00:07:38,360 Speaker 3: whole lot of tasks and a whole lot of things 157 00:07:38,400 --> 00:07:41,880 Speaker 3: that we really would love AI to do, right like 158 00:07:41,960 --> 00:07:44,160 Speaker 3: that that we would be just fine with AI doing 159 00:07:44,200 --> 00:07:47,200 Speaker 3: and in fact, we probably already do a lot of them, right, 160 00:07:47,240 --> 00:07:50,920 Speaker 3: Like I have AI transcribe my interviews, and I don't 161 00:07:50,960 --> 00:07:53,400 Speaker 3: think twice about the moral implications of doing that. And 162 00:07:53,480 --> 00:07:55,800 Speaker 3: I think that like, as humans, we sometimes jump to 163 00:07:55,880 --> 00:07:59,320 Speaker 3: these very extreme cases, right, Like I just used a warfare, 164 00:07:59,360 --> 00:08:02,800 Speaker 3: but like you know, replacing human creativity and things like that, 165 00:08:02,840 --> 00:08:05,120 Speaker 3: and yeah, we're gonna have to think very carefully about 166 00:08:05,560 --> 00:08:07,880 Speaker 3: the lines there. But I think that there's a whole 167 00:08:07,880 --> 00:08:10,880 Speaker 3: bunch of tasks that you're that you're just fine with. 168 00:08:10,920 --> 00:08:13,280 Speaker 3: And so you know, one example that I kind of 169 00:08:13,280 --> 00:08:16,560 Speaker 3: reference in the piece in Trust is that, like, oh, 170 00:08:16,600 --> 00:08:19,320 Speaker 3: I thought that like HVAC repairmen might be some of 171 00:08:19,320 --> 00:08:21,800 Speaker 3: the last people to be affected by AI, But in fact, 172 00:08:22,520 --> 00:08:25,880 Speaker 3: HVAC repairmen have to do a lot of paperwork, right, right, 173 00:08:26,040 --> 00:08:28,360 Speaker 3: and a lot of things that's not really core to 174 00:08:28,440 --> 00:08:32,160 Speaker 3: what they enjoy about the job. But if they're using 175 00:08:32,200 --> 00:08:34,959 Speaker 3: AI to do their contracts and to do their paperwork, 176 00:08:36,280 --> 00:08:39,320 Speaker 3: at some point someone needs to validate that, like those 177 00:08:39,360 --> 00:08:42,600 Speaker 3: contracts are accurate, that they're legally fair, because you can't 178 00:08:42,679 --> 00:08:46,360 Speaker 3: trust the AI, the AI is not worthy entity. So 179 00:08:46,440 --> 00:08:48,960 Speaker 3: like even trust comes in there, like somebody needs to 180 00:08:49,040 --> 00:08:52,480 Speaker 3: validate that, and it becomes a little bit different when 181 00:08:52,520 --> 00:08:57,040 Speaker 3: you haven't created the thing, the contract or the whatever yourself, 182 00:08:57,120 --> 00:08:58,920 Speaker 3: like the same problem I was having with the article. 183 00:08:58,920 --> 00:09:01,080 Speaker 3: It's like because you can't. You have to have a 184 00:09:01,160 --> 00:09:03,320 Speaker 3: sort of slightly different set of skills to be able 185 00:09:03,360 --> 00:09:05,480 Speaker 3: to be like I'm very familiar with where the jagged 186 00:09:05,559 --> 00:09:07,360 Speaker 3: edge of AI what kind of mistakes it makes, and 187 00:09:07,640 --> 00:09:10,640 Speaker 3: there really can be very weird mistakes, right, like unexpected 188 00:09:10,679 --> 00:09:13,960 Speaker 3: not like it's not like backreading something a human rope, 189 00:09:14,040 --> 00:09:15,839 Speaker 3: you know, and you can scale it up where it's 190 00:09:15,840 --> 00:09:18,560 Speaker 3: more complicated than HVAC contracts, and it's something in an 191 00:09:18,640 --> 00:09:22,200 Speaker 3: organization where they're where they have a whole chain of 192 00:09:22,240 --> 00:09:25,360 Speaker 3: systems and somebody has to really know the AI well 193 00:09:25,480 --> 00:09:26,800 Speaker 3: enough to be able to go through that chain and 194 00:09:26,840 --> 00:09:29,760 Speaker 3: give it the like human approval of like, yes, this 195 00:09:29,840 --> 00:09:33,319 Speaker 3: is this is trustworthy and accurate. And I think when 196 00:09:33,360 --> 00:09:36,720 Speaker 3: I talk about ethicists, you know, a big organization might 197 00:09:36,760 --> 00:09:40,920 Speaker 3: be integrating AI enough into the organization that they sort 198 00:09:40,920 --> 00:09:44,280 Speaker 3: of have to have some level of explainability, right and 199 00:09:44,320 --> 00:09:46,600 Speaker 3: some level of justification of like we let the AI 200 00:09:46,640 --> 00:09:48,960 Speaker 3: make these decisions and not these decisions, and here's why 201 00:09:48,960 --> 00:09:51,400 Speaker 3: we do it, and those things need to be explainable 202 00:09:51,559 --> 00:09:54,800 Speaker 3: in a way that is satisfying to all kinds of constituents, right, 203 00:09:54,880 --> 00:09:58,240 Speaker 3: so investors, customers, you know. But it can be like 204 00:09:58,320 --> 00:10:00,920 Speaker 3: if the organization ends up in court, right, they have 205 00:10:01,000 --> 00:10:03,040 Speaker 3: to explain it to a judge and a jury in 206 00:10:03,040 --> 00:10:07,360 Speaker 3: a way that they're like, okay, that's rational and ethically sustainable. So, 207 00:10:07,520 --> 00:10:08,719 Speaker 3: you know, one of the things I like to say 208 00:10:08,800 --> 00:10:11,360 Speaker 3: is that like the AI boon might actually be a 209 00:10:12,160 --> 00:10:15,320 Speaker 3: big boost for philosophy majors who can think through the 210 00:10:15,720 --> 00:10:19,000 Speaker 3: sort of philosophical implications of how the AI is used 211 00:10:19,000 --> 00:10:24,239 Speaker 3: in an organization and create a rational chain of ethical 212 00:10:24,640 --> 00:10:28,040 Speaker 3: explainability for why it's done that way. Because so much 213 00:10:28,040 --> 00:10:31,600 Speaker 3: of large corporations comes down to as they get bigger, 214 00:10:31,640 --> 00:10:34,199 Speaker 3: they're just there's just more and more liability everywhere. 215 00:10:34,679 --> 00:10:37,720 Speaker 2: Maybe one day there will be an AI hallucination interpreter, 216 00:10:38,559 --> 00:10:41,240 Speaker 2: which would be very interesting. I think, I don't know. 217 00:10:42,120 --> 00:10:45,320 Speaker 2: The second bucket is integration, and these jobs seem to 218 00:10:45,360 --> 00:10:48,040 Speaker 2: be more technical in nature. Can you talk a little 219 00:10:48,040 --> 00:10:51,600 Speaker 2: bit more about the integration category and what those jobs 220 00:10:51,640 --> 00:10:52,080 Speaker 2: will be like? 221 00:10:52,640 --> 00:10:55,360 Speaker 3: Sure? So another expert I talked to was a fellow 222 00:10:55,400 --> 00:10:58,640 Speaker 3: named Robert Siemens, who's a professor at New York University 223 00:10:58,679 --> 00:11:00,920 Speaker 3: who studies a lot of this, and he's like, well, 224 00:11:01,000 --> 00:11:03,040 Speaker 3: certainly there's going to be some technical rules, right, like 225 00:11:03,160 --> 00:11:06,280 Speaker 3: people need to understand the AI, and not just like 226 00:11:06,320 --> 00:11:08,040 Speaker 3: we know how to build models. We know how to 227 00:11:08,040 --> 00:11:11,280 Speaker 3: build AI. But these are sort of the most immediate 228 00:11:11,440 --> 00:11:14,600 Speaker 3: jobs that you can see coming up that'll be very 229 00:11:14,920 --> 00:11:17,680 Speaker 3: big for the next some amount of years, you know, 230 00:11:17,720 --> 00:11:20,920 Speaker 3: and they may shrink as the AI changes, right, we 231 00:11:21,000 --> 00:11:23,679 Speaker 3: may need less integrators over time, but like right from 232 00:11:23,720 --> 00:11:26,040 Speaker 3: the jump, you need someone at your organization who really 233 00:11:26,120 --> 00:11:28,640 Speaker 3: understands the models, can really dig down, can really understand 234 00:11:28,679 --> 00:11:30,720 Speaker 3: what the I is doing and why, and can help 235 00:11:30,800 --> 00:11:35,040 Speaker 3: map it to you know, the specific company's peculiarities and 236 00:11:35,160 --> 00:11:38,880 Speaker 3: needs and wants and desires, right, And so one of 237 00:11:38,920 --> 00:11:41,200 Speaker 3: the first ones he talked about was just the AI auditor, 238 00:11:42,000 --> 00:11:47,240 Speaker 3: someone who can go and just really create some sort 239 00:11:47,280 --> 00:11:50,240 Speaker 3: of understanding of the AI for people in the business. 240 00:11:51,280 --> 00:11:53,120 Speaker 3: So you know, you can almost think of these as 241 00:11:53,120 --> 00:11:56,760 Speaker 3: like a sort of twist on or an addition to 242 00:11:56,840 --> 00:12:01,080 Speaker 3: your typical IT manager, but isn't quite so. Just technically focused, right, 243 00:12:01,160 --> 00:12:03,480 Speaker 3: They're just like, Okay, our sales teams and models are 244 00:12:03,520 --> 00:12:06,200 Speaker 3: not showing this, We're not hitting the optimization right, So 245 00:12:06,360 --> 00:12:08,400 Speaker 3: just keeping up with the models and like which which 246 00:12:08,440 --> 00:12:11,400 Speaker 3: one is actually best at algebra right now? Right? Like 247 00:12:11,480 --> 00:12:14,559 Speaker 3: which one is actually best at writing right now? Right? 248 00:12:14,640 --> 00:12:17,120 Speaker 3: It changes every few months. Just even keeping up with 249 00:12:17,200 --> 00:12:19,520 Speaker 3: that if it changes at this pace, can be a 250 00:12:19,559 --> 00:12:22,600 Speaker 3: pretty substantial job. So you can see AI being a 251 00:12:22,640 --> 00:12:25,160 Speaker 3: great optimizer and a great tool, but someone needs to 252 00:12:25,240 --> 00:12:28,360 Speaker 3: understand the system enough to work with the ad to 253 00:12:28,440 --> 00:12:30,720 Speaker 3: make sure that like, oh, we're having this problem, like 254 00:12:30,920 --> 00:12:33,400 Speaker 3: people aren't washing their hands enough. How can AI help 255 00:12:33,480 --> 00:12:37,120 Speaker 3: us in a hospital setting? Someone needs to be thinking about, 256 00:12:37,160 --> 00:12:40,600 Speaker 3: like how to make the AI get to the outcomes 257 00:12:40,640 --> 00:12:43,040 Speaker 3: that they want to see. Because it's a very powerful 258 00:12:43,120 --> 00:12:45,040 Speaker 3: tool for some of these things, but you know, it 259 00:12:45,080 --> 00:12:47,600 Speaker 3: needs someone kind of prompting it and helping and integrating 260 00:12:47,640 --> 00:12:50,880 Speaker 3: it to these very very complex, big organizations. 261 00:13:00,840 --> 00:13:03,480 Speaker 2: After the break, why taste making will be the industry 262 00:13:03,480 --> 00:13:05,240 Speaker 2: of the future, stay with us. 263 00:13:20,320 --> 00:13:24,120 Speaker 1: So the first bucket was trust essentially human in the loop, 264 00:13:24,679 --> 00:13:30,959 Speaker 1: a category of jobs around ultimately like taking responsibility right 265 00:13:31,320 --> 00:13:34,920 Speaker 1: or being the final arbiter of like what is just 266 00:13:35,000 --> 00:13:38,839 Speaker 1: what is legal, what is correct. The second bucket is integration, 267 00:13:39,000 --> 00:13:43,480 Speaker 1: which is essentially like, how do we know enough about 268 00:13:43,480 --> 00:13:46,800 Speaker 1: these tools to harness them, effectively. 269 00:13:46,640 --> 00:13:48,160 Speaker 3: Optimize them and harness. 270 00:13:48,320 --> 00:13:50,600 Speaker 1: The third bucket is taste, and this one I think 271 00:13:50,880 --> 00:13:53,720 Speaker 1: relates most directly to what me, you and Kara do 272 00:13:53,840 --> 00:13:56,360 Speaker 1: for a living. And I was curious, how did you 273 00:13:56,480 --> 00:13:58,520 Speaker 1: choose that word and what does it mean in the 274 00:13:58,559 --> 00:14:02,319 Speaker 1: context of your pend the jobs that may emerge. 275 00:14:02,920 --> 00:14:04,600 Speaker 3: Yeah, I mean, I think taste is going to be 276 00:14:04,760 --> 00:14:07,360 Speaker 3: very core to a lot of things, and not just 277 00:14:07,440 --> 00:14:10,000 Speaker 3: creative jobs. It's also the one that I feel like 278 00:14:10,000 --> 00:14:12,080 Speaker 3: people kind of perk up because it's it's also sort 279 00:14:12,120 --> 00:14:16,200 Speaker 3: of humanly appealing. But you know, as I was just 280 00:14:16,240 --> 00:14:18,280 Speaker 3: thinking about it and talking to people, and I use 281 00:14:18,360 --> 00:14:20,760 Speaker 3: this in the piece, I just had this viral e 282 00:14:20,760 --> 00:14:24,000 Speaker 3: clip of Rick Rubin in my head right of him 283 00:14:24,040 --> 00:14:27,040 Speaker 3: on sixty Minutes and Anderson Cooper talking to him and 284 00:14:27,080 --> 00:14:29,200 Speaker 3: saying like, do you know how to work a soundboard? 285 00:14:29,240 --> 00:14:31,080 Speaker 3: And he's like, no, you know, want to play instruments? 286 00:14:31,200 --> 00:14:31,240 Speaker 2: No? 287 00:14:32,040 --> 00:14:33,400 Speaker 3: Do you know anything about music? 288 00:14:33,520 --> 00:14:33,560 Speaker 2: No? 289 00:14:33,760 --> 00:14:36,680 Speaker 3: And he's like, well, what what do you do? And 290 00:14:37,120 --> 00:14:39,360 Speaker 3: his answer, I'll paraphrase, I don't have in front of me, 291 00:14:39,440 --> 00:14:44,280 Speaker 3: is is you know, I am very confident in my taste, right, 292 00:14:44,320 --> 00:14:46,720 Speaker 3: and so like I had this in my head because 293 00:14:47,880 --> 00:14:49,640 Speaker 3: you know, just thinking about what do we still need 294 00:14:49,680 --> 00:14:51,360 Speaker 3: humans for it? And at a base level, like the 295 00:14:51,680 --> 00:14:55,000 Speaker 3: AI doesn't want anything on its own, right, Like, unprompted, 296 00:14:55,040 --> 00:14:58,760 Speaker 3: it will just sit there idle. Right. So at some point, 297 00:14:58,880 --> 00:15:02,600 Speaker 3: the basic human AI interaction is the human asking the 298 00:15:02,640 --> 00:15:06,400 Speaker 3: AI for something. And to me, one logical route you 299 00:15:06,400 --> 00:15:08,640 Speaker 3: go down if you explore that enough, is that the 300 00:15:08,760 --> 00:15:12,960 Speaker 3: human is providing the taste for what it wants created 301 00:15:13,080 --> 00:15:15,400 Speaker 3: and by I mean created. It could be a creative 302 00:15:15,440 --> 00:15:17,800 Speaker 3: thing like a piece of music, or it could be 303 00:15:18,000 --> 00:15:22,160 Speaker 3: a non creative thing like a business process. Right, But 304 00:15:22,240 --> 00:15:24,680 Speaker 3: at some point somebody is looking at it and saying like, hey, 305 00:15:24,840 --> 00:15:27,840 Speaker 3: I have the vision for what I want. But what 306 00:15:27,880 --> 00:15:30,000 Speaker 3: they are really doing is they're making creative choices. And 307 00:15:30,000 --> 00:15:34,240 Speaker 3: they're not wrong or right choices. They're choices of aesthetic 308 00:15:34,800 --> 00:15:37,720 Speaker 3: or their choices of function, right, Like, so there's multiple 309 00:15:37,760 --> 00:15:40,320 Speaker 3: ways to find a solution, but they're sort of using 310 00:15:40,320 --> 00:15:42,840 Speaker 3: their taste and their judgment to make these creative decisions 311 00:15:43,160 --> 00:15:44,120 Speaker 3: to get to their outcome. 312 00:15:44,840 --> 00:15:46,960 Speaker 1: Cara I'm curious for your take here because you know, 313 00:15:47,040 --> 00:15:50,800 Speaker 1: wearing your other hat, you're a TV producer, successful TV producer. 314 00:15:51,240 --> 00:15:53,640 Speaker 1: How do you think about this idea of taste being 315 00:15:53,760 --> 00:15:58,400 Speaker 1: a key place of human irreplaceability. 316 00:15:59,240 --> 00:16:04,520 Speaker 2: Taste is certainly I think the final frontier to be 317 00:16:06,080 --> 00:16:08,520 Speaker 2: messed with. I try to think of it. Does AI 318 00:16:08,680 --> 00:16:12,400 Speaker 2: have taste? AI has taste insofar as what is fed 319 00:16:12,440 --> 00:16:14,840 Speaker 2: to a large language model. If you feed a lot 320 00:16:14,880 --> 00:16:20,200 Speaker 2: of Beethoven to a large language model and it's spinning 321 00:16:20,240 --> 00:16:24,760 Speaker 2: out music that is supposed to be modeled after Beethoven 322 00:16:24,880 --> 00:16:28,560 Speaker 2: or Bach or whoever, you still need a human being 323 00:16:29,240 --> 00:16:32,400 Speaker 2: who's going to decide is this music good. I think 324 00:16:32,520 --> 00:16:38,680 Speaker 2: things that are synthetically made by large language models can 325 00:16:38,800 --> 00:16:42,520 Speaker 2: be good. But I still think there's someone who's deciding 326 00:16:42,800 --> 00:16:44,920 Speaker 2: ultimately if that thing is good or not. And that's 327 00:16:44,920 --> 00:16:45,440 Speaker 2: a person. 328 00:16:46,280 --> 00:16:48,280 Speaker 3: And that's what I mean, at some level of someone's 329 00:16:48,360 --> 00:16:51,080 Speaker 3: making those decisions. And I think, like, again, there's these 330 00:16:51,120 --> 00:16:54,400 Speaker 3: far out examples of like, oh, I use the AI 331 00:16:54,480 --> 00:16:57,000 Speaker 3: to do everything, and that's going to have a certain quality. 332 00:16:57,080 --> 00:16:59,880 Speaker 3: But there's like, oh, I use the AI to do something, 333 00:17:00,080 --> 00:17:03,000 Speaker 3: and I do some things myself, right, I do something's 334 00:17:03,040 --> 00:17:05,200 Speaker 3: analog and that's going to have a different quality, right, 335 00:17:05,280 --> 00:17:07,199 Speaker 3: Like those can both exist? 336 00:17:07,480 --> 00:17:12,480 Speaker 2: Do you find that quality that qceing has become a 337 00:17:12,520 --> 00:17:15,359 Speaker 2: harder job? Like I do notice that just even in 338 00:17:15,400 --> 00:17:19,160 Speaker 2: LinkedIn posts or in Instagram posts, people are relying more 339 00:17:19,520 --> 00:17:23,399 Speaker 2: on chatch ept to create the language that they're using. 340 00:17:23,760 --> 00:17:27,600 Speaker 2: And there is a sort of en shitification of things 341 00:17:27,840 --> 00:17:31,359 Speaker 2: because we have become used to just accepting that things 342 00:17:31,400 --> 00:17:34,240 Speaker 2: aren't as good anymore because people are using chat ept 343 00:17:34,920 --> 00:17:37,600 Speaker 2: to produce content. I know, is there a job that 344 00:17:37,680 --> 00:17:40,160 Speaker 2: exists to push back on in shitification? 345 00:17:40,280 --> 00:17:43,800 Speaker 3: I guess. I mean, like I've done enough writing experiments 346 00:17:43,880 --> 00:17:47,480 Speaker 3: now that I can tell, especially like LinkedIn is just 347 00:17:47,560 --> 00:17:50,440 Speaker 3: rife with it. Like there's just certain constructions M dash shit. 348 00:17:51,040 --> 00:17:53,320 Speaker 3: The m dash is one but like I love m 349 00:17:53,400 --> 00:17:56,159 Speaker 3: dasher is the one that gets me is the like 350 00:17:56,280 --> 00:17:58,840 Speaker 3: it's not X, it's y. So like when you and 351 00:17:58,880 --> 00:18:01,160 Speaker 3: you'll see these once I tell you this, you'll you'll 352 00:18:01,160 --> 00:18:04,080 Speaker 3: see this on LinkedIn like every single post. It's like 353 00:18:04,760 --> 00:18:08,520 Speaker 3: the future is not you know, apples and oranges, it's bananas. 354 00:18:08,720 --> 00:18:11,760 Speaker 3: It's like that that construction just standing on its own 355 00:18:11,880 --> 00:18:13,920 Speaker 3: is like when the AI is trying to be like 356 00:18:13,960 --> 00:18:17,199 Speaker 3: a muscular copywriter, especially Claude. It just it just it 357 00:18:17,400 --> 00:18:20,760 Speaker 3: just loves that construction, right, And yeah, I think that 358 00:18:21,320 --> 00:18:25,280 Speaker 3: it's funny because when I think about AI in creative fields, 359 00:18:25,320 --> 00:18:27,320 Speaker 3: and I try to think about it sort of more 360 00:18:27,359 --> 00:18:32,760 Speaker 3: optimistic in writing, Like there's this extreme example of like 361 00:18:33,440 --> 00:18:35,320 Speaker 3: I just use it to just write the whole thing, 362 00:18:35,400 --> 00:18:37,240 Speaker 3: or I use it to do whatever. Like there's a 363 00:18:37,280 --> 00:18:40,440 Speaker 3: lot of room between where we are and that being 364 00:18:40,520 --> 00:18:42,159 Speaker 3: the outcome, and there's a lot of like sort of 365 00:18:42,280 --> 00:18:45,520 Speaker 3: positive room between there. And so like one of the 366 00:18:45,560 --> 00:18:47,639 Speaker 3: things that I also do is I work on documentary 367 00:18:47,680 --> 00:18:50,879 Speaker 3: films and I'm making a documentary film right now about 368 00:18:51,000 --> 00:18:53,640 Speaker 3: AI weapons. And it's going to cost a little over 369 00:18:53,640 --> 00:18:56,400 Speaker 3: a million dollars to make that film, right, And that's 370 00:18:56,520 --> 00:19:01,040 Speaker 3: that's relatively inexpensive for a documentary film. And would I 371 00:19:01,119 --> 00:19:03,080 Speaker 3: love it if AI could help me make that film 372 00:19:03,119 --> 00:19:07,720 Speaker 3: for five hundred thousand dollars. Like there's a lot it 373 00:19:07,760 --> 00:19:10,280 Speaker 3: can't do. It's not gonna like DP my shoots and stuff. 374 00:19:10,320 --> 00:19:13,919 Speaker 3: But like, you know, I have an editor, and you 375 00:19:13,960 --> 00:19:16,679 Speaker 3: know it's gonna take months and months to get the 376 00:19:16,800 --> 00:19:19,639 Speaker 3: edit together, and I wouldn't want to replace that editor 377 00:19:19,640 --> 00:19:23,760 Speaker 3: because that editor is a valuable story collaborator who's taste 378 00:19:23,800 --> 00:19:26,080 Speaker 3: I love. But what I love the tools to help 379 00:19:26,160 --> 00:19:28,720 Speaker 3: him be able to do in ten weeks what takes 380 00:19:28,760 --> 00:19:31,359 Speaker 3: him twenty because it can like do sort of first 381 00:19:31,359 --> 00:19:33,119 Speaker 3: cuts quickly for him, and it can sort of like 382 00:19:33,200 --> 00:19:36,119 Speaker 3: let him try things much faster, like so therefore we 383 00:19:36,200 --> 00:19:38,639 Speaker 3: can make that documentary faster and we can move on 384 00:19:38,680 --> 00:19:42,280 Speaker 3: to another one. I would absolutely love that, right, like, 385 00:19:42,280 --> 00:19:44,720 Speaker 3: and I think that he would too, but like to 386 00:19:44,760 --> 00:19:46,359 Speaker 3: go right to like, well, just let's just have the 387 00:19:46,400 --> 00:19:48,840 Speaker 3: AI edit the whole thing and it'll be super cheap. Yeah, 388 00:19:48,840 --> 00:19:50,280 Speaker 3: but it it'll suck. 389 00:19:51,240 --> 00:19:54,440 Speaker 1: Now, But what about what about with this piece though? 390 00:19:54,480 --> 00:19:58,080 Speaker 1: Because V one you knocked up in two hours using 391 00:19:58,119 --> 00:20:00,560 Speaker 1: AI a little bit maybe came in a little bit 392 00:20:00,560 --> 00:20:02,720 Speaker 1: of a strung man in the piece, right, But on 393 00:20:02,880 --> 00:20:05,560 Speaker 1: V two that you wrote yourself, like, were there places 394 00:20:05,560 --> 00:20:08,600 Speaker 1: where you helpfully leveraged AI to make it better? And 395 00:20:08,640 --> 00:20:10,639 Speaker 1: if not, like, when do you imagine starting to do 396 00:20:10,680 --> 00:20:11,480 Speaker 1: that and in what way? 397 00:20:12,160 --> 00:20:17,000 Speaker 3: So I didn't use AI in this piece for that, 398 00:20:17,080 --> 00:20:18,639 Speaker 3: Like I didn't, And I would say one of the 399 00:20:18,640 --> 00:20:21,080 Speaker 3: reasons is I have editors at the New York Times 400 00:20:21,160 --> 00:20:22,320 Speaker 3: magazine who are very good. 401 00:20:23,200 --> 00:20:24,880 Speaker 1: And even better than GPT five. 402 00:20:25,000 --> 00:20:27,000 Speaker 3: But what I will say is I've done a lot 403 00:20:27,040 --> 00:20:30,080 Speaker 3: of experiments with writing in both fiction and nonfiction, of 404 00:20:30,320 --> 00:20:32,360 Speaker 3: trying to figure out where to use the AI, and 405 00:20:32,520 --> 00:20:37,200 Speaker 3: what I find is that for me it becomes unglued 406 00:20:37,320 --> 00:20:40,600 Speaker 3: very quickly in that even Male talks about this in 407 00:20:40,600 --> 00:20:42,040 Speaker 3: the piece which I very much agree with. He says 408 00:20:42,040 --> 00:20:44,200 Speaker 3: that he never lets the II create a first draft, 409 00:20:44,800 --> 00:20:46,479 Speaker 3: and that's something that people have talked about, like, oh, 410 00:20:46,560 --> 00:20:48,280 Speaker 3: use it to get over your writer's block, how it 411 00:20:48,320 --> 00:20:50,159 Speaker 3: created a first draft. But like his point, which I 412 00:20:50,160 --> 00:20:52,200 Speaker 3: find for myself too, is that the AI starts to 413 00:20:52,240 --> 00:20:55,120 Speaker 3: dominate your thinking. It puts you in the AI box 414 00:20:55,160 --> 00:20:56,560 Speaker 3: and then you're like, well is this where I would 415 00:20:56,600 --> 00:20:58,600 Speaker 3: have gone without the II? And it's sort of like 416 00:20:58,640 --> 00:21:01,720 Speaker 3: you end up in your own exist crisis. But then 417 00:21:01,840 --> 00:21:04,240 Speaker 3: I find even as I use as an editor, I'm 418 00:21:04,240 --> 00:21:06,000 Speaker 3: sort of saying like, hey, is this good? Is this? 419 00:21:06,280 --> 00:21:08,800 Speaker 3: And it'll start to give some suggestions like oh, that's great, 420 00:21:08,840 --> 00:21:11,040 Speaker 3: like you could probably lay on this point a little harder, 421 00:21:11,119 --> 00:21:13,240 Speaker 3: or like maybe this is a good opportunity to introduce 422 00:21:13,240 --> 00:21:17,080 Speaker 3: some moral ambiguity or whatever. But as you as you 423 00:21:17,119 --> 00:21:20,120 Speaker 3: start using it to do that. You're handing that taste 424 00:21:20,200 --> 00:21:22,919 Speaker 3: to the AI, right in the same way that like 425 00:21:23,119 --> 00:21:24,600 Speaker 3: when I write a piece of New York Times, I 426 00:21:24,600 --> 00:21:26,959 Speaker 3: give it to my editor because I'm so close to 427 00:21:27,000 --> 00:21:31,200 Speaker 3: it that I'm like, is this is this even any good? 428 00:21:31,240 --> 00:21:32,959 Speaker 3: Am I any good? Have I ever been any good 429 00:21:33,000 --> 00:21:35,400 Speaker 3: at anything? In my life? That's sort of the writers 430 00:21:36,200 --> 00:21:39,119 Speaker 3: just sent into madness. But you're trusting your editor now 431 00:21:39,240 --> 00:21:41,040 Speaker 3: to tell you, like, yes, this is very good, right 432 00:21:41,119 --> 00:21:44,120 Speaker 3: that this part's working. This part isn't working. I find 433 00:21:44,119 --> 00:21:47,159 Speaker 3: it really fraught to hand that to the AI because 434 00:21:47,200 --> 00:21:49,359 Speaker 3: it really doesn't know, but it will tell you that 435 00:21:49,400 --> 00:21:49,840 Speaker 3: it knows. 436 00:21:51,680 --> 00:21:55,080 Speaker 1: When we look at the three buckets in this article 437 00:21:55,560 --> 00:21:59,520 Speaker 1: which I'll trust, integration, and taste, one of the things 438 00:21:59,560 --> 00:22:03,320 Speaker 1: that strikes means is that these have a somewhat pyramid 439 00:22:03,480 --> 00:22:07,399 Speaker 1: like structure, right, So like, ultimately, when it comes to trust, 440 00:22:07,440 --> 00:22:12,959 Speaker 1: somebody has to take responsibility legally, morally or otherwise for outputs. 441 00:22:13,760 --> 00:22:17,520 Speaker 1: When it comes to integration, like somebody has to diligence 442 00:22:17,560 --> 00:22:21,800 Speaker 1: the tools and decide which ones are helpful and how 443 00:22:21,840 --> 00:22:25,080 Speaker 1: to integrate them. And when it comes to taste, somebody 444 00:22:25,119 --> 00:22:28,200 Speaker 1: has to be a tastemaker. Somebody has to be a 445 00:22:28,240 --> 00:22:32,040 Speaker 1: talented creative or an experience creative, whether it's in music 446 00:22:32,200 --> 00:22:36,320 Speaker 1: or organizational design or writing or whatever it is. So 447 00:22:36,760 --> 00:22:39,120 Speaker 1: I guess my question is how much of how many 448 00:22:39,119 --> 00:22:41,280 Speaker 1: of the jobs, because you also came up with a 449 00:22:41,280 --> 00:22:43,600 Speaker 1: concept of the AI plumber, right, how many of the 450 00:22:43,720 --> 00:22:46,760 Speaker 1: jobs that will emerge in the AI revolution are kind 451 00:22:46,800 --> 00:22:50,680 Speaker 1: of very much for the top of the pyramid types 452 00:22:50,720 --> 00:22:53,959 Speaker 1: of jobs, And was that a consideration the piece, Like, 453 00:22:54,560 --> 00:22:57,600 Speaker 1: even if there are these new jobs created, will they 454 00:22:58,119 --> 00:23:00,320 Speaker 1: make up for the jobs lost? 455 00:23:00,920 --> 00:23:03,679 Speaker 3: You know, I would say that, like the piece is 456 00:23:03,720 --> 00:23:05,399 Speaker 3: going to be kind of and I'm going to be 457 00:23:05,480 --> 00:23:08,880 Speaker 3: kind of woeful at answering that because I think while 458 00:23:08,920 --> 00:23:11,439 Speaker 3: I did it under the structure of like listing some 459 00:23:11,600 --> 00:23:14,040 Speaker 3: new jobs, like really my hope was to sort of 460 00:23:14,119 --> 00:23:17,880 Speaker 3: like help people philosophically think about like where new jobs 461 00:23:17,880 --> 00:23:20,359 Speaker 3: will be and maybe even what their own job will be, 462 00:23:20,400 --> 00:23:23,520 Speaker 3: and like what interacting with AI will be like. As 463 00:23:23,560 --> 00:23:25,040 Speaker 3: I say, I think that part of it is like 464 00:23:25,080 --> 00:23:27,439 Speaker 3: thinking about where the AI needs humans. It's not so 465 00:23:27,520 --> 00:23:30,120 Speaker 3: much like I try to resist the idea of its 466 00:23:30,200 --> 00:23:33,440 Speaker 3: humans serving the AI, because again, the AI doesn't want 467 00:23:33,480 --> 00:23:36,480 Speaker 3: anything if you follow the chain far enough, eventually you 468 00:23:36,520 --> 00:23:38,680 Speaker 3: get to a human that wants the thing. Right, So 469 00:23:38,720 --> 00:23:40,800 Speaker 3: it's like, where does the AI need the human? It 470 00:23:40,840 --> 00:23:43,080 Speaker 3: needs some trust, and it doesn't really understand your business. 471 00:23:43,080 --> 00:23:45,080 Speaker 3: It needs some integration, and it doesn't really understand what 472 00:23:45,119 --> 00:23:48,200 Speaker 3: you want, so it needs your taste. So there, it's 473 00:23:48,200 --> 00:23:50,679 Speaker 3: hard to say, like how much will be sort of 474 00:23:50,720 --> 00:23:54,159 Speaker 3: AI plumber versus like sort of setting the taste. You know, 475 00:23:54,240 --> 00:23:58,680 Speaker 3: I think that probably ultimately, like the taste maker part 476 00:23:58,680 --> 00:24:02,800 Speaker 3: of it is so fundamental to human creation of any kind, 477 00:24:02,880 --> 00:24:05,879 Speaker 3: that like that will be the longest lasting, right, Like 478 00:24:05,920 --> 00:24:08,720 Speaker 3: I think integration might be the shortest lasting, but like 479 00:24:08,840 --> 00:24:12,240 Speaker 3: could still be decades. It's interesting because every single person 480 00:24:12,280 --> 00:24:15,719 Speaker 3: I talk to express serious trepidation about where we are 481 00:24:15,760 --> 00:24:18,920 Speaker 3: headed in terms of jobs. All of them would say 482 00:24:19,800 --> 00:24:23,159 Speaker 3: AI is going to result in more prosperity, is going 483 00:24:23,200 --> 00:24:27,000 Speaker 3: to increase wealth. But will that prosperity and wealth accrue 484 00:24:27,000 --> 00:24:30,360 Speaker 3: to capital or will that or will it accrue to labor? Right? 485 00:24:30,400 --> 00:24:33,080 Speaker 3: And that's the big unknown that I think people really 486 00:24:33,160 --> 00:24:35,960 Speaker 3: kind of wring their hands about. And I think that 487 00:24:36,000 --> 00:24:39,520 Speaker 3: like we are definitely at the place where it could 488 00:24:39,560 --> 00:24:42,359 Speaker 3: go either way. I am a little bit of a 489 00:24:42,440 --> 00:24:44,520 Speaker 3: firm believer in you know, sort of the phrase the 490 00:24:44,520 --> 00:24:47,480 Speaker 3: cars go where the eyes go, we will go where 491 00:24:47,520 --> 00:24:51,600 Speaker 3: we sort of collectively point ourselves to go. Right, None 492 00:24:51,640 --> 00:24:54,400 Speaker 3: of this is foreseeing the technology will not It does 493 00:24:54,440 --> 00:24:56,720 Speaker 3: not make it guaranteed to go one way or the other. 494 00:24:57,440 --> 00:25:00,800 Speaker 3: I think that in theory, right you have a bunch 495 00:25:00,880 --> 00:25:03,280 Speaker 3: of companies working on this, A bunch of them are 496 00:25:03,280 --> 00:25:08,159 Speaker 3: playing with open source. In theory, the tools will be commodities, 497 00:25:08,800 --> 00:25:11,200 Speaker 3: which is to say, unless open AI and a less 498 00:25:11,200 --> 00:25:13,160 Speaker 3: anthropic and everybody decides that they get to a model 499 00:25:13,200 --> 00:25:15,600 Speaker 3: that's so good that they just close it off for themselves. 500 00:25:15,640 --> 00:25:19,560 Speaker 3: Like we'll all have access to these tools. That should 501 00:25:20,240 --> 00:25:22,320 Speaker 3: in theory, and this might not be very comforting, but 502 00:25:22,400 --> 00:25:26,439 Speaker 3: in theory, that should be democratizing, right, like the Internet was. 503 00:25:26,920 --> 00:25:29,520 Speaker 3: That should enable more of us to be able to 504 00:25:29,560 --> 00:25:34,320 Speaker 3: do more things, and that should be actually threatening to 505 00:25:34,520 --> 00:25:39,119 Speaker 3: bigger organizations. So like, my hope is that we should 506 00:25:39,200 --> 00:25:43,840 Speaker 3: be entering a very entrepreneurial age where like small teams 507 00:25:43,840 --> 00:25:46,920 Speaker 3: of people can take on big encumbered people. But I 508 00:25:46,960 --> 00:25:49,480 Speaker 3: don't think that like that. We're in this weird quirk 509 00:25:49,520 --> 00:25:51,600 Speaker 3: area right now where doesn't look like that, right, Like, 510 00:25:51,640 --> 00:25:53,919 Speaker 3: what it looks like is we have these massive monolists 511 00:25:53,920 --> 00:25:56,840 Speaker 3: that are just getting stronger and more impenetrable, which I 512 00:25:56,880 --> 00:25:59,320 Speaker 3: think is part of why it seems so scary. Right, 513 00:25:59,520 --> 00:26:02,399 Speaker 3: They're just accumulate cash till the cows come home, and 514 00:26:02,400 --> 00:26:04,600 Speaker 3: the rest of us are going to be poor. And 515 00:26:04,680 --> 00:26:06,600 Speaker 3: I do think that, like in this sort of one 516 00:26:06,640 --> 00:26:09,439 Speaker 3: sense of like being a corporate cog in one of 517 00:26:09,480 --> 00:26:12,560 Speaker 3: those massive enterprises, Like there's never been a worse future 518 00:26:12,600 --> 00:26:17,920 Speaker 3: for that. But like if you can start to think entrepreneurial, 519 00:26:17,960 --> 00:26:20,480 Speaker 3: if you can start to think about, like, hey, how 520 00:26:20,520 --> 00:26:23,440 Speaker 3: can I use these tools to do something that excites me? Right, 521 00:26:23,560 --> 00:26:25,240 Speaker 3: that I'm really thrilled about, Like I want to make 522 00:26:25,280 --> 00:26:28,440 Speaker 3: documentary films, right, Like, and it's so expensive, but like, hey, 523 00:26:28,440 --> 00:26:30,000 Speaker 3: maybe they can come down and I can start to 524 00:26:30,000 --> 00:26:32,440 Speaker 3: tell the stories I want to tell in the ways 525 00:26:32,480 --> 00:26:34,399 Speaker 3: I want to tell them, right, Like that's just for me. Like, 526 00:26:34,920 --> 00:26:37,320 Speaker 3: you know, my hope is that like there's all sorts 527 00:26:37,320 --> 00:26:39,320 Speaker 3: of stuff that we don't see coming that is going 528 00:26:39,359 --> 00:26:42,360 Speaker 3: to be really empowering to people. I don't know that 529 00:26:42,359 --> 00:26:46,880 Speaker 3: that's true, right, there's certainly enough worry about just it's 530 00:26:46,920 --> 00:26:50,240 Speaker 3: just capital building on itself right, which will require some 531 00:26:50,320 --> 00:26:53,960 Speaker 3: other kind of intervention that is fairly bleak about. But 532 00:26:54,000 --> 00:26:56,919 Speaker 3: again I think the tools will be commodities, not the people. 533 00:27:02,080 --> 00:27:04,000 Speaker 1: Robert Capps, thank you for joining us on tech Stuff. 534 00:27:04,040 --> 00:27:05,359 Speaker 3: Thank you, thanks for having me. 535 00:27:28,800 --> 00:27:30,520 Speaker 2: That's it for this week for tech Stuff. 536 00:27:30,560 --> 00:27:33,280 Speaker 1: I'm Kara Price and I'm mos Valasha And this episode 537 00:27:33,320 --> 00:27:36,280 Speaker 1: was produced by Eliza Dennis, Tyler Hill and Melissa Slaughter. 538 00:27:36,600 --> 00:27:39,480 Speaker 1: It was executive produced by me, Kara Price and Kate 539 00:27:39,520 --> 00:27:44,320 Speaker 1: Osborne for Kaleidoscope and Katria Norvelle for iHeart Podcasts. The 540 00:27:44,400 --> 00:27:48,200 Speaker 1: engineer is Beheid Fraser and Jack Insley mixed this episode. 541 00:27:48,560 --> 00:27:51,720 Speaker 1: Kyle Murdoch wrote our theme song. Please do rate, review 542 00:27:51,880 --> 00:27:54,199 Speaker 1: and reach out to us at tech Stuff podcast at 543 00:27:54,240 --> 00:27:55,919 Speaker 1: gmail dot com. We love hearing from you.