1 00:00:01,480 --> 00:00:07,200 Speaker 1: Welcome to Stuff You Should Know, a production of iHeartRadio. 2 00:00:11,160 --> 00:00:13,920 Speaker 2: Hey, and welcome to the podcast. I'm Josh, and there's 3 00:00:14,080 --> 00:00:17,880 Speaker 2: Chuck and Jerry's here too, And that makes this a 4 00:00:18,480 --> 00:00:22,759 Speaker 2: timely topical Not timely is in fork times. I meant 5 00:00:22,760 --> 00:00:27,159 Speaker 2: to say timely topical episode of Stuff you should Know. 6 00:00:27,760 --> 00:00:29,520 Speaker 2: That's a great forecast of. 7 00:00:29,480 --> 00:00:32,120 Speaker 1: How it's going to go to I think fork cast. 8 00:00:33,720 --> 00:00:39,559 Speaker 1: Oh God, is this really us or is it AI 9 00:00:39,800 --> 00:00:41,520 Speaker 1: generated Josh and Chuck. 10 00:00:42,479 --> 00:00:48,159 Speaker 2: Until this year I would have been like, don't be preposterous. 11 00:00:48,240 --> 00:00:50,280 Speaker 2: Now I'm like, just give it some time. 12 00:00:51,080 --> 00:00:53,200 Speaker 1: You know how we would know as if it said 13 00:00:53,240 --> 00:00:56,200 Speaker 1: if one of us said, of course it's the real Us. 14 00:00:56,520 --> 00:00:59,480 Speaker 1: We met in the office and bonded over our Van 15 00:00:59,560 --> 00:01:01,560 Speaker 1: Halen denim vests. 16 00:01:01,840 --> 00:01:04,640 Speaker 2: Yeah, we'd be like, sucker, you just fell for the 17 00:01:04,680 --> 00:01:07,440 Speaker 2: oldest trap in the book, the Sicilian switcher. 18 00:01:08,160 --> 00:01:11,520 Speaker 1: Yeah, aka fake Wikipedia entry stuff? 19 00:01:12,080 --> 00:01:12,880 Speaker 2: Is that still up? 20 00:01:13,840 --> 00:01:16,120 Speaker 1: I haven't been to our Wikipedia page in years, so 21 00:01:16,200 --> 00:01:16,640 Speaker 1: I don't know. 22 00:01:17,480 --> 00:01:19,920 Speaker 2: Well, regardless, we're not talking about Wikipedia, although it does 23 00:01:20,040 --> 00:01:22,440 Speaker 2: kind of fall into the sure it figures in the 24 00:01:22,520 --> 00:01:25,360 Speaker 2: rubric of this. Not sure if I use that word correctly, 25 00:01:25,440 --> 00:01:29,800 Speaker 2: but it felt right. We're talking today about what are 26 00:01:30,480 --> 00:01:34,679 Speaker 2: in the biz known as large language models, but more 27 00:01:34,760 --> 00:01:38,480 Speaker 2: colloquially known by basically their their public facing names, things 28 00:01:38,480 --> 00:01:44,360 Speaker 2: like chat, GPT or barred or being AI. But essentially 29 00:01:44,400 --> 00:01:49,920 Speaker 2: what they all are are algorithms, artificially intelligent algorithms that 30 00:01:50,440 --> 00:01:54,720 Speaker 2: are trained on text, tons and tons and tons of 31 00:01:54,760 --> 00:02:00,200 Speaker 2: text written English language stuff, that are so good at 32 00:02:00,280 --> 00:02:04,200 Speaker 2: recognizing patterns in those things that they can actually simulate 33 00:02:04,280 --> 00:02:07,080 Speaker 2: a conversation with you, the person on the other side 34 00:02:07,080 --> 00:02:09,320 Speaker 2: of the computer, asking them questions. 35 00:02:10,120 --> 00:02:13,400 Speaker 1: Yeah, this is it's gonna be fun doing this episode 36 00:02:13,880 --> 00:02:19,120 Speaker 1: over every six months, right until we're replaced totally. 37 00:02:19,600 --> 00:02:21,520 Speaker 2: So I think we should say, though, like this is 38 00:02:21,600 --> 00:02:24,320 Speaker 2: we're going to like this is such a huge wide 39 00:02:24,960 --> 00:02:30,120 Speaker 2: topic that is just we're in the ignition phase, like 40 00:02:30,120 --> 00:02:34,480 Speaker 2: the fuse just caught, right, Yeah, that we're going to 41 00:02:34,560 --> 00:02:36,640 Speaker 2: really try to keep it narrow just strictly to large 42 00:02:36,680 --> 00:02:39,760 Speaker 2: language models and the immediate effect they're planning on having 43 00:02:39,880 --> 00:02:42,480 Speaker 2: or they're going to have, hopefully not planning on anything yet, 44 00:02:43,680 --> 00:02:46,160 Speaker 2: but I really would like to do one on how 45 00:02:46,240 --> 00:02:50,320 Speaker 2: to keep AI friendly and keeping it from running away. Yeah, 46 00:02:50,480 --> 00:02:52,600 Speaker 2: I say, we just kind of avoid that whole kind 47 00:02:52,639 --> 00:02:55,040 Speaker 2: of stuff. And really I'm talking to myself right now 48 00:02:55,320 --> 00:02:56,520 Speaker 2: at least for this episode. 49 00:02:56,560 --> 00:02:58,800 Speaker 1: Okay, Yeah, you know, we're going to kind of explain 50 00:02:59,320 --> 00:03:03,079 Speaker 1: how these things were and what the initial applications look 51 00:03:03,240 --> 00:03:05,040 Speaker 1: like and kind of where we are right now, and 52 00:03:05,080 --> 00:03:08,240 Speaker 1: then what it could mean for like jobs and the 53 00:03:08,280 --> 00:03:10,600 Speaker 1: economy and stuff like that. But you're right, it is 54 00:03:11,919 --> 00:03:13,600 Speaker 1: as a whole ball of wax, as you well know. 55 00:03:14,320 --> 00:03:16,560 Speaker 1: And this is a great time to plug the End 56 00:03:16,560 --> 00:03:19,000 Speaker 1: of the World with Josh Clark, which is still out there. 57 00:03:19,040 --> 00:03:19,960 Speaker 1: You can still listen to it. 58 00:03:20,600 --> 00:03:22,960 Speaker 2: The truth is out there in the form of the 59 00:03:23,040 --> 00:03:24,440 Speaker 2: End of the World with Josh Clark. 60 00:03:24,919 --> 00:03:28,680 Speaker 1: Yeah, it's a great ten part series that you did, 61 00:03:29,240 --> 00:03:34,400 Speaker 1: and AI is among those existential risks existential that you covered. 62 00:03:34,920 --> 00:03:38,040 Speaker 2: Yeah, it's episode four, I believe. And Chuck like just 63 00:03:38,120 --> 00:03:40,920 Speaker 2: from having done that research in forming my own opinions 64 00:03:40,920 --> 00:03:46,120 Speaker 2: over the years about this. Yeah, Like I'm it's staggering 65 00:03:46,160 --> 00:03:50,000 Speaker 2: to me that we're like we've just entered like what's 66 00:03:50,040 --> 00:03:58,440 Speaker 2: going to be the most revolutionary transitional phase in the 67 00:03:58,720 --> 00:04:02,120 Speaker 2: entire history of humanity. You can argue everything else took 68 00:04:02,160 --> 00:04:05,080 Speaker 2: place over very long periods of time. We started playing 69 00:04:05,120 --> 00:04:07,280 Speaker 2: with stone tools and then we started building cities. All 70 00:04:07,320 --> 00:04:09,960 Speaker 2: this stuff took place over thousands and thousands, hundreds of thousands, 71 00:04:10,000 --> 00:04:13,120 Speaker 2: millions of years. We just entered a period where stuff's 72 00:04:13,160 --> 00:04:17,120 Speaker 2: going to start happening within weeks pretty soon. As of 73 00:04:17,120 --> 00:04:19,200 Speaker 2: twenty twenty three, the whole thing just started. 74 00:04:19,680 --> 00:04:22,680 Speaker 1: Yeah, and none of this was around like this when 75 00:04:22,720 --> 00:04:24,520 Speaker 1: you did The End of the World and that was 76 00:04:24,880 --> 00:04:26,240 Speaker 1: what like five years ago. 77 00:04:26,320 --> 00:04:28,679 Speaker 2: Is Yeah, it was twenty eighteen. All this was being 78 00:04:28,720 --> 00:04:31,040 Speaker 2: worked on, but we hadn't hit that point. Like all 79 00:04:31,080 --> 00:04:33,960 Speaker 2: this was pretty much predicted and projected, and it was 80 00:04:34,200 --> 00:04:36,039 Speaker 2: clear that this was the direction people were going. 81 00:04:36,120 --> 00:04:37,640 Speaker 1: And it's here, baby, it is. 82 00:04:37,920 --> 00:04:40,760 Speaker 2: It's nuts, but it's actually here. So we're talking about 83 00:04:40,760 --> 00:04:44,400 Speaker 2: our large language models, which is a type of neural 84 00:04:44,480 --> 00:04:48,440 Speaker 2: network that are easiest to think of in terms of 85 00:04:48,880 --> 00:04:51,600 Speaker 2: like a human brain, where you have neurons that are 86 00:04:51,600 --> 00:04:53,960 Speaker 2: connected to other neurons, but they're not connected to some 87 00:04:54,080 --> 00:04:58,080 Speaker 2: other neurons, and all of those neural connections kind of 88 00:04:58,440 --> 00:05:02,800 Speaker 2: are activated by input. It's that put out something like 89 00:05:02,839 --> 00:05:05,400 Speaker 2: your conscious experience or you say a sentence or something 90 00:05:05,440 --> 00:05:09,880 Speaker 2: like that. It's it's very similar in its most basic nature. 91 00:05:10,120 --> 00:05:13,720 Speaker 1: I guess, yeah, I mean Olivia helped us out with this, 92 00:05:13,839 --> 00:05:17,120 Speaker 1: and che did a great job. I think and Google 93 00:05:17,160 --> 00:05:20,400 Speaker 1: themselves basically say, you know what it's really it's sort 94 00:05:20,400 --> 00:05:22,839 Speaker 1: of like how when you go to search for something 95 00:05:22,960 --> 00:05:28,520 Speaker 1: on our search engine tool here a weird way I 96 00:05:28,520 --> 00:05:31,120 Speaker 1: could have said that, I don't think so are handy 97 00:05:31,160 --> 00:05:34,320 Speaker 1: search for Then you know, basically what we're doing is 98 00:05:34,400 --> 00:05:40,080 Speaker 1: auto completing, like an analysis of like probability, like like 99 00:05:40,200 --> 00:05:44,479 Speaker 1: what you're typing. If you type in you know, John Coltrane, 100 00:05:44,520 --> 00:05:47,600 Speaker 1: or start to chop type in John Cole, that might 101 00:05:47,640 --> 00:05:51,120 Speaker 1: finish it out as John Coltrane a Love Supreme or 102 00:05:51,200 --> 00:05:54,719 Speaker 1: John Coltrane jazz and they're saying, you know what, what 103 00:05:54,839 --> 00:05:59,920 Speaker 1: is happening now with these lms is it's the same thing. 104 00:06:00,080 --> 00:06:03,279 Speaker 1: It's just it's got way more data, way more calculations 105 00:06:03,880 --> 00:06:07,039 Speaker 1: in the algorithm. So it's not just completing like a 106 00:06:07,080 --> 00:06:11,160 Speaker 1: word or two. It's potentially you know, hey, rewrite the 107 00:06:11,200 --> 00:06:12,760 Speaker 1: Bible or whatever you tell it to do. 108 00:06:13,279 --> 00:06:16,279 Speaker 2: Yeah, And the big difference is in the amount of 109 00:06:16,560 --> 00:06:23,479 Speaker 2: info the neural network is capable of taking into consideration. Yeah, 110 00:06:23,520 --> 00:06:26,880 Speaker 2: so may I for a minute, oh please, So imagine 111 00:06:26,880 --> 00:06:29,599 Speaker 2: with one of those autocomplete suggestion tools like they have 112 00:06:29,760 --> 00:06:34,359 Speaker 2: on Google Search. If there's five hundred thousand words in 113 00:06:34,400 --> 00:06:37,159 Speaker 2: the English language, that means that you have five hundred 114 00:06:37,200 --> 00:06:40,640 Speaker 2: thousand words that a person could possibly put in. That's 115 00:06:40,680 --> 00:06:44,159 Speaker 2: the input into the neural network, and then there's five 116 00:06:44,240 --> 00:06:47,599 Speaker 2: hundred thousand possible words that that network could put out. 117 00:06:47,640 --> 00:06:50,440 Speaker 2: So you have five hundred thousand connections to five hundred 118 00:06:50,440 --> 00:06:53,159 Speaker 2: thousand other connections. So it's like, I think two hundred 119 00:06:53,160 --> 00:06:56,119 Speaker 2: and fifty billion connections you're starting with right there. That's 120 00:06:56,279 --> 00:07:00,520 Speaker 2: just the autocomplete suggestion because it based on those connections. 121 00:07:00,560 --> 00:07:03,480 Speaker 2: In studying words in the English language and phrases in 122 00:07:03,520 --> 00:07:07,480 Speaker 2: the English language, it places emphasis more on some connections 123 00:07:07,520 --> 00:07:12,800 Speaker 2: than others. So John Coltrane is what's what's his album 124 00:07:12,800 --> 00:07:16,400 Speaker 2: I Can't Remember is a Love Supreme classic album. So 125 00:07:16,520 --> 00:07:20,240 Speaker 2: John Coltrane is much more closely related to a Love 126 00:07:20,280 --> 00:07:23,800 Speaker 2: Supreme in the mind of a neural network than John Coltrane. 127 00:07:25,720 --> 00:07:29,560 Speaker 2: Charlie Brown disco is right, just to take something off 128 00:07:29,600 --> 00:07:32,320 Speaker 2: the top of my head. And so based on that 129 00:07:32,480 --> 00:07:35,200 Speaker 2: analysis and that weight that it gives to some things 130 00:07:35,200 --> 00:07:38,400 Speaker 2: other than others, it suggests those words what the large 131 00:07:38,480 --> 00:07:41,560 Speaker 2: language models that like chat GPT, that we're seeing today. 132 00:07:42,400 --> 00:07:45,080 Speaker 2: They do the same thing. They have all those same connections, 133 00:07:45,480 --> 00:07:48,080 Speaker 2: but the analysis they do, the weight that they put 134 00:07:48,120 --> 00:07:52,560 Speaker 2: on the connections is so much more advanced, yes, and 135 00:07:52,720 --> 00:07:57,000 Speaker 2: exponential that it's it's actually not just capable of suggesting 136 00:07:57,040 --> 00:08:00,000 Speaker 2: the next word, it's capable of holding a conversation with you. 137 00:08:00,080 --> 00:08:04,480 Speaker 2: That's how much it understands how the English language works. 138 00:08:04,920 --> 00:08:08,360 Speaker 1: Yeah, Like if you said, you know, write a story 139 00:08:08,400 --> 00:08:14,320 Speaker 1: about wintertime, and you know, it got to the word snowy, 140 00:08:14,400 --> 00:08:17,160 Speaker 1: it would it would go through you know, I mean, 141 00:08:17,200 --> 00:08:20,600 Speaker 1: and this is like instantaneously it's doing these calculations. It 142 00:08:20,680 --> 00:08:24,200 Speaker 1: might say like you know, oh, hillside or winter or 143 00:08:24,240 --> 00:08:26,280 Speaker 1: snowy day, like these are all things that make sense 144 00:08:26,680 --> 00:08:29,680 Speaker 1: because I've learned that that makes sense I being you know, 145 00:08:29,800 --> 00:08:33,760 Speaker 1: the chatbot or whatever. But it probably won't be snowy 146 00:08:33,880 --> 00:08:37,599 Speaker 1: chicken wing because that doesn't seem to fit the algorithm. 147 00:08:38,120 --> 00:08:43,080 Speaker 1: And it learns all this stuff by reading the internet. 148 00:08:43,360 --> 00:08:46,680 Speaker 1: And you know, put a pin in that, because that's 149 00:08:46,720 --> 00:08:51,040 Speaker 1: pretty thorny for a whole lot of reasons, but not 150 00:08:51,120 --> 00:08:53,480 Speaker 1: the least of which is the fact that some companies, 151 00:08:53,520 --> 00:08:55,040 Speaker 1: and again we'll get to it, are starting to say 152 00:08:55,040 --> 00:08:57,440 Speaker 1: like wait a minute, Like, we created this content and 153 00:08:57,480 --> 00:09:02,320 Speaker 1: now you're just scrubbing it and then using it and 154 00:09:02,400 --> 00:09:05,240 Speaker 1: charging people to use it, and we're not getting a 155 00:09:05,240 --> 00:09:07,880 Speaker 1: piece of it. So that's just one tiny little thorn. 156 00:09:08,840 --> 00:09:10,920 Speaker 1: But in order to do this, like you said, it's 157 00:09:10,960 --> 00:09:13,760 Speaker 1: like it needs to know more, and you came up 158 00:09:13,800 --> 00:09:17,240 Speaker 1: with a great example, like the word lemon. In a 159 00:09:17,440 --> 00:09:21,760 Speaker 1: very basic way, it might understand that a lemon is 160 00:09:22,679 --> 00:09:26,280 Speaker 1: roundish and sour and yellow. But if it needs to 161 00:09:26,280 --> 00:09:29,199 Speaker 1: get smart enough to really write as if it were 162 00:09:29,240 --> 00:09:31,760 Speaker 1: a human, it needs to know that it can make lemonade, 163 00:09:31,800 --> 00:09:35,680 Speaker 1: and that it grows on a tree in these agricultural zones, 164 00:09:35,720 --> 00:09:38,040 Speaker 1: and that it's a citrus fruit, because it has to 165 00:09:38,040 --> 00:09:43,200 Speaker 1: be able to group lemon together with like things. And 166 00:09:43,280 --> 00:09:46,439 Speaker 1: those groups are either like, you know, hey, it's super 167 00:09:46,440 --> 00:09:49,960 Speaker 1: similar to this, like maybe other citrus fruits, or it's 168 00:09:50,000 --> 00:09:51,839 Speaker 1: you know, sort of similar to this but not as 169 00:09:51,840 --> 00:09:55,760 Speaker 1: similar as citrus fruits, like desserts. And then you get 170 00:09:55,760 --> 00:09:58,560 Speaker 1: to chicken wings, although actually that's not true because lemon 171 00:09:58,800 --> 00:09:59,560 Speaker 1: chicken wings. 172 00:10:00,120 --> 00:10:01,520 Speaker 2: You could have lemon pepper chicken wings. 173 00:10:01,559 --> 00:10:03,720 Speaker 1: Right, Yeah, that's what I'm saying, So yeah, Kau, But 174 00:10:04,040 --> 00:10:06,880 Speaker 1: the instance you use is like greenland, which I guess 175 00:10:06,880 --> 00:10:08,360 Speaker 1: doesn't grow lemons. 176 00:10:08,559 --> 00:10:10,800 Speaker 2: No, but I mean, I'm sure they import lemons, so 177 00:10:10,800 --> 00:10:14,720 Speaker 2: there's some connection there. But based on how connected, how 178 00:10:14,800 --> 00:10:17,880 Speaker 2: often these words show up together, and the billions and 179 00:10:17,920 --> 00:10:21,440 Speaker 2: billions of lines of texts that these language large language 180 00:10:21,440 --> 00:10:25,440 Speaker 2: models are trained on, it starts to get more and 181 00:10:25,440 --> 00:10:29,360 Speaker 2: more dimensions and making more and more connections. Right. So 182 00:10:30,400 --> 00:10:34,080 Speaker 2: as that happens, words start to cluster together, like lemon 183 00:10:34,240 --> 00:10:36,880 Speaker 2: and pie and ice box all kind of cluster together. 184 00:10:37,440 --> 00:10:42,800 Speaker 2: And by taking words and understanding how they connect to 185 00:10:42,880 --> 00:10:46,280 Speaker 2: other words, you can take the English language, just the 186 00:10:46,280 --> 00:10:48,760 Speaker 2: words of the English language, and make meaning out of it. 187 00:10:48,880 --> 00:10:51,920 Speaker 2: That's what that's all we do. And large language models 188 00:10:51,960 --> 00:10:54,800 Speaker 2: are capable of doing the same thing. But it's really 189 00:10:55,000 --> 00:10:59,640 Speaker 2: really important for you to understand that the large language 190 00:10:59,679 --> 00:11:04,880 Speaker 2: model doesn't understand what it's doing. It doesn't have any 191 00:11:04,920 --> 00:11:10,280 Speaker 2: meaning to the word lemon whatsoever. All of these dimensions 192 00:11:10,320 --> 00:11:13,679 Speaker 2: that it waits to decide whether what word it should 193 00:11:13,760 --> 00:11:18,800 Speaker 2: use next, they're called embeddings. They're just numerical representations. Yeah, 194 00:11:18,800 --> 00:11:21,040 Speaker 2: So the higher the number, the likelier it is it 195 00:11:21,080 --> 00:11:23,320 Speaker 2: goes with the word that the user just put in 196 00:11:23,400 --> 00:11:26,679 Speaker 2: or that the large language model. Just use the lower 197 00:11:26,720 --> 00:11:29,360 Speaker 2: the number, the further away it is in the cluster, right, 198 00:11:30,000 --> 00:11:33,480 Speaker 2: it doesn't understand what it's saying to you. And as 199 00:11:33,520 --> 00:11:36,240 Speaker 2: we'll see later, that accounts for a phenomenon that we're 200 00:11:36,280 --> 00:11:38,440 Speaker 2: going to have to overcome for them to get smarter, 201 00:11:38,520 --> 00:11:42,240 Speaker 2: which is called hallucinations. But that's a really critically important 202 00:11:42,280 --> 00:11:43,000 Speaker 2: thing to remember. 203 00:11:43,679 --> 00:11:48,000 Speaker 1: Yeah, Another critically important thing to remember is and you 204 00:11:48,040 --> 00:11:49,920 Speaker 1: probably get this from what we said so far if 205 00:11:49,920 --> 00:11:51,640 Speaker 1: you already know about a little bit about it, But 206 00:11:51,679 --> 00:11:55,960 Speaker 1: there's no programmer that's teaching these things and typing in 207 00:11:56,080 --> 00:11:59,840 Speaker 1: inputs and then saying here's how you learn things. Like 208 00:12:00,520 --> 00:12:03,640 Speaker 1: it's doing this on its own, and it's learning things 209 00:12:03,640 --> 00:12:07,160 Speaker 1: on its own, and what we're talking about eventually like that. 210 00:12:07,360 --> 00:12:09,719 Speaker 1: You know, where it could get super scary is when 211 00:12:09,760 --> 00:12:12,480 Speaker 1: it gets to what's called emergent abilities, where it's so 212 00:12:12,679 --> 00:12:18,160 Speaker 1: powerful and there's so much data that the nuance that's 213 00:12:18,200 --> 00:12:21,319 Speaker 1: missing now will be there right exactly. 214 00:12:21,640 --> 00:12:25,319 Speaker 2: So, yeah, that's when things are going to get even 215 00:12:25,400 --> 00:12:29,040 Speaker 2: harder to understand, you know, to remind yourself that you're 216 00:12:29,080 --> 00:12:30,320 Speaker 2: talking to a machine, you know. 217 00:12:31,120 --> 00:12:32,880 Speaker 1: Yeah, And the other thing too, though, even though I 218 00:12:32,920 --> 00:12:36,600 Speaker 1: said humans aren't in putting this data. One of the 219 00:12:36,640 --> 00:12:39,720 Speaker 1: big things that is allowing this stuff to get smarter 220 00:12:40,600 --> 00:12:45,200 Speaker 1: is human feedback. It's called r l HF, which is 221 00:12:45,760 --> 00:12:48,960 Speaker 1: reinforcement learning on human feedback. So at the end of 222 00:12:49,000 --> 00:12:52,960 Speaker 1: your whatever you've told it to create, you can go 223 00:12:53,080 --> 00:12:55,120 Speaker 1: back in and say, well, you got this wrong and 224 00:12:55,120 --> 00:12:57,600 Speaker 1: this wrong, this is what that really is, and it says, 225 00:12:58,040 --> 00:12:59,960 Speaker 1: thank you, I have now just gotten smart. 226 00:13:01,360 --> 00:13:04,079 Speaker 2: Exactly. So one of the reasons why these things are 227 00:13:04,080 --> 00:13:07,200 Speaker 2: suddenly just so smart and can say thank you, I've 228 00:13:07,240 --> 00:13:10,079 Speaker 2: just gotten so much smarter is because of a paper 229 00:13:10,440 --> 00:13:16,360 Speaker 2: that Google engineers published openly in twenty seventeen describing what's 230 00:13:16,440 --> 00:13:19,959 Speaker 2: now like the essential ingredient for a large language model 231 00:13:20,000 --> 00:13:22,520 Speaker 2: or probably any neural network from now on. It's called 232 00:13:22,520 --> 00:13:27,280 Speaker 2: a transformer. And rather than analyzing each bit of text, 233 00:13:27,360 --> 00:13:30,200 Speaker 2: let's say you say one of the very famous things. 234 00:13:30,800 --> 00:13:33,319 Speaker 2: Marvin Minsky was one of the founders of the field 235 00:13:33,320 --> 00:13:38,200 Speaker 2: of AI, and his son Henry prompted chet Gpt to 236 00:13:38,360 --> 00:13:41,600 Speaker 2: describe what losing a sock in the dryer is like 237 00:13:41,640 --> 00:13:45,120 Speaker 2: in the style of the Declaration of Independence. Right, So 238 00:13:45,240 --> 00:13:51,040 Speaker 2: depending on how Henry Minsky type that in before transformers 239 00:13:51,360 --> 00:13:55,080 Speaker 2: the neural network would analyze each word and do it 240 00:13:55,240 --> 00:13:57,920 Speaker 2: one increment and a time, maybe not even words, sometimes 241 00:13:57,960 --> 00:14:02,400 Speaker 2: strings of just letters together, phone names even if you 242 00:14:02,440 --> 00:14:07,319 Speaker 2: can believe it, phone names even And what the transformer 243 00:14:07,360 --> 00:14:09,559 Speaker 2: does is it changes that. It allows it to analyze 244 00:14:09,600 --> 00:14:12,960 Speaker 2: everything all at once. So it's so much faster, not 245 00:14:13,160 --> 00:14:16,320 Speaker 2: just in putting out a coherent answer to your question 246 00:14:16,520 --> 00:14:20,560 Speaker 2: or request, but in also training itself on that text. 247 00:14:20,600 --> 00:14:23,840 Speaker 2: So you just feed it the internet and it starts 248 00:14:23,880 --> 00:14:27,600 Speaker 2: analyzing it and self correcting. It trains itself, It learns 249 00:14:27,640 --> 00:14:32,360 Speaker 2: on its own, and that unfortunately also makes AI, including 250 00:14:32,440 --> 00:14:35,920 Speaker 2: large language models what are known as black boxes. Yeah, 251 00:14:35,960 --> 00:14:38,840 Speaker 2: we don't know how they're doing what they're doing. We 252 00:14:38,880 --> 00:14:40,560 Speaker 2: have a good idea how to make them do the 253 00:14:40,600 --> 00:14:43,800 Speaker 2: things we want, but the in between stuff, we cannot 254 00:14:43,800 --> 00:14:47,000 Speaker 2: one hundred percent say what they're doing. How they come 255 00:14:47,120 --> 00:14:51,480 Speaker 2: up with these conclusions, which also explains hallucinations in them 256 00:14:51,520 --> 00:14:53,000 Speaker 2: not really making sense to us. 257 00:14:53,560 --> 00:14:57,360 Speaker 1: Yeah, and you know, the T in GPT stands for transformer. 258 00:14:57,520 --> 00:15:02,880 Speaker 1: It's generative pre trained transformer. And the reason they call 259 00:15:02,920 --> 00:15:05,680 Speaker 1: it GPT for short is because if they call it 260 00:15:05,760 --> 00:15:10,000 Speaker 1: generative pre trained transformer, everybody would be scared out of 261 00:15:10,040 --> 00:15:11,000 Speaker 1: their minds. 262 00:15:10,880 --> 00:15:14,359 Speaker 2: We just start running around to nowhere in particular. 263 00:15:14,840 --> 00:15:17,840 Speaker 1: Yeah, should we take a break, I say, we do. 264 00:15:17,960 --> 00:15:20,560 Speaker 2: I think that we kind of explained that fairly well. 265 00:15:20,600 --> 00:15:50,640 Speaker 1: Yeah, fairly robust beginning, my friend. All right, So open 266 00:15:50,680 --> 00:15:54,760 Speaker 1: Ai launched their chat GPT and very recently in November 267 00:15:54,760 --> 00:15:58,600 Speaker 1: of twenty twenty two, and just in that brief window 268 00:15:59,160 --> 00:16:02,240 Speaker 1: was like six or eight months ago. Things are kind 269 00:16:02,280 --> 00:16:06,400 Speaker 1: of flying high, and all kinds of companies are launching 270 00:16:06,560 --> 00:16:10,320 Speaker 1: their own stuff. Some of it is well. First of all, 271 00:16:10,320 --> 00:16:14,160 Speaker 1: open ai is now at chat GPT four, yes, and 272 00:16:14,200 --> 00:16:17,640 Speaker 1: I'm sure you know more will be coming in quick succession. 273 00:16:18,320 --> 00:16:20,920 Speaker 1: But companies are launching, and we're going to talk about 274 00:16:20,920 --> 00:16:23,120 Speaker 1: all of them kind of like broad stuff like chat, 275 00:16:23,160 --> 00:16:28,280 Speaker 1: BT GPT and really specific stuff like well, hey, I'm 276 00:16:28,320 --> 00:16:31,600 Speaker 1: in the banking business. Can we just design something for 277 00:16:31,680 --> 00:16:34,960 Speaker 1: banking or just something for real estate? So they're also 278 00:16:35,000 --> 00:16:38,640 Speaker 1: getting specific on a smaller level in addition to these 279 00:16:38,720 --> 00:16:41,640 Speaker 1: large like Google and Microsoft and Being and all that stuff. 280 00:16:41,920 --> 00:16:44,160 Speaker 2: Yeah, And to get specific, all you have to do 281 00:16:44,240 --> 00:16:49,640 Speaker 2: is take an existing GPT, a large language model, and 282 00:16:50,240 --> 00:16:54,520 Speaker 2: add some software that helps guide it a little more. Yeah, 283 00:16:54,600 --> 00:16:57,440 Speaker 2: and there you go or just train it on specific 284 00:16:57,480 --> 00:17:01,800 Speaker 2: stuff like medical notes. That's another one one of the 285 00:17:01,800 --> 00:17:05,159 Speaker 2: other things that's that's changed very quickly between November of 286 00:17:05,200 --> 00:17:07,600 Speaker 2: twenty twenty two in March of twenty twenty three, when 287 00:17:08,200 --> 00:17:13,440 Speaker 2: I think GPT four became available. Just think about that. 288 00:17:13,440 --> 00:17:16,240 Speaker 2: That's that's such a short amount of time. Yeah, all 289 00:17:16,240 --> 00:17:19,760 Speaker 2: of a sudden. Now you can take a picture and 290 00:17:19,800 --> 00:17:22,480 Speaker 2: feed it into a large language model and it will 291 00:17:22,840 --> 00:17:26,720 Speaker 2: describe the picture. It will look at the picture essentially 292 00:17:26,960 --> 00:17:31,120 Speaker 2: and describe what's going on. There's a there's a demonstration 293 00:17:31,280 --> 00:17:34,440 Speaker 2: from one of the guys from open Ai who doodles 294 00:17:34,440 --> 00:17:37,600 Speaker 2: like on a little like scrapbook piece of paper some 295 00:17:37,760 --> 00:17:41,120 Speaker 2: ideas for a website. He takes a picture of that 296 00:17:41,240 --> 00:17:45,359 Speaker 2: paper that he's written on, feeds it into chat GPT four, 297 00:17:46,359 --> 00:17:49,080 Speaker 2: and it builds a website for him in a couple 298 00:17:49,119 --> 00:17:51,480 Speaker 2: of minutes. That functions the way he was thinking of 299 00:17:51,800 --> 00:17:53,760 Speaker 2: the on the doodle scratch pad. 300 00:17:54,240 --> 00:17:55,920 Speaker 1: I wonder if the only way to slow this stuff 301 00:17:55,960 --> 00:17:59,280 Speaker 1: down is to literally slow down the Internet again. Go 302 00:17:59,359 --> 00:18:02,160 Speaker 1: back to like the old days when a picture would 303 00:18:02,200 --> 00:18:05,399 Speaker 1: load like three lines at a time, right, and they 304 00:18:05,440 --> 00:18:09,760 Speaker 1: describe a picture will be like someone's hair, someone's nose, 305 00:18:10,560 --> 00:18:14,320 Speaker 1: someone's gin. Don't forget between Yeah, an hour later you 306 00:18:14,320 --> 00:18:15,160 Speaker 1: have a complete picture. 307 00:18:15,320 --> 00:18:17,359 Speaker 2: Right. I don't think there's any way to slow this 308 00:18:17,440 --> 00:18:21,800 Speaker 2: down because we're in not to be alarmist, but we're 309 00:18:21,800 --> 00:18:26,760 Speaker 2: in a second worst case scenario for introducing AI to 310 00:18:26,840 --> 00:18:30,400 Speaker 2: the world, which is, rather than state actors doing this, 311 00:18:30,640 --> 00:18:34,680 Speaker 2: which would be really bad, we have private companies doing it, 312 00:18:34,720 --> 00:18:38,879 Speaker 2: which is just slightly less bad. But they're competing in 313 00:18:38,920 --> 00:18:42,480 Speaker 2: an arms race to get the best, brightest, smartest AI 314 00:18:42,640 --> 00:18:45,600 Speaker 2: out there as fast as they can, and they're not 315 00:18:45,680 --> 00:18:48,760 Speaker 2: taking into account like all of the downsides to it. 316 00:18:48,760 --> 00:18:50,800 Speaker 2: They're just throwing it out there as much as they can. 317 00:18:51,040 --> 00:18:53,639 Speaker 2: Because one of the ways that these things get smarter 318 00:18:53,960 --> 00:18:56,760 Speaker 2: is by interacting with the public. They get better and 319 00:18:56,800 --> 00:19:01,280 Speaker 2: better at what they do from getting feedback from people 320 00:19:01,400 --> 00:19:02,280 Speaker 2: using them. 321 00:19:02,520 --> 00:19:04,960 Speaker 1: Yeah, even if it's for just some goofy fun thing 322 00:19:05,000 --> 00:19:08,440 Speaker 1: you're doing, it's learning from that. And you talked about 323 00:19:08,440 --> 00:19:12,560 Speaker 1: the advancements made between the launch of three point five 324 00:19:12,680 --> 00:19:17,560 Speaker 1: and GPT four, and three point five scored in the 325 00:19:17,600 --> 00:19:21,960 Speaker 1: tenth percentile when it took the uniform bar exam, and 326 00:19:22,320 --> 00:19:26,159 Speaker 1: four has already scored in the ninetieth percentile, and they 327 00:19:26,200 --> 00:19:29,399 Speaker 1: found that chet GPT four is really it's great at 328 00:19:29,800 --> 00:19:34,040 Speaker 1: taking tests, and it's scoring really well on tests, particularly 329 00:19:34,440 --> 00:19:37,760 Speaker 1: you know, standardized tests. All. I think it basically aced 330 00:19:38,000 --> 00:19:41,520 Speaker 1: all of the AP tests that you would take to 331 00:19:41,520 --> 00:19:45,960 Speaker 1: get into AP classes except well, it took a couple 332 00:19:45,960 --> 00:19:49,359 Speaker 1: of AP class but the max score is five, and 333 00:19:49,440 --> 00:19:52,480 Speaker 1: I think it got five's kind of on everything except 334 00:19:52,560 --> 00:19:59,160 Speaker 1: for math. It got a four and math it's kind 335 00:19:59,160 --> 00:20:02,560 Speaker 1: of it's we it's kind of weirdly counterintuitive because it's 336 00:20:02,600 --> 00:20:05,919 Speaker 1: a number space thing, but it has more trouble with math, 337 00:20:06,640 --> 00:20:10,040 Speaker 1: like rudimentary math, than it does with like constructing a 338 00:20:10,080 --> 00:20:13,720 Speaker 1: paragraph on you know, Shakespeare or something, or as Shakespeare 339 00:20:14,800 --> 00:20:18,320 Speaker 1: does better with like math word problems and more advanced 340 00:20:18,359 --> 00:20:20,760 Speaker 1: math than it does just at basic math apparently, or. 341 00:20:20,760 --> 00:20:25,440 Speaker 2: Like describing how a formula functions using you know, writing. 342 00:20:26,200 --> 00:20:28,320 Speaker 2: The thing is though, and this is another great example 343 00:20:28,320 --> 00:20:31,440 Speaker 2: of how fast this is moving. They've already figured out 344 00:20:31,440 --> 00:20:33,040 Speaker 2: that all you have to do is do what's called 345 00:20:33,040 --> 00:20:38,760 Speaker 2: prompting where you where you basically take the answer that 346 00:20:38,840 --> 00:20:42,960 Speaker 2: the the incorrect answer that the large language model gives you, 347 00:20:43,320 --> 00:20:45,840 Speaker 2: and then basically re explain it by breaking it down 348 00:20:45,880 --> 00:20:48,760 Speaker 2: into different parts, and it learns as you're doing that, 349 00:20:48,880 --> 00:20:50,560 Speaker 2: and then all of a sudden it comes up with 350 00:20:50,760 --> 00:20:53,479 Speaker 2: it gets better at math. So they've figured out tools, 351 00:20:53,520 --> 00:20:57,800 Speaker 2: extra software you can lay over a GPT that basically 352 00:20:58,200 --> 00:21:00,959 Speaker 2: teach it to do math or prompt it in the 353 00:21:01,000 --> 00:21:03,160 Speaker 2: correct way so that you get the answer you're looking 354 00:21:03,200 --> 00:21:04,640 Speaker 2: for that's based on math. 355 00:21:05,520 --> 00:21:08,720 Speaker 1: Yeah. I mean, every time I read something that said, well, 356 00:21:08,800 --> 00:21:11,240 Speaker 1: right now, it's not so great at this, I just 357 00:21:11,359 --> 00:21:14,119 Speaker 1: assume that meant and we'll have that worked out in 358 00:21:14,160 --> 00:21:15,119 Speaker 1: the next few weeks. 359 00:21:15,320 --> 00:21:17,880 Speaker 2: Yeah, pretty much. I mean, because as these things get 360 00:21:18,760 --> 00:21:21,040 Speaker 2: like bigger and smarter, in the data sets that they're 361 00:21:21,040 --> 00:21:24,080 Speaker 2: trained on get wider, they're just going to get better 362 00:21:24,119 --> 00:21:26,480 Speaker 2: and better at this because they again they learned from 363 00:21:26,480 --> 00:21:27,280 Speaker 2: their mistakes. 364 00:21:29,800 --> 00:21:34,439 Speaker 1: Yeah, just like humans, right, exactly like humans. So you 365 00:21:34,480 --> 00:21:38,000 Speaker 1: mentioned these hallucinations kind of briefly, and this this is 366 00:21:38,040 --> 00:21:40,000 Speaker 1: one of the big problems with them so far that 367 00:21:40,560 --> 00:21:42,720 Speaker 1: again I'm sure they will figure this out in due time. 368 00:21:43,119 --> 00:21:47,480 Speaker 1: But one example that Livia found was to prompt it 369 00:21:47,520 --> 00:21:51,359 Speaker 1: with what mammal lays the largest eggs. And one of 370 00:21:51,400 --> 00:21:54,720 Speaker 1: the problems is when it gives hallucinations or wrong answers it. 371 00:21:55,160 --> 00:21:57,520 Speaker 1: You know, it's not saying like, well, I'm not so 372 00:21:57,600 --> 00:22:00,800 Speaker 1: sure about this. It's saying this is true, just like 373 00:22:00,840 --> 00:22:03,879 Speaker 1: anything else. I'm spitting out right with a lot of confidence. 374 00:22:03,880 --> 00:22:06,280 Speaker 1: So the answer there was the mammal that lays the 375 00:22:06,359 --> 00:22:08,720 Speaker 1: largest eggs is the elephant, and elephant's eggs are so 376 00:22:08,760 --> 00:22:11,320 Speaker 1: small that they are often visible to the naked eye, 377 00:22:11,520 --> 00:22:14,359 Speaker 1: so they're not commonly known to lay eggs at all. However, 378 00:22:14,400 --> 00:22:17,040 Speaker 1: in terms of sheer size, and elephant's eggs are the 379 00:22:17,160 --> 00:22:19,040 Speaker 1: largest of any mammal. 380 00:22:18,800 --> 00:22:20,760 Speaker 2: Which makes sense in a really weird way if you 381 00:22:20,760 --> 00:22:21,800 Speaker 2: think about it. 382 00:22:21,920 --> 00:22:23,399 Speaker 1: Sure, those little invisible eggs. 383 00:22:23,480 --> 00:22:26,400 Speaker 2: Yeah, because mammals don't lay eggs obviously, But the way 384 00:22:26,440 --> 00:22:28,560 Speaker 2: that it put it was if you didn't know that 385 00:22:28,600 --> 00:22:31,200 Speaker 2: mammals don't lay eggs, or you didn't know anything about elephants, 386 00:22:31,520 --> 00:22:33,679 Speaker 2: you'd be like, oh, that's interesting, and take that as 387 00:22:33,680 --> 00:22:36,480 Speaker 2: a fact, because it's saying this confidently. And I saw 388 00:22:36,760 --> 00:22:40,200 Speaker 2: written somewhere that one GPT actually argued with the user 389 00:22:40,240 --> 00:22:41,919 Speaker 2: and told them they were wrong when they told the 390 00:22:41,960 --> 00:22:46,119 Speaker 2: GPRIT that it was wrong, Yeah, which is not a 391 00:22:46,160 --> 00:22:50,239 Speaker 2: behavior you want at all. But that's what's termed as 392 00:22:50,240 --> 00:22:53,560 Speaker 2: a hallucination, and a hallucination is a good way to 393 00:22:53,800 --> 00:23:00,600 Speaker 2: understand it. That I saw is that again, this GPT, 394 00:23:00,720 --> 00:23:04,200 Speaker 2: this large language model, doesn't have any idea what it's 395 00:23:04,200 --> 00:23:09,640 Speaker 2: saying means. It's just picked up it's noticed patterns that 396 00:23:09,720 --> 00:23:13,000 Speaker 2: we've not noticed before, and it's putting them together in 397 00:23:13,119 --> 00:23:17,280 Speaker 2: nonsensical ways. But they're still sensible if you read them. 398 00:23:17,320 --> 00:23:20,600 Speaker 2: It's just factually they're not sensible because it doesn't have 399 00:23:20,640 --> 00:23:24,600 Speaker 2: any fact checking necessarily, it just knows what it's finding 400 00:23:25,440 --> 00:23:28,080 Speaker 2: kind of correlates with other things. So there's some sensible 401 00:23:28,119 --> 00:23:31,560 Speaker 2: stuff in there, like the phrase invisible to the naked eye, 402 00:23:32,000 --> 00:23:35,959 Speaker 2: or laying eggs or elephants in mammals. Like this stuff 403 00:23:36,280 --> 00:23:38,840 Speaker 2: all makes sense. It's not like these are just strings 404 00:23:38,840 --> 00:23:42,520 Speaker 2: of letters. Yeah, it's just putting them together in ways 405 00:23:42,560 --> 00:23:46,440 Speaker 2: that are not true. They're factually incorrect, and that's a hallucination. 406 00:23:46,560 --> 00:23:50,840 Speaker 2: It's not like the computer is thinking that this is true. 407 00:23:50,880 --> 00:23:57,560 Speaker 2: It doesn't understand things like truth in falsehood. It just creates, 408 00:23:58,000 --> 00:24:00,440 Speaker 2: and some of the time it gets it really wrong. 409 00:24:00,920 --> 00:24:02,439 Speaker 1: Yeah, I didn't know what an elephant is. 410 00:24:03,160 --> 00:24:06,400 Speaker 2: No, it just knows that it correlates to in some 411 00:24:06,640 --> 00:24:11,320 Speaker 2: really small way. That we've never noticed before the word eggs. 412 00:24:11,560 --> 00:24:15,280 Speaker 1: Yeah, and this is uh that that's a problem if 413 00:24:15,440 --> 00:24:17,440 Speaker 1: if it's just like, oh, well, this thing isn't quite 414 00:24:17,440 --> 00:24:19,400 Speaker 1: where it needs to be at because it thinks elephants 415 00:24:19,440 --> 00:24:22,280 Speaker 1: lay eggs. But there have already been plenty of real 416 00:24:22,440 --> 00:24:25,800 Speaker 1: world examples where people are using this and it's screwing 417 00:24:25,840 --> 00:24:28,160 Speaker 1: things up for their business or for commerce or something 418 00:24:28,240 --> 00:24:31,200 Speaker 1: where the yeah, or their client, well, that's that's one. 419 00:24:31,240 --> 00:24:35,440 Speaker 1: There was an attorney who was representing a passenger who 420 00:24:35,520 --> 00:24:39,720 Speaker 1: was suing an airline and used chat bt to do research, 421 00:24:40,320 --> 00:24:42,640 Speaker 1: and it came up with a bunch of fake cases 422 00:24:42,960 --> 00:24:46,560 Speaker 1: that this attorney didn't bother to fact check, I guess. 423 00:24:47,160 --> 00:24:50,199 Speaker 1: And there were like a dozen fake cases that this 424 00:24:50,280 --> 00:24:51,840 Speaker 1: attorney submitted in his brief. 425 00:24:52,080 --> 00:24:54,360 Speaker 2: And it wasn't like so like from what I understand, 426 00:24:54,359 --> 00:24:57,800 Speaker 2: like the the brief was largely compiled from what the 427 00:24:57,960 --> 00:25:01,480 Speaker 2: GPT spit out. The It wasn't like the GPT just 428 00:25:01,520 --> 00:25:03,240 Speaker 2: made up the names of cases. It made up the 429 00:25:03,280 --> 00:25:06,119 Speaker 2: names of cases and then described the background of the 430 00:25:06,160 --> 00:25:09,560 Speaker 2: case and how they related to the case at hand, right, 431 00:25:09,640 --> 00:25:12,440 Speaker 2: So it just completely made these up out of out 432 00:25:12,440 --> 00:25:15,359 Speaker 2: of the blue. And yeah, that lawyer had no idea. 433 00:25:15,400 --> 00:25:17,800 Speaker 2: He said in a brief later that he had no 434 00:25:17,880 --> 00:25:20,439 Speaker 2: idea that this thing was capable of being incorrect. So 435 00:25:20,760 --> 00:25:22,399 Speaker 2: it was like one of the first times he used it, 436 00:25:22,480 --> 00:25:24,280 Speaker 2: and he threw himself on the mercy of the court. 437 00:25:24,600 --> 00:25:26,520 Speaker 2: And I'm not quite sure exactly what happened. I think 438 00:25:26,520 --> 00:25:28,440 Speaker 2: they're still figuring out what to do about it. 439 00:25:29,480 --> 00:25:31,679 Speaker 1: Maybe just go spend some quality time with your little 440 00:25:31,800 --> 00:25:33,919 Speaker 1: chat butt exactly. 441 00:25:34,960 --> 00:25:39,600 Speaker 2: Similarly, Meta had a large language model that basically got 442 00:25:39,680 --> 00:25:42,040 Speaker 2: laughed off of the Internet because it was very science 443 00:25:42,040 --> 00:25:45,800 Speaker 2: focused and it would make up things that just didn't exist, 444 00:25:45,840 --> 00:25:49,879 Speaker 2: like mathematical formula, like there was one that called the 445 00:25:50,000 --> 00:25:54,399 Speaker 2: yoko or no, the lenin Ono correlation or something like 446 00:25:54,440 --> 00:25:57,480 Speaker 2: that completely made up this thing that I read, and 447 00:25:57,520 --> 00:25:59,960 Speaker 2: I was like, oh, that's interesting. I had no idea. 448 00:26:00,320 --> 00:26:02,840 Speaker 2: I have never heard this stuff before, and I would 449 00:26:02,880 --> 00:26:04,479 Speaker 2: have just thought that it was real had I not 450 00:26:04,520 --> 00:26:07,080 Speaker 2: realized and known ahead of time that it was a 451 00:26:07,080 --> 00:26:10,800 Speaker 2: hallucination that this math thing does not exist anywhere. And 452 00:26:10,800 --> 00:26:15,080 Speaker 2: it even attributed it to a live mathematician said that 453 00:26:15,160 --> 00:26:18,479 Speaker 2: this was the guy who discovered it. So like, it 454 00:26:18,560 --> 00:26:22,320 Speaker 2: really can get hard to discern what's true and what's not, 455 00:26:22,640 --> 00:26:25,440 Speaker 2: which again is a really big problem if we haven't 456 00:26:25,440 --> 00:26:26,439 Speaker 2: gotten that cross yet. 457 00:26:26,680 --> 00:26:33,560 Speaker 1: Did they say that mathematician's name was math be calculus. Yeah. 458 00:26:33,640 --> 00:26:35,639 Speaker 1: Another example, and this is, you know, we're going to 459 00:26:35,680 --> 00:26:38,600 Speaker 1: talk a little bit about you know, replacing jobs in 460 00:26:38,640 --> 00:26:42,679 Speaker 1: the various ways that can and already is happening. But 461 00:26:42,760 --> 00:26:46,120 Speaker 1: seeing that, for instance, said oh, you know what, let 462 00:26:46,119 --> 00:26:47,479 Speaker 1: me try this thing out and see if we can 463 00:26:47,480 --> 00:26:51,200 Speaker 1: get it to write an actual story. And so they 464 00:26:51,240 --> 00:26:54,199 Speaker 1: got an AI tool to write one on what is 465 00:26:54,240 --> 00:26:57,239 Speaker 1: compound interest, and it was just there was a lot 466 00:26:57,280 --> 00:27:01,040 Speaker 1: of stuff wrong in it. There was some plagiarism, you know, 467 00:27:01,280 --> 00:27:04,359 Speaker 1: directly lifted. So there's you know, these things aren't fool 468 00:27:04,359 --> 00:27:06,800 Speaker 1: proof yet, and it's definitely not something that should be 469 00:27:08,080 --> 00:27:12,040 Speaker 1: utilized for like a public facing website that's supposed to 470 00:27:12,680 --> 00:27:17,719 Speaker 1: have like really solid vetted articles about uh well, especially 471 00:27:17,760 --> 00:27:21,080 Speaker 1: seen it about a tech right of all things. 472 00:27:21,320 --> 00:27:24,840 Speaker 2: That's something that the National Eating Disorder Association found out 473 00:27:24,840 --> 00:27:30,200 Speaker 2: the hard way. They apparently replaced entirely it's human staffed 474 00:27:30,240 --> 00:27:34,520 Speaker 2: hotline with the chatbot, and supposedly they were accused of 475 00:27:34,520 --> 00:27:37,320 Speaker 2: doing this to bust the union that had formed there 476 00:27:38,520 --> 00:27:41,200 Speaker 2: and so when they released the chatpot into the world 477 00:27:41,440 --> 00:27:44,639 Speaker 2: and it started offering advice to people suffering from eating disorders. 478 00:27:44,880 --> 00:27:49,000 Speaker 2: It gave standard, you know, weight loss advice, which you 479 00:27:49,119 --> 00:27:51,840 Speaker 2: probably get from your doctor who didn't realize you had 480 00:27:51,840 --> 00:27:54,480 Speaker 2: an eating disorder, but in the context of an eating disorder, 481 00:27:54,720 --> 00:27:57,720 Speaker 2: it was all like trigger, trigger, trigger, one right after 482 00:27:57,800 --> 00:28:00,240 Speaker 2: the other. Right, Like, it was telling these people with 483 00:28:00,280 --> 00:28:03,000 Speaker 2: eating disorders to like, weigh yourself every week and try 484 00:28:03,040 --> 00:28:05,399 Speaker 2: to cut out five hundred to one thousand calories a 485 00:28:05,480 --> 00:28:08,879 Speaker 2: day and you'll lose some weight, and just stuff that 486 00:28:08,880 --> 00:28:11,320 Speaker 2: that would set everybody off. And very quickly they took 487 00:28:11,359 --> 00:28:14,959 Speaker 2: it offline and I guess brought their humans back, hopefully 488 00:28:15,000 --> 00:28:15,880 Speaker 2: at double the pay. 489 00:28:16,920 --> 00:28:19,920 Speaker 1: Yeah. But I mean this stuff is that's already being 490 00:28:19,960 --> 00:28:22,560 Speaker 1: solved as well, because they point out that GPT four 491 00:28:23,359 --> 00:28:27,240 Speaker 1: has already scored forty percent higher than three point five, 492 00:28:27,880 --> 00:28:32,159 Speaker 1: again just a handful of months ago on these accuracy tests, 493 00:28:32,720 --> 00:28:36,800 Speaker 1: so that that is even getting better. And you know, 494 00:28:36,840 --> 00:28:38,800 Speaker 1: where where I guess people want it to get to 495 00:28:39,760 --> 00:28:42,680 Speaker 1: is to the point where it doesn't need human supervision 496 00:28:42,760 --> 00:28:45,800 Speaker 1: to spit out really really accurate stuff exactly. 497 00:28:45,920 --> 00:28:48,479 Speaker 2: That's pretty much where they're hoping to get it. And 498 00:28:48,520 --> 00:28:51,600 Speaker 2: I mean it's just they have the model, they have 499 00:28:51,840 --> 00:28:53,800 Speaker 2: everything they need. They just it just has to be 500 00:28:53,840 --> 00:28:54,360 Speaker 2: tinkered with. 501 00:28:54,440 --> 00:28:58,440 Speaker 1: Now, should we take another break? I think so, all right, 502 00:28:58,480 --> 00:29:00,880 Speaker 1: we'll take another break and then get into sort of 503 00:29:00,920 --> 00:29:04,040 Speaker 1: the the economics of it and whether or not your 504 00:29:04,200 --> 00:29:05,680 Speaker 1: job may be at risk right after this. 505 00:29:29,640 --> 00:29:33,240 Speaker 2: So one of the astounding things about this that it 506 00:29:33,360 --> 00:29:37,320 Speaker 2: really caught everybody off guard is that these large language models, 507 00:29:37,800 --> 00:29:41,520 Speaker 2: the jobs they're coming after are white collar knowledge jobs. 508 00:29:42,320 --> 00:29:45,920 Speaker 2: They're so good at things like writing, they're good at researching, 509 00:29:45,960 --> 00:29:50,200 Speaker 2: they're good at analyzing photos. Now and that's a huge 510 00:29:50,680 --> 00:29:53,760 Speaker 2: sea change from what it's been like traditionally. Right wherever 511 00:29:54,000 --> 00:29:57,920 Speaker 2: whenever we've automated things, it's usually replaced manual labor. Now 512 00:29:57,920 --> 00:30:00,400 Speaker 2: it's the manual labor that's safe in this Yeah, this 513 00:30:00,640 --> 00:30:05,200 Speaker 2: generation of automation, it's the white collar knowledge shops that 514 00:30:05,280 --> 00:30:08,200 Speaker 2: are at risk. And not just white collar job but 515 00:30:08,480 --> 00:30:12,320 Speaker 2: artists in yeah, like just who have nothing to do 516 00:30:12,360 --> 00:30:16,480 Speaker 2: with white collar or jobs, they're at risk as well. 517 00:30:17,200 --> 00:30:20,880 Speaker 1: Yeah, I'm sure the farmers are all sitting around going, 518 00:30:21,480 --> 00:30:22,280 Speaker 1: how's that going for you? 519 00:30:22,360 --> 00:30:24,680 Speaker 2: Yeah, how's that taste? Uh? 520 00:30:24,880 --> 00:30:28,160 Speaker 1: So? Yeah, art art is when when d A L L. E. 521 00:30:28,320 --> 00:30:31,040 Speaker 1: Doll E came out, that was an art tool where 522 00:30:31,880 --> 00:30:33,280 Speaker 1: a lot of people, a lot of people I know, 523 00:30:33,360 --> 00:30:36,120 Speaker 1: would input. I guess I never did it. I never 524 00:30:36,280 --> 00:30:38,920 Speaker 1: do anything like that, not because I'm afraid or anything, 525 00:30:39,000 --> 00:30:44,160 Speaker 1: but I just just not interested basically. But I guess 526 00:30:44,160 --> 00:30:46,160 Speaker 1: you would submit, like a photograph of yourself and then 527 00:30:46,160 --> 00:30:48,120 Speaker 1: it would say, well, here's you as a superhero or 528 00:30:48,160 --> 00:30:52,800 Speaker 1: here's you as a Renaissance painting or whatever. And you know, 529 00:30:52,920 --> 00:30:57,280 Speaker 1: it's sourcing images from real artists throughout history, from Getty 530 00:30:57,320 --> 00:31:00,280 Speaker 1: Images and places like that, and there are are ready 531 00:31:00,360 --> 00:31:04,880 Speaker 1: artists that are suing for infringement. Getty Images is suing 532 00:31:04,920 --> 00:31:09,520 Speaker 1: for infringement and saying you can't even if you're mixing 533 00:31:09,640 --> 00:31:13,120 Speaker 1: up things and it's not like a Rembrandt. Let's say 534 00:31:13,560 --> 00:31:16,239 Speaker 1: you're using all of the artists from that era and 535 00:31:16,360 --> 00:31:19,600 Speaker 1: mashing it up together in a way that, like we 536 00:31:19,720 --> 00:31:21,160 Speaker 1: think basically is illegal. 537 00:31:21,400 --> 00:31:25,080 Speaker 2: Yeah, they say this doesn't count as transformative use, which 538 00:31:25,120 --> 00:31:30,400 Speaker 2: is typically protected under the law. Right, this is instead 539 00:31:31,080 --> 00:31:34,640 Speaker 2: just some sort of mash up that that a machine 540 00:31:34,680 --> 00:31:39,040 Speaker 2: is doing. It's to me, it's almost splitting hairs. But 541 00:31:39,080 --> 00:31:41,560 Speaker 2: I also very much get where they're coming from not 542 00:31:41,600 --> 00:31:44,720 Speaker 2: just a place of panic, but like they're a real 543 00:31:46,280 --> 00:31:48,240 Speaker 2: like they have a basis in fact that these things 544 00:31:48,280 --> 00:31:51,640 Speaker 2: are not transforming because they don't understand what they're doing. 545 00:31:52,520 --> 00:31:56,840 Speaker 1: Yeah, and companies are taking notice very quickly. There are 546 00:31:56,840 --> 00:31:59,280 Speaker 1: some companies I'm sure everyone's going to kind of fall 547 00:31:59,320 --> 00:32:01,840 Speaker 1: in line, that are already saying, well, no, you got 548 00:32:01,880 --> 00:32:04,880 Speaker 1: to start paying us for access to this stuff. We 549 00:32:05,120 --> 00:32:09,400 Speaker 1: paid human beings to create this content for lack of 550 00:32:09,400 --> 00:32:12,760 Speaker 1: a better word, and put it online for people to access. 551 00:32:13,360 --> 00:32:15,440 Speaker 1: But you can't come in here now and access it 552 00:32:15,480 --> 00:32:18,960 Speaker 1: with a bot and use it and charge for it 553 00:32:19,400 --> 00:32:21,840 Speaker 1: without giving us a little juice. And there are a 554 00:32:21,880 --> 00:32:24,960 Speaker 1: lot of companies that are already saying like, you can't 555 00:32:25,120 --> 00:32:27,760 Speaker 1: use this. If you're an employee of our company, you 556 00:32:27,800 --> 00:32:32,080 Speaker 1: can't use chatbots at all because some of our company's 557 00:32:32,080 --> 00:32:37,480 Speaker 1: secrets might end up being spilled somehow, or you know, 558 00:32:37,520 --> 00:32:40,960 Speaker 1: our databases are all of a sudden exposed. So companies 559 00:32:40,960 --> 00:32:43,920 Speaker 1: are really moving fast to trying to protect their ip 560 00:32:44,280 --> 00:32:45,120 Speaker 1: I guess. 561 00:32:44,920 --> 00:32:48,440 Speaker 2: Well, yeah, and one of the I mean some of 562 00:32:48,480 --> 00:32:51,840 Speaker 2: the companies that are behind the GPTs that are out 563 00:32:51,920 --> 00:32:54,320 Speaker 2: right now, the large language models that are out right 564 00:32:54,360 --> 00:32:58,720 Speaker 2: now are well known for not only not protecting their 565 00:32:58,840 --> 00:33:03,280 Speaker 2: users information, but for rating it for its own use. Like, 566 00:33:03,360 --> 00:33:06,080 Speaker 2: for example, Meta is one of the ones with They 567 00:33:06,160 --> 00:33:11,280 Speaker 2: have their large language models called Lama, and there's a 568 00:33:11,360 --> 00:33:15,240 Speaker 2: chatbot called Alpaca. And it makes total sense that you 569 00:33:15,480 --> 00:33:18,920 Speaker 2: are probably signing away your right to protect your information 570 00:33:19,000 --> 00:33:21,520 Speaker 2: when you use those things on whatever computer you're using 571 00:33:21,520 --> 00:33:24,080 Speaker 2: it on or whatever network you're using it on. I 572 00:33:24,120 --> 00:33:27,000 Speaker 2: don't understand exactly. I haven't seen anything that says this 573 00:33:27,160 --> 00:33:29,160 Speaker 2: is how they're doing it, or even that they are 574 00:33:29,160 --> 00:33:32,640 Speaker 2: definitely doing this. I think it's just that the powers 575 00:33:32,680 --> 00:33:35,400 Speaker 2: that be no, like they would totally do this if 576 00:33:35,440 --> 00:33:37,400 Speaker 2: they can, and they probably are, so we should just 577 00:33:37,400 --> 00:33:40,440 Speaker 2: stay keep our employees away from it, you know, as 578 00:33:40,520 --> 00:33:41,200 Speaker 2: much as we can. 579 00:33:42,040 --> 00:33:46,080 Speaker 1: Yeah, it's like we said, it's being used on smaller 580 00:33:46,160 --> 00:33:50,600 Speaker 1: levels by One of the uses that Livia dug up 581 00:33:50,680 --> 00:33:53,200 Speaker 1: was like, let's say a real estate agent, instead of 582 00:33:53,200 --> 00:33:56,959 Speaker 1: taking time to write up listings, has a chatbot to it, 583 00:33:57,000 --> 00:33:59,400 Speaker 1: and then they can go through afterward and make adjustments 584 00:33:59,440 --> 00:34:01,080 Speaker 1: to it as needed. 585 00:34:01,360 --> 00:34:05,280 Speaker 2: Well, in exchange, that database now knows exactly what you 586 00:34:05,320 --> 00:34:07,480 Speaker 2: think of that one ugly bathroom. 587 00:34:07,440 --> 00:34:12,360 Speaker 1: That's right, or doctors may be using it to compile 588 00:34:12,440 --> 00:34:17,400 Speaker 1: lists of possible diseases or conditions that someone might have 589 00:34:17,440 --> 00:34:23,359 Speaker 1: based on symptoms. These all sound like uses that are like, hey, 590 00:34:23,640 --> 00:34:25,080 Speaker 1: this sounds like it could be a good thing in 591 00:34:25,080 --> 00:34:28,200 Speaker 1: some ways, and it can be in some ways. But 592 00:34:28,880 --> 00:34:31,400 Speaker 1: it's the wild West right now, so it's not like 593 00:34:31,520 --> 00:34:35,319 Speaker 1: there's any there's anyone saying, well, you can't use it 594 00:34:35,360 --> 00:34:37,160 Speaker 1: for that, you can only use it for this, you 595 00:34:37,200 --> 00:34:37,759 Speaker 1: know what I'm saying. 596 00:34:37,840 --> 00:34:40,560 Speaker 2: Plus, also, everything that we've come up with as just 597 00:34:41,080 --> 00:34:44,840 Speaker 2: Internet users in the general public has been what we 598 00:34:44,880 --> 00:34:48,400 Speaker 2: could come up with in given three months, with no 599 00:34:48,560 --> 00:34:51,399 Speaker 2: warning that we should start thinking about this. It's just like, hey, 600 00:34:51,440 --> 00:34:53,080 Speaker 2: this is here, what are you going to do with it? 601 00:34:53,120 --> 00:34:55,640 Speaker 2: And people are just finding new things to do with 602 00:34:55,680 --> 00:34:58,840 Speaker 2: it every day, And yeah, some of them are benign, 603 00:34:58,960 --> 00:35:02,880 Speaker 2: like having a draft a blog post for your business. 604 00:35:03,280 --> 00:35:05,279 Speaker 2: I thought they were already doing that based on some 605 00:35:05,360 --> 00:35:08,920 Speaker 2: of the emails that I get from like businesses, right, Yeah, 606 00:35:08,920 --> 00:35:11,719 Speaker 2: but they definitely are now if they weren't before. And 607 00:35:11,760 --> 00:35:16,000 Speaker 2: that's totally cool because there's there's a it's just taking 608 00:35:16,040 --> 00:35:18,960 Speaker 2: some of the weight off of the humans that are 609 00:35:19,000 --> 00:35:23,759 Speaker 2: already doing this. Work. Right, what's going to be problematic 610 00:35:24,080 --> 00:35:28,200 Speaker 2: is when it comes for the full job, or enough 611 00:35:28,239 --> 00:35:32,839 Speaker 2: of the job that the company can transfer whatever's left 612 00:35:32,840 --> 00:35:35,839 Speaker 2: of that person's job to other people and make them 613 00:35:35,920 --> 00:35:39,120 Speaker 2: just work a little harder while they're supported by the AI. 614 00:35:39,840 --> 00:35:43,400 Speaker 1: Yeah, here's some stats that they were pretty shocking to me. 615 00:35:43,480 --> 00:35:47,239 Speaker 1: I didn't know it was moving this fast. But there's 616 00:35:47,239 --> 00:35:50,400 Speaker 1: a networking app called Fishbowl and in twenty twenty three, 617 00:35:50,560 --> 00:35:54,520 Speaker 1: just earlier this year, they found that forty percent of 618 00:35:55,040 --> 00:35:58,759 Speaker 1: what they call working professionals are already using some kind 619 00:35:58,800 --> 00:36:01,799 Speaker 1: of either chat, GPT or some kind of AI tool 620 00:36:01,960 --> 00:36:07,200 Speaker 1: while they work, whether it's generating idealists or brainstorming lists, 621 00:36:07,320 --> 00:36:11,799 Speaker 1: or actually writing stuff or maybe looking at code. And 622 00:36:12,120 --> 00:36:15,080 Speaker 1: this is the troubling part. Forty of those forty percent, 623 00:36:16,000 --> 00:36:19,240 Speaker 1: almost seventy percent are doing that in secret and hadn't 624 00:36:19,239 --> 00:36:20,920 Speaker 1: told their bosses that they were doing that. 625 00:36:21,040 --> 00:36:23,360 Speaker 2: Right, those are just working professionals. We haven't even started 626 00:36:23,360 --> 00:36:24,600 Speaker 2: talking about students yet. 627 00:36:25,200 --> 00:36:27,680 Speaker 1: Yeah. I mean you combine that with work from home, 628 00:36:27,760 --> 00:36:29,200 Speaker 1: you got a real racket going on. 629 00:36:29,719 --> 00:36:33,120 Speaker 2: For sure, you know. Yeah, no, totally again though, I mean, 630 00:36:33,160 --> 00:36:35,200 Speaker 2: like if you can use it to do good work, 631 00:36:35,239 --> 00:36:38,040 Speaker 2: and you can now do more work. I think you 632 00:36:38,040 --> 00:36:40,880 Speaker 2: should be paid for more work. Like if your productivity's 633 00:36:40,920 --> 00:36:44,239 Speaker 2: gone through the roof, great, you figured it out. I've 634 00:36:44,280 --> 00:36:46,799 Speaker 2: got no problem with that. It's the opposite that I 635 00:36:46,800 --> 00:36:47,600 Speaker 2: have the problem with. 636 00:36:48,280 --> 00:36:50,920 Speaker 1: Well, let's skip students for a second and then and 637 00:36:51,000 --> 00:36:54,160 Speaker 1: talk about that since you brought it up, Because here's 638 00:36:54,239 --> 00:36:59,840 Speaker 1: the thing this is the United States doesn't have a 639 00:36:59,840 --> 00:37:04,040 Speaker 1: G eight track record of ignoring the bottom line in 640 00:37:04,080 --> 00:37:09,839 Speaker 1: favor of just keeping hardworking humans at their jobs. So 641 00:37:10,920 --> 00:37:13,239 Speaker 1: I think it was a Goldman. Sachs said that they 642 00:37:13,320 --> 00:37:17,840 Speaker 1: found that there could actually be an increase in the 643 00:37:17,880 --> 00:37:21,160 Speaker 1: annual GDP by about seven percent over ten years because 644 00:37:21,680 --> 00:37:26,200 Speaker 1: productivity increases. And I guess the idea is that productivity 645 00:37:26,280 --> 00:37:29,720 Speaker 1: is increasing because let's say you've got twenty to thirty 646 00:37:29,719 --> 00:37:32,680 Speaker 1: percent of stuff being done by AI. That opens up 647 00:37:32,680 --> 00:37:36,000 Speaker 1: twenty to thirty percent of your time for your employees 648 00:37:36,040 --> 00:37:41,720 Speaker 1: to maybe innovate or you know, get do other capitalistic things. 649 00:37:42,960 --> 00:37:45,240 Speaker 1: But what it to me, and this is just my opinion, 650 00:37:45,280 --> 00:37:48,000 Speaker 1: and again we're really early in all this, but it's 651 00:37:48,000 --> 00:37:51,440 Speaker 1: a bottom line world, and especially a bottom line country 652 00:37:51,440 --> 00:37:55,279 Speaker 1: that we live in, and I imagine what it would 653 00:37:55,360 --> 00:37:59,319 Speaker 1: likely mean is by by jobs more than it means, well, hey, 654 00:37:59,360 --> 00:38:01,840 Speaker 1: you've got more time, and why don't you innovate it 655 00:38:01,920 --> 00:38:06,200 Speaker 1: your job, because for most jobs, it'll probably be like, oh, 656 00:38:06,200 --> 00:38:07,840 Speaker 1: wait a minute, if we can teach it to do 657 00:38:07,880 --> 00:38:10,640 Speaker 1: forty center your job, I bet we could train it 658 00:38:10,640 --> 00:38:12,400 Speaker 1: to do a one hundred percent. 659 00:38:12,239 --> 00:38:15,080 Speaker 2: Yeah, or we can get rid of, you know, a 660 00:38:15,120 --> 00:38:16,759 Speaker 2: bunch of you and just keep some of you to 661 00:38:17,080 --> 00:38:18,440 Speaker 2: do the other sixty percent. 662 00:38:19,000 --> 00:38:20,880 Speaker 1: You know. But now see these people are out of 663 00:38:20,960 --> 00:38:22,839 Speaker 1: jobs that it's going to bite them in the rear though, 664 00:38:22,880 --> 00:38:26,640 Speaker 1: because it's not ultimately going to be well, who knows. 665 00:38:26,719 --> 00:38:28,120 Speaker 1: It doesn't seem like it could be good for the 666 00:38:28,160 --> 00:38:30,000 Speaker 1: overall economy if all of a sudden, all these people 667 00:38:30,000 --> 00:38:32,800 Speaker 1: are out of jobs. Because people being out of jobs 668 00:38:32,880 --> 00:38:35,640 Speaker 1: mean they're not that means the economy is going to tank. 669 00:38:35,680 --> 00:38:38,680 Speaker 1: They're not spending. And it's not like a situation where 670 00:38:40,440 --> 00:38:44,960 Speaker 1: you know, the tractor replaced the plow and then the 671 00:38:45,080 --> 00:38:48,480 Speaker 1: robot tractor replaced the tractor. But hey, now we've got 672 00:38:48,480 --> 00:38:51,400 Speaker 1: these better jobs where you're designing and building these robot 673 00:38:51,440 --> 00:38:55,160 Speaker 1: tractors and they're higher paying and they're great. It's not 674 00:38:55,400 --> 00:38:59,520 Speaker 1: like that because you know, the farmer was replaced who 675 00:38:59,600 --> 00:39:02,840 Speaker 1: drove that tractor and isn't skilled in the practice of 676 00:39:02,920 --> 00:39:07,640 Speaker 1: designing robot tractors. And in this case, in most cases, 677 00:39:08,480 --> 00:39:12,400 Speaker 1: they're not being there's not some other job waiting for 678 00:39:12,480 --> 00:39:15,560 Speaker 1: someone who got fired in the world of designing AI. 679 00:39:16,680 --> 00:39:17,399 Speaker 1: Does that make sense? 680 00:39:17,440 --> 00:39:19,799 Speaker 2: No, it makes total sense. But yeah, and in this case, 681 00:39:19,840 --> 00:39:22,640 Speaker 2: one of the big differences is instead of the farmer 682 00:39:22,719 --> 00:39:24,840 Speaker 2: having to go figure out how to work a computer, 683 00:39:25,120 --> 00:39:27,319 Speaker 2: that people working computers now have to go figure out 684 00:39:27,320 --> 00:39:30,719 Speaker 2: how to be farmers in order to sustain themselves. Right, 685 00:39:31,160 --> 00:39:33,960 Speaker 2: But you're right, we don't have a track record of 686 00:39:34,040 --> 00:39:38,120 Speaker 2: taking care of people very well, at least who are 687 00:39:38,200 --> 00:39:40,600 Speaker 2: out of a job. And I mean, without getting on 688 00:39:40,640 --> 00:39:44,319 Speaker 2: a soapbox here, what's either going to come out of this, 689 00:39:44,360 --> 00:39:45,840 Speaker 2: because there's going to be one or the other. The 690 00:39:45,840 --> 00:39:48,400 Speaker 2: status quo as it is now or as it wasn't 691 00:39:48,600 --> 00:39:51,080 Speaker 2: as of up to twenty twenty two. We don't know 692 00:39:51,160 --> 00:39:55,120 Speaker 2: that that's going to be around anymore. Instead, we'll either 693 00:39:55,200 --> 00:39:59,120 Speaker 2: do something like create universal basic income for people to 694 00:39:59,239 --> 00:40:03,720 Speaker 2: be like, hey, your industry literally does not exist anymore 695 00:40:03,960 --> 00:40:07,400 Speaker 2: and it just happened overnight. Basically, we're just gonna make 696 00:40:07,400 --> 00:40:09,759 Speaker 2: sure that everybody's at least minimally taken care of while 697 00:40:09,800 --> 00:40:13,120 Speaker 2: we're figuring out what comes next, or it's gonna be 698 00:40:13,160 --> 00:40:17,200 Speaker 2: like good luck, chump, you're fired, You're out on your own. Instead, 699 00:40:17,239 --> 00:40:19,960 Speaker 2: we're gonna take all this extra wealth, this extra two 700 00:40:20,080 --> 00:40:23,400 Speaker 2: trillion dollars that's gonna be generated, and push it upward 701 00:40:23,840 --> 00:40:27,279 Speaker 2: toward the wealthy instead and everybody else. Is just the 702 00:40:28,280 --> 00:40:32,000 Speaker 2: divide between wealthy and not wealthy is just going to 703 00:40:32,480 --> 00:40:35,120 Speaker 2: exponentially grow. One of those two things is gonna happen, 704 00:40:35,200 --> 00:40:37,560 Speaker 2: because I don't see how there's just gonna be a 705 00:40:37,600 --> 00:40:40,360 Speaker 2: regular middle ground like there is now where it's kind 706 00:40:40,360 --> 00:40:44,600 Speaker 2: of shaky, and how we're taking care of people, because 707 00:40:44,600 --> 00:40:48,600 Speaker 2: there's just gonna be so many layoffs and fairly skilled 708 00:40:48,640 --> 00:40:52,280 Speaker 2: workers being laid off too. We've just never encountered that before. 709 00:40:52,800 --> 00:40:56,320 Speaker 1: Yeah. I mean, that's the thing that these the largest 710 00:40:56,320 --> 00:40:59,839 Speaker 1: corporations might want to think about. Is all that's gonna 711 00:40:59,880 --> 00:41:05,880 Speaker 1: take is one CEO of a huge corporation to say, 712 00:41:06,000 --> 00:41:09,160 Speaker 1: wait a minute, it I think I can get rid 713 00:41:09,200 --> 00:41:12,319 Speaker 1: of seventy five percent of the vps in my in 714 00:41:12,360 --> 00:41:18,120 Speaker 1: my company, right, and like who unless who accept the 715 00:41:18,160 --> 00:41:20,399 Speaker 1: person at the very very top of that food chain 716 00:41:21,239 --> 00:41:23,880 Speaker 1: is protected. And the answer is nobody. 717 00:41:24,680 --> 00:41:26,919 Speaker 2: No, No, the offense essentially. 718 00:41:26,719 --> 00:41:27,920 Speaker 1: At the end of the day, because they make a 719 00:41:28,000 --> 00:41:29,879 Speaker 1: lot of money. If you it's one thing to lay 720 00:41:29,880 --> 00:41:32,279 Speaker 1: off a bunch of you know, technical writers that are 721 00:41:32,640 --> 00:41:35,000 Speaker 1: all sitting in their cubicles, But if you start laying 722 00:41:35,000 --> 00:41:38,680 Speaker 1: off those those vps who get those big bonuses, that's 723 00:41:38,880 --> 00:41:41,239 Speaker 1: more bonus money. And you know, are we looking at 724 00:41:41,280 --> 00:41:44,240 Speaker 1: a situation where a corporation is run by one human? 725 00:41:44,760 --> 00:41:47,480 Speaker 2: I mean, it's entirely possible, Like you can make a 726 00:41:47,520 --> 00:41:49,239 Speaker 2: really good case that what it is going to wipe 727 00:41:49,280 --> 00:41:53,120 Speaker 2: out is the middle management, right, vps, This is exactly 728 00:41:53,200 --> 00:41:55,400 Speaker 2: like you said, and that we still will need some 729 00:41:55,520 --> 00:41:57,160 Speaker 2: humans to do some stuff. 730 00:41:57,160 --> 00:41:59,320 Speaker 1: Like the board take care of the board, right, sure. 731 00:41:59,280 --> 00:42:02,279 Speaker 2: Of course, yeah, but yes, I mean who knows, we 732 00:42:02,880 --> 00:42:07,439 Speaker 2: have no idea at this point. Ultimately, it could very 733 00:42:07,520 --> 00:42:13,799 Speaker 2: easily provide for a much better healthier society, at least 734 00:42:13,840 --> 00:42:17,759 Speaker 2: financially speaking, it could do that, especially given a long 735 00:42:17,840 --> 00:42:18,760 Speaker 2: enough period of time. 736 00:42:19,160 --> 00:42:21,040 Speaker 1: I'm a cynic when it comes to that kind of 737 00:42:21,040 --> 00:42:21,600 Speaker 1: trust though. 738 00:42:21,680 --> 00:42:23,560 Speaker 2: I am as well for sure. But if you look 739 00:42:23,600 --> 00:42:28,080 Speaker 2: back in history at the history of technology overall, especially 740 00:42:28,160 --> 00:42:30,280 Speaker 2: if you just turn a blind eye to human suffering 741 00:42:30,280 --> 00:42:32,719 Speaker 2: for a second and you just look at the progress 742 00:42:32,719 --> 00:42:36,359 Speaker 2: of society. Right in a lot of ways it has 743 00:42:36,400 --> 00:42:39,840 Speaker 2: has gotten better and better things to technology. There's also 744 00:42:39,880 --> 00:42:42,640 Speaker 2: a lot of downsides to it. Nothing's black and white, 745 00:42:43,120 --> 00:42:46,200 Speaker 2: it's just not that's just not how things are. So 746 00:42:46,239 --> 00:42:48,680 Speaker 2: there's of course going to be problems. There's going to 747 00:42:48,719 --> 00:42:50,759 Speaker 2: be suffering, there's going to be people left behind, there's 748 00:42:50,800 --> 00:42:52,800 Speaker 2: going to be people that fall through the cracks. It's 749 00:42:52,880 --> 00:42:56,360 Speaker 2: just inevitable. We just don't know how many people for 750 00:42:56,480 --> 00:42:59,400 Speaker 2: how long and what will happen to those people on 751 00:42:59,440 --> 00:43:01,280 Speaker 2: the other of this transition. 752 00:43:02,320 --> 00:43:06,120 Speaker 1: Yeah, I was talking with somebody the other day about 753 00:43:06,880 --> 00:43:11,800 Speaker 1: the writer's strike in Hollywood. The WGA is striking right now. 754 00:43:12,520 --> 00:43:14,160 Speaker 1: For those of you who don't know, it's kind of 755 00:43:14,160 --> 00:43:16,680 Speaker 1: all over the place. But one of the things that 756 00:43:16,719 --> 00:43:21,359 Speaker 1: they have argued for in this round of negotiations is, hey, 757 00:43:21,440 --> 00:43:25,520 Speaker 1: you can't replace us with AI, and the studios all 758 00:43:25,560 --> 00:43:28,880 Speaker 1: came back and said, well, how about this, We'll assess 759 00:43:28,960 --> 00:43:33,240 Speaker 1: that on a year to year basis. And that's frightening 760 00:43:33,320 --> 00:43:36,520 Speaker 1: if you're if you're either a writer in Hollywood or 761 00:43:36,680 --> 00:43:41,239 Speaker 1: you're somebody who loves TV and films and quality TV 762 00:43:41,360 --> 00:43:46,120 Speaker 1: in films, because I don't know if I think ideation 763 00:43:46,800 --> 00:43:52,319 Speaker 1: and initial scripts maybe even right now, could I could 764 00:43:52,320 --> 00:43:54,359 Speaker 1: see that happening where they said they're like, all right now, 765 00:43:54,400 --> 00:43:58,120 Speaker 1: we'll bring in a human to refine this thing at 766 00:43:58,120 --> 00:44:01,000 Speaker 1: a much lower wage. That it's probably what they're most 767 00:44:01,000 --> 00:44:04,680 Speaker 1: afraid of, rather than being wholesale replaced, because like you said, 768 00:44:04,920 --> 00:44:09,160 Speaker 1: these programs are they're all about just data and numbers. 769 00:44:09,200 --> 00:44:12,120 Speaker 1: They're not They don't have human feelings, and that's what 770 00:44:12,560 --> 00:44:15,319 Speaker 1: art is. And so I think I would be more 771 00:44:15,320 --> 00:44:19,880 Speaker 1: concerned if I was writing pamphlets for Verizon or something, 772 00:44:20,719 --> 00:44:21,759 Speaker 1: or if I was. 773 00:44:22,040 --> 00:44:23,800 Speaker 2: Some pamphlet writer for Verizon. 774 00:44:23,880 --> 00:44:28,000 Speaker 1: Just went in gulp, No, I'm so sorry. But like 775 00:44:28,239 --> 00:44:30,720 Speaker 1: BuzzFeed back in the day, instead of having a dozen 776 00:44:30,760 --> 00:44:34,320 Speaker 1: writers writing clickbait articles, why not have just one human 777 00:44:34,680 --> 00:44:38,560 Speaker 1: that is a prompt engineer that's managing a virtual AI 778 00:44:39,600 --> 00:44:43,799 Speaker 1: clickbait room that's just pumping out these articles that you 779 00:44:43,840 --> 00:44:46,160 Speaker 1: know they were paying someone down to forty grand a 780 00:44:46,200 --> 00:44:47,160 Speaker 1: year to write previously. 781 00:44:47,760 --> 00:44:50,160 Speaker 2: Yeah, I mean it's a great question, like that was 782 00:44:50,200 --> 00:44:54,160 Speaker 2: a horrific, horrible job to have. Not too many years ago. 783 00:44:54,320 --> 00:44:56,319 Speaker 2: So it's great to have a computer do it, but 784 00:44:56,719 --> 00:44:58,839 Speaker 2: that means that we need these other people to go 785 00:44:58,920 --> 00:45:02,040 Speaker 2: on to be to have writing jobs that are more 786 00:45:02,080 --> 00:45:05,560 Speaker 2: satisfying to them than that. But that's not necessarily the case, 787 00:45:05,640 --> 00:45:10,360 Speaker 2: because as these things get smarter and better, they're just 788 00:45:10,400 --> 00:45:12,120 Speaker 2: going to be relied upon more. We're not going to 789 00:45:12,160 --> 00:45:15,160 Speaker 2: go back. There's no going back now. It just happened 790 00:45:15,200 --> 00:45:18,719 Speaker 2: like it just happened basically as of March twenty twenty three. 791 00:45:19,480 --> 00:45:23,600 Speaker 2: And one of the big problems that people have already 792 00:45:23,640 --> 00:45:31,280 Speaker 2: projected running into is if computers replace humans, say writers, 793 00:45:33,200 --> 00:45:36,480 Speaker 2: basically entirely. Eventually all the stuff that humans have written 794 00:45:36,480 --> 00:45:38,799 Speaker 2: on the Internet is going to become dated. It's going 795 00:45:38,840 --> 00:45:41,560 Speaker 2: to stop, yeah, and it will have been replaced and 796 00:45:41,640 --> 00:45:48,640 Speaker 2: picked up on by generative, pre trained transformers. Right, And 797 00:45:48,680 --> 00:45:52,040 Speaker 2: eventually all the writing on the Internet after a certain 798 00:45:52,120 --> 00:45:54,480 Speaker 2: date will have been written by computers, but will be 799 00:45:54,480 --> 00:45:57,600 Speaker 2: being scraped by computers. When humans go ask the computer 800 00:45:57,680 --> 00:46:01,160 Speaker 2: a question, the computer then goes in reference is something 801 00:46:01,200 --> 00:46:04,040 Speaker 2: written by a computer, So humans will be completely taken 802 00:46:04,080 --> 00:46:06,239 Speaker 2: out of the equation in that in that respect, we'll 803 00:46:06,239 --> 00:46:09,360 Speaker 2: be getting all of our information, at least non historical 804 00:46:09,400 --> 00:46:13,279 Speaker 2: information from non humans, and that could be a really 805 00:46:13,320 --> 00:46:15,839 Speaker 2: big problem, not just in the fact that we're losing 806 00:46:16,000 --> 00:46:18,959 Speaker 2: jobs or in the fact that computers are now telling 807 00:46:19,040 --> 00:46:21,600 Speaker 2: us all of our information, but also that there's some 808 00:46:23,400 --> 00:46:27,520 Speaker 2: there's some part of what humans put into things that 809 00:46:27,600 --> 00:46:31,120 Speaker 2: will be lost that I think we're going to demand. 810 00:46:31,239 --> 00:46:34,319 Speaker 2: I saw somebody put it like, I think, I can't 811 00:46:34,320 --> 00:46:37,000 Speaker 2: remember who it was, but they said, we're going like 812 00:46:37,280 --> 00:46:39,960 Speaker 2: people will go seek out human written stuff. There will 813 00:46:40,000 --> 00:46:43,040 Speaker 2: always be audiences for human written stuff. Yeah. Maybe, like 814 00:46:43,080 --> 00:46:47,120 Speaker 2: you said, will rely on computers to write the Verizon pamphlets, 815 00:46:47,480 --> 00:46:49,439 Speaker 2: but we're not going to rely on computers to write 816 00:46:49,440 --> 00:46:52,200 Speaker 2: great works of literature or to create great works of art, 817 00:46:52,320 --> 00:46:55,120 Speaker 2: Like we're just not going to They'll still do that. 818 00:46:55,160 --> 00:46:56,840 Speaker 2: They're going to be writing books and movies and all that, 819 00:46:56,880 --> 00:46:59,640 Speaker 2: but there will always be a taste in a market 820 00:46:59,680 --> 00:47:03,399 Speaker 2: for human created stuff. This guy said, I think he's right. 821 00:47:04,400 --> 00:47:08,440 Speaker 1: Yeah. And Justine Bateman, I don't know if you saw that. 822 00:47:09,560 --> 00:47:10,800 Speaker 1: I don't know if it was a blog poster. 823 00:47:11,880 --> 00:47:14,080 Speaker 2: Are you having a hallucination right now? Did you mean 824 00:47:14,200 --> 00:47:14,880 Speaker 2: Justine Bateman? 825 00:47:15,400 --> 00:47:20,560 Speaker 1: Yeah, yeah, Justin Bateman, the Jason Bateman's sister the actor, 826 00:47:20,680 --> 00:47:22,960 Speaker 1: and she's done all kinds of things since then. I 827 00:47:23,000 --> 00:47:26,440 Speaker 1: know she has a computer science degree, so she's very 828 00:47:26,440 --> 00:47:28,880 Speaker 1: smart and knows a lot about this stuff. But she 829 00:47:29,000 --> 00:47:32,080 Speaker 1: basically said, and this is beyond just the chatbot stuff. 830 00:47:32,520 --> 00:47:36,400 Speaker 1: But she was like, right now, there are major Hollywood 831 00:47:36,400 --> 00:47:41,719 Speaker 1: stars being scanned, and there may be a brand new 832 00:47:41,760 --> 00:47:45,600 Speaker 1: Tom Cruise movie in sixty years. Yeah, long after he's 833 00:47:45,680 --> 00:47:49,239 Speaker 1: dead starring Tom Cruise. He may be making movies for 834 00:47:49,239 --> 00:47:52,480 Speaker 1: the next two hundred years. And like, is this what 835 00:47:52,520 --> 00:47:55,279 Speaker 1: you want? Actors? Do you want to be scanned and 836 00:47:55,400 --> 00:47:58,880 Speaker 1: have them use your image like this in perpetuity for 837 00:47:59,160 --> 00:48:02,120 Speaker 1: you know, there will be money involved. It's not like 838 00:48:02,160 --> 00:48:03,640 Speaker 1: they can just say, Okay, we can just do whatever 839 00:48:03,640 --> 00:48:06,360 Speaker 1: we want. But what if they're like, here's a billion dollars, 840 00:48:06,480 --> 00:48:11,520 Speaker 1: Tom Cruise, just for the use of your image in perpetuity, 841 00:48:11,760 --> 00:48:15,239 Speaker 1: because we will be able to duplicate that so realistically 842 00:48:15,800 --> 00:48:20,320 Speaker 1: that people won't know human voices, same thing that's already happening. 843 00:48:20,880 --> 00:48:26,400 Speaker 1: What yeah, it is. So that stuff is kind of scary. 844 00:48:26,560 --> 00:48:29,400 Speaker 1: And you know when you read I didn't really know 845 00:48:29,440 --> 00:48:32,960 Speaker 1: this was kind of already happening in companies. But Olivia 846 00:48:33,040 --> 00:48:37,080 Speaker 1: found this stuff IBM CEO Arvin Krishna said just last 847 00:48:37,080 --> 00:48:39,879 Speaker 1: month in May that he believed thirty percent of back 848 00:48:39,880 --> 00:48:44,719 Speaker 1: office jobs could be replaced over five years, and it 849 00:48:44,800 --> 00:48:48,760 Speaker 1: was pausing hiring for close to eight thousand positions because 850 00:48:48,800 --> 00:48:52,160 Speaker 1: they might be able to use AI instead. And then 851 00:48:52,280 --> 00:48:56,759 Speaker 1: Dropbox talked about the AI era when they announced a 852 00:48:56,840 --> 00:49:00,239 Speaker 1: round of layoffs. So it is happening right now in 853 00:49:00,320 --> 00:49:00,719 Speaker 1: real time. 854 00:49:00,800 --> 00:49:04,680 Speaker 2: Pretty amazing. Yeah, that's I mean, there's proof positive right there, 855 00:49:05,040 --> 00:49:08,479 Speaker 2: Like that guy couldn't even wait a couple months a year, 856 00:49:09,120 --> 00:49:12,280 Speaker 2: Like this really started up in March, and he's saying 857 00:49:12,320 --> 00:49:14,719 Speaker 2: this in May. Already in May, they're like, wait, wait, 858 00:49:14,760 --> 00:49:17,759 Speaker 2: stop hiring. We're gonna eventually replace these guys with AI 859 00:49:18,160 --> 00:49:21,480 Speaker 2: so soon that we're going to stop hiring those positions 860 00:49:21,480 --> 00:49:24,040 Speaker 2: for now until the AI is competent enough to take over. 861 00:49:24,600 --> 00:49:27,319 Speaker 1: I mean, how many people does ib employ? What, what's 862 00:49:27,360 --> 00:49:28,160 Speaker 1: thirty percent of that? 863 00:49:28,400 --> 00:49:30,960 Speaker 2: I don't know. I would say at least at least 864 00:49:31,000 --> 00:49:34,879 Speaker 2: one hundred people, right, So, yeah, like you said, it's 865 00:49:34,880 --> 00:49:37,040 Speaker 2: happening already. And then one other thing to look out 866 00:49:37,040 --> 00:49:40,600 Speaker 2: for too, that's I believe is already at least theoretically 867 00:49:40,640 --> 00:49:45,879 Speaker 2: possible since AI can write code. Now they'll be able 868 00:49:45,920 --> 00:49:50,640 Speaker 2: to create new large language models themselves, So the computers 869 00:49:50,640 --> 00:49:53,400 Speaker 2: will be able to create new AI. 870 00:49:54,080 --> 00:49:56,160 Speaker 1: Well, that's the singularity, right. 871 00:49:56,320 --> 00:50:00,799 Speaker 2: No, the singularity is when one of them a understands 872 00:50:00,880 --> 00:50:05,480 Speaker 2: what it is and become Yes, that's the singularity. 873 00:50:05,840 --> 00:50:07,560 Speaker 1: But this leads to that though, doesn't it It does. 874 00:50:07,840 --> 00:50:11,600 Speaker 2: It's hypothetically yes, but we just understand what's going on 875 00:50:11,680 --> 00:50:13,880 Speaker 2: so little that you just can't say either way. Really, 876 00:50:13,920 --> 00:50:16,239 Speaker 2: you definitely can't say that it no, it won't happen. 877 00:50:16,280 --> 00:50:18,359 Speaker 2: It's just fantasy. And you also can't say yes, it's 878 00:50:18,360 --> 00:50:19,280 Speaker 2: definitely gonna happen. 879 00:50:19,960 --> 00:50:25,520 Speaker 1: Yeah. And here's the thing, man, I'm not a paranoid technophobe. 880 00:50:26,080 --> 00:50:27,960 Speaker 2: You don't any measure foiled cap on. 881 00:50:28,719 --> 00:50:33,120 Speaker 1: No, by any measure. I'm a pretty positive thinker, and 882 00:50:33,200 --> 00:50:35,239 Speaker 1: this this is pretty scary to me. 883 00:50:37,560 --> 00:50:44,160 Speaker 2: I'm just gonna leave that there, agreed, Chuck. Okay, if 884 00:50:44,200 --> 00:50:47,520 Speaker 2: you want to know more about large language models, everybody 885 00:50:47,640 --> 00:50:51,279 Speaker 2: just to start looking around Earth and when you see 886 00:50:51,320 --> 00:50:54,000 Speaker 2: people running from explosions, go toward it and ask what's 887 00:50:54,040 --> 00:50:54,440 Speaker 2: going on? 888 00:50:55,040 --> 00:50:56,800 Speaker 1: You almost said, type it into a search engine. 889 00:50:57,080 --> 00:50:59,520 Speaker 2: Yeah, steer clear of those. Yeah, there's so much more, 890 00:50:59,520 --> 00:51:02,120 Speaker 2: we could have talked about. But this is if you 891 00:51:02,160 --> 00:51:04,520 Speaker 2: ask me, this is round one. I think we definitely 892 00:51:04,560 --> 00:51:06,000 Speaker 2: need to do at least one or so. 893 00:51:06,080 --> 00:51:08,680 Speaker 1: More on this, Okay, Yeah, and then one day, like 894 00:51:08,719 --> 00:51:11,600 Speaker 1: I said, Ai, Josh and Chuckle, just wrap it all 895 00:51:11,680 --> 00:51:13,480 Speaker 1: up and spank it on the bottom and say no 896 00:51:13,640 --> 00:51:14,279 Speaker 1: problems here. 897 00:51:14,400 --> 00:51:17,120 Speaker 2: Hopefully they'll give us a billion dollars rather than like 898 00:51:17,160 --> 00:51:20,040 Speaker 2: a month free of blue Apron instead. 899 00:51:20,680 --> 00:51:22,359 Speaker 1: Yeah, I mean we could talk here. 900 00:51:23,840 --> 00:51:27,000 Speaker 2: Well, since Chuck said we can talk real confidential like 901 00:51:27,400 --> 00:51:28,760 Speaker 2: that means it's time for listener. 902 00:51:28,800 --> 00:51:34,880 Speaker 1: May I'm gonna call this conception, not inception, but conception. 903 00:51:35,040 --> 00:51:36,680 Speaker 2: Oh I saw this one. I don't know how I 904 00:51:36,719 --> 00:51:37,319 Speaker 2: feel about this. 905 00:51:39,000 --> 00:51:41,200 Speaker 1: Hey, guys. Last year, my wife and I were attempting 906 00:51:41,239 --> 00:51:43,040 Speaker 1: to get pregnant. A couple of months in we made 907 00:51:43,040 --> 00:51:45,040 Speaker 1: plans to stay with some friends in another town for 908 00:51:45,080 --> 00:51:47,600 Speaker 1: a weekend, and when we can arrive, it happened to 909 00:51:47,640 --> 00:51:52,279 Speaker 1: coincide with my wife's ovulation cycle. As shy people, we 910 00:51:52,320 --> 00:51:55,800 Speaker 1: both felt a little bit awkward about, you know, hugging 911 00:51:55,880 --> 00:51:58,960 Speaker 1: and kissing in a friend's guest room, but we really 912 00:51:59,040 --> 00:52:01,360 Speaker 1: didn't want to miss that chance and that time of 913 00:52:01,360 --> 00:52:04,319 Speaker 1: the month, so we went about getting in the mood 914 00:52:04,320 --> 00:52:07,080 Speaker 1: as quietly as possible, and my wife suggested we play 915 00:52:07,080 --> 00:52:09,600 Speaker 1: a podcast from my phone so that, you know, if 916 00:52:09,640 --> 00:52:12,440 Speaker 1: any noise is made outside the room, it would sound 917 00:52:12,480 --> 00:52:14,560 Speaker 1: like we were just doing a little pre bedtime listening. 918 00:52:15,920 --> 00:52:18,360 Speaker 1: I knew I needed something with nice, simple production values, 919 00:52:18,440 --> 00:52:21,120 Speaker 1: so we wouldn't get distracted, of course, by the whiz 920 00:52:21,200 --> 00:52:23,640 Speaker 1: bang sounds and whatnot. And since you were my intro 921 00:52:23,760 --> 00:52:26,040 Speaker 1: to the world of podcast, I've always had a steady 922 00:52:26,040 --> 00:52:29,840 Speaker 1: supply of yours downloaded. I picked the least interesting sounding 923 00:52:29,880 --> 00:52:33,640 Speaker 1: one in the feed at the time, How Coal Works. 924 00:52:33,960 --> 00:52:38,600 Speaker 2: Okay, I thought that one turned out to be surprisingly interesting. Yeah, 925 00:52:38,640 --> 00:52:40,239 Speaker 2: I could see how we would have thought that though. 926 00:52:40,800 --> 00:52:43,359 Speaker 1: Yeah, for sure. We put that on and we did 927 00:52:43,360 --> 00:52:47,000 Speaker 1: our business. Six weeks later we got a positive pregnancy test, 928 00:52:47,880 --> 00:52:50,439 Speaker 1: and now over a year later, we've welcomed our son 929 00:52:50,520 --> 00:52:53,640 Speaker 1: into the world. Name is Cole, and of course we 930 00:52:53,719 --> 00:52:57,279 Speaker 1: named him Cole. That is what this person said. 931 00:52:57,400 --> 00:52:59,200 Speaker 2: Wait a minute, Wait a minute, they really did name 932 00:52:59,239 --> 00:52:59,680 Speaker 2: them Cole. 933 00:53:00,600 --> 00:53:04,319 Speaker 1: No, he said it as a joke. Oh but great 934 00:53:04,360 --> 00:53:07,400 Speaker 1: minds right, good joke for both of them. It's almost 935 00:53:07,440 --> 00:53:13,120 Speaker 1: like you're both chatbox. And this person said You're fine 936 00:53:13,160 --> 00:53:15,600 Speaker 1: to read this, but give me a fake name. And 937 00:53:15,640 --> 00:53:18,360 Speaker 1: so I just want to say thanks to Gene for 938 00:53:18,440 --> 00:53:19,520 Speaker 1: writing in about this. 939 00:53:20,040 --> 00:53:21,840 Speaker 2: Gene is in Gene Transfer. 940 00:53:23,000 --> 00:53:25,400 Speaker 1: Sure, thanks a lot, Gene. 941 00:53:25,400 --> 00:53:29,759 Speaker 2: We appreciate that. I think again, I'm still figuring that 942 00:53:29,800 --> 00:53:32,719 Speaker 2: one out. And if you want to be like Gene, 943 00:53:32,840 --> 00:53:36,360 Speaker 2: I'm making air quotes here, you can send us an 944 00:53:36,400 --> 00:53:39,160 Speaker 2: email to wrap it up. Spank it on the bottom. 945 00:53:39,320 --> 00:53:40,400 Speaker 2: Only humans can do that. 946 00:53:41,120 --> 00:53:43,160 Speaker 1: I wonder if when you said spanking on the bottom, 947 00:53:43,239 --> 00:53:44,840 Speaker 1: if that created any issues. 948 00:53:45,160 --> 00:53:49,760 Speaker 2: Yeah, I hadn't thought about that. Maybe playfully how about that? Sure, 949 00:53:51,080 --> 00:53:58,680 Speaker 2: and send it off to stuff podcast at iHeartRadio dot com. 950 00:53:58,920 --> 00:54:01,799 Speaker 1: Stuff you Should Know is a production of iHeartRadio. For 951 00:54:01,880 --> 00:54:06,080 Speaker 1: more podcasts my heart Radio, visit the iHeartRadio app, Apple Podcasts, 952 00:54:06,200 --> 00:54:08,040 Speaker 1: or wherever you listen to your favorite shows.