1 00:00:05,200 --> 00:00:09,280 Speaker 1: Ah, welcome back to it could happen here a podcast. 2 00:00:10,760 --> 00:00:15,159 Speaker 1: It's a podcast. I'm Robert evans Uh and with me 3 00:00:15,280 --> 00:00:19,560 Speaker 1: today is Garrison Davis and James Stout. 4 00:00:20,079 --> 00:00:24,840 Speaker 2: Hello, a Canadian, a Britishman and a text and walk 5 00:00:24,880 --> 00:00:25,840 Speaker 2: into a podcast. 6 00:00:26,040 --> 00:00:30,800 Speaker 1: Yeah, welcome to a podcast. Only two of them can 7 00:00:30,880 --> 00:00:31,560 Speaker 1: drink in a bar. 8 00:00:32,960 --> 00:00:34,360 Speaker 2: That's not true in Canada. 9 00:00:34,400 --> 00:00:37,919 Speaker 1: We can all drink in a bar now, Garrison, A 10 00:00:37,920 --> 00:00:40,520 Speaker 1: moment ago, you were holding your hand above a lit 11 00:00:40,560 --> 00:00:43,440 Speaker 1: candle in a way that reminded me of G. Gordon Liddy, 12 00:00:43,920 --> 00:00:47,640 Speaker 1: the Nazi who masterminded the Watergate breaking, and in order 13 00:00:47,680 --> 00:00:49,920 Speaker 1: to convince people that he was a hard man, would 14 00:00:49,960 --> 00:00:52,760 Speaker 1: regularly burn the palm with his hand on a candle 15 00:00:52,760 --> 00:01:03,640 Speaker 1: while staring at them and gender great g Gordon Letty, 16 00:01:03,800 --> 00:01:05,800 Speaker 1: you don't know enough about We'll talk about g. Gordon Letty, 17 00:01:05,840 --> 00:01:11,640 Speaker 1: but today we're talking about something else, problematic artificial intelligence, 18 00:01:12,640 --> 00:01:15,679 Speaker 1: which is not a thing that exists anywhere. It is 19 00:01:15,760 --> 00:01:20,360 Speaker 1: instead a terrible, terrible error going back to like the 20 00:01:20,400 --> 00:01:25,080 Speaker 1: sixties in case of terminology, when we talk about all 21 00:01:25,080 --> 00:01:28,280 Speaker 1: of the things that people are like, you know, flipping 22 00:01:28,319 --> 00:01:31,880 Speaker 1: out as ais chat GPT and stable diffusion and fucking 23 00:01:32,840 --> 00:01:38,360 Speaker 1: all these other sort of like different programs. They're not intelligences. 24 00:01:38,440 --> 00:01:41,320 Speaker 1: They're you know, the chat GPT is like a large 25 00:01:41,400 --> 00:01:45,800 Speaker 1: language model. They're all essentially like bots that you train 26 00:01:47,240 --> 00:01:50,880 Speaker 1: to understand kind of like what the likeliest thing that, 27 00:01:51,120 --> 00:01:54,840 Speaker 1: what the likeliest appropriate response is to like a given prompt. 28 00:01:56,120 --> 00:01:59,440 Speaker 1: That's kind of like the broadest way to explain it. 29 00:01:59,440 --> 00:02:01,880 Speaker 1: It's comp located and they're you know, very useful. But 30 00:02:01,880 --> 00:02:05,160 Speaker 1: obviously if you've been paying attention to the world right now, 31 00:02:05,520 --> 00:02:08,120 Speaker 1: there's just a whole bunch of bullshit about them, and 32 00:02:08,320 --> 00:02:12,000 Speaker 1: I think to kind of make sense of why we're 33 00:02:12,120 --> 00:02:14,680 Speaker 1: seeing some of the shit around AI that we're seeing. 34 00:02:14,720 --> 00:02:18,200 Speaker 1: And for a little bit of specificity, there have been 35 00:02:18,280 --> 00:02:21,359 Speaker 1: like this kind of endless series of articles around this 36 00:02:21,560 --> 00:02:24,280 Speaker 1: open letter signed by a bunch of luminaries in the 37 00:02:24,280 --> 00:02:26,959 Speaker 1: AI field talking about how, you know, there need to 38 00:02:27,000 --> 00:02:29,200 Speaker 1: be laws put in place to stop it from ending 39 00:02:29,240 --> 00:02:32,880 Speaker 1: the world. You know, you've seen articles about like oh 40 00:02:33,600 --> 00:02:36,480 Speaker 1: x percentage of AI researchers think that it could it 41 00:02:36,520 --> 00:02:40,280 Speaker 1: could destroy the planet, destroy the human race kind of. 42 00:02:40,320 --> 00:02:44,480 Speaker 1: Most recently, the biggest article, the biggest like viral hype article, 43 00:02:44,680 --> 00:02:48,760 Speaker 1: was that the Pentagon had supposedly been testing an AI 44 00:02:49,520 --> 00:02:53,959 Speaker 1: like missile system that blew up its operator in a 45 00:02:54,040 --> 00:02:57,240 Speaker 1: simulation because the operator was trying to stop it from 46 00:02:57,320 --> 00:03:00,600 Speaker 1: from firing or whatever. It was bullshit, like what was that? 47 00:03:00,680 --> 00:03:03,240 Speaker 1: What actually had? Like Vice ran with the article. It 48 00:03:03,320 --> 00:03:06,160 Speaker 1: was very breakfas Advice would do this flipping out about 49 00:03:06,160 --> 00:03:09,639 Speaker 1: how horrifying you know, our AI weapons future is and like, yeah, 50 00:03:09,680 --> 00:03:12,560 Speaker 1: we shouldn't give AI the ability to like kill people, 51 00:03:12,560 --> 00:03:14,880 Speaker 1: but that's not at all what happened. Basically, a bunch 52 00:03:14,919 --> 00:03:17,760 Speaker 1: of army nerds or air force nerds were sitting around 53 00:03:17,760 --> 00:03:20,120 Speaker 1: a table doing the D and D version of like 54 00:03:20,480 --> 00:03:23,240 Speaker 1: military planning, where you say, what if we did this, 55 00:03:23,520 --> 00:03:26,560 Speaker 1: what kinds of things could happen if we did this system? 56 00:03:26,880 --> 00:03:29,320 Speaker 1: And another guy around the table said, oh, well, if 57 00:03:29,320 --> 00:03:32,200 Speaker 1: we build the system this way, it might conceivably attack 58 00:03:32,280 --> 00:03:35,440 Speaker 1: its operator, you know, in order to optimize for this 59 00:03:35,560 --> 00:03:40,120 Speaker 1: kind of result, which is like not scary, Like it's 60 00:03:40,280 --> 00:03:42,960 Speaker 1: it's just people talking through pot like a flow chart 61 00:03:43,000 --> 00:03:45,160 Speaker 1: of possibilities around a fucking table. You don't need to 62 00:03:45,160 --> 00:03:47,600 Speaker 1: worry about that. There's so many other things to worry about. 63 00:03:47,960 --> 00:03:50,640 Speaker 1: New York City is blanketed in a layer of smogs 64 00:03:50,640 --> 00:03:52,640 Speaker 1: so thick you could cut it with a butter knife. Like, 65 00:03:52,920 --> 00:03:57,920 Speaker 1: don't don't flip out about AI weapons just yet, folks. 66 00:03:58,160 --> 00:04:00,360 Speaker 1: But I wanted to kind of talk about why this 67 00:04:00,400 --> 00:04:02,160 Speaker 1: shit is happening, And a lot of it comes down 68 00:04:02,200 --> 00:04:05,640 Speaker 1: to the fact that when we're talking about the aspects 69 00:04:05,640 --> 00:04:08,120 Speaker 1: of like the tech industry that have an impact on 70 00:04:08,640 --> 00:04:12,400 Speaker 1: outside of the tech industry. Right, there's basically three jobs 71 00:04:12,400 --> 00:04:16,280 Speaker 1: in big tech. One job is creating iterative improvements on 72 00:04:16,360 --> 00:04:18,560 Speaker 1: existing products. These would be the teams of folks who 73 00:04:18,600 --> 00:04:21,560 Speaker 1: are responsible for designing a new iPhone every year, right, 74 00:04:21,800 --> 00:04:24,680 Speaker 1: every couple of years. Lenovo puts out a new series 75 00:04:24,720 --> 00:04:27,159 Speaker 1: of think pads and idea pads every couple of years. 76 00:04:27,240 --> 00:04:30,520 Speaker 1: You know, you get a new MacBook every couple of years. 77 00:04:30,640 --> 00:04:33,919 Speaker 1: Razor puts out a new blade. This is you know, 78 00:04:33,960 --> 00:04:37,120 Speaker 1: these are the folks who kind of move along technology 79 00:04:37,160 --> 00:04:42,080 Speaker 1: at a relatively like steady pace for consumer devices. And 80 00:04:42,120 --> 00:04:45,760 Speaker 1: then you have the people who are responsible for kind 81 00:04:45,800 --> 00:04:48,719 Speaker 1: of what you might call the moonshot products. This is 82 00:04:48,760 --> 00:04:53,280 Speaker 1: a mix of the next big thing and doomed failures, 83 00:04:53,600 --> 00:04:55,520 Speaker 1: and it's often pretty hard to tell you know what's 84 00:04:55,560 --> 00:04:57,600 Speaker 1: going to be what ahead of time. A very good 85 00:04:57,600 --> 00:05:00,360 Speaker 1: example would be back in the nineties, Apple a bunch 86 00:05:00,360 --> 00:05:03,080 Speaker 1: of resources into launching an early tablet computer called the 87 00:05:03,120 --> 00:05:06,279 Speaker 1: Newton that was a fabulous disaster. And then in the 88 00:05:06,279 --> 00:05:08,440 Speaker 1: mid oughts they put a bunch of resources into launching 89 00:05:08,480 --> 00:05:12,560 Speaker 1: the iPad, which was a huge success. And when you 90 00:05:12,640 --> 00:05:15,800 Speaker 1: kind of think about like the folks doing this, like 91 00:05:15,880 --> 00:05:20,279 Speaker 1: working on the moonshot products. The most recent example would 92 00:05:20,320 --> 00:05:22,159 Speaker 1: be whatever team at Apple, the team at Apple that 93 00:05:22,279 --> 00:05:25,479 Speaker 1: was behind putting together these new Apple goggles, which I 94 00:05:25,520 --> 00:05:28,400 Speaker 1: don't think are going to be a wildly successful product 95 00:05:28,480 --> 00:05:30,080 Speaker 1: in the way that they need it to be, like 96 00:05:30,120 --> 00:05:33,240 Speaker 1: a smartphone scale success. But this is an example of 97 00:05:33,279 --> 00:05:35,240 Speaker 1: like a thing that didn't exist and a bunch of 98 00:05:35,240 --> 00:05:37,960 Speaker 1: people had to invent new technologies or new ways to 99 00:05:37,960 --> 00:05:41,599 Speaker 1: combine technologies in order to make it exist. The third 100 00:05:41,760 --> 00:05:45,640 Speaker 1: kind of job that the tech industry has, broadly speaking, 101 00:05:46,120 --> 00:05:49,919 Speaker 1: are conmen, right, And the state that we are in 102 00:05:49,920 --> 00:05:52,560 Speaker 1: in the industry right now is that every major tech 103 00:05:52,560 --> 00:05:55,400 Speaker 1: company is run by some form of con man. Right. 104 00:05:56,279 --> 00:06:00,320 Speaker 1: Tim Cook is, you know, kind of the least cones 105 00:06:00,480 --> 00:06:03,960 Speaker 1: of the conment among them. But like Mark Zuckerberg obviously 106 00:06:04,560 --> 00:06:07,680 Speaker 1: is a fucking flim flam artist, you know, and you 107 00:06:07,680 --> 00:06:10,279 Speaker 1: can see this with the huge amount of money, like 108 00:06:10,320 --> 00:06:12,840 Speaker 1: it's something like eleven billion dollars at least that Facebook 109 00:06:12,839 --> 00:06:16,800 Speaker 1: pumped into this bullshit metaverse scheme that like Apple barely 110 00:06:16,880 --> 00:06:21,800 Speaker 1: even talked about during their event unveiling like a headset 111 00:06:21,839 --> 00:06:25,440 Speaker 1: that has VR potential in it. I'm getting away from 112 00:06:25,440 --> 00:06:27,240 Speaker 1: myself here. Kind of the point that I'm making is 113 00:06:27,240 --> 00:06:30,640 Speaker 1: that you can often have very real products. There's actual 114 00:06:30,680 --> 00:06:36,640 Speaker 1: technology going into the Apple glasses marketed by conman flim 115 00:06:36,640 --> 00:06:38,920 Speaker 1: flam artist. This is not always like a bad thing, Right. 116 00:06:38,960 --> 00:06:41,599 Speaker 1: Steve Jobs was a con man, and it worked out 117 00:06:41,640 --> 00:06:44,760 Speaker 1: pretty well for him because it just so happened that 118 00:06:44,800 --> 00:06:46,960 Speaker 1: the tech not He had a decent enough idea of 119 00:06:47,000 --> 00:06:49,880 Speaker 1: what the tech was capable of that it was able 120 00:06:49,920 --> 00:06:53,159 Speaker 1: to kind of meet the promises he was making in 121 00:06:53,200 --> 00:06:55,920 Speaker 1: more or less real time. An example of what happens, 122 00:06:56,000 --> 00:06:58,400 Speaker 1: you know, pretty spectacularly when that's not the case is 123 00:06:58,440 --> 00:07:01,159 Speaker 1: what we saw with Theronose and Elizabeth Holmes who started 124 00:07:01,160 --> 00:07:04,600 Speaker 1: prison last week. Right, You've got these promises being made 125 00:07:04,600 --> 00:07:06,760 Speaker 1: by the con man and the people who are responsible 126 00:07:06,760 --> 00:07:10,760 Speaker 1: for the moonshots can't make it work. I'm bringing this 127 00:07:10,840 --> 00:07:13,560 Speaker 1: up right now because there's a lot of folks, I 128 00:07:13,640 --> 00:07:17,720 Speaker 1: think who believe that the the nate like the actual 129 00:07:17,760 --> 00:07:21,200 Speaker 1: potential of AI has been proven in a spectacular way 130 00:07:21,680 --> 00:07:24,600 Speaker 1: because the tools that have been released are able to 131 00:07:24,640 --> 00:07:28,360 Speaker 1: do cool things, and I think those people are missing 132 00:07:28,400 --> 00:07:32,400 Speaker 1: some key aspect, like some key things that like might 133 00:07:32,600 --> 00:07:36,160 Speaker 1: cause one to think more critically about the actual potential 134 00:07:36,200 --> 00:07:38,160 Speaker 1: the industry has and also might cause one to think 135 00:07:38,160 --> 00:07:42,640 Speaker 1: more critically about how earth shattering it's all going to be. 136 00:07:42,680 --> 00:07:45,240 Speaker 1: It's being taken kind of as red right now by 137 00:07:45,280 --> 00:07:50,080 Speaker 1: a lot of particularly journalists and media analysts outside of 138 00:07:50,400 --> 00:07:53,080 Speaker 1: the tech and or like outside of you know, the 139 00:07:53,160 --> 00:07:56,200 Speaker 1: dogged tech press, that like, well, this is going to 140 00:07:56,280 --> 00:07:58,880 Speaker 1: upend huge numbers of industries and put massive numbers of 141 00:07:58,880 --> 00:08:01,560 Speaker 1: people out of work. And you know, that may seem 142 00:08:01,720 --> 00:08:03,760 Speaker 1: if you sat down in front of this chatbot and 143 00:08:03,840 --> 00:08:06,960 Speaker 1: had like a mind blowing experience, that may seem credible. 144 00:08:07,960 --> 00:08:11,280 Speaker 1: There's not the evidence behind that yet. If you actually 145 00:08:11,360 --> 00:08:13,920 Speaker 1: look at the numbers behind some of these different companies 146 00:08:13,960 --> 00:08:17,480 Speaker 1: and like how their usership has grown and how it's 147 00:08:17,480 --> 00:08:19,600 Speaker 1: fallen off, one of the things you've seen is that 148 00:08:19,680 --> 00:08:22,760 Speaker 1: a lot of these tools had this kind of massive 149 00:08:23,760 --> 00:08:26,120 Speaker 1: surge peak in terms of the number of people adopting 150 00:08:26,160 --> 00:08:28,080 Speaker 1: them and in terms of their profitability. You saw this 151 00:08:28,160 --> 00:08:31,080 Speaker 1: with like Stable Diffusion, right, and then this kind of 152 00:08:32,080 --> 00:08:36,720 Speaker 1: fairly rapid fall afterwards, not because people are like giving 153 00:08:36,720 --> 00:08:39,679 Speaker 1: it up forever or whatever, but because, like, once you 154 00:08:39,880 --> 00:08:42,840 Speaker 1: fucked around with it and generated some images or generated 155 00:08:42,840 --> 00:08:45,679 Speaker 1: some stories, there's not a huge amount to do unless 156 00:08:45,760 --> 00:08:48,280 Speaker 1: you're someone who's specifically going to be using this for 157 00:08:48,360 --> 00:08:50,600 Speaker 1: your job. And most of the people that wanted to 158 00:08:50,600 --> 00:08:53,520 Speaker 1: fuck around with a lot of these apps didn't have 159 00:08:53,679 --> 00:08:57,040 Speaker 1: long term use cases for them. This is why while 160 00:08:57,080 --> 00:09:00,680 Speaker 1: you've got like, for example, Stability, which is the company 161 00:09:00,800 --> 00:09:04,400 Speaker 1: or at least the main company behind Stable Diffusion, has 162 00:09:04,440 --> 00:09:07,480 Speaker 1: been valued at like four billion dollars I think last 163 00:09:07,520 --> 00:09:10,720 Speaker 1: it was checked, but their annualized revenue is only about 164 00:09:10,760 --> 00:09:15,840 Speaker 1: ten million dollars. So that's a pretty significant gap. And 165 00:09:15,880 --> 00:09:19,480 Speaker 1: it's a pretty significant gap because the actual money in 166 00:09:19,559 --> 00:09:23,200 Speaker 1: AI so far isn't with the service providers really like 167 00:09:23,520 --> 00:09:25,040 Speaker 1: you've got some that have made in like the one 168 00:09:25,080 --> 00:09:28,280 Speaker 1: hundred million dollar range, although it's not entirely clear what 169 00:09:28,320 --> 00:09:31,240 Speaker 1: their margins are or what the kind of the long 170 00:09:31,320 --> 00:09:34,559 Speaker 1: term reliability of that profit is But the vast majority 171 00:09:34,600 --> 00:09:37,240 Speaker 1: of money in AI, like almost all of it has 172 00:09:37,240 --> 00:09:40,160 Speaker 1: been made by companies like Nvidia, and Vidia jumped up 173 00:09:40,160 --> 00:09:42,560 Speaker 1: to become like a trillion dollar company as a result 174 00:09:42,600 --> 00:09:49,240 Speaker 1: of this, because the hardware needs of these products are 175 00:09:49,280 --> 00:09:54,240 Speaker 1: so intense, and obviously that shows there's money here for somebody. 176 00:09:54,240 --> 00:09:56,360 Speaker 1: But the fact that like a shitload of people got 177 00:09:56,400 --> 00:10:00,520 Speaker 1: curious about these apps and use them in in quick 178 00:10:00,559 --> 00:10:03,840 Speaker 1: succession and then kind of dropped off is an evidence that, 179 00:10:04,000 --> 00:10:07,880 Speaker 1: like we're seeing entire industries replaced as much of it 180 00:10:07,920 --> 00:10:10,480 Speaker 1: as it is evidence that like a lot of people 181 00:10:10,520 --> 00:10:15,000 Speaker 1: thought this was interesting briefly, and so I think kind 182 00:10:15,000 --> 00:10:17,360 Speaker 1: of when you look at the data, one of the 183 00:10:17,360 --> 00:10:19,960 Speaker 1: things that suggests is that we're heading towards a point 184 00:10:20,200 --> 00:10:22,880 Speaker 1: in AI, and I think we're probably going to hit 185 00:10:22,920 --> 00:10:24,840 Speaker 1: it within the next six months to a year that 186 00:10:24,960 --> 00:10:27,520 Speaker 1: is broadly referred to as like the trial of disappointment, 187 00:10:28,360 --> 00:10:31,600 Speaker 1: And this is what happens when kind of the promises 188 00:10:31,720 --> 00:10:33,880 Speaker 1: of a new technology that are being made by the 189 00:10:33,920 --> 00:10:36,240 Speaker 1: hypemen or con men as I tend to call them, 190 00:10:36,640 --> 00:10:40,400 Speaker 1: meet with like the actual reality of its execution, which 191 00:10:40,400 --> 00:10:42,880 Speaker 1: in some areas is going to be significant. There are places, 192 00:10:42,920 --> 00:10:45,719 Speaker 1: I think medical research maybe one of them. We'll talk 193 00:10:45,720 --> 00:10:47,600 Speaker 1: about that in a bit, where a lot of the 194 00:10:47,600 --> 00:10:52,000 Speaker 1: promises people are making about AI will be fairly quickly realized, 195 00:10:52,040 --> 00:10:54,040 Speaker 1: and then there are areas where it won't be. I 196 00:10:54,080 --> 00:10:57,640 Speaker 1: think content generation is one of those things. But yeah, 197 00:10:57,880 --> 00:11:00,760 Speaker 1: so that's kind of like what I'm seeing when I'm 198 00:11:00,760 --> 00:11:04,120 Speaker 1: looking at the broad strokes of where this technology is 199 00:11:04,160 --> 00:11:06,319 Speaker 1: here and kind of the gap between how people are 200 00:11:06,320 --> 00:11:08,880 Speaker 1: talking about it and what we're actually seeing in terms 201 00:11:08,880 --> 00:11:21,160 Speaker 1: of monetization. I want to talk a little bit now 202 00:11:21,240 --> 00:11:24,040 Speaker 1: about kind of one of the guys I would call 203 00:11:24,120 --> 00:11:26,120 Speaker 1: him kind of a con man who's been a big 204 00:11:26,200 --> 00:11:29,640 Speaker 1: driver of the current AI push. He's a dude named 205 00:11:29,640 --> 00:11:34,160 Speaker 1: Amad Mustock, and he's the founder of Stable Diffusion right, 206 00:11:34,200 --> 00:11:37,000 Speaker 1: which is a text to image generator that was kind 207 00:11:37,040 --> 00:11:39,920 Speaker 1: of like before chat GPT hit. This was like the 208 00:11:39,920 --> 00:11:44,800 Speaker 1: first really really big mainstream AI thing. Chat GPT was 209 00:11:45,280 --> 00:11:49,440 Speaker 1: a lot larger, but Stable Deviusion came first, and you know, 210 00:11:49,640 --> 00:11:51,839 Speaker 1: was critical behind, among other things, a lot of the 211 00:11:52,160 --> 00:11:58,320 Speaker 1: silliest NFT bullshit. And he's a really interesting dude, Like 212 00:11:58,360 --> 00:12:00,920 Speaker 1: if you look at kind of his own claims his background. 213 00:12:01,640 --> 00:12:04,440 Speaker 1: He says that he's got an Oxford master's degree, that 214 00:12:04,520 --> 00:12:07,960 Speaker 1: he was like the behind an award winning hedge fund, 215 00:12:08,480 --> 00:12:10,800 Speaker 1: that he like worked for the United Nations and a 216 00:12:10,840 --> 00:12:14,839 Speaker 1: really important capacity, and also that he obviously founded this 217 00:12:14,840 --> 00:12:17,920 Speaker 1: this a I bought. None of that's true. He has 218 00:12:17,960 --> 00:12:21,200 Speaker 1: a bachelor's degree from Oxford, not a master's degree. 219 00:12:21,480 --> 00:12:24,280 Speaker 3: He did well, that's what he's playing off. A thing 220 00:12:24,360 --> 00:12:27,640 Speaker 3: that happens where like you can you can get if 221 00:12:27,640 --> 00:12:29,800 Speaker 3: you have a BA ox and you can you can 222 00:12:29,840 --> 00:12:31,520 Speaker 3: get it to be an MA. Doesn't mean you did 223 00:12:31,559 --> 00:12:36,840 Speaker 3: a master's. It's just a wealthy people flex. Yeah, it's 224 00:12:37,080 --> 00:12:38,760 Speaker 3: not a master's degree. You shouldn't qute it that. If 225 00:12:38,760 --> 00:12:40,040 Speaker 3: you're quoting it now, you're taking the pace. 226 00:12:40,559 --> 00:12:42,720 Speaker 1: Yeah. Yeah, he's taken the piss knowing no one's gonna 227 00:12:42,720 --> 00:12:45,239 Speaker 1: call him on it, or at least knowing that people wouldn't, 228 00:12:45,320 --> 00:12:48,040 Speaker 1: like at large, like loudly enough for it to matter 229 00:12:48,120 --> 00:12:52,199 Speaker 1: for him. He hasn't worked with the UN in quite 230 00:12:52,200 --> 00:12:55,320 Speaker 1: some time, and never did in a major capacity. He 231 00:12:55,400 --> 00:12:58,040 Speaker 1: did run a hedge fund that was successful in its 232 00:12:58,080 --> 00:12:59,920 Speaker 1: first year, but then got shut down in its sex 233 00:13:00,280 --> 00:13:04,080 Speaker 1: year because he lost everybody's money. So like this is 234 00:13:04,160 --> 00:13:05,720 Speaker 1: this is. But you see with this guy, if you 235 00:13:05,720 --> 00:13:08,199 Speaker 1: go through his like history, he's like he's like chasing 236 00:13:08,240 --> 00:13:11,760 Speaker 1: hedge funds in the early aughts. He first gets in 237 00:13:11,880 --> 00:13:14,840 Speaker 1: with stable diffusion after COVID, and he's kind of like 238 00:13:14,880 --> 00:13:17,160 Speaker 1: billing it as this is gonna help with like research 239 00:13:17,280 --> 00:13:20,720 Speaker 1: into trying to like you know, fight the COVID nineteen pandemic, 240 00:13:21,880 --> 00:13:24,120 Speaker 1: and then he kind of pivots to like, oh, this 241 00:13:24,240 --> 00:13:26,400 Speaker 1: is a great way to like make NFTs and shit, 242 00:13:26,520 --> 00:13:28,360 Speaker 1: you know when that hit, Like he's he's just sort 243 00:13:28,400 --> 00:13:31,880 Speaker 1: of like chasing where the money is. Yeah, any way 244 00:13:31,920 --> 00:13:34,640 Speaker 1: he kind of can. And he's not, by the way, 245 00:13:34,679 --> 00:13:36,280 Speaker 1: he's not the guy who wrote any of the source 246 00:13:36,320 --> 00:13:38,360 Speaker 1: code for this. That was done by like a group 247 00:13:38,440 --> 00:13:42,840 Speaker 1: of researchers, and he you know, he essentially like acquired it, 248 00:13:43,320 --> 00:13:45,600 Speaker 1: which is usually what happens here. Now, none of this 249 00:13:45,640 --> 00:13:48,240 Speaker 1: has stopped him from getting one hundred million dollars or 250 00:13:48,320 --> 00:13:53,840 Speaker 1: so in investments from various venture partners, and hasn't stopped 251 00:13:53,840 --> 00:13:56,439 Speaker 1: his company from getting this massive violation. It hasn't stopped 252 00:13:56,440 --> 00:14:00,920 Speaker 1: the White House from inviting him to talk as part 253 00:14:00,960 --> 00:14:04,640 Speaker 1: of like a federal AI safety initiative, But it is 254 00:14:04,679 --> 00:14:06,640 Speaker 1: one of those like when I kind of look into 255 00:14:06,679 --> 00:14:09,520 Speaker 1: this guy and kind of the gap between his claims 256 00:14:09,559 --> 00:14:12,800 Speaker 1: and what's actually happened and the claims that are being 257 00:14:12,840 --> 00:14:14,960 Speaker 1: made about the value of his company and what it's 258 00:14:15,000 --> 00:14:17,760 Speaker 1: actually like proved to be worth so far. I think 259 00:14:17,760 --> 00:14:20,560 Speaker 1: a lot about Sam Bankman Freed because a lot of 260 00:14:20,600 --> 00:14:23,680 Speaker 1: like the early writing around this guy was similar, and 261 00:14:23,720 --> 00:14:25,840 Speaker 1: a lot of the kind of shit that he's claiming 262 00:14:26,080 --> 00:14:29,560 Speaker 1: is similar. And yeah, I'm not sure if this is 263 00:14:29,600 --> 00:14:31,800 Speaker 1: a case where because Bankman Freed is one of these 264 00:14:31,920 --> 00:14:36,240 Speaker 1: people who, like Elizabeth Holmes, I think, backed the wrong 265 00:14:36,360 --> 00:14:40,920 Speaker 1: technology because it's fine in Silicon Valley. It's fine, generally 266 00:14:40,920 --> 00:14:43,440 Speaker 1: speaking in capitalism to lie about what a product can 267 00:14:43,480 --> 00:14:46,600 Speaker 1: do if you can, you know, fake it till you 268 00:14:46,640 --> 00:14:49,400 Speaker 1: make it. And maybe AI is there. He may have 269 00:14:49,440 --> 00:14:52,120 Speaker 1: this guy may have made a good bet as to 270 00:14:52,160 --> 00:14:55,600 Speaker 1: the future, but that's kind of far from certain yet. 271 00:14:55,640 --> 00:14:59,280 Speaker 1: And it's it's just really clear how much of this 272 00:14:59,400 --> 00:15:02,640 Speaker 1: industry is being built on, or is being built by. 273 00:15:02,680 --> 00:15:05,040 Speaker 1: How much of the people running sort of these AI 274 00:15:05,160 --> 00:15:08,760 Speaker 1: companies are dudes who managed one way or another, either 275 00:15:08,800 --> 00:15:12,760 Speaker 1: through access to VC funding or kind of like you know, 276 00:15:12,880 --> 00:15:14,600 Speaker 1: just being in the right place at the right time 277 00:15:15,240 --> 00:15:18,560 Speaker 1: to jump in on the bandwagon in the hopes that 278 00:15:18,560 --> 00:15:20,640 Speaker 1: they'll be able to cash out very, very quickly. I 279 00:15:20,680 --> 00:15:23,720 Speaker 1: found a good quote from a Forbes article talking about 280 00:15:23,760 --> 00:15:28,400 Speaker 1: like a big part of why guys like Mustock are 281 00:15:28,480 --> 00:15:31,720 Speaker 1: so interested in AI right now from a financial perspective, 282 00:15:32,640 --> 00:15:34,640 Speaker 1: And this is true, not just this was true about 283 00:15:34,680 --> 00:15:38,840 Speaker 1: like crypto before, but AI because there's more to the technology. 284 00:15:39,960 --> 00:15:44,560 Speaker 1: This is kind of even more so valid quote. Venture 285 00:15:44,600 --> 00:15:47,960 Speaker 1: capitalists historically spend months performing due diligence, a process that 286 00:15:48,000 --> 00:15:50,760 Speaker 1: involves analyzing the market, vetting the founder, and speaking to 287 00:15:50,760 --> 00:15:53,640 Speaker 1: customers to check for red flags before investing in a startup, 288 00:15:54,000 --> 00:15:56,600 Speaker 1: but start to finish. Mstock told Forbes he needed just 289 00:15:56,640 --> 00:15:59,040 Speaker 1: six days to secure one hundred million dollars from leading 290 00:15:59,080 --> 00:16:02,480 Speaker 1: investment firms Coach You and light Speed once Stable Diffusion 291 00:16:02,520 --> 00:16:05,120 Speaker 1: went viral. The extent of due diligence at the firms 292 00:16:05,160 --> 00:16:08,080 Speaker 1: performed is unclear given the speed of the investment. The 293 00:16:08,080 --> 00:16:10,520 Speaker 1: investment thesis we had is that we don't know exactly 294 00:16:10,560 --> 00:16:12,480 Speaker 1: what all the use cases will be, but we know 295 00:16:12,560 --> 00:16:15,080 Speaker 1: that this technology is truly transformative and has reached a 296 00:16:15,080 --> 00:16:17,880 Speaker 1: tipping point in terms of what it can do. Garav Gupta, 297 00:16:17,920 --> 00:16:20,280 Speaker 1: the light Speed partner who led the investment, told Forbes 298 00:16:20,320 --> 00:16:23,040 Speaker 1: in a January interview. So again they're being like, yeah, 299 00:16:23,040 --> 00:16:25,720 Speaker 1: we're pumping tens of million dollars of dollars into this. 300 00:16:25,840 --> 00:16:28,680 Speaker 1: We don't know how it'll make money. It just seems 301 00:16:28,720 --> 00:16:31,880 Speaker 1: so impressive that it has to be profitable. Now that 302 00:16:32,080 --> 00:16:37,280 Speaker 1: line is particularly funny, maybe the wrong word when compared 303 00:16:37,360 --> 00:16:40,760 Speaker 1: alongside this paragraph from later in the article. In an 304 00:16:40,760 --> 00:16:44,560 Speaker 1: open letter last September, Democratic Representative Anna Essue urged action 305 00:16:44,640 --> 00:16:47,360 Speaker 1: in Washington against the open source nature of stable Diffusion. 306 00:16:47,600 --> 00:16:49,720 Speaker 1: The model, she wrote, had been used to generate images 307 00:16:49,760 --> 00:16:52,800 Speaker 1: of violently beaten Asian women and pornography, some of which 308 00:16:52,800 --> 00:16:56,440 Speaker 1: portrays real people. Bashara said new versions of stable Diffusion 309 00:16:56,440 --> 00:17:00,440 Speaker 1: filtered data for potentially unsafe content, helping to prove users 310 00:17:00,480 --> 00:17:04,119 Speaker 1: from generating harmful images in the first place. So it's like, 311 00:17:06,359 --> 00:17:11,000 Speaker 1: part of what's happening here is you've got this thing 312 00:17:11,119 --> 00:17:14,560 Speaker 1: that seems really impressive, and that is to some extent 313 00:17:14,560 --> 00:17:17,399 Speaker 1: because it's able to like remix stuff that exists in 314 00:17:17,440 --> 00:17:21,120 Speaker 1: a way that you haven't done automatically before. But all 315 00:17:21,119 --> 00:17:23,560 Speaker 1: of these kind of valuations are based number one, and 316 00:17:23,680 --> 00:17:28,000 Speaker 1: ignoring the problems with monetizing this stuff, including like the 317 00:17:28,080 --> 00:17:30,800 Speaker 1: still very much unsorted nature of how copyright's going to 318 00:17:30,800 --> 00:17:34,040 Speaker 1: affect this, and also like the question of is this 319 00:17:34,160 --> 00:17:39,399 Speaker 1: really worth that much money? Like is this actually is 320 00:17:39,440 --> 00:17:43,639 Speaker 1: being able to generate kind of weird slightly off putting 321 00:17:43,680 --> 00:17:50,119 Speaker 1: AI images a huge business, like how much of because 322 00:17:50,200 --> 00:17:51,800 Speaker 1: like from where I'm seeing it, one of two things 323 00:17:51,840 --> 00:17:55,439 Speaker 1: is possible. Number one, this replaces all art everywhere and 324 00:17:55,520 --> 00:17:57,960 Speaker 1: so there's a shitload of money in it. Or number two, 325 00:17:58,480 --> 00:18:01,600 Speaker 1: this remains a way that like low quality websites and 326 00:18:02,000 --> 00:18:05,760 Speaker 1: like Amazon drop ship scammers who are like putting up 327 00:18:05,800 --> 00:18:09,600 Speaker 1: fake books on Kindle and whatnot to trick people using 328 00:18:09,640 --> 00:18:12,639 Speaker 1: keywords like that this is just like a way to 329 00:18:12,680 --> 00:18:15,560 Speaker 1: fill that shit out. Like I don't see a whole 330 00:18:15,560 --> 00:18:17,680 Speaker 1: lot of room in the middle there. You know, maybe 331 00:18:17,680 --> 00:18:21,240 Speaker 1: I'm being like overly pessimistic there, but that's that's that's 332 00:18:21,280 --> 00:18:22,080 Speaker 1: where I'm sitting. 333 00:18:22,600 --> 00:18:25,639 Speaker 2: I mean, some of the models we've seen used is 334 00:18:25,800 --> 00:18:30,439 Speaker 2: selling like subscription packs for like access to these tools 335 00:18:30,480 --> 00:18:33,560 Speaker 2: and access to use them for like commercial reasons. The 336 00:18:33,640 --> 00:18:36,680 Speaker 2: other thing we can see is just like corporations selling 337 00:18:36,680 --> 00:18:40,560 Speaker 2: to other corporations like basically having Disney and Warner Brothers 338 00:18:40,600 --> 00:18:42,840 Speaker 2: be able to use this to generate concept art and 339 00:18:42,960 --> 00:18:45,280 Speaker 2: now they don't need to pay concept artists and instead 340 00:18:45,359 --> 00:18:49,399 Speaker 2: they just have like pretty pretty uh pretty like nicely 341 00:18:49,480 --> 00:18:53,480 Speaker 2: curated tools for them to generate this type of Yeah A, 342 00:18:53,640 --> 00:18:56,360 Speaker 2: I imagining those are kind of two of the biggest 343 00:18:56,480 --> 00:18:59,440 Speaker 2: use cases that at least I'm seeing right now from 344 00:19:00,200 --> 00:19:04,080 Speaker 2: more on like the creative filmmaking art side of things. 345 00:19:04,800 --> 00:19:06,320 Speaker 2: Because I mean, I don't think it's going to replace 346 00:19:06,359 --> 00:19:10,959 Speaker 2: all all art. I think nobody nobody is actually uh 347 00:19:11,400 --> 00:19:13,680 Speaker 2: is actually thinking it's just going to replace all all art, 348 00:19:13,760 --> 00:19:16,159 Speaker 2: just like photography did not replace all art. It just 349 00:19:16,200 --> 00:19:19,520 Speaker 2: it changes the paradigm. And because this this tool does 350 00:19:19,560 --> 00:19:24,119 Speaker 2: seem like specifically useful for the for the way that 351 00:19:24,160 --> 00:19:27,560 Speaker 2: we're seeing like corporations make the same a movie every 352 00:19:27,600 --> 00:19:29,880 Speaker 2: five years, like it's all it's it's it's it's all 353 00:19:29,880 --> 00:19:32,480 Speaker 2: built on all of the same stuff. And I think 354 00:19:32,480 --> 00:19:34,160 Speaker 2: that that's how a lot of a lot of it's 355 00:19:34,160 --> 00:19:35,840 Speaker 2: gonna get used. It's gonna be a lot of weird 356 00:19:35,840 --> 00:19:40,080 Speaker 2: scam artists, people just messing around for fun, and then 357 00:19:40,320 --> 00:19:43,640 Speaker 2: people not paying like illustrators as much. 358 00:19:43,600 --> 00:19:47,040 Speaker 1: Like yeah, and I think that's kind of like I 359 00:19:47,080 --> 00:19:49,760 Speaker 1: see this being adopted widely, but that's not the same 360 00:19:50,040 --> 00:19:54,400 Speaker 1: as it like being a huge success. Like right now 361 00:19:54,400 --> 00:19:57,119 Speaker 1: I'm looking at an article that's estimating the current value 362 00:19:57,119 --> 00:19:59,000 Speaker 1: of AI in the US is at one hundred billion 363 00:19:59,040 --> 00:20:01,800 Speaker 1: dollars and that buy twenty thirty, it'll be worth two 364 00:20:01,800 --> 00:20:05,400 Speaker 1: trillion US dollars. And it's like, I don't know, man, 365 00:20:05,520 --> 00:20:06,360 Speaker 1: like is. 366 00:20:06,400 --> 00:20:08,920 Speaker 2: I mean the AI is more than just like mid 367 00:20:08,960 --> 00:20:12,720 Speaker 2: Journey image creation right there, is just like open AI 368 00:20:12,840 --> 00:20:16,000 Speaker 2: and chat GPT, and like AI is in everything we 369 00:20:16,160 --> 00:20:18,520 Speaker 2: use now, like yeah, like AI is in your smartphone. 370 00:20:18,520 --> 00:20:20,880 Speaker 2: AI is going to be in your refrigerator soon. It's 371 00:20:20,880 --> 00:20:23,600 Speaker 2: like it's not just image generation by any means. 372 00:20:23,760 --> 00:20:25,600 Speaker 1: That kind of gets to what I'm saying, because that's 373 00:20:25,760 --> 00:20:29,119 Speaker 1: that's when you look at AI as a tool. Is 374 00:20:29,160 --> 00:20:30,800 Speaker 1: more of like a paint brush than a painter. Is 375 00:20:30,840 --> 00:20:33,199 Speaker 1: a tool that will like augment or be used in 376 00:20:33,240 --> 00:20:34,959 Speaker 1: because I think a lot of a number of times 377 00:20:35,320 --> 00:20:36,800 Speaker 1: it may be used in a way that makes the 378 00:20:36,840 --> 00:20:41,000 Speaker 1: product worse and a lot of existing technologies, Well that's 379 00:20:41,000 --> 00:20:44,520 Speaker 1: really different from kind of number one, the doom and gloom, 380 00:20:44,600 --> 00:20:47,440 Speaker 1: like this is an intelligence on its own that could 381 00:20:47,480 --> 00:20:51,040 Speaker 1: like overtake humanity. I think the worry is more like 382 00:20:51,800 --> 00:20:54,480 Speaker 1: this could make get adopted on such a large scale 383 00:20:54,520 --> 00:20:56,239 Speaker 1: that it like makes a lot of shit worse. Like 384 00:20:56,400 --> 00:20:59,040 Speaker 1: my biggest fear with AI is that it kind of 385 00:20:59,119 --> 00:21:02,119 Speaker 1: hyper charges the SEO industry and the way that that 386 00:21:02,240 --> 00:21:04,919 Speaker 1: has worked to destroy search and destroy so much of 387 00:21:04,960 --> 00:21:05,719 Speaker 1: Internet content. 388 00:21:06,040 --> 00:21:08,560 Speaker 3: Yeah, I think that is very possible. Like if I 389 00:21:08,600 --> 00:21:11,560 Speaker 3: look at chat GPT, like, I don't think that's going 390 00:21:11,600 --> 00:21:14,639 Speaker 3: to be writing features for Rolling Stone anytime soon, but 391 00:21:15,080 --> 00:21:19,720 Speaker 3: what it can probably do because SEO max copy is derivative, right, 392 00:21:19,760 --> 00:21:23,080 Speaker 3: like like it's predictable, it's derivative, it's based on other stuff. 393 00:21:22,800 --> 00:21:23,480 Speaker 1: It's supposed to be. 394 00:21:23,840 --> 00:21:26,600 Speaker 3: Yeah, and so it can do that SEO max copy 395 00:21:26,760 --> 00:21:30,360 Speaker 3: and some of that ad copy like very well, and yeah, 396 00:21:30,560 --> 00:21:33,959 Speaker 3: either really fuck up searches, which is quite possible, and 397 00:21:34,040 --> 00:21:37,800 Speaker 3: also make the lowest kind of acceptable tier of that 398 00:21:37,920 --> 00:21:42,439 Speaker 3: kind of copy what it can generate. And because you 399 00:21:42,440 --> 00:21:43,919 Speaker 3: can just shove that copy in front of people with 400 00:21:44,040 --> 00:21:45,960 Speaker 3: SEO max and then have shitty AD copy written by 401 00:21:46,000 --> 00:21:50,919 Speaker 3: chat GPT, like that will change how it certainly how 402 00:21:50,920 --> 00:21:53,000 Speaker 3: we buy stuff on the Internet, right, But also how 403 00:21:53,000 --> 00:21:57,000 Speaker 3: we read news, et cetera. Yeah, absolutely, and I already 404 00:21:57,040 --> 00:22:00,520 Speaker 3: see that, Like I've written for some big pub locations. 405 00:22:00,560 --> 00:22:04,520 Speaker 3: You have like essentially a side. Do people know what 406 00:22:04,600 --> 00:22:08,920 Speaker 3: content driven commerce is? Oh yeah, yeah yeah yeah, it's 407 00:22:08,920 --> 00:22:11,440 Speaker 3: why every article about stuff is now the best five 408 00:22:11,800 --> 00:22:12,240 Speaker 3: x right. 409 00:22:12,359 --> 00:22:16,240 Speaker 1: Yeah, Like they have affiliate links and the publication will 410 00:22:16,240 --> 00:22:18,440 Speaker 1: profit if you buy stuff after clicking the link. 411 00:22:18,520 --> 00:22:23,119 Speaker 3: Yeah yeah. So like in the probably twenty sixteen era, 412 00:22:24,480 --> 00:22:26,919 Speaker 3: all of the stuff. So I did a lot of 413 00:22:26,920 --> 00:22:30,320 Speaker 3: previously outdoor journalism right right about climbing, gear, bikes, that 414 00:22:30,400 --> 00:22:33,760 Speaker 3: kind of thing, and like that whole industry went to 415 00:22:33,960 --> 00:22:36,800 Speaker 3: just aft com like just affiliate links, and they kind 416 00:22:36,840 --> 00:22:41,440 Speaker 3: of trashed any quality review stuff. And I can see 417 00:22:41,640 --> 00:22:45,280 Speaker 3: like a similar change to that happening with this right 418 00:22:45,320 --> 00:22:48,840 Speaker 3: where where people will just chase that SEO max copy 419 00:22:48,840 --> 00:22:51,080 Speaker 3: and that will become the new cool thing to do 420 00:22:51,320 --> 00:22:54,240 Speaker 3: and like a lot of outlets as a result. But 421 00:22:54,320 --> 00:22:57,760 Speaker 3: that's not the like earth shattering change that people are 422 00:22:57,760 --> 00:22:59,840 Speaker 3: talking about on Twitter dot com or whatever. 423 00:23:10,960 --> 00:23:12,600 Speaker 2: Well, one thing I saw recently is that more and 424 00:23:12,640 --> 00:23:17,000 Speaker 2: more students are just using chat GPT to look up 425 00:23:17,080 --> 00:23:20,280 Speaker 2: information like as opposed to like as a Wikipedia as 426 00:23:20,280 --> 00:23:22,399 Speaker 2: opposed to Wikipedia, are as supposed to Google if they 427 00:23:22,400 --> 00:23:25,359 Speaker 2: have a question. The last chat GPT which has a 428 00:23:25,400 --> 00:23:27,479 Speaker 2: few problems as soon as you start getting into how 429 00:23:27,560 --> 00:23:31,159 Speaker 2: much of the chat gbt output is just AI hallucinations 430 00:23:31,560 --> 00:23:33,800 Speaker 2: where it's not actual information, which is honest, that's not 431 00:23:33,800 --> 00:23:35,760 Speaker 2: think I should just write my own thing on in 432 00:23:35,800 --> 00:23:40,359 Speaker 2: the future. But yeah, it's just it's a really weird problem. 433 00:23:40,720 --> 00:23:43,760 Speaker 1: That's really interesting that the problem of like because I 434 00:23:43,800 --> 00:23:45,800 Speaker 1: think it's it's very clear to me at this point 435 00:23:46,160 --> 00:23:50,719 Speaker 1: that AI is a more user friendly search experience than 436 00:23:50,760 --> 00:23:53,520 Speaker 1: a search engine, right because you can talk to it 437 00:23:53,640 --> 00:23:56,240 Speaker 1: like a person and explain what you need explained. That 438 00:23:56,280 --> 00:23:58,840 Speaker 1: doesn't mean it's a better option in terms of it 439 00:23:59,000 --> 00:24:02,680 Speaker 1: provides people within information more effectively that it that it 440 00:24:02,720 --> 00:24:04,760 Speaker 1: actually tells them what they want to know as well. 441 00:24:05,160 --> 00:24:12,200 Speaker 1: But it's like easier and maybe like less kind of 442 00:24:12,240 --> 00:24:15,560 Speaker 1: an imposing task to like asking AI a question that 443 00:24:15,640 --> 00:24:18,120 Speaker 1: it is to ask like a search and to especially 444 00:24:18,800 --> 00:24:21,480 Speaker 1: as as much worse as Google has gotten lately. Like 445 00:24:21,520 --> 00:24:23,119 Speaker 1: one of the things that I found interesting is I 446 00:24:23,119 --> 00:24:25,200 Speaker 1: was kind of doing digging for this. I was looking 447 00:24:25,240 --> 00:24:27,520 Speaker 1: at some AI articles that were published in like twenty 448 00:24:27,600 --> 00:24:30,560 Speaker 1: nineteen twenty twenty twenty twenty one. This is before the 449 00:24:30,600 --> 00:24:34,000 Speaker 1: big you know AI push that like we're currently all 450 00:24:34,040 --> 00:24:36,760 Speaker 1: in the middle of before chat GPT you know, got 451 00:24:36,800 --> 00:24:39,960 Speaker 1: its its widespread release, and it was talking with like 452 00:24:40,000 --> 00:24:42,359 Speaker 1: some people from Google who were like, yeah, we really 453 00:24:42,400 --> 00:24:45,920 Speaker 1: see AI like supercharging our search results. You know, there's 454 00:24:45,960 --> 00:24:48,160 Speaker 1: a lot of potential and like its ability to help 455 00:24:48,200 --> 00:24:51,560 Speaker 1: people with search. And I'm thinking about in twenty twenty 456 00:24:51,600 --> 00:24:55,120 Speaker 1: twenty nineteen, Google was a really useful tool and it's 457 00:24:55,920 --> 00:24:59,800 Speaker 1: a shit show now, like it's filled with ads, like 458 00:25:00,119 --> 00:25:03,040 Speaker 1: arch results have gotten markedly worse. Everyone who uses Google 459 00:25:03,080 --> 00:25:04,720 Speaker 1: as part of their job will tell you that it's 460 00:25:04,720 --> 00:25:09,480 Speaker 1: gotten like significantly worse in the recent past. And like, 461 00:25:10,440 --> 00:25:16,959 Speaker 1: I that's kind of like the thing that I see 462 00:25:17,000 --> 00:25:20,280 Speaker 1: being more of a worry And it's one of those things. 463 00:25:20,280 --> 00:25:23,800 Speaker 1: It's like on one hand, in the heype machine you have, 464 00:25:24,000 --> 00:25:27,320 Speaker 1: like AI could become like our new god king and 465 00:25:27,359 --> 00:25:30,560 Speaker 1: destroy us all, and the other like AI is going 466 00:25:30,600 --> 00:25:33,560 Speaker 1: to like, you know, create all. There's all this vague 467 00:25:33,560 --> 00:25:35,479 Speaker 1: talk about what it could be giving people the tools 468 00:25:35,520 --> 00:25:38,639 Speaker 1: to create more art than ever before, to you know, 469 00:25:39,119 --> 00:25:43,240 Speaker 1: make more good things faster, And I kind of feel like, well, 470 00:25:43,240 --> 00:25:46,160 Speaker 1: what if neither of those things happens, which I and 471 00:25:46,560 --> 00:25:52,120 Speaker 1: it just sort of allows us to continue making the 472 00:25:52,119 --> 00:25:57,239 Speaker 1: Internet worse for everybody at a more rapid pace. What 473 00:25:57,280 --> 00:26:00,280 Speaker 1: if that's the primary thing that we notice about AI 474 00:26:00,520 --> 00:26:02,080 Speaker 1: as consumers. 475 00:26:02,680 --> 00:26:05,119 Speaker 3: It's probably a reasonab assumption. I think Garrison's point was 476 00:26:05,119 --> 00:26:08,160 Speaker 3: good though, when they said that, like bigger companies will buy, 477 00:26:08,240 --> 00:26:10,280 Speaker 3: like companies will just exist to get brought right, which 478 00:26:10,280 --> 00:26:12,280 Speaker 3: is the thing that's hapened to tech for decades, because 479 00:26:12,359 --> 00:26:16,199 Speaker 3: like it can't fundamentally change things, Like if AI is 480 00:26:16,240 --> 00:26:18,399 Speaker 3: another means of production, right, if we want to be 481 00:26:18,440 --> 00:26:22,159 Speaker 3: like a grossly materialist, if AI is another means of credits, 482 00:26:22,160 --> 00:26:24,440 Speaker 3: a tool for making things, if the same people own 483 00:26:24,480 --> 00:26:27,000 Speaker 3: it and benefit from it, then like it's incapable of 484 00:26:27,000 --> 00:26:31,080 Speaker 3: fundamentally changing our material conditions. Right, just becomes another way 485 00:26:31,440 --> 00:26:34,000 Speaker 3: and for them to churn out shit and say that 486 00:26:34,480 --> 00:26:36,960 Speaker 3: like this is fine, this is what you'll get, you know, 487 00:26:37,040 --> 00:26:39,720 Speaker 3: like churn out shit content on the Internet or whatever 488 00:26:39,760 --> 00:26:40,159 Speaker 3: it might be. 489 00:26:40,640 --> 00:26:45,240 Speaker 1: And likewise, if AI is primarily like if it gets 490 00:26:45,320 --> 00:26:47,840 Speaker 1: caught in this kind of SEO loop where it exists 491 00:26:47,880 --> 00:26:51,399 Speaker 1: primarily to help advertise and sell products, whether it's as 492 00:26:51,440 --> 00:26:55,800 Speaker 1: a search engine or generating mass content, you know, for 493 00:26:56,520 --> 00:26:59,840 Speaker 1: like the Internet that's sort of optimized to appear higher 494 00:26:59,880 --> 00:27:03,439 Speaker 1: and search results, and it's also being trained on that. 495 00:27:03,600 --> 00:27:05,320 Speaker 1: Is there a point at which it kind of starts 496 00:27:05,320 --> 00:27:08,680 Speaker 1: to lobotomize itself where it's just recycling shit other AI 497 00:27:08,880 --> 00:27:12,000 Speaker 1: is written, which also seems kind of inevitable with that. 498 00:27:12,080 --> 00:27:13,520 Speaker 1: This is one of those things. So one of the 499 00:27:13,520 --> 00:27:18,080 Speaker 1: more famous moments, and like recent AI research is this 500 00:27:18,440 --> 00:27:23,280 Speaker 1: Google researcher timnit Gibru, who no longer works at Google, 501 00:27:23,880 --> 00:27:26,560 Speaker 1: and some other very smart people put together a paper 502 00:27:26,600 --> 00:27:29,320 Speaker 1: that like it was I think generally regarded by AI 503 00:27:29,400 --> 00:27:31,119 Speaker 1: folks as kind of middle of the road, but it 504 00:27:32,000 --> 00:27:34,840 Speaker 1: kind of it developed the term stochastic parrot, which is 505 00:27:34,880 --> 00:27:37,240 Speaker 1: what people know it for as sort of trying to 506 00:27:37,320 --> 00:27:40,760 Speaker 1: describe what these quote unquote AIS do in a way 507 00:27:40,800 --> 00:27:42,960 Speaker 1: that's better than an AI, because like, part of what 508 00:27:43,000 --> 00:27:45,760 Speaker 1: it was saying is that, like we have to look 509 00:27:45,760 --> 00:27:47,399 Speaker 1: at this as kind of like a parrot that if 510 00:27:47,440 --> 00:27:50,080 Speaker 1: you say enough like words around it, including enough like 511 00:27:50,200 --> 00:27:53,040 Speaker 1: racial slurs, it'll start repeating a bunch of toxic shit. 512 00:27:53,040 --> 00:27:55,360 Speaker 1: It doesn't know what it's doing. It doesn't have intention, 513 00:27:56,000 --> 00:27:58,480 Speaker 1: it's just kind of like repeating this stuff because that's 514 00:27:58,520 --> 00:28:00,680 Speaker 1: what's been fed into it. But one of the things 515 00:28:00,680 --> 00:28:03,240 Speaker 1: that point out in that paper is that like when 516 00:28:03,280 --> 00:28:05,440 Speaker 1: you have an AI, when you have one of these 517 00:28:05,520 --> 00:28:08,800 Speaker 1: lms trained on too large of a model, it becomes 518 00:28:08,920 --> 00:28:12,320 Speaker 1: number one kind of impossible to avoid that toxic stuff. 519 00:28:12,320 --> 00:28:16,359 Speaker 1: But it also reduces the utility of of the AI 520 00:28:16,480 --> 00:28:18,560 Speaker 1: in a lot of ways because like when you have 521 00:28:18,800 --> 00:28:23,199 Speaker 1: so much data going in, it's very difficult for the 522 00:28:23,280 --> 00:28:27,439 Speaker 1: humans to kind of tell how competent it is. This 523 00:28:27,520 --> 00:28:31,680 Speaker 1: is why stuff like chat GPT involves so much human training, 524 00:28:31,720 --> 00:28:34,400 Speaker 1: why they had hundreds of people spending tens of thousands 525 00:28:34,400 --> 00:28:37,440 Speaker 1: of man hours like going through responses to tell if 526 00:28:37,440 --> 00:28:40,880 Speaker 1: they made sense. Because when you've got like it's one 527 00:28:40,880 --> 00:28:43,719 Speaker 1: thing if you're like using an if you're for example, 528 00:28:43,760 --> 00:28:46,040 Speaker 1: training an AI on a bunch of different like medical 529 00:28:46,120 --> 00:28:50,440 Speaker 1: data to try to determine patterns and like antibiotic research, right, 530 00:28:50,640 --> 00:28:52,800 Speaker 1: which is a thing that that lllms have been like 531 00:28:53,040 --> 00:28:56,280 Speaker 1: shown to be have some early utility in is like 532 00:28:56,400 --> 00:28:59,480 Speaker 1: kind of helping to id identify new paths for like 533 00:28:59,480 --> 00:29:03,760 Speaker 1: antibiot research, because like we've got a lot of data 534 00:29:03,840 --> 00:29:06,080 Speaker 1: but it's also a really focused kind of data. 535 00:29:06,160 --> 00:29:06,320 Speaker 3: Right. 536 00:29:06,320 --> 00:29:08,680 Speaker 1: We're not like training these things on like all of 537 00:29:08,840 --> 00:29:12,440 Speaker 1: you know, Wikipedia and you know thousands and thousands and 538 00:29:12,480 --> 00:29:16,560 Speaker 1: thousands of fan fiction stories about Kirk and Molder fucking 539 00:29:16,600 --> 00:29:19,960 Speaker 1: each other during some sort of like Exile File Star 540 00:29:20,040 --> 00:29:25,960 Speaker 1: Trek crossover. We're using a fairly focused data set to 541 00:29:26,080 --> 00:29:29,080 Speaker 1: try and analyze it in a manner more efficiently than 542 00:29:29,120 --> 00:29:33,520 Speaker 1: people are simply capable of. That's a lot more useful 543 00:29:33,880 --> 00:29:37,920 Speaker 1: in terms of getting good data than you know, just 544 00:29:38,000 --> 00:29:41,080 Speaker 1: training it on half of trillion different things out there, 545 00:29:41,120 --> 00:29:44,920 Speaker 1: a lot of which you're going to be lies. But anyway, 546 00:29:45,280 --> 00:29:48,360 Speaker 1: I found that interesting. It's kind of worth noting that, 547 00:29:48,520 --> 00:29:52,160 Speaker 1: like Gabrew and a number of other people who were 548 00:29:52,280 --> 00:29:56,040 Speaker 1: responsible for that got forced out by Google and kind 549 00:29:56,080 --> 00:30:02,080 Speaker 1: of attacked by the industry. Because I think there's a desperation, 550 00:30:02,480 --> 00:30:05,120 Speaker 1: and I talked about this in that episode I did 551 00:30:05,200 --> 00:30:09,120 Speaker 1: last year kind of about the fundamental emptiness at the 552 00:30:09,120 --> 00:30:11,280 Speaker 1: core of the modern tech industry. But I think there's 553 00:30:11,280 --> 00:30:15,320 Speaker 1: this desperation and we have to find the new thing, 554 00:30:15,560 --> 00:30:17,440 Speaker 1: the thing that's going to be as big as social 555 00:30:17,480 --> 00:30:19,440 Speaker 1: media was, the thing that's going to deliver the kind 556 00:30:19,440 --> 00:30:23,000 Speaker 1: of stock market returns that social media did and that 557 00:30:23,240 --> 00:30:27,680 Speaker 1: doesn't exist yet. And AI is the after especially several 558 00:30:27,760 --> 00:30:32,040 Speaker 1: years of disasters with crypto and diminishing returns in social 559 00:30:32,120 --> 00:30:36,320 Speaker 1: media and honestly diminishing returns and like traditional tech because 560 00:30:36,320 --> 00:30:39,880 Speaker 1: shit like smartphones have reached kind of a point of saturation. Right. 561 00:30:40,120 --> 00:30:42,640 Speaker 1: You can make money sell obviously, like you can make 562 00:30:42,680 --> 00:30:46,640 Speaker 1: money selling smartphones, but you can't show exponential growth, right, 563 00:30:46,640 --> 00:30:52,320 Speaker 1: There's just not that many people who need new ones. Yeah, anyway, yeah, 564 00:30:52,400 --> 00:30:55,440 Speaker 1: I think there's I feel some desperation here. I wanted 565 00:30:55,440 --> 00:30:58,040 Speaker 1: to kind of close by reading you all. I found 566 00:30:58,400 --> 00:31:02,080 Speaker 1: a very funny article in the Financial Times that was 567 00:31:02,120 --> 00:31:06,800 Speaker 1: about the potential that the head of Europe's biggest media group, Birtlesman, 568 00:31:07,800 --> 00:31:12,560 Speaker 1: sees for generative AI, and yeah, it interviewed a couple 569 00:31:12,560 --> 00:31:17,440 Speaker 1: of people, including a guy Thomas Rabe, who works is 570 00:31:17,480 --> 00:31:20,240 Speaker 1: the chief executive of the German business that owns Penguin 571 00:31:20,400 --> 00:31:24,000 Speaker 1: Random House. And one of the things that he says 572 00:31:24,000 --> 00:31:27,080 Speaker 1: in this is basically like I think this is, you know, uh, 573 00:31:27,520 --> 00:31:30,280 Speaker 1: going to be super great for authors. You know, there's 574 00:31:30,280 --> 00:31:34,040 Speaker 1: a potential for copyright infringement problems, but really like it 575 00:31:34,120 --> 00:31:36,959 Speaker 1: would allow you to feed your own work into an 576 00:31:37,000 --> 00:31:40,800 Speaker 1: AI and then produce much more content than you were 577 00:31:40,880 --> 00:31:43,760 Speaker 1: a raverable put able to put out before like exact 578 00:31:43,840 --> 00:31:45,520 Speaker 1: what is it's if it's your content for what you 579 00:31:45,520 --> 00:31:47,120 Speaker 1: own the copyright, and then you use it to train 580 00:31:47,160 --> 00:31:50,640 Speaker 1: the software. You can in theory generate content like never before, 581 00:31:51,640 --> 00:31:55,480 Speaker 1: which I think is yeah, a fundamental, Like you know, 582 00:31:56,000 --> 00:31:58,240 Speaker 1: I don't actually even think it's going to be possible 583 00:31:58,280 --> 00:32:00,520 Speaker 1: to like train them on airport novels. You've got like 584 00:32:00,600 --> 00:32:04,560 Speaker 1: James Patterson and other guys who they're not They don't 585 00:32:04,560 --> 00:32:06,640 Speaker 1: write their own books anymore. They have like a team 586 00:32:06,640 --> 00:32:09,520 Speaker 1: of ghostwriters. But like having gone through a lot of 587 00:32:09,560 --> 00:32:13,080 Speaker 1: AI's stories, they're not books, Like they're not capable of 588 00:32:13,160 --> 00:32:17,040 Speaker 1: writing books. They're capable of like producing text and producing 589 00:32:17,120 --> 00:32:21,000 Speaker 1: pieces of books that human beings can edit laboriously into 590 00:32:21,040 --> 00:32:24,160 Speaker 1: something that might look like a book. But the use 591 00:32:24,200 --> 00:32:27,640 Speaker 1: in that is not like filling up airports with kind 592 00:32:27,720 --> 00:32:30,200 Speaker 1: of mid grade fiction, because I think that's even beyond 593 00:32:30,200 --> 00:32:34,520 Speaker 1: these models. It's like tricking people on Amazon. There was 594 00:32:34,560 --> 00:32:37,720 Speaker 1: a really funny quote in this article though, where at 595 00:32:37,720 --> 00:32:41,640 Speaker 1: the end of it, Rabe is like I asked chat 596 00:32:41,680 --> 00:32:44,840 Speaker 1: GPT what the impact of chat GPT or generative AI 597 00:32:45,080 --> 00:32:48,360 Speaker 1: is unpublishing. It prepared a phenomenal text. Frankly, it was 598 00:32:48,480 --> 00:32:52,440 Speaker 1: very detailed into the point, which he then presented at 599 00:32:52,480 --> 00:32:54,800 Speaker 1: a staff event. So there is kind of evidence that 600 00:32:55,440 --> 00:32:58,280 Speaker 1: CEO jobs could be pretty easily replaced by this. 601 00:32:58,960 --> 00:33:03,640 Speaker 2: Like you don't actually have to do anything, comrade chat GVT. 602 00:33:03,880 --> 00:33:07,680 Speaker 3: We agree, it's just spinning Jenny for bosses. I love it. 603 00:33:08,000 --> 00:33:11,600 Speaker 1: Yeah, anyway, that's that's what I've got right now. We 604 00:33:11,680 --> 00:33:13,920 Speaker 1: have a We've been doing some research and we'll have 605 00:33:13,960 --> 00:33:16,840 Speaker 1: an article out on one of the more unsettling little 606 00:33:16,880 --> 00:33:19,760 Speaker 1: site industries that I think AI is going to create, 607 00:33:20,920 --> 00:33:26,240 Speaker 1: which is like scam children's books that exist to make 608 00:33:26,760 --> 00:33:29,080 Speaker 1: conment on the Internet money and poison the minds of 609 00:33:29,160 --> 00:33:33,640 Speaker 1: little kids. But we'll get that to you next week. Yeah, 610 00:33:33,640 --> 00:33:35,760 Speaker 1: it felt like it was worth coming back to this 611 00:33:35,840 --> 00:33:39,800 Speaker 1: subject because it, I don't know, it's the most apocalyptic 612 00:33:39,880 --> 00:33:42,920 Speaker 1: thing people in the media are talking about in a 613 00:33:43,040 --> 00:33:45,920 Speaker 1: day in which like the entire Northeast is blanketed in 614 00:33:45,960 --> 00:33:47,600 Speaker 1: poison smoke, which seems bad. 615 00:33:48,480 --> 00:33:50,200 Speaker 3: Well, people are talking about that now because they all 616 00:33:50,200 --> 00:33:52,800 Speaker 3: live in New York. I'm a fuck out, But yeah, 617 00:33:52,960 --> 00:33:54,400 Speaker 3: previous to this, Yeah. 618 00:33:55,080 --> 00:34:03,280 Speaker 1: Anyway, if it's good of hell, it could happen here 619 00:34:03,280 --> 00:34:05,040 Speaker 1: as a production of cool Zone Media. 620 00:34:05,120 --> 00:34:07,800 Speaker 2: For more podcasts from cool Zone Media, visit our website 621 00:34:07,840 --> 00:34:10,960 Speaker 2: coolzonemedia dot com or check us out on the iHeartRadio app, 622 00:34:11,000 --> 00:34:14,319 Speaker 2: Apple Podcasts, or wherever you listen to podcasts. You can 623 00:34:14,360 --> 00:34:17,080 Speaker 2: find sources for It could Happen here, updated monthly at 624 00:34:17,080 --> 00:34:19,320 Speaker 2: coolzonemedia dot com slash sources. 625 00:34:19,520 --> 00:34:20,320 Speaker 1: Thanks for listening.