WEBVTT - Radio Better Offline: Brian Koppelman, Cherlynn Low & Mike Drucker

0:00:02.800 --> 0:00:03.960
<v Speaker 1>A media.

0:00:05.559 --> 0:00:08.200
<v Speaker 2>Your scientists have yet to discover how neural networks create

0:00:08.280 --> 0:00:11.760
<v Speaker 2>self consciousness, let alone how the human brain processes two

0:00:11.800 --> 0:00:15.200
<v Speaker 2>dimensional retinal images into three dimensional phenomenon known as perception.

0:00:15.440 --> 0:00:17.880
<v Speaker 2>Yet you, somehow brazen lye declare seeing is believing.

0:00:18.200 --> 0:00:18.919
<v Speaker 3>Yes, I do.

0:00:30.840 --> 0:00:33.520
<v Speaker 2>I'm at Zitron. This is better offline. I'm your host,

0:00:33.880 --> 0:00:36.400
<v Speaker 2>by our merchandise. Go to my newsletter. Where's your ed

0:00:36.479 --> 0:00:36.920
<v Speaker 2>dot app?

0:00:37.040 --> 0:00:37.960
<v Speaker 3>Anyway? Fuck all that.

0:00:38.400 --> 0:00:41.680
<v Speaker 2>Brian Kopperman is joining us here in the studio. He's

0:00:41.720 --> 0:00:45.320
<v Speaker 2>the incredible writer, producer, and he's in the Bear a Bear.

0:00:45.360 --> 0:00:47.800
<v Speaker 2>He's a real deal actor. He's also the co creator,

0:00:47.840 --> 0:00:50.760
<v Speaker 2>showrunner and executive producer of Showtimes, Billions and super pumped

0:00:50.760 --> 0:00:52.559
<v Speaker 2>the battle for Uber. Brian, thank you so much for

0:00:52.640 --> 0:00:53.080
<v Speaker 2>joining us.

0:00:53.080 --> 0:00:53.880
<v Speaker 1>Thrilled to be here with you.

0:00:53.960 --> 0:00:56.240
<v Speaker 2>Man. We've of course got Mike drug At, the wonderful comedian.

0:00:56.320 --> 0:00:57.560
<v Speaker 2>And of course you have a book, don't you.

0:00:57.600 --> 0:01:00.400
<v Speaker 4>Mike, Yes, sir, it's called Good Game, No Rematch. Get

0:01:00.440 --> 0:01:03.840
<v Speaker 4>out embarrassing myself with video games throughout my entire.

0:01:03.600 --> 0:01:06.199
<v Speaker 2>Life, wonderful, I'll be embarrassing myself with tech on this show.

0:01:06.240 --> 0:01:08.600
<v Speaker 2>Sherlyn Lowe as well, joining us from mang gotcha hate

0:01:08.640 --> 0:01:09.240
<v Speaker 2>doing shell.

0:01:09.560 --> 0:01:12.319
<v Speaker 5>I am about it down a protein shake hopefully.

0:01:12.400 --> 0:01:14.920
<v Speaker 2>Yes, Yeah, we're just talking about the Ninja Creamy and

0:01:15.080 --> 0:01:16.399
<v Speaker 2>all the various slops.

0:01:16.480 --> 0:01:20.040
<v Speaker 1>They're great Ninja Creamy. I'll tell you there's no endorsement

0:01:20.040 --> 0:01:21.560
<v Speaker 1>out there for any of us. They don't need it

0:01:21.760 --> 0:01:23.920
<v Speaker 1>because people just get these things and they immediately go

0:01:24.000 --> 0:01:27.880
<v Speaker 1>to tiktokh because they want to share how good they are. Yeah,

0:01:27.959 --> 0:01:30.720
<v Speaker 1>they're honestly, they're so good. That's why I have full

0:01:30.800 --> 0:01:32.039
<v Speaker 1>indorse full endorsement.

0:01:32.160 --> 0:01:33.679
<v Speaker 3>This is the Ninja Creamy show.

0:01:33.760 --> 0:01:36.080
<v Speaker 2>Now I'm going to get what's great about this is

0:01:36.160 --> 0:01:38.960
<v Speaker 2>literally any time I mention any product, I will get

0:01:38.959 --> 0:01:41.000
<v Speaker 2>an email from someone being like, here's a post from

0:01:41.000 --> 0:01:42.840
<v Speaker 2>two thousand and three from the CEO in which case

0:01:42.840 --> 0:01:43.440
<v Speaker 2>they shot the door.

0:01:43.680 --> 0:01:46.880
<v Speaker 3>It's like some insane ed how dare you bring up?

0:01:46.920 --> 0:01:49.440
<v Speaker 3>I'm like, I don't know everything. Please tell me.

0:01:49.800 --> 0:01:52.200
<v Speaker 5>Well, Fair Lives bottles have a lot of Dalley's in them,

0:01:52.240 --> 0:01:53.919
<v Speaker 5>so you know that's our that's our thing today.

0:01:54.240 --> 0:01:56.960
<v Speaker 2>Yeah, I just eat like too much yoga. So I

0:01:57.000 --> 0:01:58.800
<v Speaker 2>want to actually start with a really good thing. So, Mike,

0:01:58.840 --> 0:02:00.640
<v Speaker 2>you wrote a really great piece of the Gamer I

0:02:00.640 --> 0:02:03.040
<v Speaker 2>think they went out, Yeah, yeah, very recently is called

0:02:03.120 --> 0:02:05.240
<v Speaker 2>I'm starting to worry this industry has no respect for

0:02:05.240 --> 0:02:06.080
<v Speaker 2>the people who work in it.

0:02:06.640 --> 0:02:07.600
<v Speaker 3>Well, you walk us through.

0:02:07.440 --> 0:02:09.000
<v Speaker 2>The article, because I think it's a good subject.

0:02:09.000 --> 0:02:10.560
<v Speaker 3>Matsa sure well.

0:02:10.680 --> 0:02:14.440
<v Speaker 4>As listeners may or may not know, Microsoft recently laid

0:02:14.440 --> 0:02:16.640
<v Speaker 4>off about nine thousand people, most of which were in

0:02:16.680 --> 0:02:20.400
<v Speaker 4>their gaming department. Simultaneously, they were claiming that their gaming

0:02:20.400 --> 0:02:22.799
<v Speaker 4>department's quite profitable and they're making a ton of money

0:02:23.160 --> 0:02:24.359
<v Speaker 4>and everything's going well.

0:02:24.440 --> 0:02:26.120
<v Speaker 3>I don't know if that's actually.

0:02:25.840 --> 0:02:29.000
<v Speaker 4>True, but they are saying that, And they've kind of

0:02:29.040 --> 0:02:30.760
<v Speaker 4>also said that a lot of the money that they

0:02:30.760 --> 0:02:34.200
<v Speaker 4>were paying these employees isn't isn't about losses, it's that

0:02:34.240 --> 0:02:36.760
<v Speaker 4>they want to move this money into AI development.

0:02:36.840 --> 0:02:40.119
<v Speaker 2>Did they say that, Yeah, yeah, that's horrifying.

0:02:40.280 --> 0:02:42.399
<v Speaker 4>Yeah, it's kind of im I mean they didn't say

0:02:42.400 --> 0:02:44.040
<v Speaker 4>that word for word, but it was very implied that

0:02:44.120 --> 0:02:46.519
<v Speaker 4>the budget was you know, they're more focused. That was

0:02:46.600 --> 0:02:48.120
<v Speaker 4>kind of the language. The language was like, you know,

0:02:48.160 --> 0:02:51.120
<v Speaker 4>as we restructure, we're more focused on AI going forwards.

0:02:51.120 --> 0:02:53.000
<v Speaker 2>It's just so ironic, is like video games is one

0:02:53.040 --> 0:02:57.080
<v Speaker 2>of the first real exposures to AI for most people, yeah,

0:02:57.120 --> 0:02:58.400
<v Speaker 2>I remember when Fear came out.

0:02:58.440 --> 0:03:00.520
<v Speaker 3>He won't remember fit. Yeah, it was one of the first,

0:03:00.560 --> 0:03:01.079
<v Speaker 3>one of the first.

0:03:01.440 --> 0:03:03.240
<v Speaker 2>It was the first time a guy would like shove

0:03:03.280 --> 0:03:05.400
<v Speaker 2>his head over and look around before just jumping up

0:03:05.400 --> 0:03:06.000
<v Speaker 2>and getting shot.

0:03:06.560 --> 0:03:06.800
<v Speaker 1>Yeah.

0:03:06.800 --> 0:03:08.720
<v Speaker 2>It's just the reason this really spoke to me, other

0:03:08.720 --> 0:03:11.280
<v Speaker 2>than in fact it's very well written, is the I

0:03:11.320 --> 0:03:13.120
<v Speaker 2>feel like this is the central problem with a lot

0:03:13.160 --> 0:03:16.840
<v Speaker 2>of tech, entertainment everything right now, where it's like the

0:03:16.840 --> 0:03:18.000
<v Speaker 2>people at talk and you make this.

0:03:17.960 --> 0:03:18.880
<v Speaker 3>Point well in the article.

0:03:18.960 --> 0:03:21.800
<v Speaker 2>Yeah, the industry is the problem because the industry is

0:03:21.840 --> 0:03:24.400
<v Speaker 2>not being piloted by the people creating right exactly.

0:03:24.440 --> 0:03:26.520
<v Speaker 4>I mean, you know, and it's a problem that also

0:03:26.560 --> 0:03:29.720
<v Speaker 4>extends to the you know, Hollywood, which I also work in.

0:03:29.680 --> 0:03:32.040
<v Speaker 3>Because I'm famous. No, that's not true at all.

0:03:35.280 --> 0:03:37.119
<v Speaker 4>But it's this problem that, like, you know, a lot

0:03:37.160 --> 0:03:40.200
<v Speaker 4>of the decision makers are not as close to the

0:03:40.240 --> 0:03:41.760
<v Speaker 4>product as they used to be. Yeah, I mean, I

0:03:41.760 --> 0:03:43.360
<v Speaker 4>mean you could even say the same same for things

0:03:43.400 --> 0:03:45.880
<v Speaker 4>like Boeing, where you know, you don't have engineers running

0:03:45.880 --> 0:03:48.160
<v Speaker 4>the department anymore. You have business people who've been brought

0:03:48.200 --> 0:03:51.320
<v Speaker 4>in because they're good at checking well track alliance exactly exactly,

0:03:51.840 --> 0:03:54.560
<v Speaker 4>and you know, these people are far away from the product,

0:03:54.560 --> 0:03:56.520
<v Speaker 4>they're far away from the development of it, so they

0:03:56.520 --> 0:03:59.200
<v Speaker 4>see these you know, workers as numbers on a spreadsheet.

0:03:59.680 --> 0:04:02.480
<v Speaker 4>And for better or worse, video game development is a

0:04:02.600 --> 0:04:04.520
<v Speaker 4>very long process. It takes a lot of time, it

0:04:04.520 --> 0:04:05.280
<v Speaker 4>takes a lot of money.

0:04:05.360 --> 0:04:08.600
<v Speaker 2>It's getting longer. It's getting longer, yeah, horizon as well.

0:04:08.840 --> 0:04:11.440
<v Speaker 4>And so it's you know, very easy for these companies

0:04:11.480 --> 0:04:13.960
<v Speaker 4>that are looking for a short term you know, next

0:04:14.000 --> 0:04:15.120
<v Speaker 4>financial quarter.

0:04:14.920 --> 0:04:15.640
<v Speaker 3>Boost to go.

0:04:15.680 --> 0:04:18.080
<v Speaker 4>We fired all these people and this product wasn't going

0:04:18.160 --> 0:04:19.960
<v Speaker 4>to come out for two years, so it really doesn't matter.

0:04:20.200 --> 0:04:22.440
<v Speaker 2>And that's the thing with Microsofts well they've that's that

0:04:22.680 --> 0:04:26.040
<v Speaker 2>they've laid fifteen thousand people this year as well. Nuts

0:04:26.160 --> 0:04:28.960
<v Speaker 2>and it is the money really even going an It's

0:04:29.040 --> 0:04:31.839
<v Speaker 2>just so confusing. But you're seeing everyone with Hollywood. There

0:04:31.880 --> 0:04:35.320
<v Speaker 2>was the Acme versus Coyote thing. Yeah, sorry, Coyote vusus

0:04:35.360 --> 0:04:37.800
<v Speaker 2>Acme thing where they just like multhbuled it releasing it

0:04:37.839 --> 0:04:41.080
<v Speaker 2>happily mouth bolded movie. Yeah, to save on taxes. I

0:04:41.120 --> 0:04:42.720
<v Speaker 2>feel like that shouldn't be a loophole.

0:04:42.920 --> 0:04:44.240
<v Speaker 3>I hate that loophole.

0:04:44.320 --> 0:04:47.000
<v Speaker 4>I hate the idea of that loophole. I mean, it

0:04:47.040 --> 0:04:49.919
<v Speaker 4>almost reminds me of that loophole from and Brian. You

0:04:49.920 --> 0:04:51.560
<v Speaker 4>could maybe correct me on this, but remember like in

0:04:51.600 --> 0:04:53.919
<v Speaker 4>the nineties they made a Fantastic Four film they weren't

0:04:53.960 --> 0:04:56.640
<v Speaker 4>going to release just so they could maintain rights for it,

0:04:56.680 --> 0:04:58.880
<v Speaker 4>because the rights for it were you have to make

0:04:58.880 --> 0:05:00.760
<v Speaker 4>a movie to keep the rights, you don't have to

0:05:00.800 --> 0:05:01.280
<v Speaker 4>release it.

0:05:01.800 --> 0:05:05.840
<v Speaker 1>I don't remember that exactly though, now that you say it,

0:05:05.400 --> 0:05:08.840
<v Speaker 1>it rings a bell, but I don't have particulars on it.

0:05:08.920 --> 0:05:14.320
<v Speaker 1>But I think what you're talking about really the answer.

0:05:15.360 --> 0:05:18.400
<v Speaker 1>I agree completely as you diagnose it with how depressing

0:05:18.760 --> 0:05:21.520
<v Speaker 1>it is. But to me, these things feel like a

0:05:21.560 --> 0:05:26.000
<v Speaker 1>system's response, like complexity theory and system well, they're like

0:05:26.360 --> 0:05:30.440
<v Speaker 1>systems responses to what's happening in the world, as opposed

0:05:30.440 --> 0:05:33.000
<v Speaker 1>to feeling like because I think it's easier for us

0:05:33.080 --> 0:05:37.080
<v Speaker 1>to look at the human being and go that motherfucker,

0:05:37.080 --> 0:05:40.840
<v Speaker 1>and that may be a motherfucker. But another possible answer

0:05:40.960 --> 0:05:46.480
<v Speaker 1>is that from a far remove, like something's happening right,

0:05:46.839 --> 0:05:51.960
<v Speaker 1>and when this thing is happening, it gets into quantum theory.

0:05:51.960 --> 0:05:58.920
<v Speaker 1>But it is a complexity system's response to this giant

0:05:59.320 --> 0:06:05.840
<v Speaker 1>change of artificial intelligence, having certain capabilities and all of this.

0:06:06.000 --> 0:06:08.160
<v Speaker 1>You can look at all of it, and I mean,

0:06:08.160 --> 0:06:11.120
<v Speaker 1>we've seen people who had one set of beliefs for

0:06:11.200 --> 0:06:14.600
<v Speaker 1>so long and on the record in your industry, completely

0:06:14.640 --> 0:06:17.480
<v Speaker 1>switch and they may think that's an autonomous decision they're making.

0:06:17.520 --> 0:06:20.359
<v Speaker 1>But I found a little bit of solace by reading

0:06:20.400 --> 0:06:25.000
<v Speaker 1>about complexity theory because that for me, offers a potentially

0:06:25.080 --> 0:06:30.320
<v Speaker 1>more not hopeful, but sort of a more complete understand.

0:06:30.000 --> 0:06:32.760
<v Speaker 2>You're saying, like a systemic response to conditions. Because I

0:06:32.800 --> 0:06:35.360
<v Speaker 2>don't know about the AI capability side, but the existence

0:06:35.360 --> 0:06:38.719
<v Speaker 2>of capabilities potentially being there makes sense. In the they're

0:06:38.760 --> 0:06:40.840
<v Speaker 2>all trying to reconfigure for a future they don't know

0:06:40.880 --> 0:06:43.520
<v Speaker 2>if it's actually there. The general AI doesn't seem to

0:06:43.560 --> 0:06:46.640
<v Speaker 2>do it, but the potential of that, the idea of it,

0:06:46.720 --> 0:06:49.440
<v Speaker 2>is motivating them so much. Microsoft of all people as well,

0:06:49.480 --> 0:06:51.919
<v Speaker 2>should know how little money's actually being made from this,

0:06:52.080 --> 0:06:53.800
<v Speaker 2>considering out the thirty.

0:06:53.600 --> 0:06:57.400
<v Speaker 1>So that's it, Yes, But executives, honestly, because I've studied

0:06:57.520 --> 0:07:00.800
<v Speaker 1>these people so much and written about them much, and

0:07:00.839 --> 0:07:02.960
<v Speaker 1>I understand how venal they are and how short term

0:07:03.000 --> 0:07:08.680
<v Speaker 1>they think at times. But they're also I think looking

0:07:08.680 --> 0:07:13.240
<v Speaker 1>at data that we don't have very often, and they're

0:07:13.880 --> 0:07:18.040
<v Speaker 1>trying to forecast out a long time.

0:07:18.440 --> 0:07:20.480
<v Speaker 2>Why do you think they have that we don't have though,

0:07:20.560 --> 0:07:24.160
<v Speaker 2>because like with Microsoft for example, and I have probably

0:07:24.240 --> 0:07:26.400
<v Speaker 2>studied such in the DELLA too much at this point.

0:07:26.440 --> 0:07:29.920
<v Speaker 2>I really I learned too much about the growth mindset

0:07:29.960 --> 0:07:33.880
<v Speaker 2>and Karlyn Dweck and all of that nonsense. But the data,

0:07:34.800 --> 0:07:37.960
<v Speaker 2>if the data is there, they're acting very peculiar about

0:07:37.960 --> 0:07:40.920
<v Speaker 2>the data because they're only making thirteen billion dollars this

0:07:41.000 --> 0:07:43.480
<v Speaker 2>year from AI. Ten billion is a ur ur I

0:07:43.640 --> 0:07:46.200
<v Speaker 2>was from open Ai, burning giraffes and whatever they put

0:07:46.240 --> 0:07:49.840
<v Speaker 2>into the machine for chat GPT. But it's it feels

0:07:49.920 --> 0:07:52.480
<v Speaker 2>almost like they are reacting to what they hope will

0:07:52.480 --> 0:07:54.840
<v Speaker 2>happen or what may happen. I'm just trying to lad

0:07:54.880 --> 0:07:55.440
<v Speaker 2>it might just.

0:07:55.440 --> 0:07:58.840
<v Speaker 1>Be when when everyone reacts to quarterly like the problem

0:07:58.840 --> 0:08:01.960
<v Speaker 1>our business, the entertainment. The thing is that these people

0:08:02.000 --> 0:08:05.120
<v Speaker 1>are responding to quarterly earnings calls that they have to make,

0:08:05.160 --> 0:08:08.360
<v Speaker 1>and maybe the data isn't about the long term possible,

0:08:08.360 --> 0:08:10.520
<v Speaker 1>maybe the data is literally that we don't know that

0:08:10.520 --> 0:08:14.320
<v Speaker 1>they're looking at is their little cohort of people and

0:08:14.440 --> 0:08:18.120
<v Speaker 1>the exact moment they can exercise which kinds of options,

0:08:18.120 --> 0:08:20.360
<v Speaker 1>some of which they have to declare and maybe some

0:08:20.440 --> 0:08:22.440
<v Speaker 1>of which they are somehow able to flip on a

0:08:22.440 --> 0:08:23.440
<v Speaker 1>different market.

0:08:23.800 --> 0:08:26.640
<v Speaker 2>What Anthropic basically Amazon did with Anthropic, they flip their

0:08:26.680 --> 0:08:28.600
<v Speaker 2>investment into a certain kind of thing they could tax

0:08:28.640 --> 0:08:28.880
<v Speaker 2>the dug.

0:08:29.920 --> 0:08:31.960
<v Speaker 1>I mean, there's no doubt that there's heartlessness, But I

0:08:32.520 --> 0:08:35.080
<v Speaker 1>just because Druckers here, I just want to say, you know,

0:08:35.200 --> 0:08:38.760
<v Speaker 1>even in the most depressing times or in moments where

0:08:39.320 --> 0:08:45.680
<v Speaker 1>the technology the platform seems brutal, people's humanity can transcend it.

0:08:46.240 --> 0:08:49.880
<v Speaker 1>And I remember in some dark days of Twitter, Drucker

0:08:50.400 --> 0:08:52.439
<v Speaker 1>was and it was really amazing man. And I remember,

0:08:52.480 --> 0:08:53.600
<v Speaker 1>you know, you know, I don't know each other well,

0:08:53.640 --> 0:08:55.280
<v Speaker 1>but we've known each other a long wi time, like

0:08:55.280 --> 0:08:59.760
<v Speaker 1>twenty years, and yeah, it's true. And I remember that

0:08:59.800 --> 0:09:01.920
<v Speaker 1>they're where many nights you were like not sleeping that well,

0:09:02.360 --> 0:09:05.160
<v Speaker 1>sleeping at the wrong time, yeah, and like sleeping during

0:09:05.240 --> 0:09:08.160
<v Speaker 1>the day up at night. But keep I remember him

0:09:08.320 --> 0:09:13.079
<v Speaker 1>really like very vulnerably talking about sadness and depression and

0:09:13.160 --> 0:09:20.200
<v Speaker 1>getting people to entrust sharing, and him actively trying to

0:09:20.240 --> 0:09:24.280
<v Speaker 1>like save people in a place where like where people

0:09:24.320 --> 0:09:27.680
<v Speaker 1>were so callous almost by profession on there by like

0:09:27.720 --> 0:09:29.560
<v Speaker 1>everything they wanted to do that right, Well, all everyone

0:09:29.640 --> 0:09:31.400
<v Speaker 1>was at home playing grand theft Auto, which could you

0:09:31.400 --> 0:09:32.920
<v Speaker 1>guys fix that and get the next one out? But

0:09:33.280 --> 0:09:37.680
<v Speaker 1>everyone's at home playing grand theft Auto, uh, and being callous.

0:09:37.679 --> 0:09:40.240
<v Speaker 1>And he was like literally going, let me explain depression

0:09:40.280 --> 0:09:41.520
<v Speaker 1>to you and what you can do and what the

0:09:41.520 --> 0:09:44.840
<v Speaker 1>resources are and why you shouldn't kill yourself, and which

0:09:44.840 --> 0:09:46.840
<v Speaker 1>I think is great. Of course you would lead the

0:09:46.840 --> 0:09:49.360
<v Speaker 1>empathy on this too and look at these assholes and

0:09:49.520 --> 0:09:52.360
<v Speaker 1>think about all the engineers, and it's the right way

0:09:52.400 --> 0:09:55.840
<v Speaker 1>to process it. I just think often it's not a binary.

0:09:55.559 --> 0:09:57.240
<v Speaker 2>No, and that makes sense. And that's the thing I've

0:09:57.240 --> 0:10:00.360
<v Speaker 2>been saying like a lot about the show is I'm

0:10:00.360 --> 0:10:02.240
<v Speaker 2>talking about the pigs in the arsholes and the skumbags

0:10:02.240 --> 0:10:03.719
<v Speaker 2>and all the different names and the voices I.

0:10:03.679 --> 0:10:04.360
<v Speaker 1>Do for them.

0:10:04.480 --> 0:10:07.240
<v Speaker 2>But it's also about the fact that there is like

0:10:07.480 --> 0:10:09.280
<v Speaker 2>I'm pissed off case of Kagawa.

0:10:09.360 --> 0:10:11.240
<v Speaker 3>Friend of the show's said this to him a lot.

0:10:11.280 --> 0:10:15.280
<v Speaker 2>It's I'm broken hearted dramatic because things like what Mike did.

0:10:15.480 --> 0:10:16.680
<v Speaker 3>Actually one of the reasons every day and.

0:10:16.640 --> 0:10:18.599
<v Speaker 2>Gadget as well, is it's like Yeah, there are a

0:10:18.640 --> 0:10:20.880
<v Speaker 2>lot of these financial horrors in there, but there's fun,

0:10:20.960 --> 0:10:24.240
<v Speaker 2>dorky shit online. Still, there's still one Like most of

0:10:24.280 --> 0:10:27.840
<v Speaker 2>my friends are from the internet, you know. It's like

0:10:27.880 --> 0:10:31.800
<v Speaker 2>I'm like the drill quiet crying. It's like being on

0:10:31.920 --> 0:10:35.119
<v Speaker 2>Like it's like like everything I got is through emails.

0:10:35.360 --> 0:10:36.840
<v Speaker 3>But I still think there is a lot.

0:10:36.720 --> 0:10:39.040
<v Speaker 2>Of joy in this And I think the central thing

0:10:39.160 --> 0:10:42.880
<v Speaker 2>about your article now is just like beneath this capitalism

0:10:43.000 --> 0:10:46.199
<v Speaker 2>crush is actually some really like wonderful things being built.

0:10:46.200 --> 0:10:48.280
<v Speaker 2>There are still wonderful games being built. Yeah, there's like

0:10:48.320 --> 0:10:50.960
<v Speaker 2>an entire economy on Minecraft, for better or for worse

0:10:51.000 --> 0:10:53.480
<v Speaker 2>about selling mods and stuff. There are people on roadblocks

0:10:53.920 --> 0:10:56.320
<v Speaker 2>other problems with roadblocks obviously, who are like building games

0:10:56.320 --> 0:10:58.240
<v Speaker 2>in there and selling them. There's still some cool shit

0:10:58.480 --> 0:11:03.560
<v Speaker 2>happening in tech, just being in with because every three

0:11:03.600 --> 0:11:05.559
<v Speaker 2>months someone gets upset at them.

0:11:05.760 --> 0:11:07.240
<v Speaker 4>Or finds a new way to make money off it,

0:11:07.400 --> 0:11:09.760
<v Speaker 4>which you have to you know, shove into a new category,

0:11:09.840 --> 0:11:10.560
<v Speaker 4>but the new thing.

0:11:10.440 --> 0:11:12.800
<v Speaker 5>To chase, right that's where it is right now. I

0:11:12.840 --> 0:11:15.560
<v Speaker 5>think that what you were talking about, the like macro

0:11:15.640 --> 0:11:18.520
<v Speaker 5>picture of everyone sort of having that reaction, and I

0:11:18.559 --> 0:11:20.600
<v Speaker 5>think we will course correct in time. I think we're

0:11:20.840 --> 0:11:24.200
<v Speaker 5>right now at that stage in history where it's not

0:11:24.240 --> 0:11:27.079
<v Speaker 5>as Cotton dried to me as the NFT crypto sort

0:11:27.120 --> 0:11:30.840
<v Speaker 5>of bubble, where like it was clearly about criminals.

0:11:31.280 --> 0:11:33.439
<v Speaker 1>Yeah, it was walking people.

0:11:33.200 --> 0:11:36.320
<v Speaker 5>Over, knowing right, And I'd argue that on some level

0:11:36.360 --> 0:11:38.600
<v Speaker 5>these are criminals. There are criminals that play in this

0:11:39.000 --> 0:11:41.560
<v Speaker 5>scenario right now. I will say that with what Mike

0:11:41.679 --> 0:11:44.240
<v Speaker 5>was writing about in your article, the Microsoft thing was

0:11:44.280 --> 0:11:46.120
<v Speaker 5>all the more like I think when my team saw

0:11:46.120 --> 0:11:48.880
<v Speaker 5>the news last week, my instant reaction was, didn't they

0:11:48.920 --> 0:11:51.760
<v Speaker 5>just ratify a union contract? Was like their first ever

0:11:51.840 --> 0:11:53.920
<v Speaker 5>in the US? Too like to be saying that on

0:11:54.040 --> 0:11:56.440
<v Speaker 5>one hand, we really care about workers rights and really

0:11:56.440 --> 0:11:58.440
<v Speaker 5>want to protect workers, and the other you're like, and

0:11:58.440 --> 0:12:01.079
<v Speaker 5>those are their quality people, right And now we're like, oh,

0:12:01.080 --> 0:12:03.680
<v Speaker 5>game developers, man, nine thousand of you can go because

0:12:03.720 --> 0:12:05.440
<v Speaker 5>AI n PC's are all the rage right now and

0:12:05.440 --> 0:12:07.360
<v Speaker 5>they really make a lot of sense. A m PCs

0:12:07.880 --> 0:12:10.720
<v Speaker 5>never said a bad thing, never said anything.

0:12:10.400 --> 0:12:13.080
<v Speaker 2>That was demo last year?

0:12:13.960 --> 0:12:14.640
<v Speaker 4>Was it this year?

0:12:14.720 --> 0:12:15.360
<v Speaker 5>I can't remember.

0:12:15.640 --> 0:12:18.480
<v Speaker 2>They've done too and video wheels out at demo would

0:12:18.480 --> 0:12:21.320
<v Speaker 2>be like the generative AI NPC's here and within one

0:12:21.400 --> 0:12:24.280
<v Speaker 2>day someone has made it says layah or like it's

0:12:24.320 --> 0:12:27.319
<v Speaker 2>just said something in spane. Microsoft tay was the twenty

0:12:27.360 --> 0:12:30.520
<v Speaker 2>sixteen AI bought that learned from the Internet to be racist,

0:12:30.720 --> 0:12:33.760
<v Speaker 2>like one of a proto Mecha Hitler situation.

0:12:33.920 --> 0:12:39.160
<v Speaker 4>Oh that was Grock Yeah taal that the Microsoft bathroom

0:12:39.240 --> 0:12:41.280
<v Speaker 4>Yeah from twelve or something.

0:12:41.880 --> 0:12:42.680
<v Speaker 3>I forgot about that.

0:12:43.000 --> 0:12:44.960
<v Speaker 5>But the thing is, back to Brand's point is that

0:12:45.000 --> 0:12:47.680
<v Speaker 5>this keeps happening and then we keep course correcting back

0:12:47.760 --> 0:12:51.360
<v Speaker 5>like it feels like there is a system's response and

0:12:51.480 --> 0:12:53.920
<v Speaker 5>Ed I think you're looking for the joy. I wouldn't

0:12:54.000 --> 0:12:58.319
<v Speaker 5>use the word intervening or interfering. I would say it's noise.

0:12:58.679 --> 0:13:01.680
<v Speaker 5>And the way for companies and people existing in this

0:13:01.720 --> 0:13:04.520
<v Speaker 5>industry to deal with that is to focus on what

0:13:04.559 --> 0:13:06.760
<v Speaker 5>you think is good. I find the irony in saying that,

0:13:06.800 --> 0:13:08.840
<v Speaker 5>which I think these tech bros that you're referring to, Brian,

0:13:08.880 --> 0:13:10.760
<v Speaker 5>they're also trying to focus on what they think it's good,

0:13:10.960 --> 0:13:14.559
<v Speaker 5>so they're cutting out noise from their perspective. And I

0:13:14.600 --> 0:13:17.400
<v Speaker 5>don't know it's that movie that was just released featuring

0:13:17.400 --> 0:13:20.480
<v Speaker 5>those four dudes in the Mountain Lodge.

0:13:20.400 --> 0:13:22.920
<v Speaker 1>Oh not out Head Mountain.

0:13:22.640 --> 0:13:25.280
<v Speaker 5>Mountain Head, mountain Head. Yes, so it feels like that.

0:13:25.320 --> 0:13:27.160
<v Speaker 5>It feels like everyone's got their little bubble, which is

0:13:27.200 --> 0:13:31.080
<v Speaker 5>also at the same time created maybe and supported by tech.

0:13:31.600 --> 0:13:33.959
<v Speaker 5>And if we continue to operate in silos, I don't know.

0:13:33.960 --> 0:13:34.679
<v Speaker 5>I feel like if we.

0:13:34.679 --> 0:13:37.480
<v Speaker 1>Only if we only think of those and often they

0:13:37.520 --> 0:13:40.040
<v Speaker 1>are men in those roles. They're not all, but often

0:13:40.080 --> 0:13:43.600
<v Speaker 1>they are. If you think of some of those you said, bros.

0:13:44.200 --> 0:13:47.680
<v Speaker 1>But if we only reduce them to those guys running

0:13:47.679 --> 0:13:52.240
<v Speaker 1>around trying to kill someone in a song where I

0:13:52.280 --> 0:13:54.680
<v Speaker 1>think in a way it allows in a way, it

0:13:54.720 --> 0:13:58.400
<v Speaker 1>allows us to like not worry about them because we

0:13:58.400 --> 0:13:58.840
<v Speaker 1>can do them.

0:13:58.840 --> 0:13:59.960
<v Speaker 3>Moll right, that's fair.

0:14:00.080 --> 0:14:03.040
<v Speaker 1>Some of them are smarter, like just raw synthesizing power.

0:14:03.640 --> 0:14:06.120
<v Speaker 1>Some of those people, not all of them, they're smarter

0:14:06.200 --> 0:14:08.640
<v Speaker 1>than everyone in this building combined, couple of them. And

0:14:08.679 --> 0:14:10.920
<v Speaker 1>I'm not saying that that makes them good or anything

0:14:11.000 --> 0:14:13.120
<v Speaker 1>in any way good or in any way good for now,

0:14:13.160 --> 0:14:15.040
<v Speaker 1>because even if you say, like I agree with you

0:14:15.120 --> 0:14:18.520
<v Speaker 1>that some of them may believe that they're doing this,

0:14:18.640 --> 0:14:22.040
<v Speaker 1>that they're doing good, right, But if they're thinking in

0:14:22.080 --> 0:14:26.000
<v Speaker 1>a thousand year chunks, that's really bad for You've all

0:14:26.040 --> 0:14:27.880
<v Speaker 1>said this and you've all I think is amazing. And

0:14:27.920 --> 0:14:30.720
<v Speaker 1>he said, well, the problem is historians like me. He goes,

0:14:30.760 --> 0:14:32.440
<v Speaker 1>you may look at it now. He was on a podcast.

0:14:32.440 --> 0:14:34.040
<v Speaker 1>He said, you may look at it now and say,

0:14:34.040 --> 0:14:35.680
<v Speaker 1>but those things worked out. Okay, look where we are.

0:14:35.720 --> 0:14:37.080
<v Speaker 1>But as a historian, I have to look at the

0:14:37.080 --> 0:14:40.960
<v Speaker 1>cost of human life. Yes, that happened for all the intervals,

0:14:41.200 --> 0:14:44.440
<v Speaker 1>all the little intervals to get from there to here.

0:14:44.480 --> 0:14:47.000
<v Speaker 1>And that's really what you guys are just talking about. Rightly,

0:14:47.080 --> 0:14:50.400
<v Speaker 1>so is all the you know, the devastation in the way.

0:14:50.440 --> 0:14:52.360
<v Speaker 1>But but can I just because you might might be

0:14:52.400 --> 0:14:55.880
<v Speaker 1>so jaded, but do you not think AI is like

0:14:56.040 --> 0:14:57.160
<v Speaker 1>mind bogglingly great?

0:14:57.200 --> 0:14:57.680
<v Speaker 3>It's nice.

0:14:58.760 --> 0:15:00.480
<v Speaker 5>I can see the valance. I think it's like far

0:15:00.520 --> 0:15:01.280
<v Speaker 5>off you.

0:15:01.200 --> 0:15:03.600
<v Speaker 1>Don't wait, you really I think it's like the single

0:15:03.720 --> 0:15:07.360
<v Speaker 1>I think it's why the single greatest invention in my life.

0:15:07.440 --> 0:15:09.520
<v Speaker 5>What is what is your favorite thing that has done?

0:15:09.960 --> 0:15:12.680
<v Speaker 1>I think the ability to have super high level conversations

0:15:12.720 --> 0:15:17.880
<v Speaker 1>about really esoteric about systems theory, right, you know it's

0:15:17.880 --> 0:15:20.320
<v Speaker 1>correct in them? Well you can, well, you have to

0:15:20.320 --> 0:15:21.600
<v Speaker 1>do a bit of work.

0:15:22.200 --> 0:15:24.080
<v Speaker 2>I don't want to talk to something that I have

0:15:24.160 --> 0:15:26.160
<v Speaker 2>to verify constantly I talk to people.

0:15:26.200 --> 0:15:28.480
<v Speaker 5>Well, you have to do that with people to some extent.

0:15:28.560 --> 0:15:30.480
<v Speaker 5>You can't trust everything you feel about.

0:15:30.280 --> 0:15:34.080
<v Speaker 1>The automobile or is the horse drawn carriage still your thing?

0:15:35.040 --> 0:15:37.000
<v Speaker 2>No, I'm just saying, no, I'm wondering what the point is.

0:15:37.240 --> 0:15:39.600
<v Speaker 5>It looks very uneasy, right, I mean the point?

0:15:39.680 --> 0:15:40.800
<v Speaker 1>No, I just have jokes in my head.

0:15:41.120 --> 0:15:43.320
<v Speaker 3>No, I'm well, because I.

0:15:43.240 --> 0:15:45.480
<v Speaker 1>Think you can decry the industry and the way people

0:15:45.480 --> 0:15:48.120
<v Speaker 1>are using it. And I guarante I'm pretty sure I

0:15:48.120 --> 0:15:51.280
<v Speaker 1>was reading Alazer you had Kowski before anyone in this room.

0:15:51.400 --> 0:15:53.080
<v Speaker 1>I mean, how to actually change your mind as a

0:15:53.080 --> 0:15:55.200
<v Speaker 1>book that I was obsessed to give it Christmas gifts.

0:15:55.240 --> 0:15:56.360
<v Speaker 3>I can't take your ds.

0:15:56.840 --> 0:15:58.960
<v Speaker 1>I just it's you gotta take him serious, isn't. I

0:15:59.000 --> 0:16:00.800
<v Speaker 1>mean you have to take one to take that guy's

0:16:00.840 --> 0:16:03.160
<v Speaker 1>brain seriously. He think everything you guys are worried about

0:16:03.160 --> 0:16:06.040
<v Speaker 1>he called out fifteen years ago in detail.

0:16:06.440 --> 0:16:09.880
<v Speaker 2>Okay, I mean, like, here's the thing. The automobile is

0:16:09.880 --> 0:16:11.800
<v Speaker 2>not a comparison because we knew who had to go forward,

0:16:11.840 --> 0:16:13.720
<v Speaker 2>side side and everything like that, Like we had an

0:16:13.760 --> 0:16:16.240
<v Speaker 2>actual use case for that with generative AI. The way

0:16:16.280 --> 0:16:19.240
<v Speaker 2>that these conversations happen and large language models can be

0:16:19.320 --> 0:16:22.000
<v Speaker 2>conversing with documents. It's one of the only use cases

0:16:22.240 --> 0:16:24.640
<v Speaker 2>that is actually remotely useful because you can actually verify

0:16:24.720 --> 0:16:27.119
<v Speaker 2>based on the parts of the document. Having a conversation

0:16:27.160 --> 0:16:29.720
<v Speaker 2>with one of these. Fine, it's a thing. I'm not

0:16:29.800 --> 0:16:32.360
<v Speaker 2>amazed by it because there have been chatbots doing this

0:16:32.440 --> 0:16:35.560
<v Speaker 2>with KMS since like what thirteen, twenty fourteen.

0:16:36.360 --> 0:16:38.240
<v Speaker 4>And the Eliza effect goes back to the sixties.

0:16:38.320 --> 0:16:41.440
<v Speaker 2>Yeah, and Eliza even then, there's a Karen Howe's Empire

0:16:41.440 --> 0:16:42.720
<v Speaker 2>of Ai is a great thing about Eliza.

0:16:42.760 --> 0:16:44.360
<v Speaker 3>What the creator was just like, why is everyone so

0:16:44.440 --> 0:16:46.760
<v Speaker 3>fucking impressed? But it's just like, really, just like, what

0:16:46.800 --> 0:16:47.440
<v Speaker 3>the fuck is this?

0:16:48.160 --> 0:16:51.040
<v Speaker 2>I think what it is is. I am just not

0:16:51.120 --> 0:16:54.800
<v Speaker 2>that impressed by it. Based on the larger discussion. Everyone

0:16:54.840 --> 0:16:57.320
<v Speaker 2>acts like this as the fucking future, and it just

0:16:57.360 --> 0:17:00.960
<v Speaker 2>feels like a growth of the past. Chat GPT has

0:17:01.000 --> 0:17:03.680
<v Speaker 2>become for better or for worst is what Google could

0:17:03.720 --> 0:17:06.199
<v Speaker 2>have potentially been. It's insane the attention.

0:17:06.280 --> 0:17:07.680
<v Speaker 1>I agree with you, but like, okay, I got a

0:17:07.720 --> 0:17:11.760
<v Speaker 1>tick bite. Okay, and you can literally the second you

0:17:11.760 --> 0:17:15.679
<v Speaker 1>get a tick bite, everybody says antibiotics you gotta go.

0:17:15.720 --> 0:17:17.520
<v Speaker 1>If you can't find the thing. I got a picture.

0:17:17.560 --> 0:17:21.400
<v Speaker 1>I put it on there. The AI was immediately able

0:17:21.400 --> 0:17:23.440
<v Speaker 1>to say, yes, I could verify it after because I called,

0:17:23.520 --> 0:17:25.560
<v Speaker 1>I was able to say, this is the kind of

0:17:25.560 --> 0:17:27.360
<v Speaker 1>ticket is, this is the area you're in, this kind

0:17:27.400 --> 0:17:29.800
<v Speaker 1>of tick. You don't need antibiotics. You're not gonna get

0:17:29.880 --> 0:17:32.560
<v Speaker 1>lyme disease. Here's why sent it to my doctor who

0:17:32.680 --> 0:17:36.679
<v Speaker 1>that it's feedback and they agreed they'll be harder to

0:17:36.720 --> 0:17:36.879
<v Speaker 1>do it.

0:17:36.880 --> 0:17:39.919
<v Speaker 2>Why didn't you send it to your doctor before you, Austin, Well.

0:17:39.760 --> 0:17:42.240
<v Speaker 1>He doesn't. His job isn't too like identify what ticket is.

0:17:42.440 --> 0:17:44.359
<v Speaker 3>Oh, oh, sorry, I misunderstood what you said.

0:17:44.960 --> 0:17:47.159
<v Speaker 5>Yeah, I want more clarity on the doctor thing. I

0:17:47.240 --> 0:17:49.080
<v Speaker 5>raised my hand as you were talking because I'm like,

0:17:49.080 --> 0:17:52.199
<v Speaker 5>did you trust the chat GPT answer and leave it

0:17:52.240 --> 0:17:52.520
<v Speaker 5>at that?

0:17:52.680 --> 0:17:52.800
<v Speaker 2>Right?

0:17:52.960 --> 0:17:56.320
<v Speaker 1>Then? I took when it said that, then I searched

0:17:56.320 --> 0:17:59.000
<v Speaker 1>for what it said and compared and it was right.

0:17:59.600 --> 0:17:59.879
<v Speaker 1>But it was.

0:18:02.440 --> 0:18:04.720
<v Speaker 4>Out of curiosity and ed you might be able to

0:18:04.760 --> 0:18:07.600
<v Speaker 4>answer this question. Is that And I don't actually know

0:18:07.640 --> 0:18:10.120
<v Speaker 4>the answer. Is that iterative AI or is that generative AI?

0:18:10.320 --> 0:18:13.800
<v Speaker 2>So we're using chat GPT so that it's a mix,

0:18:15.680 --> 0:18:18.520
<v Speaker 2>so that would be generative okay, and so that would

0:18:18.560 --> 0:18:21.359
<v Speaker 2>still be that And by the way, sounds like a

0:18:21.520 --> 0:18:22.000
<v Speaker 2>use case.

0:18:22.359 --> 0:18:23.720
<v Speaker 3>It is the growth of so what you.

0:18:23.640 --> 0:18:24.720
<v Speaker 5>Say, you know, I was going to say, I think

0:18:24.760 --> 0:18:26.320
<v Speaker 5>I think a version of Google could have done that

0:18:26.359 --> 0:18:28.280
<v Speaker 5>for you in the past two before chat chept. What

0:18:28.359 --> 0:18:31.200
<v Speaker 5>chat gupt, the generative side of it is providing is

0:18:31.240 --> 0:18:33.920
<v Speaker 5>the l M, the natural language interface about pulling a

0:18:33.960 --> 0:18:35.960
<v Speaker 5>lot of different sources of data and putting that together

0:18:36.040 --> 0:18:37.680
<v Speaker 5>for you. That is chat jept. You could have done

0:18:37.720 --> 0:18:40.840
<v Speaker 5>that with Google or I think there were apps right

0:18:40.840 --> 0:18:42.760
<v Speaker 5>that you could do a photo uh sort.

0:18:42.560 --> 0:18:43.800
<v Speaker 2>Of recognition for a while.

0:18:44.280 --> 0:18:46.359
<v Speaker 5>Yeah, and I to my knowledge that parts of Google

0:18:46.400 --> 0:18:49.639
<v Speaker 5>and Google Health researchers with their AI divisions were working

0:18:49.640 --> 0:18:52.280
<v Speaker 5>on apps that could identify different things like skin things,

0:18:52.320 --> 0:18:53.600
<v Speaker 5>so like is that a bug bite or is that

0:18:53.640 --> 0:18:56.800
<v Speaker 5>exemon kind of thing through the pixel phones. I don't

0:18:56.840 --> 0:18:59.640
<v Speaker 5>think they've ever broadly released it, but they were experimenting

0:18:59.720 --> 0:19:02.640
<v Speaker 5>way before chat GPT was even a thing. I would

0:19:02.800 --> 0:19:05.280
<v Speaker 5>argue that I think the LM portion of this is

0:19:05.320 --> 0:19:07.119
<v Speaker 5>more about how it converses with you and how it

0:19:07.240 --> 0:19:09.800
<v Speaker 5>like understands what you're actually concerned about. So if you

0:19:09.840 --> 0:19:12.000
<v Speaker 5>didn't give it the exact words of look up this

0:19:12.080 --> 0:19:13.960
<v Speaker 5>thing and should I see a doctor? Even if you

0:19:14.040 --> 0:19:16.800
<v Speaker 5>didn't input this your doctor request, it might you know

0:19:16.840 --> 0:19:18.080
<v Speaker 5>divine that that's what you realize.

0:19:18.119 --> 0:19:19.880
<v Speaker 3>This is the thing like chat GPT.

0:19:20.080 --> 0:19:21.600
<v Speaker 2>I don't think would have been a big deal if

0:19:21.640 --> 0:19:24.760
<v Speaker 2>Google had actually innovated and search at all, if they

0:19:24.800 --> 0:19:27.639
<v Speaker 2>have something adds to it. Google search has been the

0:19:27.680 --> 0:19:32.080
<v Speaker 2>same for like fifteen twenty years, except worse what you

0:19:32.119 --> 0:19:35.920
<v Speaker 2>were describing their valid its inference. Basically, it's infers the

0:19:36.000 --> 0:19:39.000
<v Speaker 2>understanding from the image and all this and then spits

0:19:39.040 --> 0:19:39.560
<v Speaker 2>out an answer.

0:19:39.680 --> 0:19:40.520
<v Speaker 3>It's the use case.

0:19:40.720 --> 0:19:42.240
<v Speaker 2>I think that in the way you use it that

0:19:42.359 --> 0:19:46.200
<v Speaker 2>was responsible. The problem is at scale that is not

0:19:46.280 --> 0:19:47.920
<v Speaker 2>going to work out so great because.

0:19:47.800 --> 0:19:49.480
<v Speaker 1>I believe you know way more about this than I.

0:19:52.200 --> 0:19:54.359
<v Speaker 2>No. I'm genuinely glad you brought this up because it's like,

0:19:54.800 --> 0:19:59.160
<v Speaker 2>but what about that is extrapolating out to the greater

0:19:59.240 --> 0:20:02.480
<v Speaker 2>AI replacing jobs? Think, because that's where my principal problem is.

0:20:02.520 --> 0:20:04.800
<v Speaker 2>People are taking what is what Google should have been

0:20:04.840 --> 0:20:08.000
<v Speaker 2>in twenty seventeen and turned it into this is going

0:20:08.000 --> 0:20:10.680
<v Speaker 2>to replace half of workers. A quote from Dario Amit

0:20:10.760 --> 0:20:12.480
<v Speaker 2>Day that was fucking made up.

0:20:12.359 --> 0:20:14.120
<v Speaker 3>Which he set off the top of his head. It's

0:20:14.119 --> 0:20:14.640
<v Speaker 3>not a bird.

0:20:14.720 --> 0:20:19.640
<v Speaker 2>Sorry, just getting Wario himself. Oh harrio amitay.

0:20:19.920 --> 0:20:22.119
<v Speaker 5>I think what you're also bringing up, Brian, is that

0:20:22.240 --> 0:20:25.120
<v Speaker 5>there is this marketing I guess problem with AI, which

0:20:25.119 --> 0:20:27.480
<v Speaker 5>is that like AI has existed for a very long time,

0:20:27.520 --> 0:20:29.639
<v Speaker 5>and the current version that everyone's really obsessed with is

0:20:29.680 --> 0:20:32.439
<v Speaker 5>jen AI and generative AI is all about like what

0:20:32.560 --> 0:20:35.239
<v Speaker 5>it can generate for you, using large language models, using art,

0:20:35.280 --> 0:20:37.720
<v Speaker 5>like creating art, creating videos, all of that stuff. Is

0:20:37.760 --> 0:20:40.800
<v Speaker 5>this current iteration of AI we're all talking about, But

0:20:40.840 --> 0:20:43.000
<v Speaker 5>the previous stuff has existed before, and this is all

0:20:43.000 --> 0:20:45.000
<v Speaker 5>like just more of the same. I think that's why

0:20:45.040 --> 0:20:46.879
<v Speaker 5>so many of us in the industry are so frustrated

0:20:46.920 --> 0:20:50.639
<v Speaker 5>with it, because there's a misconception. There's also this idea

0:20:50.680 --> 0:20:52.880
<v Speaker 5>that the jobs that it's trying to replace are in

0:20:52.920 --> 0:20:55.359
<v Speaker 5>that field of coming up with art, music and words

0:20:55.359 --> 0:20:58.080
<v Speaker 5>that are not jobs at APay wall to begin with,

0:20:58.200 --> 0:21:01.240
<v Speaker 5>but be like our parts of our life that we

0:21:01.280 --> 0:21:02.879
<v Speaker 5>want it replaced, right, we want it.

0:21:03.840 --> 0:21:06.080
<v Speaker 1>When I talk about Yodkowski, the reason I do is

0:21:06.119 --> 0:21:08.400
<v Speaker 1>that it is that he was somebody who he's right

0:21:08.440 --> 0:21:12.240
<v Speaker 1>on your side way and his thing always was this

0:21:12.440 --> 0:21:15.000
<v Speaker 1>is not going to be a net good. First, that's

0:21:15.000 --> 0:21:18.040
<v Speaker 1>not even he and I mean I remember, I'm sure

0:21:18.040 --> 0:21:21.359
<v Speaker 1>you remember when astro Teller wrote Ex Jesus. I remember

0:21:21.359 --> 0:21:23.399
<v Speaker 1>reading that book like the day it came out, and

0:21:23.440 --> 0:21:26.000
<v Speaker 1>it really did freak me out about what was possible.

0:21:26.040 --> 0:21:28.120
<v Speaker 1>Now we're not even there yet, right, it even isn't

0:21:28.119 --> 0:21:31.040
<v Speaker 1>where that but that version of Ai.

0:21:31.119 --> 0:21:34.280
<v Speaker 2>And Butowski's he's Awskidkowski.

0:21:34.400 --> 0:21:37.520
<v Speaker 3>Sorry, he's so find him quite distasteful.

0:21:37.560 --> 0:21:40.240
<v Speaker 2>But on top of that, he's a fucking agi dom

0:21:40.480 --> 0:21:42.920
<v Speaker 2>and he's saying he's just yeah, sure if a frog

0:21:42.960 --> 0:21:45.920
<v Speaker 2>had but a frog had wings that could fly, It's like, yeah,

0:21:45.960 --> 0:21:47.800
<v Speaker 2>if the computer wakes up and does this, this could

0:21:47.800 --> 0:21:49.399
<v Speaker 2>be scary. He's one of the few people that seems

0:21:49.400 --> 0:21:50.919
<v Speaker 2>to what you think about what a g I could do.

0:21:51.240 --> 0:21:53.520
<v Speaker 2>But also we're so far more off from it that

0:21:53.640 --> 0:21:55.359
<v Speaker 2>I can't see him as anything but a grift of

0:21:55.480 --> 0:21:57.439
<v Speaker 2>because all he is doing is grift.

0:21:57.720 --> 0:21:59.800
<v Speaker 1>It's just he's been on it long before there was grift.

0:22:00.119 --> 0:22:03.959
<v Speaker 1>He was on it as a nonprofit. No he no,

0:22:04.640 --> 0:22:07.200
<v Speaker 1>I mean, think about what I've written. You think I

0:22:07.240 --> 0:22:18.000
<v Speaker 1>know nonprofit's the matter with your one? Yeah, but he's

0:22:18.800 --> 0:22:20.800
<v Speaker 1>I have a different view. Now, I'm not saying all

0:22:20.840 --> 0:22:23.879
<v Speaker 1>his opinions are sure are correct. What I'm saying is

0:22:23.920 --> 0:22:28.040
<v Speaker 1>that that's somebody who flagged a bunch of these potential

0:22:28.560 --> 0:22:32.600
<v Speaker 1>issues a long time ago, and of course it shouldn't

0:22:32.760 --> 0:22:36.040
<v Speaker 1>and it's horrible that these people in charge are so

0:22:36.240 --> 0:22:40.320
<v Speaker 1>willing to slough off being the moment they think there's

0:22:40.359 --> 0:22:42.359
<v Speaker 1>this much of an edge. But also because of what

0:22:42.400 --> 0:22:44.960
<v Speaker 1>I study in life, I can't be so we've all

0:22:45.000 --> 0:22:46.880
<v Speaker 1>of us who write about this stuff in fiction, right,

0:22:46.920 --> 0:22:48.919
<v Speaker 1>you do it because okay, and you guys have to

0:22:48.920 --> 0:22:50.960
<v Speaker 1>report on something you have to. I can take sort

0:22:51.000 --> 0:22:52.560
<v Speaker 1>of like my partner, I could take what's in there

0:22:53.040 --> 0:22:57.480
<v Speaker 1>and try to dramatize what might happen, and always would

0:22:57.560 --> 0:23:00.480
<v Speaker 1>try to point out exactly that these motherfuckers do all

0:23:00.520 --> 0:23:03.239
<v Speaker 1>this shit. It's not surprising to me. It's horrific, but

0:23:03.280 --> 0:23:04.680
<v Speaker 1>not surprising.

0:23:15.960 --> 0:23:17.359
<v Speaker 3>What bothers me as well with it?

0:23:17.440 --> 0:23:20.240
<v Speaker 2>And I think you're completely right, And also like I'm

0:23:20.280 --> 0:23:22.679
<v Speaker 2>not completely saying everything you're saying is wrong, Like you

0:23:22.680 --> 0:23:24.600
<v Speaker 2>actually have well read on this, which is nice because

0:23:24.640 --> 0:23:27.000
<v Speaker 2>a lot of people who mentioned the AGI stuff don't

0:23:27.000 --> 0:23:27.520
<v Speaker 2>read anything.

0:23:28.800 --> 0:23:29.960
<v Speaker 3>But I think the thing is as well.

0:23:30.040 --> 0:23:32.320
<v Speaker 2>Is so much about this aiag I think is not

0:23:32.359 --> 0:23:34.679
<v Speaker 2>about what it can actually do. Yeah, the best fiction

0:23:34.880 --> 0:23:37.919
<v Speaker 2>about AGI, girlfriend Sarah showed me the kill Switch X

0:23:37.960 --> 0:23:41.720
<v Speaker 2>Files episode. Fantastic AGI episode. It's scary because it's not

0:23:41.800 --> 0:23:44.000
<v Speaker 2>about the people making it. It's about the computer itself

0:23:44.040 --> 0:23:46.679
<v Speaker 2>and the intentions spilling into it. A lot of what

0:23:46.720 --> 0:23:49.640
<v Speaker 2>the AGI discussion now is and it will auto make jobs.

0:23:49.680 --> 0:23:51.640
<v Speaker 2>And that's the last time I'm going to think about

0:23:51.640 --> 0:23:56.800
<v Speaker 2>it before I say, here's anthropic and it's and I

0:23:56.840 --> 0:23:58.679
<v Speaker 2>think what it is is there are discussions to be

0:23:58.720 --> 0:24:00.880
<v Speaker 2>had around what this stuff could do. If AI could

0:24:00.920 --> 0:24:02.520
<v Speaker 2>do the things that they're saying it could do, if

0:24:02.520 --> 0:24:04.719
<v Speaker 2>it could replace jobs, it would be doing it. But

0:24:04.800 --> 0:24:08.080
<v Speaker 2>also it would actually require a change in society. It's

0:24:08.119 --> 0:24:09.919
<v Speaker 2>almost as if they're like they want all of the

0:24:10.000 --> 0:24:13.480
<v Speaker 2>profits from all of the exposure and the ability to

0:24:13.520 --> 0:24:16.360
<v Speaker 2>lay people off without actually doing anything to earn it.

0:24:16.480 --> 0:24:18.439
<v Speaker 2>And it comes back to what you said about the

0:24:18.480 --> 0:24:21.520
<v Speaker 2>people running this shit aren't even trying to build AI.

0:24:21.880 --> 0:24:25.320
<v Speaker 2>They're not Mark Zuckerberg's building a Manhattan size get data

0:24:25.359 --> 0:24:27.680
<v Speaker 2>center to build superintelligence.

0:24:28.359 --> 0:24:30.200
<v Speaker 1>I wonder though, when you say, and you guys ask

0:24:30.280 --> 0:24:34.440
<v Speaker 1>you to this. I wonder when you when when you

0:24:34.480 --> 0:24:36.320
<v Speaker 1>talk about the Google could have done this or should

0:24:36.359 --> 0:24:38.080
<v Speaker 1>have done this, And then it's a great point that

0:24:38.119 --> 0:24:42.080
<v Speaker 1>you made about that it is the way that it communicates,

0:24:42.080 --> 0:24:44.920
<v Speaker 1>because what I wonder is, you're all expert, you're all

0:24:45.040 --> 0:24:52.040
<v Speaker 1>native early adopters, native to not only the Internet, but

0:24:52.160 --> 0:24:56.240
<v Speaker 1>native to tech, and so sure you could you have

0:24:56.520 --> 0:25:01.919
<v Speaker 1>used Alda Vista to do oh yeah, ge for fox sake, baby,

0:25:01.920 --> 0:25:03.760
<v Speaker 1>But I'm saying you could have used all that, but

0:25:03.920 --> 0:25:07.639
<v Speaker 1>for generations of people who aren't literate, in a million,

0:25:07.720 --> 0:25:12.240
<v Speaker 1>billions of people exactly, even if that was possible through

0:25:12.280 --> 0:25:14.840
<v Speaker 1>Google image search. But you ever try Google image watch

0:25:14.880 --> 0:25:16.639
<v Speaker 1>people try to do a Google image searching.

0:25:17.200 --> 0:25:17.719
<v Speaker 3>I agree.

0:25:17.800 --> 0:25:22.560
<v Speaker 1>I'm thinking this is all making it for people friendly

0:25:22.600 --> 0:25:25.159
<v Speaker 1>to them, as you said, it easy for them, and

0:25:25.200 --> 0:25:26.800
<v Speaker 1>I wonder if that's.

0:25:26.160 --> 0:25:27.560
<v Speaker 3>That's exactly what I'm saying.

0:25:27.680 --> 0:25:30.560
<v Speaker 2>That's why people like chat GPT. It's not because it's

0:25:30.560 --> 0:25:32.960
<v Speaker 2>this amazing product, it's because it does what people go

0:25:33.040 --> 0:25:34.080
<v Speaker 2>to Google that's hard.

0:25:34.600 --> 0:25:41.879
<v Speaker 1>But communicating well in an inviting, seductive way is very challenging. Yes,

0:25:42.560 --> 0:25:45.119
<v Speaker 1>So for text people to do that to regular people

0:25:45.480 --> 0:25:48.040
<v Speaker 1>is the amazing thing it's almost when you're saying Google

0:25:48.119 --> 0:25:49.160
<v Speaker 1>invented this tech.

0:25:49.800 --> 0:25:52.439
<v Speaker 2>The attention the twenty seventeen Attention is All you Need

0:25:52.480 --> 0:25:53.880
<v Speaker 2>paper was eight Google scientists.

0:25:53.880 --> 0:25:54.760
<v Speaker 3>I just look this up.

0:25:54.800 --> 0:25:57.400
<v Speaker 2>I thought it was several, but it's like, no, shousea.

0:25:57.400 --> 0:25:59.480
<v Speaker 2>They ended up paying like two billion dollars of character

0:26:00.040 --> 0:26:02.280
<v Speaker 2>just to bring him back. Google had this technology and

0:26:02.720 --> 0:26:06.040
<v Speaker 2>had this movement just been we're going to make inference

0:26:06.080 --> 0:26:09.119
<v Speaker 2>of meaning better exactly what you're talking about, it would

0:26:09.119 --> 0:26:11.960
<v Speaker 2>not be this big story. The thing is they need

0:26:12.000 --> 0:26:14.960
<v Speaker 2>it to be what you were describing our use cases good, bad, whatever.

0:26:15.040 --> 0:26:17.320
<v Speaker 2>They are things that people use it for the actual

0:26:17.320 --> 0:26:21.080
<v Speaker 2>things they're describing on as a PhD level intelligence, it's

0:26:21.119 --> 0:26:23.520
<v Speaker 2>going to solve physics. And it's like, no, it's not.

0:26:23.840 --> 0:26:26.920
<v Speaker 2>But if they say, what if Google Search worked, they

0:26:26.960 --> 0:26:29.520
<v Speaker 2>can't make a trazillion dollar They can't be like this

0:26:29.680 --> 0:26:31.879
<v Speaker 2>is going to be everything we need. We're going to

0:26:31.920 --> 0:26:34.760
<v Speaker 2>build data centers. It's just because what you're describing, Brian

0:26:34.880 --> 0:26:36.760
<v Speaker 2>is the most evil thing.

0:26:36.800 --> 0:26:41.240
<v Speaker 1>In going back to Mike Er, yeah, because I think

0:26:41.280 --> 0:26:45.399
<v Speaker 1>what I think in all these things, what I've learned

0:26:45.960 --> 0:26:49.679
<v Speaker 1>just being curious is to look at the incentives.

0:26:49.880 --> 0:26:50.879
<v Speaker 3>Yes, that's what I mean.

0:26:50.960 --> 0:26:54.840
<v Speaker 1>And Google was de incentivized to disincentivized to make search

0:26:54.880 --> 0:26:58.359
<v Speaker 1>work a bit for the user. Yeah, because of the

0:26:58.400 --> 0:27:00.960
<v Speaker 1>profit agenda, right you should, Oh, they think should make

0:27:01.000 --> 0:27:03.680
<v Speaker 1>a fucking profit. But here's the thing, in a weird way,

0:27:03.800 --> 0:27:06.399
<v Speaker 1>them not making a profit is better for the users. Yes,

0:27:06.760 --> 0:27:09.960
<v Speaker 1>and the Google wasn't. But I'm sure it's all on

0:27:09.960 --> 0:27:15.640
<v Speaker 1>the Google paper that nobody's fucking reading except I'm talking

0:27:15.680 --> 0:27:19.520
<v Speaker 1>about for the user. I'm trying to You're raising these

0:27:19.520 --> 0:27:21.560
<v Speaker 1>amazing questions, but if I try to think about how

0:27:21.560 --> 0:27:24.760
<v Speaker 1>to answer them, it's that these companies are like in

0:27:24.800 --> 0:27:26.879
<v Speaker 1>the early days of social media and all this stuff.

0:27:26.920 --> 0:27:29.200
<v Speaker 1>In the beginning, they try to super serve the user

0:27:29.280 --> 0:27:31.800
<v Speaker 1>to get you addicted to it so that then they

0:27:31.800 --> 0:27:33.879
<v Speaker 1>could But the incentive structure is what you got to

0:27:33.920 --> 0:27:36.760
<v Speaker 1>look at. And obviously Microsoft's incentive structure has been locked

0:27:36.760 --> 0:27:38.280
<v Speaker 1>in for a very long time, and that is for

0:27:38.359 --> 0:27:41.240
<v Speaker 1>the people who have the most stock in Microsoft to

0:27:41.280 --> 0:27:43.400
<v Speaker 1>make the most money one. That's the incentive.

0:27:43.119 --> 0:27:45.199
<v Speaker 5>Struct Google too, to an extent, and that's why they

0:27:45.240 --> 0:27:47.040
<v Speaker 5>had that stuff for that long, but they never because

0:27:47.040 --> 0:27:48.600
<v Speaker 5>they never found a way to integrate it into all

0:27:48.600 --> 0:27:50.640
<v Speaker 5>the different parts of his businesses that matter. That ads part,

0:27:50.680 --> 0:27:51.240
<v Speaker 5>the search.

0:27:51.080 --> 0:27:52.680
<v Speaker 3>Parts do still so they monopoly.

0:27:52.720 --> 0:27:54.600
<v Speaker 5>They just they have a monopoly. They have no reason

0:27:54.600 --> 0:27:56.399
<v Speaker 5>to be. But I don't know if you remember this

0:27:56.520 --> 0:27:59.840
<v Speaker 5>Google Io twenty seventeen, twenty eighteen, even twenty nineteen when

0:28:00.160 --> 0:28:03.159
<v Speaker 5>first show kitchen Sink, which was their version of chat GPT,

0:28:03.320 --> 0:28:06.160
<v Speaker 5>but it wasn't as conversational. It was you can ask

0:28:06.200 --> 0:28:08.600
<v Speaker 5>this app to come up with ways to learn about

0:28:08.640 --> 0:28:10.480
<v Speaker 5>a new hobby or to plan a thing for you.

0:28:10.640 --> 0:28:13.240
<v Speaker 5>Before chat GPC even was known widely to everyone, I

0:28:13.240 --> 0:28:15.320
<v Speaker 5>think it was a I wasn't really even a thing,

0:28:15.800 --> 0:28:18.920
<v Speaker 5>and it just wasn't a chat interface. And I think

0:28:18.960 --> 0:28:22.120
<v Speaker 5>to Brian's point, the incentive that chat GPT has brought

0:28:22.160 --> 0:28:23.840
<v Speaker 5>to people and to my parents, who by the way,

0:28:23.880 --> 0:28:26.919
<v Speaker 5>discovered chat GPT last year, very annoying to me. But

0:28:27.080 --> 0:28:30.200
<v Speaker 5>it's like it's so much easier. It brings them into

0:28:30.320 --> 0:28:32.600
<v Speaker 5>technology in a way that technology used to be kind

0:28:32.600 --> 0:28:34.680
<v Speaker 5>of looking down on people for not knowing things, and

0:28:35.040 --> 0:28:37.240
<v Speaker 5>you deal you do away with that with the chat bot.

0:28:37.480 --> 0:28:40.600
<v Speaker 5>My parents not only like feel so hip now, which

0:28:40.680 --> 0:28:43.719
<v Speaker 5>sorry Mom and dad, you're not, but but also like

0:28:43.760 --> 0:28:47.240
<v Speaker 5>there's people who seek comfort in the companionship brought on

0:28:47.400 --> 0:28:51.720
<v Speaker 5>by AI chatbots like chests differently, they are amazing. They

0:28:51.880 --> 0:28:55.000
<v Speaker 5>dance at their age and I love that. But the

0:28:55.040 --> 0:28:56.600
<v Speaker 5>thing is, if you look at the use of jen

0:28:56.680 --> 0:28:58.440
<v Speaker 5>a I from the last and I think I talked

0:28:58.440 --> 0:29:01.000
<v Speaker 5>about this on that Kevin Ruse episode you're on, but

0:29:01.800 --> 0:29:04.880
<v Speaker 5>the use of chat aih services has to change, right.

0:29:04.960 --> 0:29:08.080
<v Speaker 5>Used to be like very interest based on very search basements.

0:29:08.160 --> 0:29:11.160
<v Speaker 5>Now companionship base as the top few uses. And that's

0:29:11.160 --> 0:29:13.360
<v Speaker 5>why people are drawn to it. And I think that

0:29:14.440 --> 0:29:16.400
<v Speaker 5>my last point that came up really when I was

0:29:16.480 --> 0:29:19.120
<v Speaker 5>like listening to you talk ed, is that like the

0:29:19.280 --> 0:29:22.400
<v Speaker 5>incentive for them to push towards like, yes, let's go

0:29:22.440 --> 0:29:25.000
<v Speaker 5>towards a GI. It's not just like laying off people.

0:29:25.040 --> 0:29:27.840
<v Speaker 5>It's also who can get there first. It's the race

0:29:27.880 --> 0:29:31.160
<v Speaker 5>of like it's like it's the tech AI ego thing,

0:29:31.240 --> 0:29:33.640
<v Speaker 5>that tech the text ceo ego thing, and then from

0:29:33.640 --> 0:29:35.920
<v Speaker 5>the ego standpoint, then they pushed down to profits.

0:29:36.200 --> 0:29:39.840
<v Speaker 1>So many businesses are like so many businesses are already

0:29:41.200 --> 0:29:44.240
<v Speaker 1>sort of the industries are already acting in bad faith.

0:29:44.360 --> 0:29:47.240
<v Speaker 1>Oh yes, yeah, like okay, look at them. Here's here's

0:29:47.560 --> 0:29:49.280
<v Speaker 1>I'll give you, not a youth, like a real industry

0:29:49.280 --> 0:29:51.200
<v Speaker 1>that I think will be transformed by it. And I

0:29:51.200 --> 0:29:54.080
<v Speaker 1>think that's not a bad thing. Is like money management.

0:29:55.200 --> 0:30:00.640
<v Speaker 1>Money management, which is a billions of AI AI already

0:30:00.200 --> 0:30:06.120
<v Speaker 1>can replace delineate between Yeah, though, but well you can,

0:30:06.200 --> 0:30:08.600
<v Speaker 1>you can do that. But I'm saying, I know that

0:30:08.640 --> 0:30:12.080
<v Speaker 1>whole business, That whole business is about a front money management.

0:30:12.160 --> 0:30:14.200
<v Speaker 1>I'm not talking about the high net worth I'm saying

0:30:14.440 --> 0:30:18.080
<v Speaker 1>in general, people who use for their retirement accounts money

0:30:18.080 --> 0:30:22.880
<v Speaker 1>managers well front all that stuff. But within those companies

0:30:23.240 --> 0:30:26.480
<v Speaker 1>they're they're they're front facing language of humans who are

0:30:26.480 --> 0:30:29.360
<v Speaker 1>just trying to keep you invested. They don't know that stuff.

0:30:29.400 --> 0:30:31.600
<v Speaker 1>They are not offering value really, and they're taking his

0:30:31.640 --> 0:30:34.560
<v Speaker 1>big percentage from regular people trying to say it for

0:30:34.600 --> 0:30:37.560
<v Speaker 1>their retirement, and they're bleeding off my and in the

0:30:37.680 --> 0:30:40.080
<v Speaker 1>end they're going I was listening to.

0:30:40.080 --> 0:30:42.720
<v Speaker 2>Like, No, I'm smiling because you were right already this

0:30:42.840 --> 0:30:44.960
<v Speaker 2>was the twenty fifteen through twenty twenty wealth.

0:30:45.040 --> 0:30:46.360
<v Speaker 1>No, I know that all that stuff, but I'm talking

0:30:46.400 --> 0:30:49.320
<v Speaker 1>about even now like big banks, like the big banks,

0:30:49.320 --> 0:30:51.840
<v Speaker 1>I'm not talking about their their AI front. I'm saying

0:30:51.880 --> 0:30:56.120
<v Speaker 1>that I was listening to a talk given by Josh

0:30:56.200 --> 0:30:59.760
<v Speaker 1>Brown is uh and I one of his partners, I think, okay,

0:31:00.000 --> 0:31:07.520
<v Speaker 1>And he was talking about how essentially all of the

0:31:07.560 --> 0:31:09.640
<v Speaker 1>back end of all that stuff, meaning you might still

0:31:09.640 --> 0:31:13.040
<v Speaker 1>have a person talking to the user to but everything

0:31:13.040 --> 0:31:15.560
<v Speaker 1>else is going to be done by I just just

0:31:16.960 --> 0:31:18.240
<v Speaker 1>it is Michael Batnik from his company.

0:31:18.240 --> 0:31:21.280
<v Speaker 2>He was talking because here's the thing with that is

0:31:21.880 --> 0:31:25.080
<v Speaker 2>from my knowledge of basically financial regulation, they don't want

0:31:25.640 --> 0:31:27.960
<v Speaker 2>LLM touching much of this. They there's a lot of

0:31:27.960 --> 0:31:31.440
<v Speaker 2>stuff within financial research happening now with jener I, a

0:31:31.440 --> 0:31:34.760
<v Speaker 2>ton of companies doing like insanely high compute burn to

0:31:34.840 --> 0:31:38.960
<v Speaker 2>do these massive kind of like evaluations of stuff. I

0:31:38.960 --> 0:31:41.840
<v Speaker 2>don't know if anyone wants to touch the money with llms.

0:31:41.880 --> 0:31:44.920
<v Speaker 2>And they've actually been quite resistant to it, partly because

0:31:44.920 --> 0:31:46.760
<v Speaker 2>they don't know how they work, Like they truly don't know,

0:31:46.800 --> 0:31:49.200
<v Speaker 2>I know, the black box thing, yeah, yea yeah, And

0:31:49.240 --> 0:31:53.040
<v Speaker 2>so it's like a lot of these things. Also, when

0:31:53.080 --> 0:31:55.720
<v Speaker 2>are they gonna happen though, because they've been saying this

0:31:55.800 --> 0:31:56.680
<v Speaker 2>for two years.

0:31:56.720 --> 0:31:58.400
<v Speaker 1>But do you think there's a scenario, But do you

0:31:58.400 --> 0:32:02.000
<v Speaker 1>think there's a scenario where this this goes this goes away.

0:32:02.120 --> 0:32:03.360
<v Speaker 3>I don't think it goes I.

0:32:05.080 --> 0:32:06.680
<v Speaker 1>Don't think it's gonna do you think it's going to

0:32:07.000 --> 0:32:09.960
<v Speaker 1>reveal itself as as being fraudulent for what.

0:32:10.080 --> 0:32:14.400
<v Speaker 2>I'll explain, I think you're going to see what call

0:32:14.440 --> 0:32:18.560
<v Speaker 2>him my shot. So I believe that open ai will

0:32:18.680 --> 0:32:21.480
<v Speaker 2>is an ongoing concern eventually go into nothingness.

0:32:21.520 --> 0:32:23.840
<v Speaker 3>Matt Hughes, my editor, believes they'll become a patent. Sal

0:32:23.840 --> 0:32:25.000
<v Speaker 3>I actually think it's an amazing thing.

0:32:25.440 --> 0:32:27.920
<v Speaker 2>I think that what we experience of large language models

0:32:27.960 --> 0:32:30.280
<v Speaker 2>will vastly pull back. I think there will be rate

0:32:30.360 --> 0:32:33.800
<v Speaker 2>limits that, as there will be rate limits on GPT,

0:32:34.000 --> 0:32:36.600
<v Speaker 2>people are going to be horrifyingly sad because those.

0:32:36.440 --> 0:32:38.400
<v Speaker 1>Comparive companions are going to go away, And they're not

0:32:38.440 --> 0:32:38.840
<v Speaker 1>going to go.

0:32:38.760 --> 0:32:41.560
<v Speaker 2>Away, they're just going to be much, much, much more limited.

0:32:41.800 --> 0:32:43.960
<v Speaker 2>And I think that everything we see today the kind

0:32:44.000 --> 0:32:45.960
<v Speaker 2>of and you look in any of the reds behind

0:32:45.960 --> 0:32:48.800
<v Speaker 2>any of the serious like GPT's, they're all kind of

0:32:48.800 --> 0:32:51.160
<v Speaker 2>saying like, yeah, we know that the abundance the free

0:32:51.240 --> 0:32:54.400
<v Speaker 2>ride is over. So no, it's not going away, but

0:32:54.440 --> 0:32:56.480
<v Speaker 2>you're not going to hear about it constantly, And everything

0:32:56.520 --> 0:32:59.360
<v Speaker 2>you use today is going to be so severely rate limited,

0:32:59.600 --> 0:33:03.280
<v Speaker 2>or those companies that are charging for generative AI things

0:33:03.400 --> 0:33:06.680
<v Speaker 2>beneath the surface. All of the API rates behind these companies.

0:33:06.720 --> 0:33:08.000
<v Speaker 2>So the things that you plug in to run the

0:33:08.040 --> 0:33:12.479
<v Speaker 2>models are vastly subsidized by big tech and by the companies.

0:33:13.080 --> 0:33:15.440
<v Speaker 1>Alda Vista goes away and but Google takes over.

0:33:16.120 --> 0:33:19.560
<v Speaker 2>Well, Google Gemini exists, but the Gemini request perhaps don't

0:33:19.640 --> 0:33:21.080
<v Speaker 2>hit the Lelem as much they.

0:33:21.080 --> 0:33:22.200
<v Speaker 1>Have using it as a parallel.

0:33:22.320 --> 0:33:27.800
<v Speaker 2>Yeah, but it won't be this big thing you hear

0:33:27.840 --> 0:33:28.720
<v Speaker 2>about all the time.

0:33:28.800 --> 0:33:29.520
<v Speaker 3>I think you were going to.

0:33:29.880 --> 0:33:31.360
<v Speaker 1>Of course, it'll just be the back end of lots

0:33:31.400 --> 0:33:31.760
<v Speaker 1>of stuff.

0:33:32.480 --> 0:33:34.800
<v Speaker 5>No, it would just be something that sits large skills.

0:33:34.840 --> 0:33:35.560
<v Speaker 2>It's also not.

0:33:35.520 --> 0:33:38.160
<v Speaker 3>Good as a back end. Large language models are not

0:33:38.200 --> 0:33:38.720
<v Speaker 3>good back.

0:33:38.560 --> 0:33:40.240
<v Speaker 1>At they're good at talking to you.

0:33:41.640 --> 0:33:43.480
<v Speaker 2>Can they can divine stuff as well.

0:33:43.640 --> 0:33:46.240
<v Speaker 4>One thing we're kind of circling on the consumer level

0:33:46.240 --> 0:33:48.680
<v Speaker 4>that we're not talking about is it's still a novelty now.

0:33:48.680 --> 0:33:49.360
<v Speaker 3>I think it's going to.

0:33:49.360 --> 0:33:51.120
<v Speaker 4>Be continued to be used. I do think that people

0:33:51.160 --> 0:33:54.400
<v Speaker 4>are going to continue to make lists and scheduling and

0:33:54.520 --> 0:33:58.000
<v Speaker 4>summarize summari like you know, what's a good vacation.

0:33:57.720 --> 0:33:59.520
<v Speaker 2>And they are on device models that we are able

0:33:59.560 --> 0:33:59.920
<v Speaker 2>to do that.

0:34:00.520 --> 0:34:02.280
<v Speaker 4>I just think that right now it's such a big

0:34:02.320 --> 0:34:04.880
<v Speaker 4>deal that everyone's using it because it's cool, because they

0:34:04.960 --> 0:34:07.000
<v Speaker 4>you know, your parents hear about it, because you're told

0:34:07.000 --> 0:34:09.600
<v Speaker 4>you should use it for work. And I do think

0:34:09.600 --> 0:34:11.239
<v Speaker 4>that it'll stick around. I do think there'll be a

0:34:11.280 --> 0:34:13.880
<v Speaker 4>contraction in the sense that you know, it'll be a

0:34:13.880 --> 0:34:15.920
<v Speaker 4>cool thing. But I think in fifteen years it's no

0:34:15.960 --> 0:34:18.360
<v Speaker 4>longer gonna be as funny to produce, like a picture

0:34:18.400 --> 0:34:21.360
<v Speaker 4>of a pig with like Mickey Mouse's head and three boobs.

0:34:21.680 --> 0:34:22.600
<v Speaker 4>And I feel like, now.

0:34:22.480 --> 0:34:26.480
<v Speaker 2>That's a big, big Some comedy is timeless, like, but

0:34:26.520 --> 0:34:26.839
<v Speaker 2>I just.

0:34:26.800 --> 0:34:29.000
<v Speaker 4>Don't imagine that in like fifteen years having the same

0:34:29.120 --> 0:34:32.680
<v Speaker 4>novelty as it does, or five years, ten years.

0:34:32.880 --> 0:34:35.200
<v Speaker 2>If you take away all the headlines, if you take

0:34:35.239 --> 0:34:37.400
<v Speaker 2>away all of the money and you actually look at

0:34:37.400 --> 0:34:40.960
<v Speaker 2>what's there, it is everything. What you're describing is probably

0:34:40.960 --> 0:34:44.120
<v Speaker 2>the most useful thing. It's like, do I know this, Okay,

0:34:44.120 --> 0:34:46.279
<v Speaker 2>this seems plausible. I'm gonna double check it. It's kind

0:34:46.280 --> 0:34:48.040
<v Speaker 2>of like what in Karta could have been. I don't

0:34:48.040 --> 0:34:51.240
<v Speaker 2>even mean that sarcastically. I as a very cool child,

0:34:51.360 --> 0:34:53.040
<v Speaker 2>I was very cool. Would sit them then caught for

0:34:53.120 --> 0:34:54.160
<v Speaker 2>hours reading stuff.

0:34:53.920 --> 0:34:55.560
<v Speaker 3>Because it was kind of good. Why you have access

0:34:55.600 --> 0:34:56.000
<v Speaker 3>to everything?

0:34:56.080 --> 0:34:57.959
<v Speaker 2>And I think human beings a curious so of course

0:34:57.960 --> 0:35:00.640
<v Speaker 2>they're gonna talk to it if it talks back. I

0:35:00.760 --> 0:35:03.200
<v Speaker 2>just think right now, there is no business model. That's

0:35:03.200 --> 0:35:05.480
<v Speaker 2>the biggest one. The biggest one is there really is

0:35:05.520 --> 0:35:08.600
<v Speaker 2>no business model. Ads do not work, ads are not going.

0:35:08.800 --> 0:35:11.200
<v Speaker 2>How do you put you inject ads.

0:35:10.960 --> 0:35:12.600
<v Speaker 3>Within an L and M. Look at what happened with

0:35:12.640 --> 0:35:13.760
<v Speaker 3>grog grok.

0:35:13.800 --> 0:35:16.239
<v Speaker 2>Happened because they tried to make us let's make it

0:35:16.360 --> 0:35:19.239
<v Speaker 2>just how we crank up this racism dial and like,

0:35:19.320 --> 0:35:21.160
<v Speaker 2>how do we mess with this system prompt. But the

0:35:21.200 --> 0:35:25.279
<v Speaker 2>thing is those subtle changes for even advertising will be bad.

0:35:25.320 --> 0:35:27.440
<v Speaker 2>Perplexity's been talking about there and ads for a fucking year.

0:35:27.480 --> 0:35:30.920
<v Speaker 2>Not heard much of that aravant, and it's like, on

0:35:30.960 --> 0:35:34.080
<v Speaker 2>some level, regardless of how useful it is, the economics

0:35:34.120 --> 0:35:38.000
<v Speaker 2>do not make sense. They're nothing like an uber. Uber Yes,

0:35:38.080 --> 0:35:40.279
<v Speaker 2>you well know Uber was a complete fucking and it

0:35:40.400 --> 0:35:42.799
<v Speaker 2>still barely makes sense. They're raising another one point two

0:35:42.840 --> 0:35:45.560
<v Speaker 2>billion dollars, but you can at least tell someone exactly

0:35:45.600 --> 0:35:48.719
<v Speaker 2>what uber is and why you'd use it and it's

0:35:48.840 --> 0:35:52.480
<v Speaker 2>just kind of chugging along through necessity. I don't know

0:35:52.520 --> 0:35:57.719
<v Speaker 2>how necessary. Chat GPT is a've large language model, a

0:35:57.840 --> 0:36:01.960
<v Speaker 2>Google Gemini, and Google is claiming that doing efficiency stuff

0:36:02.160 --> 0:36:04.920
<v Speaker 2>that could lost. I think it will be heavily right limited.

0:36:05.120 --> 0:36:08.760
<v Speaker 1>I'm certain you are all right about the business models

0:36:08.800 --> 0:36:11.759
<v Speaker 1>and the viability of this from a business standpoint. You

0:36:11.800 --> 0:36:13.640
<v Speaker 1>know so much more than I do. I'm learning and

0:36:13.640 --> 0:36:17.719
<v Speaker 1>it's fascinating as a user of it who is not

0:36:17.960 --> 0:36:21.040
<v Speaker 1>on the inside of the business. I my prediction is

0:36:21.080 --> 0:36:22.360
<v Speaker 1>you're dead wrong.

0:36:22.239 --> 0:36:22.560
<v Speaker 4>No, Brent.

0:36:22.719 --> 0:36:26.360
<v Speaker 1>I it's going to become the dominant thing in most

0:36:26.360 --> 0:36:29.600
<v Speaker 1>of society in lots of ways. I think here's people

0:36:29.600 --> 0:36:31.120
<v Speaker 1>are going to use it and it's going to be

0:36:31.200 --> 0:36:34.680
<v Speaker 1>part of them, like William Gibson predicted a really long

0:36:34.719 --> 0:36:37.239
<v Speaker 1>time ago. Like I think that it's fine that. I

0:36:37.239 --> 0:36:39.480
<v Speaker 1>think it's bad at creative things, the thing people think

0:36:39.520 --> 0:36:41.120
<v Speaker 1>it's good at. I don't. It's bad at that. I

0:36:41.160 --> 0:36:43.360
<v Speaker 1>don't think you can write a story. Yet none of

0:36:43.360 --> 0:36:45.640
<v Speaker 1>them can convincingly write a story like drugs are good,

0:36:46.239 --> 0:36:48.919
<v Speaker 1>Like there are lots of that stuff that's not there yet.

0:36:49.000 --> 0:36:51.200
<v Speaker 1>I have no idea. Again, you all know way better

0:36:51.239 --> 0:36:53.480
<v Speaker 1>than I did the science behind it, but you're asking

0:36:53.480 --> 0:36:58.439
<v Speaker 1>about working out before you could. There are ninety five

0:36:58.480 --> 0:37:01.399
<v Speaker 1>percent of people who are trainers can't do as good

0:37:01.400 --> 0:37:04.320
<v Speaker 1>a job of programming, and you could play different aias

0:37:04.320 --> 0:37:06.520
<v Speaker 1>against each other and asking questions.

0:37:06.840 --> 0:37:09.360
<v Speaker 5>That's also ninety percent of them are influencers with no

0:37:09.480 --> 0:37:10.240
<v Speaker 5>serious backgrounds.

0:37:10.280 --> 0:37:11.680
<v Speaker 1>But I'm saying, even if you talk to sign but

0:37:12.280 --> 0:37:15.239
<v Speaker 1>because you can show programming, it can track for it

0:37:15.239 --> 0:37:17.040
<v Speaker 1>can just now you may, you may, And I'm sure

0:37:17.040 --> 0:37:20.839
<v Speaker 1>you're correct. That's not the AI doing it. It's other

0:37:20.960 --> 0:37:23.000
<v Speaker 1>people could have done. But the way I I can

0:37:23.080 --> 0:37:29.239
<v Speaker 1>program and interact with you and allow you to catch up.

0:37:29.520 --> 0:37:31.319
<v Speaker 1>If someone asked me all day long, people are asking me,

0:37:31.719 --> 0:37:35.520
<v Speaker 1>wait the question about programming. Well, I can't program for you.

0:37:35.680 --> 0:37:39.040
<v Speaker 1>I don't know you well enough. But if we talk

0:37:39.160 --> 0:37:42.000
<v Speaker 1>generally about what your goals are, I could definitely talk

0:37:42.080 --> 0:37:46.520
<v Speaker 1>to chat, GBT or Claude and build something that you

0:37:46.520 --> 0:37:48.320
<v Speaker 1>could then iterates.

0:37:47.719 --> 0:37:50.279
<v Speaker 2>Such engine were describing the iterations of such.

0:37:50.480 --> 0:37:52.920
<v Speaker 1>Yeah, but it's packaged in this way. Now go ahead.

0:37:53.080 --> 0:37:55.560
<v Speaker 5>The thing that three of us are going to tell

0:37:55.600 --> 0:37:57.200
<v Speaker 5>you are at different levels. I think it's coming to

0:37:57.200 --> 0:38:00.000
<v Speaker 5>you at the business model. The very like micro right

0:38:00.239 --> 0:38:02.319
<v Speaker 5>and sure, and yeah, that's the way kind of how

0:38:02.320 --> 0:38:04.279
<v Speaker 5>it's got to play out financially, and druggers come in

0:38:04.320 --> 0:38:06.640
<v Speaker 5>with the medium level of the use case and everything,

0:38:06.640 --> 0:38:08.280
<v Speaker 5>and I'm going to tell you that at the top level,

0:38:08.360 --> 0:38:10.759
<v Speaker 5>I draw another parallel for you, which is two things

0:38:10.840 --> 0:38:15.000
<v Speaker 5>come to mind. One is how bitcoin and crypto very

0:38:15.040 --> 0:38:17.480
<v Speaker 5>exciting everyone found in novels y factor and it ran

0:38:17.560 --> 0:38:20.239
<v Speaker 5>for everyone wanted to make NFTs and crypto a thing.

0:38:20.360 --> 0:38:22.560
<v Speaker 5>And then now is kind of dialed back down to

0:38:22.640 --> 0:38:24.920
<v Speaker 5>a less fever pitch and more of a regular body

0:38:24.920 --> 0:38:28.279
<v Speaker 5>temperature pitch. The macro metaphorical level is what I'm going

0:38:28.320 --> 0:38:30.360
<v Speaker 5>to say. I think AI will dial back down to

0:38:30.440 --> 0:38:34.839
<v Speaker 5>that Norman normal regulatory sort of body temperature. And then

0:38:34.960 --> 0:38:39.359
<v Speaker 5>to draw another parallel tinder was everyone was making their app.

0:38:39.400 --> 0:38:41.880
<v Speaker 5>The tinder of this tinder of real estate and we

0:38:41.880 --> 0:38:44.319
<v Speaker 5>were describing is like an interface that works really well

0:38:44.400 --> 0:38:47.800
<v Speaker 5>for something like a use case. The Chad GBT model

0:38:48.200 --> 0:38:50.239
<v Speaker 5>is an interface that works really well for question and

0:38:50.320 --> 0:38:53.160
<v Speaker 5>answer is seeking help assisting you with things that might

0:38:53.200 --> 0:38:56.000
<v Speaker 5>never go away, that might just get built into every app.

0:38:56.040 --> 0:38:59.759
<v Speaker 5>As access to LLMS becomes easier for developers, they'll build

0:38:59.800 --> 0:39:01.719
<v Speaker 5>it in to the Bank of America, Chat Bottle build

0:39:01.760 --> 0:39:03.680
<v Speaker 5>it into everything, and so I think that's where it

0:39:03.719 --> 0:39:06.920
<v Speaker 5>balances out eventually over time, maybe through rate limits. I

0:39:06.960 --> 0:39:08.400
<v Speaker 5>don't know that that's going to be the way. I

0:39:08.400 --> 0:39:10.040
<v Speaker 5>think some consolidation might happen.

0:39:10.160 --> 0:39:12.360
<v Speaker 3>You have right limits own actually accessing your pa R.

0:39:12.400 --> 0:39:14.319
<v Speaker 5>The question back and forth for the people who are

0:39:14.400 --> 0:39:16.759
<v Speaker 5>using chat gubt right or yeah. I think that will

0:39:16.760 --> 0:39:19.200
<v Speaker 5>come too, But I think eventually we're talking like five

0:39:19.239 --> 0:39:21.239
<v Speaker 5>years with Druggers. I think with the rate limits, maybe

0:39:21.239 --> 0:39:23.120
<v Speaker 5>in the short term, but I think even longer term

0:39:23.160 --> 0:39:27.520
<v Speaker 5>than that, we're seeing that might eventually go away. Those

0:39:27.560 --> 0:39:29.320
<v Speaker 5>apps may not even exist stand alone.

0:39:29.440 --> 0:39:31.880
<v Speaker 2>Yeah, and I think I actually don't know if I

0:39:31.920 --> 0:39:34.400
<v Speaker 2>fully disagree with you about everything you're saying. It's just

0:39:34.480 --> 0:39:36.880
<v Speaker 2>the scale of what we're talking about might be different.

0:39:37.120 --> 0:39:40.040
<v Speaker 2>Everyone having a large language model to access. Just saying

0:39:40.040 --> 0:39:43.000
<v Speaker 2>that this is the future we're talking about doesn't really

0:39:43.280 --> 0:39:46.680
<v Speaker 2>change that much. I don't like the economics effect. I

0:39:46.719 --> 0:39:48.880
<v Speaker 2>know I'm going to do the business bullshit thing I do,

0:39:49.000 --> 0:39:53.719
<v Speaker 2>but it's the economic effects are quite limited right now.

0:39:53.840 --> 0:39:54.040
<v Speaker 3>Now.

0:39:54.080 --> 0:39:55.959
<v Speaker 2>If you're saying everyone will have a large language model

0:39:56.000 --> 0:39:58.240
<v Speaker 2>will be hamstrung in some way or what have you. Fine,

0:39:58.280 --> 0:40:00.160
<v Speaker 2>I can buy that for good or for bad. I

0:40:00.160 --> 0:40:02.920
<v Speaker 2>could see that happening. I just don't think that it

0:40:03.080 --> 0:40:06.000
<v Speaker 2>goes much further than what we see today. And I

0:40:06.000 --> 0:40:10.280
<v Speaker 2>think what you're describing is what Google Search should have become.

0:40:10.760 --> 0:40:13.200
<v Speaker 2>And like that was what I remember when Bard came out.

0:40:13.200 --> 0:40:14.359
<v Speaker 3>I wrote about this and was kind of.

0:40:14.280 --> 0:40:17.080
<v Speaker 2>Like, surely what chat GPT is is what search was

0:40:17.120 --> 0:40:17.480
<v Speaker 2>meant to.

0:40:17.400 --> 0:40:19.879
<v Speaker 1>Be, right as the com Brothers said, you know, sure

0:40:19.880 --> 0:40:24.320
<v Speaker 1>if if if a frog you know, had had wings,

0:40:24.520 --> 0:40:26.800
<v Speaker 1>had wings, it wouldn't bump its ass hopping, you know

0:40:26.800 --> 0:40:27.200
<v Speaker 1>what I mean?

0:40:28.960 --> 0:40:31.799
<v Speaker 4>And so and I'll say that, you know, I think

0:40:31.840 --> 0:40:35.120
<v Speaker 4>that even as it's completely absorbed into our culture, we're

0:40:35.120 --> 0:40:37.279
<v Speaker 4>still going to be on the phone listening to an

0:40:37.320 --> 0:40:41.520
<v Speaker 4>AI list medical options, yeah, human, human operator, human, Like

0:40:41.560 --> 0:40:42.400
<v Speaker 4>I don't think that's.

0:40:42.239 --> 0:40:44.719
<v Speaker 2>Going away green seven seven one right, yeah, Like I

0:40:44.719 --> 0:40:45.160
<v Speaker 2>don't think.

0:40:45.480 --> 0:40:47.520
<v Speaker 4>I do think that people are going to be like, yeah, okay,

0:40:47.600 --> 0:40:49.600
<v Speaker 4>I'll talk you through my problems until I hit this

0:40:49.640 --> 0:40:51.479
<v Speaker 4>part with my insurance. And I just want a human

0:40:51.560 --> 0:40:54.320
<v Speaker 4>right right right now. And I don't think that's going.

0:40:54.120 --> 0:40:55.120
<v Speaker 3>Away, No, not at all.

0:40:55.200 --> 0:40:58.680
<v Speaker 2>I actually think we're really more on the same page. Yah,

0:40:58.920 --> 0:41:01.880
<v Speaker 2>it seem because it's like, my whole thing is what

0:41:01.920 --> 0:41:04.080
<v Speaker 2>you're describing is the use case. I think there are

0:41:04.080 --> 0:41:05.839
<v Speaker 2>real harms, but I think we kind of agree where

0:41:05.840 --> 0:41:08.480
<v Speaker 2>the dangers would be. My thing is is that people

0:41:08.520 --> 0:41:12.279
<v Speaker 2>are extrapolating from that to this insane level, like this

0:41:12.360 --> 0:41:16.320
<v Speaker 2>whole they keep talking about agents everywhere. You've got Matthew McConaughey.

0:41:15.800 --> 0:41:17.760
<v Speaker 5>And a But agent is kind of what I'm describing,

0:41:17.760 --> 0:41:20.640
<v Speaker 5>which is like every every service has its own chat

0:41:20.680 --> 0:41:22.400
<v Speaker 5>bot more or less. They're just using different.

0:41:22.160 --> 0:41:24.399
<v Speaker 2>They're just using a different Yeah, And I mean that's

0:41:24.480 --> 0:41:26.480
<v Speaker 2>kind of what like Bank of America already has a

0:41:26.560 --> 0:41:27.800
<v Speaker 2>chat bot and it does it.

0:41:27.640 --> 0:41:28.279
<v Speaker 3>Does not work.

0:41:28.440 --> 0:41:31.080
<v Speaker 2>It's the bottom is when you're trying to search for

0:41:31.080 --> 0:41:31.680
<v Speaker 2>a transaction.

0:41:31.880 --> 0:41:34.560
<v Speaker 1>He's talking about for sure, the human of course we

0:41:34.640 --> 0:41:38.600
<v Speaker 1>all do that. I guess I think it'll fool us.

0:41:38.920 --> 0:41:40.120
<v Speaker 4>That's fair, that's fair.

0:41:40.239 --> 0:41:42.799
<v Speaker 1>I don't think right, I've already fooled a lot of people.

0:41:42.600 --> 0:41:45.320
<v Speaker 4>Very well, people into up killing themselves.

0:41:45.360 --> 0:41:47.720
<v Speaker 1>Oh god, well that's tell your car that old character AI.

0:41:47.920 --> 0:41:50.879
<v Speaker 3>Of course Google paid two billion dollars for them.

0:41:51.080 --> 0:41:52.960
<v Speaker 1>I'm not making an argument that it's this is a

0:41:52.960 --> 0:41:58.160
<v Speaker 1>beneficent as in any way, shape or form. Like Like

0:41:58.200 --> 0:41:59.879
<v Speaker 1>I said, I've been reading about this for so long,

0:42:00.200 --> 0:42:03.319
<v Speaker 1>but I do think that to just sort of That's

0:42:03.360 --> 0:42:05.960
<v Speaker 1>why I brought up the horse and buggy because no,

0:42:06.160 --> 0:42:09.520
<v Speaker 1>the people who wanted the horses right, they were right

0:42:09.560 --> 0:42:11.640
<v Speaker 1>about a lot of stuff about the harms that would do,

0:42:11.920 --> 0:42:16.160
<v Speaker 1>the pollution, the pollution, the noise, the way would take

0:42:16.200 --> 0:42:19.200
<v Speaker 1>us away from our communities, like they were right about

0:42:19.239 --> 0:42:22.359
<v Speaker 1>so much. What they were wrong about was the inevitable

0:42:22.440 --> 0:42:23.919
<v Speaker 1>march of the future in time.

0:42:24.160 --> 0:42:26.080
<v Speaker 5>That's I think the same. Before coming to this podcast,

0:42:26.080 --> 0:42:27.839
<v Speaker 5>I was reading Mike your piece and I was like, oh, yeah,

0:42:28.120 --> 0:42:30.880
<v Speaker 5>are we like in the Industrial Revolution forgetting about the

0:42:30.880 --> 0:42:33.480
<v Speaker 5>agricultural revolution? For getting all the revolutions.

0:42:33.000 --> 0:42:35.640
<v Speaker 1>That came there. Because I've seen it all from when

0:42:35.680 --> 0:42:39.360
<v Speaker 1>I was I remember when AOL showed up and coffee served,

0:42:39.400 --> 0:42:40.920
<v Speaker 1>and I remember mess I was invited on a message

0:42:40.920 --> 0:42:42.920
<v Speaker 1>board where I was fourteen. I'm fifty nine, like this

0:42:43.000 --> 0:42:44.799
<v Speaker 1>guy I knew had a message board in New York

0:42:44.840 --> 0:42:47.640
<v Speaker 1>and had to dial up. And I mean, so I've

0:42:47.680 --> 0:42:49.960
<v Speaker 1>seen this, So I'm an early adapter of stuff, even

0:42:50.000 --> 0:42:52.279
<v Speaker 1>though I'm an old dude, I'm but not from a

0:42:52.360 --> 0:42:53.320
<v Speaker 1>tech side, from.

0:42:53.120 --> 0:42:53.839
<v Speaker 3>A user side.

0:42:55.560 --> 0:42:58.319
<v Speaker 1>Single. This is this single as a just as a

0:42:58.400 --> 0:43:01.279
<v Speaker 1>user meaning I don't know how it works why, but

0:43:01.360 --> 0:43:03.440
<v Speaker 1>I can explain to you why people are so fascinating

0:43:03.600 --> 0:43:05.680
<v Speaker 1>and NFTs I was on. You could find the old

0:43:05.680 --> 0:43:11.440
<v Speaker 1>tweets going calling people buy one? Are you fucking I

0:43:11.440 --> 0:43:16.000
<v Speaker 1>don't know. Studying people is like literally like why I

0:43:16.000 --> 0:43:17.919
<v Speaker 1>started doing what I do. But like, no, of course

0:43:17.920 --> 0:43:20.400
<v Speaker 1>I recognize that as a con from the moment one.

0:43:20.640 --> 0:43:21.759
<v Speaker 5>But this doesn't mean like a con.

0:43:22.080 --> 0:43:25.239
<v Speaker 1>But even when you say the bitcoin things today, yeah,

0:43:25.239 --> 0:43:30.560
<v Speaker 1>I know, so Bitcoin one, NFTs were always huckster devices

0:43:30.600 --> 0:43:34.359
<v Speaker 1>to separate suckers from their money. And look, Theodore Eblin said,

0:43:34.400 --> 0:43:37.200
<v Speaker 1>and David Mammock quotes it all the time. Every profession

0:43:37.640 --> 0:43:41.560
<v Speaker 1>is a conspiracy against the laity, every profession fox over

0:43:41.719 --> 0:43:48.080
<v Speaker 1>the regular. Yes, that's the and there's no doubt about Yeah,

0:43:48.120 --> 0:43:49.600
<v Speaker 1>Mamott was ahead of that by twenty years. But you

0:43:49.640 --> 0:43:53.120
<v Speaker 1>gotta you gotta look at it and not me Mammott

0:43:53.280 --> 0:43:58.160
<v Speaker 1>and then me, but but so uh, you got to

0:43:58.239 --> 0:44:02.000
<v Speaker 1>just understand that it is very active. Sure, people, I'm just.

0:44:01.960 --> 0:44:06.120
<v Speaker 2>Trying to bridge from what you're describing to the automobile,

0:44:06.160 --> 0:44:09.120
<v Speaker 2>which changed everything. I don't think large language models. I

0:44:09.120 --> 0:44:11.400
<v Speaker 2>think they're going to create harms. I think they are

0:44:11.400 --> 0:44:14.400
<v Speaker 2>going to be things that change. But it's like what

0:44:15.080 --> 0:44:17.800
<v Speaker 2>happens now, because what we have right now is basically

0:44:17.880 --> 0:44:20.000
<v Speaker 2>what we've had for two years. If you want to

0:44:20.040 --> 0:44:22.680
<v Speaker 2>email me about reasoning, please do our all email as

0:44:22.760 --> 0:44:26.560
<v Speaker 2>much as you want. But the thing is you look

0:44:26.600 --> 0:44:28.719
<v Speaker 2>at this and people are going, okay, and then this

0:44:28.760 --> 0:44:32.160
<v Speaker 2>will happen. It's like that thing that with an LLM

0:44:32.200 --> 0:44:34.560
<v Speaker 2>that goes and does something really basic. Thing an LLLM

0:44:34.560 --> 0:44:36.640
<v Speaker 2>that you tell to go and do a thing online.

0:44:37.040 --> 0:44:40.239
<v Speaker 2>They are bad, bad at it. There was a Salesforce

0:44:40.280 --> 0:44:43.840
<v Speaker 2>study that sayds like thirty five percent they like it

0:44:43.920 --> 0:44:46.279
<v Speaker 2>was like thirty something percent they fail or that that

0:44:46.400 --> 0:44:48.040
<v Speaker 2>was only the only ones they complete it.

0:44:48.080 --> 0:44:49.520
<v Speaker 1>I don't think I've never asked an AI to do

0:44:49.560 --> 0:44:51.480
<v Speaker 1>a task for it. That's I'm saying. I've never asked

0:44:51.480 --> 0:44:53.200
<v Speaker 1>it to do it. Do one of those agent kind

0:44:53.200 --> 0:44:55.600
<v Speaker 1>of functions. I wouldn't agree with you. I wouldn't well,

0:44:55.680 --> 0:44:57.359
<v Speaker 1>I don't think it's there yet. I mean, for again,

0:44:57.400 --> 0:45:01.200
<v Speaker 1>as a user. But research, I've had good research.

0:45:00.880 --> 0:45:04.319
<v Speaker 5>Done, really well, I mean the AI and jen AI

0:45:04.400 --> 0:45:06.120
<v Speaker 5>has done a lot of good stuff in the medical

0:45:06.160 --> 0:45:08.479
<v Speaker 5>fields too, right, Chrisper and all of that stuff. There's

0:45:08.480 --> 0:45:11.600
<v Speaker 5>been a lot of discoveries about what sort of mutations

0:45:11.600 --> 0:45:13.800
<v Speaker 5>you find in certain types of cancer that like, I

0:45:13.800 --> 0:45:14.960
<v Speaker 5>don't think that done.

0:45:15.239 --> 0:45:18.120
<v Speaker 1>I want to study an industry to consider writing about it,

0:45:18.800 --> 0:45:20.440
<v Speaker 1>I can ask you have to ask good question. I mean,

0:45:20.440 --> 0:45:22.640
<v Speaker 1>it's like in anything else, right, I can ask really

0:45:22.680 --> 0:45:25.920
<v Speaker 1>good questions of an AI and send it, you know,

0:45:26.280 --> 0:45:28.319
<v Speaker 1>for the two hundred dollars a month model one where

0:45:28.320 --> 0:45:31.560
<v Speaker 1>they'll do that research thing. And if I ask it

0:45:31.600 --> 0:45:34.920
<v Speaker 1>to do research and then you can, it'll like it

0:45:34.960 --> 0:45:36.960
<v Speaker 1>will come back and maybe you have to send it

0:45:37.000 --> 0:45:40.719
<v Speaker 1>back three times, but you have the speed and accurate. Yeah,

0:45:40.760 --> 0:45:45.840
<v Speaker 1>but you can very quickly a lot of it because

0:45:45.840 --> 0:45:48.680
<v Speaker 1>you can. Like the booklist thing that I can't fathom

0:45:48.719 --> 0:45:51.560
<v Speaker 1>being that irresponsible, like those idiots who in the paper,

0:45:52.600 --> 0:45:54.279
<v Speaker 1>but you literally all you have to say to them

0:45:54.320 --> 0:45:57.200
<v Speaker 1>is when it presents any kind of list, all you

0:45:57.239 --> 0:45:59.200
<v Speaker 1>say to it is go verify that list. Please.

0:45:59.280 --> 0:46:00.520
<v Speaker 3>How do you know what right is?

0:46:01.120 --> 0:46:04.920
<v Speaker 5>Or myself? I mean I would at some point, but you.

0:46:04.800 --> 0:46:07.279
<v Speaker 1>Go verify the list. It immediately goes, You're right, I

0:46:07.320 --> 0:46:11.919
<v Speaker 1>was hallucinating these three books don't exist. That happens. Then

0:46:12.120 --> 0:46:14.319
<v Speaker 1>you go do it one more time and make sure

0:46:14.320 --> 0:46:16.879
<v Speaker 1>that these titles are available in these stores. Then they'll

0:46:16.920 --> 0:46:18.719
<v Speaker 1>give you links and then you can go look at

0:46:19.360 --> 0:46:20.120
<v Speaker 1>something that you're doing.

0:46:20.280 --> 0:46:22.319
<v Speaker 4>And I think sometimes we're splitting here though, is you're

0:46:22.680 --> 0:46:25.520
<v Speaker 4>both of you are extraordinarily intelligent people who have done

0:46:25.520 --> 0:46:27.520
<v Speaker 4>a lot of research in your life, so you know

0:46:27.600 --> 0:46:29.160
<v Speaker 4>how to do those states. You know how to be like,

0:46:29.200 --> 0:46:30.720
<v Speaker 4>I need to call this up, I need to search

0:46:30.760 --> 0:46:33.400
<v Speaker 4>this make sure. I do think that one of the problems,

0:46:33.400 --> 0:46:36.120
<v Speaker 4>again coming from like the low level consumer level, is

0:46:36.640 --> 0:46:39.760
<v Speaker 4>it's often being marketed as an impartial refer impart.

0:46:40.200 --> 0:46:42.680
<v Speaker 1>You're totally right, You're totally right. It's not that you

0:46:42.719 --> 0:46:44.359
<v Speaker 1>cannot you'll be fucked so bad.

0:46:45.200 --> 0:46:47.200
<v Speaker 2>But most people don't interact with it like you do,

0:46:47.320 --> 0:46:48.000
<v Speaker 2>is what I'm saying.

0:46:48.080 --> 0:46:49.600
<v Speaker 1>Well, but but they can.

0:46:49.640 --> 0:46:51.520
<v Speaker 4>They can, and I agree with you, they can. I

0:46:51.520 --> 0:46:54.040
<v Speaker 4>think it's almost but it's it's.

0:46:54.400 --> 0:46:56.080
<v Speaker 2>They don't lead them to do that. It's not like

0:46:56.120 --> 0:46:57.440
<v Speaker 2>they have things that guide them.

0:46:57.640 --> 0:46:57.839
<v Speaker 3>Right.

0:46:58.280 --> 0:47:00.239
<v Speaker 1>This is really interesting here what you're saying about this.

0:47:00.440 --> 0:47:02.120
<v Speaker 4>Yeah, well, I mean I think that you know, when

0:47:02.160 --> 0:47:04.759
<v Speaker 4>you see people and we all saw the Rock Nazi thing,

0:47:05.040 --> 0:47:07.640
<v Speaker 4>but if you see people on Twitter, they don't use

0:47:07.800 --> 0:47:10.759
<v Speaker 4>they I would say, fewer use it to do not

0:47:10.880 --> 0:47:13.360
<v Speaker 4>see ship as much as they go, hey rock, is

0:47:13.400 --> 0:47:13.840
<v Speaker 4>this true?

0:47:13.960 --> 0:47:15.120
<v Speaker 3>Is this true? Is this true?

0:47:15.480 --> 0:47:18.200
<v Speaker 4>And depending on what fingers, on which scale, the answer

0:47:18.239 --> 0:47:19.120
<v Speaker 4>is different each time.

0:47:19.600 --> 0:47:22.640
<v Speaker 1>But I think I understand why. I mean I did.

0:47:22.719 --> 0:47:26.880
<v Speaker 1>I haven't been on Twitter since Yer whatever that date is,

0:47:27.719 --> 0:47:30.800
<v Speaker 1>so I don't know, but I would never. Of course,

0:47:30.840 --> 0:47:34.160
<v Speaker 1>you can't just say let's go to the ay as

0:47:34.200 --> 0:47:37.640
<v Speaker 1>though that's a final arbiter and it's certainly not today.

0:47:37.920 --> 0:47:38.960
<v Speaker 3>But it's being pransitioned.

0:47:39.800 --> 0:47:41.920
<v Speaker 4>I think that's my problem is it's being positioned that way.

0:47:41.960 --> 0:47:43.880
<v Speaker 4>And I think you're absolutely right. I think you're absolutely

0:47:43.960 --> 0:47:46.279
<v Speaker 4>right how useful it is, especially if you have the

0:47:46.280 --> 0:47:47.920
<v Speaker 4>skill to use it. I think the problem is it

0:47:47.960 --> 0:47:50.920
<v Speaker 4>is being marketed as this is a catch all solution,

0:47:51.040 --> 0:47:54.239
<v Speaker 4>this is a panacea to your knowledge problems, and oh.

0:47:54.239 --> 0:47:54.920
<v Speaker 1>Yeah, you know what I mean.

0:47:56.960 --> 0:47:57.600
<v Speaker 3>Well, it is like.

0:47:57.560 --> 0:47:59.960
<v Speaker 1>Believing rock Star games that that they're going to get

0:48:00.320 --> 0:48:05.120
<v Speaker 1>fucking grant the motto out right. I mean, it's no doubt, no,

0:48:05.320 --> 0:48:07.839
<v Speaker 1>I mean I literally look at the actuarial table. If

0:48:07.840 --> 0:48:09.839
<v Speaker 1>you have to build one on my life expectancy verse

0:48:09.880 --> 0:48:13.360
<v Speaker 1>when the next popperation of GTA, I might lose.

0:48:14.080 --> 0:48:15.640
<v Speaker 3>But here's the thing. Here's the thing.

0:48:15.640 --> 0:48:18.359
<v Speaker 2>Though you're describing the research, it isn't an invalid use case.

0:48:18.400 --> 0:48:21.239
<v Speaker 2>What you're describing is how people use Google Search to

0:48:21.280 --> 0:48:23.520
<v Speaker 2>do research. They pull up a bunch of stuff, they

0:48:23.560 --> 0:48:25.160
<v Speaker 2>go through them, they look at it and they go,

0:48:25.400 --> 0:48:25.879
<v Speaker 2>is this right?

0:48:26.040 --> 0:48:26.760
<v Speaker 1>That's a slog?

0:48:27.160 --> 0:48:29.280
<v Speaker 3>It is a slog? Why that is a slog?

0:48:29.680 --> 0:48:31.600
<v Speaker 1>No, it is not. It's a pleasure. It's not. No,

0:48:31.640 --> 0:48:33.959
<v Speaker 1>you're gonna be honest about it. It is a total pace.

0:48:34.480 --> 0:48:37.560
<v Speaker 2>I should be clear. I'm also I have actually used

0:48:37.560 --> 0:48:40.239
<v Speaker 2>these things. I've genuinely tried because I'm I love my

0:48:40.280 --> 0:48:43.040
<v Speaker 2>dud Dad's Mcgizmo's and I've really I've sat down and

0:48:43.080 --> 0:48:43.960
<v Speaker 2>been like, what I'm I missing?

0:48:44.239 --> 0:48:49.920
<v Speaker 1>You just became Paul McCartney And that was Paul McCartney moment.

0:48:50.080 --> 0:48:52.560
<v Speaker 2>He has not my invite for the show though. No,

0:48:53.120 --> 0:48:56.760
<v Speaker 2>I'm not invited Mecca. My mom would be so happy.

0:48:57.040 --> 0:48:57.719
<v Speaker 3>No, it's just.

0:48:59.360 --> 0:49:01.440
<v Speaker 2>I really feel like this keeps coming back to the

0:49:02.000 --> 0:49:04.239
<v Speaker 2>you were using it in a totally fine way. I'm

0:49:04.239 --> 0:49:05.880
<v Speaker 2>mad at the fact that everyone's like, and this is

0:49:05.880 --> 0:49:08.040
<v Speaker 2>the only thing you'll need. You can fully trust this,

0:49:08.040 --> 0:49:10.080
<v Speaker 2>this is the best thing. The information's the best. You're

0:49:10.080 --> 0:49:11.600
<v Speaker 2>looking at it and going this is a way of

0:49:11.640 --> 0:49:14.080
<v Speaker 2>digging through information and passing stuff and having a conversation

0:49:14.080 --> 0:49:14.640
<v Speaker 2>with the information.

0:49:14.640 --> 0:49:17.800
<v Speaker 1>Always common and in the broad society, yes, the lowest,

0:49:17.880 --> 0:49:21.440
<v Speaker 1>that's the lowest common. But unfortunately, yes, you're right, our

0:49:21.560 --> 0:49:24.719
<v Speaker 1>educational systems really fucked up. Disadvantaged people have no chance

0:49:24.760 --> 0:49:27.680
<v Speaker 1>that Mike went to a school that allowed people from

0:49:27.719 --> 0:49:29.680
<v Speaker 1>disparate areas to get We gotta, yeah, we gotta reform

0:49:29.719 --> 0:49:31.759
<v Speaker 1>the education system, and we don't trust it. Everyone has

0:49:31.800 --> 0:49:33.480
<v Speaker 1>a fair shot, I agree.

0:49:33.160 --> 0:49:34.799
<v Speaker 2>But there's a very base so let's do it. Yeah,

0:49:34.800 --> 0:49:36.560
<v Speaker 2>but there's actually a very basic thing we don't have

0:49:36.640 --> 0:49:38.719
<v Speaker 2>to with that. There should be regulation that says that

0:49:38.760 --> 0:49:42.160
<v Speaker 2>these things need big fucking disclaimers that say, hey, check everything.

0:49:42.320 --> 0:49:44.840
<v Speaker 2>They won't do it because the incentives we discussed, but

0:49:45.160 --> 0:49:47.880
<v Speaker 2>that would actually be I think called the stealing is

0:49:47.920 --> 0:49:50.560
<v Speaker 2>also bad. I think the environmental damage is bad. But

0:49:50.600 --> 0:49:51.560
<v Speaker 2>I think we agree on that.

0:49:52.120 --> 0:49:55.239
<v Speaker 1>It's just Safeguards are always great, But then you gotta

0:49:55.239 --> 0:49:57.600
<v Speaker 1>figure out who decides what those safeguards are and who's

0:49:57.960 --> 0:49:59.839
<v Speaker 1>Like you talk to Bill Girly, he'll talk about regulatory creep.

0:50:00.200 --> 0:50:02.799
<v Speaker 1>So where do you want to I'm just saying, no, no, no,

0:50:02.920 --> 0:50:06.279
<v Speaker 1>I agree with you. He's a smart person and he's

0:50:06.560 --> 0:50:08.680
<v Speaker 1>you know, thoughtful about this stuff. And so how who

0:50:08.719 --> 0:50:10.719
<v Speaker 1>do you want? Do you want to go? I mean this,

0:50:10.840 --> 0:50:15.560
<v Speaker 1>you want the current government? I think the guardrails. Who

0:50:15.600 --> 0:50:16.120
<v Speaker 1>who's to do?

0:50:16.160 --> 0:50:17.520
<v Speaker 3>But here's a very basic god roil.

0:50:17.600 --> 0:50:20.120
<v Speaker 2>It's just a disclaiming that says everything with the general

0:50:20.120 --> 0:50:21.319
<v Speaker 2>if I is blah blah blah, let you.

0:50:21.320 --> 0:50:22.960
<v Speaker 5>Kind of some of them have it now, but they're

0:50:23.000 --> 0:50:25.120
<v Speaker 5>all in preview right, and that's probably what they're coaching.

0:50:25.160 --> 0:50:30.680
<v Speaker 1>Again, they do say right on every single time, every

0:50:30.719 --> 0:50:34.760
<v Speaker 1>single time, every single time I've got today actually for sure.

0:50:35.160 --> 0:50:38.240
<v Speaker 5>Yeah, because they they have been criticized. But to Brand's

0:50:38.280 --> 0:50:40.520
<v Speaker 5>larger point, there are guardrails put in place into a

0:50:40.560 --> 0:50:42.640
<v Speaker 5>lot of these. I'm most familiar with the Geminis and

0:50:42.680 --> 0:50:45.600
<v Speaker 5>the Apple intelligences and the Amazon ones of the world,

0:50:45.680 --> 0:50:49.400
<v Speaker 5>and they their guardrails are around like see Sam Right

0:50:49.520 --> 0:50:53.480
<v Speaker 5>child sexual abuse material, or like not presenting people's faces

0:50:53.600 --> 0:50:56.000
<v Speaker 5>or like trying to avoid photo realism because then you

0:50:56.080 --> 0:50:58.920
<v Speaker 5>get very deceptive, very quickly. You don't see any of

0:50:59.000 --> 0:51:00.840
<v Speaker 5>that in rock. Maybe it depends.

0:51:01.000 --> 0:51:04.719
<v Speaker 3>I just look, I just used the which one does.

0:51:04.719 --> 0:51:06.839
<v Speaker 1>Really it does. But I'm not gonna turn on my phone.

0:51:06.880 --> 0:51:09.600
<v Speaker 2>But I'm like, I'm just like, here's the thing. If

0:51:09.600 --> 0:51:11.960
<v Speaker 2>they're there sometimes and not others, that's also bad. Like

0:51:11.960 --> 0:51:14.600
<v Speaker 2>it's like it's it's because when you've got people who

0:51:14.680 --> 0:51:18.359
<v Speaker 2>are killing themselves, people are having it was miles clear.

0:51:18.400 --> 0:51:20.120
<v Speaker 2>I think it was a rolling stone wrot, this piece

0:51:20.120 --> 0:51:24.120
<v Speaker 2>about people having psychotic reactions. Yeah, I agree this This

0:51:24.160 --> 0:51:27.000
<v Speaker 2>administration I've probably done won't regulating this. But the answer

0:51:27.080 --> 0:51:29.440
<v Speaker 2>being let's regulate nothing is terrifying.

0:51:29.480 --> 0:51:31.960
<v Speaker 1>Well, it's really, but it is confusing because if you

0:51:32.000 --> 0:51:37.680
<v Speaker 1>think about, let's say Facebook, right, there's no doubt that

0:51:38.000 --> 0:51:42.880
<v Speaker 1>Facebook was used in me and mar to uh foment

0:51:43.080 --> 0:51:48.520
<v Speaker 1>a genocide. People were warned inside. Who knows where it

0:51:48.560 --> 0:51:52.759
<v Speaker 1>got to? It is very well documented. How could one

0:51:53.400 --> 0:51:56.399
<v Speaker 1>after the fact it's okay, we know it's really hard

0:51:56.520 --> 0:51:59.160
<v Speaker 1>to sort of figure out these use cases, and then

0:51:59.200 --> 0:52:00.840
<v Speaker 1>she's all social media you have gone away like some

0:52:00.880 --> 0:52:03.799
<v Speaker 1>people think it should be. I would social media feel.

0:52:03.560 --> 0:52:04.120
<v Speaker 3>Better about that.

0:52:04.200 --> 0:52:09.040
<v Speaker 2>If Andrew Bosworth the CTO, Yeah, yeah, yeah, I mostly

0:52:09.080 --> 0:52:11.479
<v Speaker 2>say it for the this, yeah, because also the yellow.

0:52:11.600 --> 0:52:13.880
<v Speaker 1>But if I know I'm a well, that's good. If

0:52:13.920 --> 0:52:16.240
<v Speaker 1>I know he did. He did an.

0:52:16.120 --> 0:52:19.279
<v Speaker 2>Internal letter in twenty sixteen twenty seventeen where he said

0:52:19.320 --> 0:52:21.800
<v Speaker 2>that all things were justified for growth, including a terror attack,

0:52:22.200 --> 0:52:23.960
<v Speaker 2>and that's kind of how they approach everything.

0:52:24.080 --> 0:52:25.960
<v Speaker 1>No, it was monstrous, Like when I saw that Me

0:52:26.000 --> 0:52:27.520
<v Speaker 1>and Mark thing, it made me say, like I should

0:52:27.520 --> 0:52:30.240
<v Speaker 1>never use Facebook. I mean, I think that is as

0:52:30.320 --> 0:52:31.840
<v Speaker 1>bad a thing as I've ever seen.

0:52:32.520 --> 0:52:35.000
<v Speaker 2>I mean literally, this is still an example that and

0:52:35.320 --> 0:52:40.920
<v Speaker 2>Meta's l M allows children and Jeff Horwitz report this

0:52:40.920 --> 0:52:43.839
<v Speaker 2>in the journal allowed children to like have sexual conversations

0:52:43.840 --> 0:52:50.360
<v Speaker 2>with John Cena very peculiar, Like there was like super clear.

0:52:53.680 --> 0:52:56.359
<v Speaker 1>You say with John That's what I say.

0:52:56.520 --> 0:53:01.640
<v Speaker 2>One, it was just John c He's gone to jail

0:53:01.680 --> 0:53:02.520
<v Speaker 2>for one hundred years.

0:53:02.520 --> 0:53:04.120
<v Speaker 1>On the Bear. I was in a scene with John

0:53:04.960 --> 0:53:07.520
<v Speaker 1>lovely guy, and he definitely wasn't doing that.

0:53:07.600 --> 0:53:10.120
<v Speaker 2>No, No, he seems like I'm saying it was the

0:53:10.200 --> 0:53:13.239
<v Speaker 2>voice of John c You're gloud to have pedo conversations with.

0:53:13.480 --> 0:53:17.080
<v Speaker 2>But again it's this lack of restriction because no particular

0:53:17.120 --> 0:53:18.240
<v Speaker 2>technology is evil.

0:53:18.040 --> 0:53:19.360
<v Speaker 3>At it's it's.

0:53:34.200 --> 0:53:36.160
<v Speaker 5>I think what Brian was saying. And I don't know

0:53:36.200 --> 0:53:39.080
<v Speaker 5>if I misinterpreting you, but like it's there is some

0:53:39.160 --> 0:53:41.920
<v Speaker 5>fatigue at seeing these things throughout all the pharmaceutical warnings

0:53:41.920 --> 0:53:43.879
<v Speaker 5>you get the end of every commercial, every warning you're

0:53:43.880 --> 0:53:46.760
<v Speaker 5>gonna get from Gemini from now on, every Apple intelligence warning.

0:53:46.800 --> 0:53:50.040
<v Speaker 5>That's like notification summaries can be wrong sometimes, like sometimes

0:53:50.080 --> 0:53:51.919
<v Speaker 5>you see that I see them on my phone.

0:53:52.000 --> 0:53:54.640
<v Speaker 2>Are no, no, no, I believe you.

0:53:54.640 --> 0:53:58.680
<v Speaker 1>It's I guess what I'm saying. Is that why I

0:53:58.680 --> 0:54:03.680
<v Speaker 1>brought up me and Mara is No, we as a society, unfortunately,

0:54:03.760 --> 0:54:06.240
<v Speaker 1>we default to the convenient, fun.

0:54:06.280 --> 0:54:06.799
<v Speaker 4>Easy way.

0:54:07.000 --> 0:54:09.759
<v Speaker 1>Yeah, that'sactly what we all should have done when face

0:54:10.160 --> 0:54:12.640
<v Speaker 1>And I'm not even blaming exactly like you can blame bos.

0:54:12.640 --> 0:54:15.080
<v Speaker 1>I don't know enough to blame anybody there. We know

0:54:15.160 --> 0:54:17.759
<v Speaker 1>that the technology that Facebook itself was used by these

0:54:17.800 --> 0:54:21.120
<v Speaker 1>generals in me and mar right, yes, and nobody took pause,

0:54:21.239 --> 0:54:25.880
<v Speaker 1>Like I think very few people left Facebook as a

0:54:25.880 --> 0:54:28.480
<v Speaker 1>result of that. I think. So I just don't know

0:54:28.520 --> 0:54:31.759
<v Speaker 1>what the fix is for these problems because we gravitate

0:54:31.960 --> 0:54:33.040
<v Speaker 1>like bees to honey.

0:54:33.239 --> 0:54:35.800
<v Speaker 5>But they're also like tools that can be used as weapons,

0:54:35.880 --> 0:54:38.560
<v Speaker 5>and it depends on the perpetrator and the person using

0:54:38.600 --> 0:54:40.839
<v Speaker 5>these right. And this is an age old question again

0:54:40.920 --> 0:54:43.919
<v Speaker 5>coming back to the industrial and agricultural revolution. This can

0:54:43.960 --> 0:54:46.120
<v Speaker 5>be just a tool for hacking a plant off of

0:54:46.120 --> 0:54:48.160
<v Speaker 5>its huscal. It can be used as a murder weapon.

0:54:48.680 --> 0:54:51.640
<v Speaker 5>Back to AI, can be very informative, very helpful for

0:54:51.640 --> 0:54:55.000
<v Speaker 5>people who need companionship. It can be used like people

0:54:55.040 --> 0:54:58.160
<v Speaker 5>will send me scam texts all the time. The technologies

0:54:58.239 --> 0:55:00.440
<v Speaker 5>keeps keeping up with them and filtering them out, so

0:55:00.440 --> 0:55:02.520
<v Speaker 5>they keep changing their spam. Bad actors are going to

0:55:02.600 --> 0:55:05.160
<v Speaker 5>bad act That's just the way it's going to be.

0:55:05.320 --> 0:55:08.160
<v Speaker 5>What can we do? We can I stop using Facebook.

0:55:08.480 --> 0:55:10.880
<v Speaker 5>I try to educate my parents every single time they

0:55:10.960 --> 0:55:12.960
<v Speaker 5>use a GBT answer with me. I'm like, Mom and Dad,

0:55:13.040 --> 0:55:15.880
<v Speaker 5>stop using that. But then they just keep using it

0:55:15.920 --> 0:55:18.319
<v Speaker 5>because they're the sort of person that's accessible to this.

0:55:18.680 --> 0:55:22.480
<v Speaker 5>They will just use it convenience and they don't want

0:55:22.480 --> 0:55:24.560
<v Speaker 5>to do the extra work of maybe the Google search

0:55:24.600 --> 0:55:27.160
<v Speaker 5>method which you were saying at, or they just want

0:55:27.200 --> 0:55:29.600
<v Speaker 5>something that's easy and they don't care if they get

0:55:29.640 --> 0:55:29.959
<v Speaker 5>it wrong.

0:55:30.000 --> 0:55:31.400
<v Speaker 1>Me. Yeah, I guess I love the idea of you

0:55:31.520 --> 0:55:34.160
<v Speaker 1>using your brains to figure out how to make these

0:55:34.200 --> 0:55:38.520
<v Speaker 1>things safer and more useful as you agitate within the industry.

0:55:38.560 --> 0:55:41.960
<v Speaker 1>I think trying to trying to find a way for

0:55:42.080 --> 0:55:46.800
<v Speaker 1>them to disappear seems all possible. And again, like NFTs

0:55:46.840 --> 0:55:50.239
<v Speaker 1>were obviously going to disappear, but the underlying bigcoin thing

0:55:50.440 --> 0:55:53.960
<v Speaker 1>wasn't because there's too much the moment this election happened,

0:55:54.440 --> 0:55:57.319
<v Speaker 1>the moment this election happened, bitcoin is going to two

0:55:57.400 --> 0:55:59.720
<v Speaker 1>hunch of bitcoin was going because.

0:55:59.560 --> 0:56:02.600
<v Speaker 2>Once about that is we could have stopped crypto. I

0:56:02.640 --> 0:56:04.680
<v Speaker 2>was writing about it at the time. I was, and

0:56:04.719 --> 0:56:06.799
<v Speaker 2>the only one are you going to starve crypto by

0:56:06.800 --> 0:56:10.200
<v Speaker 2>informing people about the inherently criminal aspect that underlies everything,

0:56:10.200 --> 0:56:12.000
<v Speaker 2>the fact that tether is more than likely in the

0:56:12.040 --> 0:56:15.440
<v Speaker 2>hands of multiple different I tried, and I failed, and

0:56:15.600 --> 0:56:18.200
<v Speaker 2>you know what, it sucks, But you try. You can

0:56:18.200 --> 0:56:20.520
<v Speaker 2>put ideas out there, you can see if people pick

0:56:20.600 --> 0:56:22.600
<v Speaker 2>them up. And I mean that's kind of what you

0:56:22.719 --> 0:56:25.360
<v Speaker 2>do there. In the case of crypto, it's such a

0:56:25.360 --> 0:56:28.240
<v Speaker 2>weird thing as well, because it's there, but it isn't.

0:56:28.400 --> 0:56:31.200
<v Speaker 2>It doesn't really do anything other than fun things. Or

0:56:31.239 --> 0:56:34.120
<v Speaker 2>be funded, and it just kind of exists where it's

0:56:34.200 --> 0:56:36.080
<v Speaker 2>just like they kind of they don't even I kind

0:56:36.080 --> 0:56:37.640
<v Speaker 2>of respect the fact they don't even try and.

0:56:37.600 --> 0:56:40.759
<v Speaker 1>Be literally you're talking to Charlie Monger and Warren Buffet's book.

0:56:40.800 --> 0:56:42.759
<v Speaker 1>That's what they always say. I know, but that's what

0:56:42.760 --> 0:56:46.360
<v Speaker 1>they always said. Because of that. That's their whole point,

0:56:46.400 --> 0:56:48.680
<v Speaker 1>that to forget the blockchain. It doesn't really do it,

0:56:48.719 --> 0:56:50.520
<v Speaker 1>you know yet really oh yeah, do anything.

0:56:50.560 --> 0:56:52.560
<v Speaker 2>And the thing is, though there was real money in that,

0:56:52.680 --> 0:56:55.319
<v Speaker 2>which there isn't a generative payer. And I think that

0:56:55.600 --> 0:56:59.040
<v Speaker 2>what's funny is this convenience thing you're talking about may

0:56:59.080 --> 0:57:01.800
<v Speaker 2>actually be their downfa because those five hundred million weekly

0:57:01.840 --> 0:57:04.440
<v Speaker 2>active chat deepert users, the cost of them billions of dollars,

0:57:04.640 --> 0:57:06.040
<v Speaker 2>it's probably all gonna fascinate it.

0:57:06.160 --> 0:57:09.799
<v Speaker 1>It's not as fascinating interesting that they mean their popularity

0:57:09.880 --> 0:57:14.520
<v Speaker 1>is going to destroy. Yeah, those companies, not underlying tech,

0:57:14.600 --> 0:57:17.480
<v Speaker 1>but that the companies themselves, that's fascinating. I'm really like,

0:57:17.520 --> 0:57:20.000
<v Speaker 1>I'm saying, you're teaching me something I had no idea

0:57:20.000 --> 0:57:20.320
<v Speaker 1>about that.

0:57:20.480 --> 0:57:22.960
<v Speaker 2>I'll explain it very simply, which is open Ai last

0:57:23.080 --> 0:57:26.040
<v Speaker 2>year spent nine billion dollars to lose five billion dollars.

0:57:26.240 --> 0:57:30.320
<v Speaker 2>Anthropics spent five point three billion dollars. No, sorry, they

0:57:30.320 --> 0:57:34.480
<v Speaker 2>look they spent like nine seven eight. They spent billions

0:57:34.560 --> 0:57:37.120
<v Speaker 2>to lose five point three billion dollars, of which a

0:57:37.240 --> 0:57:40.080
<v Speaker 2>chunk of that was just given to Amazon for servers.

0:57:40.600 --> 0:57:44.080
<v Speaker 2>It's very fucking weird. They lose money. Their conversion rate

0:57:44.200 --> 0:57:47.400
<v Speaker 2>on chet GPT is awful. So five hundred million, I

0:57:47.440 --> 0:57:50.920
<v Speaker 2>think they have fifteen point five to sixty million paying subscribers.

0:57:51.160 --> 0:57:54.000
<v Speaker 2>They don't publish monthly active users because if they did,

0:57:54.120 --> 0:57:56.320
<v Speaker 2>you could do the math and see it's trash. And

0:57:56.440 --> 0:57:59.240
<v Speaker 2>on top of that, they just can't find a way

0:57:59.280 --> 0:58:01.280
<v Speaker 2>to make money. They so much money. So what's more

0:58:01.320 --> 0:58:03.800
<v Speaker 2>than likely is, yeah, these companies might die. Llllms will

0:58:03.840 --> 0:58:06.280
<v Speaker 2>hang around because there are use cases and Google has that,

0:58:06.440 --> 0:58:08.000
<v Speaker 2>Like Jeff Dino for at, Google is one of the

0:58:08.080 --> 0:58:10.560
<v Speaker 2>least evil tech people. There are actually people there who

0:58:10.760 --> 0:58:12.520
<v Speaker 2>like the tech and give a shit about it, and

0:58:12.600 --> 0:58:15.920
<v Speaker 2>there's more efficiency stuff coming out of Google's models. The

0:58:16.080 --> 0:58:18.840
<v Speaker 2>thing is what we see today I do not believe

0:58:19.200 --> 0:58:21.560
<v Speaker 2>is I think the most annoying scenario is going to

0:58:21.560 --> 0:58:24.400
<v Speaker 2>be the longest life large language model with unlimited access

0:58:24.520 --> 0:58:26.760
<v Speaker 2>is going to be on fucking Meta because of their

0:58:27.000 --> 0:58:30.160
<v Speaker 2>unrestricted tripe that's on every fucking app. But I think

0:58:30.240 --> 0:58:32.640
<v Speaker 2>things like chet gupt are just going to be limited.

0:58:32.840 --> 0:58:34.320
<v Speaker 2>You may still have people who use them in the

0:58:34.360 --> 0:58:34.960
<v Speaker 2>way with the gym.

0:58:35.000 --> 0:58:38.600
<v Speaker 1>But it's interesting what you were saying about. Maybe an

0:58:38.680 --> 0:58:41.400
<v Speaker 1>answer and you think about the industrial revolution is eventually

0:58:42.440 --> 0:58:44.960
<v Speaker 1>is that people are going to have to be trained

0:58:45.640 --> 0:58:48.959
<v Speaker 1>on instead of firing nine thousand, people train nine thousand,

0:58:49.040 --> 0:58:52.560
<v Speaker 1>become prompt engineers, and yeah, become prompt engineers. And the

0:58:52.640 --> 0:58:55.080
<v Speaker 1>fact that they didn't, I mean, look, no one is

0:58:55.160 --> 0:58:58.520
<v Speaker 1>less surprised than me by incentive structures making these motherfuckers

0:58:59.160 --> 0:59:05.160
<v Speaker 1>act like evil. It's completely non caring, you know, monsters.

0:59:05.440 --> 0:59:08.280
<v Speaker 1>But if they can be convinced that there's a profit

0:59:08.400 --> 0:59:09.760
<v Speaker 1>motive in training people.

0:59:09.920 --> 0:59:10.920
<v Speaker 3>It's no profits.

0:59:11.400 --> 0:59:12.920
<v Speaker 1>No, that they can be convinced of it. No, But

0:59:13.280 --> 0:59:15.240
<v Speaker 1>I must be clear though, but event but in all

0:59:15.280 --> 0:59:18.200
<v Speaker 1>these businesses there was no profit until nothing even as

0:59:18.240 --> 0:59:19.640
<v Speaker 1>long did take Amazon to become profitable.

0:59:19.640 --> 0:59:22.320
<v Speaker 2>Amazon took about eleven years with a WS but AWS

0:59:22.440 --> 0:59:24.760
<v Speaker 2>was a concern that reduced costs for Amazon itself to

0:59:24.840 --> 0:59:25.160
<v Speaker 2>run there.

0:59:25.360 --> 0:59:27.000
<v Speaker 1>But in two thousand and a lot of people thought

0:59:27.000 --> 0:59:28.000
<v Speaker 1>it was never going to become.

0:59:27.840 --> 0:59:30.000
<v Speaker 3>Profitable, right, Yeah, but that's not the same.

0:59:31.080 --> 0:59:34.440
<v Speaker 2>Yeah, but the economics computer is, it's completely different economics.

0:59:34.640 --> 0:59:36.120
<v Speaker 5>Ask so, Brian, how much would you pay to keep

0:59:36.200 --> 0:59:37.680
<v Speaker 5>using GBT a month?

0:59:38.160 --> 0:59:41.760
<v Speaker 1>I'm a bad I'm older, and I've done well and

0:59:42.520 --> 0:59:44.000
<v Speaker 1>I've do you know what I'm saying?

0:59:44.040 --> 0:59:44.960
<v Speaker 5>Do you pay anything right now?

0:59:45.000 --> 0:59:47.160
<v Speaker 1>I pay two hundred dollars a month.

0:59:47.200 --> 0:59:49.360
<v Speaker 5>Two hundred a month. Okay, so Google one's like two fifty.

0:59:49.720 --> 0:59:51.160
<v Speaker 3>Do you pay two hundred for.

0:59:51.240 --> 0:59:55.000
<v Speaker 1>The research because for the supercharged researching research on the

0:59:55.040 --> 0:59:57.280
<v Speaker 1>twenty bucks a month one though not the same level.

0:59:58.120 --> 0:59:59.800
<v Speaker 4>That's an interesting thing to point out, too, is that

1:00:00.000 --> 1:00:02.440
<v Speaker 4>you're getting better quality by paying for a higher I'm.

1:00:02.280 --> 1:00:05.520
<v Speaker 1>Aware of it. Seriously, is there a quality difference if

1:00:05.560 --> 1:00:07.320
<v Speaker 1>you look at it? We could yet, oh my god,

1:00:09.080 --> 1:00:12.680
<v Speaker 1>they would tell you there isn't any amount of What

1:00:13.760 --> 1:00:14.240
<v Speaker 1>is the rate?

1:00:14.440 --> 1:00:16.200
<v Speaker 5>So they sell your ship product to make you buy

1:00:16.240 --> 1:00:16.680
<v Speaker 5>them less?

1:00:16.800 --> 1:00:18.360
<v Speaker 1>Well, I don't know if they sell product, but you're

1:00:18.400 --> 1:00:21.840
<v Speaker 1>saying the US rate. I someone told me someone I

1:00:21.880 --> 1:00:24.120
<v Speaker 1>trust a lot, a person who's tech savvy, was like,

1:00:24.720 --> 1:00:26.840
<v Speaker 1>this is when that happened. They used it for a

1:00:26.880 --> 1:00:29.840
<v Speaker 1>while and a couple of months ago, I was in

1:00:29.920 --> 1:00:31.680
<v Speaker 1>a meeting with someone and they were like, if I

1:00:31.720 --> 1:00:33.040
<v Speaker 1>was going to tell you to spend money on anything,

1:00:33.120 --> 1:00:34.320
<v Speaker 1>spend two hundred bucks.

1:00:34.280 --> 1:00:35.400
<v Speaker 3>Want do you want to something crazy?

1:00:35.480 --> 1:00:35.600
<v Speaker 2>Though?

1:00:35.680 --> 1:00:37.960
<v Speaker 3>Yes, they lose money on every two hundred buck a

1:00:38.000 --> 1:00:38.600
<v Speaker 3>month customer.

1:00:38.760 --> 1:00:41.560
<v Speaker 1>I believe you because because I'm using, Because when they

1:00:41.640 --> 1:00:43.800
<v Speaker 1>do that search, you're saying it's burning so much.

1:00:44.160 --> 1:00:44.800
<v Speaker 3>But that's the thing.

1:00:45.320 --> 1:00:48.560
<v Speaker 2>How likely do you think that that will continue because

1:00:48.720 --> 1:00:49.680
<v Speaker 2>the deep research that.

1:00:52.720 --> 1:00:54.680
<v Speaker 1>Well, of course there's a number very soon. I'm already

1:00:54.720 --> 1:00:54.880
<v Speaker 1>at that.

1:00:55.080 --> 1:00:56.640
<v Speaker 5>Yeah, you're very high end of it.

1:00:56.720 --> 1:00:59.040
<v Speaker 1>Yeah, yeah, for sure, double this I would I would

1:00:59.080 --> 1:01:00.440
<v Speaker 1>not pay double that.

1:01:00.960 --> 1:01:02.480
<v Speaker 2>I would not pay And I will be honest. And

1:01:02.560 --> 1:01:05.520
<v Speaker 2>this is not even me being like hate or anything.

1:01:05.880 --> 1:01:08.720
<v Speaker 2>It may not be that cheap because right now.

1:01:08.840 --> 1:01:10.640
<v Speaker 3>It's the big store of my news. Let please head it.

1:01:11.920 --> 1:01:14.040
<v Speaker 2>Look for a two hundred a month power user who's

1:01:14.040 --> 1:01:18.360
<v Speaker 2>already using the losing the money. They lose so much

1:01:18.440 --> 1:01:21.120
<v Speaker 2>money on them that it's like four hundred, five hundred,

1:01:21.200 --> 1:01:24.080
<v Speaker 2>one thousand dollars a month clawed code right now, which

1:01:24.120 --> 1:01:27.440
<v Speaker 2>is slightly different because the way they do context stuff. Nevertheless,

1:01:27.520 --> 1:01:31.240
<v Speaker 2>on the two hundred dollars max user on Claude they

1:01:31.280 --> 1:01:34.680
<v Speaker 2>could be losing. They had someone on ia on Twitter

1:01:34.880 --> 1:01:36.960
<v Speaker 2>they spent ten thousand dollars in compute on a two

1:01:37.040 --> 1:01:40.240
<v Speaker 2>hundred month subscription. These are the majority of power users.

1:01:40.320 --> 1:01:43.280
<v Speaker 2>Power users go nuts, as you well know. This is

1:01:43.440 --> 1:01:45.760
<v Speaker 2>why I'm so pushy on the economics, because what we

1:01:45.840 --> 1:01:49.000
<v Speaker 2>are seeing today, it's like if every uber weighed forty

1:01:49.120 --> 1:01:52.480
<v Speaker 2>thousand pounds and the fuel was giraffe blood. It was

1:01:52.640 --> 1:01:55.439
<v Speaker 2>just like this insane economic And I sound like I'm kidding,

1:01:55.480 --> 1:01:59.200
<v Speaker 2>but the economics are completely bonkers. So as much use

1:01:59.200 --> 1:02:02.120
<v Speaker 2>as you're getting today, I just don't know how practically

1:02:02.160 --> 1:02:04.440
<v Speaker 2>they'll provide that. And they might be cheaper models, but

1:02:04.480 --> 1:02:06.200
<v Speaker 2>the cheaper models and I might not be able to

1:02:06.240 --> 1:02:08.520
<v Speaker 2>do I see me use three Yeah, I've got.

1:02:08.560 --> 1:02:12.000
<v Speaker 1>Yeah, yeah, So which takes a lot. Yeah, sometimes it

1:02:12.080 --> 1:02:13.200
<v Speaker 1>still takes such a long time.

1:02:13.320 --> 1:02:13.720
<v Speaker 3>That's the thing.

1:02:13.920 --> 1:02:16.000
<v Speaker 1>You can have a conversation and that's long.

1:02:16.080 --> 1:02:19.720
<v Speaker 2>How much of this is sustainable long term? And I

1:02:19.840 --> 1:02:22.080
<v Speaker 2>know the business stuff is annoying because you still have

1:02:22.160 --> 1:02:23.200
<v Speaker 2>your experience, which you like.

1:02:23.320 --> 1:02:25.720
<v Speaker 1>No, it's not annoying, it's fascinating. I'm fascinated about the

1:02:25.720 --> 1:02:26.840
<v Speaker 1>whole thing. This is just great.

1:02:27.080 --> 1:02:29.440
<v Speaker 2>It's just the long term here, and I'm talking long

1:02:29.560 --> 1:02:32.560
<v Speaker 2>terms in eighteen months is I don't know if we

1:02:32.680 --> 1:02:35.840
<v Speaker 2>will have deep research at any price point approaching the

1:02:35.880 --> 1:02:37.560
<v Speaker 2>one we'll have in that period.

1:02:37.560 --> 1:02:39.640
<v Speaker 1>Well, the deep research is the only reason anyone should

1:02:39.640 --> 1:02:40.160
<v Speaker 1>pay the money.

1:02:40.160 --> 1:02:42.320
<v Speaker 2>And that's the thing, though, the only reason that people

1:02:42.320 --> 1:02:45.040
<v Speaker 2>should pay is the thing that is not sustainable and

1:02:45.520 --> 1:02:48.400
<v Speaker 2>has no problems. So when you talk about how we

1:02:48.560 --> 1:02:51.720
<v Speaker 2>bring this towards like an AWS situation, AWS is lack

1:02:51.760 --> 1:02:54.600
<v Speaker 2>of profitability was built on infrastructure expansion. It was because

1:02:54.640 --> 1:02:57.400
<v Speaker 2>they were building the rails of cloud rip large. It

1:02:57.640 --> 1:02:59.880
<v Speaker 2>was never in the billions and billions a year with

1:03:00.120 --> 1:03:03.440
<v Speaker 2>no There is no path the profitability here. There was

1:03:03.520 --> 1:03:05.320
<v Speaker 2>one model of open aies that they we can deliver

1:03:05.400 --> 1:03:08.080
<v Speaker 2>to open Microsoft in twenty twenty three, called a.

1:03:08.120 --> 1:03:10.200
<v Speaker 3>Racus, and.

1:03:12.320 --> 1:03:14.480
<v Speaker 2>They failed to do it. They have yet to discover

1:03:14.760 --> 1:03:17.880
<v Speaker 2>a or make because it may not be mathematically possible

1:03:18.040 --> 1:03:20.480
<v Speaker 2>a really good large language model that can do the

1:03:20.600 --> 1:03:24.040
<v Speaker 2>kind of reasoning like that that would be reliable and

1:03:24.200 --> 1:03:25.000
<v Speaker 2>have the web search too.

1:03:25.040 --> 1:03:26.120
<v Speaker 1>And I want to say you do have to when

1:03:26.120 --> 1:03:27.840
<v Speaker 1>you talk about prompt Sorry, I was thinking about this

1:03:27.960 --> 1:03:30.080
<v Speaker 1>when I was learning about complexity theory. I read this

1:03:30.080 --> 1:03:31.960
<v Speaker 1>book by a gun named Neil Tiste that I loved.

1:03:32.000 --> 1:03:34.960
<v Speaker 1>He's professor at at NYU. And I'm not good at physics,

1:03:35.320 --> 1:03:37.560
<v Speaker 1>and I really want to understand quantum physics, and so

1:03:37.720 --> 1:03:41.880
<v Speaker 1>I would ask questions, Right, do research on this and

1:03:41.960 --> 1:03:44.800
<v Speaker 1>then find a way to be able to explain it.

1:03:45.040 --> 1:03:47.760
<v Speaker 1>So go read these books, go find documentaries. And then

1:03:48.040 --> 1:03:51.880
<v Speaker 1>I just quickly realized at some point that that that

1:03:52.000 --> 1:03:55.120
<v Speaker 1>the AI hadn't It was clear to me it hadn't

1:03:55.200 --> 1:03:57.960
<v Speaker 1>read something and it doesn't have access. But I could

1:03:57.960 --> 1:03:59.280
<v Speaker 1>figure it out it hadn't, And I just said, like,

1:03:59.480 --> 1:04:01.360
<v Speaker 1>did you look at that video? Like it was a video?

1:04:01.400 --> 1:04:03.560
<v Speaker 1>I go, did you really look at that video that speech?

1:04:04.240 --> 1:04:06.560
<v Speaker 1>And then it immediately went no, you got me there.

1:04:06.600 --> 1:04:08.000
<v Speaker 1>I didn't look at the speech, but I'm gonna go

1:04:08.160 --> 1:04:10.920
<v Speaker 1>now find it a different way. So you do. You

1:04:11.120 --> 1:04:15.479
<v Speaker 1>do have to be vigilant acculturated to having those kinds

1:04:15.520 --> 1:04:17.960
<v Speaker 1>of crazy too, like you have to maybe have advanced

1:04:18.000 --> 1:04:22.240
<v Speaker 1>degrees or have trained yourself. So I'm not this is fascinating,

1:04:22.320 --> 1:04:26.200
<v Speaker 1>right because I take for granted the steps I take

1:04:26.240 --> 1:04:31.240
<v Speaker 1>for granted to and make my way through the world.

1:04:31.440 --> 1:04:32.120
<v Speaker 3>Yes, and.

1:04:34.080 --> 1:04:36.720
<v Speaker 1>Like the way that I would interrogate something like that

1:04:36.920 --> 1:04:39.000
<v Speaker 1>to get to an answer that's useful.

1:04:39.320 --> 1:04:41.640
<v Speaker 5>Yeah, as opposed to maybe my parents would be like, oh, okay,

1:04:41.640 --> 1:04:42.360
<v Speaker 5>you watched that video.

1:04:42.360 --> 1:04:46.160
<v Speaker 1>Okay, okay, and they might get wrong information from this.

1:04:46.360 --> 1:04:47.880
<v Speaker 1>I can't argue with that. You're right about it.

1:04:47.920 --> 1:04:50.400
<v Speaker 2>And I think that a lot of this grand theory

1:04:50.480 --> 1:04:53.680
<v Speaker 2>of this comes down to I think people have a

1:04:53.760 --> 1:04:55.760
<v Speaker 2>lot of questions and not a lot of people to

1:04:55.840 --> 1:04:58.400
<v Speaker 2>ask them to questions about their life, how they're feeling.

1:04:58.520 --> 1:05:01.640
<v Speaker 2>Then I think it's answer with the medical side, where

1:05:01.680 --> 1:05:04.360
<v Speaker 2>it's it's quite hard to ask a doctor a question

1:05:04.480 --> 1:05:07.200
<v Speaker 2>in any country, it's hard to know whether they and

1:05:07.320 --> 1:05:10.120
<v Speaker 2>also doctors regularly make you feel annoying. I'm not talking

1:05:10.160 --> 1:05:10.840
<v Speaker 2>about my doctor.

1:05:10.880 --> 1:05:11.320
<v Speaker 3>He loves me.

1:05:12.000 --> 1:05:14.400
<v Speaker 2>But you can't go to a doctor regularly with little questions.

1:05:14.480 --> 1:05:17.600
<v Speaker 2>They don't have the time because they must maximize profits

1:05:17.760 --> 1:05:20.120
<v Speaker 2>or they are busy one of the two. Yeah, And

1:05:21.280 --> 1:05:23.760
<v Speaker 2>ultimately we are sitting there going I've got this weird

1:05:23.920 --> 1:05:26.400
<v Speaker 2>rash or like I might leg itched in this place

1:05:26.440 --> 1:05:27.800
<v Speaker 2>in the same day three days running.

1:05:27.840 --> 1:05:30.080
<v Speaker 3>What could that mean? And you can't really google that.

1:05:30.560 --> 1:05:33.880
<v Speaker 2>It's just you're dying but you can't really google that,

1:05:33.960 --> 1:05:36.720
<v Speaker 2>but chat, GPT whatever can do an impressive impression of it,

1:05:37.160 --> 1:05:39.800
<v Speaker 2>or in your case, Brian and I think this is

1:05:39.880 --> 1:05:42.600
<v Speaker 2>reasonable lead you in a direction towards like a male

1:05:42.680 --> 1:05:45.120
<v Speaker 2>clinic article about a particular things, something you could raise

1:05:45.160 --> 1:05:48.640
<v Speaker 2>to your doctor. That makes sense. People are lonely, people

1:05:48.800 --> 1:05:51.520
<v Speaker 2>just have weird questions. And I think that there is

1:05:51.680 --> 1:05:54.680
<v Speaker 2>partially a bad cybe where it's like everyone wants everything immediately,

1:05:54.880 --> 1:05:57.800
<v Speaker 2>we must have everything we want immediately. But also people

1:05:57.840 --> 1:06:01.200
<v Speaker 2>are curious, and we are more connected as people. We

1:06:01.320 --> 1:06:04.720
<v Speaker 2>are more decentralized as people. We don't meet people we

1:06:04.840 --> 1:06:09.760
<v Speaker 2>don't have We are by at scale, overworked, underpaid, so

1:06:09.840 --> 1:06:11.800
<v Speaker 2>we don't have the time to be generous with our time.

1:06:12.000 --> 1:06:12.200
<v Speaker 3>Yeah.

1:06:12.280 --> 1:06:14.400
<v Speaker 1>I'll give you one other use case though, because and

1:06:14.520 --> 1:06:17.120
<v Speaker 1>it goes to the question of training. Training, But you

1:06:17.160 --> 1:06:19.400
<v Speaker 1>could use this for you said, your parents dance or whatever.

1:06:19.920 --> 1:06:24.000
<v Speaker 1>So I'll tell you that if you are somebody who

1:06:24.120 --> 1:06:27.480
<v Speaker 1>is all used like deadlifting or squatting with a bar bell.

1:06:27.520 --> 1:06:30.800
<v Speaker 1>As an example, if I put a thirty second clip

1:06:31.800 --> 1:06:36.480
<v Speaker 1>into chat GBT and I say please watch this, tell

1:06:36.560 --> 1:06:39.200
<v Speaker 1>me is this form at risk of injury? How should

1:06:39.200 --> 1:06:41.880
<v Speaker 1>I modify it. Is it good enough? What do you

1:06:41.920 --> 1:06:45.440
<v Speaker 1>think about the load of this weight? The answer it

1:06:45.480 --> 1:06:48.120
<v Speaker 1>will give is and I have checked it against like

1:06:48.200 --> 1:06:51.600
<v Speaker 1>the world. The answer will give is outstandings. And that's

1:06:51.920 --> 1:06:53.720
<v Speaker 1>how would you get that in another way? I don't

1:06:53.760 --> 1:06:57.520
<v Speaker 1>think there is. And that's a small example. Well that's

1:06:57.600 --> 1:07:00.520
<v Speaker 1>different than search. There's an export. No, that's I'm searching

1:07:00.640 --> 1:07:02.280
<v Speaker 1>using a video. No, no, it's not.

1:07:02.400 --> 1:07:05.960
<v Speaker 5>The matching is a bit less spe sophisticated in the

1:07:06.240 --> 1:07:07.160
<v Speaker 5>regular Google search.

1:07:07.200 --> 1:07:07.880
<v Speaker 3>Oh I'm saying that.

1:07:08.000 --> 1:07:11.160
<v Speaker 1>No, it's I'm merely matching it against a perceived perfect form.

1:07:11.200 --> 1:07:13.440
<v Speaker 1>It's looking at your female literally, it'll go with your

1:07:13.480 --> 1:07:16.200
<v Speaker 1>femur size. This is the kind of squad that you

1:07:16.240 --> 1:07:18.880
<v Speaker 1>could do if I low bar versus high bar. Here's why,

1:07:19.000 --> 1:07:21.240
<v Speaker 1>here's what this looks like. You're over, I'm agreeing with you.

1:07:21.560 --> 1:07:24.240
<v Speaker 2>I'm just saying that search as a term has grown

1:07:24.280 --> 1:07:26.960
<v Speaker 2>to look at this image and compare it to these sources,

1:07:27.040 --> 1:07:28.600
<v Speaker 2>which is theoretically we'll search.

1:07:28.840 --> 1:07:34.000
<v Speaker 1>But then in the good language, right language, fix how

1:07:34.040 --> 1:07:34.480
<v Speaker 1>to fix it?

1:07:35.320 --> 1:07:37.640
<v Speaker 4>And I worry if you ask a bad question, like

1:07:37.720 --> 1:07:40.560
<v Speaker 4>if you not ask a bad question, phrase a question incorrect?

1:07:40.640 --> 1:07:41.440
<v Speaker 3>Sure, And.

1:07:43.000 --> 1:07:45.320
<v Speaker 4>I'm not saying you if one phrases let's say they

1:07:45.320 --> 1:07:47.200
<v Speaker 4>ask a fitness question, but they phrase it a little

1:07:47.200 --> 1:07:49.880
<v Speaker 4>bit weird, and the answer they get is harmful. It

1:07:49.960 --> 1:07:53.280
<v Speaker 4>almost I'm worried about situations like that. And also it

1:07:53.320 --> 1:07:58.680
<v Speaker 4>feels like we're removing an element of human responsibility or accountability,

1:07:58.680 --> 1:08:01.160
<v Speaker 4>where it's like, well, the machine answered weird, rather than

1:08:01.320 --> 1:08:03.200
<v Speaker 4>like a doctor answered weird.

1:08:03.240 --> 1:08:05.560
<v Speaker 3>And I think that lots of corporations love that. They

1:08:05.600 --> 1:08:06.960
<v Speaker 3>love that because they're already doing it.

1:08:07.240 --> 1:08:10.880
<v Speaker 4>If I if my AI accidentally denies your insurance, it.

1:08:10.960 --> 1:08:12.040
<v Speaker 3>Was the AI, that wasn't us.

1:08:12.120 --> 1:08:16.040
<v Speaker 2>It's the algorithm. It's it's the same algorithmic pass on thing.

1:08:16.560 --> 1:08:18.760
<v Speaker 2>But I think you're right in the that is kind

1:08:18.800 --> 1:08:20.920
<v Speaker 2>of cool. I also have used those three because I

1:08:21.000 --> 1:08:24.600
<v Speaker 2>do pay for this. I'm not a baseless hater. I

1:08:24.680 --> 1:08:27.640
<v Speaker 2>did ask what's the distance between this bottom of this

1:08:27.760 --> 1:08:31.160
<v Speaker 2>photo frame and the floor, and it spent ten minutes

1:08:31.200 --> 1:08:33.560
<v Speaker 2>to give me a completely insanely wrong answer. They're not

1:08:33.680 --> 1:08:34.439
<v Speaker 2>good with numbers.

1:08:34.520 --> 1:08:37.160
<v Speaker 1>Oh fascinating. The other day, in this example, I just

1:08:37.200 --> 1:08:39.880
<v Speaker 1>gave you it said I can't. It just said like

1:08:39.960 --> 1:08:41.880
<v Speaker 1>I can't see, which is good. That was great.

1:08:42.080 --> 1:08:42.640
<v Speaker 3>That should do that.

1:08:43.240 --> 1:08:45.760
<v Speaker 5>That was great, both of you using the same I

1:08:45.840 --> 1:08:46.599
<v Speaker 5>was using three.

1:08:46.800 --> 1:08:52.040
<v Speaker 2>On Chap GBT plus though, So now I don't know

1:08:52.080 --> 1:08:54.080
<v Speaker 2>how I feel about giving Clammy Sammy two.

1:08:54.000 --> 1:08:55.639
<v Speaker 1>Underprive for a month and see what it does.

1:08:55.840 --> 1:08:57.639
<v Speaker 3>Yeah, you can write it off.

1:08:57.720 --> 1:09:00.720
<v Speaker 2>I don't want to write gave two hundred bucks to

1:09:00.720 --> 1:09:01.840
<v Speaker 2>Warriolama day hopes.

1:09:01.840 --> 1:09:02.960
<v Speaker 1>I would love you to do that and then call

1:09:03.040 --> 1:09:05.120
<v Speaker 1>me and say it's the same result.

1:09:05.200 --> 1:09:06.960
<v Speaker 2>No, I would love you to tell me I actually

1:09:07.040 --> 1:09:07.559
<v Speaker 2>really careous.

1:09:07.560 --> 1:09:09.120
<v Speaker 1>I would love you to say to me, dude, that's

1:09:09.360 --> 1:09:10.559
<v Speaker 1>just been twenty bucks the same.

1:09:11.240 --> 1:09:12.599
<v Speaker 5>Then you can say you're one eighty a month.

1:09:12.680 --> 1:09:13.880
<v Speaker 1>Yeah, great news. Please.

1:09:14.120 --> 1:09:16.000
<v Speaker 5>I am what Mike was bringing up I thought was

1:09:16.040 --> 1:09:17.439
<v Speaker 5>going to be similar to the point I had been

1:09:17.520 --> 1:09:19.360
<v Speaker 5>mulling over when you were talking about the training stuff,

1:09:19.360 --> 1:09:21.560
<v Speaker 5>which is that we already deal with due to like

1:09:22.120 --> 1:09:25.400
<v Speaker 5>you know, capitalism or barriers to entry, an influx of

1:09:25.560 --> 1:09:27.920
<v Speaker 5>individuals who may not actually be fully equipped for the

1:09:28.040 --> 1:09:31.759
<v Speaker 5>jobs they purport to do. So, whether it be journalists

1:09:31.800 --> 1:09:34.840
<v Speaker 5>like myself or like again Finnish influencers or trainers that

1:09:34.960 --> 1:09:38.080
<v Speaker 5>say they have whatever types of physical health degrees that

1:09:38.240 --> 1:09:40.840
<v Speaker 5>are just the result of a ten hour course online

1:09:41.000 --> 1:09:43.680
<v Speaker 5>that sort of thing. We're already dealing with the like

1:09:43.880 --> 1:09:47.640
<v Speaker 5>quality dilution of like information coming from sources like that,

1:09:48.080 --> 1:09:50.880
<v Speaker 5>to throw AI into the mix piece like making it

1:09:51.120 --> 1:09:53.640
<v Speaker 5>even worse, like harder than ever to tell what the

1:09:53.720 --> 1:09:55.640
<v Speaker 5>truth is. And I don't know about you all, but

1:09:55.720 --> 1:09:59.040
<v Speaker 5>I find myself gaslighting myself all the time now, regardless

1:09:59.080 --> 1:10:00.920
<v Speaker 5>of like my own life, whether it's the truth in

1:10:01.000 --> 1:10:04.120
<v Speaker 5>the world, whether I'm being too sympathetic to multiple different perspectives.

1:10:04.520 --> 1:10:06.760
<v Speaker 5>I don't know what the truth is anymore. I can't

1:10:06.800 --> 1:10:09.679
<v Speaker 5>tell you what the cold heart scientific truth of anything

1:10:09.800 --> 1:10:11.680
<v Speaker 5>ever is. And that's where it's led me.

1:10:11.840 --> 1:10:13.280
<v Speaker 1>You also have to be willing, I agree with. That's

1:10:13.280 --> 1:10:15.000
<v Speaker 1>a brilliant point. I think one of the things I

1:10:15.040 --> 1:10:16.800
<v Speaker 1>would say to people if someone asked me, how do

1:10:16.920 --> 1:10:19.639
<v Speaker 1>you how should you communicate? Mm hmm, I would say,

1:10:19.640 --> 1:10:21.840
<v Speaker 1>And it's really painful for people because I've seen them

1:10:21.880 --> 1:10:25.160
<v Speaker 1>talk online about what they love about conversing with the eye.

1:10:26.040 --> 1:10:29.519
<v Speaker 1>You gotta say, click every toggle that says be mean,

1:10:30.040 --> 1:10:32.800
<v Speaker 1>tell me the truth, don't tell me I'm smart. Yeah,

1:10:32.960 --> 1:10:36.840
<v Speaker 1>Like you got to suck me rocked it to really

1:10:37.080 --> 1:10:40.759
<v Speaker 1>be withholding in that way if you want to actually

1:10:40.840 --> 1:10:43.960
<v Speaker 1>engage so that so that you're you're not getting gas lit,

1:10:44.080 --> 1:10:46.360
<v Speaker 1>because yes, I agree that this is dangerous. The default

1:10:46.400 --> 1:10:48.920
<v Speaker 1>setting is to gaslight you. You got to actually go.

1:10:49.240 --> 1:10:51.519
<v Speaker 1>I don't need to hear like that glaze you.

1:10:51.640 --> 1:10:52.759
<v Speaker 3>I think the young people.

1:10:52.640 --> 1:10:57.200
<v Speaker 4>Blaze you sure before we run And you didn't hear

1:10:57.240 --> 1:10:58.800
<v Speaker 4>the producer just laugh out loud.

1:11:00.000 --> 1:11:02.120
<v Speaker 1>We talk about just one I think they used to say,

1:11:02.160 --> 1:11:06.120
<v Speaker 1>gle but can we just talk about one positive, purely

1:11:06.200 --> 1:11:09.200
<v Speaker 1>positive tech thing that happened the last week July fourth?

1:11:09.880 --> 1:11:12.559
<v Speaker 1>Who who's on TikTok and knows all about the antipasto party?

1:11:12.720 --> 1:11:16.840
<v Speaker 5>And was it the one that one person went to?

1:11:17.479 --> 1:11:19.439
<v Speaker 1>Yeah, okay, it's the greatest thing.

1:11:19.560 --> 1:11:19.840
<v Speaker 3>Please.

1:11:20.479 --> 1:11:20.760
<v Speaker 2>This is this?

1:11:21.000 --> 1:11:25.200
<v Speaker 1>Okay this They're in Texas. These people have a July

1:11:25.360 --> 1:11:28.120
<v Speaker 1>fourth party. There's a woman she's just moved there recently.

1:11:29.160 --> 1:11:32.400
<v Speaker 1>Her kid becomes friends with someone else's kid. And this woman, Sarah,

1:11:32.479 --> 1:11:35.519
<v Speaker 1>is the parent of one boy, says, come with me

1:11:35.600 --> 1:11:38.080
<v Speaker 1>over to these people party, these people's party. Woman a

1:11:38.760 --> 1:11:42.880
<v Speaker 1>makes the apotheosis of all antipastos out, the greatest salady

1:11:42.880 --> 1:11:46.200
<v Speaker 1>you've ever seen in your life, goes to their house. Yeah,

1:11:47.040 --> 1:11:49.920
<v Speaker 1>and these people are like who's this stranger in our house?

1:11:50.080 --> 1:11:52.000
<v Speaker 1>And they kick her out even though she brought this

1:11:52.120 --> 1:11:56.639
<v Speaker 1>incredible She goes home and gets on TikTok and she's

1:11:56.680 --> 1:12:00.240
<v Speaker 1>crying and she's like, I brought this salad and they

1:12:00.320 --> 1:12:03.400
<v Speaker 1>kicked me out of their hounds. And the entire internet

1:12:03.520 --> 1:12:06.960
<v Speaker 1>that her, yeah, and loves her so much, and it's like,

1:12:07.800 --> 1:12:10.280
<v Speaker 1>it's an incredible story. Everybody I know is like, like

1:12:10.600 --> 1:12:13.040
<v Speaker 1>everybody of all ages, like nieces and nephews of mine

1:12:13.400 --> 1:12:16.599
<v Speaker 1>and then older people older than me are all sending

1:12:16.880 --> 1:12:19.200
<v Speaker 1>and it's an amazing thing. Is a huge community is

1:12:19.320 --> 1:12:23.760
<v Speaker 1>rallied to hate these people and to love her and

1:12:23.880 --> 1:12:28.120
<v Speaker 1>her homemade mazzarella and come grown tomatoes that she brought

1:12:28.160 --> 1:12:28.559
<v Speaker 1>to their.

1:12:28.439 --> 1:12:32.680
<v Speaker 5>House scaling Lo, you saw something else, but no, I

1:12:32.720 --> 1:12:35.040
<v Speaker 5>saw something else. All together, it's really worth But like

1:12:35.120 --> 1:12:36.960
<v Speaker 5>Reddit does things like that, and that's the thing.

1:12:37.040 --> 1:12:39.320
<v Speaker 3>Reddit like this is I liked ending this in the

1:12:39.360 --> 1:12:41.280
<v Speaker 3>post of notes. That's love it. Reddit does that.

1:12:41.560 --> 1:12:44.120
<v Speaker 5>You can say something well, yeah, no, community and social

1:12:44.280 --> 1:12:46.439
<v Speaker 5>forums like that, that's what the Internet is great for.

1:12:46.680 --> 1:12:48.920
<v Speaker 5>And then not a lot of AI as president, a

1:12:48.960 --> 1:12:49.760
<v Speaker 5>lot of that and.

1:12:50.000 --> 1:12:52.800
<v Speaker 2>It's almost read it especially right now has got good

1:12:52.800 --> 1:12:56.360
<v Speaker 2>because it isn't they The CEO keeps thinking of shoving

1:12:56.439 --> 1:12:57.840
<v Speaker 2>it places, but even on the.

1:12:57.840 --> 1:12:59.719
<v Speaker 3>Better offline read nine thousand.

1:13:00.080 --> 1:13:03.519
<v Speaker 2>Now, hey guys, but it's great because I one of

1:13:03.520 --> 1:13:05.800
<v Speaker 2>my biggest stories I wrote recently was on Cursor and

1:13:06.000 --> 1:13:08.240
<v Speaker 2>them falling apart, and it was because someone on the

1:13:08.280 --> 1:13:10.719
<v Speaker 2>Red the forum was just like looking through their stuff

1:13:10.800 --> 1:13:13.439
<v Speaker 2>like a you and everyone had this full conversation about it.

1:13:13.720 --> 1:13:16.280
<v Speaker 2>You've got these people out there in this morass of

1:13:17.040 --> 1:13:20.160
<v Speaker 2>fake stuff or generated stuff or seo stuff, You've got

1:13:20.280 --> 1:13:23.479
<v Speaker 2>genuine people. There is still a joy to all of

1:13:23.560 --> 1:13:26.200
<v Speaker 2>this crap. I love Blue Sky as well, but read

1:13:26.320 --> 1:13:27.840
<v Speaker 2>it has really just I'm shocked.

1:13:27.880 --> 1:13:29.439
<v Speaker 1>Spend a lot of time I read it. I read it.

1:13:29.520 --> 1:13:30.000
<v Speaker 2>I'm shocked.

1:13:30.040 --> 1:13:33.360
<v Speaker 1>And a group someone ever goes on you got to

1:13:33.360 --> 1:13:36.000
<v Speaker 1>pick your on. No, but I'm saying, you know, like,

1:13:36.240 --> 1:13:39.720
<v Speaker 1>but but there where people have a hobby or a thing,

1:13:39.880 --> 1:13:44.160
<v Speaker 1>like whether it's music. You know, you're like I take stuff. Yeah,

1:13:44.240 --> 1:13:47.920
<v Speaker 1>it's great and you must love the like the Claude Reddit.

1:13:48.600 --> 1:13:52.320
<v Speaker 1>Every day someone's like, why does claud you must I

1:13:52.360 --> 1:13:53.520
<v Speaker 1>love our slash Google.

1:13:53.360 --> 1:13:56.519
<v Speaker 2>Yes, I like our slash sas because it's all just

1:13:56.640 --> 1:14:01.200
<v Speaker 2>people like running sass. Know you're thinking of a different

1:14:01.280 --> 1:14:04.880
<v Speaker 2>SaaS one. I'm talking s A A S as a service.

1:14:05.120 --> 1:14:05.920
<v Speaker 3>Yeah, I'm a loser.

1:14:06.320 --> 1:14:09.280
<v Speaker 2>So no, you watch people being like my app has

1:14:09.320 --> 1:14:11.760
<v Speaker 2>been up for six months since made seven dollars. A

1:14:11.800 --> 1:14:12.960
<v Speaker 2>bunch of people are like yes.

1:14:13.200 --> 1:14:13.760
<v Speaker 5>And there was subs.

1:14:14.640 --> 1:14:17.639
<v Speaker 2>I kind of love them that you've got these niches.

1:14:17.760 --> 1:14:19.720
<v Speaker 2>But I hate to say, I do need to call

1:14:19.800 --> 1:14:20.280
<v Speaker 2>this episode.

1:14:20.640 --> 1:14:22.000
<v Speaker 3>Brian. Where can people find you?

1:14:22.920 --> 1:14:23.080
<v Speaker 1>Oh?

1:14:23.800 --> 1:14:26.760
<v Speaker 3>Instagram, Yes, we'll have a link to you there as well.

1:14:27.520 --> 1:14:30.519
<v Speaker 4>You can find me on Instagram at Mike Drucker is Dead,

1:14:30.720 --> 1:14:33.519
<v Speaker 4>and on Blue Sky at Mike Drucker and by a

1:14:33.640 --> 1:14:36.599
<v Speaker 4>good Game, No Rematch. It is a book that's available digital,

1:14:36.640 --> 1:14:39.240
<v Speaker 4>hardcover or audio with the audio read by myself.

1:14:39.479 --> 1:14:43.479
<v Speaker 5>Hell yeah, I'm at engadget dot com or on threads

1:14:43.479 --> 1:14:45.519
<v Speaker 5>at Sherlin Instagram c h r O I N N

1:14:45.640 --> 1:14:46.439
<v Speaker 5>S c h r M.

1:14:46.960 --> 1:14:49.920
<v Speaker 3>Type in Google the man who destroyed Google Search. That's me.

1:14:50.520 --> 1:14:52.800
<v Speaker 2>I'm ed Zitchohn. Thank you so much for listening as

1:14:52.880 --> 1:14:55.520
<v Speaker 2>ever is recording the wonderful New York Studio and iHeartRadio.

1:14:55.600 --> 1:14:58.080
<v Speaker 2>Danel Goodman of course as our producer. Thank you so much, Daniel,

1:14:58.520 --> 1:15:08.960
<v Speaker 2>Thank you all for listening. Thank you for listening to

1:15:09.040 --> 1:15:12.000
<v Speaker 2>Better Offline. The editor and composer of the Better Offline

1:15:12.000 --> 1:15:14.680
<v Speaker 2>theme song is Matasowski. You can check out more of

1:15:14.720 --> 1:15:18.320
<v Speaker 2>his music and audio projects at Matasowski dot com, M

1:15:18.360 --> 1:15:22.479
<v Speaker 2>A T T O S O W s ki dot com.

1:15:23.200 --> 1:15:25.479
<v Speaker 2>You can email me at easy at Better Offline dot

1:15:25.560 --> 1:15:27.760
<v Speaker 2>com or visit Better Offline dot com to find more

1:15:27.800 --> 1:15:31.160
<v Speaker 2>podcast links and of course, my newsletter. I also really

1:15:31.200 --> 1:15:33.439
<v Speaker 2>recommend you go to chat dot where's youreaed dot at

1:15:33.520 --> 1:15:35.920
<v Speaker 2>to visit the discord, and go to our slash Better

1:15:35.960 --> 1:15:39.160
<v Speaker 2>Offline to check out our reddit. Thank you so much

1:15:39.200 --> 1:15:43.000
<v Speaker 2>for listening. Better Offline is a production of cool Zone Media.

1:15:43.160 --> 1:15:46.000
<v Speaker 2>For more from cool Zone Media, visit our website cool

1:15:46.080 --> 1:15:49.400
<v Speaker 2>Zonemedia dot com, or check us out on the iHeartRadio app,

1:15:49.479 --> 1:16:08.439
<v Speaker 2>Apple Podcasts, or wherever you get your podcasts scho