WEBVTT - The Cult of Failing Upwards

0:00:02.480 --> 0:00:08.080
<v Speaker 1>All Zone Media. Hello and welcome to Better Offline, Cool

0:00:08.160 --> 0:00:20.400
<v Speaker 1>Zone Media's happiest podcast. I'm your host ed ze Tron. Well,

0:00:23.360 --> 0:00:26.680
<v Speaker 1>as I've run through in the last two episodes, managers

0:00:26.680 --> 0:00:30.360
<v Speaker 1>have poisoned taxability to innovate with a degenerative capitalism known

0:00:30.400 --> 0:00:33.640
<v Speaker 1>as the rot economy, pushing growth at all cost metrics

0:00:33.800 --> 0:00:37.000
<v Speaker 1>on companies you love will isolating and removing those that

0:00:37.040 --> 0:00:39.600
<v Speaker 1>don't agree. And by the way, they're the same people

0:00:39.640 --> 0:00:43.360
<v Speaker 1>who actually build things and make good products and use

0:00:43.440 --> 0:00:47.840
<v Speaker 1>them as well. Nowhere is this more obvious than Meta,

0:00:47.920 --> 0:00:51.519
<v Speaker 1>a company with leadership completely removed from any meaningful interaction

0:00:51.600 --> 0:00:55.720
<v Speaker 1>with their products or any value to society. Since two

0:00:55.720 --> 0:00:59.240
<v Speaker 1>thousand and nine, Facebook's core products have reliably become more

0:00:59.240 --> 0:01:02.840
<v Speaker 1>profitable and exactly the same rate they decay, with every

0:01:02.880 --> 0:01:08.080
<v Speaker 1>founder behind every product that Zuckerberg has acquired, including Instagram, Oculus,

0:01:08.160 --> 0:01:11.840
<v Speaker 1>and WhatsApp, leaving the company and almost immediately talking about

0:01:11.840 --> 0:01:14.760
<v Speaker 1>how much they hated working there. According to a New

0:01:14.840 --> 0:01:18.039
<v Speaker 1>York Times piece from twenty eighteen, Kevin Sistrom, co founder

0:01:18.080 --> 0:01:21.120
<v Speaker 1>of Instagram, only chose to quit the company after Mark

0:01:21.200 --> 0:01:24.480
<v Speaker 1>Zuckerberg became jealous of the app's success, Taking the spotlight

0:01:24.560 --> 0:01:28.000
<v Speaker 1>away from that of Facebook itself, an appy kind of

0:01:28.080 --> 0:01:32.160
<v Speaker 1>stole from the Winklevosses. Sistrom allegedly didn't really want to

0:01:32.240 --> 0:01:35.400
<v Speaker 1>leave Facebook, but felt that Zuckerberg was depriving Instagram of

0:01:35.480 --> 0:01:39.280
<v Speaker 1>resources and now and I quote seemed to want Instagram

0:01:39.319 --> 0:01:41.560
<v Speaker 1>to use its momentum to help the big Blue app,

0:01:41.840 --> 0:01:44.440
<v Speaker 1>which is an annoying way of describing a situation that

0:01:44.480 --> 0:01:47.080
<v Speaker 1>feels a convenient time to reveal that this was a

0:01:47.160 --> 0:01:52.240
<v Speaker 1>Kara Swisher piece. Despite Swisher's bloviating, it took tech Crunch's

0:01:52.320 --> 0:01:56.000
<v Speaker 1>Josh Constein to reveal the real reason that Sistram had left.

0:01:57.000 --> 0:02:00.680
<v Speaker 1>Facebook had replaced Instagram's VP of Product Kevin will who

0:02:00.800 --> 0:02:04.720
<v Speaker 1>everybody loved, with the former VP of Facebook News in

0:02:04.800 --> 0:02:08.160
<v Speaker 1>May twenty eighteen. You know that great year for news,

0:02:08.800 --> 0:02:13.680
<v Speaker 1>and that man was named Adam Masseri. He would take

0:02:13.720 --> 0:02:17.360
<v Speaker 1>over and over the next six years he would absolutely

0:02:17.480 --> 0:02:21.679
<v Speaker 1>destroy everything that Sistrom and his co founder Mitchell Krieger

0:02:21.960 --> 0:02:27.040
<v Speaker 1>had built. According to Constein's reporting, Sistrom had also clashed

0:02:27.040 --> 0:02:30.240
<v Speaker 1>with Chris Cox, Facebook's chief product officer at the time.

0:02:31.160 --> 0:02:35.040
<v Speaker 1>Constein described Masseri as a Zuckerberg loyalist who was and

0:02:35.120 --> 0:02:38.040
<v Speaker 1>I quote, disappointed that he didn't get the head of

0:02:38.080 --> 0:02:41.160
<v Speaker 1>Facebook gig that went to Will Cathcot, who now heads

0:02:41.200 --> 0:02:46.080
<v Speaker 1>up Whatsam. Over time, Massi and Zuckerberg moved to erode

0:02:46.080 --> 0:02:50.160
<v Speaker 1>Sistrom and Instagram's independence from Facebook, and eventually, I guess

0:02:50.200 --> 0:02:53.919
<v Speaker 1>it all became too much to bear. Sistrom was there

0:02:53.960 --> 0:02:57.200
<v Speaker 1>to oversee the most damaging change to Instagram, though, which

0:02:57.280 --> 0:03:01.079
<v Speaker 1>was the introduction of the algorithmic feed in Due twenty sixteen,

0:03:01.320 --> 0:03:04.680
<v Speaker 1>two years before he left. That horrified users who feared

0:03:04.720 --> 0:03:06.840
<v Speaker 1>that they would now not see posts from their friends,

0:03:07.200 --> 0:03:10.880
<v Speaker 1>a thing that almost immediately happened on both Instagram and Facebook,

0:03:10.919 --> 0:03:13.600
<v Speaker 1>which made a major change to its newsfeed algorithm in

0:03:13.639 --> 0:03:19.960
<v Speaker 1>twenty fifteen. Wait, wasn't that when Adam Masius? Oh my god. Anyway,

0:03:21.880 --> 0:03:24.480
<v Speaker 1>a few months later, Instagram would try and clone the

0:03:24.520 --> 0:03:28.000
<v Speaker 1>functionality of Snapchat, a company that has had quite literally

0:03:28.080 --> 0:03:31.360
<v Speaker 1>one profitable quarter in its history, with the release of Stories.

0:03:32.160 --> 0:03:35.600
<v Speaker 1>By the way, that's exactly what it was called on Snapchat.

0:03:35.600 --> 0:03:39.080
<v Speaker 1>They just didn't anyway. This move was illustrative both of

0:03:39.120 --> 0:03:42.240
<v Speaker 1>the lack of creativity within Instagram and Facebook, but also

0:03:42.280 --> 0:03:45.400
<v Speaker 1>of its future direction, with Story serving as yet another

0:03:45.440 --> 0:03:48.360
<v Speaker 1>touch point for advertisers, and yet another thing that Mark

0:03:48.440 --> 0:03:51.960
<v Speaker 1>Zuckerberg would rip off from people who actually build things

0:03:51.960 --> 0:03:56.680
<v Speaker 1>and have ideas. One Systrom left, Masseri became the head

0:03:56.680 --> 0:03:59.680
<v Speaker 1>of Instagram, turning it into one of the most profitable

0:03:59.680 --> 0:04:03.080
<v Speaker 1>business units in history while destroying its basic functionality is

0:04:03.080 --> 0:04:05.480
<v Speaker 1>an app that showed you photos and videos from people

0:04:05.520 --> 0:04:08.480
<v Speaker 1>you chose to follow, doubling down on the algorithm's ability

0:04:08.480 --> 0:04:12.080
<v Speaker 1>to interrupt and annoy you and stop you from seeing

0:04:12.160 --> 0:04:14.880
<v Speaker 1>things that you want to see, pushing ads and sponsored

0:04:14.880 --> 0:04:18.280
<v Speaker 1>content seemingly at random. But the one thing you can

0:04:18.320 --> 0:04:21.440
<v Speaker 1>rely on is it would do it a lot. Since

0:04:21.480 --> 0:04:25.599
<v Speaker 1>taking over Instagram, Adam Massei has, with the direct approval

0:04:25.640 --> 0:04:28.320
<v Speaker 1>and support of Mark Zuckerberg, turned the app into a

0:04:28.360 --> 0:04:32.080
<v Speaker 1>glorified ad network, devoid of any ability to innovate with

0:04:32.160 --> 0:04:35.440
<v Speaker 1>products like IGTV and Threads. By the way, not the

0:04:35.480 --> 0:04:37.719
<v Speaker 1>social network. It was a camera app built to compete

0:04:37.720 --> 0:04:41.000
<v Speaker 1>with Snapchat, which has also been shut down. Neither of

0:04:41.040 --> 0:04:44.760
<v Speaker 1>them found traction, and every change under Massi seems to

0:04:44.800 --> 0:04:49.280
<v Speaker 1>be a direct copy of either snap or TikTok. It's

0:04:49.320 --> 0:04:52.800
<v Speaker 1>also important to remember and know what Adam Massari is.

0:04:53.440 --> 0:04:56.600
<v Speaker 1>Adam Massari is not a creator. He's not an engineer.

0:04:56.800 --> 0:05:00.080
<v Speaker 1>He's not a founder. He's a designer that found his

0:05:00.120 --> 0:05:04.760
<v Speaker 1>way into product manageer, escaping the doldrums of actually doing things,

0:05:04.800 --> 0:05:10.520
<v Speaker 1>into the beautiful pantheon of wearing suits and yelling at people. Okay, okay,

0:05:10.600 --> 0:05:12.640
<v Speaker 1>I don't know if Adam yelled at people, but he

0:05:12.680 --> 0:05:16.400
<v Speaker 1>definitely annoyed them. And in late twenty twenty, he made

0:05:16.520 --> 0:05:20.159
<v Speaker 1>arguably the worst change to Instagram yet, launching Reels, a

0:05:20.200 --> 0:05:23.320
<v Speaker 1>fifteen second video format for Instagram built to compete with

0:05:23.400 --> 0:05:29.080
<v Speaker 1>the ascendant rise of the extremely algorithmic TikTok. Reels quickly

0:05:29.120 --> 0:05:32.240
<v Speaker 1>became the dominant form of content on both Facebook and Instagram,

0:05:32.240 --> 0:05:35.880
<v Speaker 1>flooding your feed with fifteen and eventually sixty second clips

0:05:35.880 --> 0:05:39.480
<v Speaker 1>that automatically players you scrolled by, each one engineered or

0:05:39.560 --> 0:05:41.720
<v Speaker 1>paid to get in the way of things that you

0:05:41.760 --> 0:05:44.680
<v Speaker 1>actually want to see. And I really want to be clear,

0:05:44.680 --> 0:05:47.840
<v Speaker 1>though there are people who are going to say, well, surely,

0:05:48.160 --> 0:05:52.400
<v Speaker 1>surely ed the fact that Reels was such a runaway success,

0:05:52.680 --> 0:05:58.119
<v Speaker 1>Well that's proof that it was good, right, wrong, horribly wrong.

0:05:58.640 --> 0:06:04.240
<v Speaker 1>Facebook's algorithm controls everything with Instagram and Facebook. Now, Systram's

0:06:04.279 --> 0:06:07.440
<v Speaker 1>worry and Kevin Systrom, the founder of Instagram. His worry

0:06:08.120 --> 0:06:11.440
<v Speaker 1>was that Zuckerberg was trying to just turn Instagram into

0:06:11.480 --> 0:06:14.560
<v Speaker 1>an arm of Facebook, kind of a feature app, Like

0:06:14.600 --> 0:06:17.640
<v Speaker 1>you went on Instagram to do things with your Facebook account.

0:06:18.080 --> 0:06:22.120
<v Speaker 1>This is exactly what has happened. Instagram is just Facebook,

0:06:22.200 --> 0:06:25.799
<v Speaker 1>but with more visual media. It has the same logins,

0:06:25.920 --> 0:06:28.640
<v Speaker 1>it has the same problems. It also has no customer

0:06:28.680 --> 0:06:32.159
<v Speaker 1>support of any kind, like every social network. But don't worry.

0:06:32.400 --> 0:06:35.640
<v Speaker 1>Thanks to Adam goddamn Massari, you can now pay fifteen

0:06:35.680 --> 0:06:39.839
<v Speaker 1>dollars a month for verification on Instagram and Facebook, which

0:06:39.880 --> 0:06:43.640
<v Speaker 1>will also get you access to customer support. Great goddamn idea,

0:06:43.720 --> 0:06:47.680
<v Speaker 1>Adam burn in hell anyway, I'm not the only one

0:06:47.720 --> 0:06:50.400
<v Speaker 1>angry with Adam Assei. He's also one of the least

0:06:50.440 --> 0:06:53.560
<v Speaker 1>popular tech executives in history. And I'm not kidding. I

0:06:53.600 --> 0:06:57.360
<v Speaker 1>have been reading the tech media very deeply. For Jesus

0:06:57.360 --> 0:07:01.039
<v Speaker 1>coming up on sixteen years bloody hell. Anyway, I've never

0:07:01.080 --> 0:07:04.320
<v Speaker 1>seen someone this unpopular other than maybe Elon Musk, and

0:07:04.400 --> 0:07:08.520
<v Speaker 1>even then, Elon Musk, who's a loathsome individual, has far

0:07:08.600 --> 0:07:13.920
<v Speaker 1>more riars than Adam Massearri, who mostly spends his time apologizing, No, seriously.

0:07:14.000 --> 0:07:16.600
<v Speaker 1>Since taking over Instagram, he's had to apologize for an

0:07:16.640 --> 0:07:19.360
<v Speaker 1>update that made Instagram's feed mood sideways and I'm not

0:07:19.440 --> 0:07:22.000
<v Speaker 1>kidding about that one. He's had to apologize for Instagram

0:07:22.120 --> 0:07:25.920
<v Speaker 1>censoring pro Palestinian content. He's had to testify before Congress

0:07:26.120 --> 0:07:28.880
<v Speaker 1>about Instagram's harm to young people, and he said to

0:07:28.920 --> 0:07:31.920
<v Speaker 1>tell people that Instagram is no longer a photo sharing app.

0:07:32.560 --> 0:07:37.960
<v Speaker 1>What does the Graham eh? Anyway, He's overseen so many

0:07:38.160 --> 0:07:42.640
<v Speaker 1>deeply unpopular changes that Kim Kardashian and Kylie Jenner, who

0:07:42.680 --> 0:07:45.800
<v Speaker 1>between them have over six hundred million followers, had to

0:07:45.840 --> 0:07:48.920
<v Speaker 1>beg him to stop Instagram from trying to be TikTok,

0:07:49.360 --> 0:07:52.520
<v Speaker 1>to which he responded that more and more of Instagram

0:07:52.560 --> 0:07:55.040
<v Speaker 1>is going to become video over time, and claim that

0:07:55.240 --> 0:07:58.120
<v Speaker 1>it had cut back on recommended content, something I think

0:07:58.160 --> 0:08:01.880
<v Speaker 1>we can all agree was a blatant, fucking lie. I

0:08:01.920 --> 0:08:04.760
<v Speaker 1>find Adam Massari very annoying as well, because when you

0:08:04.880 --> 0:08:08.600
<v Speaker 1>watch his videos, he's always going, hey, guys, yeah, so yeah,

0:08:08.640 --> 0:08:11.920
<v Speaker 1>So the reason that Instagram's bad now is because, uh,

0:08:11.960 --> 0:08:13.840
<v Speaker 1>and you can see him in real time trying to

0:08:13.880 --> 0:08:16.960
<v Speaker 1>come up with a reason why it sucks. That isn't

0:08:17.000 --> 0:08:18.600
<v Speaker 1>just well, it makes it. It makes us so it

0:08:18.640 --> 0:08:20.320
<v Speaker 1>may it may have made me so rich. I have

0:08:20.320 --> 0:08:22.720
<v Speaker 1>a house in Kensington, now I'm so rich. Is so good?

0:08:23.680 --> 0:08:27.600
<v Speaker 1>He's just he's worm like. He and I want to

0:08:27.600 --> 0:08:29.720
<v Speaker 1>get into personally inside. I take that back. Adam Massari

0:08:29.880 --> 0:08:33.440
<v Speaker 1>is not wormlike. He is a coward. Though he is

0:08:33.480 --> 0:08:37.239
<v Speaker 1>the reason that Instagram sucks. Now he is the architect

0:08:37.360 --> 0:08:40.840
<v Speaker 1>of the destruction of one of the most dominant places

0:08:40.880 --> 0:08:45.040
<v Speaker 1>on the web. These are his decisions. Massari, like many

0:08:45.080 --> 0:08:47.640
<v Speaker 1>of the most powerful men in tech, is a glorified

0:08:47.679 --> 0:08:52.320
<v Speaker 1>management consultant, incapable of creating anything of note. Massari has

0:08:52.360 --> 0:08:55.560
<v Speaker 1>already anounced that Threads metas dollars stor clone of Twitter,

0:08:56.000 --> 0:09:00.640
<v Speaker 1>is not for news and politics, making news organizations kind

0:09:00.640 --> 0:09:03.160
<v Speaker 1>of hesitant to invest in a platform that was made

0:09:03.200 --> 0:09:05.520
<v Speaker 1>by a company that has a rich history of screwing

0:09:05.600 --> 0:09:09.480
<v Speaker 1>over news organizations. Also, what the hell do I talk

0:09:09.480 --> 0:09:13.600
<v Speaker 1>about in social networks? Then, Adam, No, nothing that's happening

0:09:14.200 --> 0:09:16.560
<v Speaker 1>in the government of words. Well oh wait, let me

0:09:16.600 --> 0:09:20.000
<v Speaker 1>answer that. On threads, what people talk about is whatever's

0:09:20.040 --> 0:09:24.040
<v Speaker 1>making them angry that day, without much form, more feature,

0:09:24.600 --> 0:09:26.880
<v Speaker 1>and they still talk about politics and news. It's just

0:09:26.960 --> 0:09:30.760
<v Speaker 1>not supported by the algorithm. It's just it's the kind

0:09:30.760 --> 0:09:33.920
<v Speaker 1>of thing that you would make if you had no

0:09:34.040 --> 0:09:38.120
<v Speaker 1>idea how social networks worked, but you knew product management.

0:09:38.400 --> 0:09:40.480
<v Speaker 1>If you were like, all the people, what do we

0:09:40.520 --> 0:09:43.040
<v Speaker 1>want to see out of a social network? We want

0:09:43.080 --> 0:09:45.280
<v Speaker 1>to see lots of clicks, we want to see lots

0:09:45.320 --> 0:09:49.240
<v Speaker 1>of scrolling. Yeah, we definitely want people interacting and engaging.

0:09:49.520 --> 0:09:52.839
<v Speaker 1>That's what makes good social networking, right. And I think

0:09:52.840 --> 0:09:54.439
<v Speaker 1>we can all agree at this point that Twitter was

0:09:54.480 --> 0:09:56.839
<v Speaker 1>a mistake. It was like a text based platform. I

0:09:56.840 --> 0:09:59.800
<v Speaker 1>don't think Bisstone and Jack Dorsey are particularly gifted product

0:09:59.800 --> 0:10:03.000
<v Speaker 1>peace people, but they were smart enough to leave it

0:10:03.080 --> 0:10:05.880
<v Speaker 1>the hell alone. If you look at the Twitter files,

0:10:05.880 --> 0:10:08.640
<v Speaker 1>which by the way, are very funny because they Matt

0:10:08.679 --> 0:10:12.280
<v Speaker 1>Taybe sold is sold to Elon Musk, the most deceptive

0:10:12.280 --> 0:10:17.240
<v Speaker 1>man alive other than Donald Trump. It rocks didn't think

0:10:17.240 --> 0:10:19.520
<v Speaker 1>the leopards would eat your face, did you, mate? Anyway?

0:10:20.480 --> 0:10:22.720
<v Speaker 1>The Twitter files, all you can see in there is

0:10:22.720 --> 0:10:25.440
<v Speaker 1>Twitter's executives just being like, Oh, I don't want to

0:10:25.440 --> 0:10:27.160
<v Speaker 1>touch it, mate, I don't want to know. I don't

0:10:27.160 --> 0:10:28.839
<v Speaker 1>want to if we mess with it it's going to

0:10:28.920 --> 0:10:30.800
<v Speaker 1>make it bad. It's going to break. Everyone will be

0:10:30.840 --> 0:10:32.960
<v Speaker 1>so angry if we do anything. We shouldn't ban this person,

0:10:33.000 --> 0:10:37.199
<v Speaker 1>we shouldn't change this threads is this weird, hyperoptimized, hyper

0:10:37.240 --> 0:10:41.040
<v Speaker 1>algorithmic crapfest. And all the people on there are people

0:10:41.040 --> 0:10:44.800
<v Speaker 1>who write comments on Instagram it sucks, and there are

0:10:44.840 --> 0:10:46.880
<v Speaker 1>some good journalists on there. I'll pop in for that.

0:10:47.280 --> 0:10:49.360
<v Speaker 1>But it's a bad social network. And it's a bad

0:10:49.400 --> 0:10:51.320
<v Speaker 1>social network because it's made by a guy who doesn't

0:10:51.360 --> 0:10:55.360
<v Speaker 1>build products, unless you think of products as like financial vehicles,

0:10:55.360 --> 0:10:58.840
<v Speaker 1>in which case Adam Massei maybe the most gifted man alive.

0:11:00.559 --> 0:11:06.200
<v Speaker 1>But let's be honest. Nowhere is Adam Massari's consultant mindset

0:11:06.600 --> 0:11:09.600
<v Speaker 1>more obvious than in his suggested plan to deal with

0:11:09.640 --> 0:11:13.840
<v Speaker 1>Instagram's hundreds of thousands of underage users by creating a

0:11:13.880 --> 0:11:16.960
<v Speaker 1>new type of family centered account in Instagram that would

0:11:17.000 --> 0:11:22.360
<v Speaker 1>permit Meta to upsell Instagram to children under thirteen, a disgusting,

0:11:22.920 --> 0:11:26.560
<v Speaker 1>loathsome program that was planned as an alternative to instituting

0:11:26.720 --> 0:11:31.040
<v Speaker 1>stricter registration procedures, according to a lawsuit against Meta filed

0:11:31.080 --> 0:11:34.560
<v Speaker 1>by the Attorney General of New Mexico, Yeah, Adam I

0:11:34.600 --> 0:11:38.200
<v Speaker 1>won't come without a link because I'm scared anyways. This

0:11:38.400 --> 0:11:40.480
<v Speaker 1>is the man running one of the most important tech

0:11:40.520 --> 0:11:42.959
<v Speaker 1>platforms in the world, a man bereft of morals or

0:11:43.040 --> 0:11:47.680
<v Speaker 1>qualifications or even ideas, a walking, talking figurehead that exists

0:11:47.679 --> 0:11:50.760
<v Speaker 1>only to spout vague platitudes about what social media can

0:11:50.840 --> 0:11:53.360
<v Speaker 1>or will do, as the profits of making it harder

0:11:53.400 --> 0:11:57.079
<v Speaker 1>to communicate with friends and family make him unfathomably rich.

0:11:58.080 --> 0:12:01.520
<v Speaker 1>He comes out every so often bab about how social

0:12:01.600 --> 0:12:04.559
<v Speaker 1>is important, how the changes he's made that make Instagram

0:12:04.600 --> 0:12:07.760
<v Speaker 1>worse are actually good, and then disappears up his own asshole.

0:12:08.160 --> 0:12:10.559
<v Speaker 1>One time he responded to something I wrote on the information,

0:12:11.120 --> 0:12:13.680
<v Speaker 1>and he responded with a bunch of typos, which is

0:12:13.760 --> 0:12:17.280
<v Speaker 1>very funny. But he also responded saying that Facebook planned

0:12:17.840 --> 0:12:20.959
<v Speaker 1>no layoffs and Instagram planned no layoffs. They laid people

0:12:20.960 --> 0:12:23.640
<v Speaker 1>off on six months later. It's just this is the guy.

0:12:23.760 --> 0:12:27.679
<v Speaker 1>These are the guys in power now. Well. Much of

0:12:27.720 --> 0:12:30.240
<v Speaker 1>the blame for the state of Instagram and Facebook can

0:12:30.360 --> 0:12:32.920
<v Speaker 1>obviously be laid at the feet of Mark Zuckerberg. It's

0:12:33.280 --> 0:12:36.520
<v Speaker 1>important to remember Mark sucks, and Mark is the reason

0:12:36.559 --> 0:12:39.040
<v Speaker 1>he did the original Facebook after he stole it from

0:12:39.040 --> 0:12:43.319
<v Speaker 1>the Wet Brothers. But nevertheless, it's also important to understand

0:12:43.600 --> 0:12:46.360
<v Speaker 1>the sheer level of damage that Adam Massei has done

0:12:46.360 --> 0:12:51.360
<v Speaker 1>to the world. Instagram is now a truly awful product.

0:12:51.440 --> 0:12:55.440
<v Speaker 1>It's terrible, and Massari's only response to the pain and

0:12:55.480 --> 0:12:58.280
<v Speaker 1>frustration of his users is to tell them that he

0:12:58.360 --> 0:13:01.880
<v Speaker 1>intends to make it shittier. That was his actual response

0:13:01.920 --> 0:13:05.720
<v Speaker 1>when Kylie Jenner and Kim Kardashian said, hey, stop making

0:13:05.720 --> 0:13:08.320
<v Speaker 1>Instagram like TikTok, he said, I'm actually gonna make it

0:13:08.360 --> 0:13:11.360
<v Speaker 1>more like Tiktel. Your fucking assholes. Okay, that's not exactly

0:13:11.360 --> 0:13:13.120
<v Speaker 1>what he said. He just said there'd be more video.

0:13:13.679 --> 0:13:16.200
<v Speaker 1>They've claimed their rolling things back, but it's all nonsense.

0:13:16.240 --> 0:13:19.640
<v Speaker 1>It's all lies. And this is the management consultant mindset

0:13:19.640 --> 0:13:23.960
<v Speaker 1>that dominates tech. They trap users in these terrible experiences,

0:13:24.080 --> 0:13:27.000
<v Speaker 1>and they do so because they have giant monopolies, and

0:13:27.040 --> 0:13:29.440
<v Speaker 1>then they make their products worse and worse and worse

0:13:29.480 --> 0:13:31.600
<v Speaker 1>once they know that their users can't or won't go

0:13:31.679 --> 0:13:37.360
<v Speaker 1>anywhere else. And what's insane about this is Instagram could

0:13:37.400 --> 0:13:41.000
<v Speaker 1>have probably made Facebook tens of billions of dollars without sucking.

0:13:41.480 --> 0:13:43.720
<v Speaker 1>There are honest ways to do a business like this

0:13:44.440 --> 0:13:47.640
<v Speaker 1>if they really actually invested in algorithms. And I kind

0:13:47.640 --> 0:13:50.320
<v Speaker 1>of hinted this in previous episodes and made it so

0:13:50.440 --> 0:13:54.000
<v Speaker 1>it was just very good at showing you things you like. Hell,

0:13:54.200 --> 0:13:59.120
<v Speaker 1>they'd make TikTok. Why do you think TikTok has done well?

0:13:59.400 --> 0:14:03.760
<v Speaker 1>Because it's our algorithm, while extremely weird and unsettling, is

0:14:03.840 --> 0:14:07.040
<v Speaker 1>actually very good at showing people things they'd find interesting.

0:14:07.400 --> 0:14:11.120
<v Speaker 1>It's invasive, it's weird, it's bad. But guess what Matt's

0:14:11.120 --> 0:14:13.880
<v Speaker 1>worth hundreds of billions of dollars. They're putting tens of

0:14:13.920 --> 0:14:16.760
<v Speaker 1>billions of dollars into the metaverse. Still, Reality Labs is

0:14:16.760 --> 0:14:19.280
<v Speaker 1>still burning what ten fifteen billion dollars a quarter or

0:14:19.280 --> 0:14:22.120
<v Speaker 1>a year. It's an insane amount of money. How about

0:14:22.160 --> 0:14:25.360
<v Speaker 1>feeding that into the algorithm so your experience doesn't suck us.

0:14:26.200 --> 0:14:29.560
<v Speaker 1>I'm being vulgar, and I've been quite vulgar in this episode,

0:14:29.560 --> 0:14:32.760
<v Speaker 1>and I apologize, I really do. I shouldn't swear so much.

0:14:32.800 --> 0:14:34.920
<v Speaker 1>My mother tells me this, my father tells me this,

0:14:35.600 --> 0:14:39.960
<v Speaker 1>shrink tells me this. But anyway, I just find this

0:14:40.080 --> 0:14:43.680
<v Speaker 1>all so annoying. I find it frustrating because the bad

0:14:43.680 --> 0:14:47.120
<v Speaker 1>guys keep winning and the reason they keep winning is

0:14:47.160 --> 0:14:49.360
<v Speaker 1>nobody points at them and says how bad they are,

0:14:49.800 --> 0:14:52.720
<v Speaker 1>or at least they don't do it enough. Really, people

0:14:52.760 --> 0:14:55.280
<v Speaker 1>should walk out of Instagram, and we should stop using

0:14:55.480 --> 0:14:59.080
<v Speaker 1>Instagram and Facebook. I think I use it Instagram like

0:14:59.160 --> 0:15:02.200
<v Speaker 1>once every cup day's look at my friend's very fat dog,

0:15:02.240 --> 0:15:05.360
<v Speaker 1>which I do enjoy. I might just text him and say,

0:15:05.360 --> 0:15:06.960
<v Speaker 1>can you just send me picture of site of your dog?

0:15:07.000 --> 0:15:10.640
<v Speaker 1>But that's kind of weird. These apps have become part

0:15:10.680 --> 0:15:14.640
<v Speaker 1>of the social fabric, and people like Adam Masseria are aware.

0:15:14.960 --> 0:15:18.400
<v Speaker 1>Same with Mark Zuckerberg. They know exactly how well they've done,

0:15:18.400 --> 0:15:21.760
<v Speaker 1>and they have. They are a success story. An evil

0:15:21.840 --> 0:15:26.360
<v Speaker 1>success story, but they're a success story. You are on Instagram,

0:15:26.680 --> 0:15:29.560
<v Speaker 1>you are on Twitter, you are on email. You are

0:15:29.640 --> 0:15:35.320
<v Speaker 1>on these platforms because that's where people exist online. There

0:15:35.400 --> 0:15:38.000
<v Speaker 1>are people that I can only speak to on Instagram.

0:15:38.040 --> 0:15:40.920
<v Speaker 1>They're just bad at text, but they love Instagram, and

0:15:41.000 --> 0:15:42.920
<v Speaker 1>it sucks. I don't want to use these platforms, and

0:15:43.000 --> 0:15:45.440
<v Speaker 1>imagine you don't want to EVA, which is why it's

0:15:45.440 --> 0:15:48.040
<v Speaker 1>important to talk about who mess them up. It's important

0:15:48.080 --> 0:15:51.720
<v Speaker 1>to say Adam Masseri a hundred times and keep saying it.

0:15:52.600 --> 0:15:55.800
<v Speaker 1>That's the only way that history will know who has

0:15:55.880 --> 0:16:01.520
<v Speaker 1>done this damage. And these management consultant types they're everywhere,

0:16:02.840 --> 0:16:05.840
<v Speaker 1>They're everywhere, and they're inspiring people to be like them,

0:16:06.080 --> 0:16:10.480
<v Speaker 1>to be ruthless assholes, to be terrible product developers, but

0:16:10.640 --> 0:16:14.880
<v Speaker 1>excellent businessmen. And there is a middle ground, a sustainable ground,

0:16:15.040 --> 0:16:18.720
<v Speaker 1>one which doesn't involve burning cash or burning customers or

0:16:18.880 --> 0:16:23.280
<v Speaker 1>in the case of Instagram, shortly adding generative AI to everything,

0:16:24.120 --> 0:16:27.320
<v Speaker 1>to Facebook, to Instagram. And by the way, here's a

0:16:27.320 --> 0:16:31.840
<v Speaker 1>crazy story for you. Recently, a generative AI on Facebook

0:16:31.880 --> 0:16:35.480
<v Speaker 1>responded to a group saying that it had a gifted child.

0:16:36.720 --> 0:16:39.200
<v Speaker 1>And you may think I'm mistaking something. No, no, it

0:16:39.240 --> 0:16:41.640
<v Speaker 1>said this whole thing about having a gifted child. This

0:16:41.800 --> 0:16:44.440
<v Speaker 1>is what happens when people who don't know product, who

0:16:44.480 --> 0:16:48.440
<v Speaker 1>don't build anything, who don't understand anything, get the keys

0:16:48.480 --> 0:16:51.560
<v Speaker 1>to the kingdom. They fuck it up and they'll keep

0:16:51.560 --> 0:16:56.920
<v Speaker 1>doing so, and their massive success only seeks to inspire

0:16:57.000 --> 0:17:02.160
<v Speaker 1>generations of useless founders with creating profitable pain boxes over

0:17:02.280 --> 0:17:05.479
<v Speaker 1>useful products. And as we speak, the most quirked up

0:17:05.560 --> 0:17:09.360
<v Speaker 1>loathsome one of the more is rising to power. I'm

0:17:09.400 --> 0:17:22.159
<v Speaker 1>talking about Sam goddamn Altman. Now if you don't know

0:17:22.240 --> 0:17:25.200
<v Speaker 1>who that is. Sam Altman is the CEO of open Ai,

0:17:25.440 --> 0:17:30.680
<v Speaker 1>which is a very unprofitable revenue generating company building software

0:17:30.720 --> 0:17:33.800
<v Speaker 1>to do something, and Generative Ai might do something. You

0:17:33.840 --> 0:17:36.000
<v Speaker 1>should go back and listen to the episodes about that.

0:17:36.880 --> 0:17:40.240
<v Speaker 1>But I want to tell you how Sam Altman got started,

0:17:40.800 --> 0:17:43.800
<v Speaker 1>and I want to let you know how shit Sam

0:17:43.840 --> 0:17:46.919
<v Speaker 1>Altman has peen at his job in the past. In

0:17:46.960 --> 0:17:50.320
<v Speaker 1>two thousand and five, Altman, a Standford dropout, co founded

0:17:50.359 --> 0:17:54.200
<v Speaker 1>a company called Looped. That's loopt, that's what companies were

0:17:54.200 --> 0:17:56.800
<v Speaker 1>called back then, and they were a location based social

0:17:56.840 --> 0:18:00.240
<v Speaker 1>networking app that raised over thirty million dollars from taking Hiba,

0:18:00.520 --> 0:18:04.680
<v Speaker 1>y Combinator, and vcs like Sequoia Capital and NEA. Seven

0:18:04.760 --> 0:18:07.879
<v Speaker 1>years later, Aortmand would flog Looped to a publicly traded

0:18:07.920 --> 0:18:10.800
<v Speaker 1>financial services company called green Dot, best known for their

0:18:10.840 --> 0:18:14.679
<v Speaker 1>prepaid cards, for remarkable forty three point four million dollars,

0:18:14.920 --> 0:18:17.879
<v Speaker 1>despite the fact that the app didn't find traction or

0:18:18.080 --> 0:18:21.880
<v Speaker 1>revenue ormand got quite rich from the Loop deal, despite

0:18:21.920 --> 0:18:24.080
<v Speaker 1>the fact that a group of senior employees urged the

0:18:24.119 --> 0:18:27.560
<v Speaker 1>board on two separate occasions to fire Aortmand for what

0:18:27.640 --> 0:18:31.359
<v Speaker 1>they described as deceptive and chaotic behavior. According to The

0:18:31.359 --> 0:18:35.840
<v Speaker 1>Wall Street Journal, Ormand would almost immediately become a partner

0:18:35.880 --> 0:18:39.119
<v Speaker 1>at y Combinator, surprising a lot of people after working

0:18:39.119 --> 0:18:42.320
<v Speaker 1>there part time before being made president by co founder

0:18:42.359 --> 0:18:46.560
<v Speaker 1>Paul Graham in twenty fourteen. Yet, behind the scenes, according

0:18:46.560 --> 0:18:49.679
<v Speaker 1>to reporting by Elizabeth Dwaskin and Natasha Tiku of The

0:18:49.720 --> 0:18:53.000
<v Speaker 1>Washington Post, Aortmand was well known for and I quote

0:18:53.359 --> 0:18:56.439
<v Speaker 1>for an absenteeism that rankled his peers and some of

0:18:56.480 --> 0:18:59.840
<v Speaker 1>the startups he was supposed to nurture. Ortmand was also

0:19:00.119 --> 0:19:03.120
<v Speaker 1>double dipping in y combine at startups by investing through

0:19:03.160 --> 0:19:06.120
<v Speaker 1>Alt Capital, a venture firm founded with his brother Jack,

0:19:06.320 --> 0:19:09.639
<v Speaker 1>with one source quoted by the Post describing Aortman's tenure

0:19:09.880 --> 0:19:13.800
<v Speaker 1>as the school of loose management that's all about prioritizing

0:19:13.920 --> 0:19:17.680
<v Speaker 1>what's in it for me. Aortmand became wildly rich during

0:19:17.720 --> 0:19:20.800
<v Speaker 1>his tenure at y Combinator, using his connections and his

0:19:20.920 --> 0:19:24.040
<v Speaker 1>cult of personality to make early investments in companies like

0:19:24.080 --> 0:19:27.199
<v Speaker 1>Gusto and optimizedly, which was acquired in twenty eighteen for

0:19:27.240 --> 0:19:31.240
<v Speaker 1>one point one six billion dollars, and Patreon and Asana

0:19:31.680 --> 0:19:34.640
<v Speaker 1>and Reddit probably made a couple hundred million red air

0:19:34.720 --> 0:19:39.360
<v Speaker 1>really depressing. In twenty fifteen, he founded OpenAI, at the time,

0:19:39.400 --> 0:19:44.400
<v Speaker 1>a nonprofit organization dedicated to building responsible artificial intelligence applications.

0:19:45.560 --> 0:19:49.240
<v Speaker 1>Yet it's really important to note that Aortman is not,

0:19:49.440 --> 0:19:52.160
<v Speaker 1>and was not ever an engineer or a technologist. He

0:19:52.200 --> 0:19:55.240
<v Speaker 1>did code, but he was, in every case, from what

0:19:55.320 --> 0:19:58.159
<v Speaker 1>I can find, a figurehead and a fundraiser that was

0:19:58.200 --> 0:20:01.719
<v Speaker 1>able to convince actual academics and engineers like Dirk Kingman

0:20:01.840 --> 0:20:06.120
<v Speaker 1>and we'll check Zaremba and Iliasidskava to do the actual

0:20:06.119 --> 0:20:08.800
<v Speaker 1>work at open AI, while he sent master ptory emails

0:20:08.840 --> 0:20:11.679
<v Speaker 1>to Elon Musk, who only ever donated fifty million dollars

0:20:11.760 --> 0:20:14.359
<v Speaker 1>of the one hundred million he claimed he'd invested in open AI.

0:20:14.760 --> 0:20:17.640
<v Speaker 1>You know what the thing is, Sam, at the very least,

0:20:17.680 --> 0:20:19.679
<v Speaker 1>can you make sure that Elon Musk pays up? Are

0:20:19.680 --> 0:20:23.440
<v Speaker 1>you that much of a count anyway? I really want

0:20:23.480 --> 0:20:25.879
<v Speaker 1>you to remember that Sam Mortman was an absentee parent

0:20:25.960 --> 0:20:28.760
<v Speaker 1>for the first few years of open AI. He split

0:20:28.800 --> 0:20:31.840
<v Speaker 1>his efforts in actually a way that was very similar

0:20:31.880 --> 0:20:36.159
<v Speaker 1>to Elon Musk, across multiple other investments and enterprises like

0:20:36.200 --> 0:20:39.439
<v Speaker 1>the two funds he'd built inside of Y Combinator for

0:20:39.520 --> 0:20:42.639
<v Speaker 1>him to run. In twenty nineteen, according to reports at

0:20:42.680 --> 0:20:45.840
<v Speaker 1>the time, Alltmond would step down from Y Combinator amid

0:20:45.880 --> 0:20:48.760
<v Speaker 1>a series of changes at the accelerator, a story that

0:20:49.280 --> 0:20:51.720
<v Speaker 1>much of the industry press just simply chose not to

0:20:51.760 --> 0:20:55.639
<v Speaker 1>look into. Though I will give Eric Newcomer some credit

0:20:55.680 --> 0:20:58.560
<v Speaker 1>for calling people out for this kind of in vain.

0:20:59.160 --> 0:21:02.399
<v Speaker 1>Yet the Washington Host's reporting revealed that Y Combinator found

0:21:02.400 --> 0:21:06.000
<v Speaker 1>a Paul Graham, best known for writing extremely annoying tweets

0:21:06.000 --> 0:21:09.840
<v Speaker 1>and very long and quite boring blogs. That's my job, Paul. Anyway,

0:21:09.880 --> 0:21:13.679
<v Speaker 1>he flew from the United Kingdom to Francisco to personally

0:21:13.920 --> 0:21:16.359
<v Speaker 1>kick Sam Altman out, though he blamed his wife for

0:21:16.359 --> 0:21:20.360
<v Speaker 1>some reason, and anyway due to Altman continuing to focus

0:21:20.359 --> 0:21:24.280
<v Speaker 1>on his own personal projects and press over his thing

0:21:24.280 --> 0:21:28.440
<v Speaker 1>at WY combinetor This is the story of the man

0:21:28.480 --> 0:21:31.680
<v Speaker 1>whom New York Magazine called the Oppenheimer of our Age

0:21:32.000 --> 0:21:35.440
<v Speaker 1>in a meandering piece that frames Altman's vagueness about AI

0:21:35.800 --> 0:21:39.080
<v Speaker 1>as some sort of big secret, a hidden truth, when

0:21:39.119 --> 0:21:41.879
<v Speaker 1>I think the truth might be far simpler. Sam Altman

0:21:41.960 --> 0:21:45.760
<v Speaker 1>is yet another fucking management consultant. In a piece published

0:21:45.800 --> 0:21:48.840
<v Speaker 1>in early twenty twenty one, Sam Altman proposed the concept

0:21:48.840 --> 0:21:51.520
<v Speaker 1>of Moore's law for everything, referring to the principle that

0:21:51.560 --> 0:21:55.040
<v Speaker 1>the number of transistors on an integrated circuit doubles approximately

0:21:55.040 --> 0:21:57.600
<v Speaker 1>every two years, leading to a linear increase in computing

0:21:57.640 --> 0:21:59.680
<v Speaker 1>power in the process. And if I got that wrong,

0:21:59.720 --> 0:22:03.760
<v Speaker 1>I just I think it's okay anyway. Moore's Law for

0:22:03.840 --> 0:22:06.840
<v Speaker 1>Everything is, in essence a utopian take on the impact

0:22:06.840 --> 0:22:10.320
<v Speaker 1>of AI, noting that as machines usurp the roles of

0:22:10.400 --> 0:22:13.199
<v Speaker 1>humans in the supply chain, the prices of goods and

0:22:13.240 --> 0:22:16.360
<v Speaker 1>services and thus the cost of living will go down exponentially,

0:22:16.560 --> 0:22:19.280
<v Speaker 1>something that has been proven wrong by a lot of history.

0:22:19.880 --> 0:22:22.280
<v Speaker 1>The problem with this piece is it's kind of like

0:22:22.359 --> 0:22:25.439
<v Speaker 1>Sam Moultman in that it's a deeply complex bucket of nothing.

0:22:26.119 --> 0:22:29.600
<v Speaker 1>It's an extremely verbose screed that says things like AI

0:22:29.680 --> 0:22:32.240
<v Speaker 1>will lower the cost of goods and services because labor

0:22:32.320 --> 0:22:34.560
<v Speaker 1>is the driving cost at many levels of the supply chain,

0:22:35.119 --> 0:22:38.120
<v Speaker 1>and suggests things like that we should tax capital rather

0:22:38.160 --> 0:22:40.919
<v Speaker 1>than labor by creating an American equity Fund where companies

0:22:40.920 --> 0:22:42.600
<v Speaker 1>are forced to give up a percentage of their shares

0:22:42.640 --> 0:22:45.480
<v Speaker 1>to a nationally incorporate adventure fund, an idea that makes

0:22:45.480 --> 0:22:47.720
<v Speaker 1>sense if you're in the business of flogging private companies

0:22:47.720 --> 0:22:52.200
<v Speaker 1>to public companies. Moore's law for Everything is a remarkably

0:22:52.280 --> 0:22:55.520
<v Speaker 1>telling piece in that if frames Aortman's worldview as one

0:22:55.520 --> 0:22:59.120
<v Speaker 1>where the only thing that can save mankind is startups,

0:22:59.280 --> 0:23:02.080
<v Speaker 1>and those should be funded in the way that Sam

0:23:02.160 --> 0:23:05.840
<v Speaker 1>Ortman likes, and this piece feels a lot like everything

0:23:05.880 --> 0:23:09.960
<v Speaker 1>in a Ortmand's universe. Endeavor actually connects Moore's law to anything,

0:23:10.040 --> 0:23:12.600
<v Speaker 1>which I should add isn't so much of a law

0:23:12.600 --> 0:23:15.239
<v Speaker 1>as an observation as long ceased to be relevant as

0:23:15.280 --> 0:23:18.480
<v Speaker 1>the shrinkage of transistors as slowed in the recent years.

0:23:18.960 --> 0:23:23.200
<v Speaker 1>In many respects, this comparison is actually ierially prophetic given

0:23:23.240 --> 0:23:25.840
<v Speaker 1>the slow down and even regression of improvement we've seen

0:23:25.880 --> 0:23:30.080
<v Speaker 1>in large language models like chat GPT, And much like

0:23:30.160 --> 0:23:34.480
<v Speaker 1>chat GPT, Ortmand is capable only of loosely approximating the

0:23:34.480 --> 0:23:38.240
<v Speaker 1>output requested, because that is core. He lacks any kind

0:23:38.240 --> 0:23:41.879
<v Speaker 1>of substance or technical history that you'd need to do so,

0:23:43.160 --> 0:23:47.160
<v Speaker 1>like chat GPT, he's a very intelligent know nothing that,

0:23:47.280 --> 0:23:50.840
<v Speaker 1>through deterministic measures, completely detached from the meaning of the

0:23:50.920 --> 0:23:53.440
<v Speaker 1>underlying ideas, picks the right words to say at the

0:23:53.560 --> 0:23:57.640
<v Speaker 1>right time. And by the way, this is the man

0:23:57.720 --> 0:24:02.320
<v Speaker 1>selling the artificial intelligence stream. He's a salesman, and he's

0:24:02.320 --> 0:24:05.760
<v Speaker 1>a salesman capable of superficial connections between ideas in a

0:24:05.800 --> 0:24:08.880
<v Speaker 1>way that's initially satisfying as long as you just don't

0:24:08.920 --> 0:24:12.520
<v Speaker 1>think about it too much. Altmand's famed startup playbook, which

0:24:12.560 --> 0:24:14.840
<v Speaker 1>is published in twenty fifteen, is full of the kind

0:24:14.880 --> 0:24:19.040
<v Speaker 1>of obvious yet satisfying crap that you'd expect. He extols

0:24:19.080 --> 0:24:22.320
<v Speaker 1>the virtues of being flexible yet rigid. Advis is that

0:24:22.359 --> 0:24:24.720
<v Speaker 1>you talk to your users and watch them use your product,

0:24:25.080 --> 0:24:28.000
<v Speaker 1>which is an exact quote, and he says that you

0:24:28.000 --> 0:24:31.119
<v Speaker 1>should try to improve your product five percent each week.

0:24:32.160 --> 0:24:34.240
<v Speaker 1>These are the kind of things that are very useful,

0:24:34.320 --> 0:24:37.679
<v Speaker 1>genuinely to an early twenties founder, and super impressive to

0:24:37.720 --> 0:24:40.840
<v Speaker 1>a mid fifties white venture capitalist that doesn't remember the

0:24:40.920 --> 0:24:43.160
<v Speaker 1>last time they worked a job that wasn't ten hours

0:24:43.160 --> 0:24:46.280
<v Speaker 1>a week of investing in Chuntley the SaaS for dog breeders.

0:24:47.160 --> 0:24:51.960
<v Speaker 1>They're the tech equivalent of live, Laugh, Love It does. However,

0:24:52.320 --> 0:24:56.640
<v Speaker 1>at one point, betray Sam Altman's real mindset that the

0:24:56.680 --> 0:24:59.959
<v Speaker 1>only universal job description of a CEO is to make

0:25:00.000 --> 0:25:05.760
<v Speaker 1>make sure that the company wins. Almand's material contributions to

0:25:05.800 --> 0:25:08.880
<v Speaker 1>open Ai kind of hard to nail down. Well, it's

0:25:08.920 --> 0:25:12.080
<v Speaker 1>unfair to judge someone entirely by their emails. Those that

0:25:12.119 --> 0:25:14.879
<v Speaker 1>I can find, such as the ones from Elon Musk's

0:25:14.920 --> 0:25:17.920
<v Speaker 1>lawsuit against open Ai, feel like they could be from

0:25:18.200 --> 0:25:22.159
<v Speaker 1>any other managerial huckster, and they feel kind of as

0:25:22.200 --> 0:25:26.200
<v Speaker 1>specious as Elon Musk's. And by the way, you're able

0:25:26.240 --> 0:25:29.440
<v Speaker 1>to compare those because when Elon Musk sued open Ai,

0:25:29.520 --> 0:25:31.800
<v Speaker 1>open Ai published a bunch of emails and him and

0:25:31.840 --> 0:25:34.680
<v Speaker 1>Alman are the same guy. They're both just sitting going, yes, yes,

0:25:34.720 --> 0:25:36.560
<v Speaker 1>the future will be very good. It'll be very important

0:25:36.560 --> 0:25:38.760
<v Speaker 1>that we have technology in the future. Yes, AI will

0:25:38.800 --> 0:25:41.040
<v Speaker 1>be able to grow. And Sam Almond's like, yeah, dude, yeah,

0:25:41.040 --> 0:25:45.119
<v Speaker 1>that's great. That's crazy man. It's like Joe Rogan, except

0:25:45.160 --> 0:25:50.440
<v Speaker 1>they're worth billions of dollars. Almond blathers on about governance

0:25:50.480 --> 0:25:53.680
<v Speaker 1>structures and how open ai needs to create the first

0:25:53.720 --> 0:25:56.639
<v Speaker 1>general AI and use it for individual empowerment, which he

0:25:56.720 --> 0:25:59.960
<v Speaker 1>defines as the distributed version of the future that seems

0:26:00.119 --> 0:26:03.200
<v Speaker 1>the safest. Like I said, Musco and Aortman are very

0:26:03.240 --> 0:26:07.200
<v Speaker 1>similar creatures, managers wearing engineering costumes, and both are credited

0:26:07.359 --> 0:26:11.119
<v Speaker 1>as having expertise in AI without actually appearing to have

0:26:11.119 --> 0:26:13.879
<v Speaker 1>written a single line of code in a decade. And

0:26:14.119 --> 0:26:16.919
<v Speaker 1>let me tell you something about most of the guys

0:26:16.920 --> 0:26:19.960
<v Speaker 1>who actually work deep in AI. You know what, They've

0:26:19.960 --> 0:26:25.200
<v Speaker 1>got academic papers, they have actual published things they've done.

0:26:25.359 --> 0:26:29.919
<v Speaker 1>They're not afraid to share their code. They're not asking,

0:26:29.960 --> 0:26:32.359
<v Speaker 1>as Elon must did when laying people off at Twitter

0:26:32.560 --> 0:26:35.080
<v Speaker 1>for people's most salient lines of code, because they actually

0:26:35.080 --> 0:26:37.399
<v Speaker 1>know how code works. I, by the way, do not.

0:26:37.880 --> 0:26:40.080
<v Speaker 1>I'm not going to pretend I do. But I'm also

0:26:40.240 --> 0:26:43.199
<v Speaker 1>not there selling you the future of AI. From what

0:26:43.280 --> 0:26:47.240
<v Speaker 1>I can tell, Altman has like broberkar Ragavan, the villain

0:26:47.280 --> 0:26:51.720
<v Speaker 1>of the last episode Fallen Upwards. He was a constant

0:26:51.800 --> 0:26:55.040
<v Speaker 1>source of frustration at Looped due to his pursuit and

0:26:55.080 --> 0:26:58.160
<v Speaker 1>I share you not of side projects, with the Wall

0:26:58.160 --> 0:27:01.600
<v Speaker 1>Street Journal reporting late last year that Altman wants diverted

0:27:01.680 --> 0:27:05.000
<v Speaker 1>engineers to work on an unnamed gay dating app. As

0:27:05.000 --> 0:27:08.199
<v Speaker 1>I previously noted, Ortmand was fired from y Combinator for

0:27:08.240 --> 0:27:12.679
<v Speaker 1>his absenteism and I quote reputation for favoring personal priorities

0:27:12.680 --> 0:27:16.280
<v Speaker 1>over official duties, and the Wallstroop Journal reports that by

0:27:16.320 --> 0:27:19.199
<v Speaker 1>early twenty eighteen, a year before he was fired, Ortmand

0:27:19.240 --> 0:27:22.640
<v Speaker 1>was barely present at y Combinator's headquarters, spending more time

0:27:22.640 --> 0:27:26.200
<v Speaker 1>at open Ai, which rankled longtime partners at y Combinator

0:27:26.359 --> 0:27:41.840
<v Speaker 1>who began losing faith in him as a leader. The

0:27:41.920 --> 0:27:45.040
<v Speaker 1>journal's piece does reveal a little more about why Ortman

0:27:45.200 --> 0:27:48.200
<v Speaker 1>was fired as CEO of open Ai, something that happened

0:27:48.320 --> 0:27:51.160
<v Speaker 1>last November and was very weird if he didn't see

0:27:51.160 --> 0:27:54.560
<v Speaker 1>it happen. He was fired for like three days, and

0:27:54.600 --> 0:27:58.680
<v Speaker 1>then a bunch of managers like Reid Hoffman and Brancheski

0:27:58.760 --> 0:28:02.280
<v Speaker 1>of Airbnb, sachurn Adela, the king of managers, the CEO

0:28:02.320 --> 0:28:05.840
<v Speaker 1>of Microsoft, got together and bullied a nonprofit into putting

0:28:05.880 --> 0:28:09.720
<v Speaker 1>him back. Super happy story, But one of the reasons

0:28:09.760 --> 0:28:12.320
<v Speaker 1>that he was fired was because Sam Altman is a

0:28:12.320 --> 0:28:17.520
<v Speaker 1>pretty atrocious manager. Founding and sadly now former open Ai

0:28:17.600 --> 0:28:21.200
<v Speaker 1>board member Ilia Sutzkava described to the board when calling

0:28:21.240 --> 0:28:25.560
<v Speaker 1>for Altman's removal, a long running pattern of Altman's tendency

0:28:25.600 --> 0:28:29.000
<v Speaker 1>to pit employees against one another, or promised resources and

0:28:29.080 --> 0:28:33.400
<v Speaker 1>responsibilities to two different executives at the same time, yielding conflicts.

0:28:33.840 --> 0:28:36.720
<v Speaker 1>More worryingly, the General reports that other members of the

0:28:36.760 --> 0:28:41.120
<v Speaker 1>board had heard similar concerns from senior OpenAI executives. By

0:28:41.160 --> 0:28:45.200
<v Speaker 1>the way, anyone who brought Sam Altman back in ire

0:28:45.200 --> 0:28:48.920
<v Speaker 1>ascam bag, we all know it. They also feared that

0:28:48.960 --> 0:28:51.920
<v Speaker 1>Sam Altman would use his influence in Silicon Valley once fired,

0:28:52.440 --> 0:28:56.000
<v Speaker 1>something that almost immediately came true when Sam Altman ran

0:28:56.040 --> 0:28:59.240
<v Speaker 1>as I mentioned to Brian Chesky of Airbnb, who then

0:28:59.320 --> 0:29:02.160
<v Speaker 1>called Saturn Della of Microsoft, which sparked a chain of

0:29:02.200 --> 0:29:05.120
<v Speaker 1>events that restored Altman as CEO of open Ai and

0:29:05.160 --> 0:29:07.400
<v Speaker 1>led to the removal of the non believers like Ilia

0:29:07.440 --> 0:29:13.520
<v Speaker 1>Suitskafer from the board entirely. This this is Silicon Valley's king.

0:29:14.200 --> 0:29:17.560
<v Speaker 1>This is the guy that people think is the Oppenheimer

0:29:17.600 --> 0:29:22.160
<v Speaker 1>of our age. This is the king of Silicon Valley now,

0:29:22.640 --> 0:29:26.120
<v Speaker 1>a multi billionaire who's actually a lobbyist role playing as

0:29:26.160 --> 0:29:31.719
<v Speaker 1>a founder, a diplomat masquerading as a technologist, A charming, capricious, abusive,

0:29:31.920 --> 0:29:35.840
<v Speaker 1>and untrustworthy man that has proven time and time again

0:29:36.080 --> 0:29:39.640
<v Speaker 1>that his only reliable trait is that whatever happens must

0:29:39.680 --> 0:29:43.680
<v Speaker 1>benefit Sam Altman. This also explains why so little of

0:29:43.720 --> 0:29:47.280
<v Speaker 1>Sam Altman's promises about AI makes sense and why open

0:29:47.320 --> 0:29:50.840
<v Speaker 1>ai has been so unashamed in steam rolling and plagiarizing

0:29:50.880 --> 0:29:55.120
<v Speaker 1>the entire world. Altman has created nothing other than wealth

0:29:55.120 --> 0:29:58.640
<v Speaker 1>for himself and other rich guys, helping elevate and protect

0:29:58.720 --> 0:30:02.400
<v Speaker 1>existing power structures and the ideologies of men like Microsoft

0:30:02.480 --> 0:30:06.840
<v Speaker 1>CEO Sachnadella and LinkedIn founder and career manager Read Hoffman.

0:30:07.200 --> 0:30:10.200
<v Speaker 1>And let's not forget open AI's new board member Larry

0:30:10.240 --> 0:30:14.480
<v Speaker 1>Goddamn Summers, and it all kind of makes sense why

0:30:14.560 --> 0:30:18.240
<v Speaker 1>GENERATIVEAI doesn't really help anyone other than those who want

0:30:18.280 --> 0:30:21.560
<v Speaker 1>to sell something. When the center of attention at a

0:30:21.600 --> 0:30:24.080
<v Speaker 1>company isn't really on the product or the tech, but

0:30:24.160 --> 0:30:27.000
<v Speaker 1>the idea of what the product could do, very little

0:30:27.040 --> 0:30:30.320
<v Speaker 1>about the company's culture is focused on building useful things

0:30:30.320 --> 0:30:35.080
<v Speaker 1>for real people. When leadership is dominated by managers that

0:30:35.120 --> 0:30:37.320
<v Speaker 1>haven't touched a line of code in decades or talked

0:30:37.320 --> 0:30:40.560
<v Speaker 1>to a normal person in decades, nobody steering the ship

0:30:40.680 --> 0:30:43.280
<v Speaker 1>has the ability to judge whether software is good or useful,

0:30:43.400 --> 0:30:47.120
<v Speaker 1>or valuable, or does anything other than help you raise

0:30:47.200 --> 0:30:51.960
<v Speaker 1>venture capital, of course, and this is the direct result

0:30:52.280 --> 0:30:56.240
<v Speaker 1>of Silicon Valley's corruption by the managerial sect. While Propakar

0:30:56.400 --> 0:31:00.480
<v Speaker 1>Ragavan may be a decorated computer scientist and academic, he

0:31:00.640 --> 0:31:03.920
<v Speaker 1>arguably oversaw the destruction of Yahoo, formerly one of the

0:31:03.920 --> 0:31:07.240
<v Speaker 1>Webb's most dominant search engines, and failed upwards into a

0:31:07.240 --> 0:31:09.960
<v Speaker 1>managerial role that allowed him to take over and now

0:31:10.080 --> 0:31:14.480
<v Speaker 1>arguably ruin Google's search product chasing Away Ben Gomes a

0:31:14.560 --> 0:31:18.480
<v Speaker 1>hero and a man responsible for actually building things. Adam

0:31:18.560 --> 0:31:22.440
<v Speaker 1>Masseri was and always will be a manager making calls

0:31:22.440 --> 0:31:25.000
<v Speaker 1>about products he's had no hand in building, and has

0:31:25.120 --> 0:31:28.480
<v Speaker 1>architected the outright destruction of a social network used by

0:31:28.480 --> 0:31:33.840
<v Speaker 1>billions of people. And Sam Altman, a career failure famous

0:31:33.840 --> 0:31:37.480
<v Speaker 1>for making himself rich and popular and upsetting and hurting

0:31:37.520 --> 0:31:40.240
<v Speaker 1>the people he works with, is on course to become

0:31:40.280 --> 0:31:44.040
<v Speaker 1>the most toxic manager of them all. If left unchecked,

0:31:44.240 --> 0:31:47.320
<v Speaker 1>OpenAI will perpetuate one of the largest thefts in history,

0:31:47.480 --> 0:31:49.880
<v Speaker 1>looting the Internet and using it to train models that

0:31:49.960 --> 0:31:52.320
<v Speaker 1>have yet to prove their necessity. Other than there's a

0:31:52.320 --> 0:31:56.280
<v Speaker 1>symbol that Silicon Valley has still fucking got it, even

0:31:56.360 --> 0:32:01.160
<v Speaker 1>though it unquestionably doesn't when it comes to generative as yet,

0:32:01.200 --> 0:32:04.840
<v Speaker 1>because Altman, like every manager, is so thoroughly divorced from

0:32:04.880 --> 0:32:09.000
<v Speaker 1>natural production, he's only succeeded in generating unsustainable hype and

0:32:09.040 --> 0:32:12.440
<v Speaker 1>making vague promises that the people who do the actual

0:32:12.480 --> 0:32:16.840
<v Speaker 1>work at OpenAI likely know that they can't keep. And

0:32:16.840 --> 0:32:21.160
<v Speaker 1>it's frustrating because, as I said before, bad guys keep winning.

0:32:21.520 --> 0:32:24.920
<v Speaker 1>There are people in Silicon Valley making real products. There

0:32:24.920 --> 0:32:28.640
<v Speaker 1>are people out there who are doing good things, but

0:32:28.720 --> 0:32:31.560
<v Speaker 1>Silicon Valley will continue to suffer as long as we

0:32:31.720 --> 0:32:34.840
<v Speaker 1>entrust the future to management consultants and showmen who don't

0:32:34.840 --> 0:32:38.760
<v Speaker 1>build things. Just look at Humane, a company that raised

0:32:38.880 --> 0:32:41.640
<v Speaker 1>hundreds of millions of dollars to make a voice activated

0:32:41.680 --> 0:32:45.360
<v Speaker 1>AI powered pin that the ultra popular YouTuber markus brownly

0:32:45.400 --> 0:32:49.400
<v Speaker 1>called the worst product he'd ever reviewed. One might wonder

0:32:49.440 --> 0:32:52.000
<v Speaker 1>how a company would willingly launch a seven hundred dollars

0:32:52.040 --> 0:32:55.600
<v Speaker 1>product that overheated within minutes of use and repeatedly failed

0:32:55.720 --> 0:32:59.320
<v Speaker 1>to answer basic queries. And the answer is actually really simple.

0:33:00.400 --> 0:33:04.520
<v Speaker 1>Was founded by Bethny Bonziano, a former management consultant at PwC,

0:33:04.680 --> 0:33:08.200
<v Speaker 1>and Imronchell Dori, a former director of design from Apple,

0:33:08.200 --> 0:33:11.080
<v Speaker 1>that re firs to himself as an inventor and innovator

0:33:11.280 --> 0:33:13.760
<v Speaker 1>that was fired in twenty seventeen for sending out an

0:33:13.800 --> 0:33:16.320
<v Speaker 1>email about his planned exit from the company that suggested

0:33:16.320 --> 0:33:20.800
<v Speaker 1>that Apple could no longer innovate. Well, may look at

0:33:20.840 --> 0:33:23.440
<v Speaker 1>the humane pen. Do you think you innovated? Just to

0:33:23.480 --> 0:33:25.320
<v Speaker 1>be clear, if you haven't seen this thing, you should

0:33:25.320 --> 0:33:27.440
<v Speaker 1>really look it up. It's really funny. It clips on

0:33:27.640 --> 0:33:30.920
<v Speaker 1>and you are meant to use it to project a

0:33:31.040 --> 0:33:33.800
<v Speaker 1>laser thing onto your hand to make phone calls or

0:33:33.840 --> 0:33:36.120
<v Speaker 1>take photos. It's got a little camera in it. Here's

0:33:36.160 --> 0:33:38.560
<v Speaker 1>the problem. It overheats after a few minutes because of

0:33:38.560 --> 0:33:41.640
<v Speaker 1>the laser. And on top of that, the queries don't

0:33:41.680 --> 0:33:44.600
<v Speaker 1>work after time. And when they do, who really cares?

0:33:45.120 --> 0:33:48.040
<v Speaker 1>It's just chat GPT except seven hundred dollars worth a

0:33:48.080 --> 0:33:51.760
<v Speaker 1>twenty four dollars a month subscription. And this is what

0:33:51.840 --> 0:33:54.640
<v Speaker 1>happens when you're insulated from real people's problems, and when

0:33:54.640 --> 0:33:57.680
<v Speaker 1>you don't participate in the process that might actually solve them,

0:33:58.160 --> 0:34:03.400
<v Speaker 1>you become fundamentally disconnect from any real value creation. Silicon

0:34:03.480 --> 0:34:07.640
<v Speaker 1>Valley is atrophying as a result of lazy, disconnected vcs

0:34:07.640 --> 0:34:12.480
<v Speaker 1>and power players elevating man like sam Altman and incumbents

0:34:12.480 --> 0:34:15.680
<v Speaker 1>helping career consultants dictate the actions of those who actually

0:34:15.719 --> 0:34:19.440
<v Speaker 1>build software and hardware. If the tech industry wants to

0:34:19.600 --> 0:34:22.520
<v Speaker 1>escape the public's eire, it should push back against this

0:34:22.719 --> 0:34:26.600
<v Speaker 1>managerial poison and talk to real people with real problems

0:34:26.600 --> 0:34:29.920
<v Speaker 1>and focus on solving those before creating yet another growth

0:34:29.960 --> 0:34:34.320
<v Speaker 1>centric bullshit machine. And as the value really wants to change,

0:34:34.640 --> 0:34:38.360
<v Speaker 1>it needs to stop empowering those who have failed upwards

0:34:38.440 --> 0:34:41.760
<v Speaker 1>just because they say the right things. It feels good.

0:34:41.840 --> 0:34:44.040
<v Speaker 1>I get it that we have a guy out there

0:34:44.320 --> 0:34:47.560
<v Speaker 1>who's saying, yeah, I can help the value be worth money.

0:34:48.280 --> 0:34:51.680
<v Speaker 1>But as we speak, in vidios down ten percent by

0:34:51.680 --> 0:34:53.759
<v Speaker 1>the time this episode's out, I truly don't know where

0:34:53.760 --> 0:34:57.000
<v Speaker 1>it will be. But I'm worried. I'm worried that the

0:34:57.040 --> 0:35:00.320
<v Speaker 1>tech industry is going to start sputtering because every rues

0:35:00.360 --> 0:35:02.839
<v Speaker 1>put their eggs in the AI basket. But I do

0:35:02.920 --> 0:35:06.040
<v Speaker 1>have some hope. A lot of the startups I talked

0:35:06.080 --> 0:35:10.200
<v Speaker 1>to are only slightly touching generative AI. They don't seem

0:35:10.320 --> 0:35:13.680
<v Speaker 1>firmly embedded in it. Maybe this is just anecdotal, Maybe

0:35:13.680 --> 0:35:16.440
<v Speaker 1>I just know good people. I don't know, but my

0:35:16.600 --> 0:35:19.560
<v Speaker 1>thought here is the fact that the zero interest free

0:35:19.600 --> 0:35:22.160
<v Speaker 1>generation has kind of ended. That venture capitalists can't just

0:35:22.239 --> 0:35:26.800
<v Speaker 1>get hundreds of billions of dollars quite so easily. Means

0:35:26.800 --> 0:35:29.720
<v Speaker 1>that they're not so quick to invest in this stuff.

0:35:30.719 --> 0:35:33.600
<v Speaker 1>But I think are reckoning's coming, and I don't know

0:35:33.680 --> 0:35:35.759
<v Speaker 1>if it will be from people, but I think it's

0:35:35.800 --> 0:35:39.400
<v Speaker 1>going to be kind of a larger effect. It's going

0:35:39.440 --> 0:35:41.480
<v Speaker 1>to be one where you see that these products just

0:35:41.520 --> 0:35:43.960
<v Speaker 1>don't get adopted, like I mentioned in the AI episode,

0:35:44.400 --> 0:35:46.760
<v Speaker 1>and I think you're going to see these big, nasty,

0:35:46.920 --> 0:35:51.360
<v Speaker 1>overfunded consumer AI companies fall apart. But like I said before,

0:35:52.040 --> 0:35:54.440
<v Speaker 1>I'm afraid the open AI is going to start falling apart,

0:35:55.080 --> 0:35:57.920
<v Speaker 1>even if it is mostly part of Microsoft. I'm afraid

0:35:57.960 --> 0:36:00.040
<v Speaker 1>of the knock on effects on the stock market. And

0:36:00.120 --> 0:36:04.200
<v Speaker 1>I know, oh, stock market is only rich people play. No,

0:36:04.320 --> 0:36:07.360
<v Speaker 1>that's people's pensions as people like regular people do actually

0:36:07.400 --> 0:36:09.919
<v Speaker 1>invest in the stock market, and regular people watch Jim

0:36:10.000 --> 0:36:12.360
<v Speaker 1>Kramer as he goes you need to invest in AI.

0:36:13.000 --> 0:36:15.200
<v Speaker 1>That man does not know a goddamn thing. By the way,

0:36:16.400 --> 0:36:19.040
<v Speaker 1>a lot of people who've responded to my AI episode

0:36:19.040 --> 0:36:20.879
<v Speaker 1>have said, yeah, it's good that AI's fail. It's good

0:36:20.880 --> 0:36:23.880
<v Speaker 1>that the tech industry is falling apart, and it feels

0:36:23.920 --> 0:36:26.120
<v Speaker 1>good to see bad people fail. But the thing I

0:36:26.160 --> 0:36:29.600
<v Speaker 1>need to caution you about is management consultants are also

0:36:29.719 --> 0:36:36.000
<v Speaker 1>really good at avoiding blame. Prabakar Ragavan he destroyed Yahoo,

0:36:36.160 --> 0:36:38.239
<v Speaker 1>or at least watched it happen, and he's now the

0:36:38.239 --> 0:36:40.800
<v Speaker 1>head of Google Search and he's destroying that too. Sam

0:36:40.800 --> 0:36:44.640
<v Speaker 1>Altman has messed up so many times, and yet here

0:36:44.680 --> 0:36:47.160
<v Speaker 1>he is. He's the king of Open AI. He gets

0:36:47.200 --> 0:36:49.880
<v Speaker 1>these glossy stories. He can be interviewed by anyone anywhere.

0:36:50.200 --> 0:36:52.759
<v Speaker 1>These people keep winning, but you want to know how

0:36:52.760 --> 0:36:58.160
<v Speaker 1>they get defeated Sunlight talk about them, Talk about Adam Maseri,

0:36:58.480 --> 0:37:02.120
<v Speaker 1>talk about Sam Mortman. These stories to your friends, say

0:37:02.120 --> 0:37:05.800
<v Speaker 1>the name Propacar Ragavan. As much as you can blame

0:37:05.880 --> 0:37:08.719
<v Speaker 1>these people for their actions, I can't say it will

0:37:08.800 --> 0:37:12.719
<v Speaker 1>change much. But at least the wider society will know

0:37:13.160 --> 0:37:17.360
<v Speaker 1>who to blame for destroying the Internet. Thank you for listening.

0:37:27.800 --> 0:37:30.200
<v Speaker 1>Thank you for listening to Better Offline. The editor and

0:37:30.239 --> 0:37:33.400
<v Speaker 1>composer of the Better Offline theme song is Matasowski. You

0:37:33.440 --> 0:37:35.680
<v Speaker 1>can check out more of his music and audio projects

0:37:35.840 --> 0:37:39.319
<v Speaker 1>at Matasowski dot com. M A t t O. S

0:37:39.400 --> 0:37:43.480
<v Speaker 1>O w Ski dot com. You can email me at

0:37:43.480 --> 0:37:46.000
<v Speaker 1>easy at better offline dot com or check out better

0:37:46.040 --> 0:37:48.479
<v Speaker 1>Offline dot com to find my newsletter and more links

0:37:48.480 --> 0:37:52.120
<v Speaker 1>to this podcast. Thank you so much for listening. Better

0:37:52.160 --> 0:37:54.759
<v Speaker 1>Offline is a production of cool Zone Media. For more

0:37:54.800 --> 0:37:58.680
<v Speaker 1>from cool Zone Media, visit our website cool Zonemedia dot com,

0:37:58.760 --> 0:38:00.680
<v Speaker 1>or check us out on the I or radio app,

0:38:00.760 --> 0:38:03.200
<v Speaker 1>Apple Podcasts, or wherever you get your podcasts.