WEBVTT - Monologue: Why We Need Tech Criticism More Than Ever

0:00:02.160 --> 0:00:07.119
<v Speaker 1>Zone Media. Hello and welcome to this week's Better Offline Monologue.

0:00:07.200 --> 0:00:17.159
<v Speaker 1>I'm at Zychra and your host. As a reminder, you

0:00:17.200 --> 0:00:19.800
<v Speaker 1>can buy Better Offline merchandise now. You'll find a link

0:00:19.840 --> 0:00:21.639
<v Speaker 1>to it in the episode notes. You all seem to

0:00:21.640 --> 0:00:24.240
<v Speaker 1>really like it. It's cool stuff. But I know some

0:00:24.280 --> 0:00:26.680
<v Speaker 1>of you have also said recently that I've done too

0:00:26.840 --> 0:00:30.080
<v Speaker 1>much AI stuff recently, and I wanted to take a

0:00:30.120 --> 0:00:32.080
<v Speaker 1>little time to explain why I've been doing so, in

0:00:32.120 --> 0:00:34.960
<v Speaker 1>part because I'm not going to change the generative AI

0:00:35.040 --> 0:00:38.199
<v Speaker 1>boom is about far more than art, official intelligence, or

0:00:38.280 --> 0:00:40.760
<v Speaker 1>cloud storage or data centers or what part of Hawaii

0:00:40.800 --> 0:00:43.760
<v Speaker 1>Mark Zuckerberg will buy next. I believe this movement is

0:00:43.760 --> 0:00:46.199
<v Speaker 1>symbolic of a greater rot in the tech industry and

0:00:46.240 --> 0:00:49.280
<v Speaker 1>indeed in media criticism, both inside and outside of the

0:00:49.320 --> 0:00:51.960
<v Speaker 1>tech media, and that the nature of criticism must indeed

0:00:52.040 --> 0:00:54.440
<v Speaker 1>change to meet this moment. Now, a few of the

0:00:54.480 --> 0:00:57.080
<v Speaker 1>most consistent critiques in my work are mostly around my tone.

0:00:57.080 --> 0:00:59.200
<v Speaker 1>I'm a hater, I'm a cynic, I'm a skeptic, I'm

0:00:59.360 --> 0:01:02.840
<v Speaker 1>too emotional, I've gone overboard. The local townspeople should throw

0:01:02.880 --> 0:01:05.280
<v Speaker 1>tomatoes at me. I should be treated in the way

0:01:05.319 --> 0:01:08.240
<v Speaker 1>of Shrek and exiled from society, things like that. But

0:01:08.360 --> 0:01:10.560
<v Speaker 1>in all seriousness, it seems the only way my work

0:01:10.600 --> 0:01:14.240
<v Speaker 1>is reliably critiqued is to suggest that my emotions invalidate

0:01:14.240 --> 0:01:18.880
<v Speaker 1>my criticisms somehow, and that saying fuck somehow weakens my arguments. Basically,

0:01:18.920 --> 0:01:22.679
<v Speaker 1>that giving a shit is invalidating when it comes to

0:01:22.720 --> 0:01:25.440
<v Speaker 1>criticizing something. And I find that putrid, by the way,

0:01:25.480 --> 0:01:27.760
<v Speaker 1>but it's how things are, and I kind of get it.

0:01:28.120 --> 0:01:31.200
<v Speaker 1>Most financial tech criticism is expected to be dry and clinical,

0:01:31.240 --> 0:01:33.600
<v Speaker 1>and any emotional content should be positive or at the

0:01:33.720 --> 0:01:36.600
<v Speaker 1>very least optimistic, And I do not believe this tone

0:01:36.600 --> 0:01:38.800
<v Speaker 1>is sufficient for the seriousness of the matters at hand.

0:01:39.440 --> 0:01:42.080
<v Speaker 1>We are, as I have said repeatedly several years, into

0:01:42.120 --> 0:01:44.880
<v Speaker 1>a hysterical, illogical bubble, one launched off the back of

0:01:44.920 --> 0:01:48.160
<v Speaker 1>two smaller yet no less hysterical bubbles, at times by

0:01:48.200 --> 0:01:49.960
<v Speaker 1>the same people, some of them writing for the New

0:01:50.000 --> 0:01:52.720
<v Speaker 1>York Times of the verge, We as a society have

0:01:52.800 --> 0:01:55.600
<v Speaker 1>accepted the terms that the generative AI companies want us to.

0:01:56.040 --> 0:01:58.559
<v Speaker 1>That we must destroy the environment, that we must steal

0:01:58.560 --> 0:02:01.000
<v Speaker 1>from millions of people, that we must burn billions of

0:02:01.040 --> 0:02:03.720
<v Speaker 1>dollars all in pursuit of a vague and specious outcome

0:02:03.760 --> 0:02:06.920
<v Speaker 1>that will never really arrive. This isn't to say that

0:02:06.960 --> 0:02:09.880
<v Speaker 1>we've had much choice in said acceptance. The media has

0:02:09.919 --> 0:02:13.400
<v Speaker 1>repeatedly accepted and promoted these narratives, helped justify these costs,

0:02:13.520 --> 0:02:16.960
<v Speaker 1>and presented ridiculous narratives as sensible ones. The outcome has

0:02:16.960 --> 0:02:19.960
<v Speaker 1>been a massive transfer of wealth upwards, both into the

0:02:19.960 --> 0:02:23.000
<v Speaker 1>market capitalization of hyperscalers and into the pockets of Sam

0:02:23.000 --> 0:02:26.000
<v Speaker 1>Mortman and other founders pretending generative AI will become some

0:02:26.040 --> 0:02:29.280
<v Speaker 1>sort of conscious intelligence or the next hypergrowth market, regardless

0:02:29.280 --> 0:02:32.239
<v Speaker 1>of whether of these things is possible. The other outcome

0:02:32.240 --> 0:02:34.880
<v Speaker 1>has been the rise of many new kinds of grifters,

0:02:35.320 --> 0:02:38.240
<v Speaker 1>the ethermolics of the world that creates scientific sounding yet

0:02:38.280 --> 0:02:41.560
<v Speaker 1>specious reviews of AI models, the slop filled AI newsletters

0:02:41.560 --> 0:02:44.480
<v Speaker 1>with fake subscribe accounts, and the many many AI influencers

0:02:44.639 --> 0:02:48.480
<v Speaker 1>that exist as extensions of AI companies pr departments. Yet

0:02:48.480 --> 0:02:50.360
<v Speaker 1>the most worrying part has been the members of the

0:02:50.400 --> 0:02:52.600
<v Speaker 1>media that have allowed this to happen. This isn't to

0:02:52.600 --> 0:02:55.280
<v Speaker 1>say anybody is even Kevin rus and Case and Newton

0:02:55.280 --> 0:02:57.800
<v Speaker 1>are included here. I don't think anyone's corrupt, but the

0:02:57.840 --> 0:03:00.640
<v Speaker 1>media is unprepared, unwilling, or able to push back on

0:03:00.680 --> 0:03:03.840
<v Speaker 1>the narratives. The structures that hold up tech and business

0:03:03.840 --> 0:03:06.480
<v Speaker 1>media are not built to truly explain or criticize what's

0:03:06.520 --> 0:03:09.239
<v Speaker 1>happening in the tech industry or even the economy at large.

0:03:10.560 --> 0:03:12.760
<v Speaker 1>These structures are built not to ask is this real?

0:03:12.880 --> 0:03:15.000
<v Speaker 1>Or will this work? But when this works, what will

0:03:15.040 --> 0:03:18.000
<v Speaker 1>it look like. They're built not to question whether an

0:03:18.040 --> 0:03:21.320
<v Speaker 1>industry is like real at all, but to assume that

0:03:21.639 --> 0:03:24.040
<v Speaker 1>there are always risks in building anything, and thus the

0:03:24.080 --> 0:03:26.360
<v Speaker 1>most important part is discussing what the person in question

0:03:26.480 --> 0:03:30.840
<v Speaker 1>wants to happen. Media's desperation for objectivity regularly deprives the

0:03:30.880 --> 0:03:33.800
<v Speaker 1>reader of the value of being objective because objectivity is

0:03:33.840 --> 0:03:39.000
<v Speaker 1>being conflated with being passive. True objective journalism, which is impossible,

0:03:39.040 --> 0:03:41.000
<v Speaker 1>by the way, would say that both the Open AI

0:03:41.120 --> 0:03:43.360
<v Speaker 1>raised a bunch of money and loses billions a year,

0:03:43.400 --> 0:03:45.440
<v Speaker 1>and that to continue doing business they'll have to raise

0:03:45.520 --> 0:03:49.240
<v Speaker 1>unbelievable sums of money every year. Instead, articles in open

0:03:49.240 --> 0:03:51.240
<v Speaker 1>AI just print that they raised money and why they

0:03:51.320 --> 0:03:54.360
<v Speaker 1>raised it, or that their products do something. No interest

0:03:54.400 --> 0:03:56.800
<v Speaker 1>in finding out what there's something is whether it's important

0:03:56.800 --> 0:03:59.160
<v Speaker 1>and whether it will lead to any outcome, probably because

0:03:59.160 --> 0:04:02.320
<v Speaker 1>it won't. The initial consequences of this passivity are that

0:04:02.440 --> 0:04:05.440
<v Speaker 1>venture capitalist anders select few startup founders have become very,

0:04:05.600 --> 0:04:08.040
<v Speaker 1>very rich, The big tech firms have had something new

0:04:08.080 --> 0:04:10.480
<v Speaker 1>to hock to their customers, and the access journalists have

0:04:10.520 --> 0:04:12.520
<v Speaker 1>had a new thing to pretend they care about. And

0:04:12.600 --> 0:04:14.680
<v Speaker 1>also a fallow tech industry has had something to get

0:04:14.680 --> 0:04:17.520
<v Speaker 1>excited about. None of this money has trickled down to

0:04:17.560 --> 0:04:20.119
<v Speaker 1>anybody other than the powerful, nor have any mass market

0:04:20.240 --> 0:04:24.040
<v Speaker 1>productivity gains been realized, nor has society improved as a result.

0:04:24.440 --> 0:04:26.480
<v Speaker 1>The only things that appear to have changed is that

0:04:26.520 --> 0:04:28.599
<v Speaker 1>these companies need more money, and they don't even need

0:04:28.640 --> 0:04:31.160
<v Speaker 1>to tell anybody why. The media just assumes they need

0:04:31.200 --> 0:04:34.080
<v Speaker 1>it to build powerful AI. I think it's fair to

0:04:34.120 --> 0:04:36.240
<v Speaker 1>say at this point that the generative AI boom hasn't

0:04:36.240 --> 0:04:38.320
<v Speaker 1>done much of anything other than create new ways for

0:04:38.360 --> 0:04:41.640
<v Speaker 1>software companies to sell software or access to models. There

0:04:41.680 --> 0:04:43.680
<v Speaker 1>are no killer apps, no major shifts in the way

0:04:43.680 --> 0:04:45.839
<v Speaker 1>we live our lives outside of innovations in fraud and

0:04:45.839 --> 0:04:48.240
<v Speaker 1>harms to our power grid, and whatever use cases there

0:04:48.279 --> 0:04:50.840
<v Speaker 1>may be for large language models. Are miniscule in comparison

0:04:50.839 --> 0:04:52.640
<v Speaker 1>to the way that AI is discussed in the media.

0:04:53.480 --> 0:04:56.760
<v Speaker 1>I am justifiably angry because I watch these bubbles form

0:04:56.760 --> 0:04:59.480
<v Speaker 1>again and again in exactly the same way at every

0:04:59.520 --> 0:05:02.120
<v Speaker 1>single p With the metaverus, with cryptocurrency, and now with

0:05:02.160 --> 0:05:04.880
<v Speaker 1>generative AI, there have been obvious moments to say, yeah,

0:05:04.960 --> 0:05:07.279
<v Speaker 1>I get that this is what you say will happen,

0:05:07.440 --> 0:05:09.839
<v Speaker 1>but what's happening today doesn't suggest that it will happen

0:05:09.880 --> 0:05:12.440
<v Speaker 1>at all. And every time those moments have been missed,

0:05:12.800 --> 0:05:14.800
<v Speaker 1>and the media is opted instead to ask, but what

0:05:14.920 --> 0:05:18.200
<v Speaker 1>if it was true? When the media opts to trust

0:05:18.200 --> 0:05:20.359
<v Speaker 1>whatever comes out of the mouth of a powerful person,

0:05:20.400 --> 0:05:24.840
<v Speaker 1>the beneficiary is always always the powerful person in question. Yet,

0:05:24.880 --> 0:05:26.680
<v Speaker 1>a better world would be one where the altmans and

0:05:26.720 --> 0:05:29.520
<v Speaker 1>Amma days have to actually justify themselves, show what the

0:05:29.520 --> 0:05:32.440
<v Speaker 1>models can do, give realistic projections. Know that they can't

0:05:32.440 --> 0:05:35.200
<v Speaker 1>just say whatever and get quoted automatically, because this kind

0:05:35.200 --> 0:05:38.440
<v Speaker 1>of accountability would make their current work overstanding their models capabilities,

0:05:38.440 --> 0:05:42.159
<v Speaker 1>and running unsustainable and destructive businesses impossible or at least

0:05:42.200 --> 0:05:45.280
<v Speaker 1>much more difficult. A better tech industry is one where

0:05:45.320 --> 0:05:48.159
<v Speaker 1>the products we hear about actually exist, where hype cycles

0:05:48.200 --> 0:05:50.640
<v Speaker 1>are built based on execution and outcomes rather than whether

0:05:50.720 --> 0:05:54.000
<v Speaker 1>something may or may not be in the future. While

0:05:54.000 --> 0:05:56.279
<v Speaker 1>there is always a place in the tech media to dream,

0:05:56.279 --> 0:05:58.279
<v Speaker 1>to guess what might come next, to talk to people

0:05:58.320 --> 0:06:00.800
<v Speaker 1>inventing things, to talk to researchers, and I have no

0:06:00.839 --> 0:06:03.560
<v Speaker 1>problem with that, and it's fine to anticipate what the

0:06:03.640 --> 0:06:05.719
<v Speaker 1>effects of these things could be, so much of this

0:06:05.839 --> 0:06:08.200
<v Speaker 1>industry has become about those dreams, to the point that

0:06:08.240 --> 0:06:10.560
<v Speaker 1>innovation is far less relevant than whether you can convince

0:06:10.640 --> 0:06:14.320
<v Speaker 1>enough people that something might happen. Nevertheless, there is definitely

0:06:14.320 --> 0:06:17.320
<v Speaker 1>something missing in Better Offline, and that's excitement. The feedback

0:06:17.320 --> 0:06:19.320
<v Speaker 1>I got from my last monologue around fitness and tech

0:06:19.400 --> 0:06:22.880
<v Speaker 1>I liked was profound, and I will be looking forward

0:06:22.880 --> 0:06:25.920
<v Speaker 1>to doing more episodes like that, especially these monologues on

0:06:25.960 --> 0:06:27.680
<v Speaker 1>the things that I use that I really enjoy, all

0:06:27.720 --> 0:06:29.800
<v Speaker 1>the cool stuff that tech is actually doing. It's going

0:06:29.880 --> 0:06:31.400
<v Speaker 1>to be a process to get there, but I think

0:06:31.440 --> 0:06:33.680
<v Speaker 1>you're really gonna like what you hear when I start

0:06:33.760 --> 0:06:36.480
<v Speaker 1>doing it. Don't worry, though, there's still so much rots

0:06:36.480 --> 0:06:44.240
<v Speaker 1>economy bullshit to unpack, as you'll hear in tomorrow's episode