1 00:00:12,640 --> 00:00:16,520 Speaker 1: Welcome to Tech Stuff, a production of iHeart Podcasts and Kaleidoscope. 2 00:00:16,880 --> 00:00:19,120 Speaker 1: I'm as Valoshian and today Cara Price and I will 3 00:00:19,120 --> 00:00:22,960 Speaker 1: bring you the headlines this week, including some surprisingly rapid 4 00:00:23,040 --> 00:00:26,760 Speaker 1: changes in the way people are using generative AI. Then 5 00:00:26,800 --> 00:00:29,760 Speaker 1: on tech Support, we'll talk to the Washington Posts Naomi 6 00:00:29,840 --> 00:00:33,600 Speaker 1: Nix about the first week of the Meta antitrust trial. 7 00:00:33,960 --> 00:00:35,879 Speaker 2: This latest trial is a Now they're a piece of 8 00:00:35,920 --> 00:00:39,199 Speaker 2: evidence that the companies haven't yet, to put it crudely, 9 00:00:39,200 --> 00:00:40,520 Speaker 2: gotten what they've paid for. 10 00:00:40,880 --> 00:00:48,080 Speaker 1: All of that. On the Weekend Tech It's Friday, April eighteenth, 11 00:00:49,120 --> 00:00:50,640 Speaker 1: Hello us, Hello Cara. 12 00:00:51,080 --> 00:00:52,960 Speaker 3: So I was thinking about you this week. I was 13 00:00:52,960 --> 00:00:58,240 Speaker 3: seeing something Oliver Instagram, which is my preferred social media platform. 14 00:00:58,320 --> 00:01:01,280 Speaker 3: Your preferred social media platform has nothing to do with socializing. 15 00:01:01,360 --> 00:01:02,000 Speaker 3: It's called Lincoln. 16 00:01:02,440 --> 00:01:05,319 Speaker 1: Well, maybe you're too old for TikTok and I'm too 17 00:01:05,319 --> 00:01:09,200 Speaker 1: old for Instagram. So yes, LinkedIn is my absolutely. 18 00:01:08,760 --> 00:01:11,280 Speaker 3: Right, at least you're not using Facebook. So on this 19 00:01:11,560 --> 00:01:15,600 Speaker 3: I saw action figures. Yeah, that looked a lot like 20 00:01:15,680 --> 00:01:17,679 Speaker 3: my friends, like my friend who works at Tory Birch. 21 00:01:17,720 --> 00:01:20,560 Speaker 3: There was like there was an action figure of like 22 00:01:20,600 --> 00:01:23,479 Speaker 3: a jewel and an iced coffee and a Tory Birch handbag. 23 00:01:23,920 --> 00:01:26,440 Speaker 3: In a weird way, I've looked at it as a 24 00:01:26,480 --> 00:01:29,760 Speaker 3: sort of LinkedIn's own version of the Jibbli portraits that 25 00:01:29,760 --> 00:01:31,679 Speaker 3: we talked about a few weeks ago. You know, to 26 00:01:31,800 --> 00:01:34,360 Speaker 3: make these, you upload a picture of yourself to chat 27 00:01:34,400 --> 00:01:37,440 Speaker 3: GBT and prompt the model to turn you into a 28 00:01:37,480 --> 00:01:39,560 Speaker 3: picture of a toy action figure. 29 00:01:39,880 --> 00:01:43,920 Speaker 1: Imagine going into a toy store in nineteen eighty four 30 00:01:44,480 --> 00:01:48,040 Speaker 1: and you have plastic packed action figurines like could be 31 00:01:48,040 --> 00:01:51,280 Speaker 1: a Bobbie, could be a Gi Joe with some special 32 00:01:51,440 --> 00:01:54,560 Speaker 1: swag that the character has which reflects on who the 33 00:01:54,640 --> 00:01:55,200 Speaker 1: character is. 34 00:01:55,320 --> 00:01:57,240 Speaker 3: Yeah, and I think in a weird way, it sort 35 00:01:57,280 --> 00:02:00,880 Speaker 3: of acts as this like pictorial resume of like this 36 00:02:00,920 --> 00:02:03,840 Speaker 3: is who I like. I am podcast fundraiser, I have 37 00:02:04,040 --> 00:02:06,080 Speaker 3: jim bag, I have a six sneakers. 38 00:02:06,080 --> 00:02:06,960 Speaker 1: What are you talking about? 39 00:02:07,160 --> 00:02:10,760 Speaker 3: I am here asking for some money. I don't know 40 00:02:10,760 --> 00:02:11,600 Speaker 3: what I'm talking about. 41 00:02:11,960 --> 00:02:13,480 Speaker 1: Do you know what's very interesting? I used to work 42 00:02:13,520 --> 00:02:15,760 Speaker 1: for a branding agency. I know that you know that, 43 00:02:15,880 --> 00:02:20,919 Speaker 1: and our business cards were a pictorial representation of us 44 00:02:21,000 --> 00:02:24,680 Speaker 1: and then a few representations of things that were important 45 00:02:24,680 --> 00:02:27,160 Speaker 1: to us. So I had like a British flag and 46 00:02:27,240 --> 00:02:28,400 Speaker 1: like a tennis racking or. 47 00:02:28,320 --> 00:02:31,200 Speaker 3: Whatever one visa on visa. 48 00:02:31,120 --> 00:02:33,120 Speaker 1: Exactly, you know. And there was like kind of a 49 00:02:33,160 --> 00:02:35,240 Speaker 1: talking point at the beginning of the meeting and this 50 00:02:35,360 --> 00:02:38,760 Speaker 1: is kind of the LinkedIn meets chatchybt version of that, 51 00:02:39,160 --> 00:02:40,360 Speaker 1: and it's going completely viral. 52 00:02:40,520 --> 00:02:43,639 Speaker 3: It's going completely viral and people love. I don't know 53 00:02:43,680 --> 00:02:45,880 Speaker 3: if it's uncanny. I think it is uncanny Valley, like 54 00:02:45,919 --> 00:02:49,440 Speaker 3: people love to see themselves represented in an animated format 55 00:02:49,639 --> 00:02:52,760 Speaker 3: because it's sort of like the last frontier of like 56 00:02:53,440 --> 00:02:56,000 Speaker 3: reality turned into surreality one hundred percent. 57 00:02:56,040 --> 00:02:59,120 Speaker 1: I'm sure you've seen The Real Housewives as Pixar characters, 58 00:03:00,160 --> 00:03:01,120 Speaker 1: which is so good. 59 00:03:00,960 --> 00:03:02,800 Speaker 3: Which they're starting to look like anyway with all the 60 00:03:02,800 --> 00:03:05,160 Speaker 3: filters that you can use to post them on Instagram. 61 00:03:05,280 --> 00:03:07,920 Speaker 1: But you know what this is what is driving something 62 00:03:07,960 --> 00:03:12,680 Speaker 1: extraordinary that happened, which is that CHATGPT excluding games, became 63 00:03:12,720 --> 00:03:15,560 Speaker 1: the most downloaded app in the world last month. 64 00:03:15,720 --> 00:03:18,760 Speaker 3: Yeah, and actually at last week's TED conference, Sam Altman 65 00:03:18,880 --> 00:03:22,000 Speaker 3: let slip that the usage of chat gbt had doubled 66 00:03:22,080 --> 00:03:26,560 Speaker 3: in just a few weeks, indicating it now has eight 67 00:03:26,680 --> 00:03:29,880 Speaker 3: hundred million weekly active users. 68 00:03:30,160 --> 00:03:32,079 Speaker 1: Yeah, I mean, it is an extraordinary number, and I 69 00:03:32,160 --> 00:03:36,119 Speaker 1: think you use the word indicating advisedly because he said 70 00:03:36,160 --> 00:03:39,280 Speaker 1: it was about ten percent of the world's population. You know, 71 00:03:39,320 --> 00:03:41,440 Speaker 1: at the beginning of this year, we were talking about 72 00:03:41,440 --> 00:03:44,840 Speaker 1: would the business model for these charative ami companies ever work. 73 00:03:45,200 --> 00:03:48,320 Speaker 1: There's a story in Information this week that the revenues 74 00:03:48,440 --> 00:03:53,400 Speaker 1: at OpenAI are picking up dramatically. They've grown thirty percent 75 00:03:53,520 --> 00:03:56,120 Speaker 1: so far this year, So that's in one quarter to 76 00:03:56,160 --> 00:03:59,800 Speaker 1: around four hundred and fifteen million dollars per month in revenue. 77 00:03:59,400 --> 00:04:01,600 Speaker 3: Which is very interesting for a company that built itself 78 00:04:01,600 --> 00:04:04,000 Speaker 3: as a non brop no. But in some ways, I'm 79 00:04:04,000 --> 00:04:06,640 Speaker 3: not surprised because it feels like there's a new use 80 00:04:06,680 --> 00:04:09,960 Speaker 3: case for chat GBT every week. I'm more invested in 81 00:04:09,960 --> 00:04:12,360 Speaker 3: it every week, and so I am sort of always 82 00:04:12,360 --> 00:04:17,080 Speaker 3: interested when I see the data reflected in my own personal. 83 00:04:16,800 --> 00:04:18,960 Speaker 1: Use I mean, you get hooked on, oh my god, 84 00:04:19,120 --> 00:04:21,800 Speaker 1: what would my family photo look like as a Pixar movie? 85 00:04:22,120 --> 00:04:24,680 Speaker 1: And then it's like, huh, you know, I what if 86 00:04:24,720 --> 00:04:28,280 Speaker 1: I put this legal contract through chat GPT and asked 87 00:04:28,279 --> 00:04:29,880 Speaker 1: it to advise me on what were the key points 88 00:04:29,880 --> 00:04:31,760 Speaker 1: I should be paying attention to, which I did for 89 00:04:31,800 --> 00:04:34,159 Speaker 1: the first time last week after I used the image 90 00:04:34,160 --> 00:04:40,080 Speaker 1: generation feature. So I mean, it's a remarkable moment where 91 00:04:40,120 --> 00:04:45,120 Speaker 1: I think a true consumer adoption is creating the business model, 92 00:04:45,600 --> 00:04:47,599 Speaker 1: or could be beginning to create the business model, which 93 00:04:47,600 --> 00:04:49,599 Speaker 1: is a question some people had, would that ever really 94 00:04:49,600 --> 00:04:52,920 Speaker 1: actually happen? But we are not the only people observing 95 00:04:53,000 --> 00:04:56,000 Speaker 1: what's happening. There was something actually fascinating I read last 96 00:04:56,040 --> 00:04:58,520 Speaker 1: week in the Harvard Business Review that might shock you. 97 00:04:58,560 --> 00:05:01,360 Speaker 3: So tell me about something fat fascinating that you actually 98 00:05:01,360 --> 00:05:04,000 Speaker 3: were able to find in the Harvard Business Review. 99 00:05:04,120 --> 00:05:08,200 Speaker 1: Well, the article was about AI use cases, and according 100 00:05:08,240 --> 00:05:11,680 Speaker 1: to Harvard Business Review, the most common reason for generative 101 00:05:11,880 --> 00:05:17,840 Speaker 1: AI usage over the last twelve months was therapy and companionship. 102 00:05:18,320 --> 00:05:20,919 Speaker 3: It's actually very funny you say this, and this is 103 00:05:20,960 --> 00:05:24,039 Speaker 3: not a setup. I've actually met a woman the other 104 00:05:24,160 --> 00:05:27,880 Speaker 3: day who told me she'd created her own therapy bot, 105 00:05:28,240 --> 00:05:30,920 Speaker 3: calling it a GPT. And I've been seeing a ton 106 00:05:30,960 --> 00:05:34,080 Speaker 3: of stuff about using AI models for therapy everywhere, and 107 00:05:34,120 --> 00:05:36,479 Speaker 3: I was honestly starting to wonder if all these articles 108 00:05:36,480 --> 00:05:38,960 Speaker 3: on AI relationships were a bit blown out of proportion, 109 00:05:39,279 --> 00:05:41,120 Speaker 3: But it seems to be a very real trend. 110 00:05:41,279 --> 00:05:45,039 Speaker 1: Well, you've taken endorsement after slamming me for being fascinated 111 00:05:45,040 --> 00:05:47,400 Speaker 1: by the Harvard Business Review, You've been swayed by their 112 00:05:47,800 --> 00:05:49,000 Speaker 1: validation of your worldview. 113 00:05:49,120 --> 00:05:52,120 Speaker 3: Judge where information comes from unless it's the Harvard Business Review. 114 00:05:52,360 --> 00:05:56,680 Speaker 1: So this is a study and the methodology is quite interesting. 115 00:05:56,880 --> 00:05:58,760 Speaker 1: Some might question it, but I thought it was interesting. 116 00:05:58,960 --> 00:06:03,880 Speaker 1: They're basically a tool to scrape online forums like primarily 117 00:06:03,920 --> 00:06:06,960 Speaker 1: Reddit but also kra and a few others that scrape 118 00:06:07,000 --> 00:06:09,800 Speaker 1: for every single mention of how people are using AI, 119 00:06:10,240 --> 00:06:14,279 Speaker 1: categorize them and then counted the number of posts about 120 00:06:14,360 --> 00:06:16,440 Speaker 1: each use case, you know, filtering out the garbage and 121 00:06:16,480 --> 00:06:18,960 Speaker 1: whatever else, to come up with the stat rank of 122 00:06:19,000 --> 00:06:21,400 Speaker 1: the top one hundred use cases that people are talking 123 00:06:21,400 --> 00:06:22,520 Speaker 1: about how they use AI. 124 00:06:22,920 --> 00:06:27,680 Speaker 3: And so Reddit really is the treasure trove of this discovery. 125 00:06:28,040 --> 00:06:29,400 Speaker 1: I think it is, and I think you know there's 126 00:06:29,440 --> 00:06:32,200 Speaker 1: a reason why is because people post without using their 127 00:06:32,240 --> 00:06:35,080 Speaker 1: real names in a very unfiltered way on Reddit, and 128 00:06:35,120 --> 00:06:37,400 Speaker 1: so I think as a proxy for how people are 129 00:06:37,480 --> 00:06:40,320 Speaker 1: using technology, you could do a lot worse than Reddit. 130 00:06:40,800 --> 00:06:42,719 Speaker 1: And in fact, I also read in the Harvard Business 131 00:06:42,720 --> 00:06:46,039 Speaker 1: Review that today ten percent of reddits revenues actually come 132 00:06:46,160 --> 00:06:49,799 Speaker 1: from selling its user generated content as training data to llms, 133 00:06:50,400 --> 00:06:51,839 Speaker 1: so you know that there is. 134 00:06:51,920 --> 00:06:53,679 Speaker 3: Just like our friends of the Atlanta though there's. 135 00:06:53,520 --> 00:06:56,240 Speaker 1: Gold in them hills. This is actually the second time 136 00:06:56,320 --> 00:06:58,920 Speaker 1: this study has been published in the Harvard Business Review. 137 00:06:58,920 --> 00:07:01,120 Speaker 1: It's published in twenty twenty five as well. What I 138 00:07:01,160 --> 00:07:05,239 Speaker 1: found really really interesting was what change from twenty twenty 139 00:07:05,279 --> 00:07:08,840 Speaker 1: four to twenty twenty five. Thirty eight new use cases 140 00:07:08,880 --> 00:07:11,600 Speaker 1: have been added to the list, and last year's top 141 00:07:11,720 --> 00:07:15,320 Speaker 1: use case was generating ideas, which has now fallen down 142 00:07:15,360 --> 00:07:18,400 Speaker 1: to sixth place. This year, the second and third most 143 00:07:18,480 --> 00:07:21,760 Speaker 1: popular use cases were new entrants to the list. They 144 00:07:21,800 --> 00:07:24,080 Speaker 1: weren't in the top hundred mentions last year and now 145 00:07:24,080 --> 00:07:28,000 Speaker 1: they're in second and third place. They were organizing my life, 146 00:07:28,400 --> 00:07:30,400 Speaker 1: followed by finding purpose. 147 00:07:31,000 --> 00:07:33,040 Speaker 3: You know, I think this speaks to something which is 148 00:07:33,040 --> 00:07:36,280 Speaker 3: that people are lonely, and people don't know how to 149 00:07:36,320 --> 00:07:39,240 Speaker 3: talk to other real people about these things. And I think, 150 00:07:39,440 --> 00:07:42,000 Speaker 3: sort of like tinders the game game right that we 151 00:07:42,080 --> 00:07:46,880 Speaker 3: did last week, it's much easier to kind of test 152 00:07:47,000 --> 00:07:51,320 Speaker 3: these very human interactions on computers. 153 00:07:50,840 --> 00:07:53,320 Speaker 1: Absolutely or in a computer no judgment judgment free zone. 154 00:07:53,440 --> 00:07:55,239 Speaker 3: The other thing that I would mention is that regular 155 00:07:55,280 --> 00:07:57,720 Speaker 3: therapy the kind that you have with the human therapist 156 00:07:58,160 --> 00:08:03,080 Speaker 3: will put your bank account into negative numbers, absolutely, And 157 00:08:03,160 --> 00:08:06,800 Speaker 3: so you may not want to constantly burden your friends 158 00:08:06,800 --> 00:08:10,240 Speaker 3: with your issues and be the carry Bradshaw in your 159 00:08:10,240 --> 00:08:12,320 Speaker 3: friend group. And so I can see why a person 160 00:08:12,360 --> 00:08:14,920 Speaker 3: would turn to AI to help them through a hard time. 161 00:08:15,320 --> 00:08:16,840 Speaker 3: I actually don't judge it at all. 162 00:08:17,160 --> 00:08:19,960 Speaker 1: No, no, me neither. But on the contrary, I mean, 163 00:08:20,400 --> 00:08:23,400 Speaker 1: I think what the HBr sort of pointed out as 164 00:08:23,400 --> 00:08:25,640 Speaker 1: the kind of larger takeaway in the twenty twenty four 165 00:08:25,720 --> 00:08:29,600 Speaker 1: versus twenty twenty five comparison was that in twenty twenty four, 166 00:08:29,920 --> 00:08:32,680 Speaker 1: the most popular use cases were all around quote unquote 167 00:08:32,720 --> 00:08:36,640 Speaker 1: technical assistance and troubleshooting, whereas this year they're in quote 168 00:08:36,679 --> 00:08:41,000 Speaker 1: unquote personal and professional support, which kind of mirrors what 169 00:08:41,040 --> 00:08:43,000 Speaker 1: we were talking about just a few minutes ago, which 170 00:08:43,040 --> 00:08:46,880 Speaker 1: is how AI has become more ubiquitous and therefore like 171 00:08:47,120 --> 00:08:49,560 Speaker 1: normal people are using it for more normal reasons. 172 00:08:49,840 --> 00:08:52,600 Speaker 3: I think, more ubiquitous and also like imbued with our 173 00:08:52,600 --> 00:08:55,240 Speaker 3: own humanity as opposed to like a place where we 174 00:08:55,320 --> 00:08:57,280 Speaker 3: find how do I fix this thing? 175 00:08:57,480 --> 00:08:57,600 Speaker 2: Right? 176 00:08:58,040 --> 00:08:59,920 Speaker 3: I think it's changed from how do I fix this thing? Too? 177 00:09:00,160 --> 00:09:04,000 Speaker 3: How do I fix myself, which to me is both 178 00:09:04,080 --> 00:09:06,240 Speaker 3: a little bit scary and also a little bit exciting. 179 00:09:06,559 --> 00:09:10,600 Speaker 1: Absolutely, there is one really important exception to what we've 180 00:09:10,600 --> 00:09:14,200 Speaker 1: been talking about just now, which is that two use cases, 181 00:09:14,679 --> 00:09:20,080 Speaker 1: firstly generating code and secondly improving code, both had meteoric rises. 182 00:09:20,440 --> 00:09:24,040 Speaker 1: So generating code was down in lowly forty seventh place 183 00:09:24,200 --> 00:09:27,080 Speaker 1: on last year's list, is now in fifth place. And 184 00:09:27,120 --> 00:09:30,040 Speaker 1: that brings me to my next headline, which is by Bloomberg. 185 00:09:30,160 --> 00:09:35,400 Speaker 1: Under the headline AI Coding Assistant, Cursor draws a million 186 00:09:35,480 --> 00:09:38,760 Speaker 1: users without even trying. So there's a hot new AI 187 00:09:38,800 --> 00:09:41,000 Speaker 1: start up in town, one that you've probably never heard of, 188 00:09:41,120 --> 00:09:43,320 Speaker 1: because I hadn't heard of it either until now. Its 189 00:09:43,320 --> 00:09:47,240 Speaker 1: Bloomberg story. But also they don't advertise or market. Bloomberg 190 00:09:47,320 --> 00:09:51,160 Speaker 1: reports that Cursor hasn't spent a single dollar on paid marketing. 191 00:09:51,720 --> 00:09:54,800 Speaker 1: The startup behind it is any Sphere, Inc. And they 192 00:09:54,840 --> 00:09:57,199 Speaker 1: make this AI powered coding editor called Cursor. 193 00:09:57,559 --> 00:10:01,200 Speaker 3: Yeah, it's this popular tool for both formally trained computer 194 00:10:01,280 --> 00:10:05,560 Speaker 3: programs and this thing that I love, Vibe coders. A 195 00:10:05,600 --> 00:10:07,840 Speaker 3: coding editor, from what I understand, is a program that 196 00:10:07,880 --> 00:10:10,720 Speaker 3: does things like check your code for errors. Think of 197 00:10:10,720 --> 00:10:13,560 Speaker 3: it like spell check for coding, and then on top 198 00:10:13,640 --> 00:10:17,520 Speaker 3: of that, more recent AI coders like Cursor can also 199 00:10:17,640 --> 00:10:20,360 Speaker 3: suggest lines of code for you based on what you've 200 00:10:20,360 --> 00:10:21,240 Speaker 3: previously written. 201 00:10:21,400 --> 00:10:23,520 Speaker 1: So Gen one is like spell check and Gen two 202 00:10:23,679 --> 00:10:26,400 Speaker 1: is shifted response for emails. 203 00:10:25,920 --> 00:10:28,840 Speaker 3: Exactly, And that's what vibe coding actually is. You just 204 00:10:28,880 --> 00:10:32,600 Speaker 3: like accept all of the suggestions versus having to correct 205 00:10:32,600 --> 00:10:35,439 Speaker 3: your code, sort of like writing a whole email and autocomplete, 206 00:10:35,440 --> 00:10:38,240 Speaker 3: which I do when I send an email says thank you, Comma, 207 00:10:38,640 --> 00:10:41,840 Speaker 3: Jennifer exclamation point. That is an email that's written entirely 208 00:10:41,840 --> 00:10:44,600 Speaker 3: by autocomplete. And just to be clear, as of today, 209 00:10:44,720 --> 00:10:47,560 Speaker 3: vibe coding does not generate code at the level of 210 00:10:47,559 --> 00:10:48,120 Speaker 3: real coding. 211 00:10:48,320 --> 00:10:51,640 Speaker 1: Nonetheless, Cursor has quietly become one of the fastest growing 212 00:10:51,640 --> 00:10:55,559 Speaker 1: startups of all time. It's even used by programmers at 213 00:10:55,600 --> 00:10:59,600 Speaker 1: companies like OpenAI, Instacra, and Uber. Although most of the 214 00:10:59,600 --> 00:11:03,280 Speaker 1: revenue you comes from individuals people using it and not 215 00:11:03,360 --> 00:11:05,960 Speaker 1: using it through corporate descriptions. They're getting their own subscription 216 00:11:06,080 --> 00:11:08,440 Speaker 1: to help them with their work. As it turns out, 217 00:11:08,520 --> 00:11:10,680 Speaker 1: coders are wanting to pay cash for a good user 218 00:11:10,679 --> 00:11:13,959 Speaker 1: interface and adaptability. The passion is real for the product 219 00:11:13,960 --> 00:11:17,000 Speaker 1: back In January, any Sphere reached one hundred million in 220 00:11:17,080 --> 00:11:20,760 Speaker 1: annual recurring revenue. By March that number of doubled and 221 00:11:20,840 --> 00:11:23,640 Speaker 1: over a million people are now using Cursor every single day. 222 00:11:24,000 --> 00:11:26,760 Speaker 3: I mean, we've talked about this many, many times that 223 00:11:26,800 --> 00:11:29,880 Speaker 3: there's a lot of anxiety about AI replacing people's jobs. 224 00:11:30,080 --> 00:11:33,080 Speaker 3: Cursor is actually interesting because it's being used to help 225 00:11:33,120 --> 00:11:36,640 Speaker 3: people with their jobs. It's a productivity tool, and it's 226 00:11:36,720 --> 00:11:39,320 Speaker 3: what these AI companies have been parroting all along. 227 00:11:39,440 --> 00:11:41,760 Speaker 1: It's an interesting paradigm thing here where it's not like 228 00:11:42,080 --> 00:11:43,880 Speaker 1: here's a new product to make your redundant. It's like, 229 00:11:44,040 --> 00:11:46,040 Speaker 1: here's a new product that makes you better so much 230 00:11:46,080 --> 00:11:47,920 Speaker 1: so that you pay your own money to use it 231 00:11:47,960 --> 00:11:49,320 Speaker 1: to make you better at your job. I mean that 232 00:11:49,400 --> 00:11:51,720 Speaker 1: is I think, sort of a high watermark for what 233 00:11:51,760 --> 00:11:54,000 Speaker 1: we can hope for from AI use cases. 234 00:11:54,120 --> 00:11:56,360 Speaker 3: Yeah, and it's actually a concept that investors are falling 235 00:11:56,400 --> 00:11:58,560 Speaker 3: in love with. Any Sphere, which is, as we said, 236 00:11:58,559 --> 00:12:01,560 Speaker 3: the parent company has raised one hundred and seventy five 237 00:12:01,679 --> 00:12:04,400 Speaker 3: million dollars in the likes of Andrees and Horowitz as 238 00:12:04,440 --> 00:12:07,520 Speaker 3: well as one of open AI's co founders, among many others. 239 00:12:07,720 --> 00:12:10,880 Speaker 3: Any Spheres and talks to raise more money at evaluation 240 00:12:10,960 --> 00:12:12,640 Speaker 3: of nearly ten billion. 241 00:12:13,080 --> 00:12:16,000 Speaker 1: Yeah, I mean it's very competitive space. Open AI and 242 00:12:16,160 --> 00:12:20,960 Speaker 1: Anthropic both seem to be eyeing the arena for their 243 00:12:21,040 --> 00:12:25,200 Speaker 1: own AI for coding tools, but nothing beats user adoption. 244 00:12:25,320 --> 00:12:28,800 Speaker 1: So we'll see how this story unfolds. The CURSA. We're 245 00:12:28,840 --> 00:12:30,880 Speaker 1: going to take a quick break now. When we come back, 246 00:12:31,480 --> 00:12:44,040 Speaker 1: carasches a rather surreal headline. Stay with us. 247 00:12:47,480 --> 00:12:50,719 Speaker 3: So as we're back, and I'm warning you that our 248 00:12:50,760 --> 00:12:52,760 Speaker 3: next story is something that you're really gonna love. 249 00:12:52,840 --> 00:12:54,160 Speaker 1: Okay, tell me. So. 250 00:12:54,600 --> 00:12:58,320 Speaker 3: Google has a generative video model called VO two, and 251 00:12:58,360 --> 00:13:00,440 Speaker 3: the way it works is you type in a prompt 252 00:13:00,559 --> 00:13:03,959 Speaker 3: or feed it some media like an image, and VO 253 00:13:03,960 --> 00:13:07,200 Speaker 3: two will generate a video based on these materials. So 254 00:13:07,320 --> 00:13:10,160 Speaker 3: late last year, Google announced that they were launching a 255 00:13:10,200 --> 00:13:13,439 Speaker 3: new version of the model, which includes a better grasp 256 00:13:13,480 --> 00:13:18,520 Speaker 3: of physics and more quote nuances of human movement and expression. 257 00:13:18,720 --> 00:13:22,480 Speaker 1: When a tech company tries to capture the nuance of humanity, 258 00:13:22,520 --> 00:13:24,560 Speaker 1: it's always you always know there's going to be some 259 00:13:24,640 --> 00:13:26,240 Speaker 1: excitements in store, yet. 260 00:13:26,080 --> 00:13:28,120 Speaker 3: They still try to do it. The news is that 261 00:13:28,240 --> 00:13:31,480 Speaker 3: Google recently showed off the updated VO two at the 262 00:13:31,559 --> 00:13:34,560 Speaker 3: Google Cloud next conference, which we were not invited. 263 00:13:34,720 --> 00:13:37,600 Speaker 1: Two our invitations must have got lost in the mail. 264 00:13:37,640 --> 00:13:39,640 Speaker 3: And maybe they could have made a bigger splash if 265 00:13:39,640 --> 00:13:42,199 Speaker 3: they invited us. But they did want to make a splash, 266 00:13:42,240 --> 00:13:46,160 Speaker 3: so they decided to appeal to both you and the 267 00:13:46,280 --> 00:13:47,320 Speaker 3: art community. 268 00:13:46,960 --> 00:13:49,680 Speaker 1: High risk bet the art community and not notoriously such 269 00:13:49,720 --> 00:13:51,400 Speaker 1: big fans of the big tech world. 270 00:13:52,080 --> 00:13:53,360 Speaker 3: Are you going to ask me what they show? 271 00:13:53,520 --> 00:13:54,520 Speaker 1: Yes? What did they show? 272 00:13:54,800 --> 00:13:59,880 Speaker 3: So they showed a trailer based on Salvador Dolly's unmade screenplay. 273 00:13:59,360 --> 00:14:02,679 Speaker 1: Salvador Dolly's unmade screenplay? What's that about? 274 00:14:02,840 --> 00:14:05,920 Speaker 3: So let me set the scene. I was not there, 275 00:14:05,920 --> 00:14:09,800 Speaker 3: but it's nineteen thirty seven. You're Salvador Dolly, you're in 276 00:14:09,840 --> 00:14:13,360 Speaker 3: your early thirties, you're my age, and you were hanging 277 00:14:13,360 --> 00:14:15,880 Speaker 3: out with Harpo Marx, one of the Marx brothers, who 278 00:14:15,920 --> 00:14:16,440 Speaker 3: I adore. 279 00:14:16,520 --> 00:14:18,360 Speaker 1: This is a great collection of characters, and it's like, 280 00:14:18,600 --> 00:14:20,200 Speaker 1: you can make it up that these people were hanging 281 00:14:20,200 --> 00:14:22,000 Speaker 1: out together and thinking how do we make a movie? 282 00:14:22,080 --> 00:14:24,520 Speaker 3: And my favorite character, who's of course the naysayer, also 283 00:14:24,560 --> 00:14:28,479 Speaker 3: factors into this. So you, as Dolly, write a fantastical 284 00:14:28,520 --> 00:14:30,960 Speaker 3: screenplay for the Marx brothers to star in. But when 285 00:14:30,960 --> 00:14:34,800 Speaker 3: you bring it to MGM, Louis B. Mayor of MGM 286 00:14:35,200 --> 00:14:37,840 Speaker 3: kills the project and the screenplay is never realized as 287 00:14:37,840 --> 00:14:38,280 Speaker 3: a film. 288 00:14:38,840 --> 00:14:40,880 Speaker 1: Was there a reason for spiking it? 289 00:14:41,240 --> 00:14:44,800 Speaker 3: Well, in a world where now tent pole films are 290 00:14:44,840 --> 00:14:48,080 Speaker 3: very important, and then being able to sell a film 291 00:14:48,200 --> 00:14:52,160 Speaker 3: was very important. Pitching a film called Giraffes on Horseback 292 00:14:52,320 --> 00:14:55,800 Speaker 3: Salad is not exactly something that they think jumps off 293 00:14:55,840 --> 00:14:56,520 Speaker 3: the screen. 294 00:14:56,240 --> 00:14:58,880 Speaker 1: The biting Dali and the Marx brothers hands off for 295 00:14:58,920 --> 00:14:59,240 Speaker 1: this one. 296 00:14:59,320 --> 00:15:01,960 Speaker 3: They weren't umping at the salad fit for I'm sorry, 297 00:15:01,960 --> 00:15:04,800 Speaker 3: I couldn't help myself. But the movie itself was a 298 00:15:05,360 --> 00:15:08,440 Speaker 3: kind of broadly about an aristocratic man who falls in 299 00:15:08,480 --> 00:15:12,040 Speaker 3: love with a woman from a world where dreams are reality, 300 00:15:12,360 --> 00:15:16,240 Speaker 3: and it's so surreal it was likely unfilmable at the time. 301 00:15:17,080 --> 00:15:19,080 Speaker 3: The other important piece I think that kind of made 302 00:15:19,080 --> 00:15:21,000 Speaker 3: me laugh is that some of the Marx brothers like 303 00:15:21,120 --> 00:15:22,280 Speaker 3: didn't even think it was funny. 304 00:15:22,280 --> 00:15:26,440 Speaker 1: They were like, eh, so it never got made, no until. 305 00:15:26,240 --> 00:15:29,400 Speaker 3: Now, until now, Google is trying to at least right now, 306 00:15:29,400 --> 00:15:32,280 Speaker 3: it's just a trailer because this is a big feat. Uh, 307 00:15:32,520 --> 00:15:34,920 Speaker 3: it's a trailer that is produced by the Dolli Museum 308 00:15:35,200 --> 00:15:38,160 Speaker 3: and Gooldbee Silverstein and partners. And I'm gonna play you 309 00:15:38,320 --> 00:15:40,720 Speaker 3: just a little sample of the trailer, and then you're 310 00:15:40,720 --> 00:15:43,400 Speaker 3: gonna tell me if you think this is true to 311 00:15:43,480 --> 00:15:45,120 Speaker 3: one of your I don't know if he's one of 312 00:15:45,120 --> 00:15:48,280 Speaker 3: your favorite artists, but you do like him Catalan culture. 313 00:15:48,440 --> 00:15:49,640 Speaker 3: So let's see. 314 00:15:51,080 --> 00:15:55,280 Speaker 1: Let me tell you how the strangest movie never made 315 00:15:56,440 --> 00:15:59,360 Speaker 1: the world wasn't ready for into No. 316 00:16:00,800 --> 00:16:01,480 Speaker 3: I called it. 317 00:16:03,280 --> 00:16:12,120 Speaker 2: Horseback Solid. Prepare yourself because the impossible is coming to life. 318 00:16:13,560 --> 00:16:14,920 Speaker 3: Tell me a little bit about what you saw. 319 00:16:15,040 --> 00:16:19,360 Speaker 1: Well, there's a giraffe on fire. There is obviously Salvador 320 00:16:19,520 --> 00:16:23,600 Speaker 1: Daly's voice recreated. It does have the kind of vibe 321 00:16:23,840 --> 00:16:28,000 Speaker 1: of a movie trailer. However, the esthetic is one that 322 00:16:28,120 --> 00:16:32,280 Speaker 1: I think would make Salvador Dahi personally turn in his grave, 323 00:16:32,680 --> 00:16:36,200 Speaker 1: because it is basically what you might imagine one of 324 00:16:36,240 --> 00:16:40,680 Speaker 1: the camps at Burning Man might reinterpret Salvador Dali through 325 00:16:40,720 --> 00:16:43,520 Speaker 1: the lens of It is so on the nose quote 326 00:16:43,600 --> 00:16:45,960 Speaker 1: unquote surrealist, with people wearing like funny hats. 327 00:16:46,080 --> 00:16:48,640 Speaker 3: It looks like clouds people on fire. 328 00:16:48,880 --> 00:16:51,560 Speaker 1: It's got a very Burning Man vibe. 329 00:16:51,600 --> 00:16:54,640 Speaker 3: It also has a very google commercial vibe, yes, which 330 00:16:54,680 --> 00:16:57,200 Speaker 3: I think isn't the greatest thing ever, But yeah, I 331 00:16:57,200 --> 00:17:00,520 Speaker 3: think fever dream is how I describe it. And actually, 332 00:17:00,600 --> 00:17:03,640 Speaker 3: as you said, the narration is supposed to be Salvador, 333 00:17:07,119 --> 00:17:08,280 Speaker 3: there's enough of his audience. 334 00:17:08,280 --> 00:17:10,560 Speaker 1: Does he actually sound like that? Who knows? Did we 335 00:17:10,560 --> 00:17:11,640 Speaker 1: did in fact shake his voice? 336 00:17:11,720 --> 00:17:13,480 Speaker 3: We did? I did listen to it. Well, I did 337 00:17:13,520 --> 00:17:17,680 Speaker 3: listen to him on radio program a little bit later, 338 00:17:18,000 --> 00:17:21,080 Speaker 3: and he does sound a little bit like you know, 339 00:17:21,320 --> 00:17:23,000 Speaker 3: But I always think about him twirling his. 340 00:17:23,080 --> 00:17:25,480 Speaker 1: Mustache actually character in Rettaitui. 341 00:17:25,680 --> 00:17:29,240 Speaker 3: There's no evidence to me that this is exactly what 342 00:17:29,280 --> 00:17:32,000 Speaker 3: Salvador Dolly sounds like. But I think they definitely tried 343 00:17:32,000 --> 00:17:33,840 Speaker 3: to recreate his voice, which we know by now is 344 00:17:33,920 --> 00:17:34,919 Speaker 3: like what. 345 00:17:35,200 --> 00:17:36,960 Speaker 1: They tried to do it. They probably did do it. Yeah. 346 00:17:37,320 --> 00:17:40,159 Speaker 3: According to Art News, this isn't the first time that 347 00:17:40,200 --> 00:17:43,280 Speaker 3: there's actually been an AI generated Dolly. The same Dolly 348 00:17:43,320 --> 00:17:46,560 Speaker 3: museum who's co producing this movie, actually had an exhibition 349 00:17:46,680 --> 00:17:50,560 Speaker 3: called Ask Dolly that allowed museum goers to talk to him, 350 00:17:50,920 --> 00:17:54,360 Speaker 3: meaning to talk to an AI created based on his voice. 351 00:17:54,720 --> 00:17:57,560 Speaker 3: At the time. The museum COO said that if these 352 00:17:57,560 --> 00:18:00,720 Speaker 3: technologies had been around when Dolly was alive, he would 353 00:18:00,760 --> 00:18:01,720 Speaker 3: have played around with them. 354 00:18:02,240 --> 00:18:06,880 Speaker 1: My take was, it's not horrible, but it's pretty horrible. 355 00:18:07,160 --> 00:18:10,320 Speaker 1: And I don't disagree that Darli would have played around 356 00:18:10,320 --> 00:18:14,520 Speaker 1: with AI tools if he were around today, but I 357 00:18:14,520 --> 00:18:17,040 Speaker 1: don't think he would have used them like this. I mean, 358 00:18:17,080 --> 00:18:19,480 Speaker 1: the whole point of Dahali as a painter was that 359 00:18:19,560 --> 00:18:22,120 Speaker 1: he made the medium new. He basically reinvented the media 360 00:18:22,119 --> 00:18:26,480 Speaker 1: of paintings through his incredibly interesting and unforgettable serialus take. 361 00:18:26,600 --> 00:18:29,080 Speaker 1: So I think if he did use Jennet of AI tools, 362 00:18:29,280 --> 00:18:33,439 Speaker 1: he would come't with the considerably more interesting application than this, 363 00:18:33,480 --> 00:18:35,959 Speaker 1: which ultimately feels a bit druotism. You would imagine if 364 00:18:36,000 --> 00:18:38,679 Speaker 1: an artist like Darli used Jenet AI, he would have 365 00:18:38,720 --> 00:18:41,160 Speaker 1: done so to critique or to push the medium forward, 366 00:18:41,280 --> 00:18:43,840 Speaker 1: or to make us really think, versus to kind of, 367 00:18:44,600 --> 00:18:45,040 Speaker 1: you know. 368 00:18:45,440 --> 00:18:48,200 Speaker 3: Make a gimmick, make a gimmick. Yeah, that's very true. 369 00:18:48,480 --> 00:18:49,920 Speaker 3: Can we run through the rest of the headlines? 370 00:18:50,000 --> 00:18:54,160 Speaker 1: Yes? And can you start yes? So Ours Technica reports 371 00:18:54,160 --> 00:18:57,399 Speaker 1: that Nvidia is producing AI chips for the first time 372 00:18:57,680 --> 00:19:01,080 Speaker 1: outside of Taiwan. Blackwell chip and are now being made 373 00:19:01,119 --> 00:19:06,560 Speaker 1: at TSMC. That's the Taiwan semiconductor manufacturing company at their 374 00:19:06,560 --> 00:19:10,560 Speaker 1: plant in Phoenix, Arizona, and other companies in the state 375 00:19:10,640 --> 00:19:13,920 Speaker 1: will test and package these chips. Nvidia also announced that 376 00:19:13,960 --> 00:19:18,560 Speaker 1: they plan to build complete supercomputers on us soil, and 377 00:19:18,600 --> 00:19:20,720 Speaker 1: their reports of the promise of up to five hundred 378 00:19:20,800 --> 00:19:24,760 Speaker 1: billion dollars of investment in us AI infrastructure having been 379 00:19:24,760 --> 00:19:28,439 Speaker 1: agreed to at a dinner between Jensen Huang and you 380 00:19:28,560 --> 00:19:33,680 Speaker 1: know who recently at Mari Lago. In return, Kwang reportedly 381 00:19:33,720 --> 00:19:37,760 Speaker 1: hoped to avoid yet more stringent export controls on chips 382 00:19:37,800 --> 00:19:41,360 Speaker 1: to China, but seemly that hope has not materialized. 383 00:19:41,440 --> 00:19:43,040 Speaker 3: I do wonder what's going to happen there, so please 384 00:19:43,119 --> 00:19:45,119 Speaker 3: keep me posted. I also do wonder if the food 385 00:19:45,200 --> 00:19:46,320 Speaker 3: is good at mar Laga. 386 00:19:46,440 --> 00:19:48,119 Speaker 1: Yeah, I bet it's old school. 387 00:19:48,560 --> 00:19:53,120 Speaker 3: Well, it's food that has to benefit a discussion about semiconductors, 388 00:19:53,400 --> 00:19:56,359 Speaker 3: so it can't stand out too much. So if you 389 00:19:56,400 --> 00:20:00,000 Speaker 3: are a big enough nerd to be covetous of Sam's 390 00:20:00,000 --> 00:20:04,960 Speaker 3: Sung technology, but you're loyal to the Apple ecosystems, it's 391 00:20:04,960 --> 00:20:08,880 Speaker 3: a very it's slender. There may be a phone for you. 392 00:20:09,240 --> 00:20:11,640 Speaker 3: Nine to five Mac reports that there are rumors of 393 00:20:12,359 --> 00:20:17,280 Speaker 3: and this is very mind blowing a foldable iPhone, which 394 00:20:17,280 --> 00:20:22,560 Speaker 3: has been dubbed very surprisingly the iPhone fold be like 395 00:20:22,600 --> 00:20:27,120 Speaker 3: I need the fold pro. I have to say this, 396 00:20:27,560 --> 00:20:30,000 Speaker 3: and I hope it's not too embarrassing. If I do 397 00:20:30,080 --> 00:20:32,800 Speaker 3: get this, I will be walking around saying my iPhone, 398 00:20:32,800 --> 00:20:34,760 Speaker 3: don't jiggle, jiggle it folds. 399 00:20:35,520 --> 00:20:35,720 Speaker 1: But no. 400 00:20:35,840 --> 00:20:38,800 Speaker 3: According to the one report, the rumored phone may look 401 00:20:38,840 --> 00:20:41,720 Speaker 3: like a normal iPhone but expands to roughly the size 402 00:20:41,760 --> 00:20:45,520 Speaker 3: of an iPad Mini when unfolded to its full size. 403 00:20:45,560 --> 00:20:49,959 Speaker 3: It also might feature a touch ID enabled power button. 404 00:20:50,760 --> 00:20:53,439 Speaker 3: The most important thing in the reporting here is that 405 00:20:53,600 --> 00:20:55,720 Speaker 3: I plan to use it as a picnic blanket in 406 00:20:55,760 --> 00:20:58,359 Speaker 3: Central Park if it ever does come to fruition and 407 00:20:58,440 --> 00:21:01,000 Speaker 3: does an overheat. I hate an overheated phone. 408 00:21:01,160 --> 00:21:02,639 Speaker 1: Yeah. You have to wonder what this one is. Is it 409 00:21:02,680 --> 00:21:04,200 Speaker 1: really going to come out or is it gonna be 410 00:21:04,200 --> 00:21:07,719 Speaker 1: another one of these Apple projects. Since Steve Jobs, may 411 00:21:07,760 --> 00:21:10,159 Speaker 1: he rest in peace, past that have never seen the 412 00:21:10,240 --> 00:21:10,760 Speaker 1: light of day. 413 00:21:10,880 --> 00:21:14,200 Speaker 3: And speaking of Dolly rolling over in his grave, this 414 00:21:14,240 --> 00:21:17,440 Speaker 3: is the stone pillow in Steve Jobs's grave. 415 00:21:17,680 --> 00:21:20,320 Speaker 1: I always like to talk about tech stories that inspire 416 00:21:20,359 --> 00:21:23,440 Speaker 1: me and capture my imagination, and there was a report 417 00:21:23,440 --> 00:21:26,840 Speaker 1: in the Washington Post this week about scientists who have 418 00:21:26,920 --> 00:21:32,080 Speaker 1: been using technology to uncover the remains of past civilizations 419 00:21:32,440 --> 00:21:35,160 Speaker 1: in the Amazon. So, this team of archaeologists in South 420 00:21:35,160 --> 00:21:39,960 Speaker 1: America is using lidar or light detection and ranging, which 421 00:21:40,000 --> 00:21:43,639 Speaker 1: is basically a laser sensor that can see through dense 422 00:21:43,680 --> 00:21:47,560 Speaker 1: forests from above, either with planes or drones to find 423 00:21:47,760 --> 00:21:51,239 Speaker 1: hidden structures beneath the canopy. You know, you don't need 424 00:21:51,280 --> 00:21:53,680 Speaker 1: to be Indiana Jones anymore. You can just fly drones. 425 00:21:54,640 --> 00:21:58,480 Speaker 1: The team has found a lost Portuguese colony ceramics made 426 00:21:58,520 --> 00:22:01,440 Speaker 1: by an indigenous society. And what I find particularly cool 427 00:22:01,440 --> 00:22:04,520 Speaker 1: here is that these findings are being used to protect 428 00:22:04,560 --> 00:22:07,920 Speaker 1: the rainforest from clearing and logging. You can't protect the 429 00:22:08,000 --> 00:22:10,720 Speaker 1: rainforest on its own term, sadly, but if it has 430 00:22:10,920 --> 00:22:14,000 Speaker 1: archaeologically significant ruins beneath, it turns out you can. 431 00:22:14,480 --> 00:22:17,040 Speaker 3: Finally, if you live in Silicon Valley, you may have 432 00:22:17,080 --> 00:22:20,800 Speaker 3: heard eerily familiar voices when crossing this I'm obsessed with 433 00:22:20,840 --> 00:22:23,840 Speaker 3: this story. While crossing the street last weekend, Palo Alto 434 00:22:23,920 --> 00:22:27,960 Speaker 3: Online reports that some crosswalk buttons seem to have been hacked, 435 00:22:28,400 --> 00:22:30,840 Speaker 3: which is something I would notice immediately. I just wish 436 00:22:30,880 --> 00:22:33,399 Speaker 3: they were hacked by Almo. Some people cross the street 437 00:22:33,440 --> 00:22:35,760 Speaker 3: to the sound of Elon Musk and Mark Zuckerberg, who 438 00:22:35,800 --> 00:22:36,960 Speaker 3: I now call Mark x Infandel. 439 00:22:37,680 --> 00:22:39,800 Speaker 1: So you press the button and rather than a walk 440 00:22:39,840 --> 00:22:40,480 Speaker 1: sign is on. 441 00:22:41,080 --> 00:22:44,720 Speaker 3: It's like walk signs on. I don't know who. I 442 00:22:44,760 --> 00:22:47,000 Speaker 3: don't know if that was a Zuckerberg or an Elon Musk. 443 00:22:47,240 --> 00:22:50,840 Speaker 3: I can't really do a Musk impersonation, but some in 444 00:22:50,920 --> 00:22:54,320 Speaker 3: Silicon Valley heard an impression of Elon Musk welcoming them 445 00:22:54,359 --> 00:22:57,720 Speaker 3: to Palo Alto, while others heard fake Mark Zuckerberg say 446 00:22:58,200 --> 00:23:01,840 Speaker 3: it's normal to feel uncomfortable, even violated, as we forcefully 447 00:23:01,840 --> 00:23:04,879 Speaker 3: insert AI into every facet of your conscious experience. And 448 00:23:04,960 --> 00:23:06,879 Speaker 3: I just want to assure you you don't need to 449 00:23:06,920 --> 00:23:10,000 Speaker 3: worry because there's absolutely nothing you can do to stop it. 450 00:23:10,600 --> 00:23:13,760 Speaker 1: I love this gorilla anti tech marketing campaign. I have 451 00:23:13,800 --> 00:23:19,000 Speaker 1: no idea how you hack a crosswalk light, but props 452 00:23:19,000 --> 00:23:19,560 Speaker 1: to these people. 453 00:23:19,640 --> 00:23:21,920 Speaker 3: If there's any place it's going to happen, it's palle Alto. 454 00:23:22,200 --> 00:23:26,800 Speaker 1: I like your Zach impression. Elizabeth Helmes reminds me of 455 00:23:26,800 --> 00:23:30,119 Speaker 1: one of the most iconic ever Zuckerberg moments, when he 456 00:23:30,200 --> 00:23:33,040 Speaker 1: was testifying before the Senate a few years ago and 457 00:23:33,119 --> 00:23:38,200 Speaker 1: was asked how Facebook's business works, to which he responded, well, ads, 458 00:23:38,280 --> 00:23:42,760 Speaker 1: Sir Mark Kackerberg back in the spotlight this week because 459 00:23:42,920 --> 00:23:46,080 Speaker 1: Meta is facing an anti trust trial brought by the Federal 460 00:23:46,160 --> 00:23:49,399 Speaker 1: Trade Commission, and Zackerberg has been on the stand this 461 00:23:49,480 --> 00:23:52,840 Speaker 1: week in Washington. Naomi Nicks of the Washington Post has 462 00:23:52,880 --> 00:24:12,680 Speaker 1: more on the trial when we come back, stay with us, So, Carot, 463 00:24:12,680 --> 00:24:14,600 Speaker 1: it seems like every week that we make this show, 464 00:24:14,960 --> 00:24:17,440 Speaker 1: there are headlines we cover, and then there's a kind 465 00:24:17,480 --> 00:24:20,399 Speaker 1: of the headline that kind of emerges as the biggest 466 00:24:20,400 --> 00:24:21,040 Speaker 1: story of the week. 467 00:24:21,320 --> 00:24:24,359 Speaker 3: Right last week was tiff Mania, which is the gift 468 00:24:24,400 --> 00:24:25,679 Speaker 3: that keeps giving. 469 00:24:26,600 --> 00:24:30,000 Speaker 1: And this week it's Meta's day in court, or rather 470 00:24:30,320 --> 00:24:33,240 Speaker 1: many weeks in court. And the trial's actually been nearly 471 00:24:33,280 --> 00:24:34,359 Speaker 1: five years in the making. 472 00:24:34,760 --> 00:24:36,560 Speaker 3: It's been one of those news stories that's been like 473 00:24:36,600 --> 00:24:38,280 Speaker 3: in the back of my mind, like I will just 474 00:24:38,320 --> 00:24:40,639 Speaker 3: be going about my business and then say, oh wait, 475 00:24:41,119 --> 00:24:43,040 Speaker 3: did the government soon matter? What happened to that? 476 00:24:43,320 --> 00:24:45,359 Speaker 1: Yeah? And it's also I mean when you and I 477 00:24:45,359 --> 00:24:48,280 Speaker 1: first started working on kind of tech journalism together in 478 00:24:48,359 --> 00:24:51,960 Speaker 1: twenty eighteen twenty nineteen, the ideas that the government would 479 00:24:51,960 --> 00:24:57,480 Speaker 1: ever effectively regulate technology companies was at best a fantasy 480 00:24:57,560 --> 00:24:59,920 Speaker 1: or a pipe dream, and now it seems to be 481 00:25:00,240 --> 00:25:04,520 Speaker 1: happening in real time, albeit slowly. And this also comes 482 00:25:04,520 --> 00:25:07,360 Speaker 1: at a time when people are questioning whether the judiciary 483 00:25:07,400 --> 00:25:11,000 Speaker 1: will continue to be, you know, a functioning pillar of government. 484 00:25:11,080 --> 00:25:13,840 Speaker 1: But right now, at least in the realm of business, 485 00:25:13,920 --> 00:25:16,840 Speaker 1: it sure is. Back In twenty twenty, the Federal Trade 486 00:25:16,880 --> 00:25:20,240 Speaker 1: Commission sued Meta, as you said, and if you recall, 487 00:25:20,480 --> 00:25:24,480 Speaker 1: Meta had previously purchased Instagram and WhatsApp in the years 488 00:25:24,560 --> 00:25:27,720 Speaker 1: leading up to this. The FTC argued that Meta had 489 00:25:27,720 --> 00:25:31,920 Speaker 1: acquired these companies specifically in order to strangle competition, which 490 00:25:31,920 --> 00:25:33,480 Speaker 1: is illegal under anti trust laws. 491 00:25:33,680 --> 00:25:36,200 Speaker 3: Yeah, the lastuit began on Monday in federal court, and 492 00:25:36,240 --> 00:25:38,919 Speaker 3: over the next many weeks, DC will be filled with 493 00:25:38,960 --> 00:25:43,760 Speaker 3: some star witnesses, including Meta CEO Mark Zuckerberg, who took 494 00:25:43,840 --> 00:25:45,040 Speaker 3: the stand this week. 495 00:25:45,160 --> 00:25:47,879 Speaker 1: Hit Hops Understand how we got here and what's going 496 00:25:47,960 --> 00:25:51,440 Speaker 1: to come next? Is Naomi Nicks. She's a staff writer 497 00:25:51,560 --> 00:25:54,280 Speaker 1: at The Washington Post, where she covers social media companies, 498 00:25:54,600 --> 00:25:57,520 Speaker 1: particularly Meta. So Naomi, I can imagine this has been 499 00:25:57,560 --> 00:25:59,800 Speaker 1: an absolutely crazy week for you. 500 00:26:00,119 --> 00:26:03,120 Speaker 2: Yes, it's been very busy, but it's been very interesting. 501 00:26:03,560 --> 00:26:06,359 Speaker 1: I can only imagine. But take us back to twenty 502 00:26:06,440 --> 00:26:10,400 Speaker 1: twenty when the FTC sued Meta and how this will begin. 503 00:26:11,160 --> 00:26:11,520 Speaker 3: Yeah. 504 00:26:11,600 --> 00:26:13,240 Speaker 2: So remember at the time, there was a lot of 505 00:26:13,280 --> 00:26:16,959 Speaker 2: conversation among regulators about whether big tech companies had gotten 506 00:26:17,000 --> 00:26:21,280 Speaker 2: too big, right, whether they had stifled competition from upstarts, 507 00:26:21,320 --> 00:26:26,399 Speaker 2: whether they were prefacing their products over potential competitors, and 508 00:26:26,440 --> 00:26:28,199 Speaker 2: so a lot of big tech companies were sort of 509 00:26:28,240 --> 00:26:31,840 Speaker 2: wrapped up in that uproar and Meta was one of them. 510 00:26:32,040 --> 00:26:35,760 Speaker 2: And back then the FTC under the tail end of 511 00:26:35,880 --> 00:26:42,040 Speaker 2: Donald Trump's administration, sued Meta, challenging to break the company 512 00:26:42,080 --> 00:26:46,800 Speaker 2: up apart from Instagram and WhatsApp, and we've been sort 513 00:26:46,800 --> 00:26:49,359 Speaker 2: of locked into this anti trust battle ever since. 514 00:26:50,080 --> 00:26:52,160 Speaker 1: So this is a continuation of that same case. 515 00:26:52,720 --> 00:26:53,160 Speaker 3: It is. 516 00:26:53,320 --> 00:26:57,840 Speaker 2: So what happened was the judge initially put out a 517 00:26:57,920 --> 00:27:00,000 Speaker 2: ruling that said, you know, I don't think the FTC 518 00:27:00,480 --> 00:27:05,639 Speaker 2: has given enough evidence to establish that there's really a 519 00:27:05,680 --> 00:27:08,840 Speaker 2: case here. And so when the Bide administration took over, 520 00:27:08,960 --> 00:27:12,560 Speaker 2: Amena Khan was chair of the FTC, the commission filed 521 00:27:12,560 --> 00:27:16,000 Speaker 2: an amended lawsuit and at that point the courts allowed 522 00:27:16,000 --> 00:27:18,359 Speaker 2: it to proceed, and you know, they said, look, I 523 00:27:18,359 --> 00:27:21,080 Speaker 2: think you now have established enough evidence that there is 524 00:27:21,320 --> 00:27:24,560 Speaker 2: a potential case that Meta has a monopoly and the 525 00:27:24,600 --> 00:27:26,399 Speaker 2: personal social networking market. 526 00:27:26,920 --> 00:27:29,760 Speaker 1: So this is potentially bipartisan enforcement, which I want to 527 00:27:29,800 --> 00:27:32,040 Speaker 1: come back to. But just first off, as somebody who 528 00:27:32,040 --> 00:27:36,840 Speaker 1: hasn't been following this that closely, why were Meta allowed 529 00:27:36,880 --> 00:27:41,040 Speaker 1: to buy Instagram in twenty twelve and WhatsApp in twenty 530 00:27:41,160 --> 00:27:43,960 Speaker 1: fourteen and didn't the regulators in a sense like, isn't 531 00:27:44,000 --> 00:27:46,560 Speaker 1: this a case of building a stable after the horse 532 00:27:46,600 --> 00:27:49,080 Speaker 1: has gone? If they didn't block these transactions at the time. 533 00:27:49,359 --> 00:27:51,920 Speaker 2: That's certainly a case in an argument that Meta has 534 00:27:51,960 --> 00:27:54,639 Speaker 2: been making. They're like, you know, one of the dangers, 535 00:27:54,640 --> 00:27:57,639 Speaker 2: they argue, is that what the message of the FTC 536 00:27:57,840 --> 00:28:00,720 Speaker 2: is sending to the general marketplace is that that no 537 00:28:00,880 --> 00:28:05,520 Speaker 2: merger is final. That after an anti trust regulator deems like, yeah, 538 00:28:05,560 --> 00:28:08,240 Speaker 2: you're allowed to buy that market, that a decade after 539 00:28:08,280 --> 00:28:10,840 Speaker 2: the fact, they can decide to change their minds. And 540 00:28:10,880 --> 00:28:13,480 Speaker 2: in some ways that is what's happening here. But I 541 00:28:13,520 --> 00:28:17,040 Speaker 2: do think the politics around anti trust law and the 542 00:28:17,119 --> 00:28:21,359 Speaker 2: conversation about whether the United States anti trust laws are 543 00:28:21,560 --> 00:28:25,960 Speaker 2: up to date with the current technology industry. Is one 544 00:28:26,000 --> 00:28:30,399 Speaker 2: of the reasons both Republican and Democratic nominated anti trust 545 00:28:30,400 --> 00:28:33,560 Speaker 2: regulators have been willing to take a chance in this case. 546 00:28:34,240 --> 00:28:37,439 Speaker 3: So you've been in the courtroom the past few days. 547 00:28:38,160 --> 00:28:41,480 Speaker 3: What is the FTC's argument against Meta right now? 548 00:28:41,720 --> 00:28:44,800 Speaker 2: They're making a couple of points. One is they're saying, look, 549 00:28:45,200 --> 00:28:48,880 Speaker 2: we actually think that Meta has a monopoly. And what 550 00:28:49,000 --> 00:28:52,640 Speaker 2: the FTC has decided is called a personal social networking 551 00:28:52,720 --> 00:28:57,800 Speaker 2: market that's mostly relying a market of tech pokforms that 552 00:28:57,880 --> 00:29:02,120 Speaker 2: are designed to facilitate person communication among your friends and families. 553 00:29:02,600 --> 00:29:04,600 Speaker 2: And so what the FTC is saying is they actually 554 00:29:04,600 --> 00:29:09,280 Speaker 2: think that Meta's next biggest competitor is Snapchat. They don't 555 00:29:09,280 --> 00:29:12,160 Speaker 2: include TikTok in that, they don't include YouTube, in that 556 00:29:12,200 --> 00:29:15,600 Speaker 2: they don't include x in that they're saying that Meta 557 00:29:15,680 --> 00:29:18,480 Speaker 2: has a monopoly in that market. And they're saying that 558 00:29:18,520 --> 00:29:23,800 Speaker 2: the company acted anti competitively when it bought WhatsApp and Instagram. 559 00:29:24,120 --> 00:29:26,720 Speaker 2: And they rely on a lot of internal emails to 560 00:29:26,760 --> 00:29:30,000 Speaker 2: suggest that the goal that Mark Zuckerberg had at the 561 00:29:30,040 --> 00:29:36,280 Speaker 2: time wasn't necessarily to improve Instagram or to help users 562 00:29:36,280 --> 00:29:38,680 Speaker 2: have more choice in the social media market, but to 563 00:29:38,760 --> 00:29:42,280 Speaker 2: neutralize a potential competitor. And they're saying that all those 564 00:29:42,320 --> 00:29:45,959 Speaker 2: actions have hurt consumers, that if Meta hadn't done these things, 565 00:29:46,240 --> 00:29:49,280 Speaker 2: that we as Internet users would have more vibrant social 566 00:29:49,320 --> 00:29:51,080 Speaker 2: media options at our disposal. 567 00:29:51,640 --> 00:29:53,920 Speaker 3: What is Meta's counter argument here? 568 00:29:54,440 --> 00:29:57,040 Speaker 2: They say that the FTC misses the mark when it 569 00:29:57,080 --> 00:30:00,600 Speaker 2: defines the marketplace, that they're not just competing agains Snapchat, 570 00:30:00,640 --> 00:30:03,680 Speaker 2: that they are competing against TikTok and YouTube and all 571 00:30:03,760 --> 00:30:09,000 Speaker 2: these other Internet platforms, and that that ecosystem is robust 572 00:30:09,440 --> 00:30:10,640 Speaker 2: and vibrant. 573 00:30:10,480 --> 00:30:12,040 Speaker 1: The attention economy, so to speak. 574 00:30:12,120 --> 00:30:16,360 Speaker 2: Yeah, yeah, exactly, exactly. You know, Mark Zuckerberg earlier sort 575 00:30:16,400 --> 00:30:21,400 Speaker 2: of talked to the court about how when TikTok was rising, 576 00:30:21,480 --> 00:30:24,240 Speaker 2: the company really had to double down and introduce its 577 00:30:24,280 --> 00:30:27,800 Speaker 2: own product called Instagram Reels, right, and that that has 578 00:30:27,840 --> 00:30:32,040 Speaker 2: been a competitive space for the company ever since. As 579 00:30:32,080 --> 00:30:34,680 Speaker 2: a piece of evidence to show that like this industry 580 00:30:34,720 --> 00:30:35,960 Speaker 2: is quite thriving. 581 00:30:36,120 --> 00:30:39,800 Speaker 1: Now without getting to inside baseball, I mean, the history 582 00:30:39,840 --> 00:30:43,800 Speaker 1: of antitrust in the last few years before Lee Na 583 00:30:43,920 --> 00:30:46,760 Speaker 1: Khan came in as the FTC chair under President Biden 584 00:30:47,280 --> 00:30:50,560 Speaker 1: was basically all around consumer price like harm was defined 585 00:30:50,640 --> 00:30:54,720 Speaker 1: as consumers having to pay more because of monopoly. However, 586 00:30:55,160 --> 00:30:57,680 Speaker 1: meta services are free. And I think one of the 587 00:30:57,680 --> 00:31:01,240 Speaker 1: interesting things here is the FTC d position visa via 588 00:31:01,240 --> 00:31:04,240 Speaker 1: the tech industry was considered to be a big reason 589 00:31:04,360 --> 00:31:08,000 Speaker 1: why the tech industry broke for Trump so dramatically in 590 00:31:08,000 --> 00:31:10,640 Speaker 1: the most recent election. So there's a kind of irony 591 00:31:10,680 --> 00:31:13,520 Speaker 1: here around the kind of the FTC's theory of the 592 00:31:13,600 --> 00:31:16,560 Speaker 1: case being so consistent with how the lots of administration 593 00:31:16,760 --> 00:31:19,960 Speaker 1: viewed regulating monopolies. Can you speak a little bit about. 594 00:31:19,720 --> 00:31:22,360 Speaker 2: That, Yeah, I mean it's interesting. There's both sort of 595 00:31:22,360 --> 00:31:25,320 Speaker 2: a political element here and a legal element. It is 596 00:31:25,360 --> 00:31:28,400 Speaker 2: true that, like the courts have traditionally relied on price 597 00:31:28,840 --> 00:31:31,400 Speaker 2: as a measure of harm, and what they're doing here 598 00:31:31,440 --> 00:31:33,680 Speaker 2: is kind of novel. They're saying, actually, we think harm 599 00:31:33,720 --> 00:31:36,200 Speaker 2: can be measured in the fact that, you know, meta 600 00:31:36,280 --> 00:31:39,920 Speaker 2: maybe didn't improve users' privacy. You know, what are the 601 00:31:40,000 --> 00:31:43,440 Speaker 2: points they've made in their opening arguments. Was like after 602 00:31:43,560 --> 00:31:48,280 Speaker 2: Cambridge Analytica, consumers were really unhappy with Facebook and yet 603 00:31:48,320 --> 00:31:51,240 Speaker 2: they still used it, which means they didn't have enough options. 604 00:31:51,600 --> 00:31:53,920 Speaker 2: But I think there is a political element here. To 605 00:31:54,000 --> 00:31:57,440 Speaker 2: your point, which is the tech companies and Meta. Mark 606 00:31:57,480 --> 00:32:01,000 Speaker 2: Zuckerberg in particular made a big bet on Trump right 607 00:32:01,040 --> 00:32:03,600 Speaker 2: in the election, and you know we've reported and other 608 00:32:03,680 --> 00:32:06,080 Speaker 2: news outlets have reported that, you know, he went to 609 00:32:06,080 --> 00:32:09,520 Speaker 2: the White House in hopes to encourage the Trump administration 610 00:32:09,600 --> 00:32:13,640 Speaker 2: to encourage the FTC to resolve the lawsuit before a trial, 611 00:32:13,880 --> 00:32:17,480 Speaker 2: but was ultimately unsuccessful. In order to drive this point home, 612 00:32:17,600 --> 00:32:21,280 Speaker 2: remember back to when Trump was first elected in twenty sixteen, 613 00:32:21,880 --> 00:32:25,520 Speaker 2: how the tech industry reacted then, right like we saw 614 00:32:25,600 --> 00:32:30,280 Speaker 2: them come together and form forward the Immigration Reform Organization. 615 00:32:30,840 --> 00:32:34,880 Speaker 2: There was really this sort of groundswell of activity by 616 00:32:34,880 --> 00:32:39,440 Speaker 2: the tech industry both workerwed but also CEO led to 617 00:32:39,480 --> 00:32:41,640 Speaker 2: be willing to kind of stand the ground in the 618 00:32:41,680 --> 00:32:45,760 Speaker 2: face of potential attacks from the Trump administration. And this 619 00:32:45,880 --> 00:32:48,760 Speaker 2: time around we saw much different tone. Right, So, Mark 620 00:32:48,800 --> 00:32:52,880 Speaker 2: Zuckerberg called Trump a badass during the campaign over how 621 00:32:52,920 --> 00:32:55,600 Speaker 2: he handled the shooting attempt on his life. The company 622 00:32:55,600 --> 00:32:58,200 Speaker 2: gave a million dollars to the inauguration committee, he dined 623 00:32:58,240 --> 00:33:01,920 Speaker 2: with Trump, and then I think in January he backed 624 00:33:02,000 --> 00:33:05,000 Speaker 2: up that rhetoric with a lot of policy. They scrapped 625 00:33:05,040 --> 00:33:09,160 Speaker 2: the fact checking program, the NDDI programs. He said in 626 00:33:09,200 --> 00:33:11,960 Speaker 2: a video that they were hoping to partner with the 627 00:33:11,960 --> 00:33:16,760 Speaker 2: Trump administration to go after international regulators who are introducing 628 00:33:16,920 --> 00:33:20,560 Speaker 2: what he terms was like onerous regulations. And so the 629 00:33:20,600 --> 00:33:25,800 Speaker 2: tone has been really, really remarkably positive and has attracted 630 00:33:25,840 --> 00:33:32,080 Speaker 2: some compliments from Trump allies and Trump himself, But the 631 00:33:32,160 --> 00:33:35,680 Speaker 2: compliments only go so far. What the company obviously wants 632 00:33:35,880 --> 00:33:41,120 Speaker 2: is regulations to change, and so that so far hasn't 633 00:33:41,280 --> 00:33:46,000 Speaker 2: been an outcome of this olive branch from Mark Zuckerberg 634 00:33:46,040 --> 00:33:49,640 Speaker 2: and Meta. And I think this latest trial is another 635 00:33:49,760 --> 00:33:52,920 Speaker 2: piece of evidence that the companies haven't yet, to put 636 00:33:52,920 --> 00:33:54,560 Speaker 2: it crudely, gotten what they've paid for. 637 00:33:55,000 --> 00:33:56,880 Speaker 3: So can you tell us a little bit about the 638 00:33:56,920 --> 00:34:00,360 Speaker 3: atmosphere in the courtroom this week? You know, we're taping middanday. 639 00:34:00,400 --> 00:34:03,360 Speaker 3: What have the first few days been like in the courtroom. 640 00:34:03,040 --> 00:34:04,719 Speaker 1: Cara, sorry to jump on you, but also we all 641 00:34:04,760 --> 00:34:08,080 Speaker 1: saw the photo of Mark Zuckerberg arriving in what looked 642 00:34:08,080 --> 00:34:11,200 Speaker 1: like a presidential limousine and was in beast mode wearing 643 00:34:11,239 --> 00:34:14,480 Speaker 1: his own Meta ray bands, looking very much sho. 644 00:34:14,760 --> 00:34:17,480 Speaker 3: I heard the success in theme music as he arrived 645 00:34:17,480 --> 00:34:19,040 Speaker 3: to the courtroom. 646 00:34:19,560 --> 00:34:23,160 Speaker 2: It has been you know, on the point about Mark's look, 647 00:34:23,520 --> 00:34:26,560 Speaker 2: it's been interesting because he's had to look at videos 648 00:34:26,719 --> 00:34:28,800 Speaker 2: like talking about some of these mergers in the past, 649 00:34:28,840 --> 00:34:31,160 Speaker 2: and so you're seeing sort of like a new book, 650 00:34:31,320 --> 00:34:33,239 Speaker 2: you know, looking back in the old version of him. 651 00:34:33,560 --> 00:34:37,840 Speaker 2: But particularly I think on Tuesday, there was a really 652 00:34:38,920 --> 00:34:44,799 Speaker 2: heated exchange I think between the FTC lawyer and Mark Zuckerberg, 653 00:34:45,000 --> 00:34:48,879 Speaker 2: and we saw repeatedly the lawyer really tried to pin 654 00:34:49,160 --> 00:34:53,360 Speaker 2: Mark Zuckerberg down to essentially admit to some of the 655 00:34:53,400 --> 00:34:56,279 Speaker 2: reasonings that he weighed out in his emails at the 656 00:34:56,320 --> 00:34:58,560 Speaker 2: time for wanting to buy Instagram, you know, because the 657 00:34:58,600 --> 00:35:02,360 Speaker 2: reality is he does talk about it in those competitive terms. 658 00:35:02,400 --> 00:35:04,720 Speaker 2: Back then, Met at the time was trying to build 659 00:35:04,719 --> 00:35:08,960 Speaker 2: its own camera app and it wasn't going so well. 660 00:35:09,160 --> 00:35:12,560 Speaker 2: They were disappointed with the results of that program, and 661 00:35:12,600 --> 00:35:15,160 Speaker 2: so at the time, Mark is like, maybe we should 662 00:35:15,200 --> 00:35:17,359 Speaker 2: just like buy Instagram. 663 00:35:17,560 --> 00:35:19,359 Speaker 3: Right, I was going to say, in the scheme of things, 664 00:35:19,360 --> 00:35:21,640 Speaker 3: it's actually much better that they just spent a cool 665 00:35:21,680 --> 00:35:24,400 Speaker 3: billion dollars to have one of the most popular apps 666 00:35:24,440 --> 00:35:25,240 Speaker 3: in the app store. 667 00:35:25,600 --> 00:35:31,080 Speaker 2: Yeah, and it is also rapidly becoming their key to 668 00:35:31,360 --> 00:35:34,960 Speaker 2: retaining a really important audience, which is young people in 669 00:35:35,000 --> 00:35:36,239 Speaker 2: a way that Facebook is not. 670 00:35:37,239 --> 00:35:39,960 Speaker 3: Can you talk a little bit more about some of 671 00:35:40,040 --> 00:35:43,279 Speaker 3: the other high profile cases against big tech that are 672 00:35:43,280 --> 00:35:45,800 Speaker 3: going on right now, Like where do these lawsuits stand? 673 00:35:46,120 --> 00:35:49,080 Speaker 2: Yeah, I mean, I think the biggest example is Google, right, 674 00:35:49,320 --> 00:35:51,840 Speaker 2: you know there Right last year, a federal court century 675 00:35:51,920 --> 00:35:54,719 Speaker 2: ruled that the Justice Department was right to say that 676 00:35:54,960 --> 00:35:58,080 Speaker 2: Google violated in a trust clause because it had created 677 00:35:58,120 --> 00:36:02,160 Speaker 2: these sort of restrictive contracts with Apple and other device 678 00:36:02,239 --> 00:36:05,799 Speaker 2: makers to essentially require them to install Google as the 679 00:36:05,800 --> 00:36:09,680 Speaker 2: default search engine on their smartphones anyone. What the judge 680 00:36:09,680 --> 00:36:12,680 Speaker 2: said is like, you know, this prevented rivals from competing 681 00:36:13,239 --> 00:36:17,840 Speaker 2: on a level playing field. And so Google had argued, look, actually, 682 00:36:17,840 --> 00:36:20,399 Speaker 2: our search engine has lots of competition. People are looking 683 00:36:20,480 --> 00:36:23,120 Speaker 2: for information in lots of different places, whether it's on 684 00:36:23,160 --> 00:36:26,840 Speaker 2: Amazon and TikTok and Reddit other Internet you know search engines. 685 00:36:27,120 --> 00:36:30,439 Speaker 2: But essentially the judge ruled against him, and so where 686 00:36:30,440 --> 00:36:32,719 Speaker 2: we are now is we have to figure out like 687 00:36:32,800 --> 00:36:37,440 Speaker 2: what the remedy is to that anti trust violation. I think, 688 00:36:37,600 --> 00:36:39,040 Speaker 2: you know, one of the things that's a little bit 689 00:36:39,080 --> 00:36:41,239 Speaker 2: easier about that case in this case, which is it 690 00:36:41,320 --> 00:36:45,160 Speaker 2: probably was easier for the Justice Department to establish that 691 00:36:45,239 --> 00:36:49,840 Speaker 2: Google had maintained a monopoly in the search business in 692 00:36:49,880 --> 00:36:52,360 Speaker 2: a way that I think the discussion around what defines 693 00:36:52,400 --> 00:36:55,319 Speaker 2: social media and what defines the social media market for 694 00:36:55,400 --> 00:36:58,080 Speaker 2: meta is a little bit more fraught. But this is 695 00:36:58,120 --> 00:37:03,200 Speaker 2: like one of the biggest wins right for ntrust rigilators 696 00:37:03,239 --> 00:37:06,160 Speaker 2: against a big tech company in two decades since the 697 00:37:06,200 --> 00:37:11,040 Speaker 2: Microsoft case. And whether that will continue, whether the Justice 698 00:37:11,120 --> 00:37:14,480 Speaker 2: firement at FTC and in to circulators around the world 699 00:37:14,520 --> 00:37:16,600 Speaker 2: will be able to continue to rack up as an 700 00:37:16,640 --> 00:37:17,320 Speaker 2: open question. 701 00:37:18,120 --> 00:37:20,720 Speaker 1: What I find is so personally fascinating about this story 702 00:37:20,800 --> 00:37:24,440 Speaker 1: is it has both the drama of the TikTok of 703 00:37:24,480 --> 00:37:27,840 Speaker 1: week by week politics and of courtroom testimony and stuff, 704 00:37:27,880 --> 00:37:31,320 Speaker 1: but it also exists in this kind of grander sweep 705 00:37:31,440 --> 00:37:33,560 Speaker 1: of the history of the twentieth century. I mean, you 706 00:37:33,600 --> 00:37:37,640 Speaker 1: had IBM subject to anti trust action in the sixties, 707 00:37:37,680 --> 00:37:41,560 Speaker 1: which led to them unbundling their hardware and software business, 708 00:37:42,000 --> 00:37:45,719 Speaker 1: which created a software marketplace, which in turn allowed Microsoft 709 00:37:45,760 --> 00:37:49,320 Speaker 1: to rise. Microsoft intern was subject to anti trust legislation 710 00:37:49,400 --> 00:37:52,080 Speaker 1: the late nineties early two thousands, which meant that they 711 00:37:52,160 --> 00:37:55,440 Speaker 1: couldn't force Internet Explorer down consumers throats who were using 712 00:37:55,480 --> 00:37:59,200 Speaker 1: Windows operating system. This in turn allowed Google to emerge, 713 00:37:59,440 --> 00:38:02,960 Speaker 1: and now Google and Meta are facing antitrust action of 714 00:38:03,000 --> 00:38:04,839 Speaker 1: their own. So I was wondering if you can kind 715 00:38:04,840 --> 00:38:08,000 Speaker 1: of reflect on where this sits in the history of 716 00:38:08,160 --> 00:38:08,960 Speaker 1: tech and government. 717 00:38:09,800 --> 00:38:12,880 Speaker 2: I think at a time when the industry is really 718 00:38:12,960 --> 00:38:19,320 Speaker 2: focused on artificial intelligence and generative artificial intelligence, these cases 719 00:38:19,480 --> 00:38:21,840 Speaker 2: that are going on now really have the power to 720 00:38:21,920 --> 00:38:25,680 Speaker 2: kind of reset the competition among who's going to rise 721 00:38:25,680 --> 00:38:28,360 Speaker 2: and fall in this marketplace, Who's going to have the 722 00:38:28,400 --> 00:38:32,440 Speaker 2: ability to make the right deals, make the right mergers 723 00:38:33,000 --> 00:38:35,839 Speaker 2: in order to assume top place in the AI race, 724 00:38:35,880 --> 00:38:38,279 Speaker 2: which they all want to do. And I think it's 725 00:38:38,320 --> 00:38:43,439 Speaker 2: also unclear how a spinoff would affect the company right. 726 00:38:43,760 --> 00:38:47,040 Speaker 2: There was a really poignant moment on Tuesday in which 727 00:38:47,560 --> 00:38:50,359 Speaker 2: Mark Zuckerberg is being asked about an internal message he's 728 00:38:50,400 --> 00:38:53,440 Speaker 2: sent in twenty eighteen where he said, and I'm just 729 00:38:53,480 --> 00:38:55,600 Speaker 2: going to read the quote as calls to break up 730 00:38:55,640 --> 00:38:58,680 Speaker 2: the big tech companies grow, there is a non trivial 731 00:38:58,840 --> 00:39:01,840 Speaker 2: chance that we will be forced to spin out Instagram 732 00:39:01,880 --> 00:39:04,840 Speaker 2: and perhaps WhatsApp in the next five years. And he 733 00:39:04,960 --> 00:39:08,520 Speaker 2: later said, well, most companies resist these kinds of breakups, 734 00:39:09,200 --> 00:39:13,040 Speaker 2: companies actually performed better after they split up. And if 735 00:39:13,040 --> 00:39:15,840 Speaker 2: you think about it, people might have written Microsoft for 736 00:39:15,960 --> 00:39:19,560 Speaker 2: dead right years ago after their ANIHS battle. Is anyone 737 00:39:19,600 --> 00:39:24,360 Speaker 2: really saying that now? No, this is just one part 738 00:39:24,400 --> 00:39:26,200 Speaker 2: of that story. 739 00:39:26,280 --> 00:39:27,960 Speaker 3: This is the first week of what might be a 740 00:39:27,960 --> 00:39:31,120 Speaker 3: two month trial, and we are nowhere near ruling. Just 741 00:39:31,160 --> 00:39:34,000 Speaker 3: in your opinion as somebody who is attending these things, 742 00:39:34,040 --> 00:39:36,400 Speaker 3: like what do we think could happen and what are 743 00:39:36,440 --> 00:39:38,720 Speaker 3: the things that you're looking out for. 744 00:39:39,040 --> 00:39:40,879 Speaker 2: I think one thing I'm going to be really paying 745 00:39:40,880 --> 00:39:46,400 Speaker 2: attention to is how the judge asks questions about market size, 746 00:39:46,880 --> 00:39:50,680 Speaker 2: because I think the definition of the market is one 747 00:39:50,680 --> 00:39:53,439 Speaker 2: of the hardest parts that the FDC has to prove 748 00:39:53,960 --> 00:39:57,640 Speaker 2: that the market that Meta operates is this sort of 749 00:39:57,680 --> 00:40:01,480 Speaker 2: personal social networking market, and I'm not sure that the 750 00:40:01,520 --> 00:40:04,880 Speaker 2: tech industry has understood it that way, and certainly Meta 751 00:40:04,920 --> 00:40:08,560 Speaker 2: hasn't understood it that way. His own lawyer asked him 752 00:40:08,800 --> 00:40:10,799 Speaker 2: if he had even heard of that phrase before the 753 00:40:10,880 --> 00:40:12,040 Speaker 2: FTC's lawsuit. 754 00:40:12,400 --> 00:40:13,000 Speaker 1: She said no. 755 00:40:13,960 --> 00:40:17,520 Speaker 2: And so I think how the judge asks questions about 756 00:40:18,000 --> 00:40:20,879 Speaker 2: how the marketplace is structured and defined is like one 757 00:40:20,920 --> 00:40:23,160 Speaker 2: issue that I'm going to definitely be paying attention to 758 00:40:23,880 --> 00:40:27,279 Speaker 2: another thing is just like Honestly, these top executives that 759 00:40:27,280 --> 00:40:30,000 Speaker 2: were expected to hear from We're expecting to hear from 760 00:40:30,040 --> 00:40:33,800 Speaker 2: Cheryl Samdberg, We're expecting to hear from Instagram head Adam Masseri. 761 00:40:34,560 --> 00:40:38,600 Speaker 2: We might expect to hear from Zakoyo Capital and Google 762 00:40:38,760 --> 00:40:42,640 Speaker 2: and whatnot. And so I am very curious to see 763 00:40:42,640 --> 00:40:45,080 Speaker 2: how they talk about how they were thinking about these 764 00:40:45,120 --> 00:40:48,799 Speaker 2: mergers at the time. If there is some more relenting 765 00:40:49,280 --> 00:40:54,000 Speaker 2: from meta executives about their true motivations for making these purchases, 766 00:40:54,040 --> 00:40:57,160 Speaker 2: I think that'll be interesting to watch and then like 767 00:40:57,440 --> 00:41:01,760 Speaker 2: it's boring, But I think how THEUS talk about harm 768 00:41:02,080 --> 00:41:05,440 Speaker 2: and how users are harmed or not harmed, And it 769 00:41:05,560 --> 00:41:08,680 Speaker 2: feels like this game of like what if, right, what 770 00:41:08,760 --> 00:41:12,799 Speaker 2: if these mergers hadn't happened. What would have happened to Instagram? 771 00:41:13,480 --> 00:41:15,799 Speaker 2: Would it be as popular as like Snapchat is right now? 772 00:41:15,880 --> 00:41:18,680 Speaker 2: Or would it be an even bigger global industry threat. 773 00:41:19,600 --> 00:41:22,359 Speaker 1: What's a fascinating moment, Naomi, thank you for joining us. 774 00:41:22,360 --> 00:41:23,120 Speaker 2: Thanks for having me. 775 00:41:30,560 --> 00:41:32,640 Speaker 1: That's it for this week for tech Stuff. I'm as 776 00:41:32,719 --> 00:41:34,240 Speaker 1: Valoshian and I'm Kara Price. 777 00:41:34,600 --> 00:41:37,719 Speaker 3: This episode was produced by Eliza Dennis and Victoria Domingez. 778 00:41:38,000 --> 00:41:41,040 Speaker 3: It was executive produced by me Oz Valosian and Kate 779 00:41:41,080 --> 00:41:45,360 Speaker 3: Osborne for Kaleidoscope and Katrina Norvel for iHeart Podcasts. The 780 00:41:45,440 --> 00:41:48,840 Speaker 3: Engineer is Bihied Frasier and Kyle Murdoch mixed this episode. 781 00:41:49,000 --> 00:41:50,160 Speaker 3: He also wrote our theme song. 782 00:41:50,600 --> 00:41:53,879 Speaker 1: Join us next Wednesday for tech stuff The Story when 783 00:41:53,880 --> 00:41:56,920 Speaker 1: we'll share an indepth conversation with the former editor of 784 00:41:56,960 --> 00:42:01,520 Speaker 1: The Financial Times, Lionel Barber about his book, Gambling Man. 785 00:42:01,840 --> 00:42:05,279 Speaker 1: It's all about the enigmatic founder of SoftBank mas Son, 786 00:42:05,840 --> 00:42:08,879 Speaker 1: and we'll also talk about Masayshi Son's relationships with Sam 787 00:42:08,960 --> 00:42:10,760 Speaker 1: Altman and the Stargate Project. 788 00:42:11,040 --> 00:42:13,839 Speaker 3: Please rate, review, and reach out to us at tech 789 00:42:13,880 --> 00:42:16,839 Speaker 3: Stuff podcast at gmail dot com. We want to hear 790 00:42:16,880 --> 00:42:17,200 Speaker 3: from you.