1 00:00:00,040 --> 00:00:02,640 Speaker 1: Thanks for tuney into tech Stuff. If you don't recognize 2 00:00:02,680 --> 00:00:05,440 Speaker 1: my voice, my name is Oz Voloshian and I'm here 3 00:00:05,480 --> 00:00:08,640 Speaker 1: because the inimitable Jonathan Strickland has passed the baton to 4 00:00:08,720 --> 00:00:12,160 Speaker 1: Cara Price and myself to host tech Stuff. The show 5 00:00:12,160 --> 00:00:15,040 Speaker 1: will remain your home for all things tech, and all 6 00:00:15,080 --> 00:00:18,360 Speaker 1: the old episodes will remain available in this feed. Thanks 7 00:00:18,400 --> 00:00:23,000 Speaker 1: for listening. Welcome to tech Stuff, a production of iHeart 8 00:00:23,000 --> 00:00:29,200 Speaker 1: Podcasts and Kaleidoscope. I'm os Voloshan. Today, Cara Price and 9 00:00:29,240 --> 00:00:32,280 Speaker 1: I will bring you the headlines of the week, including 10 00:00:32,400 --> 00:00:37,320 Speaker 1: new personalized pricing methods and Ohio's investment in AI weapons. 11 00:00:37,920 --> 00:00:40,440 Speaker 1: On today's Tech Support segment, we'll talk to four of 12 00:00:40,520 --> 00:00:43,960 Speaker 1: four Media's Jason Kebler about Mark Zuckerberg's role in a 13 00:00:44,000 --> 00:00:47,159 Speaker 1: TikTok band, and then we're back with another When did 14 00:00:47,159 --> 00:00:50,040 Speaker 1: this become a thing? This time we're looking into the 15 00:00:50,120 --> 00:00:53,479 Speaker 1: history of the PayPal mafia, all of that on this 16 00:00:53,520 --> 00:01:02,200 Speaker 1: Week in Tech. It's Friday, January twenty fourth, So Carrol 17 00:01:02,200 --> 00:01:02,680 Speaker 1: wa back. 18 00:01:02,840 --> 00:01:05,039 Speaker 2: We are back, and so is TikTok. 19 00:01:04,640 --> 00:01:08,399 Speaker 1: And so is President Trump. That's right, a lot of 20 00:01:08,440 --> 00:01:10,039 Speaker 1: things returning, we do. 21 00:01:10,120 --> 00:01:11,160 Speaker 2: It's the great return. 22 00:01:11,360 --> 00:01:13,080 Speaker 1: So it's time for our news round up. Who's going 23 00:01:13,080 --> 00:01:13,600 Speaker 1: first today? 24 00:01:13,680 --> 00:01:17,680 Speaker 2: Me? Okay, you went first last time. 25 00:01:17,760 --> 00:01:18,880 Speaker 1: That's true. What have you been reading? 26 00:01:19,120 --> 00:01:21,840 Speaker 2: So you like to read Daily Mail and I like 27 00:01:21,920 --> 00:01:24,360 Speaker 2: to read Scientific American like it's US weekly. At the 28 00:01:24,400 --> 00:01:26,600 Speaker 2: nail salon, I don't get my nails done, but I 29 00:01:26,640 --> 00:01:30,120 Speaker 2: would read Scientific American if I did. And I saw 30 00:01:30,160 --> 00:01:35,520 Speaker 2: this thing that in a way, I've always been paranoid 31 00:01:35,520 --> 00:01:38,080 Speaker 2: that this is going on, but this confirmed my paranoia. 32 00:01:38,200 --> 00:01:38,560 Speaker 1: Okay. 33 00:01:40,800 --> 00:01:44,520 Speaker 2: The FTC, the Federal Trade Commission, which is what was 34 00:01:44,680 --> 00:01:47,080 Speaker 2: run by Lena Kon who we talk about often, has 35 00:01:48,240 --> 00:01:53,680 Speaker 2: released a report about something called surveillance pricing. What has 36 00:01:53,760 --> 00:01:58,400 Speaker 2: been found is that if you shop online, you may 37 00:01:58,400 --> 00:02:01,920 Speaker 2: be getting charged a different for certain items than somebody 38 00:02:02,000 --> 00:02:04,360 Speaker 2: else might be at the same store. 39 00:02:04,680 --> 00:02:07,280 Speaker 1: So, in other words, the retailer knows that you might 40 00:02:07,360 --> 00:02:10,639 Speaker 1: be less price sensitive and therefore charges you more correct 41 00:02:10,680 --> 00:02:14,679 Speaker 1: and Lena Connley FTC think this is not right. 42 00:02:15,280 --> 00:02:17,960 Speaker 2: I think they're actually more interested in looking into if 43 00:02:18,000 --> 00:02:21,680 Speaker 2: it's happening, and it clearly is happening. What they have 44 00:02:22,240 --> 00:02:26,160 Speaker 2: concluded is that this is similar to the technology being 45 00:02:26,200 --> 00:02:28,079 Speaker 2: used to feed you targeted ads, which we've talked we 46 00:02:28,160 --> 00:02:32,320 Speaker 2: talked a lot about on Sleepwalkers and targeted marketing uses 47 00:02:32,400 --> 00:02:35,520 Speaker 2: things like our browsing history to serve us ads for 48 00:02:35,600 --> 00:02:38,200 Speaker 2: things that we were just thinking about, like Oh, I 49 00:02:38,200 --> 00:02:41,600 Speaker 2: need snowbooths. Now I have these snowbooths in my Instagram feed. 50 00:02:42,080 --> 00:02:45,280 Speaker 2: This would work similarly. You know, if you have a MasterCard, 51 00:02:45,280 --> 00:02:49,160 Speaker 2: for example, that stores a lot of information on you, 52 00:02:49,560 --> 00:02:52,079 Speaker 2: the types of things you spend money on, where you live, 53 00:02:52,520 --> 00:02:55,960 Speaker 2: you're banking information. All of the information that is accessible 54 00:02:55,960 --> 00:02:59,120 Speaker 2: to companies via these third party providers, whether it be 55 00:02:59,160 --> 00:03:03,000 Speaker 2: a Fyco credit its scored or credit card company, is 56 00:03:03,080 --> 00:03:06,800 Speaker 2: now allowing retailers to make decisions on pricing that I 57 00:03:06,800 --> 00:03:10,360 Speaker 2: don't think the average American consumer is used to. I 58 00:03:10,360 --> 00:03:12,560 Speaker 2: think another really interesting thing that we didn't get in 59 00:03:12,639 --> 00:03:16,320 Speaker 2: the actual report and that the reporter from Scientific American 60 00:03:16,400 --> 00:03:19,280 Speaker 2: elaborated on is that this also might be happening in 61 00:03:19,280 --> 00:03:20,320 Speaker 2: brick and mortar stores. 62 00:03:20,720 --> 00:03:21,520 Speaker 1: How is that possible? 63 00:03:21,560 --> 00:03:24,400 Speaker 2: That could be because price tags used to be something 64 00:03:24,440 --> 00:03:26,000 Speaker 2: that if you think about that thing that would like 65 00:03:26,000 --> 00:03:28,800 Speaker 2: put the price on that in and of itself is 66 00:03:28,840 --> 00:03:32,520 Speaker 2: now becoming digitized, so you can actually change a price 67 00:03:33,000 --> 00:03:36,920 Speaker 2: that's on a computer tag, especially exactly, and those prices 68 00:03:36,960 --> 00:03:41,240 Speaker 2: are you know, based on both expiration dates or consumer demand. 69 00:03:41,600 --> 00:03:44,839 Speaker 1: Lucky if a billionaires an never do theirn shopping, that's right. 70 00:03:45,240 --> 00:03:48,920 Speaker 2: But Walmart, which is actually not being investigated, claims that 71 00:03:49,440 --> 00:03:53,280 Speaker 2: it's digital price tags can be updated remotely within minutes. 72 00:03:53,640 --> 00:03:55,960 Speaker 2: It's just an amazing thing to think about, which is 73 00:03:56,000 --> 00:03:58,920 Speaker 2: like you usually go into a store, you trust the 74 00:03:58,920 --> 00:04:01,720 Speaker 2: price of the thing. You understand that inflation might affect 75 00:04:01,720 --> 00:04:04,320 Speaker 2: the thing, You understand that something might be on sale. 76 00:04:04,680 --> 00:04:07,960 Speaker 2: You know, this is different. This is literally real time 77 00:04:08,200 --> 00:04:13,080 Speaker 2: price not gouging, but price changing. And there's one other 78 00:04:13,160 --> 00:04:17,279 Speaker 2: really interesting things which is these instacart smart shopping carts. 79 00:04:17,520 --> 00:04:20,560 Speaker 2: And these are actual shopping carts that have scanners and 80 00:04:20,600 --> 00:04:24,440 Speaker 2: screens on them. These are screens that advertise to you 81 00:04:24,520 --> 00:04:28,400 Speaker 2: while you're shopping. Wow, and then they connect directly to 82 00:04:28,520 --> 00:04:30,239 Speaker 2: your credit card payment system. 83 00:04:30,400 --> 00:04:33,160 Speaker 1: I'm actually looking at now that the tagline is you'll 84 00:04:33,200 --> 00:04:35,359 Speaker 1: feel like a magician each time you add an item 85 00:04:35,400 --> 00:04:36,200 Speaker 1: and pay on the cart. 86 00:04:36,400 --> 00:04:43,320 Speaker 2: This idea of like frictionless shopping is not all about ease. 87 00:04:43,760 --> 00:04:45,960 Speaker 1: No, it's about selling you more stuff. I mean, I 88 00:04:45,960 --> 00:04:50,039 Speaker 1: would guess the beginning of fixed prices is probably a 89 00:04:50,520 --> 00:04:53,880 Speaker 1: late nineteenth century phenomenon, and before that, like it was 90 00:04:53,880 --> 00:04:54,839 Speaker 1: a haggle every time. 91 00:04:55,120 --> 00:04:57,320 Speaker 2: That's right, and you know what you were just saying. 92 00:04:57,320 --> 00:05:00,240 Speaker 2: That was actually mentioned in the Scientific American article. They 93 00:05:00,279 --> 00:05:06,239 Speaker 2: went back to like early BC bizarres, when people would 94 00:05:06,279 --> 00:05:10,240 Speaker 2: just in Yiddish we call it handling, where it was 95 00:05:10,279 --> 00:05:12,599 Speaker 2: like there was no set price. It was on the 96 00:05:12,600 --> 00:05:15,320 Speaker 2: basis of what somebody was deciding something costs that day, 97 00:05:15,400 --> 00:05:17,760 Speaker 2: or how much could you negotiate or finagle. 98 00:05:18,120 --> 00:05:20,680 Speaker 1: And now we're kind of returning to the point where 99 00:05:20,760 --> 00:05:23,760 Speaker 1: it's a hagout every time, except way conditioned to believe 100 00:05:23,839 --> 00:05:27,360 Speaker 1: that the prices are fixed and our opponent, slash merchant, 101 00:05:27,400 --> 00:05:29,960 Speaker 1: has perfect information about us and we have no information 102 00:05:30,000 --> 00:05:33,320 Speaker 1: about them. So this sound this sounds pretty unfair. 103 00:05:33,520 --> 00:05:35,800 Speaker 2: I want to say that this is not totally negative, right, 104 00:05:35,960 --> 00:05:39,080 Speaker 2: Like demographic data can actually benefit people who are looking 105 00:05:39,080 --> 00:05:42,640 Speaker 2: for financial aid, for example, And so in the FTC report, 106 00:05:43,120 --> 00:05:45,599 Speaker 2: it wasn't like, oh my god, this is the worst 107 00:05:45,600 --> 00:05:47,480 Speaker 2: thing ever, we must change as we must take this away. 108 00:05:47,600 --> 00:05:51,000 Speaker 2: It was simply a probe into what are the ways 109 00:05:51,000 --> 00:05:54,360 Speaker 2: in which data that can then be harnessed by the 110 00:05:54,400 --> 00:05:58,680 Speaker 2: power of machine learning is affecting the way we shop. 111 00:05:59,640 --> 00:06:04,640 Speaker 2: It is possible that using our demographic data, using our 112 00:06:04,680 --> 00:06:08,800 Speaker 2: credit card information, it's possible that something might cost more 113 00:06:08,839 --> 00:06:11,520 Speaker 2: for me than it might cost for you. And I 114 00:06:11,520 --> 00:06:14,760 Speaker 2: think again, it's just something to be aware of, and I'm, 115 00:06:14,839 --> 00:06:17,080 Speaker 2: you know, as we report on this show, very aware of, 116 00:06:17,120 --> 00:06:21,760 Speaker 2: like how are small little uses of machine learning actually 117 00:06:21,839 --> 00:06:26,200 Speaker 2: affecting the daily lives of Internet users, which we basically 118 00:06:26,240 --> 00:06:27,279 Speaker 2: all are at this point. 119 00:06:27,480 --> 00:06:30,080 Speaker 1: Yeah, and you know, the bargain we've had with the 120 00:06:30,120 --> 00:06:33,000 Speaker 1: tech companies for some time is we give them our 121 00:06:33,080 --> 00:06:36,320 Speaker 1: data and they give us their free services. But if 122 00:06:36,320 --> 00:06:38,640 Speaker 1: it starts to become a world in which like we're 123 00:06:38,640 --> 00:06:40,719 Speaker 1: actually charged on the back end for the way we 124 00:06:40,720 --> 00:06:43,120 Speaker 1: gw up our data, it makes you think very differently 125 00:06:43,160 --> 00:06:45,800 Speaker 1: about you know, Internet privacy, mvpns and all those kinds 126 00:06:45,800 --> 00:06:46,120 Speaker 1: of things. 127 00:06:46,320 --> 00:06:49,760 Speaker 2: Yeah, and this is I think the FTC being like 128 00:06:49,880 --> 00:06:52,800 Speaker 2: how free is technology? You know, how free is it 129 00:06:52,839 --> 00:06:54,839 Speaker 2: to us? And like what is the cost? What is 130 00:06:54,880 --> 00:06:56,640 Speaker 2: the cost that we pay, the hidden cost that we 131 00:06:56,680 --> 00:07:01,320 Speaker 2: pay as technology users? Is it just a friction free 132 00:07:01,360 --> 00:07:04,720 Speaker 2: shopping experience? And the answer is usually now, So what's 133 00:07:04,760 --> 00:07:06,440 Speaker 2: up with you this week? What are you talking about? 134 00:07:06,480 --> 00:07:08,680 Speaker 1: So last week you showed me that video of a 135 00:07:08,720 --> 00:07:12,960 Speaker 1: gun being remotely maneuvered and fired all by voice command 136 00:07:13,080 --> 00:07:15,920 Speaker 1: using chat GPT, And then we've got talking about how 137 00:07:16,000 --> 00:07:18,760 Speaker 1: open ai has gone back and forth about whether or 138 00:07:18,800 --> 00:07:22,880 Speaker 1: their products are allowed to be used to develop weapons. Yes, yes, 139 00:07:23,000 --> 00:07:27,360 Speaker 1: I looked into this a bit. And in October last year, Africom, 140 00:07:27,440 --> 00:07:31,120 Speaker 1: which is the US military command in Africa who do 141 00:07:31,320 --> 00:07:35,640 Speaker 1: a lot of drone surveillance and I think strikes on terrorists, 142 00:07:36,160 --> 00:07:40,040 Speaker 1: signed this multi billion dollar contract with Microsoft that included 143 00:07:40,200 --> 00:07:44,960 Speaker 1: open ai tools. However, when The Intercept asked open Ai 144 00:07:45,040 --> 00:07:47,720 Speaker 1: for comment, they denied they had a partnership with US 145 00:07:47,840 --> 00:07:52,280 Speaker 1: military and referred questions up to Daddy Microsoft. 146 00:07:52,600 --> 00:07:55,280 Speaker 2: So they're really the Russian nesting doll. 147 00:07:55,480 --> 00:08:00,000 Speaker 1: This is the Matrioshka of evading responsibility for weapons development. 148 00:08:00,760 --> 00:08:04,280 Speaker 1: But then in December, open Ai did in fact sign 149 00:08:04,400 --> 00:08:09,080 Speaker 1: a direct contract with a company that develops weapons andrill. 150 00:08:08,880 --> 00:08:10,320 Speaker 2: We talked about them last week. 151 00:08:10,560 --> 00:08:14,160 Speaker 1: The company is named after Arragoron's sword in Lord of 152 00:08:14,160 --> 00:08:18,240 Speaker 1: the Rings. In case you're wondering, I wasn't an lot. 153 00:08:18,120 --> 00:08:19,960 Speaker 2: Al Fleix, good old Palmer Lucky. 154 00:08:20,120 --> 00:08:23,800 Speaker 1: That's right, Palma Lucky. Sam Altman, who runs open AI said, 155 00:08:24,080 --> 00:08:27,320 Speaker 1: quote open AI bills to benefit as many people as 156 00:08:27,320 --> 00:08:31,760 Speaker 1: possible and supports US led efforts to ensure the technology 157 00:08:31,920 --> 00:08:36,480 Speaker 1: upholds democratic values. Unbelievable first as tragedy them as fast. 158 00:08:36,520 --> 00:08:42,000 Speaker 1: I think on this upholding values with weapons. But Andreil, 159 00:08:42,040 --> 00:08:44,440 Speaker 1: as usual, you're early to the story and already talking 160 00:08:44,440 --> 00:08:48,120 Speaker 1: about it last week. Have had quite a couple of months, obviously, 161 00:08:48,160 --> 00:08:51,360 Speaker 1: the big contract with open ai in December. This week, 162 00:08:51,520 --> 00:08:54,280 Speaker 1: the New York Times reported they're planning to build a 163 00:08:54,400 --> 00:08:57,920 Speaker 1: one billion dollar factory in Ohio, which is set to 164 00:08:57,960 --> 00:09:02,280 Speaker 1: bring four thousand jobs and will build tens of thousands 165 00:09:02,320 --> 00:09:06,640 Speaker 1: of autonomous systems and weapons over the coming years, swarming 166 00:09:06,760 --> 00:09:11,079 Speaker 1: cruise missiles for use abroad and at home surveillance towers. 167 00:09:11,320 --> 00:09:14,440 Speaker 1: Andreill's been part of the kind of border surveillance efforts 168 00:09:14,480 --> 00:09:19,000 Speaker 1: for some time. It's remarkable to me how this burst 169 00:09:19,040 --> 00:09:22,160 Speaker 1: of AI development over the last couple of years has 170 00:09:22,200 --> 00:09:25,640 Speaker 1: in a sense, remodeled the American landscape. 171 00:09:25,840 --> 00:09:28,800 Speaker 2: Yeah, and I think it's not just a tech story anymore. 172 00:09:28,920 --> 00:09:31,200 Speaker 2: We're talking about American cities and infrastructure. 173 00:09:31,280 --> 00:09:33,600 Speaker 1: But I mean people often. I remember the old question 174 00:09:33,760 --> 00:09:36,080 Speaker 1: was like where is tech? The answer is, you can 175 00:09:36,120 --> 00:09:39,559 Speaker 1: see it now because it's building huge buildings all over America. 176 00:09:40,920 --> 00:09:42,040 Speaker 1: You mentioned Palmer Lucky. 177 00:09:42,200 --> 00:09:47,800 Speaker 2: Do you know what he looks like, the dude the mask. 178 00:09:48,240 --> 00:09:52,080 Speaker 1: Yeah, it's pretty close. He always wears a Hawaiian shirt 179 00:09:52,080 --> 00:09:55,679 Speaker 1: and flip flops. He has a goatee, and he plays 180 00:09:55,720 --> 00:09:59,480 Speaker 1: Dungeons and Dragons. He's described his character in Dungeons and 181 00:09:59,559 --> 00:10:02,760 Speaker 1: Dragons as quote, a chaotic, neutral wizard. 182 00:10:04,160 --> 00:10:05,400 Speaker 2: It's a dating app profile. 183 00:10:05,800 --> 00:10:07,960 Speaker 1: It is. I don't know who you'd find with such 184 00:10:07,960 --> 00:10:10,040 Speaker 1: a profile that it turns out you'd find a lot 185 00:10:10,080 --> 00:10:14,040 Speaker 1: of investors if you're a founder. He got his start 186 00:10:14,280 --> 00:10:19,239 Speaker 1: lucky building VR headsets in his own garage, and then, interestingly, 187 00:10:19,280 --> 00:10:21,680 Speaker 1: given what he's doing now, went on to work in 188 00:10:21,720 --> 00:10:25,720 Speaker 1: this clinic that worked with vets suffering from PTSD and 189 00:10:25,880 --> 00:10:30,120 Speaker 1: using VR headsets for what's called exposure therapy to kind 190 00:10:30,120 --> 00:10:31,280 Speaker 1: of help them with their PTSD. 191 00:10:31,720 --> 00:10:35,120 Speaker 2: Wasn't he sort of integral in the creation of Oculous Rift. 192 00:10:35,160 --> 00:10:36,240 Speaker 2: That's how he made a lot of his money. 193 00:10:36,240 --> 00:10:39,079 Speaker 1: Well, that's right, he found an Oculus rift, the VR headset, 194 00:10:39,120 --> 00:10:42,720 Speaker 1: which was acquired by Facebook, making him a very young 195 00:10:42,760 --> 00:10:45,160 Speaker 1: billionaire at the time, in his mid twenties. 196 00:10:45,440 --> 00:10:46,760 Speaker 2: They didn't figure out what to do with it, but 197 00:10:46,800 --> 00:10:47,880 Speaker 2: he doesn't care well. 198 00:10:48,440 --> 00:10:50,840 Speaker 1: And actually he left Facebook under something of a cloud 199 00:10:50,880 --> 00:10:54,360 Speaker 1: in twenty seventeen. Some claimed he'd been forced out because 200 00:10:54,400 --> 00:10:58,840 Speaker 1: of making a campaign contribution to Donald Trump's twenty sixteen 201 00:10:59,480 --> 00:11:04,840 Speaker 1: presidential campaign. Lucky then founded Anderil after leaving Facebook, with 202 00:11:04,960 --> 00:11:08,400 Speaker 1: support from none of them Peter Teel. So right now, 203 00:11:08,480 --> 00:11:09,959 Speaker 1: it's a good moment to be Palmer Lucky. 204 00:11:10,440 --> 00:11:11,600 Speaker 2: I can't even make the joke. 205 00:11:11,920 --> 00:11:15,920 Speaker 1: What's the joke. It's Lucky, It's Lucky, that's the one. 206 00:11:16,400 --> 00:11:19,400 Speaker 1: He's been quite vocal about the threat of China, about 207 00:11:19,480 --> 00:11:23,320 Speaker 1: a potential for war in Taiwan, about how the war 208 00:11:23,360 --> 00:11:28,800 Speaker 1: in Ukraine has shown the potential for developing new autonomous weapons, 209 00:11:29,200 --> 00:11:31,760 Speaker 1: and kind of raise the alarm bell that the US 210 00:11:31,840 --> 00:11:35,280 Speaker 1: may be falling behind the curve in all of these areas. 211 00:11:35,880 --> 00:11:38,120 Speaker 1: What I find very striking, though, is just a few 212 00:11:38,200 --> 00:11:40,800 Speaker 1: years ago the whole of Silicon Valley was more or 213 00:11:40,880 --> 00:11:46,079 Speaker 1: less unified against military contracting, and now quite the opposite. 214 00:11:46,280 --> 00:11:50,160 Speaker 1: Palmer Lucky recently tweeted, after spending a few days at 215 00:11:50,240 --> 00:11:53,960 Speaker 1: CS twenty twenty five's clear that the vibe shift is real. 216 00:11:54,679 --> 00:11:57,600 Speaker 1: Everyone wants to help our military. Everyone wants to build. 217 00:11:58,120 --> 00:11:59,320 Speaker 1: Nobody's afraid. 218 00:12:01,120 --> 00:12:04,880 Speaker 2: Big beautiful buildings. That's what the president would say. We're 219 00:12:04,920 --> 00:12:07,080 Speaker 2: looking to build big beautiful buildings. 220 00:12:06,960 --> 00:12:09,119 Speaker 1: Right, and big beautiful weapons. I guess. 221 00:12:15,679 --> 00:12:17,720 Speaker 2: When we come back, we're joined by our friends at 222 00:12:17,720 --> 00:12:25,680 Speaker 2: four afore media stay with us. 223 00:12:26,559 --> 00:12:29,760 Speaker 1: So Caro, I know you are following the the will 224 00:12:29,800 --> 00:12:32,240 Speaker 1: they weren't they? So to speak of the TikTok ban, 225 00:12:33,640 --> 00:12:35,439 Speaker 1: how did you feel the moment it went dark? 226 00:12:35,720 --> 00:12:38,200 Speaker 2: I was there the moment it went dark online, the 227 00:12:38,200 --> 00:12:40,400 Speaker 2: moment it went dark. Yes, on the app, I was 228 00:12:40,440 --> 00:12:43,360 Speaker 2: fine because I somehow have not become addicted to TikTok. 229 00:12:43,360 --> 00:12:45,439 Speaker 2: But I have friends that were really in withdrawal and 230 00:12:45,640 --> 00:12:46,960 Speaker 2: using reels as methadone. 231 00:12:48,480 --> 00:12:49,640 Speaker 1: How bad was it for them? 232 00:12:49,760 --> 00:12:52,800 Speaker 2: Very bad? Very bad. Even you're okay, I'm fine. 233 00:12:52,960 --> 00:12:54,360 Speaker 1: It's back now, so you're fine. But what are you 234 00:12:54,400 --> 00:12:55,200 Speaker 1: find when it was off? 235 00:12:55,760 --> 00:12:57,360 Speaker 2: I was fine. It was off. It was just the 236 00:12:57,400 --> 00:13:03,240 Speaker 2: weird like referral to Trump in the messaging on the 237 00:13:03,280 --> 00:13:05,640 Speaker 2: app that I was like, what is going on here? 238 00:13:05,800 --> 00:13:08,360 Speaker 1: Yeah? So when you log back on too TikTok, you 239 00:13:08,400 --> 00:13:11,439 Speaker 1: were greeted by a message which includes the line as 240 00:13:11,440 --> 00:13:14,280 Speaker 1: a result of President Trump's efforts. TikTok is back in 241 00:13:14,320 --> 00:13:16,120 Speaker 1: the US. 242 00:13:16,160 --> 00:13:18,400 Speaker 2: It sounds like Trump wrote it, and it also sounds 243 00:13:18,440 --> 00:13:22,120 Speaker 2: like he did. It sounds like Biden's bad. Trump is good. 244 00:13:22,760 --> 00:13:25,480 Speaker 2: Use TikTok again because of Trump bite dance. 245 00:13:25,559 --> 00:13:28,280 Speaker 1: The Chinese parent company of TikTok is sending a pretty 246 00:13:28,320 --> 00:13:32,400 Speaker 1: clear message there. But can I tell you who almost 247 00:13:32,480 --> 00:13:34,360 Speaker 1: certainly was not jumping for joy? 248 00:13:34,400 --> 00:13:35,320 Speaker 2: Who's that? 249 00:13:35,320 --> 00:13:38,360 Speaker 1: That is Mark Zuckerberg, and he ha to tell us 250 00:13:38,440 --> 00:13:41,760 Speaker 1: more about why not. On this week's Tech Support is 251 00:13:41,800 --> 00:13:43,760 Speaker 1: four or four Media's Jason Kebler. 252 00:13:44,000 --> 00:13:46,439 Speaker 2: Welcome Jason. It's good to have you with us again. 253 00:13:46,920 --> 00:13:49,640 Speaker 3: Yeah, this has been my favorite topic of late so 254 00:13:49,840 --> 00:13:50,920 Speaker 3: stoke to talk about it. 255 00:13:51,360 --> 00:13:53,000 Speaker 2: So last week he reported on something that I think 256 00:13:53,080 --> 00:13:57,000 Speaker 2: was buried in the pre inauguration slash TikTok chaos. But 257 00:13:57,960 --> 00:14:01,200 Speaker 2: you reminded us that what Zuckerberg wanted did most was 258 00:14:01,240 --> 00:14:03,640 Speaker 2: actually a TikTok ban. Why is this? 259 00:14:04,480 --> 00:14:10,320 Speaker 3: In many ways, TikTok has become what Instagram once was. 260 00:14:10,800 --> 00:14:13,520 Speaker 3: The simple answer is that if TikTok were to be 261 00:14:13,559 --> 00:14:15,680 Speaker 3: banned in the United States, the most obvious place for 262 00:14:15,760 --> 00:14:19,040 Speaker 3: people to go would be Instagram and Instagram reels and 263 00:14:19,120 --> 00:14:24,720 Speaker 3: so a TikTok ban was very likely to benefit Meta 264 00:14:25,800 --> 00:14:27,440 Speaker 3: in an outsized way. 265 00:14:27,640 --> 00:14:30,320 Speaker 1: And how was zuck agitating for this? 266 00:14:30,840 --> 00:14:34,800 Speaker 3: So it's very hard to say exactly how involved or 267 00:14:34,840 --> 00:14:39,960 Speaker 3: not involved Meta or Mark Zuckerberg himself or in pushing 268 00:14:40,080 --> 00:14:44,880 Speaker 3: for a TikTok ban. But Mark Zuckerberg has been sort 269 00:14:44,920 --> 00:14:49,400 Speaker 3: of beating this drum about competition from Chinese tech companies 270 00:14:49,560 --> 00:14:52,200 Speaker 3: since all the way back in twenty nineteen. He gave 271 00:14:52,240 --> 00:14:57,040 Speaker 3: this speech at Georgetown University where he sort of pulled 272 00:14:57,080 --> 00:14:59,920 Speaker 3: out this line where he says, you know, a decade ago, 273 00:15:00,000 --> 00:15:04,640 Speaker 3: so all of the top ten tech companies were American today, 274 00:15:04,880 --> 00:15:07,880 Speaker 3: meaning in twenty nineteen, six of the top ten were Chinese. 275 00:15:08,440 --> 00:15:11,880 Speaker 3: And that was right before a big congressional hearing about 276 00:15:11,960 --> 00:15:16,160 Speaker 3: Facebook's data privacy practices. Pretty Much anytime that he's pulled 277 00:15:16,200 --> 00:15:19,280 Speaker 3: in front of Congress in any way to talk about 278 00:15:19,360 --> 00:15:23,760 Speaker 3: Meta's monopoly or privacy issues, he starts talking about the 279 00:15:23,800 --> 00:15:27,280 Speaker 3: threat from China and how if there is more regulation 280 00:15:27,680 --> 00:15:31,720 Speaker 3: from the US government, basically like a Chinese app will 281 00:15:31,720 --> 00:15:35,840 Speaker 3: come in to fill that vacuum. And so he really 282 00:15:35,880 --> 00:15:40,360 Speaker 3: like seeded the ground, in my opinion, for what ultimately 283 00:15:40,400 --> 00:15:43,760 Speaker 3: became the TikTok ban. The Washington Post also reported back 284 00:15:43,800 --> 00:15:46,600 Speaker 3: in twenty twenty two that Meta paid a firm called 285 00:15:46,680 --> 00:15:50,360 Speaker 3: Targeted Victory to push the narrative that TikTok was dangerous 286 00:15:50,720 --> 00:15:54,920 Speaker 3: for children, although Meta has denied directly lobbying on the 287 00:15:55,000 --> 00:15:55,680 Speaker 3: TikTok ban. 288 00:15:56,160 --> 00:15:58,640 Speaker 1: So I can imagine this. We've got to beat China. 289 00:15:59,000 --> 00:16:03,000 Speaker 1: Message is is resident with the new president. But it 290 00:16:03,080 --> 00:16:05,800 Speaker 1: wasn't always plain sailing between zuck and Trump. 291 00:16:06,000 --> 00:16:10,080 Speaker 3: Yeah, I mean, in the aftermath of January sixth Facebook ban, 292 00:16:10,200 --> 00:16:13,720 Speaker 3: Donald Trump and Donald Trump spent time last year saying 293 00:16:13,720 --> 00:16:15,920 Speaker 3: that he wanted to put Mark Zuckerberg in jail. 294 00:16:15,720 --> 00:16:18,000 Speaker 1: But instead he put him hashtag front row. 295 00:16:18,280 --> 00:16:21,240 Speaker 3: Yeah, he has, you know, gone to mar A Lago twice. 296 00:16:21,360 --> 00:16:25,360 Speaker 3: He was at the inauguration and through a party, an 297 00:16:25,360 --> 00:16:26,800 Speaker 3: inauguration party in DC. 298 00:16:27,080 --> 00:16:27,760 Speaker 1: I threw a party. 299 00:16:27,800 --> 00:16:30,560 Speaker 3: Yeah, he threw a party. TikTok also threw a party 300 00:16:30,600 --> 00:16:31,880 Speaker 3: for Donald Trump. 301 00:16:32,080 --> 00:16:32,360 Speaker 1: Well. 302 00:16:32,440 --> 00:16:34,360 Speaker 2: And one of the things that's interesting is that Trump 303 00:16:34,400 --> 00:16:37,120 Speaker 2: tried to ban TikTok during his first term in office 304 00:16:37,600 --> 00:16:41,760 Speaker 2: with the same sort of China hawk argument, right, and 305 00:16:41,800 --> 00:16:45,000 Speaker 2: now he's stalled the ban by executive order for at 306 00:16:45,080 --> 00:16:49,360 Speaker 2: least seventy five days, and when asked during the signing 307 00:16:49,400 --> 00:16:51,880 Speaker 2: why he changed his mind about the app, Trump said, 308 00:16:52,640 --> 00:16:56,080 Speaker 2: because I got to use it, which seems to be 309 00:16:56,160 --> 00:16:57,520 Speaker 2: Mark Zuckerberg's nightmare. 310 00:16:58,840 --> 00:17:01,160 Speaker 3: I think that it is Mark Zucker's nightmare. I think 311 00:17:01,200 --> 00:17:04,160 Speaker 3: that TikTok being banned would have been a huge prize 312 00:17:04,320 --> 00:17:07,680 Speaker 3: for Meta, and they were so so close. I mean, 313 00:17:07,880 --> 00:17:12,199 Speaker 3: the the app actually went down. In the lead up 314 00:17:12,200 --> 00:17:16,000 Speaker 3: to the ban, Meta started buying ads on TikTok to 315 00:17:16,080 --> 00:17:20,480 Speaker 3: advertise Instagram, for example, as an alternative. They started buying 316 00:17:20,520 --> 00:17:24,000 Speaker 3: ads on Google saying, you know, come to reels. They 317 00:17:24,040 --> 00:17:29,360 Speaker 3: also rolled out this like affiliate program that very closely 318 00:17:29,440 --> 00:17:32,359 Speaker 3: mirrors how TikTok shop works, so sort of trying to 319 00:17:32,520 --> 00:17:36,639 Speaker 3: entice influencers from TikTok to come to Instagram. And so 320 00:17:36,880 --> 00:17:39,160 Speaker 3: I mean that there was like quite a lot of 321 00:17:39,720 --> 00:17:44,720 Speaker 3: thought put into Instagram as a replacement for TikTok. And 322 00:17:44,760 --> 00:17:47,640 Speaker 3: now there's still this seventy five day delay in terms 323 00:17:47,720 --> 00:17:51,560 Speaker 3: of whether the band will actually be enacted. I think 324 00:17:51,600 --> 00:17:54,480 Speaker 3: that Trump has made it clear he doesn't really want 325 00:17:54,480 --> 00:17:59,080 Speaker 3: to ban this thing. Like they are using Oracle servers 326 00:17:59,359 --> 00:18:03,160 Speaker 3: for American tiktoks more or less, which was part of 327 00:18:03,520 --> 00:18:07,479 Speaker 3: that earlier executive order that you referenced where Trump wanted 328 00:18:07,520 --> 00:18:10,520 Speaker 3: to ban TikTok during his first term, and as part 329 00:18:10,600 --> 00:18:14,440 Speaker 3: of that byte dance agreed to move Americans data to 330 00:18:15,200 --> 00:18:17,200 Speaker 3: Oracle controlled servers in. 331 00:18:17,200 --> 00:18:18,920 Speaker 2: Texas, an American country. 332 00:18:18,640 --> 00:18:24,960 Speaker 3: So an American company, and so Oracle shut down TikTok 333 00:18:25,080 --> 00:18:28,199 Speaker 3: servers on Saturday, which is why it went down. But 334 00:18:28,320 --> 00:18:31,960 Speaker 3: then on Sunday, I guess Trump was able to cut 335 00:18:32,000 --> 00:18:37,359 Speaker 3: a deal with Oracle seemingly that convinced the company to 336 00:18:37,440 --> 00:18:39,400 Speaker 3: take the risk to put it back up. 337 00:18:39,920 --> 00:18:41,960 Speaker 2: Well, and Larry Ellison then is at the White House 338 00:18:41,960 --> 00:18:43,640 Speaker 2: on Tuesday with Project Stargate. 339 00:18:43,880 --> 00:18:46,520 Speaker 3: Yeah, Larry Ellison, who is the CEO of Oracle, and 340 00:18:46,600 --> 00:18:49,359 Speaker 3: Trump are friends and they've been friends for a long time. 341 00:18:49,480 --> 00:18:51,560 Speaker 3: And so to be totally honest with you, a lot 342 00:18:51,560 --> 00:18:55,159 Speaker 3: of this seems almost reverse engineer to give Trump a 343 00:18:55,200 --> 00:18:59,000 Speaker 3: big win. The ban is incredibly unpopular, and it's incredibly 344 00:18:59,080 --> 00:19:04,080 Speaker 3: unpopular among TikTok users who are a young demographic that 345 00:19:04,200 --> 00:19:07,000 Speaker 3: you know, Trump has really gone after in this election. 346 00:19:07,240 --> 00:19:10,880 Speaker 3: And so I think being able to message like Biden 347 00:19:10,960 --> 00:19:13,600 Speaker 3: took your TikTok away, and I gave it back, even 348 00:19:13,640 --> 00:19:17,840 Speaker 3: if that is not exactly true. Is a very powerful 349 00:19:17,880 --> 00:19:19,320 Speaker 3: thing for Trump to be able. 350 00:19:19,160 --> 00:19:23,240 Speaker 1: To say, Jason, just close. I'm always fascinated by the 351 00:19:23,320 --> 00:19:26,159 Speaker 1: law of unintended consequences. But there are a couple of 352 00:19:26,240 --> 00:19:31,280 Speaker 1: other apps which saw big, big inflows of users, one 353 00:19:31,280 --> 00:19:32,440 Speaker 1: of which was red Note. 354 00:19:32,720 --> 00:19:36,159 Speaker 3: Yeah. So, red Note is an app that is a 355 00:19:36,200 --> 00:19:41,240 Speaker 3: sort of Chinese competitor to TikTok. It's very popular in China. 356 00:19:41,600 --> 00:19:43,480 Speaker 3: I have a lot of Chinese American friends who have 357 00:19:43,520 --> 00:19:47,080 Speaker 3: been on it for years. It's very good for finding restaurants. 358 00:19:47,119 --> 00:19:50,359 Speaker 3: From personal experience, I know that. But you know, in 359 00:19:50,440 --> 00:19:52,240 Speaker 3: advance of the TikTok band, you had a lot of 360 00:19:52,280 --> 00:19:56,520 Speaker 3: Americans downloading and going over to red Note. And it's 361 00:19:56,520 --> 00:19:59,320 Speaker 3: one of the few Chinese social media apps that is 362 00:19:59,320 --> 00:20:02,520 Speaker 3: available both in China and the United States. And so 363 00:20:02,680 --> 00:20:06,280 Speaker 3: you had a mix of people sort of ironically saying 364 00:20:06,400 --> 00:20:10,240 Speaker 3: I would rather give my data directly to the Chinese 365 00:20:10,320 --> 00:20:12,720 Speaker 3: government or to a Chinese app than give it to 366 00:20:12,760 --> 00:20:16,200 Speaker 3: Mark Zuckerberg. And then I think you also had people 367 00:20:16,200 --> 00:20:19,520 Speaker 3: who just didn't like the fact that the US government 368 00:20:19,640 --> 00:20:23,159 Speaker 3: was taking something away from them for this sort of 369 00:20:24,280 --> 00:20:29,879 Speaker 3: amorphous national security spying reason. I think that there are 370 00:20:29,960 --> 00:20:33,560 Speaker 3: very real concerns about TikTok and data privacy, but the 371 00:20:33,680 --> 00:20:39,919 Speaker 3: US government has not made it's anything specific about, you know, 372 00:20:40,000 --> 00:20:42,640 Speaker 3: the types of spying that may or may not be happening. 373 00:20:42,920 --> 00:20:46,280 Speaker 2: I think you're right, Jason, that the US government has 374 00:20:46,320 --> 00:20:49,960 Speaker 2: not made it compelling enough argument for like why TikTok 375 00:20:50,000 --> 00:20:54,320 Speaker 2: should be banned. And you know, I think people don't 376 00:20:54,320 --> 00:20:57,320 Speaker 2: want to be TikTok refugees. Also because circling right back 377 00:20:57,320 --> 00:21:00,879 Speaker 2: to Zuck reels isn't as good. And you know, it's like, 378 00:21:00,960 --> 00:21:03,359 Speaker 2: if you want people to use reels, make it as 379 00:21:03,400 --> 00:21:05,120 Speaker 2: good as TikTok. And the problem is is I don't 380 00:21:05,119 --> 00:21:06,240 Speaker 2: think that will ever be the case. 381 00:21:06,760 --> 00:21:09,879 Speaker 3: Yeah, I think you're right. And also, I mean, this 382 00:21:10,000 --> 00:21:12,080 Speaker 3: was a big argument ByteDance is making, but it's a 383 00:21:12,200 --> 00:21:16,520 Speaker 3: very real and important argument that hundreds of millions of 384 00:21:16,560 --> 00:21:21,960 Speaker 3: Americans we're on TikTok, are on TikTok. You have millions 385 00:21:22,040 --> 00:21:25,359 Speaker 3: of small businesses, millions of influencers who make a living 386 00:21:25,440 --> 00:21:29,200 Speaker 3: on TikTok. And I think that this sort of fractured 387 00:21:30,040 --> 00:21:32,920 Speaker 3: Internet that we saw for only a few hours between 388 00:21:32,960 --> 00:21:35,240 Speaker 3: Saturday and Sunday, I found it to be like a 389 00:21:35,280 --> 00:21:37,879 Speaker 3: really chilling and weird thing that I'm not used to 390 00:21:37,920 --> 00:21:40,840 Speaker 3: as an American who who I'm like, I have access 391 00:21:40,880 --> 00:21:44,000 Speaker 3: to everything on the internet more or less and suddenly 392 00:21:44,160 --> 00:21:46,119 Speaker 3: just sort of being cut off from that and not 393 00:21:46,160 --> 00:21:49,840 Speaker 3: even having a very easy circumvention. This was sort of 394 00:21:49,880 --> 00:21:53,240 Speaker 3: like a blueprint for how other types of bands could 395 00:21:53,240 --> 00:21:55,560 Speaker 3: work in the US, and I think that's pretty scary. 396 00:21:56,680 --> 00:21:59,040 Speaker 1: What will you be looking out for in the next 397 00:21:59,080 --> 00:22:01,280 Speaker 1: seventy five days and beyond on this story. 398 00:22:01,640 --> 00:22:05,359 Speaker 3: I think that by Dance is very unwilling to sell TikTok. 399 00:22:05,680 --> 00:22:08,800 Speaker 3: Whether that is because of pressure from the Chinese government 400 00:22:08,920 --> 00:22:11,400 Speaker 3: or because they have a good business and just don't 401 00:22:11,400 --> 00:22:16,280 Speaker 3: want to sell it, who knows. But I think, honestly, 402 00:22:16,320 --> 00:22:19,080 Speaker 3: the thing I'm most interested to see is whether this 403 00:22:19,280 --> 00:22:22,240 Speaker 3: just kind of fades into the background. And I'm wondering 404 00:22:22,320 --> 00:22:25,719 Speaker 3: if in seventy five days nothing changes and we just 405 00:22:25,800 --> 00:22:28,720 Speaker 3: sort of keep going, and you know, it will be 406 00:22:28,760 --> 00:22:31,840 Speaker 3: a question of whether the Justice Department is willing to 407 00:22:33,280 --> 00:22:36,080 Speaker 3: you know, enforce a law that was enacted by Congress 408 00:22:36,080 --> 00:22:38,960 Speaker 3: and upheld by the Supreme Court sort of for the 409 00:22:39,080 --> 00:22:44,280 Speaker 3: like political interest of the President. Jason, Thank you, thank 410 00:22:44,320 --> 00:22:44,800 Speaker 3: you so much. 411 00:22:45,640 --> 00:22:49,359 Speaker 1: Coming up we'll explore a crucial first family in tech 412 00:22:49,760 --> 00:22:52,240 Speaker 1: with another installment of when did this become a thing? 413 00:23:02,320 --> 00:23:05,440 Speaker 1: So Kara, Ever since the election of President Trump, any 414 00:23:05,480 --> 00:23:07,680 Speaker 1: number of tech moguls have been showing up at Mara 415 00:23:07,800 --> 00:23:10,920 Speaker 1: Lago and in Washington and have even taken on official 416 00:23:11,000 --> 00:23:14,360 Speaker 1: roles in the transition the new administration. But there's one 417 00:23:14,400 --> 00:23:16,240 Speaker 1: group on everybody's lips. 418 00:23:16,440 --> 00:23:19,280 Speaker 2: The PayPal Mafia. 419 00:23:18,520 --> 00:23:22,320 Speaker 1: PayPal Mafia, and so in today's when did this become 420 00:23:22,359 --> 00:23:24,720 Speaker 1: a thing? We want to explore the history of this 421 00:23:24,840 --> 00:23:28,080 Speaker 1: powerful group to understand just how they went from smart 422 00:23:28,119 --> 00:23:30,840 Speaker 1: investors and founders to seats of power at the top 423 00:23:30,880 --> 00:23:34,399 Speaker 1: of government, business, and even social influence. Yeah. 424 00:23:35,240 --> 00:23:37,560 Speaker 2: I know these guys, but I sort of take the 425 00:23:37,600 --> 00:23:42,040 Speaker 2: PayPal Mafia for granted without really knowing what their stories are. 426 00:23:42,080 --> 00:23:44,360 Speaker 2: So I'm excited for us to do this, and you're 427 00:23:44,400 --> 00:23:45,280 Speaker 2: going to start us off. 428 00:23:45,440 --> 00:23:47,480 Speaker 1: So let me start just by going over some of 429 00:23:47,520 --> 00:23:50,719 Speaker 1: the members of the PayPal mafia who are now closely 430 00:23:50,720 --> 00:23:54,600 Speaker 1: connected or actually inside the Trump administration. So we've got 431 00:23:54,640 --> 00:23:58,800 Speaker 1: Elon Musk running DOGE, the Department for Government Efficiency, which 432 00:23:58,840 --> 00:24:02,439 Speaker 1: tried to recruit applicants back in November, saying, quote, we 433 00:24:02,480 --> 00:24:06,359 Speaker 1: need super high IQ's small government revolutionaries willing to work 434 00:24:06,480 --> 00:24:09,399 Speaker 1: eighty plus hours per week on on glamorous cost cutting. 435 00:24:10,080 --> 00:24:10,720 Speaker 1: That was a tweet. 436 00:24:10,880 --> 00:24:11,560 Speaker 2: Great love it. 437 00:24:12,560 --> 00:24:15,960 Speaker 1: Then you have David OSAs another podcast host, the All 438 00:24:15,960 --> 00:24:21,120 Speaker 1: In Podcast, another South African and Trump's new AI and Cryptosa. 439 00:24:21,840 --> 00:24:24,480 Speaker 1: This is an advisory position, so Sax doesn't have to 440 00:24:24,560 --> 00:24:25,960 Speaker 1: quit his day job of being a VC. 441 00:24:26,240 --> 00:24:28,520 Speaker 2: Is being a VC, Yeah, I guess that is it 442 00:24:28,560 --> 00:24:29,120 Speaker 2: a real job. 443 00:24:29,480 --> 00:24:31,680 Speaker 1: Well, it's the intersection of VC in podcasting. 444 00:24:31,720 --> 00:24:35,360 Speaker 2: It's obviously effect on the path heading there. 445 00:24:36,200 --> 00:24:37,640 Speaker 1: And then of course at the center of it all 446 00:24:37,800 --> 00:24:40,760 Speaker 1: is Peter Teal. He doesn't have an official capacity in 447 00:24:40,800 --> 00:24:44,560 Speaker 1: this administration, but he's the person who introduced Trump and JD. 448 00:24:44,800 --> 00:24:48,080 Speaker 1: Vance in twenty twenty one, a key investor in Anderill 449 00:24:48,720 --> 00:24:51,600 Speaker 1: and a significant donor to Trump and Vance for many. 450 00:24:51,520 --> 00:24:54,000 Speaker 2: Years millions and millions of dollars. And I want to 451 00:24:54,040 --> 00:24:56,719 Speaker 2: add that the guy who used to be CEO of 452 00:24:56,800 --> 00:25:01,320 Speaker 2: Teal's personal philanthropic foundation that is most famous for paying 453 00:25:01,400 --> 00:25:04,520 Speaker 2: kids not to go to college, has been tapped by 454 00:25:04,600 --> 00:25:09,639 Speaker 2: Trump to be Deputy Secretary of the HHS, helping RFK 455 00:25:09,840 --> 00:25:12,800 Speaker 2: Junior Maha make America healthy. 456 00:25:12,520 --> 00:25:16,000 Speaker 1: Again, make America healthy again. So that's where some of 457 00:25:16,080 --> 00:25:19,480 Speaker 1: the leading lights of the mafia are today. But to 458 00:25:19,560 --> 00:25:22,640 Speaker 1: understand why we call them the Papal Mafia, I need 459 00:25:22,680 --> 00:25:24,720 Speaker 1: to take you back to the fall of two thousand 460 00:25:24,760 --> 00:25:25,200 Speaker 1: and seven. 461 00:25:25,880 --> 00:25:27,479 Speaker 2: Q MGMT. 462 00:25:29,440 --> 00:25:32,200 Speaker 1: Two thousand and seven was, of course, the golden age 463 00:25:32,200 --> 00:25:36,159 Speaker 1: of MGMT and prestige television. The Sopranos are just at 464 00:25:36,200 --> 00:25:40,080 Speaker 1: its final episode and then Fortune Magazine prints this photo. 465 00:25:40,640 --> 00:25:44,320 Speaker 1: Credit to Robin Toomey for the indelible image. Cary, can 466 00:25:44,359 --> 00:25:46,400 Speaker 1: you pull it up and just describe it for anyone 467 00:25:46,720 --> 00:25:48,399 Speaker 1: who doesn't have it burned onto their eyes? 468 00:25:48,720 --> 00:25:53,440 Speaker 2: Well, red lacquer tabletops, a bunch of I mean, I'm gay, 469 00:25:53,480 --> 00:25:57,919 Speaker 2: but I still understand a good looking man. And these 470 00:25:58,080 --> 00:26:01,560 Speaker 2: are a bunch of guys dressed like they belonged to 471 00:26:01,680 --> 00:26:04,240 Speaker 2: Leonardo DiCaprio's pussy Possy. 472 00:26:04,760 --> 00:26:07,600 Speaker 1: You're also vegetarian. There's a steakhouse vibe here as well. 473 00:26:07,480 --> 00:26:10,280 Speaker 2: Being steakhouse vibe. There's an oil painting of venice in 474 00:26:10,280 --> 00:26:14,960 Speaker 2: the background, a lot of whiskey and most absurdly, poker chips. 475 00:26:14,960 --> 00:26:16,520 Speaker 2: But it's giving mafia. 476 00:26:16,680 --> 00:26:20,600 Speaker 1: There's also a baseball bat giving mafia. Do you recognize 477 00:26:20,640 --> 00:26:24,199 Speaker 1: anyone in the picture. No, well, two of them are 478 00:26:24,280 --> 00:26:26,800 Speaker 1: somewhat recognizable, Peter Teal and Rehealth. 479 00:26:27,160 --> 00:26:29,800 Speaker 2: Yeah, Speedertael in Halfman. They just look different. Now, that's 480 00:26:29,800 --> 00:26:30,800 Speaker 2: why I said that's true. 481 00:26:30,840 --> 00:26:33,880 Speaker 1: I mean, it's been some time has passed. They're also 482 00:26:34,040 --> 00:26:36,440 Speaker 1: dressed kind of as they're going to a costume party, 483 00:26:36,480 --> 00:26:37,879 Speaker 1: which which doesn't. 484 00:26:37,600 --> 00:26:40,360 Speaker 2: Help definitely, which they seem to be enjoying. I don't 485 00:26:40,359 --> 00:26:41,520 Speaker 2: know if I enjoy it so much. 486 00:26:41,840 --> 00:26:45,080 Speaker 1: So in this photo is almost every single member of 487 00:26:45,119 --> 00:26:48,440 Speaker 1: the PayPal Mafia. They've been using this term in their 488 00:26:48,480 --> 00:26:51,840 Speaker 1: own circle, you know self. Self given names of groups 489 00:26:51,840 --> 00:26:54,960 Speaker 1: can be on the embarrassing side, yes, but it was 490 00:26:55,080 --> 00:26:58,480 Speaker 1: this photo, which is very iconic, that cemented them as 491 00:26:58,520 --> 00:27:01,400 Speaker 1: the PayPal Mafia as far as the wider public is concerned. 492 00:27:02,480 --> 00:27:05,440 Speaker 1: Right in the front are PayPal founders Peter Teal and 493 00:27:05,920 --> 00:27:10,200 Speaker 1: Max Levchin, who Fortune called the don and the conciliary 494 00:27:10,400 --> 00:27:10,680 Speaker 1: of the. 495 00:27:10,640 --> 00:27:13,199 Speaker 2: Mass, and Max is the Conix the conciliary. 496 00:27:13,640 --> 00:27:15,920 Speaker 1: To get the full picture of how they got inducted 497 00:27:15,920 --> 00:27:20,919 Speaker 1: into this elite steakhouse brotherhood. 498 00:27:19,080 --> 00:27:21,800 Speaker 2: Dream, by the way, I just have to say, we have. 499 00:27:21,840 --> 00:27:24,360 Speaker 1: To go all the way back to the founding of PayPal, 500 00:27:24,840 --> 00:27:28,600 Speaker 1: which started with a different name, which was Confinity not 501 00:27:28,640 --> 00:27:31,920 Speaker 1: as Good, Not as good too much like connad by 502 00:27:31,960 --> 00:27:35,159 Speaker 1: Teal and Levchin. And then in two thousand they merged 503 00:27:35,200 --> 00:27:38,879 Speaker 1: with a company called x dot com, which it was 504 00:27:38,880 --> 00:27:41,679 Speaker 1: an online banking platform you know who that was founded 505 00:27:41,680 --> 00:27:46,119 Speaker 1: by By two thousand and one, the combined company was 506 00:27:46,160 --> 00:27:49,760 Speaker 1: known as PayPal. This is way before Meta or ABNB 507 00:27:50,080 --> 00:27:51,600 Speaker 1: or way way before open Ai. 508 00:27:52,480 --> 00:27:54,760 Speaker 2: Yeah, it's funny because I think back, I mean, two 509 00:27:54,800 --> 00:27:57,480 Speaker 2: thousand and one is nine to eleven. I mean it's 510 00:27:57,800 --> 00:28:00,600 Speaker 2: that's really early Internet. And I, you know, as a 511 00:28:00,600 --> 00:28:03,400 Speaker 2: young person, was not interfacing with PayPal. 512 00:28:03,560 --> 00:28:06,560 Speaker 1: This is more like Netscape. Oh yeah, a little bit. 513 00:28:06,600 --> 00:28:09,320 Speaker 1: Maybe he was at Yahoo even around those Yeah. 514 00:28:09,240 --> 00:28:11,119 Speaker 2: YEAHO was around because I used to use yahooligans. 515 00:28:11,400 --> 00:28:13,760 Speaker 1: But this was just a few years after Google was 516 00:28:13,760 --> 00:28:17,400 Speaker 1: founded in nineteen ninety eight. But there were some very 517 00:28:17,400 --> 00:28:22,000 Speaker 1: big differences between how these two future behemoths, Google and 518 00:28:22,040 --> 00:28:24,760 Speaker 1: PayPal were run, and a lot of it had to 519 00:28:24,800 --> 00:28:28,440 Speaker 1: do with culture. As that two thousand and seven Fortune 520 00:28:28,520 --> 00:28:32,520 Speaker 1: article explains, a few of the PayPal mafia had some 521 00:28:32,640 --> 00:28:37,520 Speaker 1: choice words about the Omerta the PayPal culture. The former 522 00:28:37,560 --> 00:28:40,920 Speaker 1: CFO of PayPal said, quote the difference between Google and 523 00:28:40,960 --> 00:28:44,680 Speaker 1: PayPal was that Google wanted to hire PhDs and PayPal 524 00:28:44,760 --> 00:28:47,360 Speaker 1: wanted to hire people who got into PhDs and then 525 00:28:47,440 --> 00:28:50,600 Speaker 1: dropped out. It is, he said, a different temperament. 526 00:28:51,200 --> 00:28:53,600 Speaker 2: Well, right, and you can absolutely see how Peter Tiel 527 00:28:53,640 --> 00:28:57,160 Speaker 2: would later incentivize people to drop out of college with 528 00:28:57,200 --> 00:28:57,800 Speaker 2: the foundation. 529 00:28:58,040 --> 00:29:01,200 Speaker 1: And these weren't even PhD students, were undergraduate, that's right. 530 00:29:02,280 --> 00:29:04,520 Speaker 1: But so you know, back in the early days of PayPal, 531 00:29:04,600 --> 00:29:09,560 Speaker 1: there was this concerted effort to not hire anyone affiliated 532 00:29:09,600 --> 00:29:13,440 Speaker 1: with any kind of establishment organization. Teal and Levchin wanted 533 00:29:13,480 --> 00:29:17,640 Speaker 1: workaholics who weren't consultants, weren't MBAs, and weren't part of 534 00:29:17,680 --> 00:29:20,120 Speaker 1: the kind of elite establishment culture. 535 00:29:20,880 --> 00:29:24,360 Speaker 2: So like, who was the ideal profile for this. 536 00:29:24,720 --> 00:29:28,120 Speaker 1: Well, co founder Max Levchin said that hiring was sort 537 00:29:28,120 --> 00:29:31,760 Speaker 1: of a process of finding like minded people. He said 538 00:29:31,760 --> 00:29:35,360 Speaker 1: he was looking for, quote, someone just as geeky and 539 00:29:35,400 --> 00:29:37,920 Speaker 1: who doesn't get laid very often. For the most part, 540 00:29:37,960 --> 00:29:41,880 Speaker 1: this worked. Elon was quickly ousted because he tried to 541 00:29:41,960 --> 00:29:46,120 Speaker 1: mandate some kind of technological overhaul, but that bad blood 542 00:29:46,200 --> 00:29:49,040 Speaker 1: that once existed has been put to rest, and according 543 00:29:49,080 --> 00:29:52,680 Speaker 1: to Walter Isaacson, who wrote the most biography. Musk couch 544 00:29:52,720 --> 00:29:55,600 Speaker 1: surfs on PayPal mafia couches when he's traveling. 545 00:29:55,240 --> 00:29:58,880 Speaker 2: Between hotel That's how you have one hundred billion? Is 546 00:29:58,880 --> 00:29:59,440 Speaker 2: you couch service? 547 00:29:59,480 --> 00:30:02,480 Speaker 1: Exactly? That's right. There are stories back in the day 548 00:30:02,520 --> 00:30:05,720 Speaker 1: of disagreements turning to wrestling matches again. Another preview of 549 00:30:06,240 --> 00:30:09,000 Speaker 1: that that the yet to happen cage fight between Musk 550 00:30:09,080 --> 00:30:13,840 Speaker 1: and Zuckerberg. But it was it was quite the vibe. 551 00:30:13,720 --> 00:30:15,640 Speaker 2: And it's I mean, it's very funny because it is 552 00:30:15,680 --> 00:30:19,240 Speaker 2: an anti establishment vibe that's so clearly yearns to be 553 00:30:19,400 --> 00:30:20,800 Speaker 2: this anti establishment. 554 00:30:21,120 --> 00:30:23,960 Speaker 1: Right. It's like totally like teenage. It's like a teenage 555 00:30:24,080 --> 00:30:26,400 Speaker 1: sort of sleepover on the you know, with this money 556 00:30:26,440 --> 00:30:28,640 Speaker 1: and as she drinks as that's right, as you are. 557 00:30:28,720 --> 00:30:30,440 Speaker 2: But they're like, it's not a sleepover, it's just a 558 00:30:30,480 --> 00:30:32,160 Speaker 2: gathering of boys. 559 00:30:32,640 --> 00:30:37,440 Speaker 1: But this mindset was really cultivated by the libertarian leaning 560 00:30:37,440 --> 00:30:42,240 Speaker 1: Peter Teal and there is this inherent sort of culture 561 00:30:42,280 --> 00:30:45,000 Speaker 1: and personality of the Papal Mafia that may in some 562 00:30:45,040 --> 00:30:49,440 Speaker 1: sense help explain why these contrarians went on to be 563 00:30:49,480 --> 00:30:52,240 Speaker 1: some of the most successful entrepreneurs and vcs in Silicon 564 00:30:52,320 --> 00:30:56,120 Speaker 1: Valley history. For reference, David Sachs is thought to be 565 00:30:56,240 --> 00:31:00,440 Speaker 1: worth in hundreds of millions, Levchin apparently about a billion, 566 00:31:00,800 --> 00:31:04,880 Speaker 1: Teal closer to ten, and Musk in the hundreds of billions. 567 00:31:05,200 --> 00:31:07,080 Speaker 2: That money does not seem real to me, but whatever, 568 00:31:07,440 --> 00:31:08,920 Speaker 2: I don't exactly know how to wrap my head around 569 00:31:08,960 --> 00:31:11,520 Speaker 2: that level of money, but it does seem like PayPal 570 00:31:11,960 --> 00:31:15,360 Speaker 2: was such a giant Unicorn success story that all of 571 00:31:15,400 --> 00:31:19,040 Speaker 2: these guys had enough money to keep seeding this huge 572 00:31:19,080 --> 00:31:22,520 Speaker 2: wave of big companies and apps that came as the 573 00:31:22,560 --> 00:31:26,000 Speaker 2: iPhone and app store exploded and the digital economy boomed. 574 00:31:26,480 --> 00:31:28,400 Speaker 1: Yeah. I mean, there's a reason they call it the 575 00:31:28,440 --> 00:31:31,200 Speaker 1: PayPal Mafia. And if you've watched The Sopranos or any 576 00:31:31,600 --> 00:31:34,400 Speaker 1: mafia movie or TV series, you know like when you 577 00:31:34,440 --> 00:31:35,960 Speaker 1: need a loan or when you need some money to 578 00:31:35,960 --> 00:31:38,720 Speaker 1: start your business, you go to the godfather or one 579 00:31:38,760 --> 00:31:40,960 Speaker 1: of the one of the captains and they gave you 580 00:31:40,960 --> 00:31:44,120 Speaker 1: some money to start your thing, and then they own you. 581 00:31:44,360 --> 00:31:45,760 Speaker 2: That's right, they own your ass. 582 00:31:46,640 --> 00:31:49,640 Speaker 1: And apparently, you know, this relied most heavily on the 583 00:31:49,640 --> 00:31:52,640 Speaker 1: wallet and mind of Peter Teal, at least initially. 584 00:31:52,480 --> 00:31:55,960 Speaker 2: So PayPal is like the patriarch of a very wealthy family, 585 00:31:56,000 --> 00:31:59,720 Speaker 2: and the PayPal Mafia is the offspring of that family. 586 00:32:00,120 --> 00:32:02,520 Speaker 2: And when they all moved on, they put money into 587 00:32:02,520 --> 00:32:06,880 Speaker 2: different companies with similar principles to PayPal, or founded companies 588 00:32:06,920 --> 00:32:10,560 Speaker 2: based on their previous experience. And you know, now members 589 00:32:10,560 --> 00:32:13,240 Speaker 2: of the PayPal Mafia are some of the most influential 590 00:32:13,280 --> 00:32:15,360 Speaker 2: players in Silicon Valley and DC. 591 00:32:16,240 --> 00:32:19,440 Speaker 1: Yeah, and I think the whole tech industry in some 592 00:32:19,520 --> 00:32:22,520 Speaker 1: sense has taken on the ethos of the PayPal Mafia. 593 00:32:22,520 --> 00:32:24,240 Speaker 1: I mean, we talked about in the early days how 594 00:32:24,280 --> 00:32:26,640 Speaker 1: Google was kind of opposed in some sense to PayPal 595 00:32:26,720 --> 00:32:30,080 Speaker 1: Mafia and how they did business, and there's still some 596 00:32:30,120 --> 00:32:32,560 Speaker 1: of that, but suddenly this is now the dominant culture 597 00:32:32,560 --> 00:32:33,320 Speaker 1: in Silicon Valley. 598 00:32:34,080 --> 00:32:37,160 Speaker 2: And I think now, as I was just saying, traits 599 00:32:37,160 --> 00:32:42,000 Speaker 2: that will carry into the current administration. You know, if 600 00:32:42,040 --> 00:32:46,480 Speaker 2: that doge listing tells us anything. 601 00:32:46,320 --> 00:32:51,400 Speaker 1: Eight hours a week, lo pey no sleep No. 602 00:32:51,480 --> 00:32:53,920 Speaker 2: But I just think in terms of why it was 603 00:32:53,960 --> 00:32:58,840 Speaker 2: interesting is because you can just trace the roots of 604 00:32:59,800 --> 00:33:02,719 Speaker 2: so much of the wealth that has come out of 605 00:33:03,120 --> 00:33:06,920 Speaker 2: technology in the last twenty five years, and it's wealth 606 00:33:06,920 --> 00:33:08,440 Speaker 2: that is so concentrated. 607 00:33:08,880 --> 00:33:11,520 Speaker 1: Yeah, And to be fair, these guys are also good branders. 608 00:33:11,640 --> 00:33:14,560 Speaker 1: I mean, like if Goldman started talking about the Goldman 609 00:33:14,640 --> 00:33:17,840 Speaker 1: Ges or the McKinsey mafew, you would also see like 610 00:33:18,000 --> 00:33:22,480 Speaker 1: the fingerprints of a very specific group all over corporate America. 611 00:33:22,560 --> 00:33:25,160 Speaker 1: But in some sense, these guys weren't afraid to say 612 00:33:25,200 --> 00:33:28,320 Speaker 1: the quiet part out loud and to kind of claim ownership. 613 00:33:29,040 --> 00:33:32,560 Speaker 1: What I find really interesting is we look ahead, is 614 00:33:32,600 --> 00:33:35,640 Speaker 1: that you know, it's easier to be on the outside 615 00:33:36,360 --> 00:33:39,960 Speaker 1: throwing rocks, it's harder to be on the inside picking 616 00:33:40,040 --> 00:33:43,720 Speaker 1: up shots of lahs. So as this group has essentially 617 00:33:43,760 --> 00:33:47,680 Speaker 1: now become the mainstream and the establishment, how are they 618 00:33:47,680 --> 00:33:48,800 Speaker 1: going to deal with that? Right? 619 00:33:49,080 --> 00:33:51,520 Speaker 2: Because they don't want to be the establishment. I mean, 620 00:33:51,560 --> 00:33:55,480 Speaker 2: that's what's really interesting watching the inauguration, is like, these 621 00:33:55,480 --> 00:33:58,520 Speaker 2: are people who built themselves in the anti establishment figures. 622 00:33:59,520 --> 00:34:03,200 Speaker 2: You can't really be anti establishment and be that close 623 00:34:03,280 --> 00:34:07,800 Speaker 2: to the executive branch, you would think. Or maybe you can, 624 00:34:07,920 --> 00:34:11,200 Speaker 2: and it remains to be seen, but I just, yeah, 625 00:34:11,239 --> 00:34:15,160 Speaker 2: I don't know how much the libertarian ethos will really remain. 626 00:34:20,480 --> 00:34:23,279 Speaker 1: That's it for this week for tech Stuff. I'm Os 627 00:34:23,320 --> 00:34:25,040 Speaker 1: Voloshian and I'm Kara Price. 628 00:34:25,360 --> 00:34:28,560 Speaker 2: This episode was produced by Victoria Dominguez, Lizzie Jacobs, and 629 00:34:28,600 --> 00:34:32,080 Speaker 2: Eliza Dennis. It was executive produced by me Kara Price, 630 00:34:32,360 --> 00:34:35,960 Speaker 2: oz Valoshian and Kate Osbourne for Kaleidoscope and Katrina Norvel 631 00:34:36,040 --> 00:34:39,640 Speaker 2: for iHeart Podcasts. The Engineer is Biheied. Fraser and Jack 632 00:34:39,680 --> 00:34:42,640 Speaker 2: Insley mixed this episode. Kyle Murdoch wrote our theme song. 633 00:34:43,280 --> 00:34:46,440 Speaker 1: Join us Next Wednesday for tech Stuff the story when 634 00:34:46,520 --> 00:34:49,919 Speaker 1: we share an in depth conversation with author Nathaniel Rich 635 00:34:50,360 --> 00:34:53,280 Speaker 1: about the psychological effects of colonizing mars. 636 00:34:54,200 --> 00:34:56,839 Speaker 2: Please rate, review and reach out to us at tech 637 00:34:56,880 --> 00:35:00,400 Speaker 2: Stuff podcast at gmail dot com. Do they want to 638 00:35:00,400 --> 00:35:00,759 Speaker 2: hear from you