1 00:00:13,600 --> 00:00:15,200 Speaker 1: My phone was ringing in I could see that it 2 00:00:15,280 --> 00:00:18,439 Speaker 1: was going to like the Google voice number, and it 3 00:00:18,600 --> 00:00:24,360 Speaker 1: was this guy in Canada whose day job is like 4 00:00:24,480 --> 00:00:29,840 Speaker 1: it adjacent, and he sounded like pretty frazzled and in 5 00:00:29,880 --> 00:00:30,400 Speaker 1: a hurry. 6 00:00:31,640 --> 00:00:34,400 Speaker 2: Emanuel Meiberg is a journalist and a co founder at 7 00:00:34,400 --> 00:00:37,040 Speaker 2: four or four Media. Two weeks ago, he got a 8 00:00:37,040 --> 00:00:40,559 Speaker 2: phone call someone really wanted to tell him about this 9 00:00:40,720 --> 00:00:44,440 Speaker 2: viral app but this wasn't a pitch. 10 00:00:45,000 --> 00:00:46,680 Speaker 1: And he was like, oh my god, thank god you 11 00:00:46,800 --> 00:00:50,800 Speaker 1: picked up. There's something going down on four chan. There's 12 00:00:50,840 --> 00:00:53,519 Speaker 1: like a big security breach. I think this is a 13 00:00:53,520 --> 00:00:56,560 Speaker 1: big deal. You really need to look at this. And 14 00:00:56,800 --> 00:00:59,160 Speaker 1: he was talking pretty fast, like I couldn't fully understand 15 00:00:59,240 --> 00:01:01,120 Speaker 1: what he was saying, but he sent me an email 16 00:01:01,120 --> 00:01:04,399 Speaker 1: with some links to four chen, and as soon as 17 00:01:04,400 --> 00:01:08,880 Speaker 1: I got there, I could tell that A it was 18 00:01:08,959 --> 00:01:12,560 Speaker 1: pretty bad as he was saying, and B it was 19 00:01:12,600 --> 00:01:13,880 Speaker 1: already too late. 20 00:01:24,360 --> 00:01:28,839 Speaker 2: From Kaleidoscope and iHeart podcasts. This is kill switch. 21 00:01:29,800 --> 00:02:08,240 Speaker 3: I'm Dexter Thomas, goodbye. 22 00:02:10,560 --> 00:02:13,880 Speaker 2: A little while back, an app called t like Tea 23 00:02:14,120 --> 00:02:17,640 Speaker 2: the Tea that You Drink started going viral, and pretty 24 00:02:17,639 --> 00:02:21,000 Speaker 2: soon it was in the headlines as the week went on, 25 00:02:21,160 --> 00:02:24,919 Speaker 2: the headlines kept coming, but for all the wrong reasons. 26 00:02:25,440 --> 00:02:28,200 Speaker 2: What is the t app and how does it work? 27 00:02:28,720 --> 00:02:35,360 Speaker 1: The t app is calling itself a women's dating safety app. 28 00:02:35,639 --> 00:02:37,760 Speaker 1: I think a good way to think about it is 29 00:02:38,160 --> 00:02:43,360 Speaker 1: if you've heard of this Facebook phenomenon of Facebook groups 30 00:02:43,440 --> 00:02:47,400 Speaker 1: called are We Dating the Same Guy? Where it's like 31 00:02:47,440 --> 00:02:50,280 Speaker 1: women are getting together on Facebook in order to find 32 00:02:50,320 --> 00:02:52,160 Speaker 1: out if they're dating the same guy and if there's 33 00:02:52,200 --> 00:02:53,520 Speaker 1: any other red flags. 34 00:02:53,840 --> 00:02:55,919 Speaker 2: The reason that the t app went viral at first 35 00:02:56,040 --> 00:02:59,360 Speaker 2: was because of what it was promising to provide quote 36 00:02:59,639 --> 00:03:03,560 Speaker 2: dating safety tools that protect women. The idea was that 37 00:03:03,600 --> 00:03:06,200 Speaker 2: women could ask other women if they'd had any bad 38 00:03:06,360 --> 00:03:09,680 Speaker 2: or even dangerous experiences with a particular man, and that 39 00:03:09,800 --> 00:03:14,400 Speaker 2: all this communication would happen on a quote secure, anonymous platform. 40 00:03:15,240 --> 00:03:18,200 Speaker 1: When you sign up for tea, the app wants to 41 00:03:18,320 --> 00:03:20,799 Speaker 1: verify that you're a woman, and the way that they 42 00:03:20,919 --> 00:03:25,200 Speaker 1: elected to do that is it asks women to post selfies. 43 00:03:25,520 --> 00:03:29,799 Speaker 1: The app uses this platform that Google owns where you 44 00:03:29,919 --> 00:03:33,280 Speaker 1: set up and deploy your mobile app, and they misconfigured 45 00:03:33,520 --> 00:03:37,440 Speaker 1: how that was used, and anyone could access a bunch 46 00:03:37,520 --> 00:03:42,119 Speaker 1: of data, including all those selfies, or like a large 47 00:03:42,200 --> 00:03:43,960 Speaker 1: number of than thousands. I think it was like seventy 48 00:03:43,960 --> 00:03:49,520 Speaker 1: thousand or something. Selfies were available via this open Google 49 00:03:50,280 --> 00:03:51,960 Speaker 1: Cloud compute instance. 50 00:03:53,720 --> 00:03:57,040 Speaker 2: So a quick simplified explanation of what's happening here. T 51 00:03:57,360 --> 00:03:59,880 Speaker 2: was using an app development platform that lets you store 52 00:04:00,080 --> 00:04:03,320 Speaker 2: data in a couple of different ways. You can set 53 00:04:03,360 --> 00:04:06,840 Speaker 2: it to private and require authentication to access to data, 54 00:04:07,400 --> 00:04:09,520 Speaker 2: or you could set it to public and not require 55 00:04:09,560 --> 00:04:13,720 Speaker 2: any authentication at all. And this is what happened. Tease 56 00:04:13,800 --> 00:04:17,160 Speaker 2: buckets were public, meaning that essentially anyone who knew the 57 00:04:17,320 --> 00:04:21,200 Speaker 2: URL could access it. That spread on Fortune, and this 58 00:04:21,240 --> 00:04:23,440 Speaker 2: is how those users discovered the vulnerability. 59 00:04:25,680 --> 00:04:29,320 Speaker 1: People were already downloading those images and sharing them and 60 00:04:29,360 --> 00:04:34,520 Speaker 1: making them available elsewhere, and we immediately contacted the app. 61 00:04:34,560 --> 00:04:37,920 Speaker 1: It took them a few hours to shut down this axis, 62 00:04:38,600 --> 00:04:41,560 Speaker 1: but by then the information was all out there and 63 00:04:41,640 --> 00:04:46,520 Speaker 1: reposted elsewhere on the internet. And four Chan, I think 64 00:04:46,560 --> 00:04:53,200 Speaker 1: I don't need to explain, is notoriously racist and sexist. 65 00:04:53,400 --> 00:04:57,760 Speaker 1: There was like this gleeful rummaging through the data and 66 00:04:57,839 --> 00:05:02,080 Speaker 1: sharing images of these women and saying how attractive or 67 00:05:02,160 --> 00:05:05,599 Speaker 1: unattractive they were, and trying to get as much data 68 00:05:05,600 --> 00:05:09,000 Speaker 1: on them as possible. In an attempt to humiliate them 69 00:05:09,480 --> 00:05:15,720 Speaker 1: and humiliate women in general who use this app in 70 00:05:15,920 --> 00:05:23,440 Speaker 1: order to identify and avoid men they thought were dangerous, 71 00:05:23,640 --> 00:05:25,440 Speaker 1: and this was like a haha. 72 00:05:25,480 --> 00:05:27,560 Speaker 4: You thought that you would. 73 00:05:27,440 --> 00:05:30,680 Speaker 1: Use this to do something against men, Now it blew 74 00:05:30,760 --> 00:05:33,320 Speaker 1: up in your face and we're going to make your 75 00:05:33,320 --> 00:05:35,680 Speaker 1: life hell. Or even trying to use an app like. 76 00:05:35,640 --> 00:05:39,400 Speaker 2: This, the data was more than just out there. It 77 00:05:39,520 --> 00:05:40,640 Speaker 2: was being used. 78 00:05:41,320 --> 00:05:44,279 Speaker 1: There was a version of the app a long time 79 00:05:44,320 --> 00:05:47,119 Speaker 1: ago where rather than ask for a selfie, it asked 80 00:05:47,120 --> 00:05:52,880 Speaker 1: for users' IDs. So some women it wasn't just like 81 00:05:53,120 --> 00:05:55,560 Speaker 1: their picture, which you might be able to use to 82 00:05:55,800 --> 00:05:57,800 Speaker 1: find them in various ways, but some of them were like, 83 00:05:57,839 --> 00:06:01,320 Speaker 1: here's a picture of my ID, and here's my address, 84 00:06:01,400 --> 00:06:04,880 Speaker 1: and here's my full legal name and eye color and 85 00:06:04,960 --> 00:06:07,320 Speaker 1: all of that. And people were like, you know, we 86 00:06:07,400 --> 00:06:09,520 Speaker 1: can find them. It's just like a bunch of vile 87 00:06:09,680 --> 00:06:12,800 Speaker 1: language about women and trying to talk about it in 88 00:06:12,800 --> 00:06:14,599 Speaker 1: a way that might scare someone. 89 00:06:14,880 --> 00:06:17,520 Speaker 2: How many people's data was exposed. 90 00:06:17,360 --> 00:06:21,200 Speaker 1: So at this point, I would say conservatively seventy two 91 00:06:21,279 --> 00:06:23,640 Speaker 1: thousand users. 92 00:06:24,080 --> 00:06:26,880 Speaker 2: The t app has said that seventy two thousand images 93 00:06:26,920 --> 00:06:31,080 Speaker 2: were exposed, thirteen thousand of those were selfies and IDs. 94 00:06:31,720 --> 00:06:36,279 Speaker 2: This is identifying information that's accessible to anyone who knows 95 00:06:36,279 --> 00:06:39,119 Speaker 2: where to look. Essentially, the back end of this app 96 00:06:39,320 --> 00:06:40,240 Speaker 2: was left open. 97 00:06:40,680 --> 00:06:44,920 Speaker 1: That's kind of the initial thing that people found. It 98 00:06:44,960 --> 00:06:49,919 Speaker 1: obviously requires technical expertise in order to find that out 99 00:06:50,600 --> 00:06:54,200 Speaker 1: and then use it, but not like a computer science 100 00:06:54,240 --> 00:07:00,720 Speaker 1: degree level of expertise. In fact, when this first hit 101 00:07:00,760 --> 00:07:03,640 Speaker 1: four chan, they weren't only talking about the data and 102 00:07:04,279 --> 00:07:09,160 Speaker 1: linking to it, they also offered detailed instructions on how 103 00:07:09,200 --> 00:07:11,440 Speaker 1: to access the data yourself. 104 00:07:11,920 --> 00:07:16,880 Speaker 2: This obviously is very bad, but it gets worse because 105 00:07:17,040 --> 00:07:20,239 Speaker 2: a few days later they found another breach which takes 106 00:07:20,240 --> 00:07:21,480 Speaker 2: it to another level. 107 00:07:23,520 --> 00:07:28,040 Speaker 1: A security researcher gets in touch with us and says, hey, 108 00:07:28,720 --> 00:07:32,680 Speaker 1: I could still access a lot of t app data 109 00:07:32,880 --> 00:07:36,160 Speaker 1: after t said they fixed the issue, and that is 110 00:07:36,200 --> 00:07:37,920 Speaker 1: true in the sense that they fixed the issue we 111 00:07:38,000 --> 00:07:41,840 Speaker 1: reported on, but he found a completely separate issue that 112 00:07:42,120 --> 00:07:45,160 Speaker 1: was even worse. And what happened there is you need 113 00:07:45,200 --> 00:07:48,880 Speaker 1: an API key in order to access all the data 114 00:07:48,960 --> 00:07:51,680 Speaker 1: on the app on the back end, and what he 115 00:07:51,800 --> 00:07:56,040 Speaker 1: realized is that every user gets a key in order 116 00:07:56,120 --> 00:07:59,120 Speaker 1: to let them interface with the app as a user. 117 00:07:59,400 --> 00:08:02,760 Speaker 1: But that's key also allows them to like query the 118 00:08:02,840 --> 00:08:05,240 Speaker 1: server for like other things they should not be able 119 00:08:05,280 --> 00:08:08,239 Speaker 1: to get, and he was able to access a bunch 120 00:08:08,280 --> 00:08:12,120 Speaker 1: of direct messages, right, you should not be able to 121 00:08:12,360 --> 00:08:16,600 Speaker 1: access the back end and see everyone else's messages with 122 00:08:16,880 --> 00:08:21,480 Speaker 1: like your personal user apikey. Yeah, that that is just 123 00:08:21,760 --> 00:08:24,920 Speaker 1: a huge, huge error. So it's like, you join the app, 124 00:08:25,240 --> 00:08:30,080 Speaker 1: you talk to other users, you discover that somebody has 125 00:08:30,120 --> 00:08:32,120 Speaker 1: information about a man your dating. You can take that 126 00:08:32,200 --> 00:08:38,040 Speaker 1: conversation into a direct message conversation where understandably you would 127 00:08:38,080 --> 00:08:41,840 Speaker 1: think that conversation is private. Right, and he could access 128 00:08:42,000 --> 00:08:44,679 Speaker 1: all of that. He managed to get his hands on 129 00:08:44,679 --> 00:08:49,920 Speaker 1: one point one million direct messages and those messages are 130 00:08:49,960 --> 00:08:54,680 Speaker 1: as recent as the day of the initial breach. 131 00:08:55,280 --> 00:08:58,520 Speaker 2: Okay, So, an apikey is a pretty common way that 132 00:08:58,559 --> 00:09:03,720 Speaker 2: websites or apps can you to identify and authenticate a user. Usually, 133 00:09:03,760 --> 00:09:07,160 Speaker 2: the way this works is that your apikey is associated 134 00:09:07,200 --> 00:09:10,040 Speaker 2: with your individual account and the only things that can 135 00:09:10,120 --> 00:09:13,440 Speaker 2: unlock are the things that are connected again to your account. 136 00:09:14,160 --> 00:09:16,840 Speaker 2: But in the case of the t app, each user's 137 00:09:16,840 --> 00:09:21,040 Speaker 2: apikey could give them access to other people's messages. Also, 138 00:09:21,600 --> 00:09:24,959 Speaker 2: how does this breach happen? How does this stuff get leaked? 139 00:09:25,240 --> 00:09:29,400 Speaker 1: So there's two parts of it. One is just the 140 00:09:29,440 --> 00:09:33,760 Speaker 1: way that they used this Google platform in order to 141 00:09:33,800 --> 00:09:38,439 Speaker 1: deploy their app. You should not be able to get 142 00:09:38,480 --> 00:09:41,720 Speaker 1: into it without what's called a token in order to 143 00:09:41,920 --> 00:09:45,440 Speaker 1: view the data. But just like the way they set 144 00:09:45,440 --> 00:09:49,000 Speaker 1: it up, the token was visible to everyone, so you 145 00:09:49,040 --> 00:09:52,200 Speaker 1: can kind of like ping the app and then see 146 00:09:52,240 --> 00:09:54,640 Speaker 1: the token, and then use the token to look through 147 00:09:54,720 --> 00:09:55,559 Speaker 1: all the information. 148 00:09:57,679 --> 00:10:00,400 Speaker 2: At this point, you might have noticed something that neither 149 00:10:00,480 --> 00:10:03,319 Speaker 2: me nor a manual have really used the word hack. 150 00:10:03,720 --> 00:10:07,360 Speaker 2: We'll probably use it casually later, but there's a reason 151 00:10:07,400 --> 00:10:10,079 Speaker 2: that we don't really say it often in this situation, 152 00:10:10,559 --> 00:10:13,720 Speaker 2: and that's because when most people think of a hack, 153 00:10:14,080 --> 00:10:17,480 Speaker 2: they're thinking like Neo and the matrix. But if you've 154 00:10:17,480 --> 00:10:19,960 Speaker 2: been listening to what we've been saying, you're probably thinking, 155 00:10:20,000 --> 00:10:23,080 Speaker 2: wait a second, none of this really sounds all that hard, 156 00:10:23,840 --> 00:10:26,800 Speaker 2: and you'd be right. A lot of this did not 157 00:10:26,920 --> 00:10:30,480 Speaker 2: require any sophisticated knowledge at all. When the first breach 158 00:10:30,600 --> 00:10:34,080 Speaker 2: was posted, people were writing easy to follow tutorials on 159 00:10:34,160 --> 00:10:37,719 Speaker 2: four chan so that anyone could download the data for themselves. 160 00:10:38,480 --> 00:10:41,320 Speaker 2: To borrow a phrase that a previous guest used in 161 00:10:41,360 --> 00:10:45,520 Speaker 2: a past episode t was cartoonishly hackable. 162 00:10:46,760 --> 00:10:49,080 Speaker 1: I was talking to my wife about this as I 163 00:10:49,120 --> 00:10:51,839 Speaker 1: was reporting the story, and she was like, who hacked this? 164 00:10:52,000 --> 00:10:54,240 Speaker 1: Who is the hacker that broke into this company? And 165 00:10:54,280 --> 00:10:57,800 Speaker 1: that is a fair question, and I think you can say, nominally, 166 00:10:58,280 --> 00:11:00,319 Speaker 1: maybe it was someone on four Chan who was looking 167 00:11:00,320 --> 00:11:02,720 Speaker 1: around because they saw the app in the news and 168 00:11:02,800 --> 00:11:05,120 Speaker 1: they were like, I don't know, fuck these women and 169 00:11:05,160 --> 00:11:07,400 Speaker 1: poked around in front of vulnerability and then shared it 170 00:11:07,480 --> 00:11:10,640 Speaker 1: unfortunate and he is the hacker. But here I think 171 00:11:10,720 --> 00:11:13,319 Speaker 1: the metaphor is more like, you put all your money 172 00:11:13,320 --> 00:11:16,360 Speaker 1: in the bank, and the bank left the door open 173 00:11:16,920 --> 00:11:19,560 Speaker 1: and the vaulted door open and left no one in 174 00:11:19,679 --> 00:11:22,280 Speaker 1: charge during the night, and somebody walked in and took 175 00:11:22,320 --> 00:11:22,880 Speaker 1: all the money. 176 00:11:23,000 --> 00:11:25,160 Speaker 2: I mean to use your metaphor about the bank thing. 177 00:11:25,240 --> 00:11:28,600 Speaker 2: This is like the first issue that we discover is 178 00:11:28,640 --> 00:11:33,360 Speaker 2: that the bank left no guards, They left the door open, 179 00:11:33,400 --> 00:11:35,760 Speaker 2: they left it unlocked, and somebody walked in and took 180 00:11:35,800 --> 00:11:36,240 Speaker 2: some money. 181 00:11:36,360 --> 00:11:37,040 Speaker 4: That is bad. 182 00:11:37,760 --> 00:11:41,040 Speaker 2: This is like somebody else walks in and finds out, hey, 183 00:11:41,080 --> 00:11:43,559 Speaker 2: there's a key line on the ground. I can get 184 00:11:43,600 --> 00:11:46,679 Speaker 2: into this one person's vault. But wait a second. This 185 00:11:46,760 --> 00:11:49,839 Speaker 2: also opens almost the entire rest of the bank. I 186 00:11:49,840 --> 00:11:51,840 Speaker 2: can get into everybody's personal information. 187 00:11:51,960 --> 00:11:56,000 Speaker 1: Also, yeah, I would tweak that metaphor even a little 188 00:11:56,040 --> 00:11:57,640 Speaker 1: to make it a little worse, where it's just like 189 00:11:58,080 --> 00:12:00,880 Speaker 1: I heard in the news, this bank just got let 190 00:12:00,920 --> 00:12:03,160 Speaker 1: me just check if they still left the door up, 191 00:12:03,200 --> 00:12:05,760 Speaker 1: And he was like yep, like they actually there's another 192 00:12:05,800 --> 00:12:08,680 Speaker 1: different door that's open, and I walked in and got 193 00:12:08,720 --> 00:12:09,400 Speaker 1: more stuff. 194 00:12:10,160 --> 00:12:12,200 Speaker 2: The metaphors keep getting worse, man Like, I don't even 195 00:12:12,200 --> 00:12:14,640 Speaker 2: want to play this metaphor game. Basically, suffice to say, 196 00:12:14,679 --> 00:12:18,520 Speaker 2: hopefully anybody listening or watching understands this is really bad. 197 00:12:19,280 --> 00:12:23,439 Speaker 2: Being able to get people's photos that's not great. That's 198 00:12:23,480 --> 00:12:25,680 Speaker 2: not good at all, right, Being able to get people's 199 00:12:25,920 --> 00:12:31,559 Speaker 2: ideas that's obviously terrible. Being able to get direct messages 200 00:12:31,800 --> 00:12:35,439 Speaker 2: opens up and entirely different. It's just an entirely different 201 00:12:35,520 --> 00:12:41,840 Speaker 2: kind of vulnerability here. So now we know what happened 202 00:12:42,080 --> 00:12:46,200 Speaker 2: and it's not good. But what could happen next is 203 00:12:46,200 --> 00:13:05,040 Speaker 2: even worse. All right, let me paint the picture here. 204 00:13:05,640 --> 00:13:08,680 Speaker 2: Imagine that you went outside for a walk and you're 205 00:13:08,720 --> 00:13:11,439 Speaker 2: passing in front of an apartment building and you look 206 00:13:11,480 --> 00:13:13,320 Speaker 2: down on the street and you see this key that's 207 00:13:13,400 --> 00:13:16,560 Speaker 2: labeled Apartment one two three, and you pick it up 208 00:13:16,640 --> 00:13:18,920 Speaker 2: and decide to give it a try. So you go 209 00:13:19,000 --> 00:13:20,599 Speaker 2: up to apartment one two three, put the key in 210 00:13:20,640 --> 00:13:24,120 Speaker 2: the hole and it opens. And then you get curious 211 00:13:24,600 --> 00:13:27,640 Speaker 2: and you decide to try that same key on apartment 212 00:13:27,679 --> 00:13:32,000 Speaker 2: one two four, and that door also opens, So does 213 00:13:32,040 --> 00:13:35,600 Speaker 2: the door to apartment one two five, and so does 214 00:13:35,800 --> 00:13:40,199 Speaker 2: every apartment in every building in the city of Dallas, Texas. 215 00:13:40,800 --> 00:13:43,880 Speaker 2: That's roughly the population of women whose data could have 216 00:13:43,880 --> 00:13:47,160 Speaker 2: been exposed here. And it's not just what was leaked, 217 00:13:47,280 --> 00:13:51,000 Speaker 2: it's how those photos, how those addresses, how those private 218 00:13:51,040 --> 00:13:53,319 Speaker 2: messages could be used against these women. 219 00:13:53,880 --> 00:13:59,360 Speaker 1: We reported on many breaches and hacks over the years, 220 00:13:59,360 --> 00:14:02,200 Speaker 1: and I would say this one is one of the 221 00:14:02,200 --> 00:14:04,960 Speaker 1: worst ones that have seen. And there's a couple of 222 00:14:04,960 --> 00:14:09,360 Speaker 1: things that I think really compound the issues. The first 223 00:14:09,360 --> 00:14:12,440 Speaker 1: one I think is you make this app for women. 224 00:14:12,679 --> 00:14:18,240 Speaker 1: All your users are women, and it's like for their 225 00:14:18,480 --> 00:14:21,280 Speaker 1: I don't know, interests, and it's somehow I wouldn't say 226 00:14:21,320 --> 00:14:24,240 Speaker 1: that it's like hostile to men, but it's like it's 227 00:14:24,240 --> 00:14:27,320 Speaker 1: not a space for men, right, and some men like 228 00:14:27,360 --> 00:14:30,160 Speaker 1: these men on four chance find that, I don't know. 229 00:14:30,160 --> 00:14:32,640 Speaker 4: Offensive, the mere existence of this. 230 00:14:32,840 --> 00:14:36,000 Speaker 1: One thing that people have done, now that all these 231 00:14:36,040 --> 00:14:39,920 Speaker 1: thousands of selfies are floating out there. If people recall, 232 00:14:40,600 --> 00:14:43,920 Speaker 1: like the foundation myth of Facebook is that Mark Zuckerberg 233 00:14:44,000 --> 00:14:48,400 Speaker 1: created this website called facemash, and what it did is 234 00:14:49,080 --> 00:14:54,320 Speaker 1: it took all the profile pictures from the Harvard Handbook 235 00:14:54,880 --> 00:14:57,640 Speaker 1: and put two pictures of two women next to one another, 236 00:14:57,680 --> 00:14:59,720 Speaker 1: and you decided who was hotter, and it created like 237 00:14:59,800 --> 00:15:04,800 Speaker 1: the ranking of all the female students at Harvard. So 238 00:15:04,840 --> 00:15:09,280 Speaker 1: somebody basically did something like that for the selfies in 239 00:15:09,320 --> 00:15:12,080 Speaker 1: the app, and they did it collects all these data, 240 00:15:12,160 --> 00:15:14,800 Speaker 1: all these votes, and then it presents like the top 241 00:15:14,880 --> 00:15:18,480 Speaker 1: fifty women who were voted most attractive and like the 242 00:15:18,520 --> 00:15:21,920 Speaker 1: bottom fifty, which is just obviously it's just such a mean, 243 00:15:22,400 --> 00:15:25,240 Speaker 1: terrible thing to do and a violation of privacy, and 244 00:15:25,280 --> 00:15:27,360 Speaker 1: it's like that alone, I think, is like really awful 245 00:15:27,480 --> 00:15:28,360 Speaker 1: and mean spirited. 246 00:15:28,760 --> 00:15:31,160 Speaker 2: One four Chan user took the selfies and IDs that 247 00:15:31,200 --> 00:15:34,080 Speaker 2: they downloaded and created a website that they were calling 248 00:15:34,160 --> 00:15:37,360 Speaker 2: spilled t which used that data to rank the users 249 00:15:37,400 --> 00:15:41,320 Speaker 2: based on attractiveness. Someone else created a map on Google 250 00:15:41,360 --> 00:15:44,240 Speaker 2: Maps that was supposedly showing the locations of the women 251 00:15:44,320 --> 00:15:47,120 Speaker 2: who were affected by the breach. And I have to 252 00:15:47,120 --> 00:15:50,840 Speaker 2: stress again, once you have the data, things like this 253 00:15:50,960 --> 00:15:52,160 Speaker 2: are not hard to do. 254 00:15:52,760 --> 00:15:56,640 Speaker 1: We just had in the UK this Online Safety Act 255 00:15:56,680 --> 00:16:01,120 Speaker 1: pass that now requires Internet platforms to vary verify that 256 00:16:01,360 --> 00:16:06,480 Speaker 1: users are of a certain age to view certain mature content. 257 00:16:07,120 --> 00:16:09,280 Speaker 1: And the way that a lot of platforms are doing 258 00:16:09,280 --> 00:16:12,600 Speaker 1: this for example, like let's say redd it is it 259 00:16:12,640 --> 00:16:14,960 Speaker 1: asks the user to upload a picture of their ID 260 00:16:15,200 --> 00:16:18,880 Speaker 1: or a selfie in order to verify that their adult. 261 00:16:19,360 --> 00:16:20,760 Speaker 4: But like part of the deal. 262 00:16:20,520 --> 00:16:23,640 Speaker 1: In order to make users feel safe is Reddit doesn't 263 00:16:23,640 --> 00:16:26,960 Speaker 1: see the image and this third party service that does 264 00:16:27,000 --> 00:16:31,320 Speaker 1: the verification, they promised to only keep the image for 265 00:16:31,400 --> 00:16:33,680 Speaker 1: seven days, but then they just hold on to your 266 00:16:33,760 --> 00:16:37,520 Speaker 1: verification and they don't have the image anymore. And anytime 267 00:16:37,560 --> 00:16:40,440 Speaker 1: you're like handling sensitive information, that is a good way 268 00:16:40,440 --> 00:16:43,800 Speaker 1: to do it. And it doesn't seem like te did that, 269 00:16:44,120 --> 00:16:48,160 Speaker 1: So that is like bad security practices. That is like 270 00:16:48,200 --> 00:16:51,120 Speaker 1: a pretty bad way to do things. 271 00:16:51,720 --> 00:16:54,000 Speaker 2: Not only is it a bad way to do things, 272 00:16:54,280 --> 00:16:57,440 Speaker 2: it's precisely the opposite of what the t app said 273 00:16:57,440 --> 00:17:00,840 Speaker 2: it was doing. Their privacy policies said photos used for 274 00:17:00,920 --> 00:17:06,480 Speaker 2: verification purposes were quote securely processed and stored only temporarily 275 00:17:06,800 --> 00:17:10,080 Speaker 2: and will be deleted immediately following the completion of the 276 00:17:10,200 --> 00:17:15,560 Speaker 2: verification process end quote. Clearly they were not doing that. 277 00:17:16,280 --> 00:17:20,800 Speaker 1: The other aspect, which is much more serious, is not 278 00:17:20,840 --> 00:17:24,640 Speaker 1: only do you create this app for women, you then 279 00:17:24,840 --> 00:17:29,159 Speaker 1: invite them to talk to each other over direct messages 280 00:17:30,040 --> 00:17:34,560 Speaker 1: about the most sensitive things that can happen to a 281 00:17:34,600 --> 00:17:38,640 Speaker 1: person's life. And you're talking about like husband's cheating. There 282 00:17:38,680 --> 00:17:42,680 Speaker 1: were messages in the data that we got about women 283 00:17:42,720 --> 00:17:45,600 Speaker 1: talking about their abortions. There were people talking about like 284 00:17:46,080 --> 00:17:49,639 Speaker 1: criminal things that they are accusing other men of doing. 285 00:17:49,760 --> 00:17:53,880 Speaker 1: And all that information is now out there. And not 286 00:17:53,960 --> 00:17:56,960 Speaker 1: only is it out there, the nature of the conversation, 287 00:17:57,640 --> 00:18:01,000 Speaker 1: paired with the type of data already out there, makes 288 00:18:01,040 --> 00:18:06,960 Speaker 1: it really easy to identify specific people. And we obviously 289 00:18:07,000 --> 00:18:10,119 Speaker 1: didn't share this information in any story. But imagine if 290 00:18:10,119 --> 00:18:12,280 Speaker 1: you're like one of these scorned men. Imagine if you're 291 00:18:12,320 --> 00:18:14,600 Speaker 1: just one of these four Chune dudes that is angry 292 00:18:14,640 --> 00:18:15,960 Speaker 1: about the existence of this app. 293 00:18:16,000 --> 00:18:18,720 Speaker 4: It's just like, really horrible. What has the company said 294 00:18:18,720 --> 00:18:21,080 Speaker 4: since all this came out? They apologized. 295 00:18:21,880 --> 00:18:26,439 Speaker 1: They said initially they closed the security issue with the 296 00:18:26,440 --> 00:18:29,560 Speaker 1: first hack. After the second hack, they said they fixed 297 00:18:29,560 --> 00:18:32,120 Speaker 1: that issue as well, and they also turned off direct 298 00:18:32,119 --> 00:18:35,000 Speaker 1: messages in the app, so it's like you can no 299 00:18:35,040 --> 00:18:40,520 Speaker 1: longer DM in the app. And then they said they 300 00:18:40,600 --> 00:18:43,159 Speaker 1: hired a cybersecurity firm to come and help them. I 301 00:18:43,160 --> 00:18:46,840 Speaker 1: don't know who that is. They said they contacted law enforcement. 302 00:18:47,400 --> 00:18:50,160 Speaker 1: I don't know which agency that is. They haven't said exactly. 303 00:18:50,680 --> 00:18:54,000 Speaker 2: You just said something that t said that they closed 304 00:18:54,000 --> 00:18:58,200 Speaker 2: the security issue. How do you close the issue if? 305 00:18:58,359 --> 00:19:01,600 Speaker 2: I mean the data is just out now right? It 306 00:19:02,400 --> 00:19:03,320 Speaker 2: cats out the bag. 307 00:19:03,680 --> 00:19:05,359 Speaker 1: Yeah, And this is what I meant, Like from the 308 00:19:05,520 --> 00:19:09,119 Speaker 1: very first moment that I saw the four Chan thread, 309 00:19:09,280 --> 00:19:12,120 Speaker 1: I was like, it's too late. It's like they can 310 00:19:12,400 --> 00:19:16,880 Speaker 1: stop the access, but the information has been extracted from 311 00:19:16,920 --> 00:19:22,360 Speaker 1: the app and shared elsewhere, so it's just out there forever, 312 00:19:22,440 --> 00:19:24,000 Speaker 1: and the images are out there forever. 313 00:19:25,400 --> 00:19:29,080 Speaker 2: And this is where the lawsuit comes in. One woman 314 00:19:29,320 --> 00:19:32,680 Speaker 2: is suing Tea, and her lawyers say that she's hoping 315 00:19:32,720 --> 00:19:35,440 Speaker 2: that other women will join it as a class action lawsuit. 316 00:19:36,000 --> 00:19:39,000 Speaker 2: Just reading through the document, you can see one incredibly 317 00:19:39,080 --> 00:19:42,400 Speaker 2: important aspect of why this data breach could be such 318 00:19:42,480 --> 00:19:46,600 Speaker 2: a risk, So I'm gonna read from it quote many 319 00:19:46,720 --> 00:19:50,639 Speaker 2: users are domestic violence survivors who have intentionally moved to 320 00:19:50,760 --> 00:19:54,520 Speaker 2: new locations and taken steps to conceal their private information, 321 00:19:55,119 --> 00:19:58,240 Speaker 2: including their new addresses and that of their minor children, 322 00:19:58,640 --> 00:20:02,120 Speaker 2: in order to protect themselves from domestic violence perpetrators. 323 00:20:03,000 --> 00:20:03,440 Speaker 4: End quote. 324 00:20:04,280 --> 00:20:06,640 Speaker 2: As you can imagine, those ID cards that are out 325 00:20:06,680 --> 00:20:11,640 Speaker 2: there now made that all irrelevant. The lawsuit then goes 326 00:20:11,680 --> 00:20:15,800 Speaker 2: on to say the t quote calculated to increase its 327 00:20:15,840 --> 00:20:19,840 Speaker 2: own profits at the expense of plaintiff and class members 328 00:20:20,080 --> 00:20:24,640 Speaker 2: by utilizing cheaper, ineffective security measures end quote. 329 00:20:24,840 --> 00:20:29,040 Speaker 1: It's like violation of reasonable expectation of privacy. There was 330 00:20:29,080 --> 00:20:32,520 Speaker 1: like an expectation of privacy and this person talked about 331 00:20:32,560 --> 00:20:36,160 Speaker 1: like very private things, and that trust and. 332 00:20:36,119 --> 00:20:37,400 Speaker 4: Expectation were violated. 333 00:20:37,880 --> 00:20:40,160 Speaker 2: And this is probably where you'd expect me to tell 334 00:20:40,200 --> 00:20:43,119 Speaker 2: you that the app is shut down. Everyone's sing everyone 335 00:20:43,320 --> 00:20:46,760 Speaker 2: parties over it's all done. But that's not what's happening. 336 00:20:47,040 --> 00:20:50,840 Speaker 2: The t app is still operational. They just announced that 337 00:20:50,840 --> 00:20:54,840 Speaker 2: they've signed up over a million new users again after 338 00:20:55,080 --> 00:20:59,120 Speaker 2: all these breaches, and some women are complaining that they 339 00:20:59,160 --> 00:21:03,520 Speaker 2: still can't get access to the app that's after the break, 340 00:21:12,600 --> 00:21:15,160 Speaker 2: all right. I want to be really clear here. There 341 00:21:15,200 --> 00:21:18,200 Speaker 2: are some people who do not believe that an app 342 00:21:18,359 --> 00:21:21,560 Speaker 2: like the t app should exist at all, and there's 343 00:21:21,600 --> 00:21:23,520 Speaker 2: a range of reasons for this. Maybe you think that 344 00:21:23,560 --> 00:21:25,840 Speaker 2: there's a risk that someone might get on the app 345 00:21:25,840 --> 00:21:28,720 Speaker 2: and say something that's either just frivolous or even a 346 00:21:28,760 --> 00:21:33,160 Speaker 2: straight up lie. Fair enough, but stick with me here 347 00:21:33,240 --> 00:21:35,680 Speaker 2: because I want to propose to you that it kind 348 00:21:35,680 --> 00:21:39,600 Speaker 2: of doesn't matter what you personally think about the morality 349 00:21:39,920 --> 00:21:43,199 Speaker 2: of what this app is offering for two reasons. The 350 00:21:43,240 --> 00:21:45,760 Speaker 2: first is that I think there's an issue that is 351 00:21:46,000 --> 00:21:48,960 Speaker 2: actually a lot deeper than the supposed morality or non 352 00:21:49,000 --> 00:21:51,919 Speaker 2: morality of what's happening here, but we'll get to that 353 00:21:51,960 --> 00:21:55,440 Speaker 2: in a bit before that. The other reason is that 354 00:21:55,560 --> 00:21:58,320 Speaker 2: this is not the first time that women have used 355 00:21:58,359 --> 00:22:02,320 Speaker 2: technology in this way. Women have always had to find 356 00:22:02,359 --> 00:22:06,119 Speaker 2: ways to warn other women about dangerous men that exist 357 00:22:06,160 --> 00:22:09,760 Speaker 2: around them. Sometimes that meant talking to other women directly, 358 00:22:09,880 --> 00:22:13,400 Speaker 2: but when technology became more available, of course they use 359 00:22:13,480 --> 00:22:16,119 Speaker 2: that too. And that's something I wanted to talk to 360 00:22:16,160 --> 00:22:19,479 Speaker 2: Sam Cole about. She's another journalist and a co founder 361 00:22:19,520 --> 00:22:21,840 Speaker 2: at Floral four media, and she's done a lot of 362 00:22:21,880 --> 00:22:25,159 Speaker 2: reporting on the concept that it looks like T is 363 00:22:25,320 --> 00:22:29,400 Speaker 2: based on these Facebook groups that had titles like are 364 00:22:29,440 --> 00:22:30,600 Speaker 2: we dating the same guy? 365 00:22:31,080 --> 00:22:34,720 Speaker 5: A couple of years ago, these Facebook groups went pretty 366 00:22:34,800 --> 00:22:39,200 Speaker 5: viral where women were posting a picture of a guy 367 00:22:39,280 --> 00:22:41,520 Speaker 5: and saying, I'm going to go on a tender date 368 00:22:41,560 --> 00:22:44,159 Speaker 5: with him later? Does he have any red flags? And 369 00:22:44,200 --> 00:22:46,280 Speaker 5: red flags are like the code in the Facebook group 370 00:22:46,600 --> 00:22:50,520 Speaker 5: for something as a miss here watch out for some 371 00:22:50,640 --> 00:22:52,840 Speaker 5: kind of bad behavior. And then people can comment on 372 00:22:52,920 --> 00:22:56,280 Speaker 5: the posts and on the photos and say, yeah, this 373 00:22:56,400 --> 00:23:02,640 Speaker 5: guy pressured me into sex or ghosted me, or is married. 374 00:23:02,920 --> 00:23:06,520 Speaker 5: There's a huge spectrum of what a red flag is 375 00:23:06,720 --> 00:23:10,600 Speaker 5: in these communities. It was a much more localized and 376 00:23:12,080 --> 00:23:17,520 Speaker 5: I don't know, I guess more community based situation than 377 00:23:18,480 --> 00:23:19,240 Speaker 5: I would say. 378 00:23:19,119 --> 00:23:23,600 Speaker 2: T is the concept, or we could say the promise 379 00:23:23,720 --> 00:23:26,720 Speaker 2: of T I think stems from a good thing. As 380 00:23:26,720 --> 00:23:31,399 Speaker 2: the website says, quote dating safety tools that protect women 381 00:23:31,920 --> 00:23:35,199 Speaker 2: and unfortunately there is a real need for that. 382 00:23:35,720 --> 00:23:38,680 Speaker 5: We have like information about how this works and like 383 00:23:38,800 --> 00:23:41,680 Speaker 5: how many people are affected by that type of violence 384 00:23:41,960 --> 00:23:45,000 Speaker 5: and abuse on dates. I think it's like almost half 385 00:23:45,080 --> 00:23:46,760 Speaker 5: like it's close to the forty five percent or something 386 00:23:46,800 --> 00:23:52,080 Speaker 5: of college women experienced some kind of like verbal, physical, digital, 387 00:23:52,160 --> 00:23:58,520 Speaker 5: even like stalking type of abuse as a part of dating. 388 00:23:57,040 --> 00:23:59,200 Speaker 4: Right, which is so high. 389 00:23:59,400 --> 00:24:03,520 Speaker 5: It's a really risky thing to go on what is 390 00:24:03,600 --> 00:24:06,680 Speaker 5: essentially like a blind date, like someone who's a stranger 391 00:24:07,080 --> 00:24:11,919 Speaker 5: who you only know from pictures and text. It's just 392 00:24:11,960 --> 00:24:14,439 Speaker 5: the extremes that we have to go to to stay 393 00:24:14,520 --> 00:24:17,280 Speaker 5: safe because we're scared of something of the worst case 394 00:24:17,280 --> 00:24:20,439 Speaker 5: scenario happening, because we've heard it happened to other people 395 00:24:20,600 --> 00:24:21,200 Speaker 5: all the time. 396 00:24:21,760 --> 00:24:25,000 Speaker 2: To bring in some of the complaints that I've seen online, 397 00:24:25,160 --> 00:24:28,320 Speaker 2: anybody could get on and post anything, They could lie 398 00:24:28,359 --> 00:24:29,800 Speaker 2: and say anything that they want to say. 399 00:24:29,920 --> 00:24:31,360 Speaker 4: How do the communities deal with that? 400 00:24:31,640 --> 00:24:34,720 Speaker 5: It's something that the communities are very motivated to keep 401 00:24:34,760 --> 00:24:39,280 Speaker 5: in check themselves because they want to, you know, make 402 00:24:39,320 --> 00:24:42,800 Speaker 5: sure there isn't actually like just straight up like false 403 00:24:42,800 --> 00:24:46,399 Speaker 5: allegations happening in the Facebook communities that you have to 404 00:24:46,440 --> 00:24:49,399 Speaker 5: agree to before you enter, and one of the rules 405 00:24:49,440 --> 00:24:52,919 Speaker 5: is no liable defamation or copyright infringement. You have to 406 00:24:52,960 --> 00:24:55,680 Speaker 5: be prepared to show proof of anything you say about 407 00:24:55,680 --> 00:24:58,960 Speaker 5: somebody and warn that you're posting at your own risk. 408 00:24:59,400 --> 00:25:01,120 Speaker 5: One of the rules says, if you don't have any 409 00:25:01,160 --> 00:25:03,560 Speaker 5: proof of allegations you make against someone posted in this group, 410 00:25:03,600 --> 00:25:06,960 Speaker 5: you're in danger of being sued for libel and or defamation. 411 00:25:07,640 --> 00:25:10,240 Speaker 5: It's like, don't be mean spirited, don't be judgmental, don't 412 00:25:10,280 --> 00:25:12,800 Speaker 5: make comments based on somebody's looks or age or occupation 413 00:25:13,000 --> 00:25:15,200 Speaker 5: or anything like that. Don't make fun of people. It's 414 00:25:15,240 --> 00:25:19,399 Speaker 5: not just a shit talking group. The purpose of it is, 415 00:25:20,080 --> 00:25:22,960 Speaker 5: and at least the stated purpose is to protect women 416 00:25:23,119 --> 00:25:26,199 Speaker 5: from going on dates with guys who are like known 417 00:25:26,920 --> 00:25:28,280 Speaker 5: bad actors. 418 00:25:28,800 --> 00:25:30,560 Speaker 2: If you really think about it, this all makes a 419 00:25:30,560 --> 00:25:34,600 Speaker 2: lot of sense. These groups are focused on information. If 420 00:25:34,600 --> 00:25:38,919 Speaker 2: someone's posting bad information, fake allegations, or I don't know, 421 00:25:38,960 --> 00:25:41,440 Speaker 2: if someone's complaining that some guy likes to chew with 422 00:25:41,480 --> 00:25:43,719 Speaker 2: his mouth open or something like that, that makes it 423 00:25:43,840 --> 00:25:46,280 Speaker 2: really inefficient. If you just need to know if the 424 00:25:46,320 --> 00:25:49,159 Speaker 2: person you're going to see tonight is dangerous or not. 425 00:25:49,680 --> 00:25:52,680 Speaker 2: These groups are not for messing around. And that's where 426 00:25:52,760 --> 00:25:56,359 Speaker 2: Tea starts to differ from these community groups, because t 427 00:25:56,720 --> 00:25:59,680 Speaker 2: is promising that it's fun and I'm speaking in the 428 00:25:59,680 --> 00:26:02,680 Speaker 2: present intense here. Because the Tea app did not shut down. 429 00:26:03,119 --> 00:26:06,880 Speaker 2: It's still operational. They're still signing people up. 430 00:26:07,040 --> 00:26:07,920 Speaker 5: It's so crazy. 431 00:26:08,200 --> 00:26:14,320 Speaker 2: They are still posting on their Instagram the audacity. I'm 432 00:26:14,320 --> 00:26:16,919 Speaker 2: looking at this now. The caption of this Welcome to 433 00:26:17,000 --> 00:26:20,760 Speaker 2: the ultimate girls only group chat and app form. Whether 434 00:26:20,840 --> 00:26:24,600 Speaker 2: you're here to make new friends, share stories, or find support. 435 00:26:24,760 --> 00:26:27,280 Speaker 2: Tea is where women across the US connect, share advice, 436 00:26:27,320 --> 00:26:30,160 Speaker 2: and get each other. Tea is the girl's girl app 437 00:26:30,200 --> 00:26:33,040 Speaker 2: where the girls get to yapp. Welcome to the Tea party. 438 00:26:33,359 --> 00:26:36,160 Speaker 5: Dude, shoot me. I don't know what to say. It's 439 00:26:36,280 --> 00:26:40,720 Speaker 5: just fucking crazy. It's fucking crazy. It's crazy that they 440 00:26:40,960 --> 00:26:43,560 Speaker 5: are trying to recover and damage control out of this 441 00:26:43,600 --> 00:26:46,399 Speaker 5: as fast as possible when like, your app has just 442 00:26:46,440 --> 00:26:48,760 Speaker 5: been blown to s motherians by hackers and you're like 443 00:26:49,480 --> 00:26:54,159 Speaker 5: exposing the data of all of your users and you're like, 444 00:26:54,359 --> 00:26:57,840 Speaker 5: we're fine, everything is fine. Come on back, come on 445 00:26:57,920 --> 00:27:01,280 Speaker 5: in if you're waiting to get in, and people are 446 00:27:01,480 --> 00:27:04,520 Speaker 5: right like they're trying still to get in. 447 00:27:05,400 --> 00:27:08,960 Speaker 2: And this is where I have to say that they 448 00:27:09,000 --> 00:27:12,560 Speaker 2: have addressed this in some form, right, Ladies of Tea. 449 00:27:12,600 --> 00:27:15,360 Speaker 2: We have an update regarding the cyber incident that took 450 00:27:15,440 --> 00:27:17,919 Speaker 2: place last week and wanted to share it with you 451 00:27:17,960 --> 00:27:20,879 Speaker 2: as soon as possible. Tea was born from the mission 452 00:27:20,960 --> 00:27:24,520 Speaker 2: to empower, support and amplify the voices of women navigating 453 00:27:24,560 --> 00:27:28,640 Speaker 2: the modern dating world. Our mission remains the same. While 454 00:27:28,680 --> 00:27:31,800 Speaker 2: we acknowledge this serious cyber incident, we also acknowledge that 455 00:27:31,880 --> 00:27:34,400 Speaker 2: T is needed now more than ever. 456 00:27:34,880 --> 00:27:38,479 Speaker 5: It's just wild. It's wild to like pr speak your 457 00:27:38,480 --> 00:27:39,800 Speaker 5: way out of such a disaster. 458 00:27:40,320 --> 00:27:43,000 Speaker 2: One of the comments I was seeing was, y'all are 459 00:27:43,040 --> 00:27:46,000 Speaker 2: focused on the wrong things. We do not care about 460 00:27:46,040 --> 00:27:49,520 Speaker 2: the leak, work on approving us in the queue. There's 461 00:27:49,560 --> 00:27:52,800 Speaker 2: people still complaining about Hey, okay, stop talking about this 462 00:27:52,840 --> 00:27:55,280 Speaker 2: hack stuff. Why have you not approved my account yet? 463 00:27:55,480 --> 00:28:02,040 Speaker 5: It's just wild. We're so like desperate for convenient solutions 464 00:28:02,080 --> 00:28:05,560 Speaker 5: at this point. Just get me what I want immediately 465 00:28:05,680 --> 00:28:10,760 Speaker 5: right now. I don't care if there's a huge privacy 466 00:28:10,800 --> 00:28:15,560 Speaker 5: disaster as a result of negligence from this app. 467 00:28:15,640 --> 00:28:18,399 Speaker 2: By the way, this is also kind of important the 468 00:28:18,440 --> 00:28:22,680 Speaker 2: word T. It has a deeper background in queer and 469 00:28:22,760 --> 00:28:25,679 Speaker 2: black circles, but now it's just kind of used as 470 00:28:25,720 --> 00:28:29,320 Speaker 2: mainstream slang for gossip, Like if someone says, what's the 471 00:28:29,359 --> 00:28:31,840 Speaker 2: tea on our new coworker that just means, hey, if 472 00:28:31,880 --> 00:28:34,399 Speaker 2: you heard any gossip, please tell me. So there was 473 00:28:34,400 --> 00:28:37,080 Speaker 2: an element of this app that was less about safety 474 00:28:37,640 --> 00:28:41,959 Speaker 2: and more about promising fun, basically like any other social 475 00:28:42,000 --> 00:28:45,200 Speaker 2: media app. It was built off of fomo. Even if 476 00:28:45,240 --> 00:28:48,320 Speaker 2: you don't have any real concerns about men around you, 477 00:28:48,320 --> 00:28:50,720 Speaker 2: you don't want to get left out, which is really 478 00:28:50,840 --> 00:28:54,360 Speaker 2: counter to what these original community groups were all about. 479 00:28:54,560 --> 00:28:56,440 Speaker 1: If you pull up a lot of the images of 480 00:28:56,480 --> 00:29:00,040 Speaker 1: the website and the promotional materials of the app, it 481 00:29:00,120 --> 00:29:03,320 Speaker 1: like a woman whispering in another woman's ear. All the 482 00:29:03,360 --> 00:29:07,320 Speaker 1: promotional videos on social media is like other women inviting 483 00:29:07,320 --> 00:29:09,160 Speaker 1: you to use the app and stuff like that, and 484 00:29:09,160 --> 00:29:12,200 Speaker 1: it's it's I'm not saying this is why the app 485 00:29:12,320 --> 00:29:15,320 Speaker 1: failed so terribly, but it's a guy. 486 00:29:15,360 --> 00:29:17,400 Speaker 4: It's like, seeoh is the guy. 487 00:29:17,800 --> 00:29:19,680 Speaker 1: It's a guy who started this app and he put 488 00:29:19,880 --> 00:29:22,760 Speaker 1: a bunch of women like as the face of it 489 00:29:23,320 --> 00:29:27,360 Speaker 1: and failed failed them, terribly, failed them, just absolutely terribly. 490 00:29:29,120 --> 00:29:31,880 Speaker 2: We're talking here about a man named Sean Cook, the 491 00:29:32,000 --> 00:29:35,680 Speaker 2: founder of Tea. According to the company's website, he started 492 00:29:35,680 --> 00:29:39,920 Speaker 2: the app quote after witnessing his mother's terrifying experience with 493 00:29:40,080 --> 00:29:44,720 Speaker 2: online dating. Not only being catfished, but unknowingly engaging with 494 00:29:44,880 --> 00:29:46,600 Speaker 2: men who had criminal records. 495 00:29:46,800 --> 00:29:49,320 Speaker 1: That's with everything else. It's like something that started as 496 00:29:49,360 --> 00:29:54,760 Speaker 1: like a community effort, a not for profit thing, and 497 00:29:55,280 --> 00:29:57,960 Speaker 1: someone saw that and said, Okay, this is very popular 498 00:29:58,080 --> 00:30:00,480 Speaker 1: and that I can probably make some money off this 499 00:30:00,800 --> 00:30:02,760 Speaker 1: by putting it on the app store. 500 00:30:02,960 --> 00:30:04,800 Speaker 2: And we do have to keep in mind here that 501 00:30:04,840 --> 00:30:08,320 Speaker 2: this is not a grassroots organization. This is a company 502 00:30:08,440 --> 00:30:11,120 Speaker 2: that needs to make money. Back in May, the founder 503 00:30:11,160 --> 00:30:13,840 Speaker 2: got on a podcast and talked about his prospects for 504 00:30:13,880 --> 00:30:16,320 Speaker 2: getting angel investing and making money on this. 505 00:30:17,000 --> 00:30:20,760 Speaker 6: I've wanted us to just build something really powerful. We 506 00:30:21,400 --> 00:30:24,200 Speaker 6: have been receiving a lot of business development interest in 507 00:30:24,320 --> 00:30:28,479 Speaker 6: partnering and investing in tea and acquiring to and we 508 00:30:28,520 --> 00:30:33,240 Speaker 6: do have some executives and some pretty renowned angel investors 509 00:30:33,320 --> 00:30:34,760 Speaker 6: in the dating space. 510 00:30:35,720 --> 00:30:38,239 Speaker 2: The founder was actually answering questions about stuff, you know, 511 00:30:38,280 --> 00:30:40,160 Speaker 2: back when the tea app was on the upswing. He 512 00:30:40,280 --> 00:30:44,800 Speaker 2: was talking about, Yeah, we've gotten interest from angel investors 513 00:30:45,440 --> 00:30:49,000 Speaker 2: who are interested in us. There was modetization in the app. 514 00:30:49,120 --> 00:30:53,280 Speaker 2: You could pay for features in the app, and any 515 00:30:53,320 --> 00:30:56,160 Speaker 2: investor is not going to invest in your company unless 516 00:30:56,160 --> 00:30:59,040 Speaker 2: you've already shown that you can make money, and the 517 00:30:59,080 --> 00:31:02,440 Speaker 2: t app does the Through a subscription model, a user 518 00:31:02,440 --> 00:31:04,760 Speaker 2: can pay fifteen dollars a month and get things like 519 00:31:04,960 --> 00:31:10,080 Speaker 2: unlimited searches, reverse image searches, background checks, and phone number lookups. 520 00:31:10,400 --> 00:31:13,880 Speaker 2: By the way, reverse image searches phone number lookups, a 521 00:31:13,880 --> 00:31:15,760 Speaker 2: lot of that stuff can be done with Google or 522 00:31:15,800 --> 00:31:19,480 Speaker 2: tools that are available for very cheap or free. Tea 523 00:31:19,600 --> 00:31:22,040 Speaker 2: was just packaging these things in an easy to use 524 00:31:22,080 --> 00:31:26,000 Speaker 2: place along with a centralized chat with other users. And 525 00:31:26,040 --> 00:31:28,760 Speaker 2: that's where Tea's marketing comes in, where they make it 526 00:31:28,840 --> 00:31:31,400 Speaker 2: look like the place that all your friends are hanging out. 527 00:31:31,560 --> 00:31:33,600 Speaker 2: And in the early stages of an app, when you're 528 00:31:33,640 --> 00:31:37,600 Speaker 2: raising money, that's what you need more than anything users. 529 00:31:37,960 --> 00:31:41,280 Speaker 2: And of course there's a profit motive here to play 530 00:31:41,320 --> 00:31:43,680 Speaker 2: down what's going on, because really the way they're portraying 531 00:31:43,720 --> 00:31:47,040 Speaker 2: it is there was a hack problem fixed, something bad happened, 532 00:31:47,040 --> 00:31:51,240 Speaker 2: and we fixed it. The cyber incident quote unquote is over. 533 00:31:51,120 --> 00:31:53,680 Speaker 5: Now, and this is probably like great for it. Just 534 00:31:53,720 --> 00:31:56,239 Speaker 5: show that, like we can fuck it up horrendously and 535 00:31:56,280 --> 00:31:58,920 Speaker 5: people are still banging on the door is like something 536 00:31:58,960 --> 00:32:02,440 Speaker 5: that I would bring to an invest Sure, Look how 537 00:32:02,440 --> 00:32:05,640 Speaker 5: popular this is like, how much this isn't filling a need. 538 00:32:06,320 --> 00:32:09,240 Speaker 2: We had two catastrophic leaks and people are still trying 539 00:32:09,240 --> 00:32:09,640 Speaker 2: to get in. 540 00:32:09,840 --> 00:32:11,480 Speaker 4: Yeah, that's so bleak. 541 00:32:11,600 --> 00:32:14,840 Speaker 2: I had not thought about that that they could probably 542 00:32:14,840 --> 00:32:17,520 Speaker 2: still make money here. Honestly, why wouldn't they. 543 00:32:18,040 --> 00:32:20,800 Speaker 5: We're in the fuck around and find out era of society. 544 00:32:21,760 --> 00:32:24,120 Speaker 2: The fact that they're women who are still clamoring to 545 00:32:24,160 --> 00:32:26,960 Speaker 2: get onto this app might be really strange to you, 546 00:32:26,960 --> 00:32:29,520 Speaker 2: and you might be thinking, Yo, what is wrong with 547 00:32:29,560 --> 00:32:32,280 Speaker 2: these women? But also think about where the users are 548 00:32:32,320 --> 00:32:36,320 Speaker 2: getting their information from about the leak from the app itself. 549 00:32:37,040 --> 00:32:40,120 Speaker 2: Most of us really tend to trust the technology we use, 550 00:32:40,640 --> 00:32:43,560 Speaker 2: especially when it's billing itself as a safety app, because 551 00:32:43,680 --> 00:32:45,880 Speaker 2: I mean, if you can't trust the safety app, what 552 00:32:46,000 --> 00:32:49,400 Speaker 2: can you trust? And reading the t app's Instagram posts, 553 00:32:49,880 --> 00:32:52,040 Speaker 2: it makes it sound like this whole thing wasn't really 554 00:32:52,080 --> 00:32:55,640 Speaker 2: a big deal and whatever happened, it's been taken care 555 00:32:55,680 --> 00:32:59,960 Speaker 2: of by now, which again isn't true because all those photos, 556 00:33:00,080 --> 00:33:04,840 Speaker 2: the names, the addresses, potentially messages, it's all out there now. 557 00:33:05,320 --> 00:33:08,960 Speaker 2: It will never go away. And even for the people 558 00:33:09,000 --> 00:33:11,640 Speaker 2: who have read up on all this, we have to 559 00:33:11,640 --> 00:33:12,920 Speaker 2: think about the alternatives. 560 00:33:12,920 --> 00:33:14,920 Speaker 5: Here, I guess people are kind of doing that math 561 00:33:14,960 --> 00:33:18,960 Speaker 5: and saying, even though it's clearly not a safe place 562 00:33:19,120 --> 00:33:23,840 Speaker 5: to trust with my data or with my information, I'm 563 00:33:23,880 --> 00:33:28,360 Speaker 5: weighing that against my need for safety in the dating world. 564 00:33:28,520 --> 00:33:30,760 Speaker 5: All of my information has already been exposed to the 565 00:33:30,800 --> 00:33:33,920 Speaker 5: Internet in the past by every big company, every big 566 00:33:34,000 --> 00:33:37,840 Speaker 5: healthtach company, every major corporation has had some kind of 567 00:33:37,840 --> 00:33:41,480 Speaker 5: big breach at this point, So what's another one. It's 568 00:33:41,520 --> 00:33:45,680 Speaker 5: trying to solve a problem that started with or is 569 00:33:45,760 --> 00:33:50,120 Speaker 5: facilitated by dating apps themselves, Like being worried about going 570 00:33:50,120 --> 00:33:53,440 Speaker 5: on a date with someone. Yeah, because you're cold calling 571 00:33:53,480 --> 00:33:55,560 Speaker 5: people on a dating app, not meeting them through your 572 00:33:55,560 --> 00:33:59,880 Speaker 5: community or through in person, means beforehand, you're totally right, 573 00:34:00,280 --> 00:34:02,360 Speaker 5: which is a new phenomenon. It's like a new thing. 574 00:34:02,600 --> 00:34:05,080 Speaker 2: Yeah, we've now got an app to solve problems that 575 00:34:05,160 --> 00:34:08,319 Speaker 2: another app caused. There's a lot about the situation that 576 00:34:08,480 --> 00:34:11,680 Speaker 2: still does not make sense to me, and I'm not alone. 577 00:34:11,760 --> 00:34:15,399 Speaker 2: It's so bad that I've legit seen people online saying, hey, 578 00:34:15,520 --> 00:34:17,880 Speaker 2: I bet that this was some kind of con trying 579 00:34:17,880 --> 00:34:20,359 Speaker 2: to trick women into leaking all the information on their own. 580 00:34:20,719 --> 00:34:24,480 Speaker 2: And look, I don't think that's the case, but I 581 00:34:24,600 --> 00:34:28,600 Speaker 2: still am wondering, how could this happen? Is this just 582 00:34:28,640 --> 00:34:31,839 Speaker 2: some kind of negligence? Is there any indication that Tea 583 00:34:32,120 --> 00:34:36,800 Speaker 2: knew that their security practices were not up to standard 584 00:34:36,800 --> 00:34:37,279 Speaker 2: before this? 585 00:34:37,920 --> 00:34:41,359 Speaker 1: It's sort of like, do they know how bad of 586 00:34:41,400 --> 00:34:46,240 Speaker 1: a job they were doing? I have not seen signs 587 00:34:46,280 --> 00:34:51,359 Speaker 1: of that, but that is in itself an issue. Right 588 00:34:51,760 --> 00:34:54,439 Speaker 1: if you're building an app that has one point six 589 00:34:54,560 --> 00:34:57,719 Speaker 1: million users who are invited to talk about like the 590 00:34:57,719 --> 00:35:01,160 Speaker 1: most intimate aspects of their life, even know that you're 591 00:35:01,160 --> 00:35:03,400 Speaker 1: doing a bad job, I would say that is the problem. 592 00:35:03,560 --> 00:35:05,279 Speaker 1: I think the question is more like, who was in 593 00:35:05,400 --> 00:35:11,360 Speaker 1: charge of making sure that people's information we're probably? Was 594 00:35:11,400 --> 00:35:12,840 Speaker 1: that even a job at the company? 595 00:35:13,280 --> 00:35:13,720 Speaker 4: Yeah? 596 00:35:13,880 --> 00:35:15,319 Speaker 1: Is that another hat that. 597 00:35:15,360 --> 00:35:19,000 Speaker 4: The CEO war that they just pull. 598 00:35:18,840 --> 00:35:21,319 Speaker 1: Some solution off the shelf. Those are the kind of 599 00:35:21,400 --> 00:35:22,520 Speaker 1: questions that I'm asking. 600 00:35:22,840 --> 00:35:25,240 Speaker 2: I think those are questions everybody should be asking, because, 601 00:35:25,719 --> 00:35:28,319 Speaker 2: off bet, it has never occurred to me to make 602 00:35:28,360 --> 00:35:30,400 Speaker 2: an app like this, But if I were to make 603 00:35:30,440 --> 00:35:35,319 Speaker 2: an app where vulnerable women are coming together to try 604 00:35:35,320 --> 00:35:38,279 Speaker 2: to protect each other, honestly, four chan is going to 605 00:35:38,320 --> 00:35:40,960 Speaker 2: be my base case, is going to be that somebody 606 00:35:40,960 --> 00:35:42,640 Speaker 2: on four Chan is going to get mad about this, 607 00:35:43,600 --> 00:35:46,320 Speaker 2: and I'm going to have that scenario in mind and say, Okay, 608 00:35:46,440 --> 00:35:50,280 Speaker 2: how do we protect against this. Eventually somebody's gonna want 609 00:35:50,600 --> 00:35:53,000 Speaker 2: to attack this app. I'll making sure that this thing 610 00:35:53,040 --> 00:35:53,600 Speaker 2: is secure. 611 00:35:53,920 --> 00:35:57,680 Speaker 1: But really I find offensive about t and I think 612 00:35:57,760 --> 00:36:00,239 Speaker 1: why it's such a big story whine so shocking is 613 00:36:01,280 --> 00:36:06,239 Speaker 1: this was presented to women as a solution for that. 614 00:36:06,480 --> 00:36:09,279 Speaker 1: Right It's like, here's an app to do this thing 615 00:36:09,360 --> 00:36:14,120 Speaker 1: that is very complicated and risky and private. We made 616 00:36:14,160 --> 00:36:15,879 Speaker 1: an app for it, Boom, you don't have to worry 617 00:36:15,920 --> 00:36:19,799 Speaker 1: about it, and had the opposite effect. Yeah, I guess 618 00:36:19,920 --> 00:36:22,560 Speaker 1: that's probably like a bigger lesson to take away here. 619 00:36:22,600 --> 00:36:26,360 Speaker 1: It's like you need to be very suspicious of any 620 00:36:26,480 --> 00:36:33,200 Speaker 1: app or technology that is suggesting and has an easy 621 00:36:33,239 --> 00:36:35,640 Speaker 1: solution for a very complicated problem. 622 00:36:35,960 --> 00:36:38,440 Speaker 5: It's pretty clear that this is a conversation that needed 623 00:36:38,440 --> 00:36:40,520 Speaker 5: to be had that unfortunately exploded in a way that 624 00:36:41,040 --> 00:36:43,839 Speaker 5: as the result of a huge privacy violation. I think 625 00:36:43,880 --> 00:36:47,200 Speaker 5: it's a much more deeper societal problem that the fact 626 00:36:47,239 --> 00:36:50,239 Speaker 5: that this exists and needs to exist. It makes a 627 00:36:50,239 --> 00:36:52,319 Speaker 5: lot of sense why these exist. Rightly, wish they didn't 628 00:36:52,360 --> 00:36:52,560 Speaker 5: have to. 629 00:36:52,920 --> 00:36:55,920 Speaker 1: The real slap in the face is how you know 630 00:36:56,640 --> 00:36:59,880 Speaker 1: this app handled your data? They just left it pretty. 631 00:37:01,680 --> 00:37:03,600 Speaker 2: And this is where we come back to why I 632 00:37:03,640 --> 00:37:06,000 Speaker 2: don't really think it matters whether or not you like 633 00:37:06,080 --> 00:37:08,920 Speaker 2: the idea of an app where women talk about men 634 00:37:09,040 --> 00:37:12,279 Speaker 2: or not, because look, I've seen it, you probably have too. 635 00:37:12,360 --> 00:37:15,919 Speaker 2: There is a subset of the male population that's saying, oh, well, 636 00:37:15,920 --> 00:37:17,920 Speaker 2: it serves them right. If they're gonna get an app 637 00:37:18,040 --> 00:37:20,279 Speaker 2: and they're gonna be docs and men and talking about men, 638 00:37:20,320 --> 00:37:21,080 Speaker 2: well we're. 639 00:37:20,880 --> 00:37:22,960 Speaker 4: Gonna get you docs two. Serves you right. 640 00:37:23,360 --> 00:37:26,960 Speaker 2: And look, I'm gonna go ahead and leave my opinion 641 00:37:27,120 --> 00:37:30,680 Speaker 2: about that standpoint aside, because you probably already know what 642 00:37:30,719 --> 00:37:33,520 Speaker 2: that is. But let me just point out that it's 643 00:37:33,560 --> 00:37:36,120 Speaker 2: not just women getting docks here, because let's keep in 644 00:37:36,160 --> 00:37:40,080 Speaker 2: mind women were on this app to talk about men. 645 00:37:40,320 --> 00:37:43,560 Speaker 2: So if there are indeed messages floating around, then there 646 00:37:43,560 --> 00:37:49,440 Speaker 2: are also potentially details, potentially serious allegations about men that 647 00:37:49,480 --> 00:37:53,000 Speaker 2: are also floating around out in the open. Data leaks 648 00:37:53,280 --> 00:37:59,200 Speaker 2: do not discriminate. This is bad all around. But again, 649 00:37:59,360 --> 00:38:01,880 Speaker 2: this also comes down, I think to a way that 650 00:38:02,520 --> 00:38:05,759 Speaker 2: as a society we've just started to trust every app 651 00:38:05,800 --> 00:38:10,360 Speaker 2: we've used with incredibly sensitive information, and then we shrug 652 00:38:10,400 --> 00:38:13,600 Speaker 2: our shoulders when someone tells us to be careful for me. 653 00:38:14,600 --> 00:38:18,680 Speaker 2: This part, at the very least, is really cut and dry, 654 00:38:18,719 --> 00:38:21,879 Speaker 2: whatever the sas marketing might have been. If the core 655 00:38:21,960 --> 00:38:25,760 Speaker 2: premise of a company is an app that provides the users' safety, 656 00:38:26,160 --> 00:38:30,399 Speaker 2: and the founder's messaging is also all about safety, then 657 00:38:30,719 --> 00:38:33,840 Speaker 2: I would think that the response to a data breach 658 00:38:34,560 --> 00:38:38,640 Speaker 2: would be a lot more serious. Usually it would sound 659 00:38:38,760 --> 00:38:39,719 Speaker 2: something more like. 660 00:38:41,239 --> 00:38:44,239 Speaker 1: Hey, there's been a breach. Here are the steps that 661 00:38:44,280 --> 00:38:47,200 Speaker 1: we are taking. We have hired this company, they have 662 00:38:47,320 --> 00:38:51,080 Speaker 1: done an audit, they have found this information. We are 663 00:38:51,160 --> 00:38:54,680 Speaker 1: contacting users and letting them know who is affected. The 664 00:38:54,719 --> 00:38:57,120 Speaker 1: people who are affected are going to get this in 665 00:38:57,120 --> 00:38:59,520 Speaker 1: that service in order to try and protect their privacy. 666 00:38:59,640 --> 00:39:01,200 Speaker 4: Either pretty standard. 667 00:39:00,760 --> 00:39:03,400 Speaker 1: Things that should be the tone that should not be 668 00:39:03,600 --> 00:39:07,360 Speaker 1: like we are trying to grow massively, still girlies and 669 00:39:08,239 --> 00:39:09,640 Speaker 1: get you all into this app. 670 00:39:10,400 --> 00:39:12,600 Speaker 2: The thing here that I want to get to is 671 00:39:12,640 --> 00:39:18,200 Speaker 2: that this is an app, and I think maybe it's 672 00:39:18,239 --> 00:39:22,720 Speaker 2: worth really making that distinction between a community of women 673 00:39:23,320 --> 00:39:27,719 Speaker 2: who are working together to protect each other, yeah, and 674 00:39:27,840 --> 00:39:32,200 Speaker 2: an app. Because we're recognizing here that there is a 675 00:39:32,360 --> 00:39:37,160 Speaker 2: problem of violence against women. This is a real problem. 676 00:39:37,200 --> 00:39:40,560 Speaker 2: This exists, that women experience this an order of magnitude 677 00:39:40,560 --> 00:39:44,719 Speaker 2: more than men do. Misogyny exists. These are real societal problems, 678 00:39:45,000 --> 00:39:46,920 Speaker 2: and I think we've gotten so used to we can 679 00:39:47,040 --> 00:39:48,920 Speaker 2: use an app to fix that, like an app to 680 00:39:48,960 --> 00:39:52,520 Speaker 2: fix racism and apt to fix sexism. That idea is listen, 681 00:39:52,560 --> 00:39:53,520 Speaker 2: it's really seductive. 682 00:39:53,920 --> 00:39:55,719 Speaker 5: I do wish that we could solve these problems with 683 00:39:56,280 --> 00:39:59,520 Speaker 5: an app, but like we've just lost a lot of 684 00:40:00,080 --> 00:40:03,160 Speaker 5: the skills that would have been required to solve this 685 00:40:03,200 --> 00:40:04,799 Speaker 5: in a way that would have been more private and 686 00:40:04,880 --> 00:40:09,560 Speaker 5: safe and effective, probably, which would be like knowing your neighbors, 687 00:40:09,640 --> 00:40:13,239 Speaker 5: knowing your community, trusting your friends, having a group of 688 00:40:13,239 --> 00:40:15,520 Speaker 5: people who you trust in. It would have been better 689 00:40:15,520 --> 00:40:19,279 Speaker 5: as like a group chat, but you can't really monetize 690 00:40:19,320 --> 00:40:23,919 Speaker 5: against that without turning into an app or website or 691 00:40:24,239 --> 00:40:26,759 Speaker 5: like a Facebook group daily. We would have done it 692 00:40:26,840 --> 00:40:32,440 Speaker 5: through in person connections, but it's become such a fragmented 693 00:40:32,840 --> 00:40:35,640 Speaker 5: aspect of our society that we have to use these 694 00:40:35,960 --> 00:40:37,760 Speaker 5: apps to facilitate connection. 695 00:40:37,880 --> 00:40:43,040 Speaker 3: At this point, none of this had to happen. 696 00:40:43,600 --> 00:40:47,960 Speaker 2: Tea's inability to implement just basic security methods meant that 697 00:40:47,960 --> 00:40:51,920 Speaker 2: they've violated the security and privacy of their users. And 698 00:40:52,080 --> 00:40:55,880 Speaker 2: T is still posting on their Instagram account. If you 699 00:40:55,920 --> 00:40:58,839 Speaker 2: look at at the Tea Party girls, they're talking about 700 00:40:58,880 --> 00:41:01,239 Speaker 2: how great and fun they're app is and how they're 701 00:41:01,239 --> 00:41:04,840 Speaker 2: welcoming hundreds of thousands of new users. They're not really 702 00:41:05,000 --> 00:41:09,000 Speaker 2: talking about the breach anymore. It's all about the fun 703 00:41:09,080 --> 00:41:11,640 Speaker 2: that you can have if you join everyone else on 704 00:41:11,680 --> 00:41:15,520 Speaker 2: their app. And I said, none of this had to happen, 705 00:41:15,600 --> 00:41:17,160 Speaker 2: but if you think about it, or at least if 706 00:41:17,160 --> 00:41:19,640 Speaker 2: I think about it, it kind of feels like it 707 00:41:19,680 --> 00:41:24,600 Speaker 2: was inevitable because we're starting to get frustrated with these 708 00:41:24,719 --> 00:41:28,839 Speaker 2: complicated problems that exist in society, and then somebody comes 709 00:41:28,840 --> 00:41:31,320 Speaker 2: along and promises that they can fix it with an app, 710 00:41:31,360 --> 00:41:35,240 Speaker 2: and we sigh and relief and we tap the install button. 711 00:41:35,640 --> 00:41:40,439 Speaker 2: We have appified a solution to misogyny. It was never 712 00:41:40,520 --> 00:41:44,120 Speaker 2: going to work. And again, I think our individual opinion 713 00:41:44,160 --> 00:41:47,200 Speaker 2: on whether or not the premise behind Tea was good 714 00:41:47,360 --> 00:41:50,759 Speaker 2: or not doesn't really matter here because I think two 715 00:41:50,840 --> 00:41:54,279 Speaker 2: things have definitely been proven here. First, just in the 716 00:41:54,320 --> 00:41:56,960 Speaker 2: way that this breach happened, I think it proves the 717 00:41:57,160 --> 00:42:00,279 Speaker 2: kind of just basic anger and hatred that win and 718 00:42:00,360 --> 00:42:03,680 Speaker 2: continue to be up against every day just for existing. 719 00:42:04,640 --> 00:42:09,600 Speaker 2: And also I think it's proven again society's complete inability 720 00:42:09,920 --> 00:42:11,520 Speaker 2: to take any of this stuff. 721 00:42:11,680 --> 00:42:12,239 Speaker 4: Seriously. 722 00:42:19,960 --> 00:42:22,560 Speaker 2: Thank you so much for checking out another episode of 723 00:42:22,640 --> 00:42:25,920 Speaker 2: kill Switch. Big shout out to Sam and Emanuel from 724 00:42:25,960 --> 00:42:26,720 Speaker 2: four or for Media. 725 00:42:26,760 --> 00:42:27,640 Speaker 4: By the way, if. 726 00:42:27,560 --> 00:42:29,840 Speaker 2: You're not already subscribed to four or four Media, please 727 00:42:30,400 --> 00:42:33,719 Speaker 2: check them out or really appreciate their work as always, 728 00:42:34,040 --> 00:42:35,919 Speaker 2: and let us know what you think about our work. 729 00:42:35,960 --> 00:42:38,719 Speaker 2: Hopefully you appreciate it too. And you know, if there's 730 00:42:38,760 --> 00:42:40,640 Speaker 2: anything you'd liked us to cover, or you just want 731 00:42:40,640 --> 00:42:43,720 Speaker 2: to talk, you can hit us at kill Switch at 732 00:42:43,800 --> 00:42:47,719 Speaker 2: Kaleidoscope dot NYC, or you can find us on Instagram 733 00:42:47,760 --> 00:42:51,480 Speaker 2: at kill Switch pod and you can't meet directly at 734 00:42:51,680 --> 00:42:55,359 Speaker 2: dex digit that's d e x d I GI on 735 00:42:55,440 --> 00:42:58,080 Speaker 2: Instagram or on Blue Sky if that's your thing, and 736 00:42:58,239 --> 00:43:00,560 Speaker 2: you know, leave us a review wherever you happen to 737 00:43:00,880 --> 00:43:04,520 Speaker 2: leave your podcast reviews. It helps other people find the show, 738 00:43:04,680 --> 00:43:08,520 Speaker 2: which in turn helps us keep doing our thing. This 739 00:43:08,560 --> 00:43:12,440 Speaker 2: thing is hosted by me Dexter Thomas. It's produced by 740 00:43:12,480 --> 00:43:16,879 Speaker 2: Shina Ozaki, Darluk Potts and Kate Osbourne. The theme song 741 00:43:17,080 --> 00:43:20,240 Speaker 2: is by Me and Kyle Murdoch and Kyle also mixes 742 00:43:20,280 --> 00:43:24,400 Speaker 2: a show from Kaleidoscope. Our executive producers are oswad Washin, 743 00:43:24,760 --> 00:43:29,759 Speaker 2: Mangesh Hakikatur and Kate Osborne. From iHeart, our executive producers 744 00:43:29,800 --> 00:43:33,360 Speaker 2: are Katrina Norville and Nikki E. Tour catch All the 745 00:43:33,400 --> 00:43:46,040 Speaker 2: next one