1 00:00:03,520 --> 00:00:06,080 Speaker 1: Hey guys, it's Sammy J. And welcome to this week's 2 00:00:06,120 --> 00:00:09,240 Speaker 1: episode of Let's Be Real. This week's episode is with 3 00:00:09,360 --> 00:00:13,119 Speaker 1: journalists and tech reporter Lorie Siegel. You guys, I have 4 00:00:13,200 --> 00:00:17,320 Speaker 1: always been so fascinated in technology and how social media 5 00:00:17,440 --> 00:00:19,239 Speaker 1: is affecting all of us, and I think it's really 6 00:00:19,239 --> 00:00:23,200 Speaker 1: affecting all of us this year, especially as we are quarantined. 7 00:00:23,800 --> 00:00:27,160 Speaker 1: Lorie Siegel has been an incredible journalist and she has 8 00:00:27,200 --> 00:00:30,120 Speaker 1: a really interesting perspective because she has been on the 9 00:00:30,160 --> 00:00:34,800 Speaker 1: ground of technology. When Instagram, Facebook, Twitter, we're just startups. 10 00:00:34,800 --> 00:00:37,360 Speaker 1: She has been interviewing these founders for years and she 11 00:00:37,560 --> 00:00:40,680 Speaker 1: has an inside perspective that no one else has. I 12 00:00:40,720 --> 00:00:42,960 Speaker 1: hope you enjoyed the episode and I cannot wait for 13 00:00:43,000 --> 00:00:48,920 Speaker 1: your feedback. Hi, Lorie, I am so excited to have 14 00:00:49,000 --> 00:00:52,000 Speaker 1: you on my podcast. I am fascinated by what you 15 00:00:52,080 --> 00:00:55,800 Speaker 1: do because you have been on the ground since Instagram, Facebook, 16 00:00:55,800 --> 00:00:59,000 Speaker 1: Twitter they were just startups. What's it like seeing the 17 00:00:59,040 --> 00:01:03,000 Speaker 1: progression of these artups turned into billion dollar companies? I mean, 18 00:01:03,040 --> 00:01:04,280 Speaker 1: by the way, when you say that, I'm like that, 19 00:01:04,440 --> 00:01:09,560 Speaker 1: I feel so old recent though, like this is so new. 20 00:01:10,080 --> 00:01:12,520 Speaker 1: I know, I think people forget like that. It was. 21 00:01:13,319 --> 00:01:15,360 Speaker 1: I mean, it's funny. I'm writing a book right now 22 00:01:15,360 --> 00:01:17,560 Speaker 1: and I'm talking about my my first interactions with the 23 00:01:17,560 --> 00:01:20,399 Speaker 1: folks from Instagram. And by the way, they're my age. 24 00:01:20,400 --> 00:01:22,280 Speaker 1: Like I'm first of all, I know you're eighteen, so 25 00:01:22,319 --> 00:01:24,039 Speaker 1: you're young, but I'm thirty five years old. Like I'm 26 00:01:24,040 --> 00:01:27,319 Speaker 1: not that old, right, but you're not at all, right. 27 00:01:27,360 --> 00:01:30,400 Speaker 1: But like at the time when I started covering technology, 28 00:01:30,480 --> 00:01:32,720 Speaker 1: it was just like a bunch of us and we 29 00:01:32,720 --> 00:01:36,280 Speaker 1: were just all really young. And I remember interviewing Um 30 00:01:36,400 --> 00:01:39,640 Speaker 1: Kevin Sistram, who had this app called Instagram, and I 31 00:01:39,680 --> 00:01:41,640 Speaker 1: remember us being out of the West Side Highway and 32 00:01:41,640 --> 00:01:43,560 Speaker 1: it was just me and our camera guy and me. 33 00:01:44,120 --> 00:01:46,880 Speaker 1: You know, I was still wearing the same black blazer 34 00:01:46,920 --> 00:01:49,920 Speaker 1: that I wore on every single shoot because I wasn't 35 00:01:49,960 --> 00:01:53,200 Speaker 1: sure exactly had addressed yet for for television. And I 36 00:01:53,240 --> 00:01:57,280 Speaker 1: remember being, you know, um, super interested in this idea 37 00:01:57,320 --> 00:01:59,960 Speaker 1: of an app, and I and and being like, well, 38 00:02:00,160 --> 00:02:03,080 Speaker 1: why would people want to use this photo sharing app? 39 00:02:03,160 --> 00:02:05,360 Speaker 1: And you know, there were four people at the company 40 00:02:05,400 --> 00:02:08,000 Speaker 1: at the time. It was before they sold the Facebook, um, 41 00:02:08,880 --> 00:02:11,640 Speaker 1: and and it was when I would still ask these 42 00:02:11,680 --> 00:02:15,080 Speaker 1: founders and I remember doing this um, like the founder, 43 00:02:15,120 --> 00:02:17,919 Speaker 1: one of the founders of Twitter, why would people use Twitter? 44 00:02:18,240 --> 00:02:21,240 Speaker 1: I remember saying this to the founder of Uber, Travis Kalnik, 45 00:02:21,280 --> 00:02:23,880 Speaker 1: Why would people get into a stranger's car? Because you 46 00:02:24,000 --> 00:02:27,120 Speaker 1: forget all of these things. It was really interesting. It was, 47 00:02:27,560 --> 00:02:31,120 Speaker 1: you know, interviewing these founders when before they turned into 48 00:02:31,520 --> 00:02:35,079 Speaker 1: the multimillionaires and the billionaires, and when they just felt, um, 49 00:02:35,960 --> 00:02:38,919 Speaker 1: very much like kind of your peers and people who 50 00:02:38,960 --> 00:02:41,320 Speaker 1: didn't really play by the rules and and had these 51 00:02:41,360 --> 00:02:43,360 Speaker 1: ideas of things that could change the world. And it 52 00:02:43,400 --> 00:02:46,640 Speaker 1: was really optimistic time, and very much in two thousand 53 00:02:46,639 --> 00:02:49,359 Speaker 1: and eight, two thous and nine when I started covering technology, 54 00:02:49,360 --> 00:02:51,680 Speaker 1: when I got into journalism as a young journalist, like, 55 00:02:51,720 --> 00:02:54,920 Speaker 1: I was just attracted to weirdos and it was just 56 00:02:54,960 --> 00:02:57,280 Speaker 1: a bunch of weirdos, um, And so that's how I 57 00:02:57,280 --> 00:02:59,720 Speaker 1: got into it. When you talk to them now, is 58 00:02:59,760 --> 00:03:02,040 Speaker 1: it your friend then when you used to do still 59 00:03:02,040 --> 00:03:05,320 Speaker 1: have that peer like similarity or do they act like 60 00:03:05,360 --> 00:03:08,760 Speaker 1: the CEOs and feel as though that they are above you? Right, 61 00:03:09,160 --> 00:03:12,359 Speaker 1: that's a good question. Um, Some change and some don't. Right. 62 00:03:12,639 --> 00:03:14,800 Speaker 1: It's like it's like if you knew with someone who 63 00:03:14,840 --> 00:03:16,880 Speaker 1: was in a band that blew up right, and then 64 00:03:17,000 --> 00:03:19,720 Speaker 1: some remain very humble, and then some change. I mean, 65 00:03:19,760 --> 00:03:23,160 Speaker 1: that's an honest answer, I think. Um, but I remember 66 00:03:23,160 --> 00:03:25,880 Speaker 1: when the change started happening, which was super interesting to me. 67 00:03:25,960 --> 00:03:29,400 Speaker 1: Like I remember one day when I think it was 68 00:03:29,560 --> 00:03:31,440 Speaker 1: it was Kevin from Instagram, when he came in and 69 00:03:31,480 --> 00:03:33,239 Speaker 1: all of a sudden he had more PR handlers within, 70 00:03:33,840 --> 00:03:36,280 Speaker 1: and then there were just more layers and layers and layers. 71 00:03:36,280 --> 00:03:39,080 Speaker 1: And it used to be you just would email one 72 00:03:39,160 --> 00:03:42,560 Speaker 1: person and they would just respond backre emailing, like the 73 00:03:42,560 --> 00:03:44,640 Speaker 1: founder of Oover just just at up the interview, and 74 00:03:44,680 --> 00:03:46,400 Speaker 1: then all of a sudden there were more PR handlers 75 00:03:46,400 --> 00:03:47,960 Speaker 1: and more pure handlers, and then all of a sudden 76 00:03:47,960 --> 00:03:51,000 Speaker 1: there was a multibillion dollar company. And so it was 77 00:03:51,040 --> 00:03:53,839 Speaker 1: super I mean, it was super interesting. I interviewed Mark 78 00:03:53,920 --> 00:03:58,600 Speaker 1: Zuckerberg during UM Cambridge Analytico, which was such a such 79 00:03:58,640 --> 00:04:01,240 Speaker 1: a interesting I'm in tech, and it was I think 80 00:04:01,280 --> 00:04:05,080 Speaker 1: a time when we realized how much of our data 81 00:04:05,600 --> 00:04:08,640 Speaker 1: was not as private as we thought. It was terrified, 82 00:04:09,560 --> 00:04:13,080 Speaker 1: which was really terrifying. And so it just the whole 83 00:04:13,120 --> 00:04:16,520 Speaker 1: ethos of technology changed, and it changed our culture, and 84 00:04:16,560 --> 00:04:19,400 Speaker 1: it changed everything we did and everything got a lot 85 00:04:19,440 --> 00:04:22,479 Speaker 1: more complicated. It really did. And I saw, I saw 86 00:04:22,480 --> 00:04:26,679 Speaker 1: the social dilemma terrifying but so important to watch. I've 87 00:04:27,560 --> 00:04:31,600 Speaker 1: I've turned off all my notifications on all my apps 88 00:04:31,680 --> 00:04:34,720 Speaker 1: except Gmail because I need to for school. But it's 89 00:04:34,800 --> 00:04:37,320 Speaker 1: it really makes you look at how we've come far 90 00:04:37,360 --> 00:04:40,279 Speaker 1: as a society with the technology. When you were, you know, 91 00:04:40,400 --> 00:04:42,360 Speaker 1: in two thousand and eight and two thousand nine, did 92 00:04:42,400 --> 00:04:45,240 Speaker 1: you ever think that technology would have gone to such 93 00:04:45,279 --> 00:04:48,359 Speaker 1: a place where it would take such a factor in 94 00:04:48,560 --> 00:04:53,920 Speaker 1: polarizing our country? Um? No, I didn't, but I did. 95 00:04:54,080 --> 00:04:57,360 Speaker 1: I did see it happening before it came right. Um, 96 00:04:57,400 --> 00:05:00,200 Speaker 1: you know, I remember. It's funny. I was looking through 97 00:05:00,240 --> 00:05:03,120 Speaker 1: my notes, um from a talk I gave to some 98 00:05:03,240 --> 00:05:05,560 Speaker 1: students at m y U. And this is years This 99 00:05:05,680 --> 00:05:07,240 Speaker 1: must have been like two thousand and twelve. It was 100 00:05:07,279 --> 00:05:10,320 Speaker 1: like a bunch of engineering students, and I was looking back. 101 00:05:10,320 --> 00:05:12,880 Speaker 1: I'd say this piece of paper and I had written 102 00:05:13,720 --> 00:05:17,040 Speaker 1: things to look out for, um, and and the number 103 00:05:17,080 --> 00:05:19,920 Speaker 1: one thing I wrote I have written on this was empathy, 104 00:05:19,960 --> 00:05:21,440 Speaker 1: Like you've got to have it you, We've got to 105 00:05:21,440 --> 00:05:25,000 Speaker 1: start thinking about empathy really early on. But I remember 106 00:05:25,080 --> 00:05:27,160 Speaker 1: being like, I think it's really important for these engineers 107 00:05:27,160 --> 00:05:29,880 Speaker 1: to start thinking about the human impact, because there was 108 00:05:29,920 --> 00:05:32,000 Speaker 1: always this kind of ones and zeros like text is 109 00:05:32,040 --> 00:05:33,760 Speaker 1: going to change the world, and this black and white 110 00:05:33,800 --> 00:05:36,479 Speaker 1: view of it, um, and sometimes just like the human 111 00:05:36,520 --> 00:05:38,680 Speaker 1: thing got lost. And every time I had like a 112 00:05:38,720 --> 00:05:40,599 Speaker 1: bad story about tech or I was kind of like, 113 00:05:40,800 --> 00:05:42,760 Speaker 1: you know, saying, okay, did you not see that this 114 00:05:42,839 --> 00:05:44,400 Speaker 1: was going to happen? It was always kind of these 115 00:05:44,440 --> 00:05:48,080 Speaker 1: human questions. And so remember, um, you know, to this 116 00:05:48,240 --> 00:05:50,440 Speaker 1: room full of engineers being like, you really need to 117 00:05:50,480 --> 00:05:53,839 Speaker 1: think long and hard about the human impact of the 118 00:05:53,880 --> 00:05:56,559 Speaker 1: algorithms and what you're doing. And that was years before 119 00:05:56,600 --> 00:05:59,719 Speaker 1: this happened. Um. And I remember beginning to see the 120 00:05:59,760 --> 00:06:01,680 Speaker 1: cry acts in the in the system, but kind of 121 00:06:01,720 --> 00:06:06,400 Speaker 1: before it happened, and um and saying, you know, tech 122 00:06:06,480 --> 00:06:09,600 Speaker 1: is not taking responsibility for the content on its platform, 123 00:06:09,680 --> 00:06:12,400 Speaker 1: and the questions are becoming more and more complicated. And 124 00:06:12,400 --> 00:06:16,120 Speaker 1: I remember sitting across from Mark Zuckerberg and saying, how 125 00:06:16,120 --> 00:06:19,000 Speaker 1: does it feel to be editor in chief of, you know, 126 00:06:19,600 --> 00:06:21,359 Speaker 1: of the Internet to some degree? And he didn't like 127 00:06:21,440 --> 00:06:25,000 Speaker 1: that question, But I don't think I was ever mean 128 00:06:25,200 --> 00:06:27,320 Speaker 1: and asking these questions as much as and this has 129 00:06:27,360 --> 00:06:30,040 Speaker 1: never been my style as a journalist. You know, they're 130 00:06:30,040 --> 00:06:32,640 Speaker 1: tough of fair questions. They're fair questions that we're right 131 00:06:32,680 --> 00:06:35,200 Speaker 1: to know. Yeah, And I always thought it was important 132 00:06:35,200 --> 00:06:37,960 Speaker 1: to ask founders, especially because I grew up with these 133 00:06:37,960 --> 00:06:42,160 Speaker 1: founders and I believed in their products, and so I think, Um, 134 00:06:42,279 --> 00:06:45,159 Speaker 1: I did see that polarization was going to happen, but 135 00:06:45,200 --> 00:06:49,600 Speaker 1: to this degree no, And could anyone have envisioned it? Um? 136 00:06:49,640 --> 00:06:52,320 Speaker 1: Probably not? But could could tech have done a much 137 00:06:52,360 --> 00:06:56,400 Speaker 1: better job? Yes? Could they still be doing a better job? Yes? 138 00:06:56,480 --> 00:06:59,200 Speaker 1: Like do I buy it when they? Um? You know, 139 00:06:59,240 --> 00:07:02,480 Speaker 1: when I hear the company lines we take user safety 140 00:07:02,560 --> 00:07:05,800 Speaker 1: very seriously. I mean, like, no, I'm trying to find 141 00:07:05,800 --> 00:07:08,719 Speaker 1: a balance of having a healthy relationship with my phone 142 00:07:08,760 --> 00:07:11,320 Speaker 1: and with social media devices. But you know, we are 143 00:07:11,360 --> 00:07:14,280 Speaker 1: in a global pandemics and now aren't we of communicating 144 00:07:14,400 --> 00:07:19,000 Speaker 1: is through chechnology? So how have you tried to find 145 00:07:19,000 --> 00:07:22,080 Speaker 1: a balance between living a life, even though it through 146 00:07:22,120 --> 00:07:25,920 Speaker 1: a pandemic in a healthy way and in in technology? Not? Well? Like? 147 00:07:26,600 --> 00:07:29,840 Speaker 1: Not well? I mean, if I'm being auf Dick, I 148 00:07:29,840 --> 00:07:32,120 Speaker 1: could say I could be doing a much better job 149 00:07:32,160 --> 00:07:34,440 Speaker 1: of it. I told you as I'm writing this book, 150 00:07:34,480 --> 00:07:37,080 Speaker 1: like I had to delete Facebook and Instagram for three 151 00:07:37,200 --> 00:07:39,480 Speaker 1: or four days because I am not good at the 152 00:07:39,480 --> 00:07:43,240 Speaker 1: middle ground, you know, like I'm just not and and 153 00:07:43,280 --> 00:07:44,960 Speaker 1: so I think that's going to be a problem for 154 00:07:45,000 --> 00:07:49,120 Speaker 1: these for these companies, um, because I do think we 155 00:07:49,120 --> 00:07:51,840 Speaker 1: were all in at this point, you know. And and 156 00:07:51,880 --> 00:07:54,360 Speaker 1: it's been programmed, um, as you probably saw in the 157 00:07:54,400 --> 00:07:58,000 Speaker 1: Social Dilemma. You know, they've been programmed every product decision, 158 00:07:58,120 --> 00:08:01,680 Speaker 1: whether it's like the color of the notifications, it's programmed 159 00:08:01,680 --> 00:08:04,560 Speaker 1: to make our brain go buzz, you know, to get 160 00:08:04,640 --> 00:08:06,960 Speaker 1: us to look at it. That part of the Social 161 00:08:06,960 --> 00:08:11,440 Speaker 1: Dilemma was terrifying. So if they're programming things for our 162 00:08:11,480 --> 00:08:14,000 Speaker 1: brain to respond to in a certain way, how do 163 00:08:14,120 --> 00:08:16,480 Speaker 1: we take back control. I think there will be a 164 00:08:16,480 --> 00:08:19,760 Speaker 1: new conversation and a couple of years, I think tech 165 00:08:19,880 --> 00:08:21,840 Speaker 1: is going to be a part of us, Like we're 166 00:08:21,960 --> 00:08:23,520 Speaker 1: not going to be able to just turn it off. 167 00:08:23,640 --> 00:08:26,360 Speaker 1: Like I do not believe. Um. And maybe this is 168 00:08:26,400 --> 00:08:29,400 Speaker 1: a little controversial to say, but like we're in it, 169 00:08:29,440 --> 00:08:32,679 Speaker 1: Like we've opted in. We are all Like technology is 170 00:08:32,720 --> 00:08:36,319 Speaker 1: inherently a part of humanity now and so to some degree, 171 00:08:36,400 --> 00:08:38,920 Speaker 1: like we've got in this next phase, we've got to 172 00:08:39,040 --> 00:08:42,120 Speaker 1: learn how to interact better with it. And we need 173 00:08:42,160 --> 00:08:45,880 Speaker 1: new entrepreneurs to build better products with humanity first, and 174 00:08:45,880 --> 00:08:48,800 Speaker 1: and so that's a lot of responsibility. I am. I 175 00:08:48,880 --> 00:08:53,320 Speaker 1: interviewed a guy years ago who did UM predictive data 176 00:08:53,360 --> 00:08:56,240 Speaker 1: analytics to determine if something like really bad was going 177 00:08:56,280 --> 00:08:59,880 Speaker 1: to happen UM, like it's suicide bombing or something like 178 00:09:00,000 --> 00:09:03,439 Speaker 1: awful UM. And I remember in the middle of the 179 00:09:03,480 --> 00:09:06,679 Speaker 1: interview we never we never published this part of the interview. UM. 180 00:09:06,720 --> 00:09:08,920 Speaker 1: He was like a human algorithm. He didn't really have 181 00:09:09,080 --> 00:09:12,079 Speaker 1: much social audility. And in the in the middle of 182 00:09:12,160 --> 00:09:14,320 Speaker 1: the interview, he was like, you know, I looked at 183 00:09:14,320 --> 00:09:17,679 Speaker 1: all your data, um, everything you've posted on Facebook and 184 00:09:17,800 --> 00:09:20,560 Speaker 1: Twitter and Instagram over the last eight years. And he's like, 185 00:09:20,600 --> 00:09:22,679 Speaker 1: and I know a lot about you. And I'm like, well, 186 00:09:22,679 --> 00:09:26,160 Speaker 1: what do you and uh and he was like, well, 187 00:09:26,200 --> 00:09:28,680 Speaker 1: you're unhappy in your relationship and you're growing unhappy at 188 00:09:28,720 --> 00:09:31,240 Speaker 1: your job. And I was at cn IN for ten 189 00:09:31,320 --> 00:09:32,920 Speaker 1: years and I was like, and by the way, like, 190 00:09:33,559 --> 00:09:35,800 Speaker 1: well i'll be I'll be honestly, both of those things 191 00:09:35,880 --> 00:09:40,160 Speaker 1: are kind of like true, right and where it hurts, Yeah, 192 00:09:40,400 --> 00:09:43,000 Speaker 1: And I remember I remember being like, ah, I mean 193 00:09:43,040 --> 00:09:44,679 Speaker 1: it could be like a terror card reading, right where 194 00:09:44,679 --> 00:09:47,640 Speaker 1: you're kind of like, okay, like you are unhappy in 195 00:09:47,679 --> 00:09:49,280 Speaker 1: this way and you just kind of go with it. 196 00:09:49,360 --> 00:09:51,320 Speaker 1: But but like it did a terror card reading, you 197 00:09:51,360 --> 00:09:53,040 Speaker 1: know what, that's what you're there to do. You're here 198 00:09:53,040 --> 00:09:56,400 Speaker 1: to get this information in an interview and he goes totally. 199 00:09:56,600 --> 00:09:59,079 Speaker 1: But but I you know, when I left the job 200 00:09:59,120 --> 00:10:02,920 Speaker 1: and I left the boyfriend at the time, um, you know, 201 00:10:03,240 --> 00:10:05,319 Speaker 1: I called him up and I had said, like, how 202 00:10:05,320 --> 00:10:07,760 Speaker 1: did you do that? And he said, every word you post, 203 00:10:07,800 --> 00:10:10,000 Speaker 1: the time of day, you post, the types of words, 204 00:10:10,559 --> 00:10:13,800 Speaker 1: everything is an indicator. It creates this digital puzzle piece 205 00:10:14,000 --> 00:10:16,480 Speaker 1: of who we are and what we're not saying. And 206 00:10:16,760 --> 00:10:18,840 Speaker 1: so I think there's actually something powerful there, right, Like 207 00:10:18,880 --> 00:10:22,120 Speaker 1: I don't think it's completely totally freaky, right, Like I'm sorry, 208 00:10:22,120 --> 00:10:24,400 Speaker 1: I do want to scare all your no, no, but 209 00:10:24,600 --> 00:10:27,800 Speaker 1: this is important to know. But like advertisers are already 210 00:10:27,840 --> 00:10:29,880 Speaker 1: are already doing this to us, right, So what if 211 00:10:29,880 --> 00:10:33,880 Speaker 1: in the future, like we were able to some degree, um, 212 00:10:33,920 --> 00:10:37,160 Speaker 1: to be able to take advantage of that information for ourselves, 213 00:10:37,200 --> 00:10:39,560 Speaker 1: Like what if we had the capacity, what if down 214 00:10:39,559 --> 00:10:43,680 Speaker 1: the line, like we had the capacity, um, to understand 215 00:10:43,679 --> 00:10:46,280 Speaker 1: more about our own data, about our mental health and 216 00:10:46,360 --> 00:10:48,640 Speaker 1: what it said and so and so. I only say 217 00:10:48,640 --> 00:10:50,120 Speaker 1: this to say, as someone who has looked at the 218 00:10:50,120 --> 00:10:52,240 Speaker 1: future and the zeitgeist and what people will be talking 219 00:10:52,280 --> 00:10:54,520 Speaker 1: about down the line, I think there will be a 220 00:10:54,520 --> 00:10:57,880 Speaker 1: new conversation around technology and how do we take all 221 00:10:57,920 --> 00:11:00,760 Speaker 1: of these things and make them work for us as 222 00:11:00,760 --> 00:11:05,679 Speaker 1: opposed to us working for them. Yeah, take the power back? Yeah, like, 223 00:11:06,000 --> 00:11:08,240 Speaker 1: you know, bring back like I'm going to bring back 224 00:11:08,240 --> 00:11:10,719 Speaker 1: my old punk ROCKERSLF, but like, yeah, like, let's take 225 00:11:10,760 --> 00:11:13,880 Speaker 1: the power back in some capacity. That's so interesting because 226 00:11:13,920 --> 00:11:16,079 Speaker 1: it makes sense. You know that it is programmed to 227 00:11:16,120 --> 00:11:19,200 Speaker 1: benefit the companies and the data. It's to make it 228 00:11:19,320 --> 00:11:22,120 Speaker 1: so we can have that power. You know. Um. I'm 229 00:11:22,160 --> 00:11:24,480 Speaker 1: actually in my statistics class. We're doing this two month 230 00:11:24,559 --> 00:11:29,840 Speaker 1: project where we are we choose four metrics about ourselves, um, 231 00:11:29,960 --> 00:11:34,640 Speaker 1: and we record the results every day for two months 232 00:11:34,640 --> 00:11:37,160 Speaker 1: and then analyze the data to know more about ourselves. 233 00:11:37,240 --> 00:11:39,319 Speaker 1: And one of them for me is checking my screen time, 234 00:11:39,640 --> 00:11:43,160 Speaker 1: and I've been very it's been making me much more cautious. 235 00:11:43,520 --> 00:11:45,920 Speaker 1: Two years ago, my softmore year of high school, my 236 00:11:45,960 --> 00:11:48,480 Speaker 1: anxiety got really bad and I shut off my phone 237 00:11:48,480 --> 00:11:50,160 Speaker 1: for two weeks because I realized I was spending like 238 00:11:50,240 --> 00:11:54,839 Speaker 1: thirteen hours a day on my phone. And we're at 239 00:11:54,840 --> 00:11:57,439 Speaker 1: a point in I think technology where when I was 240 00:11:57,480 --> 00:12:00,840 Speaker 1: in that vulnerable moment, it can be really at social 241 00:12:00,840 --> 00:12:04,120 Speaker 1: media and technology for all of that. So when you 242 00:12:04,200 --> 00:12:06,959 Speaker 1: say compressing our data so we can know more about 243 00:12:06,960 --> 00:12:09,680 Speaker 1: our human behaviors, do you think that would benefit us 244 00:12:09,760 --> 00:12:13,920 Speaker 1: if for the people that are in vulnerable situations? I mean, look, 245 00:12:14,080 --> 00:12:16,319 Speaker 1: I think it could go pop ways. And by the way, 246 00:12:16,360 --> 00:12:19,520 Speaker 1: this is technology in a nutshell, Like I think having 247 00:12:19,559 --> 00:12:22,000 Speaker 1: a lot of this data could be super helpful, and 248 00:12:22,000 --> 00:12:24,000 Speaker 1: then I think it could also hurt us too, you know, 249 00:12:24,040 --> 00:12:26,320 Speaker 1: And I think that is where ethics are. And this 250 00:12:26,400 --> 00:12:28,280 Speaker 1: is why, like the sweet spot of my career has 251 00:12:28,280 --> 00:12:31,120 Speaker 1: always been ethics. Like I've always been screaming about ethics 252 00:12:31,240 --> 00:12:35,079 Speaker 1: as long as it's very underrated. It is really underrated. 253 00:12:35,120 --> 00:12:37,400 Speaker 1: It's like not that sexy, but like now it's becoming 254 00:12:37,400 --> 00:12:40,800 Speaker 1: more sexy because all these tech companies are under fire. Um. 255 00:12:40,840 --> 00:12:42,680 Speaker 1: But you know, it's it's the theme that kind of 256 00:12:42,720 --> 00:12:46,280 Speaker 1: goes missing. Um. And especially is something as powerful as technology, 257 00:12:46,280 --> 00:12:47,640 Speaker 1: like I don't even think you know, in the time 258 00:12:47,679 --> 00:12:50,080 Speaker 1: I started covering tech. It's not a beat anymore. It's 259 00:12:50,160 --> 00:12:53,040 Speaker 1: just humanity, Like it's the way we love, it's politics, 260 00:12:53,160 --> 00:12:56,920 Speaker 1: it's it's mental health, it's it's everything. I think. I 261 00:12:56,960 --> 00:12:58,800 Speaker 1: think it's a really good question, and I think and 262 00:12:58,880 --> 00:13:00,520 Speaker 1: I don't think it has an easy answer, and I 263 00:13:00,520 --> 00:13:03,200 Speaker 1: think the best questions don't have easy answers. I'm still 264 00:13:03,200 --> 00:13:05,000 Speaker 1: figuring out what I want to do, but I really 265 00:13:05,120 --> 00:13:07,800 Speaker 1: admire you as a journalist and as a reporter, because 266 00:13:07,840 --> 00:13:12,040 Speaker 1: that's something that I'm becoming really interested in. Um and you, 267 00:13:12,160 --> 00:13:15,400 Speaker 1: like I've said, you've been following this for so many years. 268 00:13:16,000 --> 00:13:20,680 Speaker 1: When you interviewed Mark Zuckerberg about the Cambridge Analytica, First 269 00:13:20,679 --> 00:13:23,160 Speaker 1: of all, that interview was incredible because that was like 270 00:13:23,200 --> 00:13:26,800 Speaker 1: the first time I really saw him take accountability to 271 00:13:27,000 --> 00:13:30,880 Speaker 1: some degree. How do you go into that interview and 272 00:13:30,920 --> 00:13:34,120 Speaker 1: did you expect to get those answers? It's an interesting question, 273 00:13:34,559 --> 00:13:39,600 Speaker 1: you know, I go into any high pressure interview like 274 00:13:40,280 --> 00:13:42,840 Speaker 1: hoping if so. First of all, I think it's always 275 00:13:42,880 --> 00:13:46,160 Speaker 1: in any in any interview, always about to follow up, right, 276 00:13:46,360 --> 00:13:49,520 Speaker 1: Like I always know that they have the thing that 277 00:13:49,559 --> 00:13:51,800 Speaker 1: they want to say that they need to say, So 278 00:13:51,840 --> 00:13:53,720 Speaker 1: you just gotta let them say it and then you 279 00:13:53,720 --> 00:13:56,840 Speaker 1: gotta listen, and then you gotta follow up. Like that's 280 00:13:56,880 --> 00:13:59,240 Speaker 1: that's always my key, So like, you know, and I 281 00:13:59,280 --> 00:14:02,080 Speaker 1: also under dan that like you've got to know Um. 282 00:14:02,160 --> 00:14:06,000 Speaker 1: I remember remember thinking like, um, I had messaged Mark 283 00:14:06,080 --> 00:14:11,480 Speaker 1: on Facebook. Um during that irony people think that, I mean, 284 00:14:11,559 --> 00:14:13,840 Speaker 1: by the way people think these things are so difficult, 285 00:14:14,040 --> 00:14:15,280 Speaker 1: um and by the way it's hard to get an 286 00:14:15,280 --> 00:14:18,640 Speaker 1: interview March sure like but but people also like it's 287 00:14:18,640 --> 00:14:21,640 Speaker 1: like people forget what technology you should just like you know, 288 00:14:21,640 --> 00:14:23,320 Speaker 1: a lot of people have like these booking departments that 289 00:14:23,360 --> 00:14:28,200 Speaker 1: book for the would be on it is his platform. 290 00:14:28,240 --> 00:14:30,200 Speaker 1: I mean, doesn't it make sense, like a as someone 291 00:14:30,200 --> 00:14:33,040 Speaker 1: who's covered technology for a really long time, so you know, 292 00:14:33,320 --> 00:14:34,920 Speaker 1: and then I had followed up with this pure person 293 00:14:34,960 --> 00:14:38,640 Speaker 1: but um, I remember going out there um. And I mean, 294 00:14:38,640 --> 00:14:40,480 Speaker 1: by the way, it couldn't have been more dramatic because 295 00:14:40,520 --> 00:14:43,720 Speaker 1: like Al was coming in um into New York. Like 296 00:14:43,760 --> 00:14:45,360 Speaker 1: I think my life is constantly a bit of a 297 00:14:45,640 --> 00:14:47,360 Speaker 1: It's like my life is a bit of a sign 298 00:14:47,400 --> 00:14:49,840 Speaker 1: Feld episode always. I feel like if we're Anderson Cooper, 299 00:14:49,920 --> 00:14:53,080 Speaker 1: like it would have just been like the like and 300 00:14:53,240 --> 00:14:56,280 Speaker 1: like you know, you know, and maybe you could imagine 301 00:14:56,360 --> 00:15:01,560 Speaker 1: went wrong. Um, but I do remember going in and think, um, 302 00:15:01,560 --> 00:15:04,640 Speaker 1: this is such a big moment. And and it was 303 00:15:04,720 --> 00:15:09,600 Speaker 1: the moment for me that technology had become society because 304 00:15:09,680 --> 00:15:12,640 Speaker 1: everyone cared like It wasn't just my inside baseball tech 305 00:15:12,720 --> 00:15:16,560 Speaker 1: friends that cared like like Facebook had managed to piss 306 00:15:16,600 --> 00:15:19,960 Speaker 1: off so many people in this and and and it 307 00:15:20,040 --> 00:15:23,120 Speaker 1: just was such a large many people's data and so 308 00:15:23,160 --> 00:15:25,480 Speaker 1: many people and you know, and their messaging had not 309 00:15:25,600 --> 00:15:28,880 Speaker 1: been good and also just an understanding of what went wrong. 310 00:15:28,960 --> 00:15:32,840 Speaker 1: And I just remember going in and and um, thinking 311 00:15:33,040 --> 00:15:36,800 Speaker 1: about how small the room felt, you know, and how 312 00:15:36,840 --> 00:15:39,560 Speaker 1: big the story was. Like it was like the small 313 00:15:39,600 --> 00:15:43,440 Speaker 1: cold room because Mark likes it really cold interviews, um, 314 00:15:43,480 --> 00:15:45,760 Speaker 1: and so I was really freezing, and then we had 315 00:15:45,760 --> 00:15:48,400 Speaker 1: the scene and countdown clock and I was just um, 316 00:15:48,400 --> 00:15:50,920 Speaker 1: but I just remember I wanted him and this is 317 00:15:50,920 --> 00:15:52,680 Speaker 1: important to me, and I'm sure you understand. This is 318 00:15:52,680 --> 00:15:54,840 Speaker 1: someone who interviews people. I wanted him to feel like 319 00:15:54,960 --> 00:15:57,960 Speaker 1: himself because if he felt like himself, then he would 320 00:15:58,000 --> 00:16:00,640 Speaker 1: say the things he needed to say. Got to make 321 00:16:00,680 --> 00:16:03,640 Speaker 1: you feel comfortable, yeah, and you know, and and but 322 00:16:03,680 --> 00:16:06,280 Speaker 1: it was also it's always so balanced because you know, 323 00:16:06,440 --> 00:16:09,000 Speaker 1: I wanted him to feel comfortable, but I also you know, 324 00:16:09,200 --> 00:16:10,640 Speaker 1: there was a lot to speak to and there's a 325 00:16:10,640 --> 00:16:13,920 Speaker 1: lot of accountability in that moment um and and so 326 00:16:14,800 --> 00:16:19,760 Speaker 1: it's interesting because we we did the interview um and 327 00:16:19,760 --> 00:16:22,600 Speaker 1: and we kept going because we went past our a 328 00:16:22,600 --> 00:16:25,640 Speaker 1: lot of time. Um But he started, you know, we 329 00:16:25,680 --> 00:16:28,960 Speaker 1: really started talking to this PR person cut you off 330 00:16:29,000 --> 00:16:30,640 Speaker 1: and be like it's time to go, or did they 331 00:16:30,720 --> 00:16:33,120 Speaker 1: just let it happen? Yeah, I mean I think like 332 00:16:33,160 --> 00:16:35,160 Speaker 1: because they were only still a chat like twenty minutes, 333 00:16:35,200 --> 00:16:37,320 Speaker 1: but I think he really started, you know, and he 334 00:16:37,400 --> 00:16:39,880 Speaker 1: was really kind of I think getting his rhythm too, 335 00:16:39,920 --> 00:16:42,400 Speaker 1: and so he went past what we were supposed to talk. 336 00:16:42,440 --> 00:16:44,520 Speaker 1: So we spoke probably for thirty minutes or even more 337 00:16:45,240 --> 00:16:47,120 Speaker 1: um and and you know, when he was answered, it 338 00:16:47,160 --> 00:16:49,600 Speaker 1: was the first time they said he said that Facebook 339 00:16:49,600 --> 00:16:52,480 Speaker 1: should be regulated. It was um he said he would 340 00:16:52,480 --> 00:16:54,840 Speaker 1: be willing to testify. And so we made a lot 341 00:16:54,880 --> 00:16:56,640 Speaker 1: of news and by the way, that a lot of 342 00:16:56,640 --> 00:16:59,880 Speaker 1: that came from follow ups um and and really kind 343 00:16:59,920 --> 00:17:01,440 Speaker 1: of pushing him on some of these things. But also 344 00:17:01,480 --> 00:17:04,000 Speaker 1: because we had the time too, and we took the time. 345 00:17:04,080 --> 00:17:05,399 Speaker 1: And by the way, I'm thinking, well, I think I 346 00:17:05,400 --> 00:17:06,840 Speaker 1: have to be live in like an hour and a half. 347 00:17:06,880 --> 00:17:08,600 Speaker 1: How are we going to do this? But you know, 348 00:17:08,760 --> 00:17:11,240 Speaker 1: and and but it's also kind of like everything else 349 00:17:11,320 --> 00:17:15,439 Speaker 1: disappears and um, and it's just you and that person. 350 00:17:15,640 --> 00:17:18,440 Speaker 1: And it's also understanding the context of what's around you 351 00:17:18,560 --> 00:17:22,120 Speaker 1: and um and not walking in And I've never been 352 00:17:22,200 --> 00:17:24,800 Speaker 1: kind of a gotcha person, and I don't believe in 353 00:17:24,840 --> 00:17:27,399 Speaker 1: taking the cheap shot. But I also don't believe in 354 00:17:27,480 --> 00:17:30,400 Speaker 1: just listening to people kind of you know, saying their 355 00:17:30,440 --> 00:17:33,360 Speaker 1: sound bites. And I also think people are more understood 356 00:17:33,359 --> 00:17:35,760 Speaker 1: when they get beyond their sound bites, Like I actually 357 00:17:35,760 --> 00:17:38,000 Speaker 1: think like I actually think they will be more happy 358 00:17:38,040 --> 00:17:41,720 Speaker 1: with the interview if they're able to actually say more too, 359 00:17:41,760 --> 00:17:43,879 Speaker 1: and you can push them and challenge them in the 360 00:17:43,960 --> 00:17:45,440 Speaker 1: right way. So I think we got into a good 361 00:17:45,440 --> 00:17:47,960 Speaker 1: cadence and I think it was a historical moment in tech. 362 00:17:48,080 --> 00:17:50,320 Speaker 1: I think that you know, it's like an almost famous 363 00:17:50,400 --> 00:17:52,080 Speaker 1: like the guy with his like eyes wide open, like 364 00:17:52,119 --> 00:17:55,120 Speaker 1: where you gotta you gotta just like you know, everything 365 00:17:55,200 --> 00:17:57,960 Speaker 1: changes and you've got to ask the hard questions and 366 00:17:57,960 --> 00:18:00,360 Speaker 1: and tech has become for all the good it has 367 00:18:00,400 --> 00:18:02,080 Speaker 1: created a lot of bad and there's a lot of 368 00:18:02,080 --> 00:18:04,920 Speaker 1: accountability to be had and a lot of complicated questions, 369 00:18:05,200 --> 00:18:07,679 Speaker 1: and I want to be the one asking those questions. 370 00:18:08,040 --> 00:18:10,320 Speaker 1: Do you think, in your personal opinion, because you've been 371 00:18:10,359 --> 00:18:12,720 Speaker 1: doing this for a long time, do you think it 372 00:18:12,880 --> 00:18:17,439 Speaker 1: was meaningful? Do you think it was um, just another 373 00:18:17,560 --> 00:18:21,000 Speaker 1: way to take accountability but not truly need it, because 374 00:18:21,000 --> 00:18:23,520 Speaker 1: that's what happens a lot. I think that I think 375 00:18:23,560 --> 00:18:26,720 Speaker 1: it was meaningful. I think that Facebook was going through 376 00:18:26,760 --> 00:18:30,280 Speaker 1: a huge transition at the time. I think that Facebook 377 00:18:30,320 --> 00:18:32,720 Speaker 1: had to grow up, truthfully, and I think they were 378 00:18:32,760 --> 00:18:35,000 Speaker 1: growing up. I think that for a very long time 379 00:18:35,640 --> 00:18:39,320 Speaker 1: they didn't understand um, you know, that they needed to 380 00:18:39,320 --> 00:18:41,359 Speaker 1: be more open with the media and more with other people, 381 00:18:41,400 --> 00:18:44,160 Speaker 1: that that kind of thing. And I think that, um, 382 00:18:44,240 --> 00:18:48,360 Speaker 1: that all changed, UM after so much went wrong with 383 00:18:48,480 --> 00:18:53,520 Speaker 1: the election and you know, in with Cambridge Analytica it 384 00:18:53,640 --> 00:18:56,800 Speaker 1: was one to three and people lost their patience, UM, 385 00:18:56,840 --> 00:18:58,560 Speaker 1: and so they had no choice and they had to 386 00:18:58,600 --> 00:19:00,480 Speaker 1: start putting Mark out there and Marked have to be 387 00:19:00,520 --> 00:19:02,960 Speaker 1: out there before, you know. But it's interesting because he 388 00:19:03,000 --> 00:19:04,760 Speaker 1: hadn't done tons of press. So by the time he's 389 00:19:04,800 --> 00:19:08,600 Speaker 1: starting to do press, like UM, I had this joke. 390 00:19:08,680 --> 00:19:10,960 Speaker 1: It's kind of like puberty's painful, especially when you're prompting. 391 00:19:11,080 --> 00:19:13,400 Speaker 1: It's like, you know, it's like you're going out there. 392 00:19:13,400 --> 00:19:15,399 Speaker 1: It's one of the most powerful people and now in 393 00:19:15,440 --> 00:19:17,360 Speaker 1: the world, but you haven't been out there much. And 394 00:19:17,560 --> 00:19:21,800 Speaker 1: so it was such a fascinating, um fascinating thing to 395 00:19:21,800 --> 00:19:24,040 Speaker 1: to be a part of it, especially because I think 396 00:19:24,040 --> 00:19:26,080 Speaker 1: I have a nuanced view of technology just because I 397 00:19:26,160 --> 00:19:28,480 Speaker 1: kind of grew up in it. We have to take 398 00:19:28,520 --> 00:19:30,600 Speaker 1: a quick break, but when we come back, we're gonna 399 00:19:30,600 --> 00:19:33,160 Speaker 1: be talking more about how do we know what's true 400 00:19:33,200 --> 00:19:35,440 Speaker 1: on the Internet, how we can take back to power 401 00:19:35,880 --> 00:19:40,119 Speaker 1: and the incredible documentary and the Social Dilemma, and we're back. 402 00:19:41,880 --> 00:19:44,520 Speaker 1: What are your thoughts when Mark Zuckerberg went before Congress 403 00:19:44,560 --> 00:19:49,080 Speaker 1: and talked about his defense in not fact checking people's posts. 404 00:19:49,119 --> 00:19:50,919 Speaker 1: I mean, how do we make sure we get the 405 00:19:50,920 --> 00:19:53,879 Speaker 1: correct information out there when the founder of Facebook won't 406 00:19:53,880 --> 00:19:56,560 Speaker 1: even ensure that? Well, I think it's it's a hard 407 00:19:56,920 --> 00:19:59,840 Speaker 1: it's a it's a hard question, right because, um, you 408 00:20:00,080 --> 00:20:01,520 Speaker 1: Mark has the famous line if he doesn't want to 409 00:20:01,520 --> 00:20:04,639 Speaker 1: be the art of truth, right, like, and you almost 410 00:20:04,640 --> 00:20:06,920 Speaker 1: to a degree, like, do you trust Mark Zuckerberg to 411 00:20:07,359 --> 00:20:11,040 Speaker 1: be fact checking things like you know, and and for 412 00:20:11,040 --> 00:20:13,240 Speaker 1: for a very long time, it's like what I think, 413 00:20:13,240 --> 00:20:16,000 Speaker 1: it's a larger conversation of what what role and responsibility 414 00:20:16,080 --> 00:20:18,080 Speaker 1: do you want to put on technology? And I think 415 00:20:18,080 --> 00:20:21,000 Speaker 1: it's an important societal question and it's one that is 416 00:20:21,000 --> 00:20:24,040 Speaker 1: like completely clashed in the last couple of years because 417 00:20:24,480 --> 00:20:28,280 Speaker 1: for so long when I interviewed tech founders, whether it 418 00:20:28,400 --> 00:20:31,840 Speaker 1: was you know, Jack Dorsey Williams from Twitter or any 419 00:20:31,960 --> 00:20:33,919 Speaker 1: any of these guys, it was we're just the pipes, 420 00:20:33,960 --> 00:20:37,320 Speaker 1: were not responsible, um for for the content on our platforms. 421 00:20:37,320 --> 00:20:40,040 Speaker 1: Well that's all changing now, you know. And and now 422 00:20:40,160 --> 00:20:43,000 Speaker 1: because you have misinformation, you have so much stuff that 423 00:20:43,040 --> 00:20:47,040 Speaker 1: has gone wrong, you have hate that's literally turned offline, 424 00:20:47,080 --> 00:20:51,000 Speaker 1: gone viral, and is like changing user behavior, changing elections. 425 00:20:51,480 --> 00:20:55,280 Speaker 1: You know, it's time for a change. And and so 426 00:20:55,359 --> 00:20:58,840 Speaker 1: I think there's a larger conversation now about what should 427 00:20:58,880 --> 00:21:01,200 Speaker 1: those laws be and should be tech be regulated, which 428 00:21:01,240 --> 00:21:03,159 Speaker 1: I think the answer is a hard yes. But what 429 00:21:03,320 --> 00:21:06,680 Speaker 1: kind of regulation is you know, is the right type 430 00:21:06,680 --> 00:21:08,760 Speaker 1: of regulation and who should be doing it and what 431 00:21:08,920 --> 00:21:11,360 Speaker 1: is right and what is wrong exactly. I think it's 432 00:21:11,359 --> 00:21:14,560 Speaker 1: been interesting in this election to see. Um, when Donald 433 00:21:14,560 --> 00:21:19,120 Speaker 1: Trump tweets, you'll see Twitter will will put it makes 434 00:21:19,160 --> 00:21:22,200 Speaker 1: me laugh so hard. Yeah, you know, but but that 435 00:21:22,320 --> 00:21:26,280 Speaker 1: wasn't the case like four years ago. And and even Facebook. 436 00:21:26,359 --> 00:21:28,480 Speaker 1: You know, it's interesting to see how they both handle 437 00:21:28,560 --> 00:21:31,400 Speaker 1: them different But I mean, there's so much more pressure 438 00:21:31,440 --> 00:21:34,440 Speaker 1: on it now. But misinformation is you know. I think 439 00:21:34,440 --> 00:21:37,520 Speaker 1: at the time I've been covering uh tech, like even 440 00:21:37,560 --> 00:21:41,800 Speaker 1: this idea of truth has been so just and what 441 00:21:42,040 --> 00:21:44,240 Speaker 1: is truth anymore? It's so scary to see that we 442 00:21:44,320 --> 00:21:47,520 Speaker 1: are all living in these different realities, in this utopia 443 00:21:47,600 --> 00:21:50,359 Speaker 1: of technology that was going to connect the world, and 444 00:21:50,440 --> 00:21:53,440 Speaker 1: that that these hippies that I knew, who were engineers, 445 00:21:53,480 --> 00:21:55,879 Speaker 1: who were musicians with tattoos and thought they were going 446 00:21:55,880 --> 00:21:59,840 Speaker 1: to change everything, have created some of the most profound 447 00:22:00,080 --> 00:22:04,199 Speaker 1: problems that we will be facing uh and and that 448 00:22:04,240 --> 00:22:07,600 Speaker 1: will become the problems that we and our children will 449 00:22:07,640 --> 00:22:10,600 Speaker 1: deal with, and that have fractured society in a way 450 00:22:10,640 --> 00:22:14,000 Speaker 1: that is so fundamental. It is extraordinary because they did 451 00:22:14,040 --> 00:22:18,560 Speaker 1: not think of the human impact of the algorithms. Um 452 00:22:18,640 --> 00:22:21,280 Speaker 1: and and so do I want Martin Zuckerberg fact checking me? No, 453 00:22:21,840 --> 00:22:23,959 Speaker 1: But do I think that they could have done a 454 00:22:24,000 --> 00:22:26,760 Speaker 1: better job and a more consistent job and thought of 455 00:22:26,800 --> 00:22:30,880 Speaker 1: these questions earlier A thousand percent. Yeah, and I think, 456 00:22:30,920 --> 00:22:33,439 Speaker 1: like you're saying, it's really scary that, like in the 457 00:22:33,520 --> 00:22:37,320 Speaker 1: Social Dilemma show, you could search climate change in like 458 00:22:37,400 --> 00:22:39,920 Speaker 1: New York and then search it somewhere else in like Alabama, 459 00:22:40,040 --> 00:22:43,840 Speaker 1: and completely different results will show up. So it really 460 00:22:43,880 --> 00:22:47,040 Speaker 1: does blur what the truth is To speak to that, like, 461 00:22:47,080 --> 00:22:48,879 Speaker 1: how do we even find the truth? Now? How do 462 00:22:48,880 --> 00:22:51,320 Speaker 1: we know what it is? Right? Well, it's funny because 463 00:22:51,359 --> 00:22:54,399 Speaker 1: it's like we have our version of the truth right, 464 00:22:54,440 --> 00:22:56,800 Speaker 1: and like it's like what we believe is true, and 465 00:22:56,840 --> 00:23:00,439 Speaker 1: then it's like the facts and and facts right, you know, 466 00:23:00,560 --> 00:23:04,160 Speaker 1: and then you have I mean, by the way, I 467 00:23:04,200 --> 00:23:06,360 Speaker 1: did a lot of work on Q and on and 468 00:23:06,480 --> 00:23:08,840 Speaker 1: looking at Q and on and conspiracy theories and whatnot. 469 00:23:08,880 --> 00:23:11,159 Speaker 1: Which is this group that was amplified by the way 470 00:23:11,160 --> 00:23:15,639 Speaker 1: by Facebook, um of you know, folks who just completely 471 00:23:16,280 --> 00:23:19,920 Speaker 1: do not believe and any version they believe their own 472 00:23:20,080 --> 00:23:23,600 Speaker 1: version of what they believe is true. Um. And it's 473 00:23:23,680 --> 00:23:27,080 Speaker 1: just extraordinary. Yeah, I mean, I think you have distrust 474 00:23:27,119 --> 00:23:29,000 Speaker 1: and mainstream media now in a way that you have 475 00:23:29,680 --> 00:23:32,480 Speaker 1: never ever had before. And if that's for many reasons 476 00:23:32,600 --> 00:23:35,240 Speaker 1: I believe. I think mainstream media has changed quite a bit. 477 00:23:35,280 --> 00:23:38,080 Speaker 1: And then you also have the president who's attacked mainstream 478 00:23:38,119 --> 00:23:41,720 Speaker 1: media for the last four years. And so all of 479 00:23:41,760 --> 00:23:44,280 Speaker 1: these things have come together and the Internet, it's like 480 00:23:44,280 --> 00:23:47,800 Speaker 1: created this perfect storm, um, and this moment in time 481 00:23:47,840 --> 00:23:52,600 Speaker 1: amplified by technology. It's so fascinating how much of a 482 00:23:52,680 --> 00:23:55,080 Speaker 1: role technology plays into all this because someone might not 483 00:23:55,119 --> 00:23:58,080 Speaker 1: think it at first glance. Something I find really interesting 484 00:23:58,160 --> 00:24:02,360 Speaker 1: is that, like you said, mainstream DA it's very controversial now, 485 00:24:02,680 --> 00:24:06,200 Speaker 1: but for example, like when Fox News declare that Buyen's president, 486 00:24:06,280 --> 00:24:08,720 Speaker 1: are they fake news? Now? It's like that blurred of 487 00:24:08,880 --> 00:24:13,920 Speaker 1: what is personal reality? Right? You know, I actually think, 488 00:24:14,200 --> 00:24:17,600 Speaker 1: um down the line, Um, I have a company called 489 00:24:17,760 --> 00:24:20,720 Speaker 1: dot dot dot Media the right, We're we're a niche 490 00:24:20,840 --> 00:24:23,240 Speaker 1: kind of media company. But I actually think there's gonna 491 00:24:23,240 --> 00:24:25,800 Speaker 1: be a lot of room. Um. And this isn't just 492 00:24:25,840 --> 00:24:28,680 Speaker 1: being plugging or doing I genuinely think, and you see 493 00:24:28,720 --> 00:24:31,120 Speaker 1: it popping up all over the place, Like I think 494 00:24:31,160 --> 00:24:34,240 Speaker 1: we're going to see a lot of alternative channels um, 495 00:24:34,320 --> 00:24:36,520 Speaker 1: of people just living in their own version you know, 496 00:24:36,840 --> 00:24:38,919 Speaker 1: um for better, I am for worse to how do 497 00:24:38,960 --> 00:24:43,199 Speaker 1: we think that because technology can be very scary, and 498 00:24:43,320 --> 00:24:46,840 Speaker 1: I I think we need to create do everything we can. 499 00:24:46,920 --> 00:24:50,320 Speaker 1: So what can the people listening do that feel helpless 500 00:24:50,359 --> 00:24:52,320 Speaker 1: because this is very scary and it could go down 501 00:24:52,320 --> 00:24:55,560 Speaker 1: a very dark path. But how can we reverse that? 502 00:24:55,920 --> 00:24:58,160 Speaker 1: How can we make it so technology can be used 503 00:24:58,160 --> 00:25:00,639 Speaker 1: to impact society for the good and the impact for 504 00:25:00,680 --> 00:25:03,640 Speaker 1: the negative much less well? I think also, I mean, 505 00:25:03,680 --> 00:25:06,879 Speaker 1: I think even being a little more cognizant of what 506 00:25:06,920 --> 00:25:09,080 Speaker 1: you're reading and what you're tweeting and what you're sharing. 507 00:25:09,119 --> 00:25:11,800 Speaker 1: I mean, I know that sounds so simple, but even Twitter, 508 00:25:11,840 --> 00:25:14,439 Speaker 1: I think recently made a product change where instead of 509 00:25:14,480 --> 00:25:18,560 Speaker 1: just retweeting, they ask you or you know, they do 510 00:25:18,600 --> 00:25:21,080 Speaker 1: you want to like quote something first and then retweet it. 511 00:25:21,160 --> 00:25:25,000 Speaker 1: Same with Instagram it's a few stories. Yeah, you know 512 00:25:25,160 --> 00:25:28,960 Speaker 1: that that product decision actually made a huge difference in 513 00:25:29,000 --> 00:25:31,560 Speaker 1: people spreading this information. So, first of all, that can 514 00:25:31,640 --> 00:25:33,840 Speaker 1: come from the top tech companies can't do something and 515 00:25:33,880 --> 00:25:36,600 Speaker 1: they should be doing something. Shouldn't just be lip service, 516 00:25:36,720 --> 00:25:38,399 Speaker 1: you know, UM, And then I think people have a 517 00:25:38,440 --> 00:25:43,119 Speaker 1: responsibility to um. But we are in a moment of pain, 518 00:25:43,200 --> 00:25:45,399 Speaker 1: like we are in a moment where people are so 519 00:25:45,480 --> 00:25:48,719 Speaker 1: angry where everyone is on their own sides. Like I 520 00:25:48,760 --> 00:25:52,320 Speaker 1: even think, um, you know, with this election and with 521 00:25:52,400 --> 00:25:56,159 Speaker 1: people out celebrating or being UNHAPPI or you know, I 522 00:25:56,200 --> 00:25:59,720 Speaker 1: think part of part of I am part time with 523 00:26:00,240 --> 00:26:02,600 Speaker 1: a new Initiative for sixty minutes and I spent time 524 00:26:02,680 --> 00:26:06,680 Speaker 1: with militia groups and Q and on and the group 525 00:26:06,680 --> 00:26:08,520 Speaker 1: called the Boogle Boo Boys who are caring around an 526 00:26:08,520 --> 00:26:14,800 Speaker 1: A R fifteens and their very anti government and wow, no, yeah, 527 00:26:14,840 --> 00:26:18,359 Speaker 1: I think Wednesday, you know. But you know, but the 528 00:26:18,720 --> 00:26:23,680 Speaker 1: thing is, I think people, um, and pain is real, 529 00:26:24,000 --> 00:26:27,320 Speaker 1: and people have lost jobs, and we're in a tough spot, 530 00:26:27,320 --> 00:26:29,119 Speaker 1: and the internet makes a lot of that worse. And 531 00:26:29,400 --> 00:26:32,520 Speaker 1: I think, uh, empathy. I still go back to that 532 00:26:32,600 --> 00:26:34,920 Speaker 1: line that I um that I wrote that I was 533 00:26:34,920 --> 00:26:38,399 Speaker 1: trying to talk to entrepreneurs about. I think, um, you know, 534 00:26:38,480 --> 00:26:40,240 Speaker 1: if we want to get out of the polarization, we've 535 00:26:40,240 --> 00:26:43,679 Speaker 1: got to talk to each other to some degree, right, Um, 536 00:26:43,720 --> 00:26:46,560 Speaker 1: you know, I think that that's important and not just 537 00:26:46,600 --> 00:26:48,440 Speaker 1: stay in our own lane and stay in our own side. 538 00:26:48,440 --> 00:26:50,359 Speaker 1: And that can be not just on the Internet, but 539 00:26:50,520 --> 00:26:54,159 Speaker 1: in real life. Believe it or not, have imagined that, Um, 540 00:26:54,200 --> 00:26:57,520 Speaker 1: you know, listen and and and listening. You know, even 541 00:26:57,520 --> 00:27:00,359 Speaker 1: if you don't agree with it, I think tech companies 542 00:27:00,400 --> 00:27:02,720 Speaker 1: have a huge responsibility. I think they will be regulated, 543 00:27:03,000 --> 00:27:04,840 Speaker 1: and I think they need diversity. And I think they 544 00:27:04,840 --> 00:27:10,879 Speaker 1: needed more people that aren't white men truthfully. UM. And yeah, 545 00:27:10,920 --> 00:27:14,640 Speaker 1: I mean I think that's important too. And I think, UM, 546 00:27:14,680 --> 00:27:16,719 Speaker 1: there's a lot, there's a lot of work to do. 547 00:27:16,760 --> 00:27:18,800 Speaker 1: I mean, I wish I could have one answer and 548 00:27:18,880 --> 00:27:21,400 Speaker 1: wave my my magic wand and fix. And I think 549 00:27:21,400 --> 00:27:24,800 Speaker 1: you need nuanced content. I've always believed in this. I 550 00:27:24,800 --> 00:27:26,760 Speaker 1: I was never a fan of just doing two minutes 551 00:27:26,760 --> 00:27:29,560 Speaker 1: on TV of like open here it is. So, is 552 00:27:29,560 --> 00:27:31,840 Speaker 1: that what you're trying to do with your media company? 553 00:27:31,960 --> 00:27:36,920 Speaker 1: Dot dot dot from me? UM, looking at technology through 554 00:27:36,960 --> 00:27:39,800 Speaker 1: the looking at humanity through that lens, looking at issues 555 00:27:39,840 --> 00:27:43,160 Speaker 1: like the unintended consequences attack, looking at issues like love 556 00:27:43,359 --> 00:27:46,320 Speaker 1: and politics and you name it. Mental health is a 557 00:27:46,320 --> 00:27:49,639 Speaker 1: big one, UM. Looking at these corner stories and the 558 00:27:49,960 --> 00:27:53,199 Speaker 1: people that people kind of turn away from UM and 559 00:27:53,240 --> 00:27:55,640 Speaker 1: being able to tell their stories in a more interesting, 560 00:27:55,720 --> 00:27:59,000 Speaker 1: nuanced way that's not people shouting at each other. That 561 00:27:59,040 --> 00:28:01,360 Speaker 1: allows us to hit it from all sides. UM. That's 562 00:28:01,400 --> 00:28:05,680 Speaker 1: always been something I feel pretty strongly about and hearing 563 00:28:05,720 --> 00:28:08,200 Speaker 1: those stories from the powerhouses in Silicon Valley, but also 564 00:28:08,240 --> 00:28:10,520 Speaker 1: the people that are ignored. I got the name. I 565 00:28:10,520 --> 00:28:12,520 Speaker 1: remember someone telling me I was the human equivalent of 566 00:28:12,600 --> 00:28:14,440 Speaker 1: dot dot dot, Like you know, when you're texting someone, 567 00:28:14,560 --> 00:28:18,240 Speaker 1: you're a waiting and like what are they gonna say? Um? 568 00:28:18,280 --> 00:28:20,879 Speaker 1: I love that. Yeah, I'm like, I feel like this 569 00:28:20,960 --> 00:28:23,440 Speaker 1: moment is so dot dot dot right, Like we don't 570 00:28:23,440 --> 00:28:26,439 Speaker 1: know what's going to happen. Um, We're just kind of 571 00:28:26,440 --> 00:28:28,760 Speaker 1: waiting for it to play out, and the anxious like 572 00:28:28,840 --> 00:28:35,239 Speaker 1: when I say, there's so much to say, um, and 573 00:28:35,280 --> 00:28:36,840 Speaker 1: we just don't know what's going to be said. So 574 00:28:36,920 --> 00:28:39,360 Speaker 1: I'm kind of okay with not tying the bow. I 575 00:28:39,360 --> 00:28:41,160 Speaker 1: think that's the story of my life. Like I'm totally 576 00:28:41,160 --> 00:28:44,400 Speaker 1: okay with not tying the bow. That's hard for me 577 00:28:44,400 --> 00:28:46,360 Speaker 1: because I like to know things and I like to 578 00:28:46,440 --> 00:28:51,120 Speaker 1: have things planned out. So it's it's it's actually your 579 00:28:51,200 --> 00:28:54,760 Speaker 1: worst nightmare, it really is. That's why the time that 580 00:28:54,800 --> 00:28:57,680 Speaker 1: we're living very scary for me as someone who likes 581 00:28:57,720 --> 00:29:02,360 Speaker 1: you know what's happening. Yeah, Okay, we have to take 582 00:29:02,520 --> 00:29:04,440 Speaker 1: one more quick break, but when we come back, I 583 00:29:04,480 --> 00:29:07,440 Speaker 1: want to know who your best interview was, your worst interview, 584 00:29:07,480 --> 00:29:10,920 Speaker 1: and your toughest interview you've ever had. We'll be right back, 585 00:29:11,880 --> 00:29:17,080 Speaker 1: and we're back. You've interviewed so many people, from Mark 586 00:29:17,160 --> 00:29:21,920 Speaker 1: Zuckerberg to Bill Gates? What has been your toughest interview, 587 00:29:22,200 --> 00:29:27,480 Speaker 1: your favorite interview? And if you can answer your worst interview? Um, 588 00:29:27,520 --> 00:29:31,640 Speaker 1: my favorite interview, my worst interview. UM, well, I guess 589 00:29:31,640 --> 00:29:33,360 Speaker 1: I'm gonna talk about it probably in the book, so 590 00:29:33,360 --> 00:29:36,000 Speaker 1: I guess I could. I couldn't give some hints at it. 591 00:29:36,240 --> 00:29:40,360 Speaker 1: Um my least favorite interview. UM. Maybe I won't say 592 00:29:40,360 --> 00:29:42,080 Speaker 1: his name because that's going to come out in the book, 593 00:29:42,120 --> 00:29:45,400 Speaker 1: but a founder of a big tech company that we 594 00:29:45,440 --> 00:29:50,400 Speaker 1: all know. He I remember asking him about women's safety 595 00:29:50,760 --> 00:29:53,280 Speaker 1: because it was at a time when the company was 596 00:29:53,320 --> 00:29:56,640 Speaker 1: dealing with many issues. This was like an online company, um. 597 00:29:56,680 --> 00:29:59,640 Speaker 1: And I remember sitting in the newsroom with him and 598 00:29:59,680 --> 00:30:01,480 Speaker 1: I put him on camera before a lot of people. 599 00:30:01,520 --> 00:30:05,840 Speaker 1: It was one of those big, big tech multimillionaire about 600 00:30:05,880 --> 00:30:11,160 Speaker 1: billionaire founder and I remember him taking off his mike 601 00:30:12,080 --> 00:30:14,840 Speaker 1: and saying Lorie, like LORI and I was like, what 602 00:30:15,360 --> 00:30:16,560 Speaker 1: you know, and he's like, I didn't know this was 603 00:30:16,600 --> 00:30:18,040 Speaker 1: that kind of interview. And I was like, what do 604 00:30:18,080 --> 00:30:22,040 Speaker 1: you mean what kind of interview? Um and uh, and 605 00:30:22,480 --> 00:30:24,800 Speaker 1: he was like that kind of interview. And I remember 606 00:30:24,880 --> 00:30:27,160 Speaker 1: him talking with turning over and talking to his PR 607 00:30:27,240 --> 00:30:30,800 Speaker 1: person and discussing if he was going to leave um. 608 00:30:30,840 --> 00:30:34,080 Speaker 1: And I think for me that was the moment that 609 00:30:35,880 --> 00:30:38,440 Speaker 1: I don't say, it's fine, like ship out real, It's like, 610 00:30:38,720 --> 00:30:40,880 Speaker 1: that was the moment where it was like, wait a second, 611 00:30:40,960 --> 00:30:43,560 Speaker 1: I'm not here to be your mouthpiece. Right, I've always 612 00:30:43,600 --> 00:30:47,440 Speaker 1: been fair. I've always treated people with respect. I've always 613 00:30:47,520 --> 00:30:50,240 Speaker 1: asked the hard questions, but I've always you know, I've 614 00:30:50,240 --> 00:30:54,360 Speaker 1: always treated people with kindness and with respect. Um. I 615 00:30:54,400 --> 00:30:57,200 Speaker 1: think for me that was that was a pretty extraordinary 616 00:30:57,240 --> 00:31:00,680 Speaker 1: moment because it showed this it was it was when 617 00:31:00,840 --> 00:31:03,240 Speaker 1: the minnows turned to sharks. Like when Silicon Valley, I 618 00:31:03,240 --> 00:31:06,480 Speaker 1: think it had officially kind of crossed over from me 619 00:31:06,480 --> 00:31:08,720 Speaker 1: and not everyone in Silicon Valley was like that, right 620 00:31:08,800 --> 00:31:11,840 Speaker 1: at all, by the way at all. But the arrogance 621 00:31:12,040 --> 00:31:15,320 Speaker 1: there that that this person had a company that was 622 00:31:15,360 --> 00:31:19,360 Speaker 1: worth more than like Campbell soup, right, you know, and 623 00:31:19,760 --> 00:31:24,280 Speaker 1: and that I didn't think that he needed to answer 624 00:31:24,400 --> 00:31:29,160 Speaker 1: questions on on basic questions like what about like we 625 00:31:29,240 --> 00:31:31,880 Speaker 1: take women's safety very seriously, like you know, and and 626 00:31:32,200 --> 00:31:34,160 Speaker 1: I think that was so that was probably my worst. 627 00:31:34,600 --> 00:31:36,280 Speaker 1: I mean, he stayed for the rest of the interview, 628 00:31:36,360 --> 00:31:41,320 Speaker 1: but that got weird. Uh, let's see. Toughest the tough 629 00:31:41,320 --> 00:31:43,240 Speaker 1: interviews are when people don't want to say something and 630 00:31:43,240 --> 00:31:45,240 Speaker 1: they need to say something, you know. The tough interviews 631 00:31:45,240 --> 00:31:47,400 Speaker 1: are a dance and like the other person and you're 632 00:31:47,400 --> 00:31:50,960 Speaker 1: just like you feel like you're doing like like yeah, 633 00:31:50,960 --> 00:31:53,400 Speaker 1: you're doing like a ninja dance and they're just like foxtrotting. 634 00:31:55,520 --> 00:31:58,680 Speaker 1: I love that comparison, you know, and it's like that's 635 00:31:58,720 --> 00:32:02,400 Speaker 1: that's this eats up and only specifically through that either 636 00:32:02,480 --> 00:32:04,240 Speaker 1: I don't don't think that at me, but I can't 637 00:32:04,320 --> 00:32:07,880 Speaker 1: right now. And then the good interviews for me, I 638 00:32:07,920 --> 00:32:10,760 Speaker 1: love interviews that enabled people to some degree to take 639 00:32:10,760 --> 00:32:13,520 Speaker 1: their power back. Um. You know, I did a lot. 640 00:32:13,640 --> 00:32:16,680 Speaker 1: It wasn't just the tech founders that I interviewed, um 641 00:32:16,720 --> 00:32:20,040 Speaker 1: that were my favorite interviews. Like my favorite interviews were, 642 00:32:21,160 --> 00:32:23,360 Speaker 1: you know, women who have been you know, I did 643 00:32:23,360 --> 00:32:25,320 Speaker 1: a whole special back in the day when people didn't 644 00:32:25,320 --> 00:32:27,440 Speaker 1: even know what the word like revenge born was, which 645 00:32:27,440 --> 00:32:30,680 Speaker 1: is like a horrible type of harassment, like mostly against women. 646 00:32:31,640 --> 00:32:34,479 Speaker 1: Were like men post images of naked women online. It's 647 00:32:34,520 --> 00:32:37,640 Speaker 1: like a form of like power and all sorts of 648 00:32:37,640 --> 00:32:39,520 Speaker 1: stuff and and it wasn't even in the dictionary at 649 00:32:39,520 --> 00:32:41,760 Speaker 1: the time, and women couldn't get their images off the 650 00:32:41,800 --> 00:32:44,040 Speaker 1: internet were ruined lives. And I remember interviewing a woman 651 00:32:44,120 --> 00:32:48,840 Speaker 1: named Nikki who was who was so scared to go 652 00:32:48,880 --> 00:32:50,520 Speaker 1: on camera, and when I did put her on camera, 653 00:32:50,600 --> 00:32:52,920 Speaker 1: she said, every time I look at someone, I wonder 654 00:32:52,960 --> 00:32:55,960 Speaker 1: if they've seen me naked. And I couldn't even envision 655 00:32:56,200 --> 00:32:58,200 Speaker 1: what that would feel like, in the shame and the 656 00:32:58,280 --> 00:33:03,280 Speaker 1: humiliation it would be for someone, um, you know too, 657 00:33:04,520 --> 00:33:06,360 Speaker 1: and what that would do to your psyche right to 658 00:33:06,440 --> 00:33:10,920 Speaker 1: walk around and wonder if people around you had, you know, 659 00:33:11,080 --> 00:33:13,640 Speaker 1: had seen you like that. And it was just such 660 00:33:13,640 --> 00:33:16,520 Speaker 1: a the type of harassment it was that this guy 661 00:33:16,560 --> 00:33:18,520 Speaker 1: he had taped her without her consent, all those types 662 00:33:18,560 --> 00:33:20,680 Speaker 1: of stuff type of stuff. It was just such a 663 00:33:20,800 --> 00:33:23,760 Speaker 1: form of power. And when she we did this interview, 664 00:33:24,640 --> 00:33:26,600 Speaker 1: it was almost like pointing the camera at her with 665 00:33:26,640 --> 00:33:30,200 Speaker 1: her consent, right, like she was able to take back 666 00:33:30,280 --> 00:33:33,320 Speaker 1: that power back. And I really that I liked and 667 00:33:33,440 --> 00:33:37,320 Speaker 1: so like I've interviewed hackers and and UM and victims 668 00:33:37,320 --> 00:33:41,000 Speaker 1: of UM, you know, of crimes and all sorts of stuff. 669 00:33:41,040 --> 00:33:44,080 Speaker 1: So I think those are probably more my favorite interviews. 670 00:33:44,200 --> 00:33:47,280 Speaker 1: And then some founders, I yeah, and founders I think 671 00:33:47,280 --> 00:33:50,600 Speaker 1: are also really interesting. Life is messy, and I like 672 00:33:50,640 --> 00:33:53,360 Speaker 1: people who talk about it, you know, it's so interesting. 673 00:33:54,000 --> 00:33:57,640 Speaker 1: I've been into UM San Quentin the prison and interviewed 674 00:33:57,760 --> 00:34:01,680 Speaker 1: a lot of prisoners about UM. They have a really 675 00:34:01,720 --> 00:34:04,120 Speaker 1: interesting startup program where they teach them how to code there. 676 00:34:04,160 --> 00:34:06,000 Speaker 1: But I've interviewed a lot of them about their crimes 677 00:34:06,000 --> 00:34:07,880 Speaker 1: and what makes them, you know, what makes them a 678 00:34:07,920 --> 00:34:11,360 Speaker 1: good entrepreneur. Also, so there's like a fine line between 679 00:34:11,360 --> 00:34:13,600 Speaker 1: that thing that makes you a good entrepreneur, like and 680 00:34:13,800 --> 00:34:16,680 Speaker 1: and also what can if those skills used in a 681 00:34:16,719 --> 00:34:19,160 Speaker 1: bad way can be very very bad And and so 682 00:34:19,239 --> 00:34:22,399 Speaker 1: I also I've always been a big believer and kind 683 00:34:22,400 --> 00:34:25,000 Speaker 1: of walking into situations that you might that might make 684 00:34:25,000 --> 00:34:29,440 Speaker 1: you uncomfortable and and going in in like a very real, genuine, 685 00:34:29,960 --> 00:34:33,360 Speaker 1: non judgmental way and trying to understand human nature. I 686 00:34:33,360 --> 00:34:36,200 Speaker 1: think technology has always just been my way into talking 687 00:34:36,200 --> 00:34:38,759 Speaker 1: about the human condition. Condition. If I do my job right. 688 00:34:39,080 --> 00:34:42,279 Speaker 1: You mentioned a book. Is that coming soon? Yes, I 689 00:34:42,320 --> 00:34:47,040 Speaker 1: have a book. It's coming, Um, it's coming in auguste. 690 00:34:47,120 --> 00:34:48,799 Speaker 1: So hopefully things will be better and I can go 691 00:34:48,840 --> 00:34:51,279 Speaker 1: on book tour and you know, hopefully by then we'll 692 00:34:51,320 --> 00:34:53,400 Speaker 1: we'll have well, you know, be uh will be in 693 00:34:53,440 --> 00:34:56,040 Speaker 1: a better position in society. But it's going to be 694 00:34:56,120 --> 00:35:00,840 Speaker 1: a lot of these stories. I'm yeah, I'll send it 695 00:35:00,880 --> 00:35:02,839 Speaker 1: to you. It's it's gonna be called dot dot dot. 696 00:35:02,960 --> 00:35:05,600 Speaker 1: So it all it's all about Yeah, it's very un 697 00:35:05,640 --> 00:35:08,520 Speaker 1: frand so it's all about um, you know, tech and 698 00:35:08,719 --> 00:35:11,760 Speaker 1: society and that's the second wave of tech and asking 699 00:35:11,800 --> 00:35:13,680 Speaker 1: for what you want. And it's it's my story, but 700 00:35:13,719 --> 00:35:16,520 Speaker 1: it's really kind of the story of um of kind 701 00:35:16,520 --> 00:35:18,680 Speaker 1: of this the second wave of tech. So it's it 702 00:35:18,719 --> 00:35:21,880 Speaker 1: should be hopefully exciting. Well, I am so excited. And 703 00:35:21,920 --> 00:35:24,759 Speaker 1: like we've talked about, has been a crazy year. Um 704 00:35:24,760 --> 00:35:27,399 Speaker 1: and I've been feeling very helpless like many others yea, 705 00:35:27,800 --> 00:35:30,160 Speaker 1: And so I wanted for this season, I wanted to 706 00:35:30,200 --> 00:35:33,560 Speaker 1: highlight a charity each episode so we can bring attention 707 00:35:33,600 --> 00:35:35,440 Speaker 1: to some good in the world. So I was wondering, 708 00:35:35,480 --> 00:35:37,600 Speaker 1: if you're a charity that you're passionate about that we 709 00:35:37,640 --> 00:35:41,680 Speaker 1: should highlight and talk about. You know, I've always I've 710 00:35:41,719 --> 00:35:44,600 Speaker 1: always supported charity water, but I feel like, I mean, 711 00:35:44,640 --> 00:35:46,799 Speaker 1: I also feel like there's room for for other ones. 712 00:35:46,800 --> 00:35:48,239 Speaker 1: So if I come up with another one, I'm going 713 00:35:48,280 --> 00:35:50,920 Speaker 1: to send you some other links. If that's okay too, Okay, 714 00:35:50,960 --> 00:35:52,480 Speaker 1: we will include it and I will link it in 715 00:35:52,560 --> 00:35:56,799 Speaker 1: my bio and it's we'll mention. Okay, wonderful, wonderful, um, 716 00:35:56,800 --> 00:35:59,920 Speaker 1: and thank you. I appreciate your time. It's been fine, awesome. 717 00:36:00,000 --> 00:36:01,520 Speaker 1: It was so great to meet too. Thank you so 718 00:36:01,600 --> 00:36:04,040 Speaker 1: much for taking the time. Yeah, I appreciate it. And 719 00:36:04,080 --> 00:36:06,080 Speaker 1: I will be listening. I love your stuff, so I'll 720 00:36:06,120 --> 00:36:11,439 Speaker 1: be listening to all your interviews. All right, guys, thank 721 00:36:11,480 --> 00:36:13,920 Speaker 1: you so much for listening to this week's episode of 722 00:36:14,040 --> 00:36:16,840 Speaker 1: Let's Be Real. As always, don't forget to subscribe to 723 00:36:16,840 --> 00:36:19,439 Speaker 1: the podcast if you haven't already, leave a comment because 724 00:36:19,440 --> 00:36:22,200 Speaker 1: I always love to hear your feedback. And don't forget 725 00:36:22,239 --> 00:36:24,560 Speaker 1: to follow me on Instagram at It's Sammy J. That's 726 00:36:24,600 --> 00:36:26,759 Speaker 1: I T S S A M M Y J A 727 00:36:27,000 --> 00:36:30,440 Speaker 1: y E. And also go follow Lorie Siegel on all 728 00:36:30,520 --> 00:36:33,480 Speaker 1: her social media. It's Lorie Siegel. And please check out 729 00:36:33,520 --> 00:36:36,759 Speaker 1: her multimedia company dot dot dot. It's really cool and 730 00:36:36,800 --> 00:36:39,040 Speaker 1: I think you'll like it. And if you're still here, 731 00:36:39,080 --> 00:36:42,120 Speaker 1: I have a little sneak peek for you because next 732 00:36:42,120 --> 00:36:45,920 Speaker 1: week's guest is singer, songwriter dancer Tate McGray and it's 733 00:36:45,920 --> 00:36:48,799 Speaker 1: an awesome conversation and I am so ready for you 734 00:36:48,800 --> 00:36:52,560 Speaker 1: to listen to it next Tuesday. Stay tuned. Bye guys,