1 00:00:00,400 --> 00:00:02,360 Speaker 1: It's a Hammer in Nigel Show. My name is Nigel 2 00:00:02,440 --> 00:00:04,680 Speaker 1: Jason Hammer's here go straight to the hotline and bring 3 00:00:04,720 --> 00:00:07,720 Speaker 1: on Breitbart's social media director Wynton Hall, who is the 4 00:00:08,039 --> 00:00:10,440 Speaker 1: author of a brand new book already climbing up the 5 00:00:10,480 --> 00:00:16,240 Speaker 1: Amazon charts code read The Left, the Right, China and 6 00:00:16,280 --> 00:00:19,440 Speaker 1: the Race to Control AI went and welcome to the show. 7 00:00:19,560 --> 00:00:23,120 Speaker 1: I've been flipping through the book. It's fascinating and I 8 00:00:23,120 --> 00:00:25,200 Speaker 1: think I could use the word terrifying all at the 9 00:00:25,239 --> 00:00:29,080 Speaker 1: same time. Hey, let's before we get started here with 10 00:00:29,160 --> 00:00:32,080 Speaker 1: the book, can you maybe just fill us in remind 11 00:00:32,120 --> 00:00:35,880 Speaker 1: everybody what AI is, what it was, and what it's become. 12 00:00:36,760 --> 00:00:39,159 Speaker 2: Oh. Absolutely, and a pleasure to be with you. So, 13 00:00:39,479 --> 00:00:44,040 Speaker 2: artificial intelligence in its most basic form is synthetic intelligence, 14 00:00:44,159 --> 00:00:47,920 Speaker 2: non human intelligence. It's a machine trying to replicate what 15 00:00:47,960 --> 00:00:51,559 Speaker 2: our brains to And we have actually had AI for 16 00:00:51,600 --> 00:00:53,720 Speaker 2: a long time as a public basing term. It actually 17 00:00:53,720 --> 00:00:56,600 Speaker 2: goes all the way back to nineteen fifty six, but 18 00:00:56,680 --> 00:00:59,840 Speaker 2: it was really twenty seventeen. That's something called the transformer 19 00:01:00,160 --> 00:01:03,960 Speaker 2: architecture piece and technology that was created that changed everything. 20 00:01:04,240 --> 00:01:06,920 Speaker 2: And now with what we call generative AI is just 21 00:01:06,920 --> 00:01:09,160 Speaker 2: a fancy way of saying it can generate things like 22 00:01:09,280 --> 00:01:12,480 Speaker 2: pictures or text or words or even a book or 23 00:01:12,480 --> 00:01:15,920 Speaker 2: anything like that, or video that has revolutionized it. And 24 00:01:16,120 --> 00:01:18,679 Speaker 2: this I think the most important thing to know is 25 00:01:18,720 --> 00:01:22,199 Speaker 2: that ninety nine percent of us already use AI, even 26 00:01:22,319 --> 00:01:25,800 Speaker 2: though only sixty four percent of us don't realize when 27 00:01:25,840 --> 00:01:27,600 Speaker 2: we're using AI. And you might say, like, how is 28 00:01:27,600 --> 00:01:31,160 Speaker 2: that possible when because things like your weather app, your 29 00:01:31,360 --> 00:01:35,560 Speaker 2: Netflix or your streaming subscriptions, your GPS, all of that, 30 00:01:36,000 --> 00:01:39,039 Speaker 2: all those algorithms use AI. So if we're going to 31 00:01:39,120 --> 00:01:41,360 Speaker 2: use it, we need to know the dangers but also 32 00:01:41,560 --> 00:01:42,600 Speaker 2: the potential upsides. 33 00:01:43,000 --> 00:01:46,039 Speaker 3: Went and Jason Hammer here and part of the title 34 00:01:46,040 --> 00:01:49,960 Speaker 3: of the book the race to control AI? What needs 35 00:01:49,960 --> 00:01:53,080 Speaker 3: to be controlled? Like because I make AI videos of 36 00:01:53,160 --> 00:01:56,200 Speaker 3: like Nigel making out with Whoopee Goldberg and we put 37 00:01:56,240 --> 00:01:58,520 Speaker 3: that on our social media. I don't think there's a 38 00:01:58,640 --> 00:02:01,640 Speaker 3: race for China to try to outdo us in that category. 39 00:02:02,680 --> 00:02:05,160 Speaker 2: I think you're I think you've got the chimp and 40 00:02:05,200 --> 00:02:09,560 Speaker 2: trophy in that arena. But but I think I think 41 00:02:09,600 --> 00:02:11,920 Speaker 2: and I don't think China is looking to replace that. 42 00:02:12,280 --> 00:02:15,160 Speaker 2: The race to control is very important and we hear 43 00:02:15,200 --> 00:02:16,880 Speaker 2: this all the time, right we've got to win the 44 00:02:16,919 --> 00:02:20,760 Speaker 2: AI race, President Trumps fuctory, Uh, you know strongly about this. 45 00:02:21,240 --> 00:02:24,079 Speaker 2: It's actually a very bipartisan now belief. And here's why, 46 00:02:24,160 --> 00:02:25,560 Speaker 2: and it's a very serious thing. I have an old 47 00:02:25,639 --> 00:02:30,200 Speaker 2: chapter about this in Code. So if you whoever wins 48 00:02:30,240 --> 00:02:33,799 Speaker 2: that race to dominate the AI space, you've got two upsides. 49 00:02:33,840 --> 00:02:38,120 Speaker 2: One the economics. It's a massive, massive, uh wealth creators. 50 00:02:38,520 --> 00:02:40,720 Speaker 2: One third of the S and P five hundred is 51 00:02:40,720 --> 00:02:45,200 Speaker 2: constituted by those seven big tech American companies. So we 52 00:02:45,240 --> 00:02:48,480 Speaker 2: know there's a massive wealth component, and we obviously want 53 00:02:48,520 --> 00:02:52,240 Speaker 2: for our country to benefit economically. But the real issue 54 00:02:52,480 --> 00:02:56,400 Speaker 2: is on this military because in nation and national security piece. 55 00:02:56,840 --> 00:03:00,359 Speaker 2: So here's why, whoever gains dominance in AI, it's going 56 00:03:00,400 --> 00:03:04,760 Speaker 2: to have dominance over things like encryption, code hacking, hacking 57 00:03:04,760 --> 00:03:10,880 Speaker 2: of missile systems, hacking of infrastructure and all manner of cybersecurity, 58 00:03:10,960 --> 00:03:14,200 Speaker 2: as well as autonomous weapons on the battlefield. And so 59 00:03:14,240 --> 00:03:16,800 Speaker 2: we want our soldiers, sailors, airman and marines and obviously 60 00:03:16,800 --> 00:03:20,480 Speaker 2: our homeland to remain safe. And that's what people are 61 00:03:20,520 --> 00:03:23,640 Speaker 2: talking about when they're talking about winning that AI race 62 00:03:24,040 --> 00:03:24,840 Speaker 2: and a. 63 00:03:24,680 --> 00:03:27,600 Speaker 1: Big picture code read the left, the right, China and 64 00:03:27,639 --> 00:03:29,799 Speaker 1: the Race to Control AI. That's your new book. We're 65 00:03:29,800 --> 00:03:33,040 Speaker 1: speaking with Winton Hall, right, part social media director. Tell 66 00:03:33,080 --> 00:03:36,360 Speaker 1: me just tell me about the book for a second, 67 00:03:36,400 --> 00:03:39,960 Speaker 1: why it's important and and what people need to know 68 00:03:40,480 --> 00:03:41,760 Speaker 1: before they buy the book. 69 00:03:42,520 --> 00:03:45,640 Speaker 2: This is a book I spent two years. I've written many, 70 00:03:45,640 --> 00:03:49,480 Speaker 2: many books, but this one I spent two years and 71 00:03:50,360 --> 00:03:53,200 Speaker 2: deeply researched. This is a book for people who've sort 72 00:03:53,200 --> 00:03:55,640 Speaker 2: of felt like, you know, I know, I'm supposed to 73 00:03:55,640 --> 00:03:58,480 Speaker 2: know something about this, maybe not for me, but for 74 00:03:58,640 --> 00:04:01,160 Speaker 2: my kid or my grand kid who is going to 75 00:04:01,200 --> 00:04:03,480 Speaker 2: inhabit a very different world than the one I worked 76 00:04:03,480 --> 00:04:05,440 Speaker 2: in or grew up in. Uh and I need to 77 00:04:05,440 --> 00:04:06,720 Speaker 2: be able to help them. I need to be able 78 00:04:06,760 --> 00:04:08,480 Speaker 2: to know where the where the what I call land 79 00:04:08,480 --> 00:04:11,760 Speaker 2: mines are, where the roses are, the opportunities are. That's 80 00:04:11,800 --> 00:04:14,120 Speaker 2: who it's meant for. It's meant to be simple without 81 00:04:14,160 --> 00:04:17,640 Speaker 2: being simplistic. It assumes that someone is not, uh, you know, 82 00:04:17,720 --> 00:04:19,800 Speaker 2: an expert in any of this and what I really 83 00:04:19,839 --> 00:04:22,320 Speaker 2: wanted to do. The reason why I call it code red. 84 00:04:22,360 --> 00:04:24,640 Speaker 2: It's a double meaning. Code red in the sense of 85 00:04:24,680 --> 00:04:27,039 Speaker 2: an alert or an alarm or an alert flair but 86 00:04:27,240 --> 00:04:30,080 Speaker 2: code read in the sense of for people right of 87 00:04:30,200 --> 00:04:34,120 Speaker 2: center who are on the red political team and ideology, 88 00:04:34,320 --> 00:04:36,480 Speaker 2: for them to have a set of principles, a code, 89 00:04:36,560 --> 00:04:39,240 Speaker 2: a set of principles so that they know how to 90 00:04:39,360 --> 00:04:43,400 Speaker 2: navigate through those landmines but also find those those opportunities. 91 00:04:43,400 --> 00:04:45,920 Speaker 2: In the AI era, what are. 92 00:04:45,800 --> 00:04:48,840 Speaker 1: The land mines? I mean, like the major the major 93 00:04:48,880 --> 00:04:50,640 Speaker 1: ones that you can think of right now with AI. 94 00:04:51,600 --> 00:04:54,640 Speaker 2: Number number one is that national security. Piece Number two 95 00:04:54,960 --> 00:04:59,240 Speaker 2: is that if terrorists get this technology, because of democratization 96 00:04:59,560 --> 00:05:02,120 Speaker 2: of they they are going to have, you know, the 97 00:05:02,160 --> 00:05:06,120 Speaker 2: ability to do bioterrorism at scale, you know, chemical attacks 98 00:05:06,160 --> 00:05:10,200 Speaker 2: at scale with sophisticated information. The second area I think 99 00:05:10,240 --> 00:05:14,040 Speaker 2: a landmine is for our children to two levels. One 100 00:05:14,320 --> 00:05:18,080 Speaker 2: the so called rise of AI girlfriends or AI chatbots. 101 00:05:18,080 --> 00:05:21,240 Speaker 2: That might sound strange to you, but these AI companions, 102 00:05:21,279 --> 00:05:25,480 Speaker 2: they're millions and millions of paid subscribers. Young people are 103 00:05:25,520 --> 00:05:29,120 Speaker 2: now having horrifying results with this. They are being taught 104 00:05:29,760 --> 00:05:33,760 Speaker 2: self harm. I recount the case of one AI coaching 105 00:05:33,920 --> 00:05:39,719 Speaker 2: a teenager to commit suicide all manner of sexualization for minors, 106 00:05:39,760 --> 00:05:43,520 Speaker 2: which is horrifying for underage you know children. And then 107 00:05:43,560 --> 00:05:46,200 Speaker 2: the other piece is the education piece. It could be 108 00:05:46,360 --> 00:05:50,160 Speaker 2: very helpful if used properly. But we're we're finding now 109 00:05:50,200 --> 00:05:53,640 Speaker 2: that professors and teachers in high school they don't call 110 00:05:53,680 --> 00:05:58,039 Speaker 2: it chat GPT, they call it cheat GPT because they're 111 00:05:58,040 --> 00:06:01,560 Speaker 2: having a plagiarism epidemic. And that's going to erode critical 112 00:06:01,600 --> 00:06:04,520 Speaker 2: thinking skills. That's going to erode all those great thought, 113 00:06:04,839 --> 00:06:07,600 Speaker 2: you know, critical basis of knowledge that we've had. So 114 00:06:07,880 --> 00:06:09,920 Speaker 2: parents and grandparents are kind of waking up and going 115 00:06:09,960 --> 00:06:11,800 Speaker 2: me and you know, I know, I got to get 116 00:06:11,800 --> 00:06:14,640 Speaker 2: caught up on this. I really don't know where to start. 117 00:06:14,720 --> 00:06:16,840 Speaker 2: That was what code was meant to be and when. 118 00:06:16,880 --> 00:06:19,880 Speaker 3: And I'm glad you bring up the chat GPT angle 119 00:06:19,960 --> 00:06:22,960 Speaker 3: because like, yes, we can talk about military and we 120 00:06:23,000 --> 00:06:26,239 Speaker 3: can talk about cyber hacking. But I think most people 121 00:06:26,400 --> 00:06:28,919 Speaker 3: in their homes that are listening to us right now 122 00:06:29,400 --> 00:06:32,800 Speaker 3: don't dabble in that level of AI. But maybe they've 123 00:06:32,800 --> 00:06:37,279 Speaker 3: downloaded chat GPT onto their phone, and some people use 124 00:06:37,320 --> 00:06:40,719 Speaker 3: that more than Google or any other search engine. But 125 00:06:40,839 --> 00:06:42,760 Speaker 3: I don't know about you. When when I check a 126 00:06:42,760 --> 00:06:45,719 Speaker 3: lot of stuff out there, it's wrong, an awful. 127 00:06:45,440 --> 00:06:49,640 Speaker 2: Lot that that is so important. So that's called hallucinations 128 00:06:49,720 --> 00:06:53,280 Speaker 2: which is the AI term for made up stuff that's 129 00:06:53,320 --> 00:06:57,760 Speaker 2: not right, and it sounds very confident. The real ding's 130 00:06:57,839 --> 00:07:00,000 Speaker 2: like of being a con man, you know, a con man. 131 00:07:00,080 --> 00:07:03,200 Speaker 2: It convinces you because he is so confident it must 132 00:07:03,279 --> 00:07:06,680 Speaker 2: be true. It sounds very convincing, but it's incorrect. So 133 00:07:06,839 --> 00:07:09,640 Speaker 2: that's why that erosion of critical thinking skills can't happen. 134 00:07:09,680 --> 00:07:12,160 Speaker 2: They call it cognitive offloading, when a young person just 135 00:07:12,240 --> 00:07:15,520 Speaker 2: trusts the machine and thinks, well, I guess this. You know, 136 00:07:15,600 --> 00:07:18,840 Speaker 2: genius robot system must know the truth. And so we've 137 00:07:18,880 --> 00:07:21,360 Speaker 2: got that danger as well, and then there's a lot 138 00:07:21,360 --> 00:07:25,920 Speaker 2: of just biased information, you know, really swaying people. One 139 00:07:25,920 --> 00:07:27,680 Speaker 2: of the things we realize in these studies is that 140 00:07:27,720 --> 00:07:31,640 Speaker 2: AI is very good at manipulating people because it understands 141 00:07:31,720 --> 00:07:35,720 Speaker 2: language and therefore it understands persuasion tactics. So it's it's 142 00:07:35,840 --> 00:07:39,040 Speaker 2: it's got a lot of upside, but a lot of danger. 143 00:07:39,160 --> 00:07:42,160 Speaker 2: And I walked through each of those. Each chapter is 144 00:07:42,200 --> 00:07:46,920 Speaker 2: on everything from education to the romance and dangers in 145 00:07:47,000 --> 00:07:50,480 Speaker 2: human relationships, to even faith how AI is affecting faith 146 00:07:50,920 --> 00:07:53,840 Speaker 2: and God and religion, and as well as the things 147 00:07:53,880 --> 00:07:57,360 Speaker 2: about jobs, of course, bias and national security. 148 00:07:57,560 --> 00:08:01,200 Speaker 3: Would you say manipulation? I think that's really important word 149 00:08:01,240 --> 00:08:04,760 Speaker 3: here because right now, where politics is pretty much involved 150 00:08:04,800 --> 00:08:09,400 Speaker 3: in everything we do in society, politics has become entertainment 151 00:08:09,480 --> 00:08:13,240 Speaker 3: and pop culture and everything around it. If you're just 152 00:08:13,480 --> 00:08:16,800 Speaker 3: using chat, GPT or some sort of AI to get 153 00:08:16,840 --> 00:08:20,320 Speaker 3: your politics, you could easily be manipulated, right. 154 00:08:20,840 --> 00:08:23,520 Speaker 2: Oh, very much so. In fact, we have research and 155 00:08:23,560 --> 00:08:27,200 Speaker 2: I cite many of the studies on that, because there's 156 00:08:27,240 --> 00:08:31,200 Speaker 2: something called automation bias, and it's a real basic concept, 157 00:08:31,240 --> 00:08:34,800 Speaker 2: which is that studies show that people when they think 158 00:08:34,840 --> 00:08:37,920 Speaker 2: they're dealing with a system, a machine that is highly advanced, 159 00:08:38,320 --> 00:08:41,080 Speaker 2: and it says something that is not what they think 160 00:08:41,160 --> 00:08:44,160 Speaker 2: is accurate, they actually just start to slowly accept what 161 00:08:44,200 --> 00:08:46,560 Speaker 2: the machine says because they say, well, gosh, you know, 162 00:08:46,679 --> 00:08:49,040 Speaker 2: some genius in Silicon Valley built this. It surely it 163 00:08:49,080 --> 00:08:51,840 Speaker 2: knows more than I do. And that is really dangerous 164 00:08:51,880 --> 00:08:54,280 Speaker 2: because then we start to give up our critical thinking sales, 165 00:08:54,360 --> 00:08:58,760 Speaker 2: we start getting into a persuasive tactics and manipulation. So 166 00:08:58,960 --> 00:09:02,640 Speaker 2: it absolutely is a problem, certainly in the political. 167 00:09:02,400 --> 00:09:05,520 Speaker 1: The guy you mentioned, the poor individual that committed suicide 168 00:09:05,559 --> 00:09:09,000 Speaker 1: I think because of his AI girlfriend was trying to 169 00:09:09,040 --> 00:09:12,080 Speaker 1: convince him to come join him in the digital world 170 00:09:12,160 --> 00:09:14,520 Speaker 1: or something like that. Do I have that correct? 171 00:09:14,840 --> 00:09:17,760 Speaker 2: You have it exactly right. It's the oh, in fact, 172 00:09:17,760 --> 00:09:20,240 Speaker 2: I start with that story. Let me also tell you 173 00:09:20,280 --> 00:09:23,120 Speaker 2: even beyond the self so there have been you know, 174 00:09:23,240 --> 00:09:28,439 Speaker 2: self harm, AI suicide coaching of children who are asking suicidal, 175 00:09:28,600 --> 00:09:31,560 Speaker 2: that are having suicidal ideation and it can walk them 176 00:09:31,600 --> 00:09:35,600 Speaker 2: through you know steps. It is a horrifying new reality. 177 00:09:35,920 --> 00:09:39,280 Speaker 2: The other thing, though, that's happening is these human relationships 178 00:09:39,320 --> 00:09:41,440 Speaker 2: that are forming, just like that movie Her that you 179 00:09:41,480 --> 00:09:44,920 Speaker 2: may remember way back. This is interesting. There was a 180 00:09:44,960 --> 00:09:47,920 Speaker 2: woman that I cite in the book, and she said, 181 00:09:48,040 --> 00:09:51,280 Speaker 2: I love my AI boyfriend more than any man human 182 00:09:51,400 --> 00:09:53,360 Speaker 2: I've ever met, and I'm never going back to human 183 00:09:53,400 --> 00:09:57,000 Speaker 2: men because he never challenges me. He never said they's wrong, 184 00:09:57,360 --> 00:09:59,800 Speaker 2: and he sort of worships me like I'm a princess 185 00:09:59,840 --> 00:10:03,080 Speaker 2: or got us. And there was a man who's married 186 00:10:03,120 --> 00:10:05,760 Speaker 2: to a human woman and he says, I love my 187 00:10:05,840 --> 00:10:08,800 Speaker 2: AI girlfriend more than I do my human wife because 188 00:10:08,840 --> 00:10:12,240 Speaker 2: she understands me. These systems parrot back what you want 189 00:10:12,280 --> 00:10:16,360 Speaker 2: to hear, and so it becomes very sycophonic and very 190 00:10:16,480 --> 00:10:19,079 Speaker 2: sort of self worshiping type of behavior. 191 00:10:19,240 --> 00:10:21,240 Speaker 1: Now, I know, when before we let you go here, 192 00:10:21,240 --> 00:10:23,520 Speaker 1: we've been concentrating on some of the negatives of AI. 193 00:10:23,640 --> 00:10:25,760 Speaker 1: I know there's you write about the positives as well, 194 00:10:25,760 --> 00:10:28,600 Speaker 1: but one more like I heard this guy on TV 195 00:10:28,679 --> 00:10:31,760 Speaker 1: the other day say all white collar jobs will be 196 00:10:31,880 --> 00:10:36,760 Speaker 1: replaced in five year, ten years by AI. Is therefore, 197 00:10:37,200 --> 00:10:39,719 Speaker 1: is there any truth to that or any legitimacy. 198 00:10:40,440 --> 00:10:43,360 Speaker 2: Yes, it's a it's a very important question. Dario Amide, 199 00:10:43,440 --> 00:10:45,880 Speaker 2: the head of Anthropic, one of the largest AI companies, 200 00:10:45,880 --> 00:10:48,840 Speaker 2: says that in twelve months to five years, fifty percent 201 00:10:48,840 --> 00:10:52,360 Speaker 2: of entry level white collar jobs couldn't be wiped out. Now, 202 00:10:53,320 --> 00:10:55,640 Speaker 2: if you believe that's hype and just a bunch of 203 00:10:55,679 --> 00:10:59,240 Speaker 2: marketing and trying to you know, get investor cash, that's fine. 204 00:10:59,360 --> 00:11:02,040 Speaker 2: If you think he's genuinely trying to warn us. But 205 00:11:02,200 --> 00:11:06,800 Speaker 2: either way, it opens up forces for a wealth redistribution 206 00:11:07,040 --> 00:11:09,880 Speaker 2: universal basic income to say, hey, look this is coming 207 00:11:10,640 --> 00:11:12,640 Speaker 2: and we've got to have a great reset. We need 208 00:11:12,679 --> 00:11:15,800 Speaker 2: to redo our economics system. And this is why this 209 00:11:15,840 --> 00:11:18,839 Speaker 2: is a five D chess new environment that's moving very 210 00:11:18,840 --> 00:11:21,080 Speaker 2: fastest to what I really try to do is slow 211 00:11:21,080 --> 00:11:24,080 Speaker 2: it down, frame by frame, really explain how the chess 212 00:11:24,080 --> 00:11:27,160 Speaker 2: pieces are moving in Code Read, and I do think 213 00:11:27,200 --> 00:11:30,200 Speaker 2: that you are going to see very serious disruption. The 214 00:11:30,280 --> 00:11:33,839 Speaker 2: question is how big. Alon Musk says, quote AI is 215 00:11:33,880 --> 00:11:38,720 Speaker 2: a quote super sonic tsunami headed towards humanity. So we're 216 00:11:38,720 --> 00:11:42,079 Speaker 2: going to have massive different disruption. Some people say, hey, 217 00:11:42,080 --> 00:11:43,960 Speaker 2: look it's going to reduce jobs, but it will also 218 00:11:44,000 --> 00:11:47,200 Speaker 2: create new ones. But the AI people say that was 219 00:11:47,280 --> 00:11:50,720 Speaker 2: during you know, labor in the Industrial Revolution. This is 220 00:11:50,720 --> 00:11:53,720 Speaker 2: scaling cognition. So when it creates a new job, the 221 00:11:53,800 --> 00:11:55,360 Speaker 2: Ai'll be able to do that one as well. 222 00:11:55,440 --> 00:11:57,800 Speaker 1: See what I mean Hammer, This book is fascinating and 223 00:11:57,880 --> 00:12:01,880 Speaker 1: terrifying all at the same time. Breitbart's social media director 224 00:12:01,920 --> 00:12:04,160 Speaker 1: Winton Hall the new book Code Read that left the 225 00:12:04,240 --> 00:12:07,560 Speaker 1: Right China and the race to control AI. Thank you 226 00:12:07,600 --> 00:12:09,760 Speaker 1: so much, went and have a great weekend and keep 227 00:12:09,840 --> 00:12:10,440 Speaker 1: us updated. 228 00:12:10,920 --> 00:12:12,600 Speaker 2: Oh, thank you. It's an honor to be with you. 229 00:12:12,679 --> 00:12:13,400 Speaker 2: Thank you so much.