1 00:00:01,840 --> 00:00:04,760 Speaker 1: Welcome back to a numbers game with Ryan Grodowski. So, 2 00:00:05,200 --> 00:00:07,480 Speaker 1: if you turned on cable news or read a newspaper 3 00:00:07,480 --> 00:00:10,440 Speaker 1: the last few weeks, nearly every story straying this White 4 00:00:10,480 --> 00:00:13,280 Speaker 1: House has had something to do with Elon Musk and DOGE, 5 00:00:13,360 --> 00:00:16,319 Speaker 1: the Department of Government Efficiency. It's important to remember that 6 00:00:16,440 --> 00:00:19,200 Speaker 1: DOGE is not actually a government department, So it's not 7 00:00:19,320 --> 00:00:22,280 Speaker 1: like the Department of Education or the Interior Department. Most 8 00:00:22,320 --> 00:00:24,959 Speaker 1: people don't realize, but it's actually scheduled to end on 9 00:00:25,040 --> 00:00:28,680 Speaker 1: July fourth, twenty twenty six. Now, let's examine what DOGE 10 00:00:28,720 --> 00:00:31,760 Speaker 1: was created for and what is done. DOGE was created 11 00:00:31,760 --> 00:00:34,240 Speaker 1: by an executive order on January twentieth, twenty twenty five, 12 00:00:34,240 --> 00:00:37,839 Speaker 1: by President Trump. The executive order states that doge's purpose 13 00:00:37,920 --> 00:00:41,960 Speaker 1: is to quote implement the President's DOJ agenda by modernizing 14 00:00:42,000 --> 00:00:46,360 Speaker 1: federal technology and software to maximize government efficiency and productivity. 15 00:00:46,560 --> 00:00:48,640 Speaker 1: It later goes on to say that DOGE will improve 16 00:00:48,680 --> 00:00:53,080 Speaker 1: the IT system, work with agency heads to promote interruptibility 17 00:00:53,120 --> 00:00:57,880 Speaker 1: between agency networks and systems, ensure data integrity, and facilitate 18 00:00:57,920 --> 00:01:01,360 Speaker 1: responsible data collection and synchronization. Nowhere in the executive order 19 00:01:01,400 --> 00:01:03,160 Speaker 1: does DOGE talk about being used as a force to 20 00:01:03,200 --> 00:01:05,679 Speaker 1: balance the budget or cancel contracts or like o of 21 00:01:05,720 --> 00:01:09,480 Speaker 1: federal workers. It's very clearly a conversation about modernizing and 22 00:01:09,560 --> 00:01:12,760 Speaker 1: protecting data to run more efficiency, which is a noble goal, 23 00:01:12,920 --> 00:01:15,720 Speaker 1: like the government should try to do that. Yet most 24 00:01:15,720 --> 00:01:20,319 Speaker 1: of the conversations surrounding Doge it's about saving money and 25 00:01:20,360 --> 00:01:23,479 Speaker 1: stopping inefficient programs. So let's get into the data. Let's 26 00:01:23,480 --> 00:01:26,760 Speaker 1: get into the numbers. The US government is currently thirty 27 00:01:26,800 --> 00:01:30,200 Speaker 1: six point four trillion dollars in debt, which is a number, 28 00:01:30,360 --> 00:01:33,240 Speaker 1: like most people can't think up with thirty six trillion 29 00:01:33,280 --> 00:01:37,040 Speaker 1: dollars in debt. It runs a two trillion dollar deficit 30 00:01:37,240 --> 00:01:40,360 Speaker 1: every single year. It's not because we don't collect money. 31 00:01:40,400 --> 00:01:42,600 Speaker 1: Any taxpayer will tell you we collect money. The federal 32 00:01:42,640 --> 00:01:47,200 Speaker 1: government's revenue from taxes is over five trillion dollars a year, 33 00:01:47,240 --> 00:01:51,000 Speaker 1: but we spend seven trillion now. According to the Doge clock, 34 00:01:51,680 --> 00:01:54,120 Speaker 1: which is a feature from the US deck clock website, 35 00:01:54,360 --> 00:01:58,280 Speaker 1: doges estimated savings for the government so far from what 36 00:01:58,320 --> 00:02:01,800 Speaker 1: they found is about eighty one billion dollars in waste 37 00:02:01,880 --> 00:02:04,800 Speaker 1: fraud abuse. Some analysts say it's like closer to thirty 38 00:02:04,840 --> 00:02:08,519 Speaker 1: eight billion, but it's between you know, thirty thirty eight 39 00:02:08,520 --> 00:02:11,520 Speaker 1: and eighty one billion, whichever number it is, it's a 40 00:02:11,600 --> 00:02:14,959 Speaker 1: lot of money. Pretending like it's not, which some Democrats 41 00:02:14,960 --> 00:02:18,600 Speaker 1: are doing, is very stupid, because American taxayers feel like 42 00:02:18,639 --> 00:02:22,160 Speaker 1: their money is being wasted on DEI consultants and duplicate programs, 43 00:02:22,280 --> 00:02:24,560 Speaker 1: or just showed up being stolen and given to corrupt 44 00:02:24,560 --> 00:02:28,560 Speaker 1: oligarchs overseas. But that's your data. That's the hard number. 45 00:02:28,760 --> 00:02:32,600 Speaker 1: Thirty six trillion in depth, five trillion taken in through taxes, 46 00:02:32,720 --> 00:02:36,840 Speaker 1: seven trillion spent, and two trillion in deficits, with between 47 00:02:36,880 --> 00:02:40,840 Speaker 1: thirty eight and eighty one billion dollars in savings from DOGE. 48 00:02:41,560 --> 00:02:45,960 Speaker 1: Defenders of these programs such as like USAID, which has 49 00:02:46,000 --> 00:02:49,880 Speaker 1: been very heavily unscrutinized by the Doge Committee, I've said 50 00:02:49,880 --> 00:02:53,840 Speaker 1: that these programs help promote foreign policy and soft power. 51 00:02:54,520 --> 00:02:56,720 Speaker 1: This is Washington d C Speak for you're too stupid 52 00:02:56,760 --> 00:02:59,680 Speaker 1: to complain abo how your tax dollars work. So kudos 53 00:02:59,720 --> 00:03:05,320 Speaker 1: to anyone looking for bad spending and wasteful and duplicate programs. 54 00:03:05,360 --> 00:03:09,440 Speaker 1: This isn't really really important work. However, it's not enough 55 00:03:09,480 --> 00:03:12,480 Speaker 1: to balance our budget. It's not even close it's currently 56 00:03:12,600 --> 00:03:15,520 Speaker 1: less than half of one percent of the federal deficit. 57 00:03:15,639 --> 00:03:18,680 Speaker 1: This is far below the two trillion dollar number that 58 00:03:18,680 --> 00:03:21,280 Speaker 1: Elon Musk originally said he hoped Doge could find and 59 00:03:21,360 --> 00:03:25,080 Speaker 1: cut from the government. As a millennial who vaguely remembers 60 00:03:25,120 --> 00:03:27,080 Speaker 1: the last time he balanced the budget in two thousand 61 00:03:27,080 --> 00:03:30,920 Speaker 1: and one, this is frustrating. Bill Clinton actually said in 62 00:03:30,919 --> 00:03:33,519 Speaker 1: two thousand he expected to pay down the entire US 63 00:03:33,600 --> 00:03:36,520 Speaker 1: debt by twenty twelve because we're running surpluses in the 64 00:03:36,600 --> 00:03:39,240 Speaker 1: late nineties and early two thousands, and then of course 65 00:03:39,240 --> 00:03:42,320 Speaker 1: we had the Warren terror warre in Iraq, the Bush 66 00:03:42,360 --> 00:03:45,840 Speaker 1: tax cuts, Obamacare, cash for clunkers, so on, and so forth. 67 00:03:46,240 --> 00:03:50,840 Speaker 1: Administration after administration, and we're in the situation we're in now. 68 00:03:50,920 --> 00:03:54,280 Speaker 1: Doge hit a controversy even before it began, first with 69 00:03:54,320 --> 00:03:57,000 Speaker 1: the exiting of co chairman of ac Raamaswami. He left 70 00:03:57,080 --> 00:04:00,160 Speaker 1: before the administration started. This is after he said a 71 00:04:00,200 --> 00:04:02,920 Speaker 1: series of tweets saying that Americans were lazy and unsuccessful 72 00:04:02,960 --> 00:04:06,360 Speaker 1: because they let their kids idolize athletes and have sleepovers. 73 00:04:07,080 --> 00:04:11,040 Speaker 1: Sources close to me have told me that Doge was 74 00:04:11,120 --> 00:04:14,400 Speaker 1: looking for him to get out way before, way before 75 00:04:14,440 --> 00:04:17,440 Speaker 1: this actually came out, these tweets came out. He wasn't 76 00:04:17,520 --> 00:04:20,840 Speaker 1: what you considered an extremely active participant. You know when 77 00:04:20,880 --> 00:04:22,800 Speaker 1: you're like a little you're like a kid and your 78 00:04:22,839 --> 00:04:25,040 Speaker 1: little brother wants to play a video game that you're playing, 79 00:04:25,160 --> 00:04:27,120 Speaker 1: so but he's terrible, so you give him like a 80 00:04:27,160 --> 00:04:29,440 Speaker 1: remote that's not plugged in to the game and tell 81 00:04:29,480 --> 00:04:32,680 Speaker 1: him that he's like playing. That was allegedly the relationship 82 00:04:32,720 --> 00:04:35,520 Speaker 1: between Elonofek. That's what my sources have told me. Since 83 00:04:35,640 --> 00:04:39,640 Speaker 1: starting though in January, Doge's staff has gained access to 84 00:04:39,720 --> 00:04:44,240 Speaker 1: many federal departments, taking data, including sensitive data, offering buyouts 85 00:04:44,279 --> 00:04:47,599 Speaker 1: to federal workers so they could retire early. They've closed 86 00:04:47,640 --> 00:04:50,520 Speaker 1: excess office space or looking to close excess office space, 87 00:04:51,000 --> 00:04:54,159 Speaker 1: and canceling contracts that they thought were wasteful. There's a 88 00:04:54,240 --> 00:04:57,240 Speaker 1: lot of noise around Doge. The Democrats don't have a 89 00:04:57,279 --> 00:05:00,800 Speaker 1: singular message because and that's a real problem for the 90 00:05:00,839 --> 00:05:03,120 Speaker 1: Democratic Party if you've paid attention the last few message 91 00:05:03,120 --> 00:05:06,239 Speaker 1: for the last few years. But they're very angry about 92 00:05:06,279 --> 00:05:10,280 Speaker 1: the accessing data. That the staffers were very young, The 93 00:05:10,320 --> 00:05:12,760 Speaker 1: Dough staffers are very young. Some of the Dog staffers 94 00:05:12,760 --> 00:05:14,760 Speaker 1: have written very spicy takes on social media that they've 95 00:05:14,800 --> 00:05:18,479 Speaker 1: raised concerns about that accessing this federal data and canceling 96 00:05:18,520 --> 00:05:21,919 Speaker 1: these contracts is illegal. I want to focus specifically on 97 00:05:21,960 --> 00:05:23,800 Speaker 1: the data aspect because a lot of the other stuff 98 00:05:23,839 --> 00:05:27,000 Speaker 1: is just political noise. It's not actually relevant that one 99 00:05:27,000 --> 00:05:30,520 Speaker 1: of the people who's nineteen wrote something spicy on Twitter 100 00:05:30,640 --> 00:05:33,560 Speaker 1: seven months ago, like that doesn't actually matter, but the 101 00:05:33,600 --> 00:05:37,600 Speaker 1: accessing federal data could matter. Musk and the Trump administration 102 00:05:37,720 --> 00:05:41,120 Speaker 1: said that they had quote read only access to all 103 00:05:41,200 --> 00:05:44,960 Speaker 1: the federal data, but reporters had wired so that at 104 00:05:45,080 --> 00:05:47,719 Speaker 1: least one dough staffer had the ability to change the 105 00:05:47,760 --> 00:05:50,839 Speaker 1: code of the Treasury Department's payment system. Democrats have said 106 00:05:50,880 --> 00:05:53,320 Speaker 1: that the staffers were attempting to cut off payments to 107 00:05:53,480 --> 00:05:56,520 Speaker 1: programs that were against President Trump's agenda without an Act 108 00:05:56,560 --> 00:05:59,560 Speaker 1: of Congress. Federal judge in New York has blocked Doge's 109 00:05:59,600 --> 00:06:01,600 Speaker 1: access to the sense of data and the Treasury Department, 110 00:06:01,720 --> 00:06:04,080 Speaker 1: while a different federal judge in Washington has declined to 111 00:06:04,080 --> 00:06:07,479 Speaker 1: block doge's access to data in the Labor Department. That 112 00:06:07,800 --> 00:06:10,520 Speaker 1: Ladder judge did express those concerns that seffrs had not 113 00:06:10,600 --> 00:06:13,719 Speaker 1: been properly trained about handling sensitive data. The Washington Post 114 00:06:13,800 --> 00:06:17,040 Speaker 1: reported quote Dojes associates have been feeding vast tropes of 115 00:06:17,080 --> 00:06:20,960 Speaker 1: government records and databases into artificial intelligence tools, looking for 116 00:06:21,080 --> 00:06:25,160 Speaker 1: unwanted federal programs and trying to determine which human work 117 00:06:25,200 --> 00:06:28,480 Speaker 1: can be replaced with AI, machine learning tools, or even robots. 118 00:06:29,160 --> 00:06:32,760 Speaker 1: Everything that can be machine automated will be machine automated. 119 00:06:32,800 --> 00:06:35,640 Speaker 1: So one person speaking to the Post, that raises a 120 00:06:35,720 --> 00:06:38,520 Speaker 1: question in my head, this is what I'm concerned about. 121 00:06:38,960 --> 00:06:42,200 Speaker 1: Whose AI company are they replacing the workers? Because the 122 00:06:42,240 --> 00:06:46,000 Speaker 1: government doesn't own an AI system. That doesn't exist, there's 123 00:06:46,000 --> 00:06:49,719 Speaker 1: no federal AI system. It contracts out to other AI companies, 124 00:06:50,080 --> 00:06:53,440 Speaker 1: and tech companies have investing billions and billions of dollars 125 00:06:53,520 --> 00:06:57,679 Speaker 1: into smarter, faster, and possibly one day self learning AI. 126 00:06:57,800 --> 00:07:00,000 Speaker 1: That's like the goal of a lot of tech billionaires 127 00:07:00,120 --> 00:07:03,120 Speaker 1: is to have self learning AI. The US has no 128 00:07:03,520 --> 00:07:07,480 Speaker 1: comprehensive federal laws governing AI. There are none. While there 129 00:07:07,480 --> 00:07:10,679 Speaker 1: are some guidelines in the federal agencies, there's no giant 130 00:07:10,800 --> 00:07:13,360 Speaker 1: oversight as to what AI should be doing. It could 131 00:07:13,360 --> 00:07:15,840 Speaker 1: be doing that could be a good thing for if 132 00:07:15,840 --> 00:07:19,080 Speaker 1: you're an AI investor, a bad thing if you're concerned 133 00:07:19,120 --> 00:07:23,920 Speaker 1: with certain issues around AI and how would either a 134 00:07:24,000 --> 00:07:27,000 Speaker 1: replacing workers or how it could possibly be interfering in 135 00:07:27,040 --> 00:07:30,600 Speaker 1: our lives. That being said, if it's going to replace 136 00:07:30,640 --> 00:07:33,640 Speaker 1: our federal workforce with artificial intelligence, if this is the 137 00:07:33,680 --> 00:07:38,080 Speaker 1: goal of Doge, whose AI company are we using? Where 138 00:07:38,120 --> 00:07:40,320 Speaker 1: are we bidding this for? Because it's not coming from 139 00:07:40,320 --> 00:07:42,640 Speaker 1: within the company, within the government. We're not building our 140 00:07:42,680 --> 00:07:46,320 Speaker 1: own is it Elon Musks He recently just put a 141 00:07:46,400 --> 00:07:49,520 Speaker 1: ninety seven point four billion dollar bid to buy the 142 00:07:49,560 --> 00:07:52,640 Speaker 1: assets of a nonprofit that controls open Ai. According to 143 00:07:52,640 --> 00:07:55,120 Speaker 1: The New York Times, he's been very interested in building 144 00:07:55,120 --> 00:07:58,520 Speaker 1: an AI company. This is something worth asking as we 145 00:07:58,680 --> 00:08:02,640 Speaker 1: plunge headfirst into our new world. I think it could 146 00:08:02,720 --> 00:08:05,680 Speaker 1: be a real problem when it comes to Doge is 147 00:08:05,680 --> 00:08:08,920 Speaker 1: that Democrats and Republicans are really talking past each other 148 00:08:08,960 --> 00:08:12,400 Speaker 1: at this moment. Republicans are rightfully concerned about spending because 149 00:08:12,440 --> 00:08:15,560 Speaker 1: we are spending way too much, way too much deficit, 150 00:08:15,640 --> 00:08:19,040 Speaker 1: way too much debt. They're completely and totally right and 151 00:08:19,640 --> 00:08:23,160 Speaker 1: looking past bloated federal agencies to take out the waste 152 00:08:23,160 --> 00:08:26,840 Speaker 1: as a noble cause. And while some Democrats are trying 153 00:08:26,880 --> 00:08:30,160 Speaker 1: to score political points against Donald Trump, who's enjoying the 154 00:08:30,200 --> 00:08:32,959 Speaker 1: most faveriable prole ratings of his entire career. Many are 155 00:08:33,080 --> 00:08:35,320 Speaker 1: right to have concerns with this data. But while they're 156 00:08:35,320 --> 00:08:37,480 Speaker 1: looking at it from the question of Doge holding a 157 00:08:37,480 --> 00:08:40,719 Speaker 1: Social Security number and income record, no one really, I mean, 158 00:08:40,720 --> 00:08:43,679 Speaker 1: that's not the real concern. What I'm worried about is 159 00:08:43,720 --> 00:08:45,880 Speaker 1: the future of the government brought to us by our 160 00:08:45,920 --> 00:08:50,120 Speaker 1: tech overlords. Whose AI company is going to be running 161 00:08:50,120 --> 00:08:52,959 Speaker 1: our federal agencies, because it's not one that we've built, 162 00:08:53,200 --> 00:08:55,480 Speaker 1: it's not one that comes from within the government, is 163 00:08:55,520 --> 00:08:58,120 Speaker 1: one that is contracted from out the government. And if 164 00:08:58,120 --> 00:09:01,880 Speaker 1: the people having this access to this quote unquote possibly 165 00:09:01,920 --> 00:09:05,440 Speaker 1: read only files or if they're changing coding, if they're 166 00:09:05,440 --> 00:09:09,520 Speaker 1: the ones getting access for their companies, that is extremely, 167 00:09:09,720 --> 00:09:16,680 Speaker 1: extremely problematic. Hey, we'll be right back after this. I 168 00:09:16,720 --> 00:09:19,320 Speaker 1: am beyond excited to have my guests this week, someone 169 00:09:19,320 --> 00:09:22,320 Speaker 1: who can talk a lot about Doge. My guest, Congressman 170 00:09:22,400 --> 00:09:25,319 Speaker 1: Marjorie Taylor Green from the Great State of Georgia's represents 171 00:09:25,360 --> 00:09:29,280 Speaker 1: the fourteenth district since twenty twenty one. Representative Green is 172 00:09:29,360 --> 00:09:33,880 Speaker 1: the chairwoman of the Delivering on Government Efficiency Subcommittee. Congressman 173 00:09:34,000 --> 00:09:38,240 Speaker 1: thank you for being here, having to be on So 174 00:09:38,280 --> 00:09:41,760 Speaker 1: my first first question is very straightforward, how does your 175 00:09:42,000 --> 00:09:43,640 Speaker 1: caucus work with DOGE. 176 00:09:44,000 --> 00:09:50,720 Speaker 2: Well, I'm on the subcommittee Doge Committee, which is on oversight, 177 00:09:50,920 --> 00:09:53,240 Speaker 2: so it's not in caucus. 178 00:09:52,880 --> 00:09:54,240 Speaker 1: Right, Sorry, that's what I'm gonna say. 179 00:09:54,440 --> 00:09:58,000 Speaker 2: Yeah, Yeah, caucuses are more like clubs and right, sorry 180 00:09:58,000 --> 00:10:00,839 Speaker 2: about that. Yeah, So this is this, this is a committee, 181 00:10:01,120 --> 00:10:03,560 Speaker 2: and what we will be doing is we'll be doing 182 00:10:03,559 --> 00:10:08,800 Speaker 2: the important work in Congress to make DOGE actually effective 183 00:10:08,840 --> 00:10:12,480 Speaker 2: from the legislative branch of the government. And I think 184 00:10:12,520 --> 00:10:16,439 Speaker 2: this is an incredibly important thing to do because if 185 00:10:16,440 --> 00:10:20,440 Speaker 2: DOGE is only functioning in the executive branch, that means 186 00:10:20,520 --> 00:10:24,520 Speaker 2: and we're going to have every single Obama and Biden 187 00:10:24,559 --> 00:10:28,960 Speaker 2: appointed judge coming out and blocking the important work that 188 00:10:29,000 --> 00:10:31,120 Speaker 2: they're trying to do. We're going to have, you know, 189 00:10:31,240 --> 00:10:35,560 Speaker 2: through Democrats filing all kinds of lawsuits trying to stop 190 00:10:36,320 --> 00:10:41,160 Speaker 2: Elon Musk and Doge, and in any future Democrat administration 191 00:10:41,840 --> 00:10:45,400 Speaker 2: can virtually go through and undo everything that's been done 192 00:10:45,440 --> 00:10:46,760 Speaker 2: with their executive orders. 193 00:10:47,160 --> 00:10:49,640 Speaker 1: So you're trying to get some of the stuff done 194 00:10:49,720 --> 00:10:52,079 Speaker 1: in the legislature where there will be bills to kind 195 00:10:52,080 --> 00:10:54,360 Speaker 1: of reduce the spending as well as which I think. 196 00:10:54,200 --> 00:10:56,320 Speaker 2: You're so yes, We've got to do. We've got to 197 00:10:56,360 --> 00:11:00,160 Speaker 2: pass bills, get them into law, we have got to 198 00:11:00,200 --> 00:11:06,160 Speaker 2: put them in our funding appropriations. We've got to do 199 00:11:06,200 --> 00:11:09,520 Speaker 2: everything we can. Congress has got to get on board 200 00:11:09,520 --> 00:11:12,520 Speaker 2: with DOGE because the American people voted for it. President 201 00:11:12,559 --> 00:11:15,679 Speaker 2: Trump campaigned on it the last few months of his campaign. 202 00:11:15,920 --> 00:11:18,680 Speaker 2: It was one of the biggest pieces. And this is 203 00:11:18,679 --> 00:11:21,240 Speaker 2: what the American people want, and Republicans need to make 204 00:11:21,240 --> 00:11:21,920 Speaker 2: sure we deliver. 205 00:11:22,320 --> 00:11:24,600 Speaker 1: So I think that a lot of Republicans when it 206 00:11:24,640 --> 00:11:27,960 Speaker 1: comes to DOJ, they're very interested in the type of 207 00:11:28,000 --> 00:11:30,360 Speaker 1: ways our tax dollars are being used. So you have 208 00:11:30,480 --> 00:11:33,680 Speaker 1: things like one hundred million dollars for DEI training for teachers, 209 00:11:34,120 --> 00:11:37,439 Speaker 1: nine million dollars in a contract for quote Central American 210 00:11:37,559 --> 00:11:41,040 Speaker 1: Gender Assessment Services. I don't know what that actually means. 211 00:11:41,520 --> 00:11:44,360 Speaker 1: What has been your opinion on the finding so far? 212 00:11:45,480 --> 00:11:48,560 Speaker 2: I think they're outrageous. I share the same outrage that 213 00:11:48,600 --> 00:11:51,480 Speaker 2: the American people, do, you know, think about it like this, Ryan, 214 00:11:51,559 --> 00:11:54,320 Speaker 2: here we are, we're coming up on April fifteenth, and 215 00:11:54,480 --> 00:11:59,880 Speaker 2: many small business owners, many Americans, you know, single moms, 216 00:12:00,080 --> 00:12:02,960 Speaker 2: all kinds of people right now are pulling together their 217 00:12:03,000 --> 00:12:05,840 Speaker 2: financial paperwork where they're going to get ready to file 218 00:12:05,920 --> 00:12:09,800 Speaker 2: their taxes and once again pay the federal government a 219 00:12:09,840 --> 00:12:12,959 Speaker 2: bunch of money that they have worked so hard to earn. 220 00:12:13,280 --> 00:12:15,599 Speaker 2: And at the same time that that part of the 221 00:12:15,679 --> 00:12:19,640 Speaker 2: year is approaching that deadline to pay taxes to the 222 00:12:19,679 --> 00:12:26,120 Speaker 2: federal government, here's people learning about ridiculous expenditures of their 223 00:12:26,160 --> 00:12:30,320 Speaker 2: hard earned tax dollars, and we're thirty six trillion dollars 224 00:12:30,360 --> 00:12:33,559 Speaker 2: in debt. The American people should not only be outraged, 225 00:12:33,640 --> 00:12:38,520 Speaker 2: they should be furious. Now it's too much to even comprehend. 226 00:12:38,840 --> 00:12:41,440 Speaker 1: Now. Elon Musk originally said he was hoping to cut 227 00:12:41,480 --> 00:12:45,040 Speaker 1: two trillion dollars using dog I think I think Congress 228 00:12:45,080 --> 00:12:47,439 Speaker 1: does need to get involved in order to hit that amount. 229 00:12:48,120 --> 00:12:51,000 Speaker 1: You know, in terms of huge spending cuts, are you 230 00:12:51,760 --> 00:12:54,520 Speaker 1: I mean, has any of your fellow members sat there 231 00:12:54,520 --> 00:12:56,760 Speaker 1: and said that they want to institute these kinds of 232 00:12:56,800 --> 00:12:58,600 Speaker 1: spending cuts to make government more efficient. 233 00:12:59,400 --> 00:13:03,400 Speaker 2: Yeah, Actually, it's what we're arguing about. That's what's taking 234 00:13:03,480 --> 00:13:07,640 Speaker 2: us so long with our reconciliation budget is we are 235 00:13:07,720 --> 00:13:10,520 Speaker 2: trying very hard to figure out how to make cuts 236 00:13:11,200 --> 00:13:15,400 Speaker 2: and It's not going to happen overnight. This is something 237 00:13:15,480 --> 00:13:17,920 Speaker 2: that's really going to have to be dialed in because 238 00:13:17,960 --> 00:13:20,040 Speaker 2: you have to understand, not only do we have to 239 00:13:20,080 --> 00:13:24,640 Speaker 2: cut spending, we also have to pass President Trump's Tax 240 00:13:24,720 --> 00:13:27,480 Speaker 2: Cuts and Savings Law. We have to put that back 241 00:13:27,559 --> 00:13:30,880 Speaker 2: into law because it expires this year. But we're also 242 00:13:30,920 --> 00:13:33,760 Speaker 2: adding more tax cuts on top of that. No tax 243 00:13:33,840 --> 00:13:37,840 Speaker 2: on tips, no tax on social Security, no tax on overtime. 244 00:13:38,200 --> 00:13:42,480 Speaker 2: And President Trump is supporting salt. So these are more 245 00:13:42,800 --> 00:13:46,160 Speaker 2: tax cuts that are going to take place on top 246 00:13:46,200 --> 00:13:49,200 Speaker 2: of spending cuts, and we just have to make sure 247 00:13:49,240 --> 00:13:52,080 Speaker 2: that we've got the numbers right to be able to 248 00:13:52,120 --> 00:13:54,400 Speaker 2: deliver on those promises right now. 249 00:13:54,440 --> 00:13:57,800 Speaker 1: Democrats have used Elon Musk as a punching bag in 250 00:13:57,840 --> 00:14:02,160 Speaker 1: the last couple of weeks, specifically around concerns over federal data, 251 00:14:02,280 --> 00:14:05,400 Speaker 1: and some of Republican colleagues of yours have some letters 252 00:14:05,440 --> 00:14:09,640 Speaker 1: to constituents saying that they're concerned, what is your opinion 253 00:14:09,880 --> 00:14:13,760 Speaker 1: and what is the opinion of the subcommittee when it 254 00:14:13,800 --> 00:14:17,080 Speaker 1: comes to oversight, when it comes like handling sensitive data. 255 00:14:18,120 --> 00:14:20,840 Speaker 2: Well, you know, it's really ridiculous that the Democrats have 256 00:14:20,960 --> 00:14:26,680 Speaker 2: decided to basically make the new bad guy, the man 257 00:14:26,880 --> 00:14:31,240 Speaker 2: that is getting involved in trying to save America from 258 00:14:32,320 --> 00:14:36,080 Speaker 2: our death fil basically into thirty six million dollars in debt. 259 00:14:36,320 --> 00:14:39,080 Speaker 2: It's important to talk about the compounding interest in our debt. 260 00:14:39,360 --> 00:14:42,040 Speaker 2: This year alone, the compounding interest is going to be 261 00:14:42,080 --> 00:14:45,320 Speaker 2: bigger than our military budget, which is close to nine 262 00:14:45,440 --> 00:14:48,840 Speaker 2: hundred billion dollars. So for the Democrats to decide that 263 00:14:48,880 --> 00:14:51,240 Speaker 2: they're they're going to make the bad guy, and they're 264 00:14:51,240 --> 00:14:53,520 Speaker 2: going to go out there and protests in front of 265 00:14:53,680 --> 00:14:58,320 Speaker 2: the Department of Education and the Department of Commerce, and 266 00:14:58,360 --> 00:15:00,240 Speaker 2: they're going out there and saying, we're going to to 267 00:15:00,280 --> 00:15:05,280 Speaker 2: defend the unelected bureaucrats, We're defending the government. We want 268 00:15:05,320 --> 00:15:08,120 Speaker 2: to keep spending this money. Elon Musk is bad. He 269 00:15:08,280 --> 00:15:11,400 Speaker 2: is evil. I think that is music to my ears 270 00:15:11,440 --> 00:15:14,280 Speaker 2: because I really want to win the midterms Ryan, and 271 00:15:14,520 --> 00:15:18,520 Speaker 2: I think the Democrats are definitely going down the wrong 272 00:15:18,600 --> 00:15:21,720 Speaker 2: path for making a big fuss over cutting spending. 273 00:15:21,840 --> 00:15:25,000 Speaker 1: Yeah, I mean, the image of Chuck Schumer and Maxine 274 00:15:25,000 --> 00:15:26,720 Speaker 1: Water is raising their canes in the air. 275 00:15:27,760 --> 00:15:31,560 Speaker 2: Not exactly, And think about it. On top of that, 276 00:15:31,640 --> 00:15:35,160 Speaker 2: you bring up them, they've been in Congress probably longer 277 00:15:35,160 --> 00:15:38,120 Speaker 2: than me and you have been alive, and they're responsible 278 00:15:38,440 --> 00:15:42,200 Speaker 2: for the debt, and so it's just laughable. They're just 279 00:15:42,320 --> 00:15:47,960 Speaker 2: providing us whenever ending campaign ads that will definitely be beneficial. 280 00:15:48,280 --> 00:15:52,040 Speaker 1: I wonder, though, if there are Democrats on the subcommittee 281 00:15:52,080 --> 00:15:57,120 Speaker 1: who do worry about things like you know, Musk has 282 00:15:57,360 --> 00:16:00,840 Speaker 1: Musk has not it still runs his businesses and he's 283 00:16:01,000 --> 00:16:04,520 Speaker 1: technically got a government employee, and if there's any kind 284 00:16:04,560 --> 00:16:08,000 Speaker 1: of conflict of interest around that, is that something that 285 00:16:08,000 --> 00:16:10,360 Speaker 1: subcommittee would be interested in picking up or at least 286 00:16:10,360 --> 00:16:12,960 Speaker 1: we're viewing and seeing and saying, here's some oversight, here's 287 00:16:12,960 --> 00:16:16,440 Speaker 1: some guardrails, because I think that the concern over guardrails 288 00:16:16,520 --> 00:16:20,440 Speaker 1: has really reached, at least in the media, a very 289 00:16:20,600 --> 00:16:24,040 Speaker 1: very very big concern in some people. Some people have concerned. 290 00:16:23,680 --> 00:16:28,000 Speaker 2: About Oh, I think guardrails are important. That's why the 291 00:16:28,040 --> 00:16:31,800 Speaker 2: Oversight Committee we investigated the Biden crime family, right, there 292 00:16:31,800 --> 00:16:35,680 Speaker 2: were no guardrails, but in place at all we can 293 00:16:35,760 --> 00:16:39,640 Speaker 2: we can look at many Democrats and their connections and 294 00:16:39,760 --> 00:16:42,160 Speaker 2: how they make money. I mean, we're looking at like 295 00:16:42,320 --> 00:16:47,240 Speaker 2: USAID for example, has funded so many NGOs, has funded 296 00:16:47,320 --> 00:16:52,040 Speaker 2: has basically rode alongside the CIA with regime change, in 297 00:16:52,160 --> 00:16:56,880 Speaker 2: foreign wars. They have funded many organizations that George Soros 298 00:16:56,960 --> 00:17:00,240 Speaker 2: is literally aligned with and running, and we know that 299 00:17:00,280 --> 00:17:04,000 Speaker 2: money makes its way back into Democrat campaign offers. I 300 00:17:04,040 --> 00:17:08,200 Speaker 2: don't think that there's I don't have any concerns about 301 00:17:08,200 --> 00:17:11,320 Speaker 2: Elon Musk. You know they're talking about, Oh, Elon Musk 302 00:17:11,359 --> 00:17:14,240 Speaker 2: has people social security numbers. Well, the richest man in 303 00:17:14,280 --> 00:17:16,399 Speaker 2: the world is really not going to go out and 304 00:17:16,440 --> 00:17:19,560 Speaker 2: set up some account with your grandma's Social Security number. 305 00:17:19,760 --> 00:17:21,399 Speaker 2: He just doesn't need to do that. 306 00:17:22,560 --> 00:17:26,600 Speaker 1: So no, I agree. I think the worry over social 307 00:17:26,640 --> 00:17:30,840 Speaker 1: security numbers is very kind of a low iq to argument. 308 00:17:30,880 --> 00:17:33,320 Speaker 1: It's a very dumb down argument to scare people because 309 00:17:33,359 --> 00:17:35,800 Speaker 1: it doesn't really make any sense. And I also think 310 00:17:35,840 --> 00:17:38,960 Speaker 1: that we need to reframe what titles are. If only 311 00:17:39,080 --> 00:17:42,480 Speaker 1: if you're an NGO, but only all your money comes 312 00:17:42,480 --> 00:17:44,400 Speaker 1: from the government, you're not really an NGO. 313 00:17:45,800 --> 00:17:50,600 Speaker 2: Well, you're not even a business, just a federally funded program. 314 00:17:50,600 --> 00:17:54,800 Speaker 2: It's a handout. Basically, it's kind of like welfare giving 315 00:17:54,920 --> 00:17:58,919 Speaker 2: somebody that has three degrees, a master's degree, a PhD. 316 00:17:59,160 --> 00:18:03,240 Speaker 2: So they are lit really the product of Marx's Marxism 317 00:18:03,800 --> 00:18:07,440 Speaker 2: and Communism and universities and then they go out and 318 00:18:07,520 --> 00:18:09,640 Speaker 2: client tell mom and dad that they got our mom 319 00:18:09,680 --> 00:18:12,040 Speaker 2: and mom or dad and dad that they got a job. 320 00:18:12,720 --> 00:18:16,159 Speaker 2: And it's really a federal funded handout. So we just 321 00:18:16,240 --> 00:18:17,919 Speaker 2: as a people need to look at it and go, 322 00:18:17,960 --> 00:18:20,920 Speaker 2: we're thirty six trillion dollars in debt. What can we 323 00:18:20,960 --> 00:18:24,000 Speaker 2: afford and what can we not afford? Yeah, we can't 324 00:18:24,040 --> 00:18:27,360 Speaker 2: afford these these federal handouts for these so called. 325 00:18:27,359 --> 00:18:29,720 Speaker 1: Ng its you're listening to it's a numbers game with 326 00:18:29,840 --> 00:18:31,600 Speaker 1: Ryan Gradowsky. We'll be right back. 327 00:18:34,960 --> 00:18:35,880 Speaker 2: There was a. 328 00:18:35,960 --> 00:18:39,760 Speaker 1: Story in the Washington Post about the fact that they 329 00:18:39,760 --> 00:18:43,280 Speaker 1: are looking to possibly people indage are looking to change 330 00:18:43,680 --> 00:18:47,960 Speaker 1: the costs and way we do government work with replacing 331 00:18:48,000 --> 00:18:52,600 Speaker 1: some human workers with AI. What is your opinion of 332 00:18:53,160 --> 00:18:56,440 Speaker 1: replacing workers with federal workers with AI? Do you have 333 00:18:56,480 --> 00:18:58,960 Speaker 1: one or is that something of the subcommittee could look 334 00:18:59,000 --> 00:19:03,000 Speaker 1: at as a possible, you know, way to change up 335 00:19:03,320 --> 00:19:06,640 Speaker 1: and reduce spending. Is obviously AI is much cheaper than 336 00:19:06,760 --> 00:19:07,360 Speaker 1: people are. 337 00:19:08,280 --> 00:19:11,439 Speaker 2: Yeah, it definitely is. I support President Trump and the 338 00:19:11,440 --> 00:19:15,200 Speaker 2: federal buyouts for federal workers, and I think that has 339 00:19:15,240 --> 00:19:18,159 Speaker 2: been a wise decision. We have to reduce the federal 340 00:19:18,200 --> 00:19:22,520 Speaker 2: workforce it's the largest employment in the employer in the 341 00:19:22,640 --> 00:19:27,760 Speaker 2: entire country. I think AI can be incredibly helpful. It's 342 00:19:27,800 --> 00:19:31,959 Speaker 2: definitely being used in the private sector, from everything of 343 00:19:32,040 --> 00:19:36,360 Speaker 2: replacing say, for example, and employees at fast food restaurants, 344 00:19:36,440 --> 00:19:39,520 Speaker 2: all the way down to being able to replace construction 345 00:19:39,640 --> 00:19:45,240 Speaker 2: workers by printing homes literally building homes. So I think 346 00:19:45,280 --> 00:19:49,000 Speaker 2: AI is definitely a solution that we'll be looking at, 347 00:19:49,400 --> 00:19:52,640 Speaker 2: and it'll just depend on you know, I don't nail 348 00:19:52,680 --> 00:19:56,200 Speaker 2: down always on a broad issue, just depends on how 349 00:19:56,200 --> 00:19:57,080 Speaker 2: to solve the problem. 350 00:19:57,240 --> 00:19:58,880 Speaker 1: Yeah, no, I agree with you, and I don't want 351 00:19:58,880 --> 00:20:02,520 Speaker 1: to give you fedical because I think that's just that's 352 00:20:02,520 --> 00:20:04,560 Speaker 1: when reporters use that gotcha questions. I don't want to 353 00:20:04,560 --> 00:20:07,040 Speaker 1: give you hypothetical. I just I think that it's a 354 00:20:07,119 --> 00:20:10,040 Speaker 1: good question because the federal government doesn't own an AI company, 355 00:20:10,080 --> 00:20:13,560 Speaker 1: so everything will be contracted out, and I just you know, 356 00:20:13,600 --> 00:20:15,399 Speaker 1: when it comes to all of our federal information, all 357 00:20:15,400 --> 00:20:19,200 Speaker 1: of our federal data, I think that having some oversight 358 00:20:19,320 --> 00:20:22,320 Speaker 1: over which is what you're on, oversight over these private 359 00:20:22,320 --> 00:20:26,480 Speaker 1: companies and AI basically running parts of our federal government 360 00:20:26,560 --> 00:20:30,239 Speaker 1: is just worthy of doing. What is your goal, like, 361 00:20:30,280 --> 00:20:33,120 Speaker 1: what is your if you could have, if the whole 362 00:20:33,119 --> 00:20:35,000 Speaker 1: world had to listen, Margine Taylor Green or the whole 363 00:20:35,560 --> 00:20:38,600 Speaker 1: subcommittee of the whole Congress, what it would be your 364 00:20:38,680 --> 00:20:40,920 Speaker 1: goal that DOAGE would do by the time it ends 365 00:20:40,920 --> 00:20:42,560 Speaker 1: in July fourth, twenty twenty six. 366 00:20:43,359 --> 00:20:46,119 Speaker 2: I think the most important thing DOAGE can do is 367 00:20:46,520 --> 00:20:50,880 Speaker 2: effectively reduce the size of the federal government and reduced 368 00:20:51,000 --> 00:20:57,040 Speaker 2: the spending. It's just so overgrown. It is incredibly massive, 369 00:20:57,600 --> 00:21:00,720 Speaker 2: and it really doesn't serve the American people. We also 370 00:21:00,800 --> 00:21:04,920 Speaker 2: have to streamline it, especially in the payment processing centers. 371 00:21:05,000 --> 00:21:07,120 Speaker 2: This is where all the checks get sent out from 372 00:21:07,160 --> 00:21:12,440 Speaker 2: various departments. For example, my hearing on Wednesday on Medicaid 373 00:21:12,480 --> 00:21:17,920 Speaker 2: and Medicare, we investigated improper payments and so we looked 374 00:21:17,960 --> 00:21:21,640 Speaker 2: in deeply at how hundreds of billions of dollars can 375 00:21:21,680 --> 00:21:25,359 Speaker 2: be saved literally by stopping checks from going to dead people, 376 00:21:25,760 --> 00:21:31,080 Speaker 2: going to foreign countries, criminal rings that have used Americans 377 00:21:31,320 --> 00:21:35,840 Speaker 2: ID and used their stolen their data to create accounts, 378 00:21:35,840 --> 00:21:39,560 Speaker 2: and they're literally stealing our money and it's being sent overseas. 379 00:21:40,000 --> 00:21:44,760 Speaker 2: This can all be stopped by literally algorithms in our 380 00:21:44,800 --> 00:21:49,200 Speaker 2: payment processing departments. And you know, those are the things 381 00:21:49,200 --> 00:21:52,240 Speaker 2: we got to do. It's modernization. These are the things 382 00:21:52,280 --> 00:21:55,520 Speaker 2: that you do in your private business that makes your 383 00:21:55,560 --> 00:21:59,960 Speaker 2: business run more efficient, which increases your revenue and increases 384 00:22:00,119 --> 00:22:03,359 Speaker 2: your profit. And so my goal with doged would be 385 00:22:03,560 --> 00:22:07,320 Speaker 2: to transform the federal government to be run like a 386 00:22:07,440 --> 00:22:11,000 Speaker 2: very successful business and serve its customer, which is the 387 00:22:11,040 --> 00:22:15,040 Speaker 2: American people, versus a failing business that's on the verge 388 00:22:15,040 --> 00:22:17,320 Speaker 2: of bankruptcy, which is what we are right now. 389 00:22:17,359 --> 00:22:20,639 Speaker 1: Right, yeah, no, and it's it's when I've said this 390 00:22:20,720 --> 00:22:23,320 Speaker 1: at the beginning of the show. In two thousand, we 391 00:22:23,320 --> 00:22:25,800 Speaker 1: were expected to pay off our entire national debt by 392 00:22:25,840 --> 00:22:29,880 Speaker 1: twenty twelve, and so obviously that didn't happen, and much 393 00:22:29,880 --> 00:22:33,280 Speaker 1: happened in reverse. So I think anything to remedy that 394 00:22:33,320 --> 00:22:36,520 Speaker 1: situation at this point would make it happen. And you know, 395 00:22:36,760 --> 00:22:38,679 Speaker 1: a lot of Democrats in the media are pushing the 396 00:22:38,720 --> 00:22:43,400 Speaker 1: idea that ultimately any kind of reduction in government spending 397 00:22:43,480 --> 00:22:45,560 Speaker 1: is going to come with an enormous amount of pain. 398 00:22:46,320 --> 00:22:49,399 Speaker 1: Yet there is over fifty billion dollars in medicaid for 399 00:22:49,440 --> 00:22:51,439 Speaker 1: all that they just discovered in the Wall Street Journal. 400 00:22:51,720 --> 00:22:55,359 Speaker 1: There's all these you know NGOs, as we've talked about, 401 00:22:56,400 --> 00:22:59,160 Speaker 1: is there a way to reduce the size and spending 402 00:22:59,200 --> 00:23:02,879 Speaker 1: a government without middle America or or middle income people 403 00:23:02,960 --> 00:23:08,439 Speaker 1: or lower income people and seniors really being so heavily 404 00:23:08,480 --> 00:23:10,880 Speaker 1: impact that they feel the pain. Because we've talked about 405 00:23:10,920 --> 00:23:14,159 Speaker 1: having Social Security, no tax social Security, no tax on tips, 406 00:23:14,400 --> 00:23:18,080 Speaker 1: tax cuts that help the middle class. You know, what 407 00:23:18,119 --> 00:23:20,680 Speaker 1: would you say to the two progressives who say any 408 00:23:20,680 --> 00:23:23,840 Speaker 1: cut and spending is going to eventually hurt working class people. 409 00:23:24,440 --> 00:23:28,159 Speaker 2: That's just an absolute lie. Cutting spending is going to 410 00:23:28,200 --> 00:23:32,840 Speaker 2: help middle class and low income Americans because it's going 411 00:23:32,880 --> 00:23:36,880 Speaker 2: to lower inflation. And so if we lower inflation, that 412 00:23:36,920 --> 00:23:40,560 Speaker 2: means grocery costs are going to go down, Electricity costs 413 00:23:40,600 --> 00:23:43,240 Speaker 2: are going to go down, the price that the gas 414 00:23:43,280 --> 00:23:45,199 Speaker 2: pump is going to go down, and then we're going 415 00:23:45,240 --> 00:23:48,480 Speaker 2: to be helping the most vulnerable Americans. No tax on 416 00:23:48,560 --> 00:23:52,520 Speaker 2: Social Security is actually the most common sense thing that 417 00:23:52,560 --> 00:23:55,520 Speaker 2: we can do for our grandparents. They should never be 418 00:23:55,600 --> 00:23:58,800 Speaker 2: paying taxes on these Social Security checks anyways. This is 419 00:23:58,880 --> 00:24:02,040 Speaker 2: money that's already been tax This is money that they 420 00:24:02,080 --> 00:24:05,000 Speaker 2: should be paid right back, and they shouldn't have to 421 00:24:05,000 --> 00:24:08,840 Speaker 2: pay Uncle Sam again, who clearly isn't good with handling 422 00:24:08,880 --> 00:24:12,720 Speaker 2: their hard earned money. We're going to be helping those 423 00:24:12,760 --> 00:24:18,000 Speaker 2: single moms and people that are really struggling by cutting 424 00:24:18,080 --> 00:24:22,480 Speaker 2: the deficit. Growing the economy, which is another very important 425 00:24:22,520 --> 00:24:28,320 Speaker 2: piece to this, growing the economy, passing President Trump's Tax 426 00:24:28,400 --> 00:24:31,359 Speaker 2: Cuts and Jobs Act, and making sure that we passed 427 00:24:31,359 --> 00:24:33,640 Speaker 2: his new tax priorities as well well. 428 00:24:33,720 --> 00:24:36,040 Speaker 1: Congressman, I want to thank you so much for being 429 00:24:36,040 --> 00:24:39,240 Speaker 1: on here. I really hope that the Subcommittee takes a 430 00:24:39,240 --> 00:24:41,680 Speaker 1: look at, especially as we move towards AI over any 431 00:24:41,760 --> 00:24:44,280 Speaker 1: kinds of conflicts of interest or which kind of companies 432 00:24:44,320 --> 00:24:47,280 Speaker 1: get it. Because the the nation doesn't own an as system, 433 00:24:47,320 --> 00:24:49,199 Speaker 1: we're going to contract it out. But I think that 434 00:24:49,480 --> 00:24:52,600 Speaker 1: the work that Doge Committee is doing to reduce government 435 00:24:52,600 --> 00:24:56,080 Speaker 1: spending is of the utmost necessary because we are going 436 00:24:56,119 --> 00:24:59,880 Speaker 1: to be in a in a default very very very soon. 437 00:25:01,119 --> 00:25:03,040 Speaker 1: Thank you so much for being here. I really really 438 00:25:03,080 --> 00:25:05,680 Speaker 1: appreciate your time. Where can people go to read what 439 00:25:05,680 --> 00:25:08,240 Speaker 1: you're doing, what you're working on on Doge? 440 00:25:08,840 --> 00:25:11,240 Speaker 2: Oh? Yeah, thank you so much. Ryan. They can go 441 00:25:11,400 --> 00:25:16,399 Speaker 2: to my government website, which is green dot House dot gov. 442 00:25:16,600 --> 00:25:19,119 Speaker 2: Green has any on the end. They can subscribe to 443 00:25:19,160 --> 00:25:22,320 Speaker 2: my newsletter. That's something that they'll get monthly. They can 444 00:25:22,359 --> 00:25:26,080 Speaker 2: also follow on my social media at Rep MTG at 445 00:25:26,119 --> 00:25:32,280 Speaker 2: Rep MTG and also our newly formed Doge Committee social 446 00:25:32,320 --> 00:25:35,359 Speaker 2: media as well. So those are all the places they 447 00:25:35,400 --> 00:25:38,400 Speaker 2: can follow along, and I hope they do. Where You're 448 00:25:38,440 --> 00:25:40,440 Speaker 2: going to be doing very great work and I'm looking 449 00:25:40,480 --> 00:25:41,320 Speaker 2: forward to it well. 450 00:25:41,320 --> 00:25:43,360 Speaker 1: I can't wait to see what you guys exposed as 451 00:25:43,359 --> 00:25:46,000 Speaker 1: far as bad government spending and some of the fireworks 452 00:25:46,000 --> 00:25:48,919 Speaker 1: and those subcommittee hearings. Thank you so so much. I 453 00:25:48,960 --> 00:25:52,840 Speaker 1: appreciate you being on. Thanks Ryan, Thank you Congressman Margor 454 00:25:52,920 --> 00:25:55,320 Speaker 1: Taylor Green for being the show. Was a great episode. 455 00:25:55,359 --> 00:25:59,840 Speaker 1: Please like and subscribe and listen every week the iHeartRadio app, 456 00:26:00,000 --> 00:26:06,040 Speaker 1: Apple Podcasts, or wherever you get your podcasts.