1 00:00:05,000 --> 00:00:10,000 Speaker 1: Gaslighting isn't disagreement. It's psychological warfare. You can't beat the truth, 2 00:00:10,119 --> 00:00:13,600 Speaker 1: try to make people doubt their own sanity. And it 3 00:00:13,680 --> 00:00:16,200 Speaker 1: is now time in regards to that, It is now 4 00:00:16,280 --> 00:00:18,600 Speaker 1: time for what's your problem? 5 00:00:18,760 --> 00:00:21,280 Speaker 2: This is our brand new thing that we're doing in today. Casey, 6 00:00:21,360 --> 00:00:23,840 Speaker 2: you've got the problem, so I'll say to you, what's 7 00:00:23,880 --> 00:00:24,640 Speaker 2: your problem? 8 00:00:24,720 --> 00:00:26,680 Speaker 1: My problem is that I feel like I have been 9 00:00:26,760 --> 00:00:31,800 Speaker 1: medically gas lit, okay, and I don't like it. For months, 10 00:00:31,840 --> 00:00:33,880 Speaker 1: I had been going to the doctor. I had been 11 00:00:33,920 --> 00:00:38,360 Speaker 1: telling him these are my issues, and I was told, Nah, 12 00:00:38,600 --> 00:00:44,000 Speaker 1: that's just anxiety. Guess what. Spoiler alert, It wasn't anxiety. 13 00:00:44,680 --> 00:00:48,040 Speaker 1: Medically gaslet is so frustrating. Another thing that I was 14 00:00:48,080 --> 00:00:50,080 Speaker 1: talking to the doctor about because I'm of a certain 15 00:00:50,120 --> 00:00:53,600 Speaker 1: age and a certain persuasion that I was interested in 16 00:00:53,680 --> 00:00:58,280 Speaker 1: learning more about HRT hormone replacement therapy, and I was told, 17 00:01:00,080 --> 00:01:03,720 Speaker 1: you don't want to go there, that's bad stuff. But 18 00:01:04,600 --> 00:01:06,800 Speaker 1: just the other day, and I know he's tough to 19 00:01:06,880 --> 00:01:10,319 Speaker 1: listen to, but he's got a message. Robert Kennedy Junior 20 00:01:10,880 --> 00:01:15,600 Speaker 1: says that they are removing the black label on hormone 21 00:01:15,600 --> 00:01:17,040 Speaker 1: replacement therapy for women. 22 00:01:17,160 --> 00:01:20,160 Speaker 3: Check this out for more than two decades, the American 23 00:01:20,200 --> 00:01:25,200 Speaker 3: medical establishment turns its back on women. Millions of women 24 00:01:25,280 --> 00:01:27,720 Speaker 3: were told to fear the very therapy that could have 25 00:01:27,760 --> 00:01:32,720 Speaker 3: given them strength, peace, and dignity through one of life's 26 00:01:32,760 --> 00:01:38,760 Speaker 3: most difficult transitions, menopause. That ends Today, the FDA is 27 00:01:38,800 --> 00:01:44,119 Speaker 3: initiating the removal of a broad black box warnings from 28 00:01:44,200 --> 00:01:50,160 Speaker 3: hormone replacement therapy products for menopause. We're challenging outdated thinking 29 00:01:50,800 --> 00:01:57,240 Speaker 3: and recommitting to evidence based medicine that empowers rather than restricts, 30 00:01:58,320 --> 00:02:03,760 Speaker 3: and prescribed responsibly. Started early, hormone replacement therapy transforms the 31 00:02:03,800 --> 00:02:04,600 Speaker 3: lives of women. 32 00:02:05,080 --> 00:02:08,880 Speaker 1: So the FDA is removing black box warnings on hormone 33 00:02:08,880 --> 00:02:12,280 Speaker 1: replacement therapy for menopause. He was critical of twenty years 34 00:02:12,280 --> 00:02:15,240 Speaker 1: of medical advice that scared women away from it. He 35 00:02:15,320 --> 00:02:17,640 Speaker 1: claims that it robs them of their strength, piece and 36 00:02:17,680 --> 00:02:21,639 Speaker 1: dignity during menopause, and if started early in use responsibly, 37 00:02:22,160 --> 00:02:26,000 Speaker 1: it can cut cardiovascular mortality risk by up to fifty percent, 38 00:02:26,280 --> 00:02:31,240 Speaker 1: Alzheimer's by thirty percent, bone fractures between fifty and sixty percent, 39 00:02:31,560 --> 00:02:34,000 Speaker 1: and may extend the life of a woman up to 40 00:02:34,160 --> 00:02:38,919 Speaker 1: ten years. And he said this is all evidence based medicine. 41 00:02:39,040 --> 00:02:41,400 Speaker 2: Yeah, And I think this has been talked about in 42 00:02:41,400 --> 00:02:43,480 Speaker 2: the medical community for a long time now, and RFK 43 00:02:43,639 --> 00:02:45,880 Speaker 2: Junior just mentioned it there. For the last twenty years, 44 00:02:45,919 --> 00:02:48,640 Speaker 2: this has really been discouraged and there really hasn't been 45 00:02:48,760 --> 00:02:51,040 Speaker 2: enough good science to back it up. And I think 46 00:02:51,040 --> 00:02:53,000 Speaker 2: what he's saying is, look, we're going to take another 47 00:02:53,000 --> 00:02:55,040 Speaker 2: look at this. Back to what we always keep talking 48 00:02:55,080 --> 00:02:58,760 Speaker 2: about the Trump administration. I think Trump saw the federal 49 00:02:58,800 --> 00:03:01,519 Speaker 2: government as a big group of people and a big 50 00:03:01,639 --> 00:03:03,359 Speaker 2: entity that was like, Eh, this is just the way 51 00:03:03,360 --> 00:03:05,320 Speaker 2: we always do it. We do it this way because 52 00:03:05,320 --> 00:03:06,760 Speaker 2: that's the way we always do it. And he was 53 00:03:06,800 --> 00:03:08,320 Speaker 2: going to go in and he picked people as his 54 00:03:08,400 --> 00:03:11,720 Speaker 2: cabinet secretaries who were just going to question everything and 55 00:03:11,760 --> 00:03:13,320 Speaker 2: are they going to get it right one hundred percent 56 00:03:13,360 --> 00:03:15,560 Speaker 2: of the time, Of course not. But we're never going 57 00:03:15,639 --> 00:03:17,480 Speaker 2: to get any of it right if we don't start 58 00:03:17,520 --> 00:03:20,360 Speaker 2: asking the questions. And what you've gone through, Casey, with 59 00:03:20,440 --> 00:03:22,960 Speaker 2: your medical issues the last several months, I know has 60 00:03:22,960 --> 00:03:25,160 Speaker 2: been really frustrating to you. And that you know, we 61 00:03:25,200 --> 00:03:27,519 Speaker 2: talk about frustrations in our medical system. We talk about 62 00:03:27,560 --> 00:03:29,480 Speaker 2: the high cost of health insurance, We talked about the 63 00:03:29,639 --> 00:03:32,040 Speaker 2: high cost of health procedures. But the other aspect of 64 00:03:32,080 --> 00:03:34,839 Speaker 2: it is is when you've got problems, you're not feeling well, 65 00:03:34,880 --> 00:03:38,000 Speaker 2: you're sick, and it's just very difficult to diagnose the 66 00:03:38,000 --> 00:03:38,880 Speaker 2: problem that's going on. 67 00:03:39,120 --> 00:03:41,240 Speaker 1: Yeah, And nothing is worse than being told, oh, this 68 00:03:41,400 --> 00:03:44,640 Speaker 1: is just anxiety and then you find out no, it 69 00:03:44,840 --> 00:03:48,840 Speaker 1: was an anxiety. That is a frustration level that I 70 00:03:49,000 --> 00:03:51,680 Speaker 1: just can't even I can't say the words on the radio. 71 00:03:51,880 --> 00:03:53,960 Speaker 2: Well, now talk to me about this. How I know 72 00:03:54,000 --> 00:03:56,600 Speaker 2: the answer? But how much did you use chat GPT 73 00:03:56,880 --> 00:03:59,520 Speaker 2: when you were trying to diagnose what kind of medical problems? 74 00:03:59,560 --> 00:04:03,400 Speaker 2: You're held all the time. No, But that's cool because. 75 00:04:03,120 --> 00:04:06,080 Speaker 1: When you're being told nah, there's nothing wrong, you're like, no, 76 00:04:06,400 --> 00:04:10,280 Speaker 1: there is, what is it? And you're trying to go 77 00:04:10,520 --> 00:04:14,120 Speaker 1: to any other resource you can find because you're just 78 00:04:14,240 --> 00:04:15,520 Speaker 1: being blown off. 79 00:04:15,640 --> 00:04:18,839 Speaker 2: Well, I know. Here's what I mention that because I 80 00:04:18,839 --> 00:04:21,760 Speaker 2: think this is a great example where AI has really 81 00:04:21,800 --> 00:04:24,480 Speaker 2: benefited you and this process trying to find out what's 82 00:04:24,520 --> 00:04:27,359 Speaker 2: going on with you medically and a tool that you used. 83 00:04:27,440 --> 00:04:29,919 Speaker 2: And I bring it up because there's this story in 84 00:04:29,960 --> 00:04:31,800 Speaker 2: the news and a number of articles about this that 85 00:04:31,920 --> 00:04:34,719 Speaker 2: in New York State right now, they're considering a bill 86 00:04:35,040 --> 00:04:38,440 Speaker 2: that would ban AI chatbots from giving either legal or 87 00:04:38,480 --> 00:04:42,279 Speaker 2: medical advice. That's like the worst thing you want to do, 88 00:04:42,400 --> 00:04:45,600 Speaker 2: by the way, that is going to hurt lower income 89 00:04:45,640 --> 00:04:50,719 Speaker 2: and poor people disproportionately. If I'm lower income, my ability 90 00:04:50,760 --> 00:04:53,400 Speaker 2: to talk to a lawyer or go seek medical care 91 00:04:53,560 --> 00:04:56,839 Speaker 2: is hampered dramatically by the outrageous costs that both of 92 00:04:56,880 --> 00:05:00,560 Speaker 2: those different types of industries provide on this. The idea 93 00:05:00,600 --> 00:05:03,680 Speaker 2: that you would ban chatbots from giving legal and medical 94 00:05:03,720 --> 00:05:08,200 Speaker 2: advice disproportionately hurts low income people. It's finally they have 95 00:05:08,279 --> 00:05:10,240 Speaker 2: a tool that may be able to give them some 96 00:05:10,279 --> 00:05:11,640 Speaker 2: help at all of this in the state of New 97 00:05:11,720 --> 00:05:13,080 Speaker 2: York is trying to get rid of it on them. 98 00:05:13,240 --> 00:05:15,520 Speaker 1: Now. I'm not promoting HRT. I'm not saying it's good 99 00:05:15,600 --> 00:05:17,400 Speaker 1: or bad. I just think it's a good thing that 100 00:05:17,720 --> 00:05:19,880 Speaker 1: you've got Kennedy saying, you know what, No, let's look 101 00:05:19,920 --> 00:05:23,440 Speaker 1: at this a little further and also obviously go to 102 00:05:23,480 --> 00:05:26,640 Speaker 1: your doctor. Don't always just trust a chat GPT. But 103 00:05:26,960 --> 00:05:29,440 Speaker 1: you have this other thing that's coming out. It's called 104 00:05:29,720 --> 00:05:33,120 Speaker 1: a human Body at List, and it's an interactive three 105 00:05:33,200 --> 00:05:36,680 Speaker 1: D human model of the human body, and it's let 106 00:05:36,839 --> 00:05:41,640 Speaker 1: users explore anatomy like bones, muscles, organs, nerves, and systems. 107 00:05:41,760 --> 00:05:44,840 Speaker 1: So this is something that's going to be used by students, 108 00:05:44,880 --> 00:05:49,120 Speaker 1: medical professionals, teachers, and even just every day people. Think 109 00:05:49,160 --> 00:05:54,640 Speaker 1: of this as Google Maps for your body. This is 110 00:05:54,960 --> 00:05:58,400 Speaker 1: crazy technology that's coming where you're going to be able 111 00:05:58,440 --> 00:06:01,240 Speaker 1: to point your phone at your body and it can 112 00:06:01,279 --> 00:06:04,720 Speaker 1: get down to the cellular level and tell you what's 113 00:06:04,760 --> 00:06:05,240 Speaker 1: going on. 114 00:06:05,760 --> 00:06:07,279 Speaker 2: Do you think this is going to get rid of 115 00:06:07,320 --> 00:06:11,200 Speaker 2: like dissecting frogs in sixth grade biology class? You won't 116 00:06:11,200 --> 00:06:13,159 Speaker 2: have to do that anymore. You don't have to do 117 00:06:13,200 --> 00:06:14,880 Speaker 2: that anymore. Did you end up doing did you do that? 118 00:06:14,920 --> 00:06:17,200 Speaker 2: Did you do the frog dissection or the fetal pig 119 00:06:17,400 --> 00:06:20,240 Speaker 2: or those sort of things? And half the people thought 120 00:06:20,279 --> 00:06:22,039 Speaker 2: it was the most disgusting thing they ever saw in 121 00:06:22,080 --> 00:06:23,680 Speaker 2: their lives, And the other half of the people are like, 122 00:06:23,720 --> 00:06:24,400 Speaker 2: this is cool. 123 00:06:24,920 --> 00:06:28,560 Speaker 1: Yeah, I think this is amazing that you're just going 124 00:06:28,640 --> 00:06:30,520 Speaker 1: to be able to point your phone at your body 125 00:06:30,880 --> 00:06:34,640 Speaker 1: and eventually somewhat like a CT scan or an X ray, 126 00:06:34,760 --> 00:06:37,200 Speaker 1: it'll it'll get down to the cellular level and tell 127 00:06:37,240 --> 00:06:43,279 Speaker 1: you what's going on. They're already using these in student 128 00:06:43,560 --> 00:06:48,880 Speaker 1: situations to help students learn. When will it be broadly available? 129 00:06:49,400 --> 00:06:51,440 Speaker 1: Probably not for a little bit, but you know, A 130 00:06:51,560 --> 00:06:55,160 Speaker 1: I'm moving fast so and I always could be here 131 00:06:55,160 --> 00:06:57,200 Speaker 1: a lot quicker than you know. I heard I heard 132 00:06:57,240 --> 00:06:59,200 Speaker 1: a politician the other day. I think it was Yang 133 00:06:59,520 --> 00:07:03,880 Speaker 1: who said it that they should start taxing AI use 134 00:07:04,560 --> 00:07:07,080 Speaker 1: instead of labor tax AI. 135 00:07:07,680 --> 00:07:09,840 Speaker 2: Oh so no more income tax and it's just an 136 00:07:09,920 --> 00:07:12,280 Speaker 2: AI tax, correct. I can fully get behind it. 137 00:07:12,680 --> 00:07:15,119 Speaker 1: Anytime you want to start using AI, there's a tax 138 00:07:15,160 --> 00:07:16,120 Speaker 1: to be paid for it. 139 00:07:16,200 --> 00:07:17,960 Speaker 2: I've only thought that would. 140 00:07:17,720 --> 00:07:19,800 Speaker 1: Change all of the you know we hear over and 141 00:07:19,880 --> 00:07:23,200 Speaker 1: overhew AI is taking jobs. You tax AI. People got 142 00:07:23,240 --> 00:07:25,280 Speaker 1: to pay for it to use it, and that will 143 00:07:25,320 --> 00:07:26,720 Speaker 1: keep the labor market in place. 144 00:07:26,840 --> 00:07:29,120 Speaker 2: Hey, if a gift's rid of the income tax, I 145 00:07:29,160 --> 00:07:31,480 Speaker 2: don't care. If I just heard about this thirty seconds ago, 146 00:07:31,520 --> 00:07:34,000 Speaker 2: I fully support it. One. 147 00:07:34,160 --> 00:07:36,360 Speaker 1: You're listening to ninety three WIBC