1 00:00:00,080 --> 00:00:02,840 Speaker 1: There is finally a governor that has decided to stand 2 00:00:02,920 --> 00:00:09,000 Speaker 1: up to big tech. With Washington totally gone to the socialists, 3 00:00:09,360 --> 00:00:12,400 Speaker 1: there is going to be no help there on many, many, 4 00:00:12,480 --> 00:00:17,000 Speaker 1: many issues. And what governors are starting to understand, what 5 00:00:17,040 --> 00:00:22,000 Speaker 1: they're starting to figure out, is you're pretty powerful. You 6 00:00:22,079 --> 00:00:26,360 Speaker 1: actually have the right to stand up for your citizens, 7 00:00:26,400 --> 00:00:29,960 Speaker 1: to protect your citizens and to fight back instead of 8 00:00:29,960 --> 00:00:33,360 Speaker 1: just saying, oh, well, Washington's doing its thing. And now 9 00:00:33,400 --> 00:00:37,320 Speaker 1: we're seeing this change in a major way. We're seeing 10 00:00:37,360 --> 00:00:42,600 Speaker 1: this change in a very very very very very significant way. 11 00:00:42,800 --> 00:00:45,839 Speaker 1: Governor de Santists is the first one to really go 12 00:00:46,000 --> 00:00:52,120 Speaker 1: all in. Governor Desantists has decided that his citizens deserve 13 00:00:52,240 --> 00:00:54,480 Speaker 1: to have someone stand up for them when it comes 14 00:00:54,520 --> 00:01:00,520 Speaker 1: to the big tech attacks. Governor Desantists has decided he's 15 00:01:00,560 --> 00:01:03,040 Speaker 1: not going to just let Washington and big tech in 16 00:01:03,080 --> 00:01:09,720 Speaker 1: Silicon Valley silence the citizens in his state. So far, 17 00:01:09,880 --> 00:01:14,000 Speaker 1: it's pretty darned brilliant what he's doing so far. It's 18 00:01:14,040 --> 00:01:17,360 Speaker 1: actually extremely brilliant what he's decided to do. He's keeping 19 00:01:17,360 --> 00:01:20,160 Speaker 1: it simple, but he's doing it. And every other conservative 20 00:01:20,160 --> 00:01:23,880 Speaker 1: governor in the country, forget conservative, every other governor in 21 00:01:23,920 --> 00:01:27,880 Speaker 1: the country that believes in freedom of speech, believes in 22 00:01:27,920 --> 00:01:31,440 Speaker 1: freedom of expression, should be paying attention to what Governor 23 00:01:31,480 --> 00:01:34,840 Speaker 1: Desantists is doing. Now. Of course, the media has immediately 24 00:01:34,880 --> 00:01:38,800 Speaker 1: attack Governor Desantists on this, right. I mean, I mean 25 00:01:38,880 --> 00:01:45,759 Speaker 1: it was like instant attack on Governor DeSantis, attacking him, saying, oh, well, 26 00:01:45,760 --> 00:01:47,720 Speaker 1: he's just doing this, playing this up for the media. 27 00:01:47,760 --> 00:01:50,080 Speaker 1: He's just doing this because he must be trying to 28 00:01:50,200 --> 00:01:53,080 Speaker 1: run for president in twenty twenty four or four years 29 00:01:53,120 --> 00:01:56,840 Speaker 1: from now. By the way, if Governor Desantist did decide 30 00:01:56,880 --> 00:01:59,760 Speaker 1: to run for president, I'd be all about it. I'd 31 00:01:59,800 --> 00:02:01,440 Speaker 1: want to listen to more of what he has to say, 32 00:02:02,400 --> 00:02:05,680 Speaker 1: because the guy seems to be listening to the people 33 00:02:05,760 --> 00:02:08,400 Speaker 1: and making decisions based on common sense, whether it be 34 00:02:08,440 --> 00:02:12,600 Speaker 1: from the COVID nineteen lockdowns, not you know, taking tons 35 00:02:12,600 --> 00:02:15,399 Speaker 1: of heat for not locking down his state. And yes, 36 00:02:15,440 --> 00:02:17,520 Speaker 1: they've had fewer people dying Florida than they did in 37 00:02:17,520 --> 00:02:22,520 Speaker 1: New York, where they've had extreme lockdowns. Those are things 38 00:02:22,520 --> 00:02:26,920 Speaker 1: that I like about him. But what he said in 39 00:02:26,960 --> 00:02:29,480 Speaker 1: this announcement is that we're not just going to let 40 00:02:29,520 --> 00:02:32,400 Speaker 1: these big tech tyrants decide what they're going to do. 41 00:02:33,400 --> 00:02:37,200 Speaker 1: We're not going to let them manipulate our lives silence us. 42 00:02:37,760 --> 00:02:42,320 Speaker 1: Not in my state, and my state's pretty important. It's 43 00:02:42,320 --> 00:02:45,600 Speaker 1: a pretty big states. It's got a lot of citizens, 44 00:02:45,639 --> 00:02:48,600 Speaker 1: and if you want to operate here, then you better 45 00:02:48,919 --> 00:02:53,320 Speaker 1: operate under the rules of freedom. I want you to 46 00:02:53,400 --> 00:02:56,600 Speaker 1: hear in this announcement as governed of Santists at the 47 00:02:56,800 --> 00:03:01,360 Speaker 1: Florida State House announced this yesterday and he played it 48 00:03:01,360 --> 00:03:05,560 Speaker 1: out very clearly why he believes. As he said in 49 00:03:05,680 --> 00:03:08,080 Speaker 1: his message, this message is loud and clear when it 50 00:03:08,120 --> 00:03:11,960 Speaker 1: comes to elections in Florida, big tech should stay out 51 00:03:12,000 --> 00:03:17,760 Speaker 1: of it. Here is Governor de Santis in his own words, 52 00:03:17,760 --> 00:03:19,960 Speaker 1: and every other governor in America should pay attention to 53 00:03:19,960 --> 00:03:23,000 Speaker 1: what he's doing. The ends should have the privacy of 54 00:03:23,040 --> 00:03:27,840 Speaker 1: their data and personal information protected, their ability to access 55 00:03:27,880 --> 00:03:32,880 Speaker 1: and participate in online platforms protected, and their ability to 56 00:03:32,960 --> 00:03:37,440 Speaker 1: participate in elections free from interference from big tech protected. 57 00:03:38,080 --> 00:03:41,040 Speaker 1: But began as a group of upstar technology companies from 58 00:03:41,080 --> 00:03:44,840 Speaker 1: the West Coast, since transformed into an industry of monopoly 59 00:03:44,920 --> 00:03:49,760 Speaker 1: communications platforms that monitor, influence, and control the flow of 60 00:03:49,800 --> 00:03:53,160 Speaker 1: information in our country and among our citizens. And they 61 00:03:53,200 --> 00:03:57,400 Speaker 1: do that to an extent. Hitherto unimaginable. At the turn 62 00:03:57,440 --> 00:04:00,920 Speaker 1: of the twenty first century, online technology presented tools to 63 00:04:01,040 --> 00:04:06,560 Speaker 1: liberate Americans from reliance on distrusted, legacy media outlets. A 64 00:04:06,680 --> 00:04:10,480 Speaker 1: social media proliferated over the past decade. Citizens could directly 65 00:04:10,480 --> 00:04:13,800 Speaker 1: connect with large numbers of people and could cut out 66 00:04:13,840 --> 00:04:18,800 Speaker 1: corporate media outlets entirely. Over the years, however, these platforms 67 00:04:18,800 --> 00:04:22,320 Speaker 1: have changed from neutral platforms that provided Americans with the 68 00:04:22,360 --> 00:04:27,599 Speaker 1: freedom to speak to enforcers of preferred narratives. Consequently, these 69 00:04:27,600 --> 00:04:31,440 Speaker 1: platforms have played an increasingly decisive role in elections and 70 00:04:31,480 --> 00:04:36,200 Speaker 1: have negatively impacted Americans who dissent from orthodoxies favored by 71 00:04:36,240 --> 00:04:39,159 Speaker 1: the big tech cartel. We have seen the power of 72 00:04:39,160 --> 00:04:43,440 Speaker 1: their censorship over individuals and organizations, including what I believe 73 00:04:43,560 --> 00:04:47,599 Speaker 1: is clear viewpoint discrimination, and as these companies have grown 74 00:04:47,600 --> 00:04:50,920 Speaker 1: and their influence expanded, Big Tech has come to look 75 00:04:51,000 --> 00:04:54,400 Speaker 1: more like Big Brother with each passing day. But this 76 00:04:54,440 --> 00:04:57,680 Speaker 1: is twenty twenty one, not nineteen eighty four, and this 77 00:04:57,760 --> 00:05:01,800 Speaker 1: is real life, not George Orwell's fick. These companies exert 78 00:05:01,880 --> 00:05:06,320 Speaker 1: monopoly power over a centrally important forum of public discourse 79 00:05:06,600 --> 00:05:11,200 Speaker 1: and the access of information that Floridians rely on. Used 80 00:05:11,240 --> 00:05:13,280 Speaker 1: to be. The consumers were trusted to make their own 81 00:05:13,279 --> 00:05:16,880 Speaker 1: decisions about what information to consume, about which leaders to 82 00:05:17,000 --> 00:05:20,839 Speaker 1: quote follow, about what news to watch. Now those decisions 83 00:05:20,880 --> 00:05:25,520 Speaker 1: are increasingly made by nameless, faceless boards of censors. They 84 00:05:25,560 --> 00:05:30,720 Speaker 1: even have a name, euphemistically called content moderators. And we're 85 00:05:30,760 --> 00:05:33,400 Speaker 1: told that these are private companies and that those who 86 00:05:33,520 --> 00:05:37,680 Speaker 1: disagree with their decisions to regulate content and even suppress 87 00:05:37,760 --> 00:05:42,719 Speaker 1: content can simply choose other services. Well, when two point 88 00:05:42,760 --> 00:05:47,479 Speaker 1: eight million Americans chose to download the application Parlor and 89 00:05:47,600 --> 00:05:51,200 Speaker 1: share information with friends, family, and colleagues, what was the 90 00:05:51,279 --> 00:05:57,440 Speaker 1: result of that canceled by Amazon, Google, and Apple? What 91 00:05:57,520 --> 00:06:02,320 Speaker 1: about eighty eight million Americans who chose to quote follow 92 00:06:02,760 --> 00:06:07,640 Speaker 1: President Donald Trump? Sorry, content moderators on Twitter pulled the plug, 93 00:06:07,920 --> 00:06:10,800 Speaker 1: as did a different set of quote moderators at Facebook. 94 00:06:11,200 --> 00:06:14,919 Speaker 1: This is the case even though leaders like Ayatollahamni have 95 00:06:15,000 --> 00:06:17,279 Speaker 1: been permitted to use these platforms to do things like 96 00:06:17,360 --> 00:06:20,720 Speaker 1: call for the destruction of Israel and the elimination of Jews. 97 00:06:21,440 --> 00:06:23,800 Speaker 1: Core issue here is this, our consumers going to have 98 00:06:23,880 --> 00:06:27,960 Speaker 1: the choice to consume the information they choose, or oligarchs 99 00:06:28,000 --> 00:06:31,480 Speaker 1: in Silicon Valley going to make those choices for us. 100 00:06:32,040 --> 00:06:35,440 Speaker 1: No group of people should exercise such power, especially not 101 00:06:35,600 --> 00:06:39,640 Speaker 1: tech billionaires in northern California. If I had to choose, 102 00:06:40,000 --> 00:06:42,240 Speaker 1: I'd rather be governed by the first fifty names in 103 00:06:42,279 --> 00:06:46,080 Speaker 1: the Tallahassee phone book than the CEOs of big tech companies. 104 00:06:46,560 --> 00:06:49,000 Speaker 1: You don't like parlor, then don't read it. Let's not 105 00:06:49,080 --> 00:06:52,440 Speaker 1: have those choices made for us, or before long we 106 00:06:52,480 --> 00:06:55,719 Speaker 1: will have nothing more than someone else's choices imposed upon 107 00:06:55,920 --> 00:06:59,520 Speaker 1: us by a bunch of monopolies whose core business is 108 00:06:59,560 --> 00:07:03,680 Speaker 1: to advertising. These behaviors are concerning to me, as I 109 00:07:03,720 --> 00:07:07,160 Speaker 1: know they are concerning to many Floridians. It's high time 110 00:07:07,200 --> 00:07:09,279 Speaker 1: that we step up to the plate to ensure the 111 00:07:09,320 --> 00:07:13,040 Speaker 1: protection of the people and their rights. And I'm committed 112 00:07:13,040 --> 00:07:15,720 Speaker 1: to addressing what may be one of the most pervasive 113 00:07:15,800 --> 00:07:18,920 Speaker 1: threats to American self government in the twenty first century 114 00:07:18,960 --> 00:07:22,920 Speaker 1: because I believe in individual rights, privacy rights, and property rights. 115 00:07:23,160 --> 00:07:26,160 Speaker 1: Because I trust Floridians to choose which content to consume 116 00:07:26,200 --> 00:07:28,640 Speaker 1: in which to ignore, and because I want to preserve 117 00:07:28,680 --> 00:07:32,640 Speaker 1: Florida's rich, diverse public discourse and not allow corporate owned 118 00:07:33,000 --> 00:07:37,440 Speaker 1: narrative approve outlets to dominate our voices. Now Fortunately, I'm 119 00:07:37,480 --> 00:07:39,720 Speaker 1: not alone in this fight, and I'm glad to work 120 00:07:39,720 --> 00:07:42,480 Speaker 1: with the Speaker of the House and other legislative leaders 121 00:07:42,680 --> 00:07:45,080 Speaker 1: as we take on these issues on behalf of the 122 00:07:45,080 --> 00:07:48,160 Speaker 1: people of Florida. As we work together, we're united under 123 00:07:48,160 --> 00:07:51,400 Speaker 1: core beliefs and the sacredness of one's voice, one's privacy, 124 00:07:51,680 --> 00:07:55,160 Speaker 1: and the protection they deserve in our system of government. 125 00:07:55,480 --> 00:07:58,000 Speaker 1: Privacy rights are really property rights. And just like big 126 00:07:58,040 --> 00:08:01,880 Speaker 1: tech can't rummage through your dresser draw they also aren't 127 00:08:01,960 --> 00:08:05,600 Speaker 1: entitled to track your every move. You notice what the 128 00:08:05,680 --> 00:08:08,720 Speaker 1: santists is saying right now, and this is why so 129 00:08:08,760 --> 00:08:11,840 Speaker 1: many governors need to pay attention to the scientists. He's 130 00:08:11,880 --> 00:08:14,280 Speaker 1: not talking about conservative liberally, He's talking about your actual 131 00:08:14,320 --> 00:08:17,480 Speaker 1: privacy rights. He's talking about the right for these companies 132 00:08:17,520 --> 00:08:19,200 Speaker 1: to look at everything you do and then sell your 133 00:08:19,240 --> 00:08:22,320 Speaker 1: information the highest bidder. And that is not okay because 134 00:08:22,320 --> 00:08:24,120 Speaker 1: this is how they make billions, It's how they control 135 00:08:24,720 --> 00:08:28,160 Speaker 1: every ad you see on your platforms. This is what 136 00:08:28,320 --> 00:08:31,040 Speaker 1: makes them so powerful, this is what makes them like oligarchs. 137 00:08:31,040 --> 00:08:33,520 Speaker 1: And he's right, these companies can't come in your house 138 00:08:33,520 --> 00:08:36,480 Speaker 1: and rummage through your mail, rummage through your dress or drawer. 139 00:08:36,840 --> 00:08:39,040 Speaker 1: Go into your closet and see what you're buyings. They 140 00:08:39,080 --> 00:08:42,400 Speaker 1: can then tell people how to market to you. So 141 00:08:42,480 --> 00:08:44,640 Speaker 1: why can they do it on your computer all day 142 00:08:44,640 --> 00:08:50,000 Speaker 1: long and your phone all day long? This No one 143 00:08:50,120 --> 00:08:52,839 Speaker 1: listening right now should think of this as a Republican v. 144 00:08:52,960 --> 00:08:56,640 Speaker 1: Democrat thing. This should be about privacy and all of 145 00:08:56,679 --> 00:09:00,200 Speaker 1: our privacy against these big tech tyrants that have way 146 00:09:00,240 --> 00:09:03,160 Speaker 1: too much power and control over our lives and the 147 00:09:03,280 --> 00:09:06,680 Speaker 1: information that they basically steal from you and sell to 148 00:09:06,720 --> 00:09:08,640 Speaker 1: the highest bidder that you don't get to say so 149 00:09:08,720 --> 00:09:12,520 Speaker 1: in Governor to Santests is doing what other governors need 150 00:09:12,559 --> 00:09:17,400 Speaker 1: to do. He's protecting your privacy. He's taking away the 151 00:09:17,440 --> 00:09:21,719 Speaker 1: financial power of these big tech companies to sell your 152 00:09:21,760 --> 00:09:24,199 Speaker 1: information to rummagster your life and to sell it to 153 00:09:24,240 --> 00:09:27,199 Speaker 1: the highest bidder without you really even knowing what's going on. 154 00:09:28,640 --> 00:09:31,440 Speaker 1: What Governor de Santist is saying to these big tech companies, 155 00:09:32,120 --> 00:09:34,920 Speaker 1: you think you can come after us politically, forget that. 156 00:09:35,679 --> 00:09:39,920 Speaker 1: We're going to talk about you, not politically, right. We're 157 00:09:39,920 --> 00:09:43,040 Speaker 1: going to talk about what you're doing with our information, 158 00:09:44,760 --> 00:09:46,760 Speaker 1: and we're gonna hold you accountable for what you're doing 159 00:09:46,800 --> 00:09:49,360 Speaker 1: with our information. We're not going to let you just 160 00:09:50,679 --> 00:09:52,600 Speaker 1: sell it to anybody you want to and think that 161 00:09:52,600 --> 00:09:54,400 Speaker 1: we're just going to look the other way. Any longer 162 00:09:54,440 --> 00:09:57,120 Speaker 1: you guys are out of control, We're going to rain 163 00:09:57,160 --> 00:09:59,400 Speaker 1: you in. And if you want to have a customer 164 00:09:59,400 --> 00:10:04,439 Speaker 1: base in Florida, one of the highliest, highliest populated states 165 00:10:04,440 --> 00:10:07,320 Speaker 1: in America, then you're gonna have to play by a 166 00:10:07,440 --> 00:10:12,400 Speaker 1: very different set of rules. You're gonna have to play 167 00:10:12,400 --> 00:10:15,680 Speaker 1: by a set of rules that protect all the people 168 00:10:16,320 --> 00:10:20,200 Speaker 1: in Florida, not just some. Now, everything that you just 169 00:10:20,240 --> 00:10:23,959 Speaker 1: talked about so far, it is really not political, right. 170 00:10:23,960 --> 00:10:25,959 Speaker 1: This isn't like I We're gonna come after you because 171 00:10:25,960 --> 00:10:28,640 Speaker 1: of politics. No, No, we're gonna stop the way that 172 00:10:28,760 --> 00:10:32,280 Speaker 1: you wore out our information from our cell phones and 173 00:10:32,320 --> 00:10:37,160 Speaker 1: from our computers, Governor de Santis said just a second ago, 174 00:10:37,240 --> 00:10:39,960 Speaker 1: as I was playing this for you, as he put it, 175 00:10:41,920 --> 00:10:45,120 Speaker 1: you don't let these people come into your room and 176 00:10:45,280 --> 00:10:48,719 Speaker 1: rummage through your drawers, right Like, you can't do that, 177 00:10:48,840 --> 00:10:51,400 Speaker 1: So why can you do it on my phone on 178 00:10:51,440 --> 00:10:54,880 Speaker 1: my computer? Or beliefs and the sacredness of one's voice, 179 00:10:54,920 --> 00:10:58,760 Speaker 1: one's privacy and the protection they deserve in our system 180 00:10:58,800 --> 00:11:01,960 Speaker 1: of government, pivous rights are really property rights. And just 181 00:11:02,000 --> 00:11:04,559 Speaker 1: like big tech can't rummage through your dress or drawers, 182 00:11:05,480 --> 00:11:08,960 Speaker 1: they also aren't entitled to track your every move. You know, 183 00:11:09,040 --> 00:11:10,760 Speaker 1: someone once summed it up to me this way. When 184 00:11:10,800 --> 00:11:13,000 Speaker 1: I invite you into my house and say have a seat, 185 00:11:13,360 --> 00:11:16,080 Speaker 1: I don't mean you can leave with my couch. Our 186 00:11:16,160 --> 00:11:18,840 Speaker 1: founding fathers were deliberate in the enshrinement of our rights, 187 00:11:18,840 --> 00:11:22,400 Speaker 1: and the Constitution ensure that we the people were guaranteed 188 00:11:22,400 --> 00:11:27,679 Speaker 1: protection against those wishing to violate our rights. Ironically, early 189 00:11:27,720 --> 00:11:30,439 Speaker 1: founders were most concerned with the tyranny of government and 190 00:11:30,559 --> 00:11:34,120 Speaker 1: deciding these rights. But today the big tech oligarchy has 191 00:11:34,120 --> 00:11:36,880 Speaker 1: in many ways become a clear and more present danger 192 00:11:37,200 --> 00:11:39,600 Speaker 1: to the restriction of the right to free speech than 193 00:11:39,760 --> 00:11:42,960 Speaker 1: the government itself. And certainly, if you go back and 194 00:11:43,000 --> 00:11:45,600 Speaker 1: look at the monopolies at the turn of the twentieth century, 195 00:11:46,240 --> 00:11:50,360 Speaker 1: these current big tech monopolies are exerting power far far 196 00:11:50,520 --> 00:11:55,200 Speaker 1: more pervasive than standard oil ever did. Now these issues 197 00:11:55,240 --> 00:11:59,000 Speaker 1: are so important because of big tech pervasiveness and near 198 00:11:59,120 --> 00:12:03,480 Speaker 1: limitless in our society, with billions of monthly users and 199 00:12:03,520 --> 00:12:07,040 Speaker 1: the vastness of information exchange. Not only these companies control 200 00:12:07,120 --> 00:12:10,560 Speaker 1: the flow of information they are selling it as well. 201 00:12:11,440 --> 00:12:14,120 Speaker 1: This is how that you think the business works. They 202 00:12:14,160 --> 00:12:18,559 Speaker 1: take consumer data sell it for advertising, specifically more than 203 00:12:18,600 --> 00:12:22,280 Speaker 1: two hundred billion dollars worth of advertising in a given year. 204 00:12:22,840 --> 00:12:25,679 Speaker 1: And that's not really innovation. That's just a different form 205 00:12:25,720 --> 00:12:29,240 Speaker 1: of Madison Avenue. Now, since its exception, Big Tech has 206 00:12:29,240 --> 00:12:33,320 Speaker 1: experienced rapid and extraordinary growth. It's passed to its path 207 00:12:33,360 --> 00:12:35,880 Speaker 1: to expansion bore willingness to engage in a host of 208 00:12:35,920 --> 00:12:41,320 Speaker 1: savvy practices to advance profit while compromising the protection of consumers. 209 00:12:41,720 --> 00:12:45,360 Speaker 1: Not only do Floridians share with these platforms their lives, thoughts, hopes, 210 00:12:45,400 --> 00:12:48,920 Speaker 1: and stories, but also some of their most intimate personal information. 211 00:12:49,320 --> 00:12:52,360 Speaker 1: But what most folks don't realize is that all these 212 00:12:52,360 --> 00:12:56,480 Speaker 1: companies are taking that information, regardless of its sensitivity, and 213 00:12:56,760 --> 00:13:00,840 Speaker 1: selling it to whoever is willing to pay the highest price. 214 00:13:01,320 --> 00:13:04,680 Speaker 1: They've even created complex markets and exchanges for the sale 215 00:13:04,679 --> 00:13:08,520 Speaker 1: of Floridians information, all the while claiming to never sell 216 00:13:09,080 --> 00:13:12,560 Speaker 1: user information. Florida is not going to be tolerating that 217 00:13:12,960 --> 00:13:17,800 Speaker 1: Florida consumers deserve protection for their privacy. With the help 218 00:13:17,840 --> 00:13:20,520 Speaker 1: of our legislative partners, we're going to stand together in 219 00:13:20,520 --> 00:13:23,320 Speaker 1: support of Floridians and put a stop to big text 220 00:13:23,360 --> 00:13:29,920 Speaker 1: practice of praying on consumers your privacy is important. We 221 00:13:30,000 --> 00:13:33,319 Speaker 1: also are going to address your privacy is importantly. And 222 00:13:33,360 --> 00:13:36,280 Speaker 1: again this is where other governors need to pay attention. 223 00:13:39,160 --> 00:13:42,360 Speaker 1: Other governors need to understand what Governor does sand is 224 00:13:42,760 --> 00:13:46,000 Speaker 1: saying we are going to stand up for you. The 225 00:13:46,120 --> 00:13:50,120 Speaker 1: citizens we're going to stand up for you are constituents. 226 00:13:51,200 --> 00:13:53,520 Speaker 1: These companies have been saying and lying to you for 227 00:13:53,640 --> 00:13:56,720 Speaker 1: years that they don't sell your information the highest bidder. 228 00:13:57,080 --> 00:13:59,199 Speaker 1: We now know for a fact that's exactly what they 229 00:13:59,200 --> 00:14:06,360 Speaker 1: do every day on the web. They sell everything, virtually 230 00:14:06,400 --> 00:14:10,840 Speaker 1: everything that you've ever sent out into the world searched 231 00:14:10,920 --> 00:14:15,959 Speaker 1: on their search engines. They're even reading your emails. A 232 00:14:16,000 --> 00:14:18,120 Speaker 1: lot of people don't realize this. If you send a 233 00:14:18,200 --> 00:14:25,040 Speaker 1: link in Facebook Messenger for a product, if you send 234 00:14:25,080 --> 00:14:29,320 Speaker 1: a link in an email to a product, you will 235 00:14:29,360 --> 00:14:33,240 Speaker 1: then be spammed. And a lot of you probably notice 236 00:14:33,280 --> 00:14:37,080 Speaker 1: this with that exact product or products related to it 237 00:14:37,480 --> 00:14:40,760 Speaker 1: in your advertising. When you start searching the web after that, 238 00:14:41,200 --> 00:14:46,360 Speaker 1: or when you're on social media sites like Facebook, there 239 00:14:46,480 --> 00:14:49,480 Speaker 1: is when you hit that button to forward a product 240 00:14:50,040 --> 00:14:52,400 Speaker 1: or to forward an article about a product, or ford 241 00:14:52,440 --> 00:14:56,480 Speaker 1: an article about a resort. They then go and say, hey, there, 242 00:14:56,680 --> 00:14:59,520 Speaker 1: this person is looking to go on a vacation to Mexico. 243 00:15:00,080 --> 00:15:05,200 Speaker 1: You should advertise your resort to them. How valuable is 244 00:15:05,200 --> 00:15:09,400 Speaker 1: that information? When you do something as simple as searching 245 00:15:09,520 --> 00:15:17,120 Speaker 1: for a flight, then what happens? All the airlines start 246 00:15:17,160 --> 00:15:20,160 Speaker 1: coming after your business for that exact flight. If you 247 00:15:20,280 --> 00:15:24,200 Speaker 1: search a hotel in a certain city, every ad afterwards 248 00:15:24,280 --> 00:15:29,479 Speaker 1: is for hotels in that city. Same with rental cars, 249 00:15:29,480 --> 00:15:33,760 Speaker 1: same with restaurants, same with office supplies. The list goes 250 00:15:33,840 --> 00:15:39,920 Speaker 1: on and on and on and on and on. And 251 00:15:40,000 --> 00:15:42,440 Speaker 1: what Governor de Santist is saying, first and foremost is 252 00:15:42,480 --> 00:15:47,200 Speaker 1: not in our state any longer. You guys are way 253 00:15:47,200 --> 00:15:52,880 Speaker 1: too invasive. You've lied to the American consumer. You've lied 254 00:15:52,960 --> 00:15:55,440 Speaker 1: to these to us, to the people saying we never 255 00:15:55,480 --> 00:15:58,040 Speaker 1: sell your information, when that is exactly what you do, 256 00:15:58,120 --> 00:16:03,920 Speaker 1: and you do it billions of times a day. The 257 00:16:04,040 --> 00:16:07,840 Speaker 1: game's over in our state. You can't do this now. 258 00:16:07,880 --> 00:16:09,160 Speaker 1: They may be able to do in other states. And 259 00:16:09,160 --> 00:16:11,440 Speaker 1: there's gonna be other states. You're gonna know who's really 260 00:16:11,480 --> 00:16:14,200 Speaker 1: in favor of protecting your freedom and who's being controlled 261 00:16:14,200 --> 00:16:17,360 Speaker 1: by big tech by which states do not act. Governor 262 00:16:17,400 --> 00:16:19,440 Speaker 1: De Santist is the first, and there should be dozens 263 00:16:19,480 --> 00:16:22,360 Speaker 1: and dozens of states that follow. There may not be. 264 00:16:24,440 --> 00:16:26,120 Speaker 1: And this is why we need to be looking at 265 00:16:26,120 --> 00:16:30,080 Speaker 1: this and asking real questions about this. We need to 266 00:16:30,080 --> 00:16:33,680 Speaker 1: be asking tough questions of every governor. Why are you 267 00:16:33,760 --> 00:16:37,680 Speaker 1: not following suit, Why are you not organizing and doing 268 00:16:37,720 --> 00:16:42,560 Speaker 1: the same thing. Why are you not advocating for our protections. 269 00:16:44,360 --> 00:16:48,520 Speaker 1: These are the questions that need to be asked. These 270 00:16:48,520 --> 00:16:54,800 Speaker 1: are the important questions that need to be discussed. Now, 271 00:16:54,920 --> 00:16:57,240 Speaker 1: this isn't all that Governor de Santist did. This was 272 00:16:57,360 --> 00:17:01,000 Speaker 1: part one of his plan. Part two is an extremely 273 00:17:01,040 --> 00:17:04,800 Speaker 1: important aspect of his plan. The other part of this 274 00:17:04,840 --> 00:17:06,760 Speaker 1: plan that he's talking about, and again we all need 275 00:17:06,800 --> 00:17:08,680 Speaker 1: to be taking notes on this, because this is how 276 00:17:08,680 --> 00:17:10,520 Speaker 1: we are going to beat big tech, if we're going 277 00:17:10,560 --> 00:17:12,960 Speaker 1: to survive them, if we're going to stand up to them. 278 00:17:13,480 --> 00:17:21,240 Speaker 1: His second part of his plan is this. Now, these 279 00:17:21,440 --> 00:17:26,359 Speaker 1: network of Silicon Valley CEOs wield extraordinary power to the 280 00:17:26,440 --> 00:17:30,320 Speaker 1: point of holistically controlling the flow of vast swaths of 281 00:17:30,359 --> 00:17:34,440 Speaker 1: information in our country. No matter of hours, a business 282 00:17:34,480 --> 00:17:38,359 Speaker 1: can be dismantled, a community of friends and colleagues, canceled, 283 00:17:38,680 --> 00:17:42,160 Speaker 1: and even a sitting president of the United States Silence 284 00:17:42,640 --> 00:17:45,520 Speaker 1: are their own admissions. Social media companies view themselves as 285 00:17:45,520 --> 00:17:48,560 Speaker 1: a new public square and are happy to market themselves 286 00:17:48,560 --> 00:17:53,880 Speaker 1: as platforms of global, regional, and local connectivity. Make no mistake, 287 00:17:54,240 --> 00:17:57,679 Speaker 1: they are nothing more than advertising conglomerates. And I'm not 288 00:17:57,760 --> 00:18:00,000 Speaker 1: interested in handing over the key so the public square 289 00:18:00,240 --> 00:18:03,199 Speaker 1: to a bunch of companies whose economic interests are not 290 00:18:03,280 --> 00:18:06,320 Speaker 1: aligned with the public interest. When it comes to the 291 00:18:06,400 --> 00:18:10,080 Speaker 1: rightful criticism for their editing and manipulation of the public square, 292 00:18:10,560 --> 00:18:14,280 Speaker 1: big tech executives flee to shelter themselves from accountability as 293 00:18:14,320 --> 00:18:16,959 Speaker 1: anything but a public forum, and that they have the 294 00:18:17,040 --> 00:18:22,680 Speaker 1: hutzpah the nerve who insist on broad liability protection heads 295 00:18:22,760 --> 00:18:27,440 Speaker 1: they win, pails, we lose. And worse yet, a faceless 296 00:18:27,440 --> 00:18:30,160 Speaker 1: and nameless group of tech employees that these companies now 297 00:18:30,160 --> 00:18:34,480 Speaker 1: wheeled tremendous power to censor speech and enforce their viewpoint 298 00:18:34,760 --> 00:18:39,040 Speaker 1: on political discourse upon the general public. If George Orwell 299 00:18:39,119 --> 00:18:41,680 Speaker 1: had thought of it, he would have loved the term 300 00:18:42,160 --> 00:18:47,080 Speaker 1: content moderation. Consequences of big tech censorship are felt far 301 00:18:47,119 --> 00:18:50,680 Speaker 1: and wide. Take for example, Big TeX's approach to censoring 302 00:18:50,760 --> 00:18:56,560 Speaker 1: criticism of pseudoscientific lockdowns during the coronavirus pandemic. By the way, 303 00:18:56,760 --> 00:18:58,879 Speaker 1: Governor to Scantists and what he's doing now, And you 304 00:18:58,960 --> 00:19:02,080 Speaker 1: notice the first thing he went for when explaining his 305 00:19:02,160 --> 00:19:07,520 Speaker 1: big Tech move is to show the example of COVID nineteen. Right, 306 00:19:07,640 --> 00:19:10,959 Speaker 1: you must agree with our science, you must agree with 307 00:19:11,000 --> 00:19:14,919 Speaker 1: what we tell you. We will immediately put misinformation or 308 00:19:14,960 --> 00:19:18,280 Speaker 1: misleading or silence or shut you down if you write anything, 309 00:19:18,359 --> 00:19:21,520 Speaker 1: put anything up, or debate anything about COVID that we 310 00:19:21,640 --> 00:19:25,880 Speaker 1: believe is not correct. And Governor de Santists in Florida 311 00:19:26,119 --> 00:19:30,200 Speaker 1: was a huge victim of this propaganda, of this silencing. 312 00:19:30,680 --> 00:19:35,160 Speaker 1: They were a massive victim of this because the left 313 00:19:35,200 --> 00:19:38,080 Speaker 1: attack Governor de Santists, And the reason why they attacked 314 00:19:38,160 --> 00:19:41,480 Speaker 1: him is because he was saying no to what New 315 00:19:41,560 --> 00:19:45,399 Speaker 1: York and these other liberals were doing, saying not in 316 00:19:45,480 --> 00:19:48,199 Speaker 1: my state we're going to do this, We're not going 317 00:19:48,240 --> 00:19:52,240 Speaker 1: to do it this way. One of the examples that 318 00:19:52,280 --> 00:19:57,120 Speaker 1: he is talking about is COVID nineteen. COVID nineteen, he 319 00:19:57,200 --> 00:20:03,000 Speaker 1: was attacked by big tech. He understands how insane big 320 00:20:03,040 --> 00:20:05,879 Speaker 1: tech has gotten, how out of control they've gotten because 321 00:20:05,920 --> 00:20:11,359 Speaker 1: big Tech allowed for his state to be just assaulted online, 322 00:20:12,400 --> 00:20:17,800 Speaker 1: because he refused to bow to the liberals, those that said, 323 00:20:17,800 --> 00:20:21,040 Speaker 1: you got to shut down everything, the economy schools, all 324 00:20:21,080 --> 00:20:23,159 Speaker 1: of it, and all the things they did in New 325 00:20:23,240 --> 00:20:25,040 Speaker 1: York that didn't work, all the things in New Jersey 326 00:20:25,040 --> 00:20:26,600 Speaker 1: they did that didn't work, all the things they did 327 00:20:26,600 --> 00:20:28,639 Speaker 1: in Baltimore that didn't work, all the things they did 328 00:20:28,680 --> 00:20:30,640 Speaker 1: in Minneapolis that didn't work, all the things they did 329 00:20:30,640 --> 00:20:36,120 Speaker 1: in San Francisco that didn't work. In la that didn't work. 330 00:20:36,160 --> 00:20:39,119 Speaker 1: In California as a state that didn't work, which is 331 00:20:39,119 --> 00:20:44,600 Speaker 1: why they're recalling their governor now in California. You know, 332 00:20:44,680 --> 00:20:47,520 Speaker 1: New York City is a great example. Governor Cuomo had 333 00:20:47,560 --> 00:20:49,800 Speaker 1: time to write a book about how great his response 334 00:20:49,840 --> 00:20:53,879 Speaker 1: to COVID was, when in fact they were under reported 335 00:20:53,920 --> 00:20:56,320 Speaker 1: the number of dead and nursing homes to cover up 336 00:20:56,359 --> 00:20:59,880 Speaker 1: how bad their decision making was. And when talking about 337 00:21:00,080 --> 00:21:03,000 Speaker 1: those desks, what did he say, whether a person died 338 00:21:03,040 --> 00:21:06,000 Speaker 1: in a nursing home or a hospital, who cares they died. 339 00:21:06,560 --> 00:21:11,440 Speaker 1: That's what the response from Governor Cuomo was. His exact words. 340 00:21:13,240 --> 00:21:16,360 Speaker 1: So Governor de Santes dealt with big tech. Remember, you 341 00:21:16,400 --> 00:21:20,480 Speaker 1: couldn't even write the word hydroxy chlora quim without being 342 00:21:20,520 --> 00:21:26,359 Speaker 1: censored on Twitter and Facebook. You couldn't ask questions about 343 00:21:26,480 --> 00:21:30,399 Speaker 1: COVID nineteen vaccines or the effectiveness of wearing a mask 344 00:21:30,760 --> 00:21:34,600 Speaker 1: without someone shutting you down and silencing. You couldn't even 345 00:21:34,680 --> 00:21:40,359 Speaker 1: do that. Not allowed to ask questions, not allowed to 346 00:21:40,400 --> 00:21:44,480 Speaker 1: ask any basic questions. Nope, you've just got to go 347 00:21:44,600 --> 00:21:50,320 Speaker 1: with what we tell you can and cannot say. And 348 00:21:50,440 --> 00:21:53,639 Speaker 1: Governor de Santis was ridiculed and mocked, and he was 349 00:21:53,680 --> 00:21:56,359 Speaker 1: told and the media said that people were dying because 350 00:21:56,359 --> 00:22:01,119 Speaker 1: of the decisions that he made, which wasn't true. That 351 00:22:01,160 --> 00:22:02,960 Speaker 1: he was putting people at lists, that people were going 352 00:22:03,000 --> 00:22:06,920 Speaker 1: to die because of him, which again was not true, 353 00:22:09,080 --> 00:22:12,520 Speaker 1: wasn't even close to being accurate, wasn't even close to 354 00:22:12,600 --> 00:22:20,639 Speaker 1: being true. Here is more of Governor to santists talking 355 00:22:20,680 --> 00:22:24,320 Speaker 1: about how we're going to random big tech and dissension 356 00:22:24,359 --> 00:22:27,240 Speaker 1: when it comes to things like COVID nineteen, which you 357 00:22:27,320 --> 00:22:30,360 Speaker 1: should be able to ask questions. You should be able 358 00:22:30,400 --> 00:22:32,560 Speaker 1: to ask a question or talk about hydroxy court quinn 359 00:22:32,600 --> 00:22:36,639 Speaker 1: without being silenced forever, and worse yet, a faceless and 360 00:22:36,720 --> 00:22:39,639 Speaker 1: nameless group of tech. Employees at these companies now wheeled 361 00:22:39,640 --> 00:22:44,040 Speaker 1: tremendous power to censor speech and enforce their viewpoint on 362 00:22:44,119 --> 00:22:48,440 Speaker 1: political discourse upon the general public. If George Orwell had 363 00:22:48,440 --> 00:22:52,920 Speaker 1: thought of it, he would have loved the term content moderation. 364 00:22:53,720 --> 00:22:56,840 Speaker 1: Consequences of big tech censorship are felt far and wide. 365 00:22:57,320 --> 00:23:01,159 Speaker 1: Take for example, Big tex approach to censoring criticism of 366 00:23:01,240 --> 00:23:06,119 Speaker 1: pseudo scientific lockdowns during the coronavirus pandemic. Well these lockdowns 367 00:23:06,119 --> 00:23:10,639 Speaker 1: were almost universally rejected in pre COVID pandemic preparedness plans. 368 00:23:11,119 --> 00:23:15,040 Speaker 1: Lockdowns at the time of the pandemic were favored by 369 00:23:15,040 --> 00:23:18,640 Speaker 1: the quote narrative, and so in the name of quote science, 370 00:23:19,160 --> 00:23:24,399 Speaker 1: articles and posts warning against lockdowns were taken down and censored. 371 00:23:24,880 --> 00:23:27,840 Speaker 1: The result has been the destruction of millions of lives 372 00:23:27,880 --> 00:23:33,680 Speaker 1: across America, as well as increased deaths from suicide, substance abuse, 373 00:23:33,800 --> 00:23:39,920 Speaker 1: and despair, without any corresponding benefit in COVID mortality. Shouldn't 374 00:23:39,960 --> 00:23:44,800 Speaker 1: such monumental policy questions have received a full, open and 375 00:23:45,000 --> 00:23:49,200 Speaker 1: robust debate. Social media platforms have become among the most 376 00:23:49,280 --> 00:23:52,320 Speaker 1: powerful mechanism for a private citizen to make his or 377 00:23:52,320 --> 00:23:55,320 Speaker 1: her voice heard. As incumbent upon us to ensure those 378 00:23:55,400 --> 00:24:00,560 Speaker 1: voices are not capriciously and vindictively targeted. And the worst part, 379 00:24:00,600 --> 00:24:04,360 Speaker 1: they change the rules constantly based on whichever whatever they 380 00:24:04,400 --> 00:24:07,080 Speaker 1: deem to be politically correct at any given point in time. 381 00:24:07,400 --> 00:24:10,160 Speaker 1: These rules and standards are often change without the knowledge 382 00:24:10,160 --> 00:24:13,880 Speaker 1: of their users, moving the goalposts on Floridians and others 383 00:24:13,920 --> 00:24:16,879 Speaker 1: who use these open forums for discourse and as a 384 00:24:16,920 --> 00:24:20,399 Speaker 1: source for information. When a social media company applies these 385 00:24:20,480 --> 00:24:25,320 Speaker 1: standards unequally on users, this is discrimination, pure and simple, 386 00:24:25,720 --> 00:24:29,120 Speaker 1: and you imagine tolerating this kind of behavior in banking, 387 00:24:29,280 --> 00:24:32,119 Speaker 1: or in healthcare, or in other industries. So today we 388 00:24:32,200 --> 00:24:35,359 Speaker 1: announced during this legislative session that we will seek to 389 00:24:35,359 --> 00:24:39,040 Speaker 1: do the following ensure that Floridians are safeguarded against these 390 00:24:39,040 --> 00:24:43,719 Speaker 1: practices from technology companies by requiring proper notice and disclosure 391 00:24:43,760 --> 00:24:46,679 Speaker 1: of these changes to the standards and full disclosure of 392 00:24:46,720 --> 00:24:50,800 Speaker 1: any actions taken against a user for violating the standards. 393 00:24:51,440 --> 00:24:54,760 Speaker 1: We'll also seek to prevent these platforms from rapidly changing 394 00:24:54,800 --> 00:24:58,920 Speaker 1: these standards and applying them unequally against users. Will also 395 00:24:58,960 --> 00:25:02,239 Speaker 1: require that users provided the option to opt out of 396 00:25:02,240 --> 00:25:06,399 Speaker 1: the various algorithms these platforms used to steer content or 397 00:25:06,440 --> 00:25:11,160 Speaker 1: in many cases, suppress content from the view of other users. 398 00:25:11,560 --> 00:25:14,359 Speaker 1: But these provisions are of no use without enforcement, and 399 00:25:14,400 --> 00:25:17,639 Speaker 1: we will provide recourse for Floridians, both by enabling a 400 00:25:17,760 --> 00:25:20,240 Speaker 1: user to bring a cause of action against the technology 401 00:25:20,280 --> 00:25:24,840 Speaker 1: company for violating these requirements of Florida law and empowering 402 00:25:24,840 --> 00:25:28,040 Speaker 1: the Attorney General to bring action against the technology company 403 00:25:28,320 --> 00:25:32,720 Speaker 1: for violations of these requirements under Florida's Unfair and Deceptive 404 00:25:32,760 --> 00:25:36,840 Speaker 1: Trade Practices Act. We've also seen the breath of big 405 00:25:36,880 --> 00:25:40,679 Speaker 1: tech's influence on campaigns and elections. While there wasn't a 406 00:25:40,680 --> 00:25:42,719 Speaker 1: state in the Union that ran a better election than 407 00:25:42,760 --> 00:25:45,719 Speaker 1: Florida last year, we still saw on a national scale 408 00:25:45,760 --> 00:25:49,399 Speaker 1: how articles, candidates, and content had the thumb prints of 409 00:25:49,480 --> 00:25:52,879 Speaker 1: big tech executives all over them. You can look no 410 00:25:52,960 --> 00:25:55,800 Speaker 1: further than the last several months of the election, as coordinated, 411 00:25:55,880 --> 00:26:00,720 Speaker 1: calculated efforts were undertaken to advance and increasingly evident political 412 00:26:00,720 --> 00:26:03,880 Speaker 1: agenda of the big tech companies. The problem is these 413 00:26:03,880 --> 00:26:06,439 Speaker 1: companies are playing a significant role in the advancement of 414 00:26:06,480 --> 00:26:09,879 Speaker 1: issues and candidates, but do so without recording many of 415 00:26:09,920 --> 00:26:15,400 Speaker 1: their efforts for what they are political contributions. He's right, 416 00:26:18,280 --> 00:26:23,600 Speaker 1: He's absolutely right. That's why he's doing what he's doing. 417 00:26:24,240 --> 00:26:27,880 Speaker 1: That's why Governor Desantists is having to do these things. 418 00:26:28,080 --> 00:26:30,880 Speaker 1: You look at the meat in the potatoes of what 419 00:26:31,000 --> 00:26:32,480 Speaker 1: is in this bill, and I want to just give 420 00:26:32,520 --> 00:26:35,800 Speaker 1: you a quick recap of it. Okay, He says, Big 421 00:26:35,840 --> 00:26:38,679 Speaker 1: Tech has come to look more like Big Brother with 422 00:26:38,760 --> 00:26:43,240 Speaker 1: each passing day, accusing the tech companies holding monopoly power 423 00:26:43,240 --> 00:26:48,760 Speaker 1: over a centrally important part of your life, your private information. 424 00:26:50,080 --> 00:26:52,800 Speaker 1: He then said, nameless and faceless boards of sensors are 425 00:26:52,880 --> 00:26:56,600 Speaker 1: violating the free speech rights of Floridians, and big tech 426 00:26:56,720 --> 00:26:59,480 Speaker 1: is not until to track your every move. He's right, 427 00:27:00,840 --> 00:27:03,560 Speaker 1: he said. Our founding fathers were deliberate in the enshrinement 428 00:27:03,560 --> 00:27:06,359 Speaker 1: of our rights in the Constitution to ensure that we 429 00:27:06,440 --> 00:27:09,320 Speaker 1: the people were guaranteed protections against those which in deviolates 430 00:27:09,320 --> 00:27:11,800 Speaker 1: our rights. And that's exactly what big tech is doing 431 00:27:11,880 --> 00:27:18,479 Speaker 1: right now, he said. Ironically, our early founding fathers, early 432 00:27:18,600 --> 00:27:21,320 Speaker 1: founders were most concerned about the tyranny of the government 433 00:27:21,640 --> 00:27:24,919 Speaker 1: in deciding these rights. But today big tech oligarchs have 434 00:27:25,040 --> 00:27:28,080 Speaker 1: in many ways become a clear and more present danger 435 00:27:28,119 --> 00:27:30,840 Speaker 1: to the restrictions of the rights to free speech than 436 00:27:30,880 --> 00:27:37,960 Speaker 1: the government itself. And he's also right, Silicon Valley CEOs 437 00:27:38,000 --> 00:27:44,800 Speaker 1: have extraordinary power to point the control of ideas, the 438 00:27:44,840 --> 00:27:48,639 Speaker 1: flows of information in our country. In a matter of hours, 439 00:27:49,320 --> 00:27:53,840 Speaker 1: a business can be dismantled, a community of friends and 440 00:27:53,920 --> 00:27:57,200 Speaker 1: colleagues can be canceled, and even the sitting president of 441 00:27:57,200 --> 00:28:03,639 Speaker 1: the United States of America can be silenced by their 442 00:28:03,640 --> 00:28:06,439 Speaker 1: own emissions. He says, social media companies view themselves as 443 00:28:06,440 --> 00:28:10,679 Speaker 1: platforms of global, regional, and local connectivity. Make no mistake, 444 00:28:10,760 --> 00:28:16,440 Speaker 1: they are nothing more than advertising conglomerates. He said, I'm 445 00:28:16,440 --> 00:28:18,600 Speaker 1: not interested in handing over the keys the public square 446 00:28:18,600 --> 00:28:21,399 Speaker 1: to a bunch of companies whose economic interests are not 447 00:28:21,440 --> 00:28:29,240 Speaker 1: aligned with public interests. So this new legislative proposal that 448 00:28:29,320 --> 00:28:31,160 Speaker 1: they are going to attempt to pass to crack down 449 00:28:31,200 --> 00:28:35,560 Speaker 1: on unfair practices by big tech company include this requiring 450 00:28:35,640 --> 00:28:38,640 Speaker 1: social media platforms to give proper notice and disclosure of 451 00:28:38,760 --> 00:28:42,320 Speaker 1: changes to their content standards or terms of service, and 452 00:28:42,400 --> 00:28:46,720 Speaker 1: provide full disclosure of any action taking against a user 453 00:28:46,800 --> 00:28:52,360 Speaker 1: for violating their standards. Number two. He wants to prevent 454 00:28:52,440 --> 00:28:55,800 Speaker 1: social media platforms from rapidly changing these standards and applying 455 00:28:55,840 --> 00:29:02,040 Speaker 1: them unequally. Against users based on their political ideals. Three 456 00:29:02,160 --> 00:29:04,520 Speaker 1: Provide users the options to opt out of the various 457 00:29:04,520 --> 00:29:11,040 Speaker 1: algorithms these platforms use to steer content or suppress content 458 00:29:11,560 --> 00:29:14,200 Speaker 1: from the views of others. In other words, you can 459 00:29:14,240 --> 00:29:16,160 Speaker 1: opt out saying no, no no, you're not gonna give me 460 00:29:16,680 --> 00:29:18,880 Speaker 1: what you want me to see. I'm gonna I'm gonna 461 00:29:18,960 --> 00:29:22,920 Speaker 1: see organically what I've chosen to follow and like and 462 00:29:23,120 --> 00:29:26,080 Speaker 1: look at. You're not gonna send me a message that 463 00:29:26,160 --> 00:29:33,800 Speaker 1: you believe I need to be in favor of. Provide 464 00:29:33,880 --> 00:29:36,360 Speaker 1: users the ability to bring lawsuits against tech companies and 465 00:29:36,400 --> 00:29:40,040 Speaker 1: empower the Florida Attorney General to bring actions against a 466 00:29:40,040 --> 00:29:44,320 Speaker 1: tech company for violating these requirements under federal under Florida's 467 00:29:44,440 --> 00:29:53,200 Speaker 1: Unfair and Deceptive Trade Practices Act. You know this is 468 00:29:53,240 --> 00:29:57,840 Speaker 1: all about these algorithms. Quote. If a tech company uses 469 00:29:57,880 --> 00:30:01,760 Speaker 1: algorithms to suppress or prioritize the access of any content 470 00:30:01,880 --> 00:30:04,920 Speaker 1: related to a political cause or candidate on the ballot, 471 00:30:04,920 --> 00:30:07,800 Speaker 1: the company will face daily finds. That's how you rain 472 00:30:07,880 --> 00:30:10,800 Speaker 1: these guys in. You stick it to them in their pocketbooks, 473 00:30:10,840 --> 00:30:13,400 Speaker 1: and they have nothing to, by the way, worry about 474 00:30:14,920 --> 00:30:21,200 Speaker 1: if they're not breaking the rules, if they're not breaking 475 00:30:21,200 --> 00:30:26,560 Speaker 1: the rules. If they're not deliberately trying to influence elections, 476 00:30:26,800 --> 00:30:30,040 Speaker 1: if they're not deliberately trying to silence conservative viewpoints, then 477 00:30:30,120 --> 00:30:32,479 Speaker 1: big Tech should have no issue going along with us 478 00:30:32,520 --> 00:30:34,800 Speaker 1: because it means that they are being honest and authentic 479 00:30:35,120 --> 00:30:42,880 Speaker 1: and organic. But if they are, then here's your sign. 480 00:30:46,080 --> 00:30:51,800 Speaker 1: If they are, here's your sign. By the way, if 481 00:30:51,840 --> 00:30:54,040 Speaker 1: you want to keep up with us outside of big Tech, 482 00:30:54,120 --> 00:30:57,720 Speaker 1: make sure you hey, listen to our podcasts every day 483 00:30:57,720 --> 00:31:01,600 Speaker 1: and be text us right now. We're making sure that 484 00:31:01,640 --> 00:31:03,959 Speaker 1: we can keep up with you outside of this crazy 485 00:31:04,000 --> 00:31:06,520 Speaker 1: world a big text. So if they decide to cancel us, 486 00:31:07,440 --> 00:31:10,240 Speaker 1: all you have to do is send us a text 487 00:31:10,280 --> 00:31:13,840 Speaker 1: message to our phone number. Our phone number is easy 488 00:31:14,360 --> 00:31:18,760 Speaker 1: five five four three three, So all you gotta do 489 00:31:18,840 --> 00:31:20,920 Speaker 1: just send us a text message to our phone number 490 00:31:20,920 --> 00:31:23,680 Speaker 1: like you'd send a normal text message. Our phone numbers 491 00:31:23,840 --> 00:31:27,480 Speaker 1: five five four three three, and then you're gonna send 492 00:31:27,560 --> 00:31:29,280 Speaker 1: the word Ben, just like you would say what do 493 00:31:29,280 --> 00:31:31,040 Speaker 1: you want for dinner? Just type the word Ben and 494 00:31:31,120 --> 00:31:35,000 Speaker 1: hit send. That's all you have to do, so do 495 00:31:35,040 --> 00:31:37,120 Speaker 1: it right now so we can always keep up with you, 496 00:31:37,920 --> 00:31:40,520 Speaker 1: all right, See you back here tomorrow on our podcast 497 00:31:41,240 --> 00:31:43,640 Speaker 1: and you can go backwards and listen to other podcasts 498 00:31:43,720 --> 00:31:44,120 Speaker 1: as well.