1 00:00:08,320 --> 00:00:08,840 Speaker 1: Joining us. 2 00:00:08,840 --> 00:00:12,879 Speaker 2: In the WIBC studios, we have former FBI special agent 3 00:00:13,600 --> 00:00:18,360 Speaker 2: Corey Grass. He spent twenty years in counterintelligence and counter terrorism. 4 00:00:18,720 --> 00:00:22,560 Speaker 2: It's Casey and Jim you're listening to on ninety three WIBC. 5 00:00:22,160 --> 00:00:22,640 Speaker 1: So Corey. 6 00:00:22,720 --> 00:00:27,000 Speaker 2: Yesterday, Jim and I discussed this article spyware, once used 7 00:00:27,000 --> 00:00:31,880 Speaker 2: by governments, is now spreading to cyber criminals, and I thought, well, 8 00:00:32,159 --> 00:00:33,560 Speaker 2: we've got we've got a guy. 9 00:00:33,720 --> 00:00:34,639 Speaker 3: Love it when we got a guy. 10 00:00:34,720 --> 00:00:36,520 Speaker 1: We got a guy, and here you are. 11 00:00:36,640 --> 00:00:39,879 Speaker 2: So how big of a problem is it when government 12 00:00:39,920 --> 00:00:43,960 Speaker 2: grade spyware tools leak into the hands of cyber criminals 13 00:00:44,320 --> 00:00:46,160 Speaker 2: or just regular criminals. 14 00:00:46,600 --> 00:00:48,080 Speaker 4: Well, first, I might say, thanks for having me. I'm 15 00:00:48,080 --> 00:00:50,400 Speaker 4: glad you think of criminals you think of me. I'm 16 00:00:50,400 --> 00:00:53,320 Speaker 4: not sure what that says about me, but I appreciate it. 17 00:00:53,440 --> 00:00:56,920 Speaker 4: That type of technology is so prevalent these days. It's crazy. 18 00:00:57,560 --> 00:01:00,000 Speaker 4: Like you said, it was a high level use previously 19 00:01:00,120 --> 00:01:02,960 Speaker 4: by higher powers that had access to it, and now 20 00:01:03,000 --> 00:01:05,200 Speaker 4: that it's worked this way down, I think it's kind 21 00:01:05,200 --> 00:01:06,600 Speaker 4: of foolish for any of us in the United States. 22 00:01:06,640 --> 00:01:08,679 Speaker 4: I think our stuff is safe in any capacity because 23 00:01:08,680 --> 00:01:11,840 Speaker 4: there's so many people in so many ways trying to 24 00:01:11,880 --> 00:01:14,080 Speaker 4: steal your information multi ways. 25 00:01:14,200 --> 00:01:16,800 Speaker 2: Let's talk about some of the ways. Okay, some of 26 00:01:16,800 --> 00:01:19,880 Speaker 2: the things that you're skeptical of or you've seen or 27 00:01:19,920 --> 00:01:21,040 Speaker 2: people need to be aware of. 28 00:01:21,840 --> 00:01:24,240 Speaker 4: I'll mention you guys previously, but QR codes are a 29 00:01:24,240 --> 00:01:27,119 Speaker 4: great way for people to steal your information as simple 30 00:01:27,160 --> 00:01:29,880 Speaker 4: as City of Indianapolis parking signs downtown. You go to 31 00:01:29,920 --> 00:01:32,040 Speaker 4: pay your parking meter. Now everything's automated, then which you 32 00:01:32,040 --> 00:01:34,240 Speaker 4: to click on the QR code send your money that way? 33 00:01:34,280 --> 00:01:36,520 Speaker 4: Will it cost me a dollar apiece for a sticker 34 00:01:36,840 --> 00:01:38,360 Speaker 4: to put on those signs? And then I have you 35 00:01:38,400 --> 00:01:40,720 Speaker 4: send you to my website or my corrupt way of 36 00:01:40,760 --> 00:01:42,920 Speaker 4: doing business, and now I have all your personal information 37 00:01:43,080 --> 00:01:44,480 Speaker 4: as simple as that a simple sticker. 38 00:01:44,560 --> 00:01:48,200 Speaker 3: Okay. So what you're saying is, and you're right, those 39 00:01:48,280 --> 00:01:51,400 Speaker 3: QR codes are everywhere, especially with city parking. So you 40 00:01:51,440 --> 00:01:52,880 Speaker 3: go there and you park, and then you've got to 41 00:01:52,920 --> 00:01:54,560 Speaker 3: go to the little shi phone and you got to 42 00:01:54,600 --> 00:01:56,080 Speaker 3: scan it with your phone and then you give them 43 00:01:56,080 --> 00:01:58,120 Speaker 3: all your credit card information and your license plate and 44 00:01:58,160 --> 00:02:00,400 Speaker 3: that's how they charge you for your parking. And what 45 00:02:00,440 --> 00:02:03,120 Speaker 3: you're saying is somebody could easily just print out a 46 00:02:03,160 --> 00:02:05,960 Speaker 3: sticker of their own QR code that's the same size 47 00:02:06,000 --> 00:02:08,480 Speaker 3: as the QR code on the sign, and just slap 48 00:02:08,560 --> 00:02:10,520 Speaker 3: that sticker on there. And then so that when they 49 00:02:10,639 --> 00:02:13,840 Speaker 3: scan my QR code, the criminal that I've put on there, 50 00:02:14,000 --> 00:02:16,160 Speaker 3: it's going to take them to this website that's going 51 00:02:16,200 --> 00:02:19,880 Speaker 3: to download some malicious software onto their phone or whatever else. 52 00:02:20,080 --> 00:02:22,320 Speaker 4: Correct, and particularly if you're good enough, wow, to have 53 00:02:22,320 --> 00:02:24,200 Speaker 4: a website that it goes to me? How many times 54 00:02:24,200 --> 00:02:26,240 Speaker 4: do you click on a code, a link in your emails, 55 00:02:26,280 --> 00:02:27,880 Speaker 4: a text you get, you click on it and it 56 00:02:27,919 --> 00:02:29,760 Speaker 4: goes and says file not available. You know, some sort 57 00:02:29,760 --> 00:02:31,480 Speaker 4: of tech issue. But if you do it right and 58 00:02:31,520 --> 00:02:34,480 Speaker 4: you create a site that will collect that information even better, 59 00:02:34,639 --> 00:02:36,400 Speaker 4: would you notice the difference if it was a different 60 00:02:36,440 --> 00:02:40,320 Speaker 4: parking company name, if it then seaving Annapolis? Or It's 61 00:02:40,320 --> 00:02:42,280 Speaker 4: funny we discussed last couple of days, all of us 62 00:02:42,320 --> 00:02:44,239 Speaker 4: did what we're going to talk about today. Yesterday in 63 00:02:44,280 --> 00:02:47,120 Speaker 4: the mail, I got a letter from a unnamed company 64 00:02:47,360 --> 00:02:49,680 Speaker 4: from there, let's it called let me see their secure 65 00:02:49,720 --> 00:02:52,760 Speaker 4: processing center. Set of my information got stolen from an 66 00:02:52,840 --> 00:02:57,000 Speaker 4: unnamed vendor. It possibly included all of or some of 67 00:02:57,080 --> 00:03:00,280 Speaker 4: my first name, last name, address, email address, so security 68 00:03:00,320 --> 00:03:03,120 Speaker 4: number dat to birth. They sent me an email this 69 00:03:03,160 --> 00:03:06,079 Speaker 4: company who lost my information, a secure processing company lost 70 00:03:06,120 --> 00:03:09,120 Speaker 4: my information potentially. They sent me an email address and 71 00:03:09,160 --> 00:03:11,120 Speaker 4: a secret code to put in an access code, and 72 00:03:11,160 --> 00:03:13,160 Speaker 4: they want me to click out another link and submit 73 00:03:13,200 --> 00:03:15,440 Speaker 4: that information. I don't trust that company. They've already lost 74 00:03:15,440 --> 00:03:16,919 Speaker 4: my stuff and they're asked me to send it to 75 00:03:16,960 --> 00:03:17,360 Speaker 4: them again. 76 00:03:17,520 --> 00:03:19,760 Speaker 2: So do you think that it's a scammer actually wanting 77 00:03:19,760 --> 00:03:20,960 Speaker 2: you to enter reial information? 78 00:03:21,080 --> 00:03:23,919 Speaker 4: Probably legitimate, but this legitimate company has filt. 79 00:03:24,200 --> 00:03:24,720 Speaker 1: They can't do it. 80 00:03:24,720 --> 00:03:26,359 Speaker 4: They could secure it the first time, so they want 81 00:03:26,360 --> 00:03:27,880 Speaker 4: me to send my information to them again. I'm just 82 00:03:27,960 --> 00:03:28,520 Speaker 4: very skeptical. 83 00:03:28,639 --> 00:03:31,639 Speaker 3: Well, and they're asking you to do the types of things. 84 00:03:31,720 --> 00:03:34,480 Speaker 3: Let's assume this company is legitimate. What they're asking you 85 00:03:34,560 --> 00:03:37,200 Speaker 3: to do are the exact type of things that a 86 00:03:37,240 --> 00:03:40,720 Speaker 3: scammer would ask you to do. Go to this website 87 00:03:40,800 --> 00:03:43,000 Speaker 3: that you don't know from a hole in the wall 88 00:03:43,200 --> 00:03:45,880 Speaker 3: and put in this special secure code, and if it 89 00:03:45,920 --> 00:03:48,360 Speaker 3: turned out that it was not legitimate, then all of 90 00:03:48,400 --> 00:03:50,520 Speaker 3: a sudden, they could have access. Part of what we 91 00:03:50,560 --> 00:03:53,920 Speaker 3: talked about too, is that you know, now it used 92 00:03:53,960 --> 00:03:56,720 Speaker 3: to be that if you had an iPhone that was 93 00:03:57,000 --> 00:04:00,760 Speaker 3: you know, scam free and malware free and virus and 94 00:04:00,800 --> 00:04:03,200 Speaker 3: trojan horse and all of those sorts of problems. But 95 00:04:03,680 --> 00:04:06,400 Speaker 3: in these specific situations that we're talking about, a lot 96 00:04:06,400 --> 00:04:08,760 Speaker 3: of these things you would do with your iPhone, like 97 00:04:08,800 --> 00:04:11,800 Speaker 3: scanning a QR code at a public parking garage and 98 00:04:11,840 --> 00:04:14,440 Speaker 3: doing that that could you know, the days of the 99 00:04:14,440 --> 00:04:16,760 Speaker 3: iPhone being completely secure or long gone? 100 00:04:17,120 --> 00:04:21,080 Speaker 4: I thought of two things there. Yeah, technology use has 101 00:04:21,160 --> 00:04:23,640 Speaker 4: changed so much, and anything that's being used for business 102 00:04:23,640 --> 00:04:26,080 Speaker 4: in a very positive way to grow business, you know, 103 00:04:26,160 --> 00:04:28,840 Speaker 4: to continue transacting business, the criminals will find a way 104 00:04:28,880 --> 00:04:31,440 Speaker 4: exploit that. And that's kind of always been the game 105 00:04:31,440 --> 00:04:32,720 Speaker 4: of law enforce when you were trying to play catch 106 00:04:32,760 --> 00:04:34,359 Speaker 4: up with what the bad guys have gotten ahead of you. 107 00:04:34,400 --> 00:04:37,200 Speaker 4: On you mentioned iPhones being secure, I think it was 108 00:04:37,480 --> 00:04:39,599 Speaker 4: still during my time as an FBI agen. But there 109 00:04:39,640 --> 00:04:42,600 Speaker 4: was a terrorist attack in Ky San Berdano, California. There 110 00:04:42,600 --> 00:04:45,120 Speaker 4: were two phones recovered that the FBI could not get into, 111 00:04:45,160 --> 00:04:47,880 Speaker 4: and Apple refused to help. They said they could not help, 112 00:04:47,920 --> 00:04:50,120 Speaker 4: they didn't have the technology. They wanted their consumers to 113 00:04:50,120 --> 00:04:53,039 Speaker 4: know their phones were so secure. Your informational say they 114 00:04:53,080 --> 00:04:55,680 Speaker 4: couldn't help you. And then I recall somehow, some way, 115 00:04:55,720 --> 00:04:57,720 Speaker 4: the FBI was able to access those phones, and they 116 00:04:57,760 --> 00:04:59,240 Speaker 4: called the director back in the test fight in front 117 00:04:59,279 --> 00:05:01,960 Speaker 4: of Congress, and then Apple one was demanding to know 118 00:05:02,000 --> 00:05:04,560 Speaker 4: how we got into those phones, and the director had 119 00:05:04,600 --> 00:05:08,080 Speaker 4: some kolay funny comments about another private company had developed 120 00:05:08,080 --> 00:05:10,640 Speaker 4: the software and would not divulge a price, but it 121 00:05:10,680 --> 00:05:13,039 Speaker 4: was I believe he mentioned something along the lines with 122 00:05:13,040 --> 00:05:15,000 Speaker 4: ten times his salary, that a company had done that 123 00:05:15,040 --> 00:05:15,320 Speaker 4: for us. 124 00:05:15,400 --> 00:05:18,520 Speaker 2: But you just mentioned a private company developed that software 125 00:05:18,560 --> 00:05:21,040 Speaker 2: to break into the iPhone, and if a private company 126 00:05:21,040 --> 00:05:22,680 Speaker 2: can do it, then a criminal can do it. 127 00:05:23,279 --> 00:05:25,640 Speaker 4: And again, I'm no expert in technology, and I'm not 128 00:05:25,680 --> 00:05:28,400 Speaker 4: an expert in all things FBI. But to bring the 129 00:05:28,440 --> 00:05:31,560 Speaker 4: whole picture into focus, it was pretty widely accepted as 130 00:05:31,560 --> 00:05:33,679 Speaker 4: a foreign company as well that had developed that software. 131 00:05:33,760 --> 00:05:34,920 Speaker 1: Corey Grass is joining us. 132 00:05:34,920 --> 00:05:37,600 Speaker 2: He's a former FBI special agent, spent twenty years in 133 00:05:37,640 --> 00:05:41,799 Speaker 2: counterintelligence and counter terrorism. You had mentioned We've been talking 134 00:05:41,800 --> 00:05:44,720 Speaker 2: a lot about airports, a lot of people waiting in 135 00:05:44,760 --> 00:05:47,360 Speaker 2: long lines at airports, and there's something that you said 136 00:05:47,400 --> 00:05:49,839 Speaker 2: you will never do when you're at an airport, And I'm. 137 00:05:49,680 --> 00:05:51,880 Speaker 4: Not sure if I'm just paranoid and skeptical, but i'd 138 00:05:51,920 --> 00:05:54,400 Speaker 4: read previously. I heard stories that if you plug your 139 00:05:54,400 --> 00:05:57,080 Speaker 4: phone into a USB charger, which these days be different, 140 00:05:57,080 --> 00:05:59,280 Speaker 4: I guess a little different, that it could access your 141 00:05:59,360 --> 00:06:02,440 Speaker 4: information that as well. So I don't recall ever plugging 142 00:06:02,440 --> 00:06:04,680 Speaker 4: my phone into an unknown charging base that I don't 143 00:06:04,680 --> 00:06:06,760 Speaker 4: have control over or no who owns or has access to. 144 00:06:06,920 --> 00:06:08,640 Speaker 3: And see, that's something I wouldn't think of because when 145 00:06:08,680 --> 00:06:10,800 Speaker 3: we talked about this before, when we were off the 146 00:06:10,839 --> 00:06:13,520 Speaker 3: air I thought about the traditional credit card skimmers, and 147 00:06:13,560 --> 00:06:15,680 Speaker 3: you've heard those stories for a long time, either at 148 00:06:15,680 --> 00:06:18,400 Speaker 3: the ATM or at the gas station and the criminals 149 00:06:18,400 --> 00:06:20,279 Speaker 3: put some sort of device and it looks just like 150 00:06:20,360 --> 00:06:22,599 Speaker 3: it's the you know, the gas pump, but you put 151 00:06:22,600 --> 00:06:25,560 Speaker 3: your credit card in they're stealing your information. But now 152 00:06:26,680 --> 00:06:30,040 Speaker 3: that could potentially apply to just USB charging ports at 153 00:06:30,040 --> 00:06:32,359 Speaker 3: public location, So this is not only at the airport, 154 00:06:32,400 --> 00:06:35,039 Speaker 3: it's any place that you might go to charge your phone. 155 00:06:35,600 --> 00:06:38,240 Speaker 3: It could be very It could be something similar to 156 00:06:38,279 --> 00:06:40,240 Speaker 3: what we've seen with these credit card skimmers, where they 157 00:06:40,279 --> 00:06:42,320 Speaker 3: just install something in front of it or it looks 158 00:06:42,360 --> 00:06:45,920 Speaker 3: like an attachment, but to the average consumer, it wouldn't 159 00:06:45,960 --> 00:06:47,839 Speaker 3: raise alarm bells at all. And then you plug in 160 00:06:47,880 --> 00:06:50,320 Speaker 3: your USB phone, your USB port into your phone, and 161 00:06:50,320 --> 00:06:52,360 Speaker 3: all of a sudden, they've had access to your information. 162 00:06:52,720 --> 00:06:54,520 Speaker 4: It's a new way of doing business for criminals. It's 163 00:06:54,680 --> 00:06:57,479 Speaker 4: it's more subtle. Again, even in naked, I can sometimes 164 00:06:57,480 --> 00:06:59,240 Speaker 4: see those skimmers look weird on a gas pump. Something 165 00:06:59,279 --> 00:07:01,080 Speaker 4: will catch your attention falling off that don't look right, 166 00:07:01,160 --> 00:07:04,640 Speaker 4: something's abnormal. Technology use doesn't Isn't that that clear to 167 00:07:04,680 --> 00:07:07,280 Speaker 4: the plane eye to see? And it's on a whole 168 00:07:07,320 --> 00:07:09,120 Speaker 4: new level now they're way to get access. It used 169 00:07:09,160 --> 00:07:10,680 Speaker 4: to be you had to as a victim, You had 170 00:07:10,720 --> 00:07:12,880 Speaker 4: to take a proactive step to fall for their trick. 171 00:07:12,920 --> 00:07:16,160 Speaker 4: They was in a link, you know, some sort of 172 00:07:16,360 --> 00:07:17,400 Speaker 4: something that was proactive. 173 00:07:17,400 --> 00:07:18,840 Speaker 3: You had to do it. 174 00:07:18,960 --> 00:07:21,240 Speaker 4: Sure seems to be now that they can access your 175 00:07:21,240 --> 00:07:24,160 Speaker 4: information without you doing something to help them, they can 176 00:07:24,200 --> 00:07:26,880 Speaker 4: just do it themselves. The article we talked about yesterday 177 00:07:26,880 --> 00:07:29,440 Speaker 4: a love real fast. They seem to think of this 178 00:07:29,600 --> 00:07:33,520 Speaker 4: MSN article that these bad guys or other entities can 179 00:07:33,640 --> 00:07:35,600 Speaker 4: access your information while you clicking on a series of 180 00:07:35,640 --> 00:07:37,559 Speaker 4: web different websites they could direct you to and things. 181 00:07:37,720 --> 00:07:40,120 Speaker 4: So again it seems very normal a victim to just 182 00:07:40,200 --> 00:07:42,560 Speaker 4: click a different websites you're being led to, especially if 183 00:07:42,560 --> 00:07:44,760 Speaker 4: they entice you to go certain ways with certain topics, 184 00:07:45,080 --> 00:07:47,080 Speaker 4: and then once you do that combination, then they've got 185 00:07:47,080 --> 00:07:47,520 Speaker 4: your stuff. 186 00:07:47,800 --> 00:07:50,160 Speaker 2: How how does a tool like that that was once 187 00:07:50,320 --> 00:07:53,640 Speaker 2: used by state or intelligence services end up in the 188 00:07:53,680 --> 00:07:57,120 Speaker 2: hands of criminals? Is it internal leaks, is it black 189 00:07:57,160 --> 00:07:59,920 Speaker 2: market sales or is it just like reverse engineering? 190 00:08:01,080 --> 00:08:02,800 Speaker 4: Probably a little bit of everything. But the weakest link 191 00:08:02,840 --> 00:08:05,120 Speaker 4: is always going to be human beings. When I did investigation, 192 00:08:05,160 --> 00:08:08,120 Speaker 4: you always start with that who worked the company, who 193 00:08:08,200 --> 00:08:10,880 Speaker 4: left under bad terms, who suddenly had affluence that didn't 194 00:08:10,880 --> 00:08:13,240 Speaker 4: have it before. Again, espionage case, it's very similar to that. 195 00:08:13,800 --> 00:08:16,480 Speaker 4: So usually it's going to be the human being being 196 00:08:16,480 --> 00:08:18,440 Speaker 4: the weak link, either selling it or giving away out 197 00:08:18,480 --> 00:08:21,680 Speaker 4: of anger or spite or whatever, but also just accidentally 198 00:08:21,720 --> 00:08:23,840 Speaker 4: not knowing that they didn't protect it through their coding 199 00:08:23,920 --> 00:08:25,600 Speaker 4: or whatever it was. That's going to be the biggest 200 00:08:25,600 --> 00:08:26,440 Speaker 4: issue probably. 201 00:08:26,200 --> 00:08:27,640 Speaker 3: You know, I think that's really interesting what you talk 202 00:08:27,680 --> 00:08:30,120 Speaker 3: about about the human being being the week weakest link, 203 00:08:30,160 --> 00:08:32,800 Speaker 3: because we saw these stories where you know, big corporations 204 00:08:32,800 --> 00:08:35,559 Speaker 3: spent billions and billions of dollars building up this incredible 205 00:08:35,600 --> 00:08:40,120 Speaker 3: IT infrastructure and IT security associated with it. And now 206 00:08:40,480 --> 00:08:43,200 Speaker 3: what is becoming one of the most successful ways to 207 00:08:43,200 --> 00:08:47,400 Speaker 3: break into a company's IT structure is a phishing scam. Okay, great, 208 00:08:47,400 --> 00:08:50,000 Speaker 3: you've got all this wonderful IT security keeping all the 209 00:08:50,000 --> 00:08:51,800 Speaker 3: bad guys out. But if I can just send an 210 00:08:51,800 --> 00:08:55,240 Speaker 3: employee an email that has a link on it that 211 00:08:55,320 --> 00:08:57,319 Speaker 3: they click on, then all of a sudden, I've got 212 00:08:57,320 --> 00:09:00,400 Speaker 3: the same access to that internal IT system that employee 213 00:09:00,400 --> 00:09:00,680 Speaker 3: you had. 214 00:09:01,200 --> 00:09:03,880 Speaker 4: And there's so many ways to gain your information and 215 00:09:03,880 --> 00:09:05,960 Speaker 4: the way you can secure it. I mean, as you 216 00:09:05,960 --> 00:09:07,360 Speaker 4: took me back to the FBI training this now, this 217 00:09:07,400 --> 00:09:09,120 Speaker 4: is some of the most basic online training we take 218 00:09:09,360 --> 00:09:12,040 Speaker 4: as reminders every year. Is wearing your name badge when 219 00:09:12,040 --> 00:09:13,839 Speaker 4: I work in the State House up here. I always 220 00:09:13,840 --> 00:09:16,400 Speaker 4: took my name badge when I left because it's not 221 00:09:16,440 --> 00:09:17,960 Speaker 4: important for people on the street to know who I am, 222 00:09:18,000 --> 00:09:20,040 Speaker 4: where I work my name. And again, same thing with 223 00:09:20,040 --> 00:09:22,560 Speaker 4: big corporation up at Indianapolis. You know, Lily, it comings 224 00:09:22,960 --> 00:09:26,199 Speaker 4: these people. We don't want foreign companies or foreign agencies 225 00:09:26,240 --> 00:09:28,280 Speaker 4: to know be able to target you based on where 226 00:09:28,320 --> 00:09:30,080 Speaker 4: you work. All it takes is again it's a very 227 00:09:30,120 --> 00:09:32,440 Speaker 4: simple scam. A lot of times they see you come out, 228 00:09:32,640 --> 00:09:34,800 Speaker 4: pick up your patterns, pick up if you're married or not, 229 00:09:34,800 --> 00:09:36,720 Speaker 4: do you have kids? Are you a drinker? Do you 230 00:09:36,760 --> 00:09:38,839 Speaker 4: live in downtown Indy? Do you live somewhere else? They 231 00:09:38,840 --> 00:09:40,520 Speaker 4: follow you, they pick up those patterns, and then they 232 00:09:40,559 --> 00:09:42,720 Speaker 4: find a way to social engineer it that they will 233 00:09:42,840 --> 00:09:44,240 Speaker 4: connect with you and get what they want. 234 00:09:44,960 --> 00:09:46,000 Speaker 1: How is it? 235 00:09:46,000 --> 00:09:50,280 Speaker 2: What's a realistic way that can you tell if can 236 00:09:50,320 --> 00:09:52,960 Speaker 2: you tell if your computer or your phone has been compromised? Like, 237 00:09:53,000 --> 00:09:55,200 Speaker 2: can you tell if somebody's spying on you? Is there 238 00:09:55,240 --> 00:09:56,079 Speaker 2: a way to pick that up? 239 00:09:56,200 --> 00:09:59,120 Speaker 4: I believe there's no. In my experience, I don't think there's. 240 00:09:58,960 --> 00:10:00,760 Speaker 2: Anything with without taking I'll be happening to you and 241 00:10:00,800 --> 00:10:01,400 Speaker 2: you don't even know. 242 00:10:01,480 --> 00:10:02,959 Speaker 4: I would get that occasionally when I was still in 243 00:10:02,960 --> 00:10:04,559 Speaker 4: the FBI, people would call me and say they think 244 00:10:04,559 --> 00:10:06,440 Speaker 4: their phones were tapped because they could hear the clicking 245 00:10:06,480 --> 00:10:06,920 Speaker 4: or hear whatever. 246 00:10:07,000 --> 00:10:07,760 Speaker 3: It doesn't work that way. 247 00:10:07,800 --> 00:10:09,360 Speaker 4: I would always tell them if even if your phone 248 00:10:09,440 --> 00:10:12,840 Speaker 4: was being listened to by an agency, there's no there's 249 00:10:12,880 --> 00:10:15,160 Speaker 4: no signs that gives off any of emissions or anything. 250 00:10:16,240 --> 00:10:18,240 Speaker 2: So there's no way to defend it like you don't know. 251 00:10:18,640 --> 00:10:20,240 Speaker 4: I mean, I think probably the first sign as you 252 00:10:20,280 --> 00:10:22,520 Speaker 4: realize either your credit or something is wrong, your just 253 00:10:22,520 --> 00:10:24,320 Speaker 4: thing is maybe your computer is slowing down more or 254 00:10:24,320 --> 00:10:26,600 Speaker 4: things like that, your cell phone's not working right. Possibly, 255 00:10:26,679 --> 00:10:28,920 Speaker 4: but again that happens so often anyway. Software updates and 256 00:10:28,960 --> 00:10:29,439 Speaker 4: things like that. 257 00:10:29,480 --> 00:10:32,000 Speaker 2: Well, and I have to imagine that like the spy 258 00:10:32,080 --> 00:10:35,480 Speaker 2: where on computers has gotten so much more advanced over 259 00:10:35,520 --> 00:10:37,640 Speaker 2: the years. It's just gotten better and better. 260 00:10:37,720 --> 00:10:41,000 Speaker 4: Yeah, again, it's it's a race waray who can get 261 00:10:41,080 --> 00:10:43,640 Speaker 4: farther ahead. They developed something to get your information. Law 262 00:10:43,679 --> 00:10:46,080 Speaker 4: enforcement'll catch up to it eventually. Another country will develop 263 00:10:46,160 --> 00:10:48,200 Speaker 4: a program they want to target a certain thing. We 264 00:10:48,280 --> 00:10:50,120 Speaker 4: hear about or figure it out eventually and react to that. 265 00:10:50,160 --> 00:10:51,720 Speaker 4: It's always reacting. It's hard to get ahead of that. 266 00:10:52,280 --> 00:10:54,280 Speaker 3: We talked again off the air about a story that 267 00:10:54,320 --> 00:10:56,199 Speaker 3: I want you to tell again too, because I used 268 00:10:56,200 --> 00:10:57,480 Speaker 3: to do with my job. Used to go to a 269 00:10:57,520 --> 00:10:59,480 Speaker 3: lot of trade shows, used to fly all over the country, 270 00:10:59,600 --> 00:11:01,080 Speaker 3: go to life of trade shows. You meet a lot 271 00:11:01,120 --> 00:11:03,560 Speaker 3: of strangers that give you their business card. Talk to 272 00:11:03,600 --> 00:11:07,320 Speaker 3: me about trade shows and business cards and how there's weaknesses. 273 00:11:06,760 --> 00:11:10,280 Speaker 4: There they know Again, that they need you to click 274 00:11:10,320 --> 00:11:12,960 Speaker 4: on something or access their information to get what they 275 00:11:12,960 --> 00:11:16,200 Speaker 4: want out of yours. So QR codes on a business 276 00:11:16,200 --> 00:11:17,959 Speaker 4: card again, I work in real estate now I'm a 277 00:11:17,960 --> 00:11:21,280 Speaker 4: realtra full time. QR codes my open house flyers. There's 278 00:11:21,320 --> 00:11:23,560 Speaker 4: QR codes on your links on your website for people 279 00:11:23,559 --> 00:11:25,240 Speaker 4: to click on. There's QR cards in your business card. 280 00:11:25,360 --> 00:11:27,360 Speaker 4: I have a QR code on my phone. That's one's 281 00:11:27,400 --> 00:11:28,839 Speaker 4: my business card reales that I can show my phone, 282 00:11:28,880 --> 00:11:31,560 Speaker 4: they take a picture of them. Goes other thing. A 283 00:11:31,640 --> 00:11:35,840 Speaker 4: legitimate business or even a foreign agency conducting business under 284 00:11:35,880 --> 00:11:38,520 Speaker 4: the cover of darkness can use those same tricks to 285 00:11:38,520 --> 00:11:41,319 Speaker 4: get you to click on their site either no business name, 286 00:11:41,360 --> 00:11:43,960 Speaker 4: a confusing business name, or something a logo, and you 287 00:11:44,080 --> 00:11:46,080 Speaker 4: go back to your office wherever that might be, whether 288 00:11:46,080 --> 00:11:47,840 Speaker 4: it's the government or a private industry, and you click 289 00:11:47,840 --> 00:11:50,040 Speaker 4: on that QR code trying to go to their site 290 00:11:50,040 --> 00:11:51,720 Speaker 4: to see if they can help you with your technology 291 00:11:51,720 --> 00:11:54,440 Speaker 4: for your company, and then they infiltrate that with your 292 00:11:54,480 --> 00:11:55,920 Speaker 4: private stuff right to the company. 293 00:11:55,960 --> 00:11:57,400 Speaker 2: We only have a couple of minutes left here with 294 00:11:57,440 --> 00:12:01,760 Speaker 2: Corey Grass, former FBI special agent, couple last questions, how 295 00:12:01,840 --> 00:12:05,160 Speaker 2: big of a risk or a trend. Do these cyber 296 00:12:05,480 --> 00:12:09,920 Speaker 2: criminals posed to not only personal security but United States 297 00:12:10,080 --> 00:12:11,040 Speaker 2: national security? 298 00:12:11,679 --> 00:12:12,199 Speaker 3: It's huge. 299 00:12:12,200 --> 00:12:15,720 Speaker 4: I mean it's a constant barrage. I work counter intelligence 300 00:12:15,720 --> 00:12:18,480 Speaker 4: for many years and there's multiple countries that it is 301 00:12:18,679 --> 00:12:22,240 Speaker 4: constant their ability to try. They will try and steal anything. 302 00:12:22,280 --> 00:12:24,880 Speaker 4: And I would deal with universities and sometimes have different 303 00:12:24,920 --> 00:12:27,640 Speaker 4: backgrounds of political opinions on how it works. I lived 304 00:12:27,679 --> 00:12:29,679 Speaker 4: it one way. They lived a different way with different perspectives, 305 00:12:29,880 --> 00:12:33,080 Speaker 4: but it is relentless in their pursuit of I mean, 306 00:12:33,400 --> 00:12:35,160 Speaker 4: the FBI and Indiana had a seed corn in case 307 00:12:35,240 --> 00:12:37,160 Speaker 4: years ago when I was here, and I forget that 308 00:12:37,160 --> 00:12:37,680 Speaker 4: the value was. 309 00:12:37,720 --> 00:12:38,640 Speaker 3: It started with a b. 310 00:12:38,440 --> 00:12:41,719 Speaker 4: Billions or billion billion dollar loss a seed core coming in. 311 00:12:41,760 --> 00:12:44,240 Speaker 4: Any that developed a product that a Chinese national had 312 00:12:44,280 --> 00:12:46,720 Speaker 4: stolen is sent back to their country. And the implications 313 00:12:46,720 --> 00:12:48,920 Speaker 4: from that are insane because not only did this company 314 00:12:48,920 --> 00:12:50,800 Speaker 4: spend R and D money for years to develop this 315 00:12:50,840 --> 00:12:53,439 Speaker 4: product to help feed Americans, but also sell to other 316 00:12:53,480 --> 00:12:56,800 Speaker 4: countries when they stole that technology. Not only did we 317 00:12:56,880 --> 00:12:59,319 Speaker 4: not our farmers lost out on selling that product, list 318 00:12:59,400 --> 00:13:02,080 Speaker 4: all that research time and money, and then that country 319 00:13:02,080 --> 00:13:03,920 Speaker 4: can then sell the product to other people and steal 320 00:13:03,920 --> 00:13:04,360 Speaker 4: our profits. 321 00:13:04,360 --> 00:13:06,600 Speaker 2: That way, they took it back to seed corn. 322 00:13:06,640 --> 00:13:08,520 Speaker 4: People think seed corn is it's seed corn. Yeah, it's 323 00:13:08,559 --> 00:13:11,160 Speaker 4: seed corn, unless unless it build up another country's military 324 00:13:11,160 --> 00:13:14,160 Speaker 4: with that money, unless ithearts our tax base and our military. 325 00:13:14,559 --> 00:13:16,880 Speaker 4: It's far reaching, and that seedcorn is one little example. 326 00:13:17,000 --> 00:13:19,680 Speaker 4: It's everything. There's a lot of examples to an under 327 00:13:19,720 --> 00:13:23,400 Speaker 4: counterintelligence where they noticed a trend of like this certain 328 00:13:23,440 --> 00:13:25,160 Speaker 4: company buying up I'm going to make an example like 329 00:13:25,160 --> 00:13:27,920 Speaker 4: a rubber band factory. It sounds like rubber bands are, 330 00:13:28,400 --> 00:13:29,840 Speaker 4: but what they do is they buy up the entire 331 00:13:29,880 --> 00:13:31,640 Speaker 4: market and then jack the prices up on it and 332 00:13:31,679 --> 00:13:33,520 Speaker 4: then take take over everything they want, and that way 333 00:13:33,520 --> 00:13:35,400 Speaker 4: they shut down. That way they can elevate the whole 334 00:13:35,400 --> 00:13:37,160 Speaker 4: market to what they want and then shut it down. 335 00:13:37,160 --> 00:13:38,880 Speaker 4: They just have the one provider within that product. 336 00:13:39,040 --> 00:13:41,120 Speaker 1: I wanted to ask you before you go. 337 00:13:41,400 --> 00:13:45,319 Speaker 2: You served in the FBI during the time of Robert 338 00:13:45,400 --> 00:13:47,760 Speaker 2: Mueller and he had just passed away at the age 339 00:13:47,760 --> 00:13:53,960 Speaker 2: of eighty one, and the director changed while you were there, 340 00:13:54,040 --> 00:13:58,160 Speaker 2: So it was partly Mueller and partly Ray. Did you 341 00:13:58,240 --> 00:14:00,640 Speaker 2: notice a did you ever meet mule Reeler? 342 00:14:00,720 --> 00:14:02,280 Speaker 4: I did not. It was early in my career now, but. 343 00:14:02,280 --> 00:14:06,360 Speaker 2: You met Ray often. Yeah, did you notice a difference 344 00:14:06,520 --> 00:14:09,560 Speaker 2: in the administration or the way things were run in 345 00:14:09,600 --> 00:14:11,319 Speaker 2: the FBI under the different leadership. 346 00:14:12,040 --> 00:14:14,160 Speaker 4: I was always a special agent, that was my goal 347 00:14:14,200 --> 00:14:15,960 Speaker 4: all along. I never put in to be promoted or 348 00:14:16,120 --> 00:14:18,760 Speaker 4: I did some specialty things. I was a host negotiator. 349 00:14:19,840 --> 00:14:22,360 Speaker 4: I was the Agent Association RED for Indiana for many 350 00:14:22,440 --> 00:14:24,320 Speaker 4: years and on the national board for several years at 351 00:14:24,320 --> 00:14:28,720 Speaker 4: the end. So you didn't really notice. As the street 352 00:14:28,760 --> 00:14:30,560 Speaker 4: level cop I call it. You know, when you get 353 00:14:30,560 --> 00:14:33,640 Speaker 4: a new chief, it doesn't affect morale maybe a little bit, 354 00:14:33,640 --> 00:14:35,960 Speaker 4: but it doesn't really check affect the day to day activities. 355 00:14:36,000 --> 00:14:37,920 Speaker 4: They'll they'll change the priorities and say we're going to 356 00:14:37,920 --> 00:14:40,080 Speaker 4: be more focus on whatever it is, counter terrors and 357 00:14:40,120 --> 00:14:42,120 Speaker 4: counter intelligence, go back to violent crime if there's a 358 00:14:42,120 --> 00:14:44,480 Speaker 4: surge in that. But typically once you get down to 359 00:14:44,520 --> 00:14:46,120 Speaker 4: the street level where people are working in the cases, 360 00:14:46,080 --> 00:14:47,760 Speaker 4: it doesn't affect your world all that much. 361 00:14:47,840 --> 00:14:48,920 Speaker 2: You just going in and doing this. 362 00:14:49,280 --> 00:14:51,280 Speaker 4: I do remember people talking to him, obviously the respect 363 00:14:51,280 --> 00:14:53,520 Speaker 4: that he people had for him based on his military 364 00:14:53,560 --> 00:14:56,480 Speaker 4: service prior, but also I got hired just after nine 365 00:14:56,480 --> 00:14:58,120 Speaker 4: to eleven. He got hired before I did, but he 366 00:14:58,160 --> 00:15:00,200 Speaker 4: took over to that weird transition time where pe we're 367 00:15:00,200 --> 00:15:02,000 Speaker 4: trying to split the FBI and half for an I 368 00:15:02,080 --> 00:15:04,120 Speaker 4: five I six type model, and he was able to 369 00:15:04,160 --> 00:15:07,800 Speaker 4: keep it together as one intelligence and criminal organization same time. 370 00:15:07,840 --> 00:15:10,320 Speaker 4: That was kind of that was my memory of what 371 00:15:10,360 --> 00:15:11,400 Speaker 4: he did during our time. 372 00:15:11,680 --> 00:15:14,240 Speaker 2: Well, we appreciate you coming in. You're gonna be here next. 373 00:15:14,040 --> 00:15:15,000 Speaker 4: Week April fool Day. 374 00:15:15,040 --> 00:15:19,120 Speaker 2: Wouldn' miss it, Corey Grass, former FBI special agent. 375 00:15:19,160 --> 00:15:20,000 Speaker 1: We appreciate you. 376 00:15:19,960 --> 00:15:21,000 Speaker 2: And everything that you've done. 377 00:15:21,160 --> 00:15:22,000 Speaker 1: Before we let you go. 378 00:15:22,080 --> 00:15:24,000 Speaker 2: We have to tell everybody where they can find you 379 00:15:24,080 --> 00:15:25,200 Speaker 2: more information about you. 380 00:15:25,480 --> 00:15:27,360 Speaker 4: The easiest way to find is on Facebook. I'm just 381 00:15:27,440 --> 00:15:30,640 Speaker 4: Corey Grass. Co Ry Grass, like you moo or. I 382 00:15:30,640 --> 00:15:32,560 Speaker 4: do have a partnership with the company I work with 383 00:15:32,600 --> 00:15:34,400 Speaker 4: that does a consulting for active shooter and a lot 384 00:15:34,440 --> 00:15:37,680 Speaker 4: of security issues. It's APEX Defend dot. 385 00:15:37,440 --> 00:15:41,080 Speaker 2: Com, Apex Defend dot Com. Corey Grass, all right, thank you. 386 00:15:41,080 --> 00:15:43,120 Speaker 2: You're listening to ninety three WIBC. 387 00:15:45,040 --> 00:15:49,040 Speaker 3: What is your problem? I got a lot of problems 388 00:15:49,080 --> 00:15:51,200 Speaker 3: with your people. We have a problem. 389 00:15:51,280 --> 00:15:52,120 Speaker 4: What's your problem? 390 00:15:52,240 --> 00:15:55,640 Speaker 3: I've got ninety nine problems On ninety three double YBC. 391 00:15:56,280 --> 00:16:00,600 Speaker 2: All right, here we go some group therapy and today, Jim, 392 00:16:01,080 --> 00:16:02,040 Speaker 2: what's your problem. 393 00:16:02,160 --> 00:16:08,040 Speaker 3: Here's the problem. Major League Baseball is expanding their replay 394 00:16:08,120 --> 00:16:11,800 Speaker 3: system in the form of what they are calling automatic 395 00:16:12,160 --> 00:16:19,440 Speaker 3: Ball Strike Challenge System or the ABS Challenge systems. Robottompires, robottompiers. 396 00:16:19,480 --> 00:16:21,600 Speaker 3: That's exactly right. And look, I don't have a problem 397 00:16:21,600 --> 00:16:24,160 Speaker 3: with technology and sports, and especially a sport that is 398 00:16:24,280 --> 00:16:27,800 Speaker 3: where tradition is such an integral part of the sport 399 00:16:27,920 --> 00:16:30,400 Speaker 3: like it is with Major League Baseball. Here's the problem 400 00:16:30,440 --> 00:16:33,040 Speaker 3: I have with it. I've seen this work in person. 401 00:16:33,120 --> 00:16:36,320 Speaker 3: I went to an Indianapolis Indians game last year the 402 00:16:36,320 --> 00:16:38,760 Speaker 3: minor league baseball team here in town, and Minor league 403 00:16:38,760 --> 00:16:41,000 Speaker 3: Baseball's had this for a few years, and you can 404 00:16:41,120 --> 00:16:44,360 Speaker 3: challenge balls and strikes, and the problem with it is 405 00:16:45,320 --> 00:16:49,240 Speaker 3: it slows the game down. Because recently, the big change 406 00:16:49,280 --> 00:16:51,840 Speaker 3: in Major League Baseball has been the pitch clock, and 407 00:16:51,880 --> 00:16:54,080 Speaker 3: a lot of people were against this, but man, the 408 00:16:54,160 --> 00:16:57,600 Speaker 3: pitch clock has improved the pace of a baseball game 409 00:16:57,720 --> 00:17:01,320 Speaker 3: dramatically and it's just made the game so much more 410 00:17:01,400 --> 00:17:03,880 Speaker 3: enjoyable to watch it. By the way, baseball ratings have 411 00:17:03,960 --> 00:17:06,880 Speaker 3: been through the roof since they put the pitch clock 412 00:17:06,920 --> 00:17:08,520 Speaker 3: in there, and a lot of people were against it, 413 00:17:08,920 --> 00:17:11,800 Speaker 3: and it's been proven to be very popular, very successful, 414 00:17:11,800 --> 00:17:14,320 Speaker 3: because now we've got this quick pace of game. Pitchers 415 00:17:14,359 --> 00:17:16,320 Speaker 3: can't sit on the pitching mound forever and you know, 416 00:17:16,359 --> 00:17:18,119 Speaker 3: take their time like they used to. They got to 417 00:17:18,160 --> 00:17:21,639 Speaker 3: move it forward. Well, now we've got an additional replay 418 00:17:21,680 --> 00:17:26,040 Speaker 3: option on balls and strikes that is going to counteract 419 00:17:26,280 --> 00:17:28,320 Speaker 3: what Major League Baseball just did a couple of years 420 00:17:28,320 --> 00:17:30,520 Speaker 3: ago with the pitch clock. So we sped up the game, 421 00:17:30,600 --> 00:17:32,600 Speaker 3: and now we're going to slow it down with more replays. 422 00:17:32,680 --> 00:17:35,920 Speaker 2: So you've got batters, pitchers, and catchers who can immediately 423 00:17:36,000 --> 00:17:38,880 Speaker 2: challenge a call by tapping their helmet or cap within 424 00:17:38,920 --> 00:17:39,879 Speaker 2: a few seconds. 425 00:17:40,080 --> 00:17:41,159 Speaker 1: The experts are. 426 00:17:41,080 --> 00:17:45,520 Speaker 2: Saying that purists are gonna hate it for about two weeks, 427 00:17:46,040 --> 00:17:49,040 Speaker 2: but then they'll notice that a bad call could cost 428 00:17:49,080 --> 00:17:52,400 Speaker 2: someone a perfect game. And if you have the robot 429 00:17:52,600 --> 00:17:57,320 Speaker 2: umpires checking everything with their high speed cameras much like 430 00:17:57,480 --> 00:18:01,360 Speaker 2: the Hawkeye, then they're going to change their mind because 431 00:18:01,480 --> 00:18:03,800 Speaker 2: their team may end up winning the game. 432 00:18:04,320 --> 00:18:06,879 Speaker 3: I understand, and this has always been the challenge with replay, 433 00:18:06,920 --> 00:18:09,120 Speaker 3: whether it's in the NBA or the NFL or Major 434 00:18:09,200 --> 00:18:12,760 Speaker 3: League Baseball or whatever, is that it just can dramatically 435 00:18:12,840 --> 00:18:15,679 Speaker 3: slow down the pace of the game. And that was 436 00:18:15,720 --> 00:18:18,800 Speaker 3: a big problem with baseball, and Baseball fixes the problem 437 00:18:18,920 --> 00:18:20,960 Speaker 3: by putting in the pitch clock, and now we're just 438 00:18:21,040 --> 00:18:23,760 Speaker 3: going to throw all of those games away by putting 439 00:18:23,800 --> 00:18:26,720 Speaker 3: in this ABS system where you can challenge balls and strikes. 440 00:18:26,720 --> 00:18:29,080 Speaker 3: My biggest problem is that, Look, I'm a technology guy 441 00:18:29,320 --> 00:18:32,240 Speaker 3: and I'm not a big sentimentalist, kind of traditionalist kind 442 00:18:32,240 --> 00:18:33,960 Speaker 3: of guy where I'm like, oh, they should never change 443 00:18:33,960 --> 00:18:35,760 Speaker 3: the game. I'm fine if you want to make changes 444 00:18:35,800 --> 00:18:37,600 Speaker 3: to the game, but you just sped the game up, 445 00:18:37,760 --> 00:18:39,119 Speaker 3: and now you're going to go back and slow the 446 00:18:39,400 --> 00:18:40,200 Speaker 3: game back down again. 447 00:18:40,240 --> 00:18:42,080 Speaker 2: But the goal is to cut down on the blown calls, 448 00:18:42,480 --> 00:18:45,800 Speaker 2: especially on some really close pitches in key moments of games. 449 00:18:46,119 --> 00:18:50,800 Speaker 3: Understand that, But look, sports and referees and umpires and officials, 450 00:18:50,960 --> 00:18:54,800 Speaker 3: it's always going to have a certain level of subjectivity 451 00:18:54,840 --> 00:18:57,840 Speaker 3: to it. There's always going to be officials that make 452 00:18:57,880 --> 00:19:00,320 Speaker 3: the wrong call. That's just part of the game. Game. 453 00:19:00,600 --> 00:19:02,840 Speaker 3: The fact that it's slowing it down after we just 454 00:19:02,880 --> 00:19:04,879 Speaker 3: sped it up, we just got here. 455 00:19:04,720 --> 00:19:07,120 Speaker 2: You're not upset about the fact that now you can't 456 00:19:07,119 --> 00:19:08,160 Speaker 2: blame it on an umpire. 457 00:19:08,480 --> 00:19:11,359 Speaker 3: I've always hated that. Look, if your team loses and 458 00:19:11,359 --> 00:19:13,119 Speaker 3: you spend the whole next day talking about how the 459 00:19:13,119 --> 00:19:16,439 Speaker 3: ref suck, you know what, that's a terrible argument. 460 00:19:16,560 --> 00:19:17,320 Speaker 1: Find your way to win. 461 00:19:17,440 --> 00:19:19,760 Speaker 3: Yeah, exactly, figure out a way to win. Every team's 462 00:19:19,760 --> 00:19:22,680 Speaker 3: got challenges and have to overcome obstacles. Sometimes you got 463 00:19:22,680 --> 00:19:25,840 Speaker 3: to overcome the obstacle. And that's the ref you're listening to. 464 00:19:25,920 --> 00:19:37,840 Speaker 2: Ninety three WYBC, Good Morning. Americans can smell when something 465 00:19:37,880 --> 00:19:41,320 Speaker 2: doesn't have a source. There's a new poll out, a 466 00:19:41,400 --> 00:19:43,960 Speaker 2: new national poll, and it said that very few Americans 467 00:19:44,080 --> 00:19:49,520 Speaker 2: use AI chatbots as their first choice for breaking news. Well, 468 00:19:49,560 --> 00:19:54,520 Speaker 2: no kidding, this shouldn't surprise anybody. AI summarizes, it doesn't report, 469 00:19:54,760 --> 00:19:57,120 Speaker 2: and until a robot can actually knock on a door 470 00:19:57,160 --> 00:19:59,879 Speaker 2: and get hung up on, treat it like a tool, 471 00:20:00,359 --> 00:20:05,840 Speaker 2: not a journalist. AI is not journalism, AI summarizes, And 472 00:20:05,880 --> 00:20:10,000 Speaker 2: people who go to AI chatbots as their first choice 473 00:20:10,000 --> 00:20:13,400 Speaker 2: for breaking news are pretty slim in numbers. 474 00:20:13,480 --> 00:20:15,440 Speaker 3: Yeah, and this was done by a very reputable pole 475 00:20:15,520 --> 00:20:18,320 Speaker 3: in research. FE Research did these did this bowl and 476 00:20:18,440 --> 00:20:21,800 Speaker 3: only one percent of adults say they use AI chatbots 477 00:20:21,800 --> 00:20:24,880 Speaker 3: first for breaking news. I'm surprised, And so who are 478 00:20:24,880 --> 00:20:27,520 Speaker 3: those one percent wackos out there that go to AI 479 00:20:27,640 --> 00:20:30,320 Speaker 3: chatbots for breaking news first? I mean, just look what 480 00:20:30,359 --> 00:20:33,480 Speaker 3: we've witnessed since the Iran War started. When when when 481 00:20:33,800 --> 00:20:37,160 Speaker 3: the you know, military operation first started, all you had 482 00:20:37,160 --> 00:20:39,600 Speaker 3: to do was go to Twitter and it was nothing 483 00:20:39,680 --> 00:20:45,160 Speaker 3: but fake AI generated video that trying to depict itself 484 00:20:45,359 --> 00:20:49,040 Speaker 3: as real life events when it was clearly fake. I 485 00:20:49,160 --> 00:20:51,560 Speaker 3: just don't see how they Yeah, I mean, look, we're 486 00:20:51,600 --> 00:20:54,480 Speaker 3: just getting started down the road in this whole AI thing. 487 00:20:54,800 --> 00:20:57,840 Speaker 3: The idea that there's any trust associated with it is 488 00:20:58,840 --> 00:21:01,040 Speaker 3: something I didn't need a poll, but I'm glad to 489 00:21:01,080 --> 00:21:03,040 Speaker 3: see that only one percent of the public goes to 490 00:21:03,080 --> 00:21:03,480 Speaker 3: it first. 491 00:21:03,720 --> 00:21:05,440 Speaker 1: But where do people go for their news? 492 00:21:05,440 --> 00:21:09,479 Speaker 2: Thirty six percent choose their preferred news organization like a 493 00:21:09,520 --> 00:21:13,040 Speaker 2: specific TV or newspaper. Twenty eight percent go to search 494 00:21:13,119 --> 00:21:17,520 Speaker 2: engines like Google or Bing. You have nineteen percent using 495 00:21:17,600 --> 00:21:20,800 Speaker 2: social media to get breaking news. That's the number that 496 00:21:20,880 --> 00:21:24,439 Speaker 2: surprises me the most because I always feel like X 497 00:21:24,600 --> 00:21:27,800 Speaker 2: or Twitter are so much quicker on breaking news than 498 00:21:27,840 --> 00:21:31,920 Speaker 2: any other news source, especially you know, your local news 499 00:21:31,960 --> 00:21:33,560 Speaker 2: or even national news. For that matter. 500 00:21:33,760 --> 00:21:38,320 Speaker 3: Absolutely. Twitter has always been essentially a real time news feed, 501 00:21:39,200 --> 00:21:42,159 Speaker 3: and it's and and when breaking news happens, like the 502 00:21:42,240 --> 00:21:45,040 Speaker 3: when the war and Iran started, I mean Twitter usage 503 00:21:45,080 --> 00:21:47,720 Speaker 3: went through the roof because that's where everybody goes. I 504 00:21:47,760 --> 00:21:51,280 Speaker 3: think where people may be confusing this is and the 505 00:21:51,280 --> 00:21:54,240 Speaker 3: problem here is there's so much misinformation on social media and. 506 00:21:54,280 --> 00:21:54,880 Speaker 2: That's so bad. 507 00:21:54,880 --> 00:21:58,520 Speaker 3: There's so much phony AI videos and photos, and people 508 00:21:58,600 --> 00:22:02,679 Speaker 3: just post stuff that they know is intentionally false because 509 00:22:02,680 --> 00:22:05,840 Speaker 3: it generates them engagement. And because Twitter, for certain people 510 00:22:05,880 --> 00:22:08,520 Speaker 3: can pay you for your engagement, they can actually make 511 00:22:08,600 --> 00:22:11,680 Speaker 3: money off of posting obvious fake information. So I think 512 00:22:11,840 --> 00:22:14,320 Speaker 3: that's the problem with social media. If you're smart enough 513 00:22:14,720 --> 00:22:18,000 Speaker 3: and kind of have the wherewithal to be able to 514 00:22:18,119 --> 00:22:22,119 Speaker 3: recognize the misinformation and the fake news from the real news, 515 00:22:22,240 --> 00:22:24,840 Speaker 3: then yeah, Twitter and social media is the perfect place 516 00:22:24,840 --> 00:22:25,600 Speaker 3: for breaking news. 517 00:22:25,640 --> 00:22:27,879 Speaker 2: I had no idea that this was even a job, 518 00:22:27,960 --> 00:22:29,400 Speaker 2: but apparently it is. 519 00:22:29,520 --> 00:22:30,360 Speaker 1: Did you know that the. 520 00:22:30,400 --> 00:22:33,720 Speaker 2: United States has a brand ambassador. 521 00:22:33,960 --> 00:22:35,040 Speaker 3: I think it's new, isn't it. 522 00:22:35,240 --> 00:22:38,280 Speaker 2: Well, the Biden administration had one. Oh well, then Biden 523 00:22:38,320 --> 00:22:43,320 Speaker 2: Administration's brand ambassador liked Hunter Biden's artwork. The New Guy 524 00:22:43,640 --> 00:22:48,280 Speaker 2: under Donald Trump. He's a conservative influencer and author, and 525 00:22:48,520 --> 00:22:53,240 Speaker 2: apparently he likes Hooters. Yes, I'm talking about the restaurantaurant. 526 00:22:53,280 --> 00:22:57,320 Speaker 2: They're calling him an alpha male, so he's a Dude's dude. 527 00:22:57,880 --> 00:23:02,800 Speaker 2: He likes hot wings over Biden's art. He's his name 528 00:23:02,880 --> 00:23:07,240 Speaker 2: is Nick Adams, and he's an Australian American who is naturalized. 529 00:23:07,240 --> 00:23:10,439 Speaker 2: In twenty one and he became known as a Trump 530 00:23:10,520 --> 00:23:15,359 Speaker 2: aligned conservative commentator and writer. He was previously nominated to 531 00:23:15,359 --> 00:23:20,440 Speaker 2: be US Ambassador to Malaysia. Nomination fell apart, but now 532 00:23:20,480 --> 00:23:25,120 Speaker 2: he's in this new position as America's brand Ambassador. He's 533 00:23:25,160 --> 00:23:30,720 Speaker 2: in charge of getting people excited about American tourism, exceptionalism, 534 00:23:31,160 --> 00:23:34,919 Speaker 2: and also US values. He's, I guess, going to be 535 00:23:34,960 --> 00:23:40,159 Speaker 2: playing a key role part of the Special Presidential Envoy 536 00:23:40,240 --> 00:23:43,680 Speaker 2: for the World Cup events and also the two hundred 537 00:23:43,680 --> 00:23:45,600 Speaker 2: and fiftieth anniversary. 538 00:23:45,760 --> 00:23:48,040 Speaker 3: Well, what is so ironic about what you just said 539 00:23:48,080 --> 00:23:51,280 Speaker 3: is that this guy, Nick Adams, he was nominated to 540 00:23:51,359 --> 00:23:55,720 Speaker 3: be the actual US ambassador to Malaysia. That didn't work out, 541 00:23:55,880 --> 00:23:58,160 Speaker 3: so now he's going to move to be the brand 542 00:23:58,200 --> 00:24:01,720 Speaker 3: ambassador of the United State TIS. This follows a trend 543 00:24:01,760 --> 00:24:04,360 Speaker 3: of what you see major brands and major corporations doing 544 00:24:04,400 --> 00:24:07,040 Speaker 3: is they do hire these brand ambassadors, and what that 545 00:24:07,359 --> 00:24:10,760 Speaker 3: essentially means, you're a spokesperson. You're a spokesperson, and mostly 546 00:24:10,840 --> 00:24:14,760 Speaker 3: what you're doing is cheerleading for that corporate entity or 547 00:24:14,760 --> 00:24:17,840 Speaker 3: that brand, or in this specific instance, the United States 548 00:24:17,880 --> 00:24:20,520 Speaker 3: of America. You're cheerleading for the United States of America, 549 00:24:20,600 --> 00:24:22,960 Speaker 3: mostly on social media and just talking about how great 550 00:24:23,000 --> 00:24:25,800 Speaker 3: they are and how awesome they are. And it's it's 551 00:24:26,200 --> 00:24:29,679 Speaker 3: as opposed to putting together a commercial, which corporations and 552 00:24:29,720 --> 00:24:32,480 Speaker 3: brands in the United States does as well. This is 553 00:24:32,560 --> 00:24:35,040 Speaker 3: kind of like putting a face to all of that, 554 00:24:35,200 --> 00:24:36,920 Speaker 3: and they essentially are just a cheerleader. 555 00:24:37,320 --> 00:24:40,720 Speaker 2: Okay, So from Hooters to something a little bit more 556 00:24:40,760 --> 00:24:46,840 Speaker 2: pricey Ruth's Chris the steakhouse chain has implemented a new 557 00:24:46,960 --> 00:24:50,240 Speaker 2: dress code and they're saying that they will ban hats 558 00:24:50,400 --> 00:24:54,560 Speaker 2: and gymware. Chili's restaurant fired back, saying that their only 559 00:24:54,640 --> 00:24:57,720 Speaker 2: dress code is that you have to be dressed. 560 00:24:59,119 --> 00:25:00,720 Speaker 5: I'm not finding any I want if they wear their 561 00:25:00,720 --> 00:25:03,200 Speaker 5: pajamas on the airplane, right, that's not what we're doing. 562 00:25:03,240 --> 00:25:07,000 Speaker 1: I'm just we're asking people to maybe dress a little better. 563 00:25:07,600 --> 00:25:11,680 Speaker 2: So that was the Transportation Secretary Sean Tuffey reminding people 564 00:25:11,720 --> 00:25:14,960 Speaker 2: not to wear jammies when they fly. But we have 565 00:25:15,040 --> 00:25:18,639 Speaker 2: to ask, okay, Ruth, Chris, you're paying a lot of 566 00:25:18,680 --> 00:25:22,440 Speaker 2: money to go there and get a really delicious, delicious steak. 567 00:25:22,480 --> 00:25:23,920 Speaker 3: Fantastic, absolutely good. 568 00:25:24,000 --> 00:25:27,160 Speaker 2: But they're saying, hey, no hats, no gym ware. They 569 00:25:27,160 --> 00:25:30,359 Speaker 2: can implement any dress code they want, But it's the 570 00:25:30,440 --> 00:25:32,359 Speaker 2: public who's really going to decide at the end of 571 00:25:32,400 --> 00:25:32,760 Speaker 2: the day. 572 00:25:33,200 --> 00:25:35,560 Speaker 3: Yeah, this is a business decision for Ruth Chris, and 573 00:25:35,560 --> 00:25:38,040 Speaker 3: I think what And we've seen this through our culture 574 00:25:38,080 --> 00:25:42,040 Speaker 3: that people are more casual in their dress than they 575 00:25:42,080 --> 00:25:44,679 Speaker 3: ever have been before. And we see this too. You know, 576 00:25:44,960 --> 00:25:46,119 Speaker 3: it used to be when I was a kid and 577 00:25:46,119 --> 00:25:48,239 Speaker 3: you would go to church, that you would dress up 578 00:25:48,280 --> 00:25:50,080 Speaker 3: and you would wear you know what, your quote, your 579 00:25:50,119 --> 00:25:52,760 Speaker 3: Sunday best. You can go into plenty of churches now 580 00:25:52,840 --> 00:25:55,120 Speaker 3: and there's a lot of people that are far from 581 00:25:55,280 --> 00:25:58,040 Speaker 3: dressed in their Sunday best. And this happens everywhere we go. 582 00:25:58,240 --> 00:25:59,960 Speaker 3: We always talk about this. You see people, you know, 583 00:26:00,320 --> 00:26:02,600 Speaker 3: shopping at the grocery store and their jammy pants and 584 00:26:02,640 --> 00:26:06,119 Speaker 3: their slippers and and okay, everybody's free to do that, 585 00:26:06,280 --> 00:26:09,120 Speaker 3: but Ruth Chris, as its own private company, is free 586 00:26:09,160 --> 00:26:12,439 Speaker 3: to restrict that. And Ruth Chris is, let's be honest, 587 00:26:12,560 --> 00:26:14,800 Speaker 3: it's an expensive rents for a restaurant. It's a fancy, 588 00:26:15,119 --> 00:26:18,760 Speaker 3: expensive restaurant. And I say so, I think from their perspective, 589 00:26:19,119 --> 00:26:21,520 Speaker 3: they want to make sure that the client tele coming 590 00:26:21,560 --> 00:26:26,760 Speaker 3: into their restaurant reflects that brand image of more expensive, 591 00:26:27,040 --> 00:26:30,800 Speaker 3: upper class, put together, special occasion sort of thing. 592 00:26:31,080 --> 00:26:33,160 Speaker 2: So a lot of people are saying that they should 593 00:26:33,160 --> 00:26:35,920 Speaker 2: be spending more time on picking a name that's easier 594 00:26:35,960 --> 00:26:39,480 Speaker 2: to pronounce than what people wear they come into the restaurant. 595 00:26:39,760 --> 00:26:43,080 Speaker 2: But you're right, it's a private business. It's their rules. 596 00:26:43,359 --> 00:26:47,800 Speaker 2: End of story. Move along. But Chili's coming back trying 597 00:26:47,840 --> 00:26:52,080 Speaker 2: to inject themselves into the conversation saying that as long 598 00:26:52,160 --> 00:26:56,280 Speaker 2: as you're dressed, we're okay with it. Chili's and Ruth's 599 00:26:56,359 --> 00:26:58,440 Speaker 2: Chris not on the same level. 600 00:26:58,600 --> 00:27:01,040 Speaker 3: No, it's smart of Chili's to to piggyback off of this, 601 00:27:01,200 --> 00:27:04,280 Speaker 3: And because the Ruth's Chris dress code story got a 602 00:27:04,280 --> 00:27:06,359 Speaker 3: lot of traction in the media, and so Chili's, you know, 603 00:27:06,560 --> 00:27:09,400 Speaker 3: piggybacked off of that to get some engagement of themselves. 604 00:27:09,560 --> 00:27:12,800 Speaker 3: I don't think anybody's confusing Chili's for Ruth Chris. Nobody's, 605 00:27:13,000 --> 00:27:15,840 Speaker 3: you know, confusing the three for ten ninety nine deal 606 00:27:16,280 --> 00:27:19,639 Speaker 3: at Chili's for a you know, a porter House steak 607 00:27:19,720 --> 00:27:21,320 Speaker 3: at Ruth Chris, which is going to cost you a 608 00:27:21,320 --> 00:27:23,160 Speaker 3: whole lot of money. It will be interesting to see 609 00:27:23,160 --> 00:27:24,600 Speaker 3: how this goes, because this is either going to go 610 00:27:24,640 --> 00:27:27,159 Speaker 3: really well for Ruth Chris or it's going to go 611 00:27:27,240 --> 00:27:29,960 Speaker 3: really poorly for Ruth Chris. And they may lose a 612 00:27:29,960 --> 00:27:32,240 Speaker 3: lot of customers because of this, and it'll be interesting 613 00:27:32,280 --> 00:27:35,000 Speaker 3: to see how long they keep this dress code in place. 614 00:27:35,280 --> 00:27:38,440 Speaker 2: NASA has unveiled a new plan to build a permanent 615 00:27:38,520 --> 00:27:43,000 Speaker 2: lunar base near the Moon's South Pole. Project is part 616 00:27:43,040 --> 00:27:45,879 Speaker 2: of the update to the Artemis program. We were just 617 00:27:46,000 --> 00:27:50,199 Speaker 2: talking about how we were planning on putting somebody on 618 00:27:50,240 --> 00:27:52,639 Speaker 2: the Moon, how in April they were going to go 619 00:27:52,720 --> 00:27:56,000 Speaker 2: up and they were gonna satellite around, and that was 620 00:27:56,040 --> 00:27:58,159 Speaker 2: part of a broader plan to put people on the 621 00:27:58,200 --> 00:28:02,600 Speaker 2: Moon within the next two years. Now they changed the 622 00:28:02,640 --> 00:28:06,600 Speaker 2: plan just this week and they're wanting to build a 623 00:28:06,640 --> 00:28:09,879 Speaker 2: permanent lunar base on the Moon and the plan is 624 00:28:10,080 --> 00:28:13,760 Speaker 2: expected to cost about twenty billion dollars and take around 625 00:28:13,880 --> 00:28:17,600 Speaker 2: seven years to build. It's meant to be a lasting 626 00:28:17,640 --> 00:28:20,960 Speaker 2: outpost on the Moon's surface. To what end? What is 627 00:28:21,000 --> 00:28:21,960 Speaker 2: the purpose of that? 628 00:28:21,960 --> 00:28:24,200 Speaker 3: That was my big question in all of this, because 629 00:28:24,359 --> 00:28:26,280 Speaker 3: I came across the story and it was reported in 630 00:28:26,320 --> 00:28:28,560 Speaker 3: a ton of news outlets, and I was like, Okay, okay, 631 00:28:28,760 --> 00:28:29,800 Speaker 3: why are we doing? 632 00:28:29,840 --> 00:28:32,560 Speaker 2: What do they know that we don't know? Because we're 633 00:28:32,560 --> 00:28:34,800 Speaker 2: going to be spending billions on a moon base while 634 00:28:34,840 --> 00:28:38,960 Speaker 2: everyday infrastructure here at home needs work. It's like Washington's 635 00:28:39,040 --> 00:28:41,280 Speaker 2: priorities are upside down. Like if you want to talk 636 00:28:41,280 --> 00:28:45,200 Speaker 2: about moon craters, drive anywhere in Indianapolis and you'll get 637 00:28:45,200 --> 00:28:46,600 Speaker 2: the exact same feeling. 638 00:28:46,680 --> 00:28:49,120 Speaker 3: Yeah, with our roads and the current conditions that they're in. 639 00:28:49,160 --> 00:28:51,920 Speaker 3: And look, we knew, you know, the first space race 640 00:28:52,160 --> 00:28:55,200 Speaker 3: John Kennedy in the sixties and the Russians. We knew 641 00:28:55,200 --> 00:28:57,000 Speaker 3: what the goal was for that. We had to beat 642 00:28:57,040 --> 00:28:59,440 Speaker 3: the Russians to the Moon. Not only was it to 643 00:28:59,520 --> 00:29:03,840 Speaker 3: advance technologically to keep them from you know, getting there first, 644 00:29:03,840 --> 00:29:06,680 Speaker 3: but also we had this national pride and we were 645 00:29:06,680 --> 00:29:08,560 Speaker 3: in the Cold War with the Soviets. I get all 646 00:29:08,560 --> 00:29:10,600 Speaker 3: of that, and we spent billions of dollars doing that. 647 00:29:10,960 --> 00:29:14,040 Speaker 3: And after that in the seventies, NASA realized pretty quickly, 648 00:29:14,120 --> 00:29:17,520 Speaker 3: hey man, these Moon missions that we're doing, they're pretty 649 00:29:17,560 --> 00:29:20,480 Speaker 3: darn expensive and we're really not getting much out of it. 650 00:29:20,560 --> 00:29:22,640 Speaker 3: So now here we are fifty years later, and we 651 00:29:22,680 --> 00:29:25,200 Speaker 3: want to build a base on the Moon, and NASA 652 00:29:25,280 --> 00:29:27,960 Speaker 3: really didn't do a great job of explaining why we 653 00:29:28,000 --> 00:29:30,480 Speaker 3: need to spend thirty billion dollars to do this. Look, 654 00:29:30,520 --> 00:29:33,160 Speaker 3: and there may be really good reasons. Maybe we mine 655 00:29:33,240 --> 00:29:35,360 Speaker 3: Moon rocks and send them back to the US. Maybe 656 00:29:35,360 --> 00:29:38,840 Speaker 3: there's rare earth metals there. Maybe this is preparation for 657 00:29:39,000 --> 00:29:41,400 Speaker 3: us to build you know, we start with the Moon 658 00:29:41,760 --> 00:29:43,760 Speaker 3: and then maybe the next one is we build a 659 00:29:43,800 --> 00:29:47,440 Speaker 3: base on Mars or some other planet. Okay, fine, but NASA, 660 00:29:47,520 --> 00:29:49,720 Speaker 3: come on, man, you can't just tell me you're spending 661 00:29:49,760 --> 00:29:52,040 Speaker 3: thirty billion dollars to build a Moon base and not 662 00:29:52,120 --> 00:29:54,640 Speaker 3: give me a what's in it for me? What's in 663 00:29:54,680 --> 00:29:55,080 Speaker 3: it for me? 664 00:29:55,240 --> 00:29:59,400 Speaker 2: NASA official say the base would support more frequent astronaut missions, 665 00:29:59,600 --> 00:30:04,240 Speaker 2: research and scientific work, and also infrastructure for long term 666 00:30:04,400 --> 00:30:07,320 Speaker 2: lunar presence. And they're going to the South Pole that 667 00:30:07,440 --> 00:30:11,160 Speaker 2: was chosen because it likely has water, ice and better 668 00:30:11,320 --> 00:30:14,840 Speaker 2: sunlight for power. So maybe this is all part of 669 00:30:15,400 --> 00:30:19,520 Speaker 2: Elon Musk's push to put data centers in space. I 670 00:30:19,640 --> 00:30:20,480 Speaker 2: got to start. 671 00:30:20,240 --> 00:30:22,840 Speaker 1: Somewhere, so perhaps we're starting on the Moon. 672 00:30:22,920 --> 00:30:26,040 Speaker 2: And I've also heard that China is interested in getting 673 00:30:26,040 --> 00:30:28,680 Speaker 2: to the Moon, so now it's a race. It used 674 00:30:28,720 --> 00:30:31,280 Speaker 2: to be the Russians, now once again it's China. 675 00:30:31,440 --> 00:30:35,040 Speaker 3: India wants to also put a banned Moon mission as 676 00:30:35,080 --> 00:30:37,480 Speaker 3: part of their space program as well. But you're exactly right, Hey, 677 00:30:37,480 --> 00:30:40,480 Speaker 3: if this is part of you know, data centers in space, 678 00:30:40,600 --> 00:30:43,040 Speaker 3: and that's going to help us figure that out, wonderful. Look, 679 00:30:43,120 --> 00:30:45,360 Speaker 3: I'd rather have the data centers in space than in 680 00:30:45,400 --> 00:30:48,280 Speaker 3: my backyard, so I'm all for that. But that's something nasty. 681 00:30:48,280 --> 00:30:49,880 Speaker 3: You got to give me something that's in it for 682 00:30:49,920 --> 00:30:50,720 Speaker 3: me on this mat. 683 00:30:50,680 --> 00:30:52,400 Speaker 2: Yeah, and it also could be part of a whole 684 00:30:52,760 --> 00:30:57,600 Speaker 2: branding situation where it is seen that US is maintaining 685 00:30:57,720 --> 00:31:01,480 Speaker 2: leadership in space exploration, and since we have not been 686 00:31:01,520 --> 00:31:05,000 Speaker 2: to the Moon in many moons, it is possibly time 687 00:31:05,000 --> 00:31:05,520 Speaker 2: to go back. 688 00:31:05,600 --> 00:31:05,800 Speaker 1: Now. 689 00:31:05,880 --> 00:31:07,760 Speaker 2: I'm embarrassed to say it. Just the other day we 690 00:31:07,760 --> 00:31:10,000 Speaker 2: were talking about favorite sports teams and I said that 691 00:31:10,640 --> 00:31:15,480 Speaker 2: Baltimore Ravens were my favorite NFL team, and that is 692 00:31:15,560 --> 00:31:18,560 Speaker 2: strictly an emotional bias, because you know, our daughter lives 693 00:31:18,560 --> 00:31:20,360 Speaker 2: in Baltimore, so I'm just trying to make. 694 00:31:20,240 --> 00:31:21,160 Speaker 1: The connection there. 695 00:31:21,760 --> 00:31:26,000 Speaker 2: But now after hearing that this, I might have to 696 00:31:26,080 --> 00:31:27,560 Speaker 2: rescind my statement. 697 00:31:28,120 --> 00:31:33,160 Speaker 5: Menstrual hygiene products means appropriately sized tampons. What are appropriately 698 00:31:33,400 --> 00:31:37,760 Speaker 5: sized tampons? I've never heard of such a thing. What 699 00:31:37,920 --> 00:31:39,360 Speaker 5: do you consider appropriate? 700 00:31:43,040 --> 00:31:45,400 Speaker 3: It just means that tampons are offered. 701 00:31:45,680 --> 00:31:47,320 Speaker 4: They're no specific side. 702 00:31:48,400 --> 00:31:51,600 Speaker 5: Well, apparently there's four different sizes. So which one would 703 00:31:51,600 --> 00:31:52,480 Speaker 5: you like them to use? 704 00:31:54,560 --> 00:31:57,040 Speaker 4: Just a regular sized tampon in the bathroom? 705 00:31:57,640 --> 00:32:00,680 Speaker 5: Okay, maybe it should say that and not a apropriately 706 00:32:00,920 --> 00:32:04,840 Speaker 5: sized tampon. Now, is this going in all bathrooms men 707 00:32:05,080 --> 00:32:05,800 Speaker 5: and women's? 708 00:32:05,920 --> 00:32:06,240 Speaker 3: Yes? 709 00:32:07,200 --> 00:32:11,520 Speaker 5: How about the Ravens Stadium. 710 00:32:11,600 --> 00:32:18,160 Speaker 4: If it is a state owned building, then yes, it 711 00:32:18,160 --> 00:32:20,880 Speaker 4: would go in it's a public building. 712 00:32:21,200 --> 00:32:22,880 Speaker 5: Oriole Park. 713 00:32:24,440 --> 00:32:26,760 Speaker 4: Well, if it applies to the Ravens Stadium, it would 714 00:32:26,760 --> 00:32:28,160 Speaker 4: also apply to Orioles Park. 715 00:32:28,240 --> 00:32:32,520 Speaker 2: I guess Pamlico, So Maryland is seriously debating putting tampons 716 00:32:32,560 --> 00:32:36,600 Speaker 2: in the men's bathroom at Baltimore Ravens and the Oriel stadiums. 717 00:32:36,760 --> 00:32:41,000 Speaker 2: And you've got this Democrat delegate insisting that the tampons 718 00:32:41,080 --> 00:32:46,560 Speaker 2: must be appropriately sized. There is no appropriate size for 719 00:32:46,680 --> 00:32:51,560 Speaker 2: tampons and men's bathrooms at any NFL stadium or anywhere. 720 00:32:51,600 --> 00:32:54,720 Speaker 2: I can't even believe that that guy said that out loud. 721 00:32:55,400 --> 00:32:59,360 Speaker 2: You know, Ravens fans, especially the dudes, they just want 722 00:32:59,360 --> 00:33:00,120 Speaker 2: to watch foot. 723 00:33:00,080 --> 00:33:01,760 Speaker 1: All right, why are you doing that? 724 00:33:01,960 --> 00:33:04,240 Speaker 3: When did Tim Walls become the governor of Maryland? 725 00:33:04,320 --> 00:33:07,320 Speaker 2: They didn't know what have No, it's the whole thing 726 00:33:07,400 --> 00:33:11,120 Speaker 2: is ridiculous. And so now I'm rethinking about announcing that 727 00:33:11,160 --> 00:33:14,240 Speaker 2: the Baltimore Ravens was my favorite NFL team. 728 00:33:14,320 --> 00:33:17,280 Speaker 3: Okay, so this is part of a state of Maryland plan. 729 00:33:17,440 --> 00:33:19,360 Speaker 3: Like I'm not trying to justify it here. This doesn't 730 00:33:19,360 --> 00:33:22,240 Speaker 3: make it any better. But it's more than just Camden 731 00:33:22,320 --> 00:33:25,920 Speaker 3: Yards and the Ravens Stadium that would be getting tampons 732 00:33:26,000 --> 00:33:27,000 Speaker 3: in men's rooms either. 733 00:33:27,000 --> 00:33:29,320 Speaker 2: They're saying, it's all public buildings across the state. 734 00:33:29,440 --> 00:33:32,640 Speaker 3: It's all public buildings across the save in the rest area. 735 00:33:32,880 --> 00:33:35,920 Speaker 3: So so anywhere that you go and there's a public restroom, 736 00:33:36,040 --> 00:33:39,000 Speaker 3: and this is being debated, it's not law yet correct 737 00:33:39,080 --> 00:33:43,320 Speaker 3: correct in the state of Maryland that they would require tampons. 738 00:33:42,760 --> 00:33:45,920 Speaker 2: In men So the next time we do another cross 739 00:33:45,960 --> 00:33:48,600 Speaker 2: country trek to go see our daughter in Baltimore and 740 00:33:48,640 --> 00:33:51,560 Speaker 2: we have to stop at a rest area along the way, 741 00:33:52,400 --> 00:33:54,800 Speaker 2: I want to report from you. Let me know if 742 00:33:54,840 --> 00:33:56,920 Speaker 2: there's tampons in the men's restaurant. 743 00:33:57,000 --> 00:33:59,280 Speaker 3: Do you think we could like you know, most of 744 00:33:59,320 --> 00:34:02,840 Speaker 3: these state governments now have special hotlines to report waste 745 00:34:02,880 --> 00:34:05,520 Speaker 3: frauden abuse. I think that would probably be my first 746 00:34:05,560 --> 00:34:09,040 Speaker 3: phone call. I'd like to report some waste fraud abuse. Yes, sir, 747 00:34:09,320 --> 00:34:11,399 Speaker 3: where are you seeing this waste broaden abuse? Yea, I'm 748 00:34:11,400 --> 00:34:13,720 Speaker 3: at the rest area on I seventy and you've got 749 00:34:13,760 --> 00:34:18,000 Speaker 3: tampons in the men's room. That waste waste. That is 750 00:34:18,040 --> 00:34:19,520 Speaker 3: the epitome of waste. 751 00:34:19,680 --> 00:34:26,600 Speaker 1: You're listening to ninety three WIBC. We have been spending 752 00:34:26,640 --> 00:34:27,120 Speaker 1: a lot. 753 00:34:26,920 --> 00:34:30,000 Speaker 2: Of time over the past few days talking about TSA. 754 00:34:30,320 --> 00:34:33,760 Speaker 2: This is a TSA story, but it's a little different. 755 00:34:33,880 --> 00:34:39,440 Speaker 2: It's the most shocking confiscated items that TSA has found 756 00:34:39,640 --> 00:34:44,000 Speaker 2: over the past year, the bizarre things that people try 757 00:34:44,040 --> 00:34:50,360 Speaker 2: to get through TSA, and they range from wildlife to weapons, 758 00:34:51,200 --> 00:34:53,560 Speaker 2: and if you're trying to get a weapon through TSA, 759 00:34:53,920 --> 00:34:57,640 Speaker 2: clearly you can't read because the signs are posted everywhere. 760 00:34:57,680 --> 00:35:00,200 Speaker 2: You can't even take more than four ounces of You 761 00:35:00,280 --> 00:35:04,000 Speaker 2: can't even take water through TSA or four ounces of shampoo, 762 00:35:04,040 --> 00:35:05,799 Speaker 2: and you're going to try and get a weapon. Thrill. Yeah. 763 00:35:06,080 --> 00:35:08,160 Speaker 3: I think this has a lot more to do with 764 00:35:08,400 --> 00:35:12,400 Speaker 3: just laziness and forgetfulness than it does malice. Probably. I 765 00:35:12,440 --> 00:35:15,240 Speaker 3: think everybody knows. You know, you can't take a knife 766 00:35:15,239 --> 00:35:17,560 Speaker 3: over a certain size. You certainly can't take a gun through. 767 00:35:17,600 --> 00:35:20,040 Speaker 3: You know, your carry on baggage and other weapons through 768 00:35:20,440 --> 00:35:22,520 Speaker 3: you know, most of your carry on stuff that you 769 00:35:22,560 --> 00:35:25,200 Speaker 3: take and get checked in through TSA. And my guess 770 00:35:25,320 --> 00:35:28,200 Speaker 3: is the vast majority of people that just forget that 771 00:35:28,239 --> 00:35:28,960 Speaker 3: they had it there. 772 00:35:29,080 --> 00:35:29,799 Speaker 1: No, they're not. 773 00:35:29,960 --> 00:35:33,279 Speaker 2: They're people that were up to nefarious activity because some 774 00:35:33,320 --> 00:35:37,520 Speaker 2: of the weapons were concealed in ordinary objects, for example, 775 00:35:38,000 --> 00:35:42,600 Speaker 2: knives hidden in knee braces or in a children's car seat. 776 00:35:43,040 --> 00:35:47,480 Speaker 2: You also had bullets that were packed inside containers like 777 00:35:47,520 --> 00:35:51,600 Speaker 2: a drink or like a ness quick box, like a 778 00:35:51,680 --> 00:35:56,320 Speaker 2: juice box. They were intentionally put there and intentionally hidden there. 779 00:35:56,719 --> 00:36:01,040 Speaker 2: Some other items that TSA has found hidden in luggage, 780 00:36:01,640 --> 00:36:08,000 Speaker 2: fake pipe bombs, also a turtle hidden in someone's pants. 781 00:36:09,400 --> 00:36:10,960 Speaker 2: You're trying to smuggle a turtle. 782 00:36:11,080 --> 00:36:12,839 Speaker 3: I can see how that would be an accident. You're 783 00:36:12,880 --> 00:36:15,880 Speaker 3: telling me that he intentionally tried to smuggle a turtle 784 00:36:16,040 --> 00:36:18,279 Speaker 3: who hasn't walked out the door of their house and 785 00:36:18,360 --> 00:36:22,319 Speaker 3: forgot about their turtle in their pants. But I mean 786 00:36:22,360 --> 00:36:24,239 Speaker 3: for some of the and I guess maybe I'm too 787 00:36:24,280 --> 00:36:28,080 Speaker 3: trusting because I tend to go through well, I'm always 788 00:36:28,080 --> 00:36:30,160 Speaker 3: worried and making sure, Okay, do I have all of 789 00:36:30,200 --> 00:36:32,920 Speaker 3: my you know, toilet trees in a one ounce clear 790 00:36:33,080 --> 00:36:36,240 Speaker 3: quart bag and they all less than three point four ounces? 791 00:36:36,239 --> 00:36:38,680 Speaker 3: And I'm always is my laptop out in a separate pack? 792 00:36:38,719 --> 00:36:40,000 Speaker 3: What do I do with my jacket? Do I wear 793 00:36:40,080 --> 00:36:41,440 Speaker 3: my jacket? Do I put it in the bin? And 794 00:36:41,440 --> 00:36:43,919 Speaker 3: so I'm always worried about the rules and regulations that 795 00:36:43,920 --> 00:36:45,759 Speaker 3: that's when I forget. Oh, I forgot I had that 796 00:36:45,800 --> 00:36:48,560 Speaker 3: bottle of water in my backpack and that goes through 797 00:36:48,680 --> 00:36:51,280 Speaker 3: and then my bag gets pulled out for special screening 798 00:36:51,600 --> 00:36:54,000 Speaker 3: and I get the dirty look from the TSA guy like, 799 00:36:54,040 --> 00:36:56,239 Speaker 3: come on, dude, you didn't know that you couldn't take water? 800 00:36:56,320 --> 00:36:58,080 Speaker 3: Through the screen to read. 801 00:36:58,120 --> 00:37:01,400 Speaker 2: I want to read this quote from the article. In 802 00:37:01,520 --> 00:37:08,960 Speaker 2: multiple cases, TSA found turtles. In multiple, multiple cases, TSA 803 00:37:09,120 --> 00:37:13,080 Speaker 2: found turtles hidden in people's pants or bras, which is 804 00:37:13,280 --> 00:37:17,799 Speaker 2: unsafe and violates rules for transporting animals. 805 00:37:17,920 --> 00:37:21,520 Speaker 3: It just violates human decency. Quit putting your pet turtle 806 00:37:21,600 --> 00:37:24,279 Speaker 3: down your pants, dude. Think of the Think of the 807 00:37:24,400 --> 00:37:27,880 Speaker 3: poor turtle. Okay, he's got to walk through security in 808 00:37:27,920 --> 00:37:28,560 Speaker 3: your pants. 809 00:37:28,800 --> 00:37:32,200 Speaker 1: I forgot I've got a turtle in my pants. Uh huh. 810 00:37:32,200 --> 00:37:34,440 Speaker 3: But you do see, I do love every now and 811 00:37:34,480 --> 00:37:37,160 Speaker 3: then you will see like the big bin of things 812 00:37:37,200 --> 00:37:39,320 Speaker 3: that they confiscate. Now most of the time it's like 813 00:37:39,440 --> 00:37:42,120 Speaker 3: big shampoo bottles and a bottle of water. But can 814 00:37:42,200 --> 00:37:44,080 Speaker 3: you imagine going through that and looking off to the 815 00:37:44,120 --> 00:37:46,239 Speaker 3: side and see the turtle sitting on top of a 816 00:37:46,280 --> 00:37:48,960 Speaker 3: shampoo bottle as part of the confiscated items. 817 00:37:49,040 --> 00:37:51,040 Speaker 2: I tried to go through airport security ones and I 818 00:37:51,080 --> 00:37:53,520 Speaker 2: forgot that I had pepper spray in my purse. 819 00:37:53,560 --> 00:37:56,080 Speaker 3: See there's a great example that would be when I walked. 820 00:37:55,840 --> 00:37:58,080 Speaker 2: Right up to the line and I was like, I 821 00:37:58,360 --> 00:38:00,600 Speaker 2: think I was reaching for my ID or something, and 822 00:38:00,640 --> 00:38:03,399 Speaker 2: I saw and I was like, oh, I can't do this. 823 00:38:03,800 --> 00:38:06,319 Speaker 2: And I looked up at the TSA officer and I said, 824 00:38:06,680 --> 00:38:08,920 Speaker 2: I can't take pepper spray through, can I? And he 825 00:38:09,080 --> 00:38:11,640 Speaker 2: was like, no, but you can throw it away right here. 826 00:38:12,160 --> 00:38:14,640 Speaker 2: And at that time, the hour, you know, the wait 827 00:38:14,719 --> 00:38:16,920 Speaker 2: time was not four or five hours long. So I 828 00:38:16,960 --> 00:38:19,200 Speaker 2: walked it all the way back out to my car 829 00:38:19,440 --> 00:38:21,840 Speaker 2: and then came back in because that was expensive and 830 00:38:21,880 --> 00:38:23,360 Speaker 2: I didn't want to just throw it away. 831 00:38:23,600 --> 00:38:27,120 Speaker 3: Yeah, and see that would be to me. That's what 832 00:38:27,280 --> 00:38:31,040 Speaker 3: ninety nine percent of these is. But obviously a turtle 833 00:38:31,040 --> 00:38:34,520 Speaker 3: in your pants and concealing knives and knee braces and 834 00:38:34,960 --> 00:38:37,600 Speaker 3: bullets and part of it, you know, juice boxes. I 835 00:38:37,640 --> 00:38:39,680 Speaker 3: guess there's a lot more people trying to sneak stuff 836 00:38:39,719 --> 00:38:40,520 Speaker 3: through than I thought. 837 00:38:41,080 --> 00:38:43,320 Speaker 1: Someone trying to smuggle a live animal. 838 00:38:43,360 --> 00:38:46,400 Speaker 2: True airport security, and we're worried about ice speed. 839 00:38:46,600 --> 00:38:50,400 Speaker 3: Everybody. Everybody knows. You put your turtle in your shoe, 840 00:38:50,640 --> 00:38:52,359 Speaker 3: that's how you get it through. You don't put them 841 00:38:52,360 --> 00:38:55,120 Speaker 3: in your pants. That's how you get a turtle through TSA. 842 00:38:55,200 --> 00:38:58,240 Speaker 2: Huh, Thank you, Jim, thank you, Kevin, thank you for listening. 843 00:38:58,560 --> 00:39:00,440 Speaker 2: This is ninety three YBC