1 00:00:02,360 --> 00:00:05,680 Speaker 1: Hey there, folks. It is September the second, and the 2 00:00:05,720 --> 00:00:11,800 Speaker 1: hat snatcher has apologized. Is that enough? Apparently to a 3 00:00:11,800 --> 00:00:15,400 Speaker 1: lot of people, that's not enough and think he should 4 00:00:15,440 --> 00:00:18,880 Speaker 1: suffer much much more. And with that, welcome to this 5 00:00:18,920 --> 00:00:21,759 Speaker 1: episode of Amy and TJ Romes. This story really really, 6 00:00:21,800 --> 00:00:25,759 Speaker 1: really really took off because it looks bad. It looks 7 00:00:25,840 --> 00:00:29,360 Speaker 1: really bad. There's a tennis superstar handing a hat to 8 00:00:29,440 --> 00:00:32,400 Speaker 1: a kid, a fan, a little fan, and a grown 9 00:00:32,479 --> 00:00:34,800 Speaker 1: man snatches it. It looked awful. 10 00:00:35,000 --> 00:00:38,120 Speaker 2: The reality is this was caught on a live broadcast, 11 00:00:38,320 --> 00:00:40,839 Speaker 2: so there was nothing that was. 12 00:00:42,600 --> 00:00:43,040 Speaker 3: Obviously. 13 00:00:43,080 --> 00:00:44,960 Speaker 2: Neither one of them knew they were on camera, so 14 00:00:45,000 --> 00:00:47,879 Speaker 2: what happened was real. It happened in real time, and 15 00:00:47,920 --> 00:00:50,159 Speaker 2: you can slow it down and watch it happen. The 16 00:00:50,200 --> 00:00:55,480 Speaker 2: tennis star Camille Maijak was getting inundated with fans. Everyone 17 00:00:55,560 --> 00:00:57,840 Speaker 2: was very excited, and so he was handing a bunch 18 00:00:57,840 --> 00:01:00,200 Speaker 2: of stuff out. But you can clearly see he is 19 00:01:00,240 --> 00:01:03,560 Speaker 2: handing the hat off his head to this seeming like 20 00:01:03,720 --> 00:01:06,559 Speaker 2: nine ten year old kid and you can just see 21 00:01:06,600 --> 00:01:11,360 Speaker 2: him excitedly reaching for it as a grown ass man 22 00:01:11,959 --> 00:01:14,640 Speaker 2: literally comes in from the side, swoops in and just 23 00:01:14,800 --> 00:01:18,400 Speaker 2: takes the hat, and then you can see the kid 24 00:01:18,680 --> 00:01:24,039 Speaker 2: whose jaw literally drops and says, wait, I forget yes, 25 00:01:24,120 --> 00:01:24,360 Speaker 2: I do. 26 00:01:24,440 --> 00:01:25,399 Speaker 3: I have his actual quote. 27 00:01:25,400 --> 00:01:28,360 Speaker 1: I can tell you what this kid actually say to 28 00:01:28,480 --> 00:01:30,399 Speaker 1: the man who just took and the man he turns 29 00:01:30,400 --> 00:01:33,480 Speaker 1: out it's forty. 30 00:01:31,880 --> 00:01:33,319 Speaker 3: Forty two years two years old. 31 00:01:33,360 --> 00:01:36,480 Speaker 1: We see the kid turns to the grown man and 32 00:01:36,560 --> 00:01:39,960 Speaker 1: almost exclaim like, what are you doing? 33 00:01:40,040 --> 00:01:41,520 Speaker 3: Yeah, the kid says, what are you doing? 34 00:01:41,720 --> 00:01:44,800 Speaker 2: That's exactly what that brock was a little boy's name, 35 00:01:45,160 --> 00:01:48,360 Speaker 2: and that's exactly what he said, what are you doing? 36 00:01:48,640 --> 00:01:51,520 Speaker 1: So you imagine this happened. This is live again, no editing. 37 00:01:51,600 --> 00:01:54,080 Speaker 1: So this happened last Thursday at the US Open. We 38 00:01:54,120 --> 00:01:56,960 Speaker 1: know it's hot right now and going on, but this 39 00:01:57,040 --> 00:01:59,160 Speaker 1: is a big, big, big part of the US Open. 40 00:01:59,480 --> 00:02:01,440 Speaker 1: And this is the thing I say, Robes, I love 41 00:02:01,520 --> 00:02:04,240 Speaker 1: most about the US Open that first week is you 42 00:02:04,320 --> 00:02:06,600 Speaker 1: have so much access to the side court. You can 43 00:02:06,640 --> 00:02:08,840 Speaker 1: get up close and personal with a lot of the athletes. 44 00:02:09,200 --> 00:02:13,239 Speaker 1: But you see them have these gaggles almost with these 45 00:02:13,360 --> 00:02:15,600 Speaker 1: kids that they hold who have these big tennis balls 46 00:02:15,639 --> 00:02:17,520 Speaker 1: or a hat or something, trying to get an autograph. 47 00:02:17,560 --> 00:02:21,280 Speaker 1: And these tennis stars take the time to do it Now. 48 00:02:21,840 --> 00:02:25,400 Speaker 1: I have been to the US Open and seen Serena 49 00:02:25,480 --> 00:02:30,200 Speaker 1: Williams signing stuff, walking right by her as she's doing it. 50 00:02:30,280 --> 00:02:32,639 Speaker 1: I was excited as anything. You know what I did. No, 51 00:02:32,840 --> 00:02:36,720 Speaker 1: that's for the kids. It's for no matter how excited 52 00:02:36,800 --> 00:02:40,720 Speaker 1: you are, So start with that with this guy. Why 53 00:02:40,840 --> 00:02:45,720 Speaker 1: is he even competing with a child or an audience? 54 00:02:45,720 --> 00:02:46,680 Speaker 3: It looks terrible. 55 00:02:47,240 --> 00:02:49,400 Speaker 2: And then when we find out who this guy is, 56 00:02:49,480 --> 00:02:52,519 Speaker 2: because of course, what happens is this video goes up, 57 00:02:52,919 --> 00:02:55,400 Speaker 2: people take note of it. They start slowing it down 58 00:02:55,480 --> 00:02:58,000 Speaker 2: frame by frame. They find out who this guy is, 59 00:02:58,400 --> 00:03:01,680 Speaker 2: and it turns out his name is Peter Shazak. 60 00:03:01,960 --> 00:03:02,800 Speaker 3: He is a. 61 00:03:02,840 --> 00:03:08,160 Speaker 2: Polish millionaire CEO. He owns a paving company, and he 62 00:03:08,240 --> 00:03:12,240 Speaker 2: was excited that a fellow Polish tennis star had won. 63 00:03:12,480 --> 00:03:15,240 Speaker 2: So he was swept up in the excitement. His family 64 00:03:15,320 --> 00:03:16,960 Speaker 2: was with him, his wife was with him, and he 65 00:03:17,000 --> 00:03:19,679 Speaker 2: says his two sons were with him as well. But God, 66 00:03:19,760 --> 00:03:22,000 Speaker 2: even as a father, I feel like that makes it 67 00:03:22,040 --> 00:03:22,639 Speaker 2: even worse. 68 00:03:23,360 --> 00:03:25,560 Speaker 3: It's maybe if he was a single guy, he wasn't 69 00:03:25,560 --> 00:03:26,079 Speaker 3: thinking about it. 70 00:03:26,240 --> 00:03:29,920 Speaker 2: He's a father and he still stole from someone else's child. 71 00:03:30,160 --> 00:03:32,720 Speaker 1: This is where I will step in as and I 72 00:03:32,760 --> 00:03:35,680 Speaker 1: think some parents and you will too, the things we 73 00:03:35,760 --> 00:03:38,960 Speaker 1: have done and try to go get for our. 74 00:03:38,920 --> 00:03:41,120 Speaker 2: Kids at the expense of someone else's child. 75 00:03:41,160 --> 00:03:44,120 Speaker 1: We're not saying that, but to see him in there 76 00:03:44,160 --> 00:03:46,720 Speaker 1: clamoring for something for his kid, maybe I could buy 77 00:03:47,160 --> 00:03:49,360 Speaker 1: that part of it. And again, folks, we are not 78 00:03:49,960 --> 00:03:54,000 Speaker 1: at any way editorializing this video. It is unmistakable. 79 00:03:54,680 --> 00:03:56,640 Speaker 2: Yeah, if you haven't seen it, please go watch it. 80 00:03:56,800 --> 00:04:00,200 Speaker 1: Yes, so it's this is not an interpretation of the video. 81 00:04:00,600 --> 00:04:04,560 Speaker 1: What happened happened. Again, he's there, and so we have 82 00:04:04,640 --> 00:04:07,800 Speaker 1: the answer for why he's there. The video now goes 83 00:04:08,240 --> 00:04:12,080 Speaker 1: everywhere and again rose after he snatches the hat, I 84 00:04:12,080 --> 00:04:14,360 Speaker 1: don't even think he acknowledges the boy next to him. Oh, 85 00:04:14,400 --> 00:04:16,920 Speaker 1: he turns and hands the hat to I believe it's 86 00:04:16,960 --> 00:04:17,240 Speaker 1: his wife. 87 00:04:17,320 --> 00:04:18,560 Speaker 3: I thought he stuffed it in a back. 88 00:04:18,760 --> 00:04:20,160 Speaker 1: Yeah, the woman was holding the bag. 89 00:04:20,200 --> 00:04:21,920 Speaker 3: He put it in a bag, immediately goes in. 90 00:04:22,080 --> 00:04:26,880 Speaker 2: The kid is on the video literally looking at him 91 00:04:26,920 --> 00:04:28,080 Speaker 2: like what have you done? 92 00:04:28,160 --> 00:04:29,520 Speaker 3: Like why can't I get my hat back? 93 00:04:29,600 --> 00:04:33,479 Speaker 2: And the man completely ignores the kid And maybe he 94 00:04:33,520 --> 00:04:36,200 Speaker 2: didn't see him. I have a hard time believing that. 95 00:04:36,760 --> 00:04:40,200 Speaker 2: But fine, he doesn't even acknowledge the kid to the 96 00:04:40,200 --> 00:04:43,799 Speaker 2: point where it's almost apparent he knows the kid is upset, 97 00:04:44,000 --> 00:04:45,960 Speaker 2: and he doesn't want to turn around and look at 98 00:04:45,960 --> 00:04:48,479 Speaker 2: the kid. He doesn't want to acknowledge the pain he's 99 00:04:48,560 --> 00:04:50,800 Speaker 2: just caused, so he keeps his back turned to the 100 00:04:50,839 --> 00:04:51,960 Speaker 2: kid the entire time. 101 00:04:52,160 --> 00:04:53,840 Speaker 1: And the kid is a door. And what makes it worse, 102 00:04:53,920 --> 00:04:56,839 Speaker 1: the kid is adorable, like he is the perfect tennis fan. 103 00:04:56,880 --> 00:04:59,600 Speaker 1: He's like ten years old. They forgot name. His name 104 00:04:59,640 --> 00:05:02,839 Speaker 1: was Brock, that's perfect. But he had this long, shaggy 105 00:05:02,920 --> 00:05:05,800 Speaker 1: kind of blonde hair, right, yes, he was just perfect, 106 00:05:05,800 --> 00:05:08,280 Speaker 1: like how are you up? But he reacted the way 107 00:05:08,880 --> 00:05:10,479 Speaker 1: all of us did, like what are you doing? 108 00:05:10,720 --> 00:05:13,599 Speaker 2: So genuinely it was and he was like did that 109 00:05:14,080 --> 00:05:15,839 Speaker 2: just actually happens? 110 00:05:16,160 --> 00:05:17,000 Speaker 3: That's what I thought? 111 00:05:17,040 --> 00:05:20,440 Speaker 2: And you could see that was the energy and everything 112 00:05:20,440 --> 00:05:23,640 Speaker 2: that kid was putting out was like what the hell? 113 00:05:23,800 --> 00:05:25,080 Speaker 3: How did this man just do that? 114 00:05:25,320 --> 00:05:27,960 Speaker 1: Okay, So to all of this, so this is the drama, 115 00:05:28,000 --> 00:05:30,200 Speaker 1: this is the situation, this is the video that so 116 00:05:30,279 --> 00:05:34,719 Speaker 1: many have seen and robes it actually got worse. So 117 00:05:34,760 --> 00:05:37,839 Speaker 1: he goes viral with the video, people figure out who 118 00:05:37,880 --> 00:05:42,680 Speaker 1: he is, and then he goes even more viral after 119 00:05:43,040 --> 00:05:46,520 Speaker 1: he I said that in air quotes puts out a 120 00:05:46,720 --> 00:05:50,320 Speaker 1: statement in response to how viral he has gone. 121 00:05:50,560 --> 00:05:53,240 Speaker 2: And so I actually want to be fully transparent here 122 00:05:53,279 --> 00:05:55,800 Speaker 2: because when we get ready in the morning for morning run, 123 00:05:55,839 --> 00:05:58,680 Speaker 2: we each take different stories. This was my story, and 124 00:05:58,720 --> 00:06:01,839 Speaker 2: I was watching Obviously I watched the video, but then 125 00:06:01,880 --> 00:06:04,960 Speaker 2: I was watching news reports, and that's where I saw 126 00:06:05,080 --> 00:06:10,560 Speaker 2: that this CEO, millionaire Peter Shazaik, put out a statement 127 00:06:11,120 --> 00:06:15,599 Speaker 2: and the news anchors were just aghast. They were editorializing 128 00:06:15,600 --> 00:06:17,720 Speaker 2: in a way that most news anchors don't, but just saying, hey, 129 00:06:17,800 --> 00:06:20,520 Speaker 2: I'm having a hard time reading this, but guys, take 130 00:06:20,560 --> 00:06:23,520 Speaker 2: a listen to what this man put out in a statement. 131 00:06:23,880 --> 00:06:27,080 Speaker 2: And it was everywhere and everyone was reacting to this 132 00:06:27,240 --> 00:06:30,800 Speaker 2: statement that was allegedly written by the man who stole 133 00:06:30,839 --> 00:06:34,720 Speaker 2: the hat. It reads, yes I took it, Yes I 134 00:06:34,760 --> 00:06:37,520 Speaker 2: did it quickly. But as I've always said, life is 135 00:06:37,560 --> 00:06:41,039 Speaker 2: first come, first served. I understand that some people might 136 00:06:41,080 --> 00:06:43,560 Speaker 2: not like it, but please, let's not make a global 137 00:06:43,640 --> 00:06:44,960 Speaker 2: scandal out of the hat. 138 00:06:45,279 --> 00:06:49,840 Speaker 3: It's just a hat. If you were faster, you would 139 00:06:49,960 --> 00:06:53,760 Speaker 3: have it, which is just the most awful thing. 140 00:06:54,000 --> 00:06:56,440 Speaker 2: And then it goes on to basically say, if you 141 00:06:56,720 --> 00:07:00,240 Speaker 2: keep complaining about me and my company, because he's recognizing, 142 00:07:00,320 --> 00:07:02,159 Speaker 2: or at least acknowledging in this statement that people have 143 00:07:02,200 --> 00:07:04,520 Speaker 2: now found out who he works for, that he's a 144 00:07:04,600 --> 00:07:07,960 Speaker 2: CEO of this company. Basically, I will pursue legal action 145 00:07:08,279 --> 00:07:12,360 Speaker 2: against you if you continue to say terrible things about me. 146 00:07:12,680 --> 00:07:15,240 Speaker 2: And then the news anchor was saying, hey, and this 147 00:07:15,440 --> 00:07:18,040 Speaker 2: was all in Polish. It's been translated. This is the 148 00:07:18,040 --> 00:07:22,000 Speaker 2: best translation. But my goodness, how could somebody say this 149 00:07:22,120 --> 00:07:24,640 Speaker 2: after they've done what they've not? People were just outraged. 150 00:07:24,680 --> 00:07:27,600 Speaker 2: So then that created a whole other heap of hate. 151 00:07:27,600 --> 00:07:31,440 Speaker 2: And I had put this in Morning Run and you're like, babe, 152 00:07:32,600 --> 00:07:35,000 Speaker 2: it doesn't look like he actually said the statement. I 153 00:07:35,040 --> 00:07:38,000 Speaker 2: was like, what, So it goes to show and I 154 00:07:38,080 --> 00:07:41,320 Speaker 2: just want to point this out. It's always whatever is reported, 155 00:07:41,360 --> 00:07:43,760 Speaker 2: which is the salacious, horrific thing that he was reported 156 00:07:43,800 --> 00:07:47,480 Speaker 2: to have said. That's everywhere everywhere. I did not see 157 00:07:47,840 --> 00:07:51,520 Speaker 2: anywhere that had taken it down or had said we 158 00:07:52,120 --> 00:07:54,920 Speaker 2: put this up an error. It turns out he didn't 159 00:07:54,920 --> 00:07:57,720 Speaker 2: make this statement. We actually don't know where this statement came, 160 00:07:57,960 --> 00:08:00,360 Speaker 2: none of that, And I actually had to do a 161 00:08:00,520 --> 00:08:04,400 Speaker 2: deep dive to find anyone who even briefly acknowledged that 162 00:08:04,440 --> 00:08:07,560 Speaker 2: there was a false statement potentially put out, or that 163 00:08:07,600 --> 00:08:08,880 Speaker 2: they didn't know where this statement. 164 00:08:09,160 --> 00:08:12,320 Speaker 3: The retraction is never there. The amount of press you 165 00:08:12,360 --> 00:08:13,720 Speaker 3: get for the wrong. 166 00:08:13,520 --> 00:08:20,280 Speaker 2: Information is ginormous, and then there's almost no attention given 167 00:08:20,360 --> 00:08:23,440 Speaker 2: at all to the false nature of that statement. So 168 00:08:23,480 --> 00:08:26,080 Speaker 2: it turns out he didn't actually put that out there, 169 00:08:26,080 --> 00:08:29,320 Speaker 2: and when you read it, it's ridiculous. 170 00:08:28,840 --> 00:08:30,080 Speaker 1: As soon as you read it, and that this has 171 00:08:30,120 --> 00:08:32,040 Speaker 1: happened a couple of times. It's something recently. I can't 172 00:08:32,080 --> 00:08:35,840 Speaker 1: remember the incident, but something like this happened where somebody reacted. 173 00:08:35,880 --> 00:08:37,800 Speaker 1: There was a statement that went around that was just 174 00:08:37,920 --> 00:08:41,000 Speaker 1: it was fake. It's fault. But it gets going, and 175 00:08:41,080 --> 00:08:44,480 Speaker 1: so so much negativity, so much hate has gone this 176 00:08:44,559 --> 00:08:47,280 Speaker 1: guy's way, and so much of it that we have 177 00:08:47,520 --> 00:08:54,079 Speaker 1: read is specifically based on comments they think he made 178 00:08:54,400 --> 00:08:58,200 Speaker 1: that he actually didn't make. And just it has to 179 00:08:58,280 --> 00:09:05,480 Speaker 1: please everybody. Take a beat. Please before you condemn somebody. 180 00:09:05,559 --> 00:09:08,120 Speaker 1: Tell someone that they need to lose their job, lose 181 00:09:08,160 --> 00:09:11,920 Speaker 1: their way of life, their company needs to fold. He 182 00:09:12,200 --> 00:09:15,000 Speaker 1: made a mistake. We're not defending that, but Robes, this 183 00:09:15,200 --> 00:09:20,120 Speaker 1: is another prime example of how dangerous this is. And 184 00:09:20,160 --> 00:09:26,240 Speaker 1: you damn right, this is very personal to us because 185 00:09:26,360 --> 00:09:31,160 Speaker 1: a lie was told, it got going and there's nothing 186 00:09:31,280 --> 00:09:34,280 Speaker 1: you can do to stop it, and it cost us 187 00:09:34,360 --> 00:09:37,080 Speaker 1: careers that we had. Don't know what it might end 188 00:09:37,160 --> 00:09:39,720 Speaker 1: up costing this guy, but we have seen all kinds 189 00:09:39,720 --> 00:09:43,160 Speaker 1: of hate going his way, and they specifically say it's 190 00:09:43,200 --> 00:09:46,160 Speaker 1: based on what he said in that statement, that statement 191 00:09:46,200 --> 00:09:48,880 Speaker 1: that he actually didn't say, what the what are we 192 00:09:48,880 --> 00:09:49,400 Speaker 1: supposed to do? 193 00:09:49,520 --> 00:09:53,520 Speaker 2: It's gutting because you can, you can obviously still feel 194 00:09:54,160 --> 00:09:58,079 Speaker 2: upset by what you saw by this man's actions, absolutely 195 00:09:58,120 --> 00:10:00,400 Speaker 2: say what he did was wrong. And I believe to 196 00:10:00,520 --> 00:10:04,640 Speaker 2: think that this false statement that was said as if 197 00:10:04,679 --> 00:10:08,160 Speaker 2: he said it, and then the other layer, the whole 198 00:10:08,200 --> 00:10:10,360 Speaker 2: other round of hate that came his way because of it. 199 00:10:11,080 --> 00:10:11,800 Speaker 3: Think about that. 200 00:10:12,000 --> 00:10:16,360 Speaker 2: Imagine, once someone says you said something, it doesn't matter 201 00:10:16,400 --> 00:10:19,080 Speaker 2: if you said it or not. And to your point, TJ, 202 00:10:19,240 --> 00:10:23,200 Speaker 2: it is personal. Once people say a picture represents something 203 00:10:23,200 --> 00:10:27,160 Speaker 2: that it doesn't, you can't untell that story. And it's 204 00:10:27,200 --> 00:10:30,280 Speaker 2: so difficult when every day people will bring back up 205 00:10:30,320 --> 00:10:34,360 Speaker 2: something that actually was never true and you can never 206 00:10:34,480 --> 00:10:37,920 Speaker 2: fully combat it. So yes, on that level, I feel 207 00:10:37,960 --> 00:10:41,720 Speaker 2: for him because yes, he did something wrong, Yes he 208 00:10:41,960 --> 00:10:44,840 Speaker 2: needs to apologize, which we'll get into, and he did, 209 00:10:45,440 --> 00:10:50,559 Speaker 2: and yes, are those people still humans who have an 210 00:10:50,559 --> 00:10:54,080 Speaker 2: opportunity to do better, be better, grow from this, and actually, 211 00:10:54,200 --> 00:10:57,880 Speaker 2: can we give people second chances? Can we give people 212 00:10:57,960 --> 00:11:01,600 Speaker 2: a beat to recognize where they made a mistake and 213 00:11:01,640 --> 00:11:05,360 Speaker 2: then do better. I just it's disheartening to me how 214 00:11:05,440 --> 00:11:07,719 Speaker 2: much we just always want a good guy and a 215 00:11:07,760 --> 00:11:10,679 Speaker 2: bad guy. We can't ever accept the fact that maybe 216 00:11:11,080 --> 00:11:14,400 Speaker 2: someone is made a bad choice but isn't a bad person. 217 00:11:15,240 --> 00:11:20,959 Speaker 2: We don't ever give people that I guess, that that 218 00:11:21,360 --> 00:11:22,240 Speaker 2: I don't know. 219 00:11:22,120 --> 00:11:23,040 Speaker 3: Like just that grace. 220 00:11:23,440 --> 00:11:25,560 Speaker 2: There's no other word for it other than grace. 221 00:11:25,720 --> 00:11:30,360 Speaker 1: Because the thing always we judge other people by their 222 00:11:30,400 --> 00:11:34,040 Speaker 1: worst actions, and we judge ourselves by our best intentions. 223 00:11:34,640 --> 00:11:37,040 Speaker 1: And so the way we can keep that balance. And 224 00:11:37,080 --> 00:11:40,360 Speaker 1: again I've learned this through therapists and all kinds of 225 00:11:40,400 --> 00:11:45,000 Speaker 1: people who have we've talked to over the years about projecting, Like, 226 00:11:45,120 --> 00:11:47,720 Speaker 1: why are we attacking this guy for a mistake? We like, 227 00:11:47,760 --> 00:11:50,120 Speaker 1: we can put we can heap everything on him because 228 00:11:50,160 --> 00:11:52,240 Speaker 1: we see what he screwed up, we see the mistake 229 00:11:52,640 --> 00:11:56,200 Speaker 1: he made, Well, how bad of a mistake have you 230 00:11:56,280 --> 00:12:01,080 Speaker 1: made this week last week in your family that we 231 00:12:01,160 --> 00:12:03,920 Speaker 1: all don't know about. He made an awful mistake. We 232 00:12:03,960 --> 00:12:06,480 Speaker 1: are not rose and I are not defending at all 233 00:12:06,559 --> 00:12:10,760 Speaker 1: what this dude did, but I'm also not defending the 234 00:12:10,880 --> 00:12:16,360 Speaker 1: response to it either. So this man, yes, he did 235 00:12:16,400 --> 00:12:19,680 Speaker 1: put out a statement, kind of a linky statement as 236 00:12:19,880 --> 00:12:22,840 Speaker 1: as well, and we'll let you hear word for word 237 00:12:22,920 --> 00:12:27,319 Speaker 1: what he said and how he apologized unequivocally for what happened. 238 00:12:27,320 --> 00:12:30,439 Speaker 1: But also we want to continue to ask that question, 239 00:12:31,360 --> 00:12:36,720 Speaker 1: what punishment does this man deserve and what right do 240 00:12:36,920 --> 00:12:38,600 Speaker 1: you have to decide it. 241 00:12:47,920 --> 00:12:50,800 Speaker 2: Welcome back to this edition of Amy and TJ, where 242 00:12:50,800 --> 00:12:55,720 Speaker 2: we are talking about the hat grab that went round 243 00:12:55,800 --> 00:12:59,880 Speaker 2: the world from Peter Shazik taking that hat that was 244 00:13:00,080 --> 00:13:04,360 Speaker 2: intended clearly for a ten year old boy next to him. 245 00:13:05,040 --> 00:13:09,839 Speaker 2: He has actually dealt with not only the backlash from 246 00:13:09,880 --> 00:13:12,560 Speaker 2: that action that he certainly needs to and has taken 247 00:13:12,600 --> 00:13:15,920 Speaker 2: responsibility for, but he is dealing with backlash for a 248 00:13:16,000 --> 00:13:20,120 Speaker 2: statement he never even made, where someone jumped in in 249 00:13:20,200 --> 00:13:23,840 Speaker 2: Polish to make it seem more legit, saying something outrageous 250 00:13:24,040 --> 00:13:27,320 Speaker 2: that he never actually said. So, if you're interested in 251 00:13:27,360 --> 00:13:32,360 Speaker 2: hearing what SHA's Eric actually said, what his true statement was, 252 00:13:32,440 --> 00:13:34,920 Speaker 2: We actually have it here for you, and this is legit. 253 00:13:35,480 --> 00:13:39,199 Speaker 2: This came from him directly, not from someone posing as him. 254 00:13:39,320 --> 00:13:42,720 Speaker 1: And we'll piece this together because it's kind of kind 255 00:13:42,720 --> 00:13:45,240 Speaker 1: of long here. But he starts by saying, due to 256 00:13:45,280 --> 00:13:51,680 Speaker 1: the situation that occurred during Camille Myijack's eijacks match at 257 00:13:51,720 --> 00:13:54,160 Speaker 1: the US Open, I would like to sincerely apologize to 258 00:13:54,200 --> 00:13:57,960 Speaker 1: the injured boy, his family, all the fans, and the 259 00:13:58,000 --> 00:14:02,080 Speaker 1: player himself. I made a great mistake. In the midst 260 00:14:02,080 --> 00:14:06,200 Speaker 1: of emotion amidst the crowd celebration after his victory, I 261 00:14:06,280 --> 00:14:09,440 Speaker 1: was convinced the tennis player was offering his cap to 262 00:14:09,600 --> 00:14:14,080 Speaker 1: me for my sons, who had asked for an autograph earlier. 263 00:14:14,600 --> 00:14:20,200 Speaker 1: A mistaken belief made me instinctively reach out. Today, I 264 00:14:20,280 --> 00:14:24,000 Speaker 1: know I did something that looked like deliberately taking a 265 00:14:24,040 --> 00:14:27,240 Speaker 1: souvenir from a child. It wasn't my intention, but it 266 00:14:27,280 --> 00:14:29,840 Speaker 1: doesn't change the fact that I hurt the boy and 267 00:14:29,880 --> 00:14:33,920 Speaker 1: disappointed the fans. You buy that so far, so good? 268 00:14:34,120 --> 00:14:37,000 Speaker 2: Yeah, I think that was a really good apology. I 269 00:14:37,000 --> 00:14:40,040 Speaker 2: think that's a really good apology. I have to say, look, 270 00:14:40,560 --> 00:14:45,800 Speaker 2: it's it's embarrassing, it's humiliating to admit that you let 271 00:14:45,840 --> 00:14:49,680 Speaker 2: your own excitement and maybe it was for what you 272 00:14:49,840 --> 00:14:55,240 Speaker 2: wanted for your children to overtake logic, overtake just having 273 00:14:55,440 --> 00:14:58,000 Speaker 2: any sort of situational awareness. How many times do we 274 00:14:58,080 --> 00:15:01,000 Speaker 2: encounter and do I find myself all of us do 275 00:15:01,400 --> 00:15:05,320 Speaker 2: find ourselves not being situationally aware. We know what we want, 276 00:15:05,520 --> 00:15:07,880 Speaker 2: we know what we need, we know what maybe our 277 00:15:07,960 --> 00:15:10,480 Speaker 2: children want, and we're not thinking about other people's kids. 278 00:15:10,480 --> 00:15:13,400 Speaker 2: We're not thinking about the people around us. And all 279 00:15:13,440 --> 00:15:15,560 Speaker 2: of us have done that to some degree in a 280 00:15:15,600 --> 00:15:16,680 Speaker 2: crowd at some. 281 00:15:16,720 --> 00:15:20,120 Speaker 1: Point, and sometimes it just comes off as inconsiderate. Maybe 282 00:15:20,160 --> 00:15:22,240 Speaker 1: you step off in front of somebody when you getting 283 00:15:22,240 --> 00:15:26,120 Speaker 1: off an elevator, maybe you don't give a little gear. 284 00:15:27,240 --> 00:15:29,680 Speaker 2: Crowded streets, small whatever. 285 00:15:29,360 --> 00:15:32,160 Speaker 1: Little things happen, and we are all unaware, and we 286 00:15:32,240 --> 00:15:35,240 Speaker 1: all make mistakes. He made it in front of cameras, 287 00:15:35,600 --> 00:15:38,920 Speaker 1: and we made it to the worst possible victim, right, 288 00:15:38,960 --> 00:15:42,280 Speaker 1: but it was still a mistake, yes, And be. 289 00:15:42,160 --> 00:15:46,000 Speaker 2: That I do think that people, we all make mistakes, 290 00:15:46,480 --> 00:15:49,480 Speaker 2: and I think that so far, I really appreciate this 291 00:15:49,520 --> 00:15:53,440 Speaker 2: apology because unfortunately the statement that was attributed to him 292 00:15:53,480 --> 00:15:56,920 Speaker 2: earlier was so egregious. I just worry that that's the 293 00:15:56,960 --> 00:15:58,960 Speaker 2: only one people who are going to read. That's the 294 00:15:59,000 --> 00:16:01,080 Speaker 2: only one people are gonna see. People aren't going to 295 00:16:01,280 --> 00:16:05,120 Speaker 2: actually hear what he really did have to say, because sadly, 296 00:16:05,160 --> 00:16:08,000 Speaker 2: when you do take a beat and you don't actually 297 00:16:08,080 --> 00:16:12,400 Speaker 2: immediately respond, turns out people will respond for you and 298 00:16:12,440 --> 00:16:15,760 Speaker 2: you can't undo it. That's what's so scary about the 299 00:16:15,800 --> 00:16:16,800 Speaker 2: world we live in now. 300 00:16:17,360 --> 00:16:20,200 Speaker 1: But why can't we all we just assume, I don't know, 301 00:16:20,560 --> 00:16:22,960 Speaker 1: you know what robes this is. Probably I should give 302 00:16:23,000 --> 00:16:25,360 Speaker 1: people a break. Why wouldn't they just believe what I 303 00:16:25,360 --> 00:16:28,400 Speaker 1: don't know, believe what they see. There's a statement just 304 00:16:28,520 --> 00:16:29,440 Speaker 1: running around, Babe. 305 00:16:29,480 --> 00:16:31,320 Speaker 3: I believed it, just babe. 306 00:16:31,440 --> 00:16:33,120 Speaker 2: I not only did I see it, but then I 307 00:16:33,160 --> 00:16:36,240 Speaker 2: was like, let me go actually investigate. And I started 308 00:16:36,240 --> 00:16:39,920 Speaker 2: pulling up news reports and they were all repeating this 309 00:16:40,040 --> 00:16:40,800 Speaker 2: false statement. 310 00:16:40,960 --> 00:16:43,760 Speaker 3: So, yeah, I fell for it. I fell for it. 311 00:16:44,160 --> 00:16:46,880 Speaker 2: So his statement, his actual statement goes on to say 312 00:16:47,280 --> 00:16:50,240 Speaker 2: the cap was given to the boy and an apology 313 00:16:50,280 --> 00:16:53,520 Speaker 2: to the family, meaning he returned the cap. I hope 314 00:16:53,520 --> 00:16:56,880 Speaker 2: I have at least partially repaired the harm. I also 315 00:16:57,080 --> 00:17:00,920 Speaker 2: want to make it clear, neither I, my wife, nor 316 00:17:01,000 --> 00:17:06,120 Speaker 2: my sons commented on this situation in any way social 317 00:17:06,160 --> 00:17:09,840 Speaker 2: media and on any other portal. We did not use 318 00:17:09,880 --> 00:17:11,400 Speaker 2: the services of any law. 319 00:17:11,200 --> 00:17:12,320 Speaker 3: Firm in this regard. 320 00:17:12,760 --> 00:17:18,080 Speaker 2: All alleged statements appearing online are not our own. 321 00:17:18,320 --> 00:17:21,480 Speaker 1: Now, would that trigger anyone to take something down? Know? 322 00:17:21,520 --> 00:17:23,360 Speaker 1: Who's going to read that? Who's going to see this one? 323 00:17:23,560 --> 00:17:23,679 Speaker 3: No? 324 00:17:23,760 --> 00:17:26,640 Speaker 1: This is because it's not sexy. Right Still, it doesn't 325 00:17:26,640 --> 00:17:29,480 Speaker 1: make a cool headline, it's not fun and funny. You 326 00:17:29,520 --> 00:17:31,560 Speaker 1: can't attack a guy for this. What do you attack 327 00:17:31,680 --> 00:17:34,560 Speaker 1: him for? Yes, he messed up? Please, I hope no one. 328 00:17:34,800 --> 00:17:35,440 Speaker 1: We've said it right. 329 00:17:35,560 --> 00:17:38,200 Speaker 3: You're not defending what he absolutely made a mistake. 330 00:17:38,240 --> 00:17:41,720 Speaker 1: He made a mistake. But that's where we're we are. 331 00:17:41,880 --> 00:17:45,640 Speaker 1: We end up right having an indictment of the person's 332 00:17:45,720 --> 00:17:48,320 Speaker 1: character based on the moment that they have. So you 333 00:17:48,680 --> 00:17:51,560 Speaker 1: make a mistake, that means you are evil? 334 00:17:52,040 --> 00:17:53,160 Speaker 3: You are your mistake. 335 00:17:53,840 --> 00:17:56,720 Speaker 1: Yes, I don't know, none of us. 336 00:17:56,880 --> 00:17:59,520 Speaker 2: None of us want to be our worst mistake, and 337 00:17:59,560 --> 00:18:02,040 Speaker 2: we don't want to be remembered from worse mistake, and 338 00:18:02,040 --> 00:18:07,159 Speaker 2: we don't want to never be able to move past 339 00:18:07,400 --> 00:18:10,000 Speaker 2: our worst mistake. So if you don't want that for yourself. 340 00:18:10,480 --> 00:18:12,480 Speaker 2: My hope is that we can all extend that to him. 341 00:18:13,640 --> 00:18:15,800 Speaker 2: He goes on to say, my wife and I have 342 00:18:15,880 --> 00:18:18,760 Speaker 2: been involved in helping children and young athletes for years, 343 00:18:19,080 --> 00:18:22,040 Speaker 2: but this situation has shown me that one moment of 344 00:18:22,080 --> 00:18:27,120 Speaker 2: inattention can destroy years of work and support. It is 345 00:18:27,160 --> 00:18:31,000 Speaker 2: a painful, but a lesson in humility. It is painful, 346 00:18:31,000 --> 00:18:32,840 Speaker 2: but a lesson in humility is needed. 347 00:18:33,400 --> 00:18:33,760 Speaker 1: Wow. 348 00:18:35,080 --> 00:18:39,000 Speaker 2: Therefore, I will be even more actively involved in initiatives 349 00:18:39,000 --> 00:18:42,720 Speaker 2: supporting children and young people, and in actions against violence 350 00:18:42,720 --> 00:18:45,560 Speaker 2: and hate. I believe that only through actions can I 351 00:18:45,680 --> 00:18:50,480 Speaker 2: rebuild the lost trust. Once again, I apologize to everyone 352 00:18:50,520 --> 00:18:54,760 Speaker 2: I have disappointed. Please understand. Out of concern for my family, 353 00:18:55,200 --> 00:18:58,080 Speaker 2: I have decided to disable commenting on this post. 354 00:18:58,600 --> 00:18:59,920 Speaker 3: That was actually really smart. 355 00:19:00,240 --> 00:19:03,840 Speaker 2: Yours, sincerely, Peter Shazik. 356 00:19:04,240 --> 00:19:06,159 Speaker 1: That's a that's an apology, well done. 357 00:19:06,000 --> 00:19:08,080 Speaker 3: I agree, is it not? Yes? 358 00:19:08,320 --> 00:19:11,879 Speaker 1: So, I mean he's taken full responsibility for it. But this, 359 00:19:12,560 --> 00:19:16,359 Speaker 1: but I mean, this should be This should be a 360 00:19:16,359 --> 00:19:21,280 Speaker 1: template for apologies. One moment of inattention can destroy years 361 00:19:21,280 --> 00:19:24,399 Speaker 1: of work and support one moment. And he's right, and 362 00:19:24,440 --> 00:19:28,199 Speaker 1: that sucks. But how many CEOs, how many things have 363 00:19:28,240 --> 00:19:32,200 Speaker 1: we seen where companies are hurting? I guess Tesla is 364 00:19:32,200 --> 00:19:34,840 Speaker 1: a high profile situation for all kinds of reasons, but 365 00:19:35,480 --> 00:19:39,040 Speaker 1: just any moment that the ceo appears to be doing 366 00:19:39,040 --> 00:19:42,359 Speaker 1: something wrong, the whole company can. 367 00:19:42,640 --> 00:19:46,160 Speaker 2: Yeah, Walter no, And speaking of his company on Google, 368 00:19:46,640 --> 00:19:50,440 Speaker 2: the company has received so they've found out the company 369 00:19:50,520 --> 00:19:54,439 Speaker 2: he is ceo of the company now has plummeted to 370 00:19:54,640 --> 00:19:57,280 Speaker 2: just one point two stars. So you know you can 371 00:19:57,600 --> 00:20:00,560 Speaker 2: you can rate a company. It's a sick stars to 372 00:20:00,640 --> 00:20:03,760 Speaker 2: one star on Google review. His company is now at 373 00:20:03,840 --> 00:20:07,440 Speaker 2: one point two stars. One of the reviews. I'll read 374 00:20:07,480 --> 00:20:09,240 Speaker 2: it for you just to give you an idea of 375 00:20:09,280 --> 00:20:13,000 Speaker 2: what people are leaving in the review column. The ceo 376 00:20:13,119 --> 00:20:17,560 Speaker 2: of this company doesn't have any morals, stealing from children 377 00:20:17,640 --> 00:20:21,720 Speaker 2: and then defending himself in public. Don't ever do business 378 00:20:21,760 --> 00:20:22,879 Speaker 2: with this company. 379 00:20:24,000 --> 00:20:30,439 Speaker 1: Based on a falsehood. She is telling the world, don't 380 00:20:30,800 --> 00:20:34,919 Speaker 1: do business here this guy. She's trying to shut him down. 381 00:20:36,160 --> 00:20:41,280 Speaker 1: I don't know what to just please caution, I just caution, folks, 382 00:20:41,440 --> 00:20:43,360 Speaker 1: I caution you. Even if you hate what he did, 383 00:20:43,400 --> 00:20:46,919 Speaker 1: which I hate what he did, he apologized for it, 384 00:20:46,920 --> 00:20:49,800 Speaker 1: and why do we start with he must be evil 385 00:20:49,840 --> 00:20:51,359 Speaker 1: for why don't we start with, Okay, this was a 386 00:20:51,400 --> 00:20:54,000 Speaker 1: good guy who made a mistake versus he's a terrible guy, 387 00:20:54,040 --> 00:20:55,119 Speaker 1: and he just confirmed it for me. 388 00:20:55,240 --> 00:20:57,720 Speaker 2: We all have moments of bad judgment, yes, And if 389 00:20:57,800 --> 00:20:58,560 Speaker 2: you're honest with. 390 00:20:58,560 --> 00:21:00,280 Speaker 3: Yourself, you know you have. 391 00:21:00,440 --> 00:21:02,520 Speaker 2: And there have been so many of those moments where 392 00:21:02,520 --> 00:21:04,840 Speaker 2: you are thanking God that no one else saw, that 393 00:21:04,960 --> 00:21:07,399 Speaker 2: no one else knew, that no one else was rolling 394 00:21:07,440 --> 00:21:10,119 Speaker 2: on you, that no one else was recording you. 395 00:21:10,400 --> 00:21:13,880 Speaker 3: With what you did or said or whatever. 396 00:21:13,960 --> 00:21:16,480 Speaker 2: But we've all been there, not to that extent. And 397 00:21:16,520 --> 00:21:19,280 Speaker 2: I'm not saying that that is what we like. I 398 00:21:19,359 --> 00:21:22,359 Speaker 2: get it, and we all feel better about ourselves thinking 399 00:21:22,560 --> 00:21:24,679 Speaker 2: I would never do that. And see, that's what this 400 00:21:24,800 --> 00:21:27,520 Speaker 2: all comes down to. We watched that video and we 401 00:21:27,560 --> 00:21:30,600 Speaker 2: say I would never do that. He must be an asshole, 402 00:21:30,840 --> 00:21:33,840 Speaker 2: and then we feel better about ourselves. It really does 403 00:21:33,960 --> 00:21:34,879 Speaker 2: come back to that. 404 00:21:35,880 --> 00:21:38,240 Speaker 1: And again, this is not something we have come up 405 00:21:38,280 --> 00:21:43,320 Speaker 1: with our own. These are things we have learned through lessons. 406 00:21:43,320 --> 00:21:46,680 Speaker 1: We have been taught through life, through experience, and through 407 00:21:46,760 --> 00:21:50,399 Speaker 1: yes therapy, And you know what we have throughout our lives. 408 00:21:50,440 --> 00:21:53,720 Speaker 1: Rogues have access to a bunch of thought leaders and 409 00:21:53,760 --> 00:21:55,359 Speaker 1: a bunch of people who are trying to help people 410 00:21:55,359 --> 00:21:57,040 Speaker 1: get through these moments, and that's one of the things 411 00:21:57,080 --> 00:21:59,760 Speaker 1: they absolutely do talk to us about a lot. So 412 00:22:00,400 --> 00:22:02,320 Speaker 1: we're all working on it. We're trying to do better. 413 00:22:02,400 --> 00:22:06,680 Speaker 1: But I this is not an episode to defend now 414 00:22:06,840 --> 00:22:12,560 Speaker 1: the actions of Peter Shaseric, but this is one to 415 00:22:12,640 --> 00:22:17,600 Speaker 1: make sure we are not defending as well the actions 416 00:22:17,680 --> 00:22:20,520 Speaker 1: of so many people and how they've reacted to something. 417 00:22:20,640 --> 00:22:22,720 Speaker 2: And I do think it's just another lesson in how 418 00:22:22,760 --> 00:22:25,800 Speaker 2: we view how we receive news, whether it's through TikTok 419 00:22:26,080 --> 00:22:29,840 Speaker 2: or accredited news stations who get it wrong sometimes who 420 00:22:29,880 --> 00:22:32,320 Speaker 2: fall for things. I fell for it this morning as well. 421 00:22:32,680 --> 00:22:36,640 Speaker 2: But just to view what you read, to view what 422 00:22:36,720 --> 00:22:40,400 Speaker 2: you hear with a little touch of skepticism, just keep 423 00:22:40,440 --> 00:22:42,679 Speaker 2: a space open for the fact that you might not 424 00:22:42,840 --> 00:22:46,399 Speaker 2: have all the facts, and then just allow yourself that 425 00:22:46,520 --> 00:22:50,359 Speaker 2: space for consideration. And I think that's just a big 426 00:22:50,400 --> 00:22:53,040 Speaker 2: part of how hopefully we can all do better in 427 00:22:53,880 --> 00:22:58,240 Speaker 2: getting information and then processing that information. So it was 428 00:22:58,280 --> 00:23:00,840 Speaker 2: a tough thing to watch. I feel like there's so 429 00:23:00,920 --> 00:23:04,480 Speaker 2: many lessons in it. And look, I hope that you know. 430 00:23:04,560 --> 00:23:07,440 Speaker 2: The little boy was great. He got celebrated by his 431 00:23:07,480 --> 00:23:09,960 Speaker 2: tennis star. He got a new hat, he got a 432 00:23:09,960 --> 00:23:12,280 Speaker 2: bunch of extra merch, he got a big photo op. 433 00:23:12,400 --> 00:23:15,119 Speaker 2: It actually created a really special moment, like this terrible 434 00:23:15,160 --> 00:23:18,400 Speaker 2: thing that happened actually created a really special moment for him. 435 00:23:18,880 --> 00:23:21,200 Speaker 2: And that's all online as well. So look, it kind 436 00:23:21,200 --> 00:23:25,159 Speaker 2: of feels like there was a happy ending, so to speak, 437 00:23:25,240 --> 00:23:28,840 Speaker 2: and if everyone can grow and learn from it, it wasn't. 438 00:23:28,680 --> 00:23:29,160 Speaker 3: A bad thing. 439 00:23:30,480 --> 00:23:32,119 Speaker 1: You know, that kid is having a good time at school. 440 00:23:32,800 --> 00:23:34,720 Speaker 3: I love Yeah, he's a hero of the day. 441 00:23:34,800 --> 00:23:36,600 Speaker 1: He's having a good time. All right. Well, we always 442 00:23:36,600 --> 00:23:39,440 Speaker 1: appreciate Ja haying with us for my dear partner, Aby 443 00:23:39,480 --> 00:23:41,240 Speaker 1: Robot on t J. Holmes. We'll see y'allso