1 00:00:00,200 --> 00:00:02,680 Speaker 1: This portion of The Joe Show podcast is powered by 2 00:00:02,720 --> 00:00:06,280 Speaker 1: Fair and Farah, Tampa Accident Attorneys. All right, we're live, 3 00:00:06,480 --> 00:00:09,119 Speaker 1: Tampa Bay's number one and only hit music channel ninety 4 00:00:09,160 --> 00:00:13,720 Speaker 1: three to three fl Z. How many people in here 5 00:00:13,800 --> 00:00:17,040 Speaker 1: actively use AI? I don't need Jed to raise his hand? 6 00:00:17,079 --> 00:00:18,680 Speaker 1: I know he does. I do you do? 7 00:00:18,960 --> 00:00:19,240 Speaker 2: Yeah? 8 00:00:19,480 --> 00:00:21,720 Speaker 3: I don't at all. None. 9 00:00:22,400 --> 00:00:25,759 Speaker 1: You don't ever use it to be like, hey, give 10 00:00:25,760 --> 00:00:26,520 Speaker 1: me an itinerary? 11 00:00:29,320 --> 00:00:30,400 Speaker 4: Is Google Gemini count? 12 00:00:30,520 --> 00:00:30,800 Speaker 5: Yes? 13 00:00:30,840 --> 00:00:37,120 Speaker 2: Okay, then yes it's incorrected. Yeah, because it planned my trip. Oh, pinccle, 14 00:00:37,159 --> 00:00:38,720 Speaker 2: I said, I'm going to Pensacola this weekend. 15 00:00:38,760 --> 00:00:40,199 Speaker 4: Gave it the dates. Give me some things to do. 16 00:00:40,640 --> 00:00:43,159 Speaker 1: See, I do like? How many calories is a New 17 00:00:43,240 --> 00:00:48,519 Speaker 1: York Strip steak? If I use like, no butter, estimate 18 00:00:48,600 --> 00:00:51,040 Speaker 1: my calories of my meal. 19 00:00:51,280 --> 00:00:52,200 Speaker 3: I did that. 20 00:00:52,200 --> 00:00:52,880 Speaker 4: That is so fun. 21 00:00:53,720 --> 00:00:58,440 Speaker 1: I'll do a Chipotle bowl calorie estimate. I'll do windshield 22 00:00:58,480 --> 00:01:03,760 Speaker 1: fluid issue trending, taper fade cuts. 23 00:01:06,120 --> 00:01:09,560 Speaker 2: My last one was using Public's chicken and sprouts Farmers 24 00:01:09,920 --> 00:01:10,600 Speaker 2: green Beans. 25 00:01:10,800 --> 00:01:12,280 Speaker 4: Make this into seven meals. 26 00:01:12,959 --> 00:01:13,320 Speaker 3: It did. 27 00:01:13,600 --> 00:01:14,640 Speaker 4: It's seven meals. 28 00:01:14,920 --> 00:01:18,160 Speaker 3: So check this out. Well, yeah, they'll do it. They'll 29 00:01:18,160 --> 00:01:20,360 Speaker 3: do it, Ashley. It's the best it is. 30 00:01:22,760 --> 00:01:27,640 Speaker 1: Give me funny pictures for the term type s it 31 00:01:27,680 --> 00:01:28,280 Speaker 1: didn't give me. 32 00:01:29,720 --> 00:01:32,479 Speaker 3: Sometimes it's good, sometimes it's not. Well check this out. 33 00:01:32,880 --> 00:01:41,080 Speaker 1: So sixty nine sixty nine percent of Americans openly admit, Hey, 34 00:01:41,160 --> 00:01:44,559 Speaker 1: I use AI for something in my life. 35 00:01:44,959 --> 00:01:46,759 Speaker 3: It's got to be much more than sixty nine. 36 00:01:46,840 --> 00:01:49,520 Speaker 2: Okay, but that's a lot of people are probably like Kitty, 37 00:01:49,560 --> 00:01:51,640 Speaker 2: they don't know they're using it because. 38 00:01:51,400 --> 00:01:53,800 Speaker 6: You're googling it and it's just like pulling from all 39 00:01:53,800 --> 00:01:55,440 Speaker 6: these different websites. 40 00:01:55,000 --> 00:01:56,360 Speaker 3: Or because of this. 41 00:01:57,800 --> 00:02:01,120 Speaker 1: Of those sixty nine percent of people, more than ninety 42 00:02:01,160 --> 00:02:03,560 Speaker 1: percent of them said that in some way they are 43 00:02:04,120 --> 00:02:05,520 Speaker 1: ashamed of using. 44 00:02:05,320 --> 00:02:09,080 Speaker 5: It to a certain extent, I would say yes for sure, 45 00:02:09,160 --> 00:02:12,120 Speaker 5: especially when you hear the stuff about how much water 46 00:02:12,400 --> 00:02:14,840 Speaker 5: AI uses and things like that. 47 00:02:15,120 --> 00:02:18,480 Speaker 4: And real, yes, really a lot of environments. 48 00:02:19,200 --> 00:02:22,040 Speaker 5: There are a lot of places in Ohio in particular, 49 00:02:22,080 --> 00:02:24,120 Speaker 5: because they're buy fresh water sources and they have to 50 00:02:24,200 --> 00:02:27,040 Speaker 5: use the water to cool down all of the the. 51 00:02:27,040 --> 00:02:30,800 Speaker 1: AI is mad because living in Ohio the hell would 52 00:02:30,800 --> 00:02:33,560 Speaker 1: selectively pick that. So that's why AI is going to 53 00:02:33,600 --> 00:02:35,560 Speaker 1: take up the world. They got to get out of Ohio, 54 00:02:36,400 --> 00:02:37,120 Speaker 1: live to Florida. 55 00:02:37,560 --> 00:02:39,480 Speaker 3: People are embarrassed. They're ashamed of it. 56 00:02:39,520 --> 00:02:41,679 Speaker 1: They feel like they're taking jobs away from people, which 57 00:02:41,680 --> 00:02:43,600 Speaker 1: I don't know who you're taking a job away from. 58 00:02:43,639 --> 00:02:45,840 Speaker 1: When I'm asking it, how many calories is in my 59 00:02:45,919 --> 00:02:49,919 Speaker 1: New York strip, nutrition is That's what people? Oh, yes, 60 00:02:50,520 --> 00:02:53,040 Speaker 1: because one day I'll totally be able to afford a 61 00:02:53,120 --> 00:02:54,000 Speaker 1: nutrition iss. 62 00:02:54,040 --> 00:02:57,560 Speaker 2: It comes with your insurance, what your hs I can 63 00:02:57,600 --> 00:02:59,760 Speaker 2: pay work. I learned that I know to shout out 64 00:02:59,800 --> 00:03:00,400 Speaker 2: to its TikTok. 65 00:03:01,280 --> 00:03:08,160 Speaker 1: I don't have enough. But do you ever are you 66 00:03:08,240 --> 00:03:09,960 Speaker 1: ever using it? And you feel bad? 67 00:03:10,520 --> 00:03:13,480 Speaker 2: This morning when we were talking about just like graphics, 68 00:03:13,720 --> 00:03:16,240 Speaker 2: I've been noticing a lot of people talking about graphic 69 00:03:16,240 --> 00:03:19,720 Speaker 2: designers being out of work and graphic designers looking for work. 70 00:03:20,480 --> 00:03:23,120 Speaker 2: So that is where I'm just like, I feel so 71 00:03:23,280 --> 00:03:26,040 Speaker 2: bad for taking away from someone's creativity not being able 72 00:03:26,040 --> 00:03:26,400 Speaker 2: to do that. 73 00:03:26,480 --> 00:03:26,560 Speaker 7: Well. 74 00:03:26,680 --> 00:03:28,600 Speaker 1: Good thing is is we don't use AI to make 75 00:03:28,639 --> 00:03:34,720 Speaker 1: graphics for us in the last twelve hours, but. 76 00:03:35,000 --> 00:03:36,480 Speaker 3: My brother does get very mad. 77 00:03:36,920 --> 00:03:39,840 Speaker 1: I'll just be honest. Of course we use it. We 78 00:03:39,920 --> 00:03:43,240 Speaker 1: don't have capabilities to make anything. I don't know what 79 00:03:43,280 --> 00:03:46,480 Speaker 1: to tell you, and we got to get our stuff done. 80 00:03:46,680 --> 00:03:48,760 Speaker 1: That's what I always tell my brother. I go, well, Jacob, 81 00:03:48,800 --> 00:03:50,600 Speaker 1: do you want me to just like not do my job? 82 00:03:50,640 --> 00:03:52,839 Speaker 1: Because Jacob, do you want to do it? 83 00:03:51,680 --> 00:03:51,880 Speaker 2: Work? 84 00:03:52,840 --> 00:03:53,640 Speaker 4: It didn't come on it. 85 00:03:54,120 --> 00:03:56,400 Speaker 1: I look around and all these shows have thirty seven 86 00:03:56,400 --> 00:03:57,400 Speaker 1: people on staff. 87 00:03:57,480 --> 00:03:58,120 Speaker 3: We don't. 88 00:03:58,240 --> 00:04:01,280 Speaker 1: So I don't know what to do. And he'll be like, 89 00:04:01,320 --> 00:04:03,200 Speaker 1: you're taking a job away from a graphics center. I go, 90 00:04:04,920 --> 00:04:07,080 Speaker 1: do you think we what do you mean we don't 91 00:04:07,120 --> 00:04:08,520 Speaker 1: have the job to begin with? 92 00:04:09,520 --> 00:04:11,760 Speaker 3: Who is this? Is this AI on the phone? 93 00:04:13,440 --> 00:04:13,600 Speaker 2: Now? 94 00:04:13,640 --> 00:04:14,280 Speaker 8: This is Abby? 95 00:04:14,720 --> 00:04:16,559 Speaker 3: Abby? How often are you using AI? 96 00:04:18,080 --> 00:04:21,200 Speaker 8: I use it for two different things A lot one. 97 00:04:21,480 --> 00:04:24,120 Speaker 8: I just had a baby back in January, and I 98 00:04:24,200 --> 00:04:27,560 Speaker 8: swear it's taking so much anxiety as you know, is 99 00:04:27,800 --> 00:04:28,479 Speaker 8: this normal? 100 00:04:28,680 --> 00:04:28,760 Speaker 1: Is? 101 00:04:28,920 --> 00:04:29,120 Speaker 2: You know? 102 00:04:29,240 --> 00:04:29,920 Speaker 8: Things like that? 103 00:04:30,000 --> 00:04:31,080 Speaker 9: And so I think it's help with that. 104 00:04:31,640 --> 00:04:33,800 Speaker 8: But I'm a little guilty to say. I'm a teacher 105 00:04:34,640 --> 00:04:38,960 Speaker 8: and sometimes and I'm in a pinch for a lesson plan, I'll. 106 00:04:38,720 --> 00:04:39,800 Speaker 7: Say exactly what I need. 107 00:04:39,839 --> 00:04:41,680 Speaker 9: It'll give you a whole outline of. 108 00:04:41,640 --> 00:04:44,640 Speaker 6: A lesson plant to I mean, the tool is there 109 00:04:44,680 --> 00:04:46,360 Speaker 6: to use and Abby needs it. 110 00:04:46,480 --> 00:04:50,760 Speaker 3: So has it ever given you information for your lesson plan? 111 00:04:50,880 --> 00:04:50,960 Speaker 6: Like? 112 00:04:51,000 --> 00:04:52,720 Speaker 3: Have you ever caught it as you're going? 113 00:04:52,800 --> 00:04:55,359 Speaker 1: Because I'm assuming you like, will go over it before 114 00:04:55,400 --> 00:04:59,520 Speaker 1: you start, you know acting do you ever do you 115 00:04:59,520 --> 00:05:01,760 Speaker 1: ever like kid? And you go, well, wait home on 116 00:05:01,800 --> 00:05:03,200 Speaker 1: one second, that's not real. 117 00:05:04,440 --> 00:05:07,479 Speaker 8: And not sometime it's not real. I also work with 118 00:05:07,560 --> 00:05:12,919 Speaker 8: like special education, and I'll try it and I'm like, 119 00:05:14,000 --> 00:05:15,640 Speaker 8: there is no way in the. 120 00:05:15,520 --> 00:05:18,880 Speaker 9: World that these kids can do this or. 121 00:05:18,839 --> 00:05:21,880 Speaker 8: Handle this or understand this, and it'll have to just 122 00:05:22,040 --> 00:05:23,880 Speaker 8: you know, be switched up real quick. 123 00:05:24,880 --> 00:05:28,040 Speaker 3: Are there times where you are grading? 124 00:05:28,160 --> 00:05:31,120 Speaker 1: What what class do you teach? 125 00:05:31,160 --> 00:05:36,480 Speaker 8: What I teach kids with emotional and behavioral disabilities? 126 00:05:36,920 --> 00:05:39,760 Speaker 3: How old are they they? 127 00:05:39,960 --> 00:05:42,160 Speaker 8: I usually do kindergarten. 128 00:05:41,640 --> 00:05:44,800 Speaker 1: Through second grade, so they're not using AI yet. But 129 00:05:45,200 --> 00:05:47,600 Speaker 1: do you talk to your other teacher friends that have 130 00:05:47,680 --> 00:05:51,320 Speaker 1: older kids in class and like, are they noticing that? 131 00:05:51,440 --> 00:05:53,200 Speaker 3: Like everyone's just cheating with AI. 132 00:05:54,600 --> 00:05:58,240 Speaker 8: So it's actually said, my husband's a college professor and 133 00:05:58,920 --> 00:06:03,839 Speaker 8: he will find his students using AI for essays and 134 00:06:04,120 --> 00:06:06,120 Speaker 8: projects and. 135 00:06:06,000 --> 00:06:06,560 Speaker 9: Things like that. 136 00:06:06,680 --> 00:06:08,760 Speaker 8: He had had a student the other day who had 137 00:06:08,800 --> 00:06:12,440 Speaker 8: metaglasses on during a test, and like you had to 138 00:06:12,440 --> 00:06:13,360 Speaker 8: do this whole. 139 00:06:13,120 --> 00:06:15,640 Speaker 7: Thing with you know, making sure that they didn't have 140 00:06:15,720 --> 00:06:18,720 Speaker 7: those on. And so it's definitely as kids get older, 141 00:06:18,760 --> 00:06:20,080 Speaker 7: it's definitely a problem. 142 00:06:20,520 --> 00:06:22,560 Speaker 1: Like I wonder if they'll ever have a point where 143 00:06:22,600 --> 00:06:26,000 Speaker 1: like in schools, they'll have the capability to like they 144 00:06:26,000 --> 00:06:31,560 Speaker 1: click a button and it blocks all information coming into devices. 145 00:06:32,000 --> 00:06:34,360 Speaker 7: Well, when they do like testing and stuff, they have 146 00:06:34,480 --> 00:06:36,600 Speaker 7: to go on to like a secured browser and whatnot 147 00:06:36,640 --> 00:06:38,800 Speaker 7: to make sure like they're not being able. 148 00:06:38,560 --> 00:06:40,760 Speaker 3: To use secured browser. You talk about. 149 00:06:43,800 --> 00:06:47,400 Speaker 8: Like testing browser and like you know, like you can't do. 150 00:06:47,360 --> 00:06:50,120 Speaker 7: Anything but the test, and you know it's you know, 151 00:06:50,360 --> 00:06:53,480 Speaker 7: it can be very useful, but it can also be 152 00:06:53,600 --> 00:06:57,039 Speaker 7: very dangerous and give a lot of misinformation or you know, 153 00:06:58,120 --> 00:07:00,000 Speaker 7: it's just to be able to use the right way, 154 00:07:00,279 --> 00:07:00,800 Speaker 7: you know it. 155 00:07:01,320 --> 00:07:03,600 Speaker 1: So is that is that a prompt that you can 156 00:07:03,640 --> 00:07:06,039 Speaker 1: ask AI to be like, Hey, I'm doing my homework. 157 00:07:06,080 --> 00:07:06,960 Speaker 3: Can you make this? 158 00:07:07,200 --> 00:07:07,440 Speaker 2: Uh? 159 00:07:07,480 --> 00:07:09,039 Speaker 4: So I don't get can you make it? 160 00:07:09,080 --> 00:07:09,120 Speaker 1: So? 161 00:07:09,560 --> 00:07:12,200 Speaker 3: Don't Thank you so much for calling in. 162 00:07:14,600 --> 00:07:14,840 Speaker 6: Again. 163 00:07:15,120 --> 00:07:15,680 Speaker 8: Very funny. 164 00:07:15,760 --> 00:07:17,400 Speaker 7: It's it's my husband. 165 00:07:17,200 --> 00:07:20,000 Speaker 9: Will like put like the visible things. 166 00:07:20,960 --> 00:07:22,800 Speaker 8: And like, so what he's put they put into AI, 167 00:07:23,000 --> 00:07:24,560 Speaker 8: like it'll throw it in there so they don't read 168 00:07:24,640 --> 00:07:25,120 Speaker 8: through it. 169 00:07:25,960 --> 00:07:26,440 Speaker 4: Oh they know. 170 00:07:26,600 --> 00:07:29,240 Speaker 3: Yeah, thank you so much for calling in life. 171 00:07:29,400 --> 00:07:30,360 Speaker 4: That's interesting. 172 00:07:30,600 --> 00:07:33,160 Speaker 3: Thank you. Who's this. 173 00:07:35,320 --> 00:07:35,520 Speaker 2: Hi? 174 00:07:35,720 --> 00:07:39,600 Speaker 9: This is Amy. I'm a college professor. I'm calling about 175 00:07:39,840 --> 00:07:43,920 Speaker 9: the AI thing. I just heard the last caller, So 176 00:07:44,040 --> 00:07:47,400 Speaker 9: I can continue that conversation if you want, or you 177 00:07:47,440 --> 00:07:49,880 Speaker 9: guys can move on to like a different. 178 00:07:52,000 --> 00:07:52,480 Speaker 4: I love this. 179 00:07:53,240 --> 00:07:58,920 Speaker 3: Are you female? Jed here? Yeah? Phil? So as a 180 00:07:58,960 --> 00:08:02,200 Speaker 3: college professor? How how many students have you caught with AI? 181 00:08:04,280 --> 00:08:07,680 Speaker 9: So what I end up doing is any assignment that 182 00:08:07,960 --> 00:08:11,000 Speaker 9: I do, I put into AI to see what could 183 00:08:11,160 --> 00:08:15,080 Speaker 9: possibly output, so I know what to look for, and 184 00:08:15,120 --> 00:08:17,920 Speaker 9: then I will when I get their assignments back. I 185 00:08:18,040 --> 00:08:22,440 Speaker 9: run everything through AI and it's probably about like ten 186 00:08:22,480 --> 00:08:23,440 Speaker 9: to twenty percent. 187 00:08:23,960 --> 00:08:24,280 Speaker 3: Damn. 188 00:08:24,400 --> 00:08:26,240 Speaker 4: And what do you call them out? Do you fail them? 189 00:08:26,280 --> 00:08:26,360 Speaker 6: Like? 190 00:08:26,400 --> 00:08:27,400 Speaker 4: What's the next step? 191 00:08:28,760 --> 00:08:29,120 Speaker 1: Yeah? 192 00:08:29,200 --> 00:08:32,439 Speaker 9: I call them out because it's university policy that if 193 00:08:32,440 --> 00:08:35,800 Speaker 9: you get caught using AI, it counts as flavorism, which 194 00:08:35,880 --> 00:08:38,560 Speaker 9: could be ground for expulsions from me. 195 00:08:38,800 --> 00:08:40,040 Speaker 4: The goal is to not get caught. 196 00:08:40,120 --> 00:08:41,200 Speaker 3: Do you expel them? 197 00:08:41,200 --> 00:08:41,280 Speaker 1: Like? 198 00:08:41,320 --> 00:08:42,520 Speaker 3: Do you look to expel them? 199 00:08:42,600 --> 00:08:44,680 Speaker 1: Or are you more so pulling than to the side 200 00:08:44,720 --> 00:08:47,840 Speaker 1: and being like, hey you fed up on this, don't 201 00:08:47,840 --> 00:08:48,480 Speaker 1: do it again? 202 00:08:48,800 --> 00:08:50,240 Speaker 3: Or I will have to say something. 203 00:08:52,000 --> 00:08:54,760 Speaker 9: Yeah, So all I comment back because everything is like 204 00:08:54,760 --> 00:08:58,200 Speaker 9: an online type submission, so I'll comment back like Hey, 205 00:08:58,320 --> 00:09:01,280 Speaker 9: just to let you know, this up as flagged for 206 00:09:01,920 --> 00:09:06,640 Speaker 9: seventy three percent use of AI. So I'm going to 207 00:09:06,720 --> 00:09:10,280 Speaker 9: give you a chance to redo it, and if it 208 00:09:10,360 --> 00:09:12,240 Speaker 9: comes up again, then I'm going to have to fail 209 00:09:12,280 --> 00:09:16,280 Speaker 9: you and potentially report it's your advisor. But you know, 210 00:09:16,360 --> 00:09:18,800 Speaker 9: I'll give you a chance to redo it. 211 00:09:19,120 --> 00:09:21,000 Speaker 6: Amy, I have a question, is there like a certain 212 00:09:21,080 --> 00:09:23,520 Speaker 6: word that you'll see that you're like, oh, this is 213 00:09:23,559 --> 00:09:25,719 Speaker 6: definitely using chat GBT, Because there was a story the 214 00:09:25,800 --> 00:09:29,280 Speaker 6: other week and Professor was saying, if anyone has moreover 215 00:09:30,679 --> 00:09:33,240 Speaker 6: moreover in their essays, do. 216 00:09:33,200 --> 00:09:33,640 Speaker 7: You see that? 217 00:09:33,840 --> 00:09:35,280 Speaker 6: Is there like a word and you're like, you're not 218 00:09:35,400 --> 00:09:35,880 Speaker 6: using that? 219 00:09:37,720 --> 00:09:39,800 Speaker 9: So the ones that I look for, if there's any 220 00:09:40,000 --> 00:09:44,080 Speaker 9: random words that are bolded, that usually means that it's 221 00:09:44,160 --> 00:09:47,720 Speaker 9: chat GBT, because if you put in any like keywords, 222 00:09:47,760 --> 00:09:49,920 Speaker 9: when the output comes out, it will be bolded. 223 00:09:50,160 --> 00:09:50,760 Speaker 3: How about. 224 00:09:52,679 --> 00:09:54,560 Speaker 9: Yeah, the dash, that's it? Get out of my head. 225 00:09:54,600 --> 00:09:55,360 Speaker 6: That was exact. 226 00:09:56,360 --> 00:09:56,600 Speaker 9: Great. 227 00:09:56,880 --> 00:10:00,439 Speaker 4: Friends actually already use the DASH though, the dash AI 228 00:10:00,679 --> 00:10:01,240 Speaker 4: all the time. 229 00:10:01,600 --> 00:10:04,040 Speaker 3: That's what, Yeah, what my fiance looks for in her stuff. 230 00:10:04,200 --> 00:10:05,280 Speaker 3: You have to look for the DASH. 231 00:10:05,520 --> 00:10:08,040 Speaker 4: And I know that people who used the dash for 232 00:10:08,280 --> 00:10:08,880 Speaker 4: before AI. 233 00:10:09,960 --> 00:10:11,440 Speaker 3: Well, so it's called. 234 00:10:13,240 --> 00:10:18,000 Speaker 9: Okay, it's called anash like E M dash And in 235 00:10:18,200 --> 00:10:19,959 Speaker 9: order to actually put it in you have to like 236 00:10:20,080 --> 00:10:23,000 Speaker 9: push another key rather than just the normal dash. And like, no, 237 00:10:23,400 --> 00:10:25,679 Speaker 9: no students going to be going, no one has been 238 00:10:26,040 --> 00:10:28,199 Speaker 9: for that, Like, ain't nobody like to put that in? 239 00:10:28,559 --> 00:10:29,040 Speaker 9: Just whatever? 240 00:10:31,360 --> 00:10:35,520 Speaker 1: Well then I know exactly, like okay, yeah, this was 241 00:10:35,880 --> 00:10:38,679 Speaker 1: this was a well, thank you so much for for 242 00:10:38,800 --> 00:10:41,480 Speaker 1: filling us in. And actually all college students who listened 243 00:10:41,480 --> 00:10:47,199 Speaker 1: to us just said thank you for we love you, 244 00:10:47,600 --> 00:10:49,280 Speaker 1: we'll talk to you soon, and thank you for being 245 00:10:49,360 --> 00:10:50,880 Speaker 1: a teacher and doing what you do. You make the 246 00:10:50,960 --> 00:10:51,840 Speaker 1: community a lot. 247 00:10:52,000 --> 00:10:53,240 Speaker 3: Then we love our teachers. 248 00:10:53,480 --> 00:10:56,120 Speaker 1: We do. This portion of the Joe Show podcast is 249 00:10:56,200 --> 00:10:59,240 Speaker 1: powered by Fair and Fair Tampa Accident Attorneys,