1 00:00:00,480 --> 00:00:03,320 Speaker 1: Out of most of their streaming success. 2 00:00:02,920 --> 00:00:06,400 Speaker 2: Shows Clay and Buck Today at noon on fifty five. 3 00:00:06,280 --> 00:00:11,960 Speaker 1: Karc Ato five Here a fifty five KRCD talk station, 4 00:00:12,080 --> 00:00:15,079 Speaker 1: Happy Wednesday. Sadly no judgment of Polatana. But at the 5 00:00:15,080 --> 00:00:16,840 Speaker 1: bottom of the are we're gonna hear from Scott Warpin from 6 00:00:16,840 --> 00:00:19,919 Speaker 1: the Cincinni enquired by the Council Race in the meantime. 7 00:00:19,960 --> 00:00:21,959 Speaker 1: Empower Youamerica dot Orger is where you find all the 8 00:00:21,960 --> 00:00:26,119 Speaker 1: empower Use seminar series. They're all wonderful, all very informative, 9 00:00:26,160 --> 00:00:29,200 Speaker 1: most notably chat GPT. It's a brand new phenomenon for 10 00:00:29,280 --> 00:00:31,360 Speaker 1: most of us out in the world. It's that artificial 11 00:00:31,360 --> 00:00:35,279 Speaker 1: intelligence doing the work for us. Michael Mercer is doing 12 00:00:35,280 --> 00:00:37,479 Speaker 1: the seminars taking place tomorrow night. You can log in 13 00:00:37,520 --> 00:00:39,920 Speaker 1: from home or I think you can show up yes 14 00:00:40,360 --> 00:00:43,760 Speaker 1: at the empower U Seminar Classroom three hundred Great Oaks Drive, 15 00:00:44,200 --> 00:00:47,599 Speaker 1: or you'll hear Michael talk about chat GPT by way 16 00:00:47,640 --> 00:00:51,040 Speaker 1: of background. He's very informed in this obviously this topic. 17 00:00:51,120 --> 00:00:54,560 Speaker 1: He is the president of Screen Education, which addresses issues 18 00:00:54,560 --> 00:00:57,240 Speaker 1: at the intersection of digital technology and human well being. 19 00:00:57,280 --> 00:01:00,480 Speaker 1: We'll get to that, including smartphone addiction. News meet you, bias, 20 00:01:00,560 --> 00:01:04,479 Speaker 1: artificial intelligence, thought research seminars, and consultant. He, like most 21 00:01:04,480 --> 00:01:07,480 Speaker 1: of us my age, started out with world book encyclopedias 22 00:01:07,520 --> 00:01:09,320 Speaker 1: that he spent a decade as an editor and publisher 23 00:01:09,360 --> 00:01:12,520 Speaker 1: of college textbooks and another decade in market research before 24 00:01:12,560 --> 00:01:16,160 Speaker 1: heading on over to president of Screen Education, got multiple 25 00:01:16,200 --> 00:01:19,280 Speaker 1: degrees and again the seminars tomorrow night, beginning at seven pm. 26 00:01:19,440 --> 00:01:21,680 Speaker 1: Welcome to the program, Michael. It's a real pleasure to 27 00:01:21,720 --> 00:01:22,440 Speaker 1: have you on today. 28 00:01:23,160 --> 00:01:24,640 Speaker 2: Well, thank you, Brian, I appreciate it. 29 00:01:24,680 --> 00:01:27,840 Speaker 1: And an interesting element of your seminar. Now part of 30 00:01:27,840 --> 00:01:31,559 Speaker 1: me wants to say, well, if you're using chat GPT, 31 00:01:31,840 --> 00:01:34,640 Speaker 1: you're not doing the work and you're not learning anything. 32 00:01:34,720 --> 00:01:36,360 Speaker 1: It's like you sit there in front of it with 33 00:01:36,440 --> 00:01:38,200 Speaker 1: your mouth open, a little bit of drool coming out 34 00:01:38,240 --> 00:01:39,640 Speaker 1: of the corner of your mouth. You let it do 35 00:01:39,720 --> 00:01:42,000 Speaker 1: all the work for you, and then you just regurgitate 36 00:01:42,040 --> 00:01:45,200 Speaker 1: it back to whatever source you're seeking the information for. 37 00:01:45,280 --> 00:01:47,600 Speaker 1: But you have a different viewpoint on this. You can 38 00:01:47,800 --> 00:01:50,080 Speaker 1: use it to learn, right. 39 00:01:50,200 --> 00:01:52,600 Speaker 2: So that's the point of my talk is really how 40 00:01:53,960 --> 00:01:57,880 Speaker 2: to use chat GPT to teach yourself anything. That's really 41 00:01:57,920 --> 00:02:01,760 Speaker 2: the point where I'm coming from with this is I've, 42 00:02:01,920 --> 00:02:03,720 Speaker 2: as you sort of alluded to this, I've been a 43 00:02:03,840 --> 00:02:08,840 Speaker 2: sort of an autodidactor. I teach myself things all my life, 44 00:02:08,880 --> 00:02:13,120 Speaker 2: and I during the COVID I actually spent about thirty 45 00:02:13,120 --> 00:02:16,519 Speaker 2: five hours studying MR and A vaccines and how they work. 46 00:02:16,600 --> 00:02:20,000 Speaker 2: I really needed to understand them for personal reasons, and 47 00:02:20,919 --> 00:02:24,639 Speaker 2: so that was a huge self directed learning project. And 48 00:02:25,480 --> 00:02:27,280 Speaker 2: I was sort of shocked because I would talk to 49 00:02:27,320 --> 00:02:30,320 Speaker 2: other people, friends and acquaintances about these vaccines and they 50 00:02:30,400 --> 00:02:32,000 Speaker 2: knew nothing about it. And I could see that they 51 00:02:32,000 --> 00:02:34,440 Speaker 2: had no interest in learning about it. You know, they 52 00:02:34,440 --> 00:02:36,400 Speaker 2: almost felt like they couldn't understand it if they tried. 53 00:02:36,480 --> 00:02:39,280 Speaker 2: So I realized, like, wow, you know, maybe a lot 54 00:02:39,320 --> 00:02:41,600 Speaker 2: of people don't teach themselves things. And then when I 55 00:02:41,639 --> 00:02:45,480 Speaker 2: discovered chatbots, I'll tell you, I asked the man I 56 00:02:45,520 --> 00:02:48,040 Speaker 2: spent about thirty five hours teaching myself about MR and 57 00:02:48,080 --> 00:02:50,160 Speaker 2: A vaccines that if I had a chatbot, I think 58 00:02:50,160 --> 00:02:53,000 Speaker 2: I could have cut it to fifteen. You know. Yeah, well, 59 00:02:53,160 --> 00:02:53,520 Speaker 2: you know what. 60 00:02:53,880 --> 00:02:56,280 Speaker 1: I have to interject this because we went back and 61 00:02:56,320 --> 00:02:58,919 Speaker 1: forth with my son. My son is now thirty one. 62 00:02:58,960 --> 00:03:02,800 Speaker 1: He was in the computer engineering department at Ohio State university, 63 00:03:02,800 --> 00:03:05,639 Speaker 1: and I think that speaks volumes to his his intellect 64 00:03:05,639 --> 00:03:07,800 Speaker 1: because it's tough to get in there. So he spends 65 00:03:07,800 --> 00:03:09,640 Speaker 1: about a year and a half and then he drops out. 66 00:03:10,200 --> 00:03:12,560 Speaker 1: And what are you doing? You're not paying for it, 67 00:03:12,639 --> 00:03:14,560 Speaker 1: Just get the damn degree. I don't want to sit 68 00:03:14,680 --> 00:03:17,280 Speaker 1: in front of a computer doing coding. And besides, Mom 69 00:03:17,320 --> 00:03:21,320 Speaker 1: and Dad, I can teach it myself. And we're like, well, 70 00:03:21,600 --> 00:03:23,920 Speaker 1: well he ended up doing that. He'd got certificates for 71 00:03:24,120 --> 00:03:26,320 Speaker 1: computer security. He knows how to code. He did it 72 00:03:26,360 --> 00:03:29,440 Speaker 1: all himself. He taught himself, so it can be done. 73 00:03:29,480 --> 00:03:33,400 Speaker 1: And I make this point regularly because he convinced us 74 00:03:33,919 --> 00:03:37,000 Speaker 1: that he was right. Why spend all the extra money 75 00:03:37,000 --> 00:03:39,840 Speaker 1: and resources when all of this material is readily available. 76 00:03:39,880 --> 00:03:42,360 Speaker 1: Why would you spend one hundred thousand dollars to get 77 00:03:42,360 --> 00:03:45,800 Speaker 1: an art degree when there's countless books on art? You 78 00:03:45,800 --> 00:03:47,960 Speaker 1: can self direct and learn all you want to know 79 00:03:48,000 --> 00:03:50,600 Speaker 1: about your favorite hobby without having some teacher tell you 80 00:03:50,640 --> 00:03:52,600 Speaker 1: how to do it. I think people are paying college 81 00:03:52,640 --> 00:03:57,960 Speaker 1: tuition for the structure, the forced obligation to learn right. 82 00:03:58,640 --> 00:04:01,040 Speaker 2: Right right, that's you know what I'm going to do 83 00:04:01,040 --> 00:04:03,520 Speaker 2: in the talk is it's broken into three parts. So 84 00:04:03,600 --> 00:04:06,080 Speaker 2: the first part, and I feel this is critical for 85 00:04:06,120 --> 00:04:08,280 Speaker 2: people who want to teach themselves things right the way 86 00:04:08,280 --> 00:04:11,280 Speaker 2: that your son did. The first part, I'm going to 87 00:04:11,360 --> 00:04:14,520 Speaker 2: walk through the learning process. So I've broken the learning 88 00:04:14,520 --> 00:04:18,400 Speaker 2: process into six steps. You set a goal, you gather information, 89 00:04:18,520 --> 00:04:20,719 Speaker 2: you vet the information, then you have to do the 90 00:04:20,720 --> 00:04:24,960 Speaker 2: hard work to understand at a deep level that information. 91 00:04:25,839 --> 00:04:28,360 Speaker 2: Then you structure it, and then you integrate it into 92 00:04:28,400 --> 00:04:30,760 Speaker 2: your current knowledge base. Right. So I'm going to walk 93 00:04:30,800 --> 00:04:35,320 Speaker 2: through this self directed learning process so people understand the 94 00:04:35,360 --> 00:04:38,000 Speaker 2: process and how it works at each stage. Then I'm 95 00:04:38,000 --> 00:04:41,400 Speaker 2: going to give an overview of chatbots, and specifically I'm 96 00:04:41,400 --> 00:04:43,520 Speaker 2: going to use chat GPT because it's so popular, right, 97 00:04:43,560 --> 00:04:46,800 Speaker 2: but i want to show people, you know, you know, 98 00:04:47,520 --> 00:04:49,719 Speaker 2: this is what a chatbot is, this is the interface. 99 00:04:50,279 --> 00:04:53,360 Speaker 2: These are the features that are critical and really helpful 100 00:04:53,400 --> 00:04:57,520 Speaker 2: in self directed learning. But I think another important thing 101 00:04:57,640 --> 00:05:00,480 Speaker 2: is to teach them how these you know, so called 102 00:05:00,560 --> 00:05:03,560 Speaker 2: large language models work, because I think if you can 103 00:05:03,640 --> 00:05:06,800 Speaker 2: understand how large language models work, you can be more 104 00:05:06,839 --> 00:05:10,760 Speaker 2: effective in using chatbots to learn, you know. And then 105 00:05:10,960 --> 00:05:12,880 Speaker 2: the third part, I'm going to synthesize the two. Then 106 00:05:12,880 --> 00:05:15,960 Speaker 2: I'm going to close off by walking through the each 107 00:05:16,000 --> 00:05:18,200 Speaker 2: steps of each of the six steps of learning again 108 00:05:18,839 --> 00:05:21,840 Speaker 2: and tying off, how do you know specific ways you 109 00:05:21,960 --> 00:05:25,320 Speaker 2: use the chatbot to optimize your learning at each stage. Okay, 110 00:05:25,360 --> 00:05:27,599 Speaker 2: So I think it's really going to be helpful for 111 00:05:27,720 --> 00:05:30,760 Speaker 2: people to sort of codify this process for people if 112 00:05:30,800 --> 00:05:32,840 Speaker 2: they do want to jump in and try to teach 113 00:05:32,880 --> 00:05:33,679 Speaker 2: themselves things. 114 00:05:33,720 --> 00:05:35,719 Speaker 1: You know, right, if you have a motivation to teach 115 00:05:35,720 --> 00:05:37,800 Speaker 1: yourself something, this is an ideal class to do it 116 00:05:37,800 --> 00:05:40,960 Speaker 1: because it is it being the information on whatever topic 117 00:05:41,000 --> 00:05:44,560 Speaker 1: you want is out there in the world. So the 118 00:05:44,680 --> 00:05:46,640 Speaker 1: key element that I want to focus on just here 119 00:05:46,680 --> 00:05:49,920 Speaker 1: for at least a moment, Michael is vetting. You've got 120 00:05:49,960 --> 00:05:52,080 Speaker 1: vetting in here, and of course Vetting suggests to me 121 00:05:52,200 --> 00:05:55,320 Speaker 1: you need to understand how reliable whatever given source is, 122 00:05:55,360 --> 00:05:57,560 Speaker 1: where the source material came from, so you know if 123 00:05:57,560 --> 00:05:59,520 Speaker 1: it's some sort of like learned treatise or it's just 124 00:05:59,600 --> 00:06:03,000 Speaker 1: made up whole cloth. Because I keep hearing I mean 125 00:06:03,040 --> 00:06:06,440 Speaker 1: almost daily, and I'm sure you do as well. Idiot lawyers, 126 00:06:06,440 --> 00:06:09,000 Speaker 1: for example, who let chat, GPT or some other AI 127 00:06:09,200 --> 00:06:11,960 Speaker 1: program create case law that they don't even bother. Going 128 00:06:12,000 --> 00:06:14,520 Speaker 1: back to the original law books, or go to Lexus 129 00:06:14,560 --> 00:06:16,120 Speaker 1: and Nexus or West Law to find out if it's 130 00:06:16,120 --> 00:06:18,760 Speaker 1: a real case law and behold, it's not. It makes 131 00:06:18,839 --> 00:06:21,080 Speaker 1: up stuff. How do you know? How can you vet 132 00:06:21,120 --> 00:06:23,520 Speaker 1: what you're reading and know whether it's even real or not? 133 00:06:23,600 --> 00:06:26,120 Speaker 2: Michael, Well, that's a good point, you know, so that 134 00:06:26,279 --> 00:06:28,560 Speaker 2: you're getting at the in terms of vetting, that's the 135 00:06:28,720 --> 00:06:33,400 Speaker 2: you're right, the information literacy judging whether something is accurate 136 00:06:33,440 --> 00:06:35,280 Speaker 2: and correct. Right. So I guess that that's part of 137 00:06:35,320 --> 00:06:38,680 Speaker 2: the vetting. And one one way you do this is 138 00:06:38,720 --> 00:06:40,479 Speaker 2: I would say, first of all, don't just trust. You 139 00:06:40,520 --> 00:06:43,440 Speaker 2: can't blindly trust the chatbot. You can't. So you you know, 140 00:06:43,480 --> 00:06:46,120 Speaker 2: you've got to go out and cross reference things. You 141 00:06:46,160 --> 00:06:48,320 Speaker 2: can ask it to give you sources and then you 142 00:06:48,360 --> 00:06:50,440 Speaker 2: can link to those sources and look at the source 143 00:06:50,480 --> 00:06:53,040 Speaker 2: and see if you, in your judgment, it's accurate. Right. 144 00:06:54,120 --> 00:06:58,000 Speaker 2: So information literacy and judging the validity and truthfulness of 145 00:06:58,040 --> 00:07:01,640 Speaker 2: information is part of the vetting. The other part is 146 00:07:01,640 --> 00:07:05,920 Speaker 2: is vetting information in terms of determining what what you 147 00:07:06,000 --> 00:07:09,720 Speaker 2: given your goal, right, since you're the teacher, you're setting 148 00:07:09,760 --> 00:07:12,000 Speaker 2: the goal, and you have an internal motivation, right, you 149 00:07:12,040 --> 00:07:14,400 Speaker 2: want to you want to learn what you're studying for 150 00:07:14,480 --> 00:07:16,800 Speaker 2: internal reasons, right, It's not like you're taking a course 151 00:07:16,840 --> 00:07:19,560 Speaker 2: and the goal is given to you, right, Right, So 152 00:07:19,600 --> 00:07:23,600 Speaker 2: you want to use your intrinsic goals as the guide 153 00:07:23,640 --> 00:07:27,960 Speaker 2: post for judging whether or not the information you're you're 154 00:07:28,160 --> 00:07:31,760 Speaker 2: finding that could be relevant to the goal is actually relevant. 155 00:07:31,760 --> 00:07:33,360 Speaker 2: So you're going to find there are things like, Okay, 156 00:07:33,800 --> 00:07:36,120 Speaker 2: it's tangential, I don't really need to understand that at 157 00:07:36,160 --> 00:07:38,360 Speaker 2: a deep level given my goal, right. So that's the 158 00:07:38,360 --> 00:07:41,600 Speaker 2: other part of vetting. So so it's it's sort of 159 00:07:41,960 --> 00:07:44,640 Speaker 2: you know, both of those things, right, Yeah, judging the 160 00:07:44,680 --> 00:07:47,160 Speaker 2: relevance but also the validity of the information, and it's 161 00:07:47,440 --> 00:07:49,960 Speaker 2: it's all going to like primary sources and checking multiple 162 00:07:49,960 --> 00:07:52,520 Speaker 2: sources to make sure it's it's it's. 163 00:07:52,360 --> 00:07:55,640 Speaker 1: Truthful, and having a fundamental knowledge of what a primary 164 00:07:55,680 --> 00:07:58,320 Speaker 1: source is also is a helpful thing to have going 165 00:07:58,360 --> 00:07:59,680 Speaker 1: into the process, isn't it. 166 00:08:00,760 --> 00:08:04,320 Speaker 2: Well, that's that's true, that's true. Yeah. So but even that, 167 00:08:04,480 --> 00:08:07,679 Speaker 2: you know, there's there's a lot of judgment that comes 168 00:08:07,680 --> 00:08:09,600 Speaker 2: to that because you could have a primary source, but 169 00:08:09,640 --> 00:08:12,120 Speaker 2: it could be biased, like you know, for example. 170 00:08:12,400 --> 00:08:16,880 Speaker 1: Like a scientific consensus Exactly. 171 00:08:17,200 --> 00:08:18,920 Speaker 2: I was going to say a research study going back 172 00:08:18,960 --> 00:08:20,640 Speaker 2: to COVID. Right, you could have a research study on 173 00:08:20,680 --> 00:08:24,600 Speaker 2: the vaccines or biromectin or whatever it is, right, and 174 00:08:25,280 --> 00:08:27,320 Speaker 2: you you have to bring you know, it could be 175 00:08:27,440 --> 00:08:31,240 Speaker 2: biased science, right, The people doing the study could could 176 00:08:31,240 --> 00:08:33,040 Speaker 2: have been biased, they could have changed data. So you 177 00:08:34,000 --> 00:08:36,160 Speaker 2: have to bring some judgment and you know there is 178 00:08:36,160 --> 00:08:40,320 Speaker 2: some guesswork in estimating. It's not a perfect process, but 179 00:08:40,760 --> 00:08:41,559 Speaker 2: you can do pretty well. 180 00:08:41,559 --> 00:08:42,160 Speaker 1: I think you know. 181 00:08:42,320 --> 00:08:42,600 Speaker 2: So. 182 00:08:42,600 --> 00:08:45,560 Speaker 1: So for example, if I'm interested in, like whether carbon 183 00:08:45,600 --> 00:08:47,679 Speaker 1: dioxide is bad for people or something, you're going to 184 00:08:47,720 --> 00:08:49,880 Speaker 1: have research studies would suggest it is some sort of 185 00:08:49,880 --> 00:08:52,840 Speaker 1: greenhouse gas, and it'll make that conclusion based upon some 186 00:08:52,960 --> 00:08:55,960 Speaker 1: other scientific consensus. But then again you can ask it 187 00:08:56,640 --> 00:09:00,560 Speaker 1: to provide an alternative viewpoint of that, can you not? 188 00:09:00,760 --> 00:09:02,880 Speaker 1: So it would go to other sources? 189 00:09:03,520 --> 00:09:06,760 Speaker 2: Yeah? Perfect, Yeah, perfect point. Yes you can. And that's 190 00:09:06,880 --> 00:09:09,200 Speaker 2: see that's the great thing about the chat This is 191 00:09:09,240 --> 00:09:11,840 Speaker 2: the thing with the chatbots, right, it's you I basically 192 00:09:11,840 --> 00:09:16,480 Speaker 2: look at these chatbots as they're they're expert level tutors 193 00:09:16,520 --> 00:09:20,839 Speaker 2: on every subject, right that are available twenty four to seven. 194 00:09:20,920 --> 00:09:23,240 Speaker 2: You can access them twenty four to seven and they 195 00:09:23,280 --> 00:09:27,600 Speaker 2: can speak to you at any level of detail you want. Right. So, 196 00:09:27,600 --> 00:09:30,120 Speaker 2: so what you're doing is through this iterative process I 197 00:09:30,280 --> 00:09:32,200 Speaker 2: chat with these chat bots like I'm chatting with a 198 00:09:32,320 --> 00:09:35,640 Speaker 2: with a teacher or a person right, right. And what 199 00:09:35,679 --> 00:09:38,760 Speaker 2: it's doing is it's continuously with with every iteration of 200 00:09:38,800 --> 00:09:44,360 Speaker 2: your conversation, it's continuously refining your it's understanding of your 201 00:09:44,440 --> 00:09:47,760 Speaker 2: perspective until it gets to a point where it's it's 202 00:09:47,800 --> 00:09:54,079 Speaker 2: what I call mirror, cognitive mirroring or cognitive achieving, cognitive 203 00:09:54,120 --> 00:09:57,600 Speaker 2: alignment with with you, right you get? 204 00:09:57,960 --> 00:10:02,600 Speaker 1: Is that you getting an answer you ultimately want, Like 205 00:10:02,640 --> 00:10:04,880 Speaker 1: an expert. If I hire an expert as a lawyer, 206 00:10:05,000 --> 00:10:07,680 Speaker 1: I'm pretty darn certain I can get that expert to 207 00:10:07,720 --> 00:10:10,440 Speaker 1: reach a conclusion that's beneficial to my client. Can you 208 00:10:10,480 --> 00:10:13,200 Speaker 1: do that with chat, GPT or other AI resources? 209 00:10:14,000 --> 00:10:16,040 Speaker 2: You can? It's it's through you know, the more it 210 00:10:16,120 --> 00:10:18,800 Speaker 2: understands your goal. So what one thing I'm going to 211 00:10:18,880 --> 00:10:21,440 Speaker 2: lay out in the talk is how you accelerate through 212 00:10:21,480 --> 00:10:25,080 Speaker 2: the process of having it aligned with you cognitively. Right, 213 00:10:25,280 --> 00:10:28,280 Speaker 2: Sure you can. So you can almost do like a 214 00:10:28,320 --> 00:10:31,040 Speaker 2: brain dump at the beginning of the project, so it 215 00:10:31,200 --> 00:10:33,680 Speaker 2: fully understands what you want to do. And this is 216 00:10:33,679 --> 00:10:36,000 Speaker 2: the thing, Brian, Once it knows, if it knows you're 217 00:10:36,040 --> 00:10:38,520 Speaker 2: working on a project, say like I use chat GPT 218 00:10:38,640 --> 00:10:42,200 Speaker 2: to learn about large language models and chatbots for this 219 00:10:42,400 --> 00:10:45,200 Speaker 2: for this talk them to give right, And once it 220 00:10:45,360 --> 00:10:48,840 Speaker 2: new that, I'm like, Okay, he's working on this presentation. 221 00:10:49,400 --> 00:10:51,439 Speaker 2: He wants to understand them, but he wants to understand 222 00:10:51,440 --> 00:10:53,000 Speaker 2: them at a way, at a level at which he 223 00:10:53,000 --> 00:10:56,120 Speaker 2: can explain them to a layperson, right, And he himself 224 00:10:56,200 --> 00:10:57,840 Speaker 2: is not like a tech expert, right, so he has 225 00:10:57,880 --> 00:11:00,600 Speaker 2: to understanding of that level. It's it's what you're spying 226 00:11:00,640 --> 00:11:04,760 Speaker 2: and saying, here's the here's the information, here's how you 227 00:11:04,800 --> 00:11:08,480 Speaker 2: can present it to your audience at a conceptual level 228 00:11:08,559 --> 00:11:11,680 Speaker 2: so the average person can understand it. And if you want, 229 00:11:11,720 --> 00:11:14,080 Speaker 2: I can create a sly for you, you know, to 230 00:11:14,160 --> 00:11:16,559 Speaker 2: incert in your talk. That's what it starts doing. And 231 00:11:16,600 --> 00:11:20,160 Speaker 2: so you know, they can be extremely helpful once they 232 00:11:20,360 --> 00:11:22,720 Speaker 2: really fully understand what you're doing and what your goal 233 00:11:22,800 --> 00:11:24,679 Speaker 2: is and how you're looking at the project. You know, 234 00:11:25,600 --> 00:11:26,160 Speaker 2: it's amazing. 235 00:11:26,240 --> 00:11:28,680 Speaker 1: Yeah, I get that, and I can see it. For example, 236 00:11:28,760 --> 00:11:30,880 Speaker 1: again going back to my you know my days' is 237 00:11:30,920 --> 00:11:36,800 Speaker 1: a litigation attorney. Sometimes you have extraordinarily complex case. You know, 238 00:11:37,000 --> 00:11:39,760 Speaker 1: the facts and circumstances and law and elements of it 239 00:11:40,040 --> 00:11:42,400 Speaker 1: that you really get used to and you're accustomed to 240 00:11:42,520 --> 00:11:45,240 Speaker 1: talking amongst yourselves with your other lawyers. Everybody gets what 241 00:11:45,240 --> 00:11:47,240 Speaker 1: you're talking about, you know, but you've got to be 242 00:11:47,360 --> 00:11:50,200 Speaker 1: able to explain it to a jury who has no 243 00:11:50,559 --> 00:11:54,160 Speaker 1: idea where you're coming from. You got to boil it 244 00:11:54,200 --> 00:11:57,640 Speaker 1: down easily, understandable, a story to tell which connects all 245 00:11:57,679 --> 00:11:59,800 Speaker 1: the dots, so it makes people that's it sounds me 246 00:11:59,880 --> 00:12:01,120 Speaker 1: like what you're accomplishing here. 247 00:12:01,960 --> 00:12:05,040 Speaker 2: That's right. So after the vetting process, once you decide okay, 248 00:12:05,080 --> 00:12:06,640 Speaker 2: I need these are the five things I need to 249 00:12:06,679 --> 00:12:09,000 Speaker 2: learn to understand this, then the next step in my 250 00:12:09,080 --> 00:12:12,080 Speaker 2: learning process is actually understanding it. You know, you won't 251 00:12:12,080 --> 00:12:14,520 Speaker 2: fully understanding once you vet it. You still have to 252 00:12:14,559 --> 00:12:17,640 Speaker 2: go deep into it and understand it. That's when these 253 00:12:17,720 --> 00:12:20,840 Speaker 2: chatbots can be incredible because you can just keep going 254 00:12:20,920 --> 00:12:25,960 Speaker 2: back and forth and just keep asking it questions about 255 00:12:26,040 --> 00:12:29,040 Speaker 2: every minute thing that you don't understand, and it'll it'll 256 00:12:29,040 --> 00:12:31,120 Speaker 2: explain it to you at the level you want to 257 00:12:31,200 --> 00:12:34,080 Speaker 2: understand it or you need to understand it. It'll clarify 258 00:12:34,760 --> 00:12:38,200 Speaker 2: the micro elements that you're not getting. It'll give you 259 00:12:38,640 --> 00:12:40,800 Speaker 2: metaphors that help you under I want to give you 260 00:12:40,840 --> 00:12:45,120 Speaker 2: an example. So these large language models, it's really complex stuff. 261 00:12:45,120 --> 00:12:47,439 Speaker 2: So I learned a lot more than the average person. 262 00:12:47,480 --> 00:12:50,160 Speaker 2: But I'm by no means close to you know, being 263 00:12:50,200 --> 00:12:53,880 Speaker 2: a quote tech expert on this, right, But I was 264 00:12:53,920 --> 00:12:57,640 Speaker 2: struggling with this for hours, okay, to understand it large 265 00:12:57,679 --> 00:13:00,840 Speaker 2: language models. And at some point the chat gave me 266 00:13:00,920 --> 00:13:03,880 Speaker 2: an analogy that just crystallized the whole thing for me, right, 267 00:13:04,320 --> 00:13:06,520 Speaker 2: And it was trying to explain, well, it's a large 268 00:13:06,600 --> 00:13:10,600 Speaker 2: language module. Isn't thinking, it's actually just it's it's using 269 00:13:10,640 --> 00:13:14,000 Speaker 2: probability to give you the next likely word in its response. 270 00:13:14,360 --> 00:13:18,200 Speaker 2: So it's a probability machine, right, And it gave me 271 00:13:18,240 --> 00:13:23,000 Speaker 2: this metaphor it was it said, it's like autocomplete on steroids. 272 00:13:23,960 --> 00:13:26,600 Speaker 2: It just right, So that's literally what it's doing. It's 273 00:13:26,640 --> 00:13:29,600 Speaker 2: like on your phone, right when you're typing a message 274 00:13:29,640 --> 00:13:33,000 Speaker 2: to someone, a text message. It gives you like you're 275 00:13:33,120 --> 00:13:34,800 Speaker 2: starting to type the next word, and it gives you 276 00:13:34,840 --> 00:13:38,280 Speaker 2: three options to pick one. It's calculating the most likely 277 00:13:38,400 --> 00:13:41,160 Speaker 2: next word you're going to use given the context of 278 00:13:41,240 --> 00:13:44,160 Speaker 2: what you're typing, right, Well, that's what a chatbot is 279 00:13:44,200 --> 00:13:47,679 Speaker 2: literally doing it, but it's infinitely more complex. The calculation 280 00:13:47,800 --> 00:13:50,960 Speaker 2: is infinitely more complex. But it's really just a machine. 281 00:13:51,000 --> 00:13:54,520 Speaker 2: It's it doesn't you know, it's not thinking, it's not 282 00:13:55,280 --> 00:13:59,080 Speaker 2: it's not a person. You know. It's easy to anthropomorphize 283 00:13:59,200 --> 00:14:01,079 Speaker 2: these things, right, you really feel like it knows you, 284 00:14:01,600 --> 00:14:04,320 Speaker 2: but that Chris lives for me, Like, Okay, it's really 285 00:14:04,440 --> 00:14:08,760 Speaker 2: just a machine. It's a probability generating machine. 286 00:14:08,400 --> 00:14:12,319 Speaker 1: Really, and that's that's where the human element is always 287 00:14:12,360 --> 00:14:15,000 Speaker 1: going to be critical and going through this exact process 288 00:14:15,040 --> 00:14:16,560 Speaker 1: that you're going to be talking about tomorrow night at 289 00:14:16,600 --> 00:14:20,520 Speaker 1: seven pm. So yes, it is, and it won't end 290 00:14:20,560 --> 00:14:25,400 Speaker 1: your job. AI Michael Mercer, how to teach yourself anything 291 00:14:25,520 --> 00:14:28,680 Speaker 1: using chat GTP log in, get registered before you do. 292 00:14:28,760 --> 00:14:32,000 Speaker 1: Empower you America dot org register seven PM's log in 293 00:14:32,040 --> 00:14:33,600 Speaker 1: time or the time to be a three hundred grede 294 00:14:33,600 --> 00:14:36,960 Speaker 1: Oaks driver for the live discussion. It's going to be fascinating, Michael. 295 00:14:36,960 --> 00:14:39,240 Speaker 1: I appreciate you doing this and helping to educate us 296 00:14:39,240 --> 00:14:42,880 Speaker 1: and obviously giving us some really positive uses for artificial 297 00:14:42,880 --> 00:14:45,080 Speaker 1: intelligence when I think most of us are scared that 298 00:14:45,080 --> 00:14:47,080 Speaker 1: it's going to end our careers. Michael. 299 00:14:47,400 --> 00:14:49,960 Speaker 2: Yeah, well thanks, I hope it. I hope it helps people, 300 00:14:50,000 --> 00:14:50,760 Speaker 2: so I really do. 301 00:14:50,840 --> 00:14:53,400 Speaker 1: So that's what you're all about, Michael, and I appreciate 302 00:14:53,440 --> 00:14:55,800 Speaker 1: you doing that. Get some help tomorrow night, seven pm. 303 00:14:55,840 --> 00:14:57,880 Speaker 1: Thanks Michael, have a wonderful day, and good luck with 304 00:14:57,880 --> 00:15:00,280 Speaker 1: the seminar and folks. And when you get into with 305 00:15:00,320 --> 00:15:02,000 Speaker 1: Galaxy concrete coatings, because you're going to