1 00:00:07,680 --> 00:00:10,600 Speaker 1: Hi everyone, this is Lee Clasgow and we're Talking Transports. 2 00:00:10,640 --> 00:00:14,280 Speaker 1: Welcome to Bloomberg Intelligence Talking Transports podcast. I'm your host, 3 00:00:14,400 --> 00:00:19,200 Speaker 1: Lee Claskow, Senior free transportation logistics analyst at Bloomberg Intelligence, 4 00:00:19,200 --> 00:00:22,160 Speaker 1: Bloomberg's end house research arm of almost five hundred analysts 5 00:00:22,160 --> 00:00:25,440 Speaker 1: and strategists around the globe. Before diving in a little 6 00:00:25,440 --> 00:00:28,760 Speaker 1: public service announcement, your support is instrumental to keep bringing 7 00:00:28,760 --> 00:00:31,520 Speaker 1: great guests onto the podcast like the one we have today, 8 00:00:31,960 --> 00:00:34,280 Speaker 1: So we need your support, so please if you enjoy 9 00:00:34,320 --> 00:00:37,760 Speaker 1: the podcast, share it, like it, and leave a comment. Also, 10 00:00:38,040 --> 00:00:41,000 Speaker 1: if you have any ideas for future episodes or just 11 00:00:41,040 --> 00:00:43,440 Speaker 1: want to talk transports, please hit me up on the 12 00:00:43,440 --> 00:00:47,000 Speaker 1: Bloomberg terminal, on LinkedIn or on Twitter at Logistics. 13 00:00:47,040 --> 00:00:47,360 Speaker 2: Lee. 14 00:00:47,760 --> 00:00:50,960 Speaker 1: Now onto our episode We're delighted to have with us today. 15 00:00:51,120 --> 00:00:51,920 Speaker 2: Yoz Sheffey. 16 00:00:52,280 --> 00:00:57,040 Speaker 1: He's a director of the Center for Transportation Logistics at MIT. 17 00:00:57,520 --> 00:01:02,000 Speaker 1: He's an expert in system optimization, risk analysis, and supply 18 00:01:02,120 --> 00:01:06,399 Speaker 1: chain management. Doctor Sheffee has authored a number of textbooks 19 00:01:06,440 --> 00:01:09,600 Speaker 1: on these topics. His ninth and latest book is The 20 00:01:09,680 --> 00:01:13,800 Speaker 1: Magic Conveyor Belt, Supply Chains, AI and the Future of Work. 21 00:01:14,520 --> 00:01:18,320 Speaker 1: He consults with leading enterprises and founded or co founded 22 00:01:18,360 --> 00:01:24,440 Speaker 1: five successful companies. Yosi or doctor Chffee, Welcome to the podcast. 23 00:01:24,480 --> 00:01:25,400 Speaker 1: Can I call you Yosi? 24 00:01:25,560 --> 00:01:28,240 Speaker 2: Is that all right? That's the only name I answer to. 25 00:01:28,360 --> 00:01:33,520 Speaker 1: So it's fine, okay, fantastic. So can you just a 26 00:01:33,520 --> 00:01:36,680 Speaker 1: little background about the MIT Logistics program. Can you just 27 00:01:36,800 --> 00:01:39,480 Speaker 1: talk a little bit about what you guys do over there. 28 00:01:40,360 --> 00:01:45,000 Speaker 2: The center itself is now fifty two years old, so 29 00:01:45,160 --> 00:01:49,760 Speaker 2: we have started the as I said, fifty two years 30 00:01:49,800 --> 00:01:55,840 Speaker 2: mostly about public transportation. But they remember a little over 31 00:01:56,040 --> 00:02:02,640 Speaker 2: thirty years when I took it over thirty three years. 32 00:02:02,680 --> 00:02:06,880 Speaker 2: Actually I moved it to logistics and supply chain management 33 00:02:06,920 --> 00:02:11,160 Speaker 2: and freight transportation basically, rather than transit and urban planning. 34 00:02:12,280 --> 00:02:16,400 Speaker 2: Even though my first book was on urban transportation networks, 35 00:02:16,760 --> 00:02:19,800 Speaker 2: how you analyze urban transportation networks, but then my next 36 00:02:19,800 --> 00:02:25,040 Speaker 2: eight books were on supply chain logistics and related issues. 37 00:02:25,720 --> 00:02:31,560 Speaker 2: The center itself is an interdepartment center. It's a house 38 00:02:31,680 --> 00:02:36,600 Speaker 2: in the School of engineering because we are engineers, but 39 00:02:36,880 --> 00:02:42,200 Speaker 2: it's it's employee faculty from across MIT. That's the nature 40 00:02:42,240 --> 00:02:47,280 Speaker 2: of Interdepartmental Center c MIT. The departments are you think 41 00:02:47,320 --> 00:02:51,720 Speaker 2: about it, it's silos, it's vertical. There are cent into department. 42 00:02:51,800 --> 00:02:55,079 Speaker 2: Center go horizontal across the department in the schools, so 43 00:02:55,120 --> 00:02:57,079 Speaker 2: we have people from the School of Management, from the 44 00:02:57,080 --> 00:03:00,240 Speaker 2: School of Engineering, from the School of Urban Planning, School 45 00:03:00,280 --> 00:03:05,679 Speaker 2: of Science, and so forth. We have several programs. In 46 00:03:06,160 --> 00:03:08,880 Speaker 2: one sense, we are the most complex unit or the 47 00:03:08,880 --> 00:03:11,800 Speaker 2: most interesting unit that they might tea in that we 48 00:03:11,840 --> 00:03:16,320 Speaker 2: have everything that one can have. We have our education program. 49 00:03:16,400 --> 00:03:19,640 Speaker 2: We offer a master program in supply chain management, a 50 00:03:19,720 --> 00:03:25,120 Speaker 2: pH d program only graduate program. We have a very 51 00:03:25,360 --> 00:03:31,440 Speaker 2: extensive research program, many research programs. We have a you know, 52 00:03:31,680 --> 00:03:36,720 Speaker 2: extensive industrial partnership program, industry partnership program when we work 53 00:03:36,760 --> 00:03:41,200 Speaker 2: with industry. In fact, we can talk more about it later. 54 00:03:42,040 --> 00:03:47,000 Speaker 2: And we have five international centers that we set up 55 00:03:47,120 --> 00:03:55,480 Speaker 2: in Columbia, Spain, Luxembourg, China, Malaysia and now the UK. 56 00:03:55,600 --> 00:04:00,280 Speaker 2: We just launched the UK. So these are stuff as 57 00:04:00,320 --> 00:04:02,560 Speaker 2: a copy of our center, but we run it all 58 00:04:02,600 --> 00:04:05,520 Speaker 2: as a big network. We also have a very large 59 00:04:05,600 --> 00:04:09,920 Speaker 2: online programs and we can talk more about depending what 60 00:04:09,960 --> 00:04:10,839 Speaker 2: you're interested in. 61 00:04:11,320 --> 00:04:14,880 Speaker 1: Sure, Just so for for context, about how many kids 62 00:04:14,960 --> 00:04:18,839 Speaker 1: graduate in the undergraduating undergraduate class each year. 63 00:04:18,720 --> 00:04:23,320 Speaker 2: Around around eighty master students, eighty students from our program 64 00:04:23,600 --> 00:04:25,400 Speaker 2: and now they are one hundred and fifty from the 65 00:04:25,440 --> 00:04:29,520 Speaker 2: programs around the globe. This is a master student. Plus 66 00:04:29,560 --> 00:04:33,880 Speaker 2: we have about four or five PhD students every year 67 00:04:35,040 --> 00:04:39,640 Speaker 2: studying here. So we have about I think twelve or 68 00:04:39,640 --> 00:04:45,120 Speaker 2: fourteen PhD students right now in our program and it 69 00:04:45,160 --> 00:04:49,680 Speaker 2: takes them about four years, so about three four graduating 70 00:04:49,760 --> 00:04:50,560 Speaker 2: every year. 71 00:04:50,800 --> 00:04:54,279 Speaker 1: And how many undergraduate We don't have undergraduate programs. You 72 00:04:54,320 --> 00:04:56,760 Speaker 1: don't have undergraduate Okay, well that would make sense. So 73 00:04:56,800 --> 00:04:58,120 Speaker 1: it's just such a graduate and. 74 00:04:58,200 --> 00:05:02,160 Speaker 2: My undergraduate program only managed by department, not by center. 75 00:05:02,240 --> 00:05:04,039 Speaker 2: The other centers that are might be not as big 76 00:05:04,040 --> 00:05:08,520 Speaker 2: as as ours, but centers are managing only graduate program. 77 00:05:08,560 --> 00:05:11,679 Speaker 2: Not because it involved people from all kinds. In fact, 78 00:05:12,120 --> 00:05:16,240 Speaker 2: we believe in this. In this we want people to study, 79 00:05:16,839 --> 00:05:22,400 Speaker 2: to have a profession before they coming to study engineering systems, 80 00:05:22,600 --> 00:05:30,120 Speaker 2: industrial engineering, economics, some subjects that gives them some management, 81 00:05:30,800 --> 00:05:38,039 Speaker 2: some you know, ability to do some analytics, and some breath. 82 00:05:38,560 --> 00:05:44,039 Speaker 2: So that's basically our main input to the program. And 83 00:05:44,080 --> 00:05:47,280 Speaker 2: we have eight We have a believable number of applications, 84 00:05:47,360 --> 00:05:50,840 Speaker 2: very hard, but we have eighty. They are divided forty 85 00:05:50,920 --> 00:05:54,640 Speaker 2: and forty forty are here for the master program for 86 00:05:54,760 --> 00:05:59,240 Speaker 2: the air here in the residential program, they come, they 87 00:05:59,320 --> 00:06:02,479 Speaker 2: study in Gusts and they end up in May or 88 00:06:02,600 --> 00:06:05,080 Speaker 2: sometimes beginning of June. So it's about ten months program. 89 00:06:05,800 --> 00:06:09,039 Speaker 2: And we have what we call the blended program which 90 00:06:09,240 --> 00:06:15,159 Speaker 2: people take half the program online. It takes them about 91 00:06:16,040 --> 00:06:22,080 Speaker 2: eighteen months to three years. And then we take the 92 00:06:22,160 --> 00:06:24,800 Speaker 2: cream of the crop of this. And when I say 93 00:06:24,839 --> 00:06:27,600 Speaker 2: cream of the crop, this has one point two million 94 00:06:27,680 --> 00:06:31,400 Speaker 2: learners in our learness. In our program we have a 95 00:06:31,440 --> 00:06:35,000 Speaker 2: lot to truth from. Turns out the best students that 96 00:06:35,080 --> 00:06:37,120 Speaker 2: we have and they come to enmity of only one 97 00:06:37,160 --> 00:06:41,200 Speaker 2: for five months, only for one semester. This is the 98 00:06:41,600 --> 00:06:45,680 Speaker 2: eighty student. And then we have it's more complex because 99 00:06:45,800 --> 00:06:49,839 Speaker 2: we have so many students online that we have dozens 100 00:06:49,839 --> 00:06:53,560 Speaker 2: and dozens university who accept this online our online program 101 00:06:53,600 --> 00:06:57,440 Speaker 2: and give them one semester or two semesters sometimes to 102 00:06:57,520 --> 00:07:00,279 Speaker 2: finish and get a full master degree, because if you 103 00:07:00,400 --> 00:07:02,720 Speaker 2: just take the online program, you get a certificate from 104 00:07:02,839 --> 00:07:04,760 Speaker 2: m I T, but not the full masters degree. 105 00:07:05,600 --> 00:07:09,640 Speaker 1: Right okay, And so what are what are your students 106 00:07:10,320 --> 00:07:12,680 Speaker 1: or what are you focusing on right now? Obviously there's 107 00:07:12,720 --> 00:07:15,880 Speaker 1: a lot to talk about, you know, whether it's technology, 108 00:07:16,000 --> 00:07:19,120 Speaker 1: the economy. So you know, what are you guys focusing 109 00:07:19,160 --> 00:07:19,920 Speaker 1: on right now? 110 00:07:20,240 --> 00:07:24,440 Speaker 2: Okay? In the educational program, it does not change that 111 00:07:24,640 --> 00:07:28,760 Speaker 2: quickly because it focused on fundamentals, focused on the ability 112 00:07:28,880 --> 00:07:33,680 Speaker 2: to design a network, to optimize inventory all you know, 113 00:07:34,600 --> 00:07:41,640 Speaker 2: work on advanced procurement distribution is a transportation. Uh, these 114 00:07:41,680 --> 00:07:45,640 Speaker 2: issues don't change so because it's based on principles and 115 00:07:45,920 --> 00:07:51,640 Speaker 2: they don't. However, students also have to do a busis 116 00:07:51,880 --> 00:07:58,360 Speaker 2: or you know, final project, and this topic change from 117 00:07:58,440 --> 00:08:00,880 Speaker 2: year to year because they have to work industry. We 118 00:08:01,000 --> 00:08:04,120 Speaker 2: insist on students working with real data, we real companies 119 00:08:04,160 --> 00:08:08,160 Speaker 2: on real problems. So it's not in that sense there's 120 00:08:08,200 --> 00:08:11,120 Speaker 2: not theory here. They have to work on real problem 121 00:08:11,240 --> 00:08:15,400 Speaker 2: and real data and then and understand, the data is 122 00:08:15,480 --> 00:08:17,880 Speaker 2: never cleaned and never you know the way you want it, 123 00:08:17,920 --> 00:08:19,400 Speaker 2: and you have to clean it. You have to work 124 00:08:19,400 --> 00:08:23,880 Speaker 2: at it and you have to use it. So every 125 00:08:23,880 --> 00:08:27,360 Speaker 2: one of our students when they finished and so coming 126 00:08:27,400 --> 00:08:32,280 Speaker 2: back these the topic are usually more up to date topics. 127 00:08:32,760 --> 00:08:36,640 Speaker 2: So my guess is next year there will be we'll 128 00:08:36,679 --> 00:08:40,760 Speaker 2: see a lot of things about tariffs. The last the 129 00:08:40,840 --> 00:08:46,600 Speaker 2: last two years, every other thesis was about AI. Before 130 00:08:46,640 --> 00:08:50,680 Speaker 2: that it was blockchain or Alfred or you know, whatever 131 00:08:50,800 --> 00:08:55,200 Speaker 2: is the whatever the companies are interested in because they 132 00:08:56,880 --> 00:08:59,120 Speaker 2: marry with the students to do to do a project. 133 00:08:59,200 --> 00:09:03,320 Speaker 2: So the companies in August we have all our partners 134 00:09:04,040 --> 00:09:08,240 Speaker 2: present areas that they want to investigate. They want to 135 00:09:08,520 --> 00:09:12,720 Speaker 2: employ a student for ten months. Now I should say 136 00:09:13,240 --> 00:09:18,640 Speaker 2: these are limited engagement because it's especially when new companies 137 00:09:18,640 --> 00:09:22,120 Speaker 2: are join our group. They think that they fail on 138 00:09:22,200 --> 00:09:25,959 Speaker 2: gold within ten months. You know, a student will solve 139 00:09:26,520 --> 00:09:29,320 Speaker 2: problems that the mckinseye was not able to do in 140 00:09:29,760 --> 00:09:33,480 Speaker 2: five years and then five million dollars. It's not quite. 141 00:09:33,720 --> 00:09:36,320 Speaker 2: So we have we go to a process of understanding 142 00:09:36,400 --> 00:09:38,560 Speaker 2: the scope and what can be done in terms what 143 00:09:38,640 --> 00:09:41,080 Speaker 2: cannot be done or so the student work is an 144 00:09:41,120 --> 00:09:44,599 Speaker 2: advisor work with them. Somebody at the company is responsible 145 00:09:44,600 --> 00:09:49,200 Speaker 2: for it. So in that sense, these issues are more 146 00:09:49,640 --> 00:09:54,520 Speaker 2: up to date. We also have lots of outside presentations 147 00:09:54,559 --> 00:09:58,840 Speaker 2: and lectures and about ongoing research, ongoing issues with industry. 148 00:09:59,120 --> 00:10:01,520 Speaker 2: These are more up to date. But the basic program 149 00:10:01,679 --> 00:10:07,000 Speaker 2: program is based on principle. It does not change quickly 150 00:10:07,040 --> 00:10:07,800 Speaker 2: from year to year. 151 00:10:09,240 --> 00:10:12,240 Speaker 1: You mentioned AI, so is in your view you know 152 00:10:12,280 --> 00:10:14,600 Speaker 1: you mentioned a couple other things like blockchain, which really 153 00:10:14,679 --> 00:10:17,240 Speaker 1: kind of fizzled out. You know, it seems like AI 154 00:10:17,559 --> 00:10:21,680 Speaker 1: is truly one of the transformational things that could impact 155 00:10:21,760 --> 00:10:26,160 Speaker 1: not only transportation, but obviously the broader economy and other industries. 156 00:10:27,120 --> 00:10:30,120 Speaker 1: Can you talk about, you know, where you're seeing AI 157 00:10:30,679 --> 00:10:34,280 Speaker 1: uh and supply chains and where do you think, uh 158 00:10:34,320 --> 00:10:34,800 Speaker 1: it could go? 159 00:10:35,640 --> 00:10:38,360 Speaker 2: Okay, where do I think it can go? Is the 160 00:10:39,080 --> 00:10:43,240 Speaker 2: you know, hard to predict. The technology is still evolving 161 00:10:43,280 --> 00:10:47,280 Speaker 2: so quickly and the capabilities are changing so fast that 162 00:10:47,400 --> 00:10:53,400 Speaker 2: it's hard to imagine all the changes. However, we see 163 00:10:53,400 --> 00:11:01,800 Speaker 2: already cases where it is very helpful. For example, in procurement, 164 00:11:02,679 --> 00:11:10,480 Speaker 2: we see companies implementing machine learning and generative AI in 165 00:11:12,400 --> 00:11:17,200 Speaker 2: making sure that the whole process from generating you know, 166 00:11:18,400 --> 00:11:23,400 Speaker 2: requests for information and to requests for proposal, for requests 167 00:11:23,400 --> 00:11:28,520 Speaker 2: for quote, to contracting to signing contract to then following 168 00:11:28,679 --> 00:11:33,040 Speaker 2: what's going on is done on semi automatically at this point, 169 00:11:33,040 --> 00:11:36,760 Speaker 2: but going to become more and more automated using these 170 00:11:36,960 --> 00:11:41,839 Speaker 2: advanced AI methods. We see, of course, you know that 171 00:11:41,880 --> 00:11:45,040 Speaker 2: the things that you don't even think about, but when 172 00:11:45,040 --> 00:11:48,000 Speaker 2: you talk to your favorite customer service representative, you know, 173 00:11:48,120 --> 00:11:51,319 Speaker 2: used to say press one to take hear, press to 174 00:11:51,480 --> 00:11:54,080 Speaker 2: to get ear prestificate here. Now you just talk and 175 00:11:54,480 --> 00:11:58,880 Speaker 2: you know, the the chatbot talks back to you. So 176 00:11:58,920 --> 00:12:03,880 Speaker 2: it's done in the natural language. Lots of the communication 177 00:12:04,040 --> 00:12:10,040 Speaker 2: between people and computer is now done with language. There's 178 00:12:10,080 --> 00:12:13,960 Speaker 2: a lot of a lot of work going on on 179 00:12:14,160 --> 00:12:19,600 Speaker 2: a network design that try to do it more, you know, 180 00:12:20,040 --> 00:12:24,040 Speaker 2: with more AI. This is not going this is not 181 00:12:24,200 --> 00:12:28,920 Speaker 2: yet widespread. There is a lot of work on infusing 182 00:12:28,960 --> 00:12:32,360 Speaker 2: more and more robots with more and more capabilities. So 183 00:12:33,120 --> 00:12:36,680 Speaker 2: if you think about a warehouse when robots are running 184 00:12:36,679 --> 00:12:41,120 Speaker 2: around and you have to worry about safety because there 185 00:12:41,120 --> 00:12:44,760 Speaker 2: are people there, and you're also worry about the robots themselves. 186 00:12:45,320 --> 00:12:50,080 Speaker 2: So think about it that robots have a if each 187 00:12:50,160 --> 00:12:54,800 Speaker 2: robots run around the warehouse a space of let's say 188 00:12:55,000 --> 00:12:57,200 Speaker 2: a miter a miter and a half around there that 189 00:12:57,320 --> 00:13:01,840 Speaker 2: if anything enters this space, the robot stops. Now we 190 00:13:02,000 --> 00:13:06,079 Speaker 2: have prograt projects basically a lot of AI in this 191 00:13:06,400 --> 00:13:10,160 Speaker 2: lots of sensors and AI and combination that try to 192 00:13:10,280 --> 00:13:16,240 Speaker 2: limit it to let's say a lot less maybe a foot. 193 00:13:17,679 --> 00:13:20,800 Speaker 2: This means that the robots can be a lot more efficient, 194 00:13:21,760 --> 00:13:23,959 Speaker 2: more of them can run around and not run into 195 00:13:24,000 --> 00:13:30,200 Speaker 2: each other or running too people. Some of these technology 196 00:13:30,240 --> 00:13:35,559 Speaker 2: are the same as technology using autonomous vehicles. Also, it's 197 00:13:35,600 --> 00:13:38,760 Speaker 2: a combination. It's not just AI. It's a combination of 198 00:13:39,640 --> 00:13:47,440 Speaker 2: sensors and leaders and radars and some smart to process 199 00:13:47,480 --> 00:13:53,559 Speaker 2: all this and demonstrate and get into a conclusion. Upcoming. 200 00:13:54,000 --> 00:13:59,200 Speaker 2: What we see coming down the pulic is agent AI, 201 00:14:00,080 --> 00:14:05,400 Speaker 2: when there'll be a piece of so software that will 202 00:14:05,440 --> 00:14:11,400 Speaker 2: go out and perform tasks on an individual level. You 203 00:14:11,520 --> 00:14:16,640 Speaker 2: can ask even today, you can ask some of the 204 00:14:16,720 --> 00:14:22,400 Speaker 2: leading AI provider to design a trip to Venice and 205 00:14:22,440 --> 00:14:26,640 Speaker 2: they'll come up with just like a travel agent. They'll 206 00:14:26,640 --> 00:14:30,080 Speaker 2: come up with the flights and hotel and suggestions where 207 00:14:30,120 --> 00:14:34,160 Speaker 2: to go. I didn't use it myself, so I don't 208 00:14:34,200 --> 00:14:37,120 Speaker 2: know how good they are, but it's an example of 209 00:14:37,440 --> 00:14:43,440 Speaker 2: what's coming. Clearly. People are if you say a law firm, 210 00:14:44,160 --> 00:14:49,480 Speaker 2: people can send and it's being done, sending an agent 211 00:14:49,920 --> 00:14:55,160 Speaker 2: to collect all the president cases for some lawsuits and 212 00:14:55,520 --> 00:14:59,600 Speaker 2: summarize them and present them. Now, the problem is sometimes 213 00:15:00,160 --> 00:15:04,280 Speaker 2: to hallucinate and come up with nonsense. But this problem 214 00:15:04,360 --> 00:15:08,000 Speaker 2: is being you know, address better and better. It's not 215 00:15:08,080 --> 00:15:12,320 Speaker 2: perfect yet by any stretch of the imagination. We see 216 00:15:12,720 --> 00:15:15,240 Speaker 2: even you know, when we use Google Map, it's becoming 217 00:15:15,280 --> 00:15:18,600 Speaker 2: better and better. Translation is becoming better and better because 218 00:15:18,640 --> 00:15:21,440 Speaker 2: of the use of AI. So we see it happening 219 00:15:21,520 --> 00:15:25,920 Speaker 2: in all parts of life and certainly in supply chain management. 220 00:15:26,840 --> 00:15:29,400 Speaker 1: And do you guys at the center use AI. 221 00:15:29,560 --> 00:15:33,480 Speaker 2: Yes, we do. In fact, we encourage our students to 222 00:15:33,560 --> 00:15:37,160 Speaker 2: use and AI. We went to go to go through 223 00:15:37,200 --> 00:15:39,600 Speaker 2: the understanding that you're give an assignment or a case 224 00:15:39,640 --> 00:15:41,840 Speaker 2: study and people use chet, gipt in or to give 225 00:15:41,880 --> 00:15:46,520 Speaker 2: you an answer. This changed how we teach and how 226 00:15:46,560 --> 00:15:50,600 Speaker 2: we work. We separated and put a lot more emphasis 227 00:15:50,600 --> 00:15:53,920 Speaker 2: on the assessment park. So, for example, assume there's a 228 00:15:53,960 --> 00:15:56,120 Speaker 2: case study. You send students at Hon't to do a 229 00:15:56,160 --> 00:15:58,520 Speaker 2: case study, and you don't know if they come up 230 00:15:58,560 --> 00:16:01,760 Speaker 2: with che jipt or some other states or they actually 231 00:16:01,760 --> 00:16:04,880 Speaker 2: do the work. What you do is in the class 232 00:16:04,960 --> 00:16:07,720 Speaker 2: you don't talk about it. You just tell them, okay, 233 00:16:08,200 --> 00:16:10,040 Speaker 2: I'm going to point at you and you have to 234 00:16:10,120 --> 00:16:14,720 Speaker 2: tell me what's going on with the case. Just calling 235 00:16:16,760 --> 00:16:19,400 Speaker 2: they induce fear make sure that they do the work. 236 00:16:21,160 --> 00:16:26,880 Speaker 2: So we had to adapt ourselves to for example, in 237 00:16:26,920 --> 00:16:29,360 Speaker 2: the teaching, we don't want them not to use AI 238 00:16:29,480 --> 00:16:32,040 Speaker 2: because they'll have everything work, they'll have it everywhere. So 239 00:16:32,080 --> 00:16:34,440 Speaker 2: we want them, we want them to learn how to 240 00:16:34,560 --> 00:16:36,080 Speaker 2: use it and want them to get better at it. 241 00:16:36,520 --> 00:16:39,800 Speaker 2: But of course we tell them that when you submit, 242 00:16:39,920 --> 00:16:44,600 Speaker 2: when the jgbts submit nonsense, it's you get an f 243 00:16:44,720 --> 00:16:48,120 Speaker 2: not open eye. So anyway, it's honestly it's still work 244 00:16:48,120 --> 00:16:52,800 Speaker 2: in progress. We're still learning to work with it. There now. 245 00:16:52,880 --> 00:16:57,960 Speaker 2: For example, pieces of software that when the students submit 246 00:16:58,080 --> 00:17:02,320 Speaker 2: somebody something you can analyze it and see what percentage 247 00:17:02,320 --> 00:17:05,879 Speaker 2: of it was written by AI. Turns out you can 248 00:17:05,960 --> 00:17:10,199 Speaker 2: do it Amazon for example, if you submit you know, 249 00:17:10,240 --> 00:17:13,000 Speaker 2: I did several of my books with Amazon. If you 250 00:17:13,080 --> 00:17:18,919 Speaker 2: submit a text for a book to Amazon, they'll analyze 251 00:17:18,920 --> 00:17:21,440 Speaker 2: it and if more than fifty percent is written by AI, 252 00:17:21,520 --> 00:17:23,840 Speaker 2: they're not going to publish it. Kidney is not going 253 00:17:23,880 --> 00:17:28,520 Speaker 2: to publish it. So it starts to be an arms race, 254 00:17:28,600 --> 00:17:33,400 Speaker 2: you know stuff and anti stuff and we'll see. 255 00:17:34,480 --> 00:17:37,720 Speaker 1: So do you view like in your career and supply 256 00:17:37,920 --> 00:17:41,600 Speaker 1: chains and transportation, do you view AI as the most 257 00:17:41,640 --> 00:17:47,240 Speaker 1: transformative change or technology that you've seen in your career? 258 00:17:47,760 --> 00:17:53,720 Speaker 2: Well, that's being old means that it's hard to make 259 00:17:53,760 --> 00:17:59,919 Speaker 2: this statement. I was still alive when containers came on. 260 00:18:00,359 --> 00:18:05,760 Speaker 2: You know, containers change everything, change the whole. So the 261 00:18:05,760 --> 00:18:10,200 Speaker 2: Internet change. It's hard to say, you know, the Internet 262 00:18:10,240 --> 00:18:13,200 Speaker 2: come on, change how we work, how we do stuff, 263 00:18:13,200 --> 00:18:17,440 Speaker 2: how we communicate. I think that AI is a change 264 00:18:17,480 --> 00:18:21,280 Speaker 2: on that scale. It will change how people work, it 265 00:18:21,359 --> 00:18:25,639 Speaker 2: will change relationship. There's an interesting article when it is 266 00:18:25,680 --> 00:18:29,720 Speaker 2: the worst with journal or the economists, that they talk 267 00:18:29,760 --> 00:18:34,639 Speaker 2: about the CEOs that this is the last crop of 268 00:18:34,720 --> 00:18:39,920 Speaker 2: CEOs who still manage only humans. They'll have to manage 269 00:18:40,200 --> 00:18:46,000 Speaker 2: humans working with you know, digital agents. And actually there 270 00:18:47,040 --> 00:18:50,360 Speaker 2: there are age R programs that are starting to think 271 00:18:50,400 --> 00:18:54,479 Speaker 2: about how do we manage agents. I'm not sure that 272 00:18:54,880 --> 00:18:59,800 Speaker 2: HR professionals are because they are taught to show em 273 00:19:03,440 --> 00:19:08,080 Speaker 2: and then and work on the regulation for human resources. 274 00:19:08,800 --> 00:19:11,240 Speaker 2: But it just so to show you how people are 275 00:19:11,600 --> 00:19:14,440 Speaker 2: start to think about it. This is still we're talking 276 00:19:14,440 --> 00:19:18,639 Speaker 2: about the future, really when you're not there yet. In fact, interestingly, 277 00:19:19,160 --> 00:19:23,320 Speaker 2: I'm now working on a book just it's not even 278 00:19:23,359 --> 00:19:28,120 Speaker 2: close to being done because it's a moving target. I'm 279 00:19:28,160 --> 00:19:31,920 Speaker 2: working about what is actually being used in supply chain management, 280 00:19:32,880 --> 00:19:36,520 Speaker 2: because when everything is said and done, you know a 281 00:19:36,560 --> 00:19:40,359 Speaker 2: lot more is said than done. So people are trying 282 00:19:40,400 --> 00:19:46,680 Speaker 2: to do some testing, some experiment here and there. There 283 00:19:46,680 --> 00:19:49,960 Speaker 2: are some people who are starting to put it to 284 00:19:50,040 --> 00:19:53,280 Speaker 2: work at scale and commercial use. So I'm trying to 285 00:19:53,320 --> 00:19:57,880 Speaker 2: find out where these nuggets are, talking to a lot 286 00:19:57,920 --> 00:20:03,040 Speaker 2: of companies, trying to find what is being done. Mostly 287 00:20:03,040 --> 00:20:08,199 Speaker 2: when I find out, of course Chegibt and just talking 288 00:20:08,400 --> 00:20:15,800 Speaker 2: the tech is everywhere interesting, the stuff that works. So 289 00:20:15,840 --> 00:20:18,040 Speaker 2: I'll give you some of the things that I've found 290 00:20:18,600 --> 00:20:22,000 Speaker 2: the stuff that works. Let me take a step back. 291 00:20:22,680 --> 00:20:32,400 Speaker 2: You know why chess and games and where software AI 292 00:20:32,600 --> 00:20:35,640 Speaker 2: was very successful in playing chess, in playing goal, which 293 00:20:35,640 --> 00:20:39,439 Speaker 2: is much more complicated. Given all this, because these our 294 00:20:39,560 --> 00:20:44,960 Speaker 2: system with rules, the objectives is clear, it's you can 295 00:20:45,040 --> 00:20:48,359 Speaker 2: repeat it, and it's the same thing. The world is 296 00:20:48,359 --> 00:20:52,359 Speaker 2: not like this. You were very far from being able 297 00:20:52,440 --> 00:20:57,280 Speaker 2: to ask Chegibt or any other of the program to 298 00:20:57,440 --> 00:21:01,679 Speaker 2: design us a global networks that we will respond to 299 00:21:01,800 --> 00:21:07,399 Speaker 2: risks and be efficient and you know, make sure that 300 00:21:08,320 --> 00:21:12,200 Speaker 2: we not burn the planet, worry about sustainability and risk 301 00:21:12,280 --> 00:21:15,720 Speaker 2: and you know, and be a low cost and efficient 302 00:21:15,800 --> 00:21:21,320 Speaker 2: and fast can't be done. Were not there because these systems, 303 00:21:21,400 --> 00:21:24,480 Speaker 2: there are so many objectives sometimes the contradict with each other. 304 00:21:24,960 --> 00:21:28,520 Speaker 2: The world is changing all the time. So this is 305 00:21:28,720 --> 00:21:31,960 Speaker 2: what you might call open system versus chess. That's closed system, 306 00:21:32,240 --> 00:21:37,240 Speaker 2: but certainly chess. Let's say you go to your favorite 307 00:21:38,320 --> 00:21:43,320 Speaker 2: you know, windows in a drive through to buy coffee. 308 00:21:44,400 --> 00:21:47,040 Speaker 2: In many of these places, you're not talking to a 309 00:21:47,119 --> 00:21:50,320 Speaker 2: person when you order. You're talking to chipetitos or to 310 00:21:50,440 --> 00:21:55,280 Speaker 2: a chat bot. And it works. Why does it work? 311 00:21:55,680 --> 00:21:57,840 Speaker 2: Why does it work? Because you are not going to 312 00:21:57,920 --> 00:22:00,560 Speaker 2: go to the window and ask questions like is their god? 313 00:22:02,119 --> 00:22:04,600 Speaker 2: You would ask questions can I have almond milk with 314 00:22:04,640 --> 00:22:08,080 Speaker 2: my coffee? I mean? Which is easy to train on 315 00:22:08,480 --> 00:22:11,320 Speaker 2: the questions are the same. You do machine learning on 316 00:22:11,440 --> 00:22:14,000 Speaker 2: all the data that you have, and it works. It 317 00:22:14,160 --> 00:22:18,000 Speaker 2: was pretty good. So so when the problem, when the 318 00:22:18,080 --> 00:22:22,560 Speaker 2: problem is contained in several on several dimensions, it's very 319 00:22:22,680 --> 00:22:28,359 Speaker 2: it's we're there, we can do it. The the hope 320 00:22:28,480 --> 00:22:31,000 Speaker 2: is the work on problem that I'm not contained are 321 00:22:31,080 --> 00:22:34,760 Speaker 2: really big and ugly and the changing all the time, 322 00:22:35,040 --> 00:22:36,000 Speaker 2: But we're not there yet. 323 00:22:37,240 --> 00:22:40,399 Speaker 1: Gotcha. And you know, and you mentioned another technology earlier 324 00:22:40,400 --> 00:22:43,320 Speaker 1: when you were talking about robots you talked you mentioned 325 00:22:43,440 --> 00:22:46,879 Speaker 1: it's the same technology or similar technology for safety with 326 00:22:47,000 --> 00:22:51,600 Speaker 1: autonomous vehicles. What what's your take on like autonomous vehicles 327 00:22:51,680 --> 00:22:55,000 Speaker 1: on the roadway. Do you think that it's the future 328 00:22:55,160 --> 00:22:57,440 Speaker 1: is closed or do you think this is something that 329 00:22:59,280 --> 00:23:02,160 Speaker 1: is a lot of further are out then maybe those 330 00:23:02,200 --> 00:23:07,040 Speaker 1: that are pretty bullish on the autonomous vehicles out there, 331 00:23:07,040 --> 00:23:08,480 Speaker 1: whether it's trucks are card i. 332 00:23:08,480 --> 00:23:11,080 Speaker 2: Would saw to some extent the future is here. You 333 00:23:11,119 --> 00:23:13,479 Speaker 2: go to San Francisco to Phoenix, you can have an 334 00:23:14,080 --> 00:23:19,800 Speaker 2: autonomous taxi. And in the South you can have autonomous trucks, 335 00:23:19,800 --> 00:23:24,800 Speaker 2: mostly still running in training, but some of them are 336 00:23:24,840 --> 00:23:29,840 Speaker 2: running limited commercial use, very limited and only in the 337 00:23:29,920 --> 00:23:33,600 Speaker 2: South when the roads are open, when you don't have snow, 338 00:23:33,720 --> 00:23:37,240 Speaker 2: when you don't have the cowpath that define Boston roads, 339 00:23:37,760 --> 00:23:43,680 Speaker 2: it's you know, it works there. There are, however, issues 340 00:23:43,720 --> 00:23:48,880 Speaker 2: with this. The issues are with every system that involves 341 00:23:48,960 --> 00:23:55,359 Speaker 2: human and robots, human and autonomy. There's a question of 342 00:23:55,400 --> 00:24:00,159 Speaker 2: public acceptance. Let me give you an extreme example. Today 343 00:24:00,240 --> 00:24:05,520 Speaker 2: modern aircraft seven eighty seven eight three fifty is actually 344 00:24:05,520 --> 00:24:10,160 Speaker 2: a drone. It can fly by itself. Not too many 345 00:24:10,160 --> 00:24:14,040 Speaker 2: people would fly at the aluminum tube, but thirty five 346 00:24:14,080 --> 00:24:18,600 Speaker 2: thousand feet across the Atlantic with nobody in the front. 347 00:24:19,960 --> 00:24:23,960 Speaker 2: But it's possible. There's the technologies there. You know, these 348 00:24:23,960 --> 00:24:26,720 Speaker 2: are really drones. I mean you see what drones do 349 00:24:27,240 --> 00:24:30,919 Speaker 2: and autonomous they do. They bomb, they go back, they 350 00:24:30,960 --> 00:24:37,439 Speaker 2: collect intelligence. It can be done, but public acceptance is 351 00:24:37,480 --> 00:24:41,399 Speaker 2: not there. So even though by the way that people 352 00:24:41,480 --> 00:24:44,120 Speaker 2: working on it, on getting we used to have five 353 00:24:44,160 --> 00:24:46,720 Speaker 2: people in the cockpit of an airliner and now that too, 354 00:24:47,200 --> 00:24:49,280 Speaker 2: and there's a lot of work and getting into one 355 00:24:49,440 --> 00:24:53,720 Speaker 2: and the other one is an AI helper and it's 356 00:24:53,800 --> 00:24:56,920 Speaker 2: and at the end of the day it will become autonomous. 357 00:24:57,119 --> 00:25:03,040 Speaker 2: So the question is how do people feel about a 358 00:25:03,119 --> 00:25:07,399 Speaker 2: large truck, weird looking track running at the one hundred 359 00:25:07,480 --> 00:25:10,240 Speaker 2: miles per hour next to them on the highway with 360 00:25:10,480 --> 00:25:16,520 Speaker 2: nobody in the front? Would they be you know, and 361 00:25:16,560 --> 00:25:19,680 Speaker 2: we will have accidents and who knows how it will 362 00:25:20,240 --> 00:25:24,680 Speaker 2: play people. No, this is too dangerous. We cannot have it. 363 00:25:24,680 --> 00:25:30,600 Speaker 2: It may come because because there'll be impacts on jobs. 364 00:25:30,680 --> 00:25:35,800 Speaker 2: You know, truck driver jobs are in thirty thirty some 365 00:25:35,960 --> 00:25:39,760 Speaker 2: states are the number one job, I mean the number 366 00:25:39,800 --> 00:25:43,720 Speaker 2: one job category. So it's lots of trucking jobs. So 367 00:25:45,040 --> 00:25:50,879 Speaker 2: going away, we'll see, we see what happens in the 368 00:25:50,960 --> 00:25:54,760 Speaker 2: port very hard to automate because of the union. So 369 00:25:55,760 --> 00:25:58,240 Speaker 2: and if you go to ports, as you must know, 370 00:25:58,400 --> 00:26:03,720 Speaker 2: you go to buy or Singapore or Rotterdam, and then 371 00:26:03,800 --> 00:26:06,200 Speaker 2: you go to an American port, then you get depressed. 372 00:26:08,200 --> 00:26:12,560 Speaker 2: It is what it is. I mean, I I understand, 373 00:26:12,720 --> 00:26:17,840 Speaker 2: but there's a lot of fame pushback and on automation. 374 00:26:18,800 --> 00:26:21,800 Speaker 2: It also it also takes a longer time than people think, 375 00:26:22,080 --> 00:26:25,760 Speaker 2: only a longer time than people think. Give you give 376 00:26:25,800 --> 00:26:30,920 Speaker 2: you one example. You know we used to have telephone 377 00:26:30,960 --> 00:26:41,120 Speaker 2: exchange operators. In eighteen ninety the AT and T came 378 00:26:41,240 --> 00:26:47,200 Speaker 2: up with an automatic exchange just based on numbers. You 379 00:26:47,520 --> 00:26:50,120 Speaker 2: didn't have to call susan in the exchange and ask 380 00:26:50,200 --> 00:26:53,919 Speaker 2: it to find where mister Glasgow is having lunch today 381 00:26:53,960 --> 00:26:59,360 Speaker 2: so we can talk to him. It was all automated. Now, 382 00:26:59,640 --> 00:27:02,399 Speaker 2: honestly think we lost some level of service, but we 383 00:27:02,520 --> 00:27:07,840 Speaker 2: got we were an efficiency. Turns out it took by 384 00:27:07,920 --> 00:27:11,160 Speaker 2: nineteen fifty we still had three hundred and fifty thousand 385 00:27:12,359 --> 00:27:17,200 Speaker 2: exchange operators. Only by nineteen eighty, which is nine decades later, 386 00:27:19,800 --> 00:27:25,200 Speaker 2: these people disappeared. So it takes time until technologies catch 387 00:27:25,280 --> 00:27:28,040 Speaker 2: not to be sure, I was working a lot faster 388 00:27:28,600 --> 00:27:32,199 Speaker 2: because it starts And the reason is because most of 389 00:27:32,440 --> 00:27:36,480 Speaker 2: today's technology starts with the consumers and they pick it 390 00:27:36,560 --> 00:27:40,479 Speaker 2: up because it's useful, it's interesting, it's cool, so they 391 00:27:40,480 --> 00:27:42,320 Speaker 2: pick it up and then it goes to companies. 392 00:27:43,440 --> 00:27:48,040 Speaker 1: You mentioned like acceptance of autonomous vehicles. It was in 393 00:27:48,200 --> 00:27:51,800 Speaker 1: a way about cab in southern California a couple of 394 00:27:51,840 --> 00:27:55,080 Speaker 1: months ago, and it was definitely cool. My anxiety was 395 00:27:55,119 --> 00:27:59,480 Speaker 1: definitely higher than a normal taxi. But yeah, I can 396 00:27:59,560 --> 00:28:04,919 Speaker 1: I can see people, you know, not being very comfortable 397 00:28:05,040 --> 00:28:07,960 Speaker 1: going into a plane without at least two people in 398 00:28:08,000 --> 00:28:11,240 Speaker 1: the cockpit. And so you know, another thing I'm assuming 399 00:28:11,320 --> 00:28:13,720 Speaker 1: you guys are looking at and we're in extremely early 400 00:28:13,840 --> 00:28:15,720 Speaker 1: early innings and we don't really know how this is 401 00:28:15,800 --> 00:28:19,040 Speaker 1: going to play out. But all these tariffs that are 402 00:28:19,040 --> 00:28:22,240 Speaker 1: coming out of the US, I mean, are supply chain 403 00:28:22,320 --> 00:28:25,399 Speaker 1: is going to be redrawn or it's just just going 404 00:28:25,480 --> 00:28:28,720 Speaker 1: to be inflationary and we're all just going to deal 405 00:28:28,720 --> 00:28:30,760 Speaker 1: with higher prices and things are still going to be 406 00:28:30,800 --> 00:28:32,879 Speaker 1: you know, stickers are still going to be manufactured in 407 00:28:33,000 --> 00:28:36,280 Speaker 1: Vietnam and you know those those jobs aren't obviously coming 408 00:28:36,320 --> 00:28:40,240 Speaker 1: to the US. So so so how do you see 409 00:28:40,240 --> 00:28:42,920 Speaker 1: this playing out from a supply chain standpoint. 410 00:28:43,400 --> 00:28:48,840 Speaker 2: Let me start my more general Okay, during the decades 411 00:28:48,920 --> 00:28:53,760 Speaker 2: of globalization, most of the population around the world benefit 412 00:28:53,880 --> 00:28:58,320 Speaker 2: with much higher start of living. Right, those are small 413 00:28:58,440 --> 00:29:02,480 Speaker 2: minority that extraly suffer. The small minority people who lost 414 00:29:02,560 --> 00:29:06,400 Speaker 2: manufacturing job in the industrial heartland the United States, for example, 415 00:29:07,200 --> 00:29:11,800 Speaker 2: ghost towns that the lost employment, which in some sense 416 00:29:11,880 --> 00:29:16,160 Speaker 2: led to the Fentanel crisis later on. So we had 417 00:29:16,560 --> 00:29:24,000 Speaker 2: most people, you know, enjoyed it, few people suffered. With 418 00:29:24,320 --> 00:29:29,960 Speaker 2: the current tarists, we are turning it upside down. Most 419 00:29:29,960 --> 00:29:32,800 Speaker 2: of people are going to suffer, a few people will benefit. 420 00:29:34,640 --> 00:29:38,680 Speaker 2: The people who benefit, for example, are domestic manufacturers who 421 00:29:38,760 --> 00:29:41,160 Speaker 2: will be able to have less competition, be able to 422 00:29:41,240 --> 00:29:45,800 Speaker 2: raise the price, increase their margins, and they will keep, 423 00:29:45,840 --> 00:29:51,680 Speaker 2: by the way, lobbying to continue this. So every economic 424 00:29:51,760 --> 00:29:57,240 Speaker 2: dislocation creates winner and loser, and the winners will keep 425 00:29:58,040 --> 00:30:01,280 Speaker 2: making it hard to dial it down. Now, let's talk 426 00:30:01,280 --> 00:30:06,800 Speaker 2: about what happened to supply chain. In some sense, I'm 427 00:30:06,840 --> 00:30:11,320 Speaker 2: less worried after COVID and Ukraine and the Middle East 428 00:30:11,400 --> 00:30:14,960 Speaker 2: and all this. In many ways, it's just another day 429 00:30:14,960 --> 00:30:20,520 Speaker 2: of the office. It is I was just with a 430 00:30:20,560 --> 00:30:23,719 Speaker 2: group of fifty supply chain officers and this was the 431 00:30:23,760 --> 00:30:28,000 Speaker 2: basically the the main response this is I was with 432 00:30:28,080 --> 00:30:33,040 Speaker 2: them last week when the when it was announced, nobody 433 00:30:33,120 --> 00:30:40,560 Speaker 2: was panicking. Some people immediately said, okay, even fifty tariffs 434 00:30:40,760 --> 00:30:42,800 Speaker 2: in now China is sixty percent or something. When you 435 00:30:42,920 --> 00:30:47,320 Speaker 2: enter everything up, it is still we cannot move to 436 00:30:47,360 --> 00:30:54,560 Speaker 2: the United States. The unfortunate thing that the administration does 437 00:30:54,680 --> 00:30:57,240 Speaker 2: seem to pay attention to our people, not that they 438 00:30:57,320 --> 00:31:01,960 Speaker 2: did right media to pay attention to is the fact 439 00:31:02,000 --> 00:31:04,560 Speaker 2: that it takes three years to be a manufacturing plant. 440 00:31:05,240 --> 00:31:07,640 Speaker 2: You cannot just come to deal. And then, by the way, 441 00:31:07,720 --> 00:31:10,880 Speaker 2: we don't have workers. You know, when DSMC build a 442 00:31:11,000 --> 00:31:15,840 Speaker 2: plant in a in Phoenix, in the Phoenix area, they 443 00:31:15,880 --> 00:31:19,440 Speaker 2: said that they're making the chips in the US will 444 00:31:19,440 --> 00:31:23,840 Speaker 2: cost fifty percent more. Why because we don't have enough 445 00:31:23,880 --> 00:31:27,600 Speaker 2: engineering talent and they have to train them. Every time 446 00:31:27,640 --> 00:31:30,440 Speaker 2: they're bringing engineers on, they have to train them. So 447 00:31:30,520 --> 00:31:36,280 Speaker 2: it's they said, it's not we Our biggest failure is 448 00:31:36,320 --> 00:31:42,240 Speaker 2: the education system. The US education system failed miserably. We 449 00:31:42,400 --> 00:31:45,080 Speaker 2: don't have an you know, MIT is not an example. 450 00:31:45,120 --> 00:31:47,560 Speaker 2: We get out of three hundred million people we get, 451 00:31:47,720 --> 00:31:50,840 Speaker 2: you know, where the MIT or Stanford or Harved we 452 00:31:50,960 --> 00:31:57,560 Speaker 2: get the top students even in STEM in science, technology, engineering, mathematics. 453 00:31:58,040 --> 00:32:03,360 Speaker 2: But the average level of the US is going down significantly. 454 00:32:04,440 --> 00:32:08,360 Speaker 2: And this is to me, this is the biggest fear. 455 00:32:08,480 --> 00:32:11,719 Speaker 2: And by the way, one of the main reasons that 456 00:32:11,800 --> 00:32:16,840 Speaker 2: we have this, you know, deficits in the balance of payment, 457 00:32:17,320 --> 00:32:22,120 Speaker 2: because otherwise we would have attracted more companies. But companies 458 00:32:22,160 --> 00:32:25,840 Speaker 2: know that there's no the tenant here is lucky and 459 00:32:26,040 --> 00:32:32,720 Speaker 2: the administration rather than invest mightily in education, I mean, 460 00:32:32,760 --> 00:32:36,840 Speaker 2: they try to root out DEI and the other things 461 00:32:36,880 --> 00:32:41,840 Speaker 2: from the education. Okay, fine, but make sure that they 462 00:32:41,880 --> 00:32:46,640 Speaker 2: focus on education or real education. I'm not sure they're 463 00:32:46,680 --> 00:32:49,160 Speaker 2: doing this, I mean the sleep service. But they're not 464 00:32:49,200 --> 00:32:55,320 Speaker 2: doing this because we are graduating kids who can hardly 465 00:32:55,360 --> 00:33:01,360 Speaker 2: read and write, forget being able to do real engineerings. 466 00:33:01,400 --> 00:33:05,360 Speaker 2: It's just or the number that we the number of 467 00:33:05,400 --> 00:33:09,000 Speaker 2: people who can do advanced engineering is way too small 468 00:33:10,560 --> 00:33:13,280 Speaker 2: because and it's not because of the university, it's because 469 00:33:13,280 --> 00:33:17,680 Speaker 2: of the input that comes into the universities. That's not 470 00:33:18,400 --> 00:33:24,000 Speaker 2: enough anyway. So that's a long answer to a short question. 471 00:33:24,080 --> 00:33:27,640 Speaker 1: Yeah, I'm sure we can probably talk a long time 472 00:33:28,320 --> 00:33:33,000 Speaker 1: about it. Do you see any things that are any 473 00:33:33,080 --> 00:33:38,240 Speaker 1: industries or manufacturing processes that are easy to bring. 474 00:33:38,120 --> 00:33:39,400 Speaker 2: Back to the United States? 475 00:33:39,640 --> 00:33:42,760 Speaker 1: You know, we mentioned chips, we mentioned sneakers. You know, 476 00:33:43,160 --> 00:33:47,000 Speaker 1: I can't see you know people you know want of 477 00:33:47,040 --> 00:33:50,760 Speaker 1: those the manufacturing sneakers. And I don't think Americans are 478 00:33:50,760 --> 00:33:53,719 Speaker 1: going to pay five hundred dollars for a pair of Nikes. 479 00:33:54,040 --> 00:33:59,040 Speaker 2: The problem with American liberates and American you know, oh, 480 00:33:59,240 --> 00:34:03,200 Speaker 2: environmental rules and all kinds of regulations, the prices will 481 00:34:03,280 --> 00:34:08,280 Speaker 2: be very high. It's so the question for a sneaker 482 00:34:08,360 --> 00:34:11,799 Speaker 2: company is do we pay sixty percent more and get 483 00:34:11,840 --> 00:34:15,359 Speaker 2: the sneakers vesty for three hundred dollars, or we bring 484 00:34:15,400 --> 00:34:18,080 Speaker 2: it to the United States, wait three years, build a plant, 485 00:34:18,360 --> 00:34:21,000 Speaker 2: don't find the workers for it, and then when we 486 00:34:21,080 --> 00:34:24,680 Speaker 2: finally automate it and get it to work with minimum 487 00:34:24,760 --> 00:34:30,120 Speaker 2: number of workers, it's five hundred dollars per sneaker and 488 00:34:30,160 --> 00:34:35,280 Speaker 2: we can't sell them anyway. So it's not clear. Look, 489 00:34:36,040 --> 00:34:38,680 Speaker 2: the main is it a problem with that you know, 490 00:34:40,400 --> 00:34:46,360 Speaker 2: automotive industry. They're being hammered because many of the parts 491 00:34:46,440 --> 00:34:50,359 Speaker 2: cross Canada and Mexico to United Several times, and each 492 00:34:50,480 --> 00:34:53,640 Speaker 2: time they'll be will be tariff on this. So it's 493 00:34:53,680 --> 00:34:57,120 Speaker 2: a it's getting ridiculous. The only way to do it is, 494 00:34:58,239 --> 00:35:03,040 Speaker 2: you know, but we should if we want to do something, 495 00:35:03,040 --> 00:35:07,239 Speaker 2: we should adopt value edit tacks. Just like most of 496 00:35:07,280 --> 00:35:10,759 Speaker 2: the world has value edit tax. It's a tax that 497 00:35:11,560 --> 00:35:17,600 Speaker 2: it is the least disruptive tax, but the US never 498 00:35:18,480 --> 00:35:22,000 Speaker 2: never went to it. It's much more efficient than sales 499 00:35:22,040 --> 00:35:24,799 Speaker 2: text that we charge here because it charged and it 500 00:35:24,960 --> 00:35:28,760 Speaker 2: also means very hard. It has so many other pluses. 501 00:35:28,800 --> 00:35:33,600 Speaker 2: It's it's very hard to use, you know, to do 502 00:35:33,680 --> 00:35:36,799 Speaker 2: stuff under the table because every stage of production must 503 00:35:36,800 --> 00:35:40,080 Speaker 2: report it and then get the money back. It's the 504 00:35:40,120 --> 00:35:44,239 Speaker 2: whole way that it's working along the supply chain. 505 00:35:45,160 --> 00:35:46,759 Speaker 1: So can you, I guess can you explain that a 506 00:35:46,760 --> 00:35:49,960 Speaker 1: little bit? So why is the value added tacks the 507 00:35:50,000 --> 00:35:50,799 Speaker 1: bad tax. 508 00:35:52,280 --> 00:35:56,040 Speaker 2: More fashion? So let's say you have a twenty percent tax. 509 00:35:56,080 --> 00:36:00,000 Speaker 2: It's most Europe is twenty percent tax. And let's say 510 00:36:00,120 --> 00:36:05,120 Speaker 2: if very simple thing, you get one supplier and one manufacturer. 511 00:36:06,560 --> 00:36:11,640 Speaker 2: The supplier pays twenty five percent tax. It comes to 512 00:36:11,680 --> 00:36:18,400 Speaker 2: the manufacturer. The manufacturers now has value added. He paid 513 00:36:18,520 --> 00:36:22,680 Speaker 2: twenty five percent only on the value added part, and 514 00:36:22,719 --> 00:36:25,600 Speaker 2: then on the part that comes from a supplier he 515 00:36:25,680 --> 00:36:28,040 Speaker 2: can get the money back. He applies to get the 516 00:36:28,080 --> 00:36:32,600 Speaker 2: money back, so everything has to be upfront in order 517 00:36:32,640 --> 00:36:36,200 Speaker 2: to get the money back. Now, finally, the one who 518 00:36:36,239 --> 00:36:40,320 Speaker 2: pays the full freight so to speak, is the consumer. 519 00:36:40,400 --> 00:36:43,600 Speaker 2: The consumer cannot get anything back. So for the consumer 520 00:36:43,680 --> 00:36:47,600 Speaker 2: it's the consumer paid twenty percent. But along the way 521 00:36:47,960 --> 00:36:51,759 Speaker 2: companies paid just for the value added actually that they 522 00:36:51,840 --> 00:36:56,440 Speaker 2: add add to it, and then for anything that the 523 00:36:58,040 --> 00:37:05,080 Speaker 2: supplier paid, the buyer can charge back at value. So 524 00:37:05,120 --> 00:37:08,200 Speaker 2: it's it kind of moves along the supply chain and 525 00:37:08,320 --> 00:37:11,560 Speaker 2: people are charging and charging and getting the money bag, 526 00:37:11,719 --> 00:37:14,960 Speaker 2: charging and getting part of the money bag. So it 527 00:37:15,080 --> 00:37:21,440 Speaker 2: requires significant you know management, but hey, we have a 528 00:37:21,440 --> 00:37:25,960 Speaker 2: big IRS, so's it can be done. It's a question 529 00:37:26,040 --> 00:37:28,480 Speaker 2: and now it's required on software and all this. But 530 00:37:28,560 --> 00:37:32,160 Speaker 2: it's a much more efficient way of taxation than sales 531 00:37:32,200 --> 00:37:39,239 Speaker 2: tax for example Celtics, it's not doesn't have any of 532 00:37:39,239 --> 00:37:45,120 Speaker 2: these qualities. But by the way, the administration makes a 533 00:37:45,160 --> 00:37:47,919 Speaker 2: mistake by looking at value added tax as if it's 534 00:37:47,960 --> 00:37:53,239 Speaker 2: distorting the market. First of all, the Europeans cannot get 535 00:37:53,320 --> 00:37:56,080 Speaker 2: rid of it because it finds most of the government. 536 00:37:56,160 --> 00:37:59,240 Speaker 2: It's a huge part of funding in a local government 537 00:37:59,280 --> 00:38:02,239 Speaker 2: in every state. They are in every country. I think 538 00:38:02,280 --> 00:38:04,120 Speaker 2: the only three or four countries in the world, and 539 00:38:04,160 --> 00:38:06,080 Speaker 2: the US is one of them. They do not have 540 00:38:06,640 --> 00:38:10,080 Speaker 2: value added tax. It is every economist will tell you 541 00:38:10,120 --> 00:38:14,480 Speaker 2: it's the most efficient way to tax, but we don't. 542 00:38:14,600 --> 00:38:22,440 Speaker 2: So we don't anyway. But it's so now they're the 543 00:38:21,440 --> 00:38:25,480 Speaker 2: h The administration took value added tax into account, thinking 544 00:38:25,520 --> 00:38:29,719 Speaker 2: that it's it's like a tariff. It's not, absolutely not. 545 00:38:30,560 --> 00:38:37,560 Speaker 2: But anyway, the administration is very simple calculation. But let 546 00:38:37,600 --> 00:38:40,880 Speaker 2: me add something just for the record, so to speak. 547 00:38:42,360 --> 00:38:46,280 Speaker 2: It is clear that some of what Trump is saying 548 00:38:46,320 --> 00:38:50,600 Speaker 2: that we were taking advantage of is correct. It's absolutely correct. 549 00:38:50,640 --> 00:38:54,480 Speaker 2: And by the way, every administration, the last I don't 550 00:38:54,520 --> 00:38:58,479 Speaker 2: know five six the administration was complaining about it. Right. 551 00:38:59,560 --> 00:39:02,560 Speaker 2: Trump is trying to do something about it now the 552 00:39:02,600 --> 00:39:05,879 Speaker 2: way he works in everything. If you look at how 553 00:39:05,880 --> 00:39:08,880 Speaker 2: its university, first of all, take a few hundred million 554 00:39:08,880 --> 00:39:12,360 Speaker 2: dollars and then let's talk, so you certainly get people's attention, 555 00:39:13,400 --> 00:39:17,920 Speaker 2: and just like what it does now is getting He 556 00:39:18,040 --> 00:39:23,080 Speaker 2: said that already fifty countries came back to renegotiate. Okay, 557 00:39:23,560 --> 00:39:26,840 Speaker 2: we'll see. We'll see how that goes. I hope they'll negotiate. 558 00:39:27,560 --> 00:39:31,000 Speaker 2: What can also happen that in a few months, when 559 00:39:31,040 --> 00:39:36,400 Speaker 2: the full branch of these teriffs will hit consumers, the 560 00:39:36,560 --> 00:39:40,960 Speaker 2: outcry will reach Congress, will reach the Senate, and we 561 00:39:41,080 --> 00:39:46,919 Speaker 2: may have, you know, congressional action. They'll simply outlaw some terrasts, 562 00:39:48,040 --> 00:39:50,839 Speaker 2: which is again a bad way of doing it without analysis, 563 00:39:50,840 --> 00:39:55,719 Speaker 2: without it. It should be the executive branch doing this, 564 00:39:55,960 --> 00:39:58,920 Speaker 2: but it should be done with a lot of analysis. 565 00:39:58,960 --> 00:40:03,919 Speaker 2: What is it should be? You know, which items should 566 00:40:03,960 --> 00:40:08,439 Speaker 2: be thanks at, what which how ter should should replace? 567 00:40:08,480 --> 00:40:13,560 Speaker 2: Should be a lot more sophisticated than just blend xpresent 568 00:40:13,760 --> 00:40:20,120 Speaker 2: on everything and just the simplical collation that the administration did. 569 00:40:20,239 --> 00:40:25,759 Speaker 2: But and again I am you know, Trump said he's 570 00:40:25,760 --> 00:40:30,800 Speaker 2: gonna do it. In some sense, he was elected based 571 00:40:30,840 --> 00:40:34,319 Speaker 2: on what he said. He never hid the fact that 572 00:40:34,400 --> 00:40:37,520 Speaker 2: he's in labor terist. I mean, he never hed it. 573 00:40:37,800 --> 00:40:41,719 Speaker 2: He was talking about it in the campaign trail. He 574 00:40:41,800 --> 00:40:47,000 Speaker 2: votes people voted him in. So it's a it's hard 575 00:40:47,040 --> 00:40:50,279 Speaker 2: to complain about it because we got what we voted for. 576 00:40:50,719 --> 00:40:54,560 Speaker 2: When I said we, I mean the majority of the 577 00:40:54,640 --> 00:40:58,359 Speaker 2: population got what they voted for. So here we are. 578 00:40:59,360 --> 00:41:02,920 Speaker 1: So let's shift gears a little bit. So you're not 579 00:41:03,080 --> 00:41:07,239 Speaker 1: only UH an academia, You're you're also been involved with 580 00:41:07,239 --> 00:41:11,640 Speaker 1: a number of companies. Looking at your bio, you founded 581 00:41:11,719 --> 00:41:16,120 Speaker 1: or co founded five companies that were acquired by other companies. 582 00:41:16,120 --> 00:41:18,759 Speaker 1: So that's that's a pretty darn good track record. Can 583 00:41:18,800 --> 00:41:21,480 Speaker 1: you talk about the kind of like the lessons learned 584 00:41:21,560 --> 00:41:28,080 Speaker 1: from founding companies UH in supply chain and sure, you know, 585 00:41:28,160 --> 00:41:30,520 Speaker 1: maybe maybe talk to those experiences a little bit. 586 00:41:31,120 --> 00:41:37,480 Speaker 2: Sure, So some companies were software companies and some were 587 00:41:38,480 --> 00:41:41,879 Speaker 2: maybe one of the more interesting companies was called Logicorp. 588 00:41:42,239 --> 00:41:47,040 Speaker 2: It's a I was consulting at the time to Rockwell 589 00:41:47,080 --> 00:41:51,959 Speaker 2: International and befriended the vice president logistic the trock Oil 590 00:41:52,400 --> 00:41:57,399 Speaker 2: Roco was selling truck parts UH trucking company everything under 591 00:41:57,440 --> 00:42:02,000 Speaker 2: the truck brank, breaks, excels, what have you. At that time, 592 00:42:02,080 --> 00:42:07,560 Speaker 2: it was deregulation of the transportation industry came into being, 593 00:42:07,800 --> 00:42:11,120 Speaker 2: and we just look at each other and say, okay, 594 00:42:12,200 --> 00:42:16,799 Speaker 2: we can now serve every route without getting the the 595 00:42:16,840 --> 00:42:19,640 Speaker 2: government to prove it. So let's do what's called a 596 00:42:19,719 --> 00:42:25,160 Speaker 2: market test auction. We did we got the right to 597 00:42:25,160 --> 00:42:27,600 Speaker 2: do under Rockwell still got the right to do auction, 598 00:42:28,360 --> 00:42:33,560 Speaker 2: and the results were astounding prices when half became half 599 00:42:33,600 --> 00:42:38,400 Speaker 2: the price or you know, forty percent better and with 600 00:42:38,520 --> 00:42:43,040 Speaker 2: the same carriers as before. And then so we look 601 00:42:43,080 --> 00:42:45,960 Speaker 2: at it and said, okay, there's a business here. So 602 00:42:46,000 --> 00:42:53,120 Speaker 2: we convinced Rockwall to start the separate unit and then 603 00:42:53,320 --> 00:42:58,000 Speaker 2: basically we bought it from Rockwall and this started growing 604 00:42:58,160 --> 00:43:02,160 Speaker 2: like weeds in We ran it three years under rock Oil, 605 00:43:02,200 --> 00:43:04,880 Speaker 2: which it didn't grow much because the carriers were afraid 606 00:43:04,880 --> 00:43:08,080 Speaker 2: that we have biasing. We did these auctions, but the 607 00:43:08,160 --> 00:43:11,040 Speaker 2: carriers were afraid of buiasing. The results for rockcol for 608 00:43:11,239 --> 00:43:13,919 Speaker 2: a Rockcay customers, and we said we must get it out. 609 00:43:14,400 --> 00:43:19,359 Speaker 2: We found the an angel. We took it out and 610 00:43:19,440 --> 00:43:22,440 Speaker 2: it grew from it was about forty million when it 611 00:43:22,520 --> 00:43:24,200 Speaker 2: took it out of rock Oil. It grew within three 612 00:43:24,280 --> 00:43:28,879 Speaker 2: years to six hundred million, and it grew like weeds. 613 00:43:29,360 --> 00:43:35,000 Speaker 2: We were an example of outsourcing. We had just logistical engineer. 614 00:43:35,120 --> 00:43:38,680 Speaker 2: Everything else was every other corporate function was outsourced. We 615 00:43:38,800 --> 00:43:42,520 Speaker 2: tried to tell people they can outsource the traffic manage 616 00:43:42,520 --> 00:43:47,040 Speaker 2: transportation management to us and to give you an idea. 617 00:43:47,120 --> 00:43:49,600 Speaker 2: There was two quarters when we said we cannot take 618 00:43:49,640 --> 00:43:54,040 Speaker 2: any new business, and we still doubled just existing business. 619 00:43:55,560 --> 00:43:56,440 Speaker 1: It's good problem to have. 620 00:43:56,960 --> 00:43:59,759 Speaker 2: Well, the wheels are coming off the wagon in terms 621 00:43:59,800 --> 00:44:03,200 Speaker 2: of very hard thisason before the internet was it not 622 00:44:03,360 --> 00:44:05,919 Speaker 2: easy to manage a business like this? I mean the 623 00:44:05,960 --> 00:44:10,319 Speaker 2: papers and stuff and showed with mis solid to Writer. 624 00:44:10,840 --> 00:44:15,000 Speaker 2: It's now a three billion you know dollars annually part 625 00:44:15,040 --> 00:44:19,279 Speaker 2: of Writer. You know. The chairman of Writer called me, 626 00:44:19,480 --> 00:44:22,000 Speaker 2: he retired. He said, you know what we did one 627 00:44:22,080 --> 00:44:24,960 Speaker 2: hundred and fifty six acquisition. This was the best one 628 00:44:24,960 --> 00:44:29,840 Speaker 2: that we ever did. So so it was and I 629 00:44:29,840 --> 00:44:32,160 Speaker 2: got on the Writer board and stuff like this. But 630 00:44:32,239 --> 00:44:36,480 Speaker 2: let's a so this was one company. We had a 631 00:44:39,760 --> 00:44:43,480 Speaker 2: software company that did that started by doing software for 632 00:44:43,560 --> 00:44:49,480 Speaker 2: tracking for routing, scheduling, assignment of drivers and TL truck 633 00:44:49,560 --> 00:44:54,319 Speaker 2: load load load matching, a load planning for LTL and 634 00:44:54,360 --> 00:44:58,560 Speaker 2: then change it or grew it to do procurement basically 635 00:44:59,160 --> 00:45:06,360 Speaker 2: for for shippers. And this company was sold to Manhattan Associates. 636 00:45:07,200 --> 00:45:12,520 Speaker 2: H There were lots of stories, you know, along ever, 637 00:45:12,640 --> 00:45:17,040 Speaker 2: but I tell you what I always stayed also my 638 00:45:17,120 --> 00:45:19,280 Speaker 2: t I usually well what I did is started a company, 639 00:45:19,360 --> 00:45:21,680 Speaker 2: spent a year at the company and then just became 640 00:45:21,800 --> 00:45:26,840 Speaker 2: chairman just you know, found management team. And then because 641 00:45:26,880 --> 00:45:29,719 Speaker 2: I I could not stay them T and have b 642 00:45:29,840 --> 00:45:32,920 Speaker 2: a line officer because I understand it, I see it 643 00:45:32,960 --> 00:45:34,440 Speaker 2: in front of the class. You don't want to get 644 00:45:34,440 --> 00:45:37,160 Speaker 2: a phone call certainly and run to the company. Okay, 645 00:45:37,200 --> 00:45:40,960 Speaker 2: that's an m T rule that made that's reasonable. So 646 00:45:41,080 --> 00:45:45,359 Speaker 2: I work rule and either took of absence of sabbatical 647 00:45:45,640 --> 00:45:52,240 Speaker 2: whatever it was, and start a company and then acquire 648 00:45:53,160 --> 00:45:58,640 Speaker 2: They all acquired just putunistically, they're all all acquired by 649 00:45:58,920 --> 00:46:02,520 Speaker 2: by the company. But what did you ask for? What 650 00:46:02,960 --> 00:46:06,400 Speaker 2: I learned? I learned that that became an amazingly good teacher. 651 00:46:07,840 --> 00:46:12,239 Speaker 2: I talked about my own experience and there's nothing you know. 652 00:46:12,280 --> 00:46:15,440 Speaker 2: I worked directly with companies, with a tracking company, ships 653 00:46:15,480 --> 00:46:20,719 Speaker 2: with carrier. We developed one of the first TMS's transportation 654 00:46:20,840 --> 00:46:26,279 Speaker 2: management systems. So it's a It was one thing to 655 00:46:26,400 --> 00:46:30,160 Speaker 2: talk to students about stuff that I had experienced with 656 00:46:31,040 --> 00:46:34,040 Speaker 2: that they're not talking about what they reading the book, 657 00:46:34,080 --> 00:46:37,440 Speaker 2: what they do a case study. I developed case studies 658 00:46:37,440 --> 00:46:41,040 Speaker 2: that according that I know everything about it because I 659 00:46:41,120 --> 00:46:45,800 Speaker 2: lived it. So I'm also, of course consulting. Now the 660 00:46:45,880 --> 00:46:49,760 Speaker 2: last company was so years ago. I'm consulting to lots 661 00:46:49,760 --> 00:46:56,320 Speaker 2: of budding entrepreneurs because after that I joined some venture 662 00:46:56,360 --> 00:47:01,000 Speaker 2: capital boards, but not anymore. But a lot of people 663 00:47:01,080 --> 00:47:03,200 Speaker 2: they might want to start companies, they want to study this. 664 00:47:03,360 --> 00:47:06,880 Speaker 2: I'm the informal consult that to those people stry to 665 00:47:06,960 --> 00:47:08,080 Speaker 2: try to start companies. 666 00:47:09,360 --> 00:47:14,000 Speaker 1: Well, well that must be extremely rewarding. You know, we're 667 00:47:14,040 --> 00:47:16,719 Speaker 1: coming up towards the end of our time and I 668 00:47:16,880 --> 00:47:19,520 Speaker 1: just always like to ask this, and maybe I'll phrase 669 00:47:19,520 --> 00:47:23,080 Speaker 1: the question all differently. Since you know, giving that your 670 00:47:23,200 --> 00:47:26,400 Speaker 1: mi T, do you have like a required reading for 671 00:47:26,480 --> 00:47:28,719 Speaker 1: all your students. It might not be, it might not 672 00:47:28,760 --> 00:47:31,840 Speaker 1: be a textbook, but it's something that you know, would 673 00:47:31,840 --> 00:47:35,640 Speaker 1: prepare them for a career in supply chain. 674 00:47:37,280 --> 00:47:42,200 Speaker 2: That's very you know, there are some textbooks in supply chain. 675 00:47:43,760 --> 00:47:48,359 Speaker 2: My own books are not textbooks. There are business books 676 00:47:48,840 --> 00:47:56,240 Speaker 2: and look for example, my book on name it's called 677 00:47:56,719 --> 00:48:01,640 Speaker 2: Sustainability in Supply Chain. It's called the Bus and Green. 678 00:48:02,440 --> 00:48:06,200 Speaker 2: When to apply Sustainability in supply gen and when not 679 00:48:06,280 --> 00:48:12,880 Speaker 2: to is used by several schools as textbook, including the Hours. 680 00:48:12,880 --> 00:48:16,319 Speaker 2: Now we don't have a class on separate issues so 681 00:48:16,320 --> 00:48:19,759 Speaker 2: we usually don't assign books. We are command books, so 682 00:48:20,120 --> 00:48:27,600 Speaker 2: books on specific specific subject. On AI, we don't have 683 00:48:27,640 --> 00:48:32,880 Speaker 2: a book because it is we just talk about the 684 00:48:32,960 --> 00:48:40,680 Speaker 2: latest and every semester the the lectures change and the 685 00:48:40,680 --> 00:48:45,440 Speaker 2: the work the homework changes because we students. So for example, 686 00:48:45,440 --> 00:48:47,840 Speaker 2: our students now have to not Python, which is the 687 00:48:47,920 --> 00:48:52,759 Speaker 2: language that you know, the computer language used mostly for APIs, 688 00:48:53,360 --> 00:48:57,600 Speaker 2: and but every year they have to do more or 689 00:48:57,680 --> 00:49:03,360 Speaker 2: different because the abilities are a different, you know, the 690 00:49:03,440 --> 00:49:12,640 Speaker 2: new software coming, the latest version of GPT of you know, Copilot, Jemini, 691 00:49:12,800 --> 00:49:16,120 Speaker 2: whatever it is, it's getting better. I can do more 692 00:49:16,960 --> 00:49:19,160 Speaker 2: and so you work with it. 693 00:49:19,960 --> 00:49:21,880 Speaker 1: Well you sa I really want to thank you for 694 00:49:21,960 --> 00:49:25,600 Speaker 1: your time. This was a really great conversation and thanks 695 00:49:25,640 --> 00:49:26,480 Speaker 1: for your insights. 696 00:49:26,840 --> 00:49:30,200 Speaker 2: All right, thank you very much for having me appreciate it. 697 00:49:29,480 --> 00:49:32,680 Speaker 1: It's my pleasure and hope to have you back again 698 00:49:32,800 --> 00:49:35,359 Speaker 1: some time. And I want to thank you for tuning in. 699 00:49:35,440 --> 00:49:38,120 Speaker 1: If you liked the episode, please subscribe and leave a review. 700 00:49:38,520 --> 00:49:40,880 Speaker 1: We've lined up a number of great guests for the podcast, 701 00:49:40,880 --> 00:49:45,799 Speaker 1: so please check back to your conversations with c suite executives, shippers, regulators, 702 00:49:45,840 --> 00:49:48,880 Speaker 1: and decision makers within the freight markets. Also, if you 703 00:49:48,880 --> 00:49:51,000 Speaker 1: have any ideas for a future episodes, just hit me 704 00:49:51,080 --> 00:49:53,879 Speaker 1: up on the Bloomberg terminal or on Twitter at logistics Late. 705 00:49:54,160 --> 00:50:01,440 Speaker 1: Thanks so much and take care everyone,