1 00:00:04,440 --> 00:00:08,440 Speaker 1: Welcome to tech Stuff, a production from iHeartRadio. 2 00:00:12,080 --> 00:00:13,480 Speaker 2: Hey there, tech Stuff listeners. 3 00:00:13,640 --> 00:00:17,480 Speaker 1: This is Jonathan Strickland, executive producer at iHeart Podcasts, and 4 00:00:17,680 --> 00:00:20,400 Speaker 1: what I have for you today is an episode of 5 00:00:20,440 --> 00:00:23,799 Speaker 1: a new podcast we launched earlier this year in partnership 6 00:00:23,800 --> 00:00:28,680 Speaker 1: with Intel. The show is called Technically Speaking, an Intel Podcast, 7 00:00:29,200 --> 00:00:32,840 Speaker 1: and it focuses on all things artificial intelligence. Now, y'all 8 00:00:32,840 --> 00:00:36,159 Speaker 1: have heard me talk about AI tons of times on 9 00:00:36,240 --> 00:00:38,239 Speaker 1: tech Stuff, and I'm sure you've got a pretty good 10 00:00:38,280 --> 00:00:42,680 Speaker 1: handle on my general thoughts and opinions about artificial intelligence. 11 00:00:42,720 --> 00:00:44,920 Speaker 1: But that's not to say that my point of view 12 00:00:45,000 --> 00:00:48,680 Speaker 1: is the only one, or heaven knows, it's not necessarily 13 00:00:48,720 --> 00:00:52,519 Speaker 1: the correct one, or anything like that. This show features 14 00:00:52,520 --> 00:00:58,960 Speaker 1: hosts Graham Class exploring bleeding edge implementations of AI and 15 00:00:59,040 --> 00:01:03,959 Speaker 1: how AI is making incredible changes in the way we 16 00:01:04,040 --> 00:01:06,280 Speaker 1: do different types of work and how it can help 17 00:01:06,360 --> 00:01:10,080 Speaker 1: people in various ways. And he has conversations with pioneers 18 00:01:10,080 --> 00:01:13,800 Speaker 1: and innovators in the space. So check out this episode 19 00:01:13,840 --> 00:01:16,600 Speaker 1: and to hear more, make sure you subscribe to Technically 20 00:01:16,600 --> 00:01:21,760 Speaker 1: Speaking and Intel podcast. Wherever you get your podcasts enjoy. 21 00:01:22,959 --> 00:01:26,280 Speaker 3: Where do world changing ideas get their start at Intel. 22 00:01:26,319 --> 00:01:29,399 Speaker 3: It starts with real solutions, and real solutions start with 23 00:01:29,480 --> 00:01:34,480 Speaker 3: exceptional engineering, the quantum computing revolution, the next generation of 24 00:01:34,520 --> 00:01:38,560 Speaker 3: AI experts, the renewable energy grid, liquid cooling, data centers, 25 00:01:38,680 --> 00:01:42,760 Speaker 3: early diagnosis for cancer, water restoration, and even farmland protection. 26 00:01:43,120 --> 00:01:46,360 Speaker 3: The examples are countless, the impacts are endless, but the 27 00:01:46,440 --> 00:01:50,160 Speaker 3: foundation is always the same. It starts with Intel. Join 28 00:01:50,240 --> 00:01:53,080 Speaker 3: us in redefining what's achievable through the power of AI. 29 00:01:53,600 --> 00:02:01,400 Speaker 3: Learn more at Intel dot com slash stories. Welcome to 30 00:02:01,600 --> 00:02:06,280 Speaker 3: Lake Nona, a beautiful residential and commercial oasis where the 31 00:02:06,320 --> 00:02:10,400 Speaker 3: future has arrived. Lake Nona is a seventeen square mile 32 00:02:10,440 --> 00:02:14,920 Speaker 3: community in Orlando, Florida that has established new standards of 33 00:02:15,000 --> 00:02:20,080 Speaker 3: living that integrate the latest technology into every facet of life, including, 34 00:02:20,320 --> 00:02:23,560 Speaker 3: but not limited to the way its citizens get around. 35 00:02:24,000 --> 00:02:27,280 Speaker 3: Picture this. A person stands in the warm Florida sun 36 00:02:27,520 --> 00:02:30,400 Speaker 3: at a designated bus stop, waiting for the next shuttle 37 00:02:30,400 --> 00:02:33,839 Speaker 3: to arrive. And here it comes, not with the roar 38 00:02:33,880 --> 00:02:36,320 Speaker 3: of an engine, but with the gentle hum of an 39 00:02:36,400 --> 00:02:40,280 Speaker 3: energy efficient electric mona. The busk glides to a halt, 40 00:02:40,800 --> 00:02:45,280 Speaker 3: and as the doors open, something is missing. There's no 41 00:02:45,320 --> 00:02:48,400 Speaker 3: one in the driver's seat. That's because Lake Nona is 42 00:02:48,440 --> 00:02:51,359 Speaker 3: home to one of the country's largest and longest running 43 00:02:51,360 --> 00:02:55,919 Speaker 3: single site autonomous vehicle fleets. These energy efficient, self driving 44 00:02:55,960 --> 00:02:59,119 Speaker 3: buses have transformed the way residents travel in this community, 45 00:02:59,720 --> 00:03:03,320 Speaker 3: say and easily accessible. They whisk people from place to place, 46 00:03:04,000 --> 00:03:08,800 Speaker 3: freeing hands, reducing traffic congestion, and embracing a sustainable future. 47 00:03:09,720 --> 00:03:12,560 Speaker 3: What else can a world of autonomous public transportation do? 48 00:03:13,400 --> 00:03:17,000 Speaker 3: How else may impact the way a community operates in 49 00:03:17,040 --> 00:03:19,680 Speaker 3: this bright and sunny corner of the world. The horizon 50 00:03:19,720 --> 00:03:32,760 Speaker 3: is limitless and our journey is full of possibilities. Hey there, 51 00:03:32,919 --> 00:03:36,760 Speaker 3: I'm Grain Class and this is technically speaking an Intel podcast. 52 00:03:37,360 --> 00:03:40,920 Speaker 3: The show is dedicated to highlighting technology is revolutionizing the 53 00:03:40,960 --> 00:03:44,920 Speaker 3: way we live, work, and move. In every episode, we'll 54 00:03:44,960 --> 00:03:48,200 Speaker 3: connect with innovators in areas like artificial intelligence to better 55 00:03:48,280 --> 00:03:52,480 Speaker 3: understand the human centered technology they've developed. Thus far, we've 56 00:03:52,520 --> 00:03:57,160 Speaker 3: explored how AI impacts society in the ways of agriculture, accessibility, 57 00:03:57,440 --> 00:04:00,240 Speaker 3: and mental health. But one of the ways technology and 58 00:04:00,360 --> 00:04:04,920 Speaker 3: especially artificial intelligence impact society is through its structures. AI 59 00:04:05,080 --> 00:04:07,880 Speaker 3: is advancing the way cities are able to serve their citizens. 60 00:04:08,480 --> 00:04:10,720 Speaker 3: There's a very interesting example of this happening in a 61 00:04:10,720 --> 00:04:13,920 Speaker 3: small town in the United States. But before we go 62 00:04:13,960 --> 00:04:18,839 Speaker 3: any further, I need to introduce my guests. Joining me 63 00:04:18,920 --> 00:04:21,960 Speaker 3: now is Joey Morow, the CEO of BEEP, which is 64 00:04:22,000 --> 00:04:25,440 Speaker 3: a company that offers autonomous mobility solutions in public and 65 00:04:25,480 --> 00:04:29,000 Speaker 3: private communities across the US. His career has spanned the 66 00:04:29,080 --> 00:04:33,200 Speaker 3: technology arena, from hardware and software to IT services. He 67 00:04:33,240 --> 00:04:37,360 Speaker 3: has spearheaded groundbreaking enterprise projects in cutting edge startups to 68 00:04:37,560 --> 00:04:42,520 Speaker 3: multi billion dollar enterprises. Joe's expertise in innovation, strategy and 69 00:04:42,560 --> 00:04:45,800 Speaker 3: transformative technologies paved the way for his role at BEEP, 70 00:04:46,080 --> 00:04:49,560 Speaker 3: where he now leads a new team transforming mobility as 71 00:04:49,560 --> 00:04:52,440 Speaker 3: we know it. We are so excited to have you on, Joe. 72 00:04:52,680 --> 00:04:54,240 Speaker 2: Thank you, Graham, glad to be here. 73 00:04:54,920 --> 00:04:57,799 Speaker 3: Also joining us as Juan Santos, the senior vice president 74 00:04:57,920 --> 00:05:01,840 Speaker 3: of Brand Experience and Innovation at Tavas Group. At table Stock, 75 00:05:01,920 --> 00:05:05,120 Speaker 3: He's part of a multi disciplinary team that uses design 76 00:05:05,160 --> 00:05:08,720 Speaker 3: thinking to build places where people can thrive. One is 77 00:05:08,760 --> 00:05:13,960 Speaker 3: a recognized expert in design thinking, user generated content, virtual worlds, 78 00:05:14,320 --> 00:05:16,640 Speaker 3: physical and digital, and loyalty and rewards. 79 00:05:17,160 --> 00:05:18,360 Speaker 2: Welcome to the chop one. 80 00:05:18,520 --> 00:05:19,200 Speaker 4: Thank you very much. 81 00:05:19,240 --> 00:05:25,920 Speaker 3: Green, I'll start with you, Joe, Can you just tell 82 00:05:25,960 --> 00:05:29,120 Speaker 3: us a little bit more about Beep and in particular 83 00:05:29,360 --> 00:05:33,200 Speaker 3: your personal story around why you decided to get involved 84 00:05:33,200 --> 00:05:33,799 Speaker 3: with the company. 85 00:05:34,360 --> 00:05:37,080 Speaker 2: Yeah, I'm happy to Graham, and thanks again for having us. 86 00:05:37,120 --> 00:05:41,800 Speaker 2: So Beep was founded on the premise that autonomous mobility 87 00:05:42,360 --> 00:05:45,760 Speaker 2: is going to be proven out in i'll see incremental 88 00:05:45,880 --> 00:05:49,640 Speaker 2: use cases. I know everybody has had different experiences and 89 00:05:49,800 --> 00:05:52,880 Speaker 2: or has read a little bit about what driving and 90 00:05:52,920 --> 00:05:55,200 Speaker 2: mobility is about. You know, I would tell you if 91 00:05:55,240 --> 00:05:58,520 Speaker 2: you think of the technologies and the work that we're doing, 92 00:05:58,560 --> 00:06:02,720 Speaker 2: it's very focused on on shorthaul first mile last mile 93 00:06:02,800 --> 00:06:06,760 Speaker 2: type use cases in public and private communities, solving for 94 00:06:07,360 --> 00:06:11,120 Speaker 2: that micro transit gap across many areas of our country. 95 00:06:11,839 --> 00:06:15,480 Speaker 2: Second is very important that it's a shared platform, so 96 00:06:15,520 --> 00:06:20,760 Speaker 2: we focus on more controlled speed, GEO fenced use cases, 97 00:06:21,440 --> 00:06:24,839 Speaker 2: but in a shared mobility form factor, meaning a shuttle 98 00:06:24,880 --> 00:06:28,480 Speaker 2: that seats a ten to twelve passengers and really represents 99 00:06:28,880 --> 00:06:33,240 Speaker 2: that ability to provide a good balance of yes, personal mobility, 100 00:06:33,279 --> 00:06:37,280 Speaker 2: but also community mobility. So the business was founded by 101 00:06:37,320 --> 00:06:40,200 Speaker 2: a group of us that are also investors in the company. 102 00:06:40,600 --> 00:06:44,080 Speaker 2: We've been entrepreneurs across a couple of funds, so we're 103 00:06:44,160 --> 00:06:47,480 Speaker 2: venture capitalists as well as operators. And again, as we 104 00:06:47,560 --> 00:06:50,680 Speaker 2: looked at this key inflection point in the area of 105 00:06:50,760 --> 00:06:56,000 Speaker 2: technology specific to autonomy, made a very calculated approach to 106 00:06:56,080 --> 00:06:59,520 Speaker 2: focusing on this micro segment of the larger market of 107 00:06:59,600 --> 00:07:06,200 Speaker 2: autono enemy around this electric shared autonomous mobility in these 108 00:07:06,320 --> 00:07:08,520 Speaker 2: micro transit use cases. 109 00:07:09,920 --> 00:07:13,040 Speaker 3: BEEP is a turnkey mobility solution with the goal of 110 00:07:13,040 --> 00:07:18,120 Speaker 3: providing stress free transportation, reducing carbon emissions, and improving road safety. 111 00:07:18,760 --> 00:07:23,480 Speaker 3: Offering autonomous transportation to thousands of people, beep's technology focuses 112 00:07:23,520 --> 00:07:27,440 Speaker 3: on community and offers localized travel solutions that reflect the 113 00:07:27,440 --> 00:07:32,440 Speaker 3: way people want to engage with their neighborhood. Are these 114 00:07:32,600 --> 00:07:36,400 Speaker 3: vehicles going to be driver lists or driver assisted? How 115 00:07:36,480 --> 00:07:38,000 Speaker 3: is that currently being played out? 116 00:07:38,400 --> 00:07:41,600 Speaker 2: Yeah, it's a great question. We work in partnership with 117 00:07:41,640 --> 00:07:46,000 Speaker 2: the US Department of Transportation, who oversees the use of 118 00:07:46,040 --> 00:07:49,720 Speaker 2: these vehicles on our roadways today. So the vehicles are 119 00:07:49,760 --> 00:07:54,360 Speaker 2: operating in a very high percentage fully autonomous. But we 120 00:07:54,480 --> 00:07:59,360 Speaker 2: do have safety attendants or ambassadors on board whose responsibility 121 00:07:59,480 --> 00:08:03,280 Speaker 2: is to both educate welcome passengers and introduce them to 122 00:08:03,320 --> 00:08:07,480 Speaker 2: the technology, help them feel comfortable with these types of services, 123 00:08:07,520 --> 00:08:11,200 Speaker 2: but also to take over manual control should that be 124 00:08:11,280 --> 00:08:14,040 Speaker 2: needed if there's an event on the roadway that requires 125 00:08:14,080 --> 00:08:18,920 Speaker 2: some level of intervention. Fast forward a couple of short years, 126 00:08:19,600 --> 00:08:23,520 Speaker 2: and those attendants are going to be virtual or remote. 127 00:08:24,000 --> 00:08:27,280 Speaker 2: So we will in our types of services always have 128 00:08:27,480 --> 00:08:29,920 Speaker 2: a human in the loop. It will shift from being 129 00:08:29,960 --> 00:08:34,360 Speaker 2: an onboard attendant to a virtual attendant. And you can 130 00:08:34,400 --> 00:08:39,200 Speaker 2: only imagine, especially in the area of public transportation, if 131 00:08:39,240 --> 00:08:43,440 Speaker 2: there is some circumstance, be that a traffic jam or 132 00:08:43,480 --> 00:08:48,000 Speaker 2: a pothole on a roadway or some other eventuality, you 133 00:08:48,080 --> 00:08:50,880 Speaker 2: still have to be able to communicate with passengers on 134 00:08:51,000 --> 00:08:53,400 Speaker 2: board if there's a reason to pull a vehicle off 135 00:08:53,480 --> 00:08:56,640 Speaker 2: the side of the road, let people know what's going 136 00:08:56,679 --> 00:08:58,200 Speaker 2: on and what to do about it. 137 00:08:58,360 --> 00:09:02,319 Speaker 3: Okay, great, I'll bring one into that discussion. Now, can 138 00:09:02,400 --> 00:09:04,320 Speaker 3: you just tell us a little bit about your work 139 00:09:04,360 --> 00:09:05,600 Speaker 3: at Tavasot Group. 140 00:09:06,600 --> 00:09:11,360 Speaker 4: So I lead innovation and a brand experience in what 141 00:09:11,679 --> 00:09:15,760 Speaker 4: most people would traditionally think of as a development company. However, 142 00:09:15,920 --> 00:09:19,040 Speaker 4: Tavisak Development, which is the area that I focus mostly in, 143 00:09:19,480 --> 00:09:22,720 Speaker 4: is not your traditional developer. We are actually an owner 144 00:09:22,800 --> 00:09:26,120 Speaker 4: operator and in the case of BEEP, we have a 145 00:09:26,120 --> 00:09:29,600 Speaker 4: place called Lakenona where directly contiguous to the Orlando Airport. 146 00:09:29,640 --> 00:09:32,600 Speaker 4: We're proud citizens of the city of Orlando, but we 147 00:09:32,679 --> 00:09:36,120 Speaker 4: represent an advanced district in the city, and it's a 148 00:09:36,160 --> 00:09:39,960 Speaker 4: fairly large advanced district. We're approximately seventeen square miles to 149 00:09:40,000 --> 00:09:43,200 Speaker 4: give you a point of comparison, Manhattan's twenty two, so 150 00:09:43,240 --> 00:09:45,800 Speaker 4: it's a fairly large swath of land. And then we 151 00:09:45,880 --> 00:09:49,760 Speaker 4: have pretty much every use case inside like no, no, I mean, 152 00:09:49,800 --> 00:09:54,079 Speaker 4: we have universities, high schools, people can go to preschool, 153 00:09:54,120 --> 00:09:58,280 Speaker 4: there's micro apartments, there's large homes, so it becomes this 154 00:09:58,760 --> 00:10:02,120 Speaker 4: really interesting place. It's for people to live, but also 155 00:10:02,480 --> 00:10:05,920 Speaker 4: for companies that are on the forefront of technology to 156 00:10:06,000 --> 00:10:08,720 Speaker 4: use us a living lab. The reason BEEP is a 157 00:10:08,720 --> 00:10:13,079 Speaker 4: critical partner for Lignona is because we believe mobility is 158 00:10:13,200 --> 00:10:15,240 Speaker 4: one of those things that create a lot of friction 159 00:10:15,440 --> 00:10:17,719 Speaker 4: inside a community. Right you come to a place and 160 00:10:18,120 --> 00:10:21,080 Speaker 4: parking is difficult moving from one place to the other. 161 00:10:21,320 --> 00:10:24,120 Speaker 4: That's really kind of like they're not so enjoyable, not 162 00:10:24,280 --> 00:10:28,000 Speaker 4: so great parts of being in communities that are successful. 163 00:10:28,240 --> 00:10:32,040 Speaker 4: In Lignona, we've tackled that friction with immobility by a 164 00:10:32,120 --> 00:10:36,600 Speaker 4: variety of things, but we've also incorporated BEEP under autonomou 165 00:10:36,600 --> 00:10:40,440 Speaker 4: shuttle operation as a critical part to provide that first 166 00:10:40,480 --> 00:10:44,720 Speaker 4: and last mile mile and a half inside the community 167 00:10:45,200 --> 00:10:47,800 Speaker 4: for people to traverse, and it's something that has been 168 00:10:47,880 --> 00:10:51,720 Speaker 4: running now for multiple years. We have what I believe 169 00:10:51,800 --> 00:10:55,240 Speaker 4: today is the largest and longest running autonomous shuttle operation 170 00:10:55,800 --> 00:10:58,520 Speaker 4: in the United States in Lakenona. It's actually so prevalent 171 00:10:58,960 --> 00:11:01,160 Speaker 4: now that we're coming close to the end of the 172 00:11:01,240 --> 00:11:04,760 Speaker 4: year where we had a kid, you know, last Halloween 173 00:11:04,800 --> 00:11:07,800 Speaker 4: actually dressed up as one of the autonomous shuttles. So 174 00:11:07,840 --> 00:11:11,760 Speaker 4: it's something that's both an incredible service that reliefs striction, 175 00:11:11,880 --> 00:11:16,120 Speaker 4: but it's become a natural part of the ecosystem that 176 00:11:16,160 --> 00:11:18,680 Speaker 4: people live with and live in in Lachdana. 177 00:11:19,200 --> 00:11:23,920 Speaker 3: Yeah, I'm interested in how that autonomous shuttle bus started 178 00:11:24,400 --> 00:11:27,960 Speaker 3: and was there any I guess pushback or were any 179 00:11:28,040 --> 00:11:31,079 Speaker 3: challenges with the community to try and get this sort 180 00:11:31,120 --> 00:11:32,240 Speaker 3: of thing deployed. 181 00:11:33,080 --> 00:11:36,680 Speaker 4: Actually, it was incredibly well received. It started in a 182 00:11:36,720 --> 00:11:39,720 Speaker 4: conversation with the founders of BEEP. We were actually having 183 00:11:39,720 --> 00:11:42,760 Speaker 4: a conversation about a different topic and the topic of 184 00:11:42,760 --> 00:11:47,640 Speaker 4: autonomous mobility came up, and after that conversation. Fast forward 185 00:11:47,679 --> 00:11:51,600 Speaker 4: eleven months and the company had been created, the vehicles 186 00:11:51,640 --> 00:11:54,079 Speaker 4: have been brought into the US. We've worked with Department 187 00:11:54,120 --> 00:11:57,520 Speaker 4: of Transportation and NITSA to make it happen, and from 188 00:11:57,520 --> 00:12:01,280 Speaker 4: a community perspective, we actually did an outreach process where 189 00:12:01,320 --> 00:12:03,920 Speaker 4: we actually allowed critical members of the community to be 190 00:12:03,960 --> 00:12:08,240 Speaker 4: a part of understanding what the vehicles would do. For example, 191 00:12:08,320 --> 00:12:11,640 Speaker 4: we had a specific day where the beeps were on 192 00:12:11,800 --> 00:12:15,720 Speaker 4: preview just for first responders, so we showed our police 193 00:12:15,720 --> 00:12:19,040 Speaker 4: department and the fire department how to work with the vehicles, 194 00:12:19,040 --> 00:12:21,560 Speaker 4: how to operate them, how to move them if necessary, 195 00:12:21,920 --> 00:12:24,760 Speaker 4: and when the vehicles rolled for the first time, we 196 00:12:24,800 --> 00:12:28,080 Speaker 4: had a community that was ready, so we didn't have 197 00:12:28,240 --> 00:12:32,200 Speaker 4: much pushback. Now we had people have to adapt to 198 00:12:32,440 --> 00:12:35,360 Speaker 4: having a vehicle with no driver right because even though 199 00:12:35,360 --> 00:12:38,720 Speaker 4: there's a safety attendant on board, the vehicles operating on 200 00:12:38,760 --> 00:12:42,800 Speaker 4: its own and it operates differently than a humanly controlled vehicle. 201 00:12:43,240 --> 00:12:46,680 Speaker 4: So we had some situations where people were like learning 202 00:12:46,720 --> 00:12:49,600 Speaker 4: to interact with them, but for the most part, it 203 00:12:49,679 --> 00:12:53,480 Speaker 4: was very well received. One of the hallmarks of known 204 00:12:53,520 --> 00:12:57,440 Speaker 4: as a community is that our citizens, they think of 205 00:12:57,520 --> 00:13:01,839 Speaker 4: themselves almost like citizen scientists. They're almost asking us what's 206 00:13:01,880 --> 00:13:04,679 Speaker 4: new every week. It's like, what's the new thing to try. 207 00:13:05,120 --> 00:13:09,160 Speaker 4: They've come to expect strange things to happen, you know, 208 00:13:09,200 --> 00:13:12,080 Speaker 4: in the roads and other places in Lignona. So I 209 00:13:12,120 --> 00:13:16,079 Speaker 4: think it was significantly better received because of the education 210 00:13:16,200 --> 00:13:18,960 Speaker 4: that we did, because the first responders were on board, 211 00:13:19,360 --> 00:13:22,240 Speaker 4: because we gave community previews, so it was not like suddenly, 212 00:13:22,760 --> 00:13:25,760 Speaker 4: you know, self driving car shows up in the middle 213 00:13:25,760 --> 00:13:26,760 Speaker 4: of the community. 214 00:13:26,400 --> 00:13:27,360 Speaker 2: Right okay. 215 00:13:27,400 --> 00:13:29,400 Speaker 3: And in terms of I mean we've talked about the 216 00:13:29,440 --> 00:13:32,640 Speaker 3: autonomous side of things and the AI. Are there any 217 00:13:32,679 --> 00:13:36,400 Speaker 3: other AI techniques or technology that has been used for 218 00:13:37,160 --> 00:13:40,520 Speaker 3: general community planning and development? Are there any other tools 219 00:13:40,559 --> 00:13:42,480 Speaker 3: out there that is currently being used? 220 00:13:43,280 --> 00:13:47,360 Speaker 4: So from a legnano perspective, it's pretty significant. We actually 221 00:13:47,440 --> 00:13:52,800 Speaker 4: have a very detailed data overlay that actually shows us 222 00:13:52,840 --> 00:13:56,880 Speaker 4: how the city is behaving. Everything is private, so there 223 00:13:56,960 --> 00:14:00,960 Speaker 4: is no personally identifiable information being collected, but we collect 224 00:14:01,120 --> 00:14:04,800 Speaker 4: a wide variety of behaviors. I know how long people 225 00:14:04,880 --> 00:14:08,640 Speaker 4: wait for an uber, I know the specific state of 226 00:14:08,880 --> 00:14:12,240 Speaker 4: parking garages. Every spot is instrumental, so we know if 227 00:14:12,280 --> 00:14:15,000 Speaker 4: there's a weight for them. We know how the beaps 228 00:14:15,000 --> 00:14:18,319 Speaker 4: are flowing inside the community, and that is fed into 229 00:14:18,360 --> 00:14:23,000 Speaker 4: a large data environment where we actually use AI driven 230 00:14:23,040 --> 00:14:27,040 Speaker 4: tools to both predict and model the behavior of the environment. 231 00:14:27,080 --> 00:14:31,720 Speaker 4: We've done presophisticated prediction on mobility using AI, but we 232 00:14:31,760 --> 00:14:35,040 Speaker 4: also use it for energy consumption. We use it to 233 00:14:35,120 --> 00:14:39,560 Speaker 4: detect unknown patterns, like, for example, the impact of having 234 00:14:39,640 --> 00:14:43,080 Speaker 4: pets in the environment and how that changes visitation. So 235 00:14:43,520 --> 00:14:47,479 Speaker 4: when you look behind the scenes at what allows Lakenna 236 00:14:47,520 --> 00:14:51,200 Speaker 4: to operate and what allows Beep to find such a 237 00:14:51,240 --> 00:14:56,280 Speaker 4: fertile environment for testing and operating these vehicles here, there's 238 00:14:56,320 --> 00:15:01,160 Speaker 4: a significant amount of AI and data that actually powers 239 00:15:01,160 --> 00:15:01,760 Speaker 4: our community. 240 00:15:02,320 --> 00:15:05,400 Speaker 3: Yeah, that's pretty cool. Just as you're describing the amount 241 00:15:05,440 --> 00:15:07,800 Speaker 3: of data and be able to find all their starts. 242 00:15:07,800 --> 00:15:11,080 Speaker 3: It just reminded me of the SimCity series of games 243 00:15:11,080 --> 00:15:14,240 Speaker 3: that I used to play quite a bit, and using 244 00:15:14,280 --> 00:15:16,720 Speaker 3: that to make decisions to make your citizens happy. 245 00:15:17,480 --> 00:15:20,480 Speaker 4: I may have said once or twice that I get 246 00:15:20,480 --> 00:15:23,200 Speaker 4: to play SimCity with a real city to a degree, 247 00:15:23,280 --> 00:15:24,920 Speaker 4: so I know exactly what you mean. 248 00:15:28,680 --> 00:15:40,120 Speaker 3: We'll be right back after a quick break. Where do 249 00:15:40,240 --> 00:15:43,800 Speaker 3: world changing ideas get their start? At Intel? It starts 250 00:15:43,800 --> 00:15:47,760 Speaker 3: with real solutions, and real solutions start with exceptional engineering. 251 00:15:48,360 --> 00:15:52,400 Speaker 3: Empowering those with disabilities starts with assistive AI, and stopping 252 00:15:52,440 --> 00:15:56,680 Speaker 3: crop loss from infestation starts with thermal imaging and open technology, 253 00:15:57,240 --> 00:16:01,760 Speaker 3: while artificial intelligence that predicts depression starts with educational programs 254 00:16:01,800 --> 00:16:05,240 Speaker 3: like Intel's AI for Youth. And that's just the start 255 00:16:06,600 --> 00:16:10,920 Speaker 3: the quantum computing revolution. The next generation of AI experts 256 00:16:11,600 --> 00:16:17,040 Speaker 3: the renewable energy grid, liquid cooling, data centers, radiation exposure 257 00:16:17,040 --> 00:16:22,480 Speaker 3: prevention in space, water restoration, and early cancer detection. The 258 00:16:22,520 --> 00:16:26,480 Speaker 3: examples are countless, the impacts are endless, but the foundation 259 00:16:26,640 --> 00:16:31,600 Speaker 3: is always the same. It starts with Intel. Learn more 260 00:16:31,640 --> 00:16:42,440 Speaker 3: at Intel dot com, Forward Slash Stories Welcome back to 261 00:16:42,520 --> 00:16:50,400 Speaker 3: Technically Speaking, an Intel podcast. When you think about AI 262 00:16:50,680 --> 00:16:54,560 Speaker 3: in our environment, the question of oversight often comes into play. 263 00:16:55,160 --> 00:16:58,320 Speaker 3: How did these tools manage incidents in the community. What 264 00:16:58,480 --> 00:17:01,440 Speaker 3: metrics or data are you to determine when an AI 265 00:17:01,480 --> 00:17:05,080 Speaker 3: tool should engage or intervene. I often think of the 266 00:17:05,119 --> 00:17:08,720 Speaker 3: pacemaker as an example of how AI can be used 267 00:17:08,720 --> 00:17:12,399 Speaker 3: to positively impact our lives. A monitoring system that is 268 00:17:12,400 --> 00:17:15,120 Speaker 3: set up to only act when a severe change has occurred. 269 00:17:15,800 --> 00:17:18,879 Speaker 3: BEEP is creating a system with checks and balances that 270 00:17:19,000 --> 00:17:23,200 Speaker 3: can be more reliable than humans in reporting incidents. Vehicles 271 00:17:23,200 --> 00:17:27,159 Speaker 3: are constantly collecting information inside and outside around what it 272 00:17:27,200 --> 00:17:31,000 Speaker 3: observes and encounters that can make the community safer and 273 00:17:31,080 --> 00:17:31,680 Speaker 3: more efficient. 274 00:17:34,440 --> 00:17:36,720 Speaker 2: If you think of the in cab and environments and 275 00:17:36,760 --> 00:17:40,320 Speaker 2: you think of the scenario of not having a person 276 00:17:40,359 --> 00:17:42,840 Speaker 2: of authority on board, there is no driver, there is 277 00:17:42,880 --> 00:17:47,720 Speaker 2: no attendant. In the future, I mean, we're developing tools 278 00:17:47,760 --> 00:17:54,480 Speaker 2: and techniques that monitor the activities of the writers to 279 00:17:54,680 --> 00:17:57,879 Speaker 2: ensure we understand that if there is a health event, 280 00:17:58,119 --> 00:18:02,440 Speaker 2: you know, somebody crouches over their chair as an example, 281 00:18:03,119 --> 00:18:07,280 Speaker 2: if there's an unfortunate situation like somebody were to present 282 00:18:07,320 --> 00:18:10,280 Speaker 2: a weapon. You have to think of all these types 283 00:18:10,320 --> 00:18:13,960 Speaker 2: of use cases, and what's critical about that is being 284 00:18:14,000 --> 00:18:19,840 Speaker 2: able to process that observation and quickly align that with 285 00:18:20,080 --> 00:18:23,320 Speaker 2: how we would get some communication into the vehicle and 286 00:18:23,600 --> 00:18:27,560 Speaker 2: or immediately dispatch support or services. You know, one of 287 00:18:27,560 --> 00:18:32,920 Speaker 2: the things that is so important about these vehicles is 288 00:18:33,880 --> 00:18:39,840 Speaker 2: in the event of an incident, you have the perfect eyewitness. 289 00:18:40,040 --> 00:18:45,720 Speaker 2: Every time you're videotaping what's happened in an intersection, you're 290 00:18:46,480 --> 00:18:50,760 Speaker 2: leveraging that information and data to measure exactly how did 291 00:18:50,800 --> 00:18:55,560 Speaker 2: an autonomous vehicle respond and so an important piece of 292 00:18:56,400 --> 00:18:59,440 Speaker 2: leveraging data in the future for the work that we're 293 00:18:59,480 --> 00:19:04,479 Speaker 2: doing is going to really reinvent how we do things 294 00:19:04,720 --> 00:19:09,360 Speaker 2: like supporting police activities out there in the area of 295 00:19:09,960 --> 00:19:14,720 Speaker 2: data collection and determining fault in scenarios, but most importantly 296 00:19:14,800 --> 00:19:18,920 Speaker 2: taking that data back and improving situations that may be 297 00:19:19,320 --> 00:19:23,480 Speaker 2: hazardous to roadway conditions that result in accidents and things 298 00:19:23,480 --> 00:19:26,679 Speaker 2: of that nature. Externally, if you think of all the 299 00:19:26,880 --> 00:19:31,639 Speaker 2: data that is being collected, simple things that we're able 300 00:19:31,720 --> 00:19:34,800 Speaker 2: to determine by being out there on the roadways in 301 00:19:34,880 --> 00:19:38,960 Speaker 2: these different traffic scenarios are used to improve traffic flow 302 00:19:39,000 --> 00:19:41,120 Speaker 2: and one hit on some of the things they do 303 00:19:41,880 --> 00:19:45,480 Speaker 2: in standing road infrastructure that can also be done in 304 00:19:45,560 --> 00:19:49,720 Speaker 2: the data that's collected through these vehicles. There are scenarios 305 00:19:49,800 --> 00:19:54,000 Speaker 2: where public works departments can utilize the data and we 306 00:19:54,040 --> 00:19:57,919 Speaker 2: can send them examples of where a tree limb is 307 00:19:57,960 --> 00:20:02,120 Speaker 2: growing out over a power line, or potholes in the road, 308 00:20:02,280 --> 00:20:06,639 Speaker 2: or other circumstances that may create a safety issue that 309 00:20:06,720 --> 00:20:10,359 Speaker 2: need to be addressed. And so there's just an enormous 310 00:20:10,440 --> 00:20:15,080 Speaker 2: amount of observation that's going on every time we are 311 00:20:15,080 --> 00:20:18,560 Speaker 2: on a route that that can serve so many important purposes, 312 00:20:19,240 --> 00:20:23,000 Speaker 2: just to proactively address things before they come problems. 313 00:20:23,600 --> 00:20:27,600 Speaker 4: I think it's pretty unique that you have now these 314 00:20:27,720 --> 00:20:32,359 Speaker 4: autonomous vehicles moving throughout communities. They carry people and provide service, 315 00:20:32,840 --> 00:20:35,800 Speaker 4: but they're also a very accurate scanner. 316 00:20:36,280 --> 00:20:36,520 Speaker 2: Right. 317 00:20:36,960 --> 00:20:40,679 Speaker 4: Autonomous vehicles have cameras, they have light ar. When you 318 00:20:40,760 --> 00:20:43,600 Speaker 4: ride the beeps, you actually see in a display what 319 00:20:43,680 --> 00:20:47,240 Speaker 4: the vehicle is seeing, and it's like recording every minute 320 00:20:47,320 --> 00:20:50,240 Speaker 4: detail of the environment, and it's a three D view 321 00:20:50,520 --> 00:20:52,760 Speaker 4: of the world around it. So it's I think a 322 00:20:52,920 --> 00:20:56,280 Speaker 4: unique opportunity and one that we haven't fully utilized yet 323 00:20:56,840 --> 00:20:59,679 Speaker 4: of having these objects that are three D scanners that 324 00:20:59,760 --> 00:21:03,920 Speaker 4: are traversion the community thousands of times a month, and 325 00:21:04,119 --> 00:21:07,040 Speaker 4: they can provide us with an incredible amount of information. 326 00:21:07,200 --> 00:21:10,600 Speaker 4: So I think it's a unique opportunity and one would 327 00:21:10,600 --> 00:21:13,920 Speaker 4: we haven't utilized as much of the data that the 328 00:21:14,000 --> 00:21:15,480 Speaker 4: vehicles generate as we could. 329 00:21:16,840 --> 00:21:19,200 Speaker 3: But there's a lot more to Lake Nona than their 330 00:21:19,240 --> 00:21:23,360 Speaker 3: revolutionary public transportation. One that stands out to me, which 331 00:21:23,400 --> 00:21:26,600 Speaker 3: I hope more towns and cities will consider, is Wi 332 00:21:26,600 --> 00:21:30,280 Speaker 3: Fi access for all its residents, something that's quickly becoming 333 00:21:30,320 --> 00:21:34,480 Speaker 3: an essential utility. Lakenona is also home to the most 334 00:21:34,480 --> 00:21:38,399 Speaker 3: technologically advanced hotel in the world, the Lake Nona Wave Hotel. 335 00:21:39,040 --> 00:21:42,680 Speaker 3: Beyond the new fangled tech for residents and visitors, Lakenona 336 00:21:42,800 --> 00:21:46,600 Speaker 3: also considers itself a living lab community where companies and 337 00:21:46,640 --> 00:21:51,240 Speaker 3: innovators can connect, collaborate, and test their prototypes and ideas 338 00:21:51,280 --> 00:21:57,240 Speaker 3: in a real world setting. And in terms of the 339 00:21:57,359 --> 00:22:01,200 Speaker 3: partnership with Intel, while our start with you, what were 340 00:22:01,200 --> 00:22:05,680 Speaker 3: some of the technologies and help that Intel provided your project? 341 00:22:06,600 --> 00:22:10,919 Speaker 4: So we are primarily an Intel shop when it comes 342 00:22:10,920 --> 00:22:16,240 Speaker 4: to processing. We utilize Intel CPUs for a variety of 343 00:22:16,280 --> 00:22:19,800 Speaker 4: the data that we collect, and we're even experimenting right 344 00:22:19,840 --> 00:22:23,080 Speaker 4: now with Intel GPUs as a way to actually do 345 00:22:23,240 --> 00:22:27,440 Speaker 4: some of the heavier data processing. So it's one thing 346 00:22:27,480 --> 00:22:32,280 Speaker 4: that's always running and always behind the scenes from our perspective. Now, 347 00:22:32,800 --> 00:22:36,080 Speaker 4: we have a variety of partners like people that actually 348 00:22:36,240 --> 00:22:40,359 Speaker 4: engage in some of the more advanced technologies that Intel 349 00:22:40,400 --> 00:22:43,919 Speaker 4: has to offer. But from our part, it's a strong 350 00:22:43,960 --> 00:22:48,439 Speaker 4: combination of tried and true you know CPUs and you know, 351 00:22:48,480 --> 00:22:53,000 Speaker 4: we're getting some pretty interesting performance results from Intel GPUs 352 00:22:53,040 --> 00:22:56,240 Speaker 4: now that make them usable for a variety of data 353 00:22:56,240 --> 00:22:59,560 Speaker 4: crunching tasks for large data sets that we find interesting. 354 00:23:00,160 --> 00:23:02,960 Speaker 3: Yeah, I just want to switch now a little bit 355 00:23:03,000 --> 00:23:05,960 Speaker 3: to the safety side of things. I've actually got a 356 00:23:06,000 --> 00:23:08,200 Speaker 3: bit of a background in mining, and I was around 357 00:23:08,359 --> 00:23:11,960 Speaker 3: with the advent of the whole autonomous mining vehicles with 358 00:23:12,000 --> 00:23:15,639 Speaker 3: those huge dump trucks being in a loaded and driven 359 00:23:16,359 --> 00:23:19,280 Speaker 3: without any drivers, which is a real site to see. 360 00:23:19,920 --> 00:23:23,840 Speaker 3: Going through some of that technology, they had a very strict, 361 00:23:24,160 --> 00:23:27,560 Speaker 3: multi layer approach to safety. There was like seven tiers 362 00:23:28,119 --> 00:23:30,800 Speaker 3: right down to people having actual buttons they can press, 363 00:23:30,840 --> 00:23:34,480 Speaker 3: and it just shuts everything down. How have you tackled 364 00:23:34,480 --> 00:23:37,760 Speaker 3: the approach of safety, particularly in a much more open 365 00:23:37,840 --> 00:23:39,680 Speaker 3: environment than a mind sight. 366 00:23:40,720 --> 00:23:44,920 Speaker 2: First, I would tell you as you look at autonomous mobility, 367 00:23:45,320 --> 00:23:49,480 Speaker 2: safety is the primary driver of why these technologies exist. 368 00:23:49,680 --> 00:23:52,800 Speaker 2: You know, in the US, ninety four percent of all 369 00:23:53,000 --> 00:23:56,240 Speaker 2: accidents and many tens of thousands of fatalities a year 370 00:23:56,320 --> 00:24:02,000 Speaker 2: a result of human distraction, impairmile and error, and that's 371 00:24:02,040 --> 00:24:07,280 Speaker 2: a well known fact. Obviously, taking some of the faults 372 00:24:07,320 --> 00:24:11,159 Speaker 2: of the driver out of the equation by utilizing technology 373 00:24:11,280 --> 00:24:16,679 Speaker 2: that's never distracted, never impaired, and always on is an 374 00:24:16,720 --> 00:24:21,679 Speaker 2: important aspect of this. But It's not just about achieving 375 00:24:21,720 --> 00:24:24,920 Speaker 2: an equivalent level of safety, which is a common phrase 376 00:24:25,119 --> 00:24:27,359 Speaker 2: used at the standards of how do you choose to 377 00:24:27,400 --> 00:24:30,280 Speaker 2: put an autonomous vehicle on the road. You have to 378 00:24:30,359 --> 00:24:34,119 Speaker 2: prove that it's equal to or better than the driven 379 00:24:34,200 --> 00:24:37,040 Speaker 2: vehicle in the eyes of our government, the US Department 380 00:24:37,080 --> 00:24:41,600 Speaker 2: of Transportation and Knits in particular. Well, if you think 381 00:24:41,680 --> 00:24:45,080 Speaker 2: of the opportunity and one hit on some of the 382 00:24:45,160 --> 00:24:51,159 Speaker 2: technologies in Lake Nona to have roadside infrastructure that is 383 00:24:51,960 --> 00:24:56,959 Speaker 2: looking down a roadway, communicating with our vehicles and telling 384 00:24:57,040 --> 00:25:00,560 Speaker 2: us that the trajectory of a particular car at a 385 00:25:00,600 --> 00:25:03,960 Speaker 2: particular speed is telling us it's very likely to run 386 00:25:04,000 --> 00:25:08,240 Speaker 2: that red light. So it's not just about the vehicles themselves, 387 00:25:08,320 --> 00:25:12,640 Speaker 2: it's about that entire connected infrastructure and how you use 388 00:25:12,720 --> 00:25:17,800 Speaker 2: other technologies to give you views of scenarios or predict 389 00:25:18,600 --> 00:25:22,400 Speaker 2: the event that may happen. Given the information that we're 390 00:25:22,440 --> 00:25:27,639 Speaker 2: perceiving from roadside infrastructure or intersection infrastructure, that can be 391 00:25:27,680 --> 00:25:33,480 Speaker 2: fed to these vehicles to dramatically improve safety and reduce 392 00:25:34,080 --> 00:25:36,960 Speaker 2: some of these scenarios that candidly a human would never 393 00:25:37,480 --> 00:25:40,760 Speaker 2: see or understand from their vantage point just behind the 394 00:25:40,760 --> 00:25:43,520 Speaker 2: wheel of a car, and so I think those things 395 00:25:43,560 --> 00:25:46,200 Speaker 2: are equally as important as the great work that's going 396 00:25:46,240 --> 00:25:49,320 Speaker 2: on with the autonomous platforms themselves. 397 00:25:50,080 --> 00:25:54,520 Speaker 3: Now looking into the future, Joe, as you know, AI 398 00:25:54,600 --> 00:25:58,960 Speaker 3: is evolving very rapidly, particularly around generative AI and even 399 00:25:59,000 --> 00:26:01,960 Speaker 3: just the visual AI capabilities. With new GPUs coming out 400 00:26:01,960 --> 00:26:06,080 Speaker 3: all the time, how do you place BEEP strategically so 401 00:26:06,160 --> 00:26:08,679 Speaker 3: to take advantage of any sort of new technologies that 402 00:26:08,760 --> 00:26:12,800 Speaker 3: come out, and so that you're keeping ahead of the 403 00:26:12,840 --> 00:26:16,160 Speaker 3: competition and also be able to serve your communities better. 404 00:26:16,800 --> 00:26:21,040 Speaker 2: If you look at the future of autonomous mobility, obviously 405 00:26:21,119 --> 00:26:24,600 Speaker 2: the market that we are focused on, and you think 406 00:26:24,680 --> 00:26:30,400 Speaker 2: of expanded use cases and evolving from you what today 407 00:26:30,640 --> 00:26:36,760 Speaker 2: in our world are planned services, planned routes, GEO fenced areas, 408 00:26:37,400 --> 00:26:40,800 Speaker 2: and the broader that you expand the horizons of the 409 00:26:40,960 --> 00:26:46,560 Speaker 2: types of environments that these vehicles would ultimately traverse and serve. 410 00:26:47,480 --> 00:26:50,480 Speaker 2: It's just going to be very very critical that we 411 00:26:51,240 --> 00:26:53,960 Speaker 2: as a business stay out in front of how we 412 00:26:54,080 --> 00:26:58,360 Speaker 2: leverage AI to improve what these vehicles are able to do. 413 00:26:59,119 --> 00:27:02,720 Speaker 2: It's going to be comperative for our business model to 414 00:27:02,880 --> 00:27:08,959 Speaker 2: succeed by utilizing the technology and the AI technologies in 415 00:27:09,040 --> 00:27:14,200 Speaker 2: particular to be able to understand, perceive, and properly respond 416 00:27:14,240 --> 00:27:17,199 Speaker 2: to these situations that are out there both on our 417 00:27:17,280 --> 00:27:21,000 Speaker 2: roadways and in our vehicles, so that we can provide 418 00:27:21,040 --> 00:27:26,800 Speaker 2: a safe, convenient service for expanded use cases across the country. 419 00:27:27,680 --> 00:27:28,720 Speaker 3: Did you want to add to that? 420 00:27:29,520 --> 00:27:32,760 Speaker 4: Definitely, and maybe fast forward a little bit more into 421 00:27:32,800 --> 00:27:38,400 Speaker 4: the future. Today, we use AI and we use the 422 00:27:38,400 --> 00:27:41,840 Speaker 4: tools that we have in our toolkit to make things 423 00:27:42,480 --> 00:27:47,440 Speaker 4: safe and efficient, right, and that's definitely the right order 424 00:27:47,480 --> 00:27:51,360 Speaker 4: to take. I mean, safety is the number one concern 425 00:27:51,400 --> 00:27:54,280 Speaker 4: and then making sure that it's efficient. But then once 426 00:27:54,359 --> 00:27:58,480 Speaker 4: you tackle those I think AI opens the opportunity for 427 00:27:58,560 --> 00:28:02,440 Speaker 4: things that are very unique. How about the vehicle recognizing 428 00:28:03,000 --> 00:28:05,919 Speaker 4: that the persons that are there, because we're able to 429 00:28:05,960 --> 00:28:09,760 Speaker 4: look into their schedules, they have an extra two minutes 430 00:28:10,200 --> 00:28:15,800 Speaker 4: and there's a side road that could be calm right 431 00:28:15,840 --> 00:28:18,640 Speaker 4: where they could see a lake or what if you're 432 00:28:18,640 --> 00:28:21,920 Speaker 4: able to figure out that there's a live event going on, 433 00:28:22,480 --> 00:28:25,960 Speaker 4: and instead of having only the opportunity for you to 434 00:28:26,040 --> 00:28:31,160 Speaker 4: attend because you're there, the system automatically redirects the non 435 00:28:31,320 --> 00:28:34,439 Speaker 4: essential traffic to one where you can actually listen to 436 00:28:34,520 --> 00:28:38,400 Speaker 4: live music as you go in. I think the experiential 437 00:28:38,440 --> 00:28:45,160 Speaker 4: opportunities of this intersection between technical AI for efficiency for safety, 438 00:28:45,320 --> 00:28:50,480 Speaker 4: couple with let's call it human understanding powered by AI, 439 00:28:51,040 --> 00:28:54,400 Speaker 4: they open these intersections that we haven't thought about. Right, 440 00:28:55,000 --> 00:28:58,520 Speaker 4: Maybe when we get the next version of your routing 441 00:28:58,840 --> 00:29:01,440 Speaker 4: on your GPS, when you pull it in your phone, 442 00:29:01,680 --> 00:29:04,920 Speaker 4: it's not going to say avoid toolls. It may say 443 00:29:05,760 --> 00:29:10,080 Speaker 4: bring my blood pressure down right. It may say let 444 00:29:10,080 --> 00:29:12,960 Speaker 4: me discover the place that I'm in. That's the thing 445 00:29:13,000 --> 00:29:16,120 Speaker 4: that really excites me is sure we'll use the tools 446 00:29:16,160 --> 00:29:20,400 Speaker 4: to make sure we tackle the technical so that we 447 00:29:20,440 --> 00:29:21,960 Speaker 4: can deliver the experiential. 448 00:29:22,600 --> 00:29:25,160 Speaker 3: Okay, Finally, I like to sort of wrap it up 449 00:29:25,160 --> 00:29:28,360 Speaker 3: with some ethical type questions. We talked a little bit 450 00:29:28,400 --> 00:29:32,479 Speaker 3: about data privacy and user privacy. You do work with 451 00:29:32,520 --> 00:29:36,480 Speaker 3: a lot of local governments and local municipalities. I'd like 452 00:29:36,520 --> 00:29:38,760 Speaker 3: to get your thoughts on how do we strike that 453 00:29:38,800 --> 00:29:40,920 Speaker 3: balance or even if indeed there is a balance, or 454 00:29:40,920 --> 00:29:46,080 Speaker 3: should be just ensure by default that it users privacy 455 00:29:46,160 --> 00:29:46,960 Speaker 3: is sacrisanct. 456 00:29:47,760 --> 00:29:51,360 Speaker 2: First, I mean, obviously, even with the data collected, we 457 00:29:51,520 --> 00:29:55,960 Speaker 2: have to honor the PII restrictions and other things that 458 00:29:56,080 --> 00:30:00,560 Speaker 2: exist in our country and certainly respect that right privacy. 459 00:30:01,320 --> 00:30:03,920 Speaker 2: I will tell you that a lot of the information 460 00:30:04,080 --> 00:30:09,440 Speaker 2: that's gathered is not to identify details of an individual. 461 00:30:09,800 --> 00:30:14,920 Speaker 2: It's about taking that collective body of information to predict 462 00:30:15,000 --> 00:30:20,560 Speaker 2: certain outcomes or events and identify certain behaviors that would 463 00:30:20,720 --> 00:30:25,200 Speaker 2: enable us to address the situation or perform a different service. 464 00:30:25,480 --> 00:30:30,160 Speaker 2: But very very critical we're able to capture these images 465 00:30:30,240 --> 00:30:33,920 Speaker 2: and the information that we do to ensure we're improving 466 00:30:33,960 --> 00:30:37,520 Speaker 2: the safety and performance of these types of platforms and 467 00:30:38,000 --> 00:30:42,240 Speaker 2: work within obviously the respected boundaries that we all have. 468 00:30:42,920 --> 00:30:46,080 Speaker 3: For audience, Can you just define the PII? Sure? 469 00:30:46,080 --> 00:30:50,360 Speaker 4: It's personally identifiable data, usually a collection of things that 470 00:30:50,400 --> 00:30:53,080 Speaker 4: can allow you to identify a personal like, for example, 471 00:30:53,440 --> 00:30:58,120 Speaker 4: your name, your address, your telephone number, and in some 472 00:30:58,200 --> 00:31:02,480 Speaker 4: other cases things like your biometrics like your face, or 473 00:31:02,720 --> 00:31:06,840 Speaker 4: other things that are uniquely attachable to you. I mean, 474 00:31:06,920 --> 00:31:10,920 Speaker 4: other environments and other users of data I think have 475 00:31:11,000 --> 00:31:15,360 Speaker 4: a much tougher situation because they have to deal with 476 00:31:16,040 --> 00:31:19,480 Speaker 4: personally identifiable data to conduct your business because who you 477 00:31:19,560 --> 00:31:23,720 Speaker 4: are is critically important to how they deliver the service. 478 00:31:23,840 --> 00:31:28,120 Speaker 4: It's not yet for what we do, and by just 479 00:31:28,200 --> 00:31:31,400 Speaker 4: not collecting the data and then making sure we have 480 00:31:31,480 --> 00:31:35,560 Speaker 4: no opportunity to actually look at one individual, only collective data. 481 00:31:36,080 --> 00:31:38,640 Speaker 4: We put ourselves in a situation that we are not 482 00:31:38,760 --> 00:31:42,840 Speaker 4: infringing into people's identities or privacy. 483 00:31:43,360 --> 00:31:47,880 Speaker 3: That's good to know. Thanks Joan one for your time today. 484 00:31:47,920 --> 00:31:50,840 Speaker 3: It was really great talking to you and I've learned 485 00:31:51,080 --> 00:31:51,520 Speaker 3: a lot. 486 00:31:51,720 --> 00:31:52,400 Speaker 4: Thank you, Graham. 487 00:31:52,680 --> 00:31:54,160 Speaker 2: Yeah, thanks very much. Enjoyed it. 488 00:31:58,400 --> 00:32:01,000 Speaker 3: I would like to thank my guests Joe and Juan 489 00:32:01,040 --> 00:32:04,160 Speaker 3: Santos for joining me on this episode of Technically Speaking, 490 00:32:04,280 --> 00:32:08,560 Speaker 3: an Intel podcast. I gained significant insights from my guests 491 00:32:08,560 --> 00:32:11,400 Speaker 3: today and I hope you found it enlightening as well. 492 00:32:11,680 --> 00:32:14,360 Speaker 3: My primary realization is that AI and technology have the 493 00:32:14,440 --> 00:32:18,480 Speaker 3: power to shape and nurture local communities. I'm always inspired 494 00:32:18,480 --> 00:32:21,880 Speaker 3: by grassroots solutions as opposed to overarching, top down strategies. 495 00:32:22,320 --> 00:32:25,800 Speaker 3: Both Joe and Ie emphasize the criticality of data privacy 496 00:32:26,160 --> 00:32:29,960 Speaker 3: and the necessity to protect users' personal details, particularly since 497 00:32:30,000 --> 00:32:32,800 Speaker 3: they are working with local governments and agencies. On a 498 00:32:32,840 --> 00:32:35,880 Speaker 3: technical front, it's evident that BEEP is adapting and evolving 499 00:32:35,920 --> 00:32:39,680 Speaker 3: in its approach to autonomous vehicles. Currently, their shuttle models 500 00:32:39,680 --> 00:32:43,320 Speaker 3: are facilitated by attendants, but the trajectory suggests that in 501 00:32:43,360 --> 00:32:47,480 Speaker 3: a few years, these shuttles might operate autonomously with minimal supervision. 502 00:32:47,920 --> 00:32:51,440 Speaker 3: Watching this transformation unfold is genuinely and exciting. While it's 503 00:32:51,480 --> 00:32:54,479 Speaker 3: easy to be captivated by new technology, and I'm no exception, 504 00:32:55,040 --> 00:32:58,200 Speaker 3: it's crucial to prioritize the user experience and the tangible 505 00:32:58,200 --> 00:33:02,080 Speaker 3: benefits it brings to enriching lives from the Roman aqueducts 506 00:33:02,280 --> 00:33:05,960 Speaker 3: to present day innovations. It's the relentless drive and commitment 507 00:33:05,960 --> 00:33:08,680 Speaker 3: of visionaries like Joe and Juan that propel us forward. 508 00:33:09,160 --> 00:33:11,840 Speaker 3: With a touch of luck and their pioneering spirit, we 509 00:33:11,920 --> 00:33:13,920 Speaker 3: may soon pave the way for a future that would 510 00:33:13,960 --> 00:33:18,760 Speaker 3: leave even the Jetsons and all. Please join us on Tuesday, 511 00:33:18,800 --> 00:33:21,600 Speaker 3: December twelfth for the next episode, when we will learn 512 00:33:21,640 --> 00:33:25,360 Speaker 3: about how Intel's AI for Workforce program is making learning 513 00:33:25,400 --> 00:33:31,840 Speaker 3: AI more accessible. Technically Speaking was produced by Ruby Studios 514 00:33:31,840 --> 00:33:34,960 Speaker 3: from iHeartRadio in partnership with Intel and hosted by me 515 00:33:35,160 --> 00:33:39,440 Speaker 3: Graham Class. Our executive producer is Moley Sosha, our EP 516 00:33:39,600 --> 00:33:42,880 Speaker 3: of Post Production is James Foster, and our supervising producer 517 00:33:43,080 --> 00:33:47,120 Speaker 3: is Nikkia Swinton. This episode was edited by Cira Spreen 518 00:33:47,480 --> 00:33:59,680 Speaker 3: and written and produced by Tiree Rush. Where do world 519 00:33:59,720 --> 00:34:02,960 Speaker 3: change ideas get their start? At Intel? It starts with 520 00:34:03,080 --> 00:34:07,880 Speaker 3: real solutions, and real solutions start with exceptional engineering, the 521 00:34:07,960 --> 00:34:11,840 Speaker 3: quantum computing revolution, the next generation of AI experts, the 522 00:34:11,920 --> 00:34:16,320 Speaker 3: renewable energy grid, liquid cooling, data centers, early diagnosis for cancer, 523 00:34:16,480 --> 00:34:20,440 Speaker 3: water restoration, and even farmland protection. The examples are countless, 524 00:34:20,680 --> 00:34:23,880 Speaker 3: the impacts are endless, but the foundation is always the same. 525 00:34:24,120 --> 00:34:28,000 Speaker 3: It starts with Intel. Join us in redefining what's achievable 526 00:34:28,120 --> 00:34:31,040 Speaker 3: through the power of AI. Learn more at Intel dot 527 00:34:31,080 --> 00:34:32,400 Speaker 3: com slash Stories.