1 00:00:00,240 --> 00:00:04,840 Speaker 1: Welcome to tech Stuff, a production of iHeart Podcasts and Kaleidoscope. 2 00:00:04,880 --> 00:00:06,680 Speaker 1: This week, we're taking you on the road for a 3 00:00:06,680 --> 00:00:11,360 Speaker 1: special episode recorded in Doha, Cutter. Now, before you start 4 00:00:11,400 --> 00:00:12,400 Speaker 1: imagining sand. 5 00:00:12,240 --> 00:00:16,560 Speaker 2: Dunes and pure blue coastlines, I should describe my surroundings. 6 00:00:17,200 --> 00:00:20,360 Speaker 1: I'm currently sitting in a makeshift studio in a massive 7 00:00:20,680 --> 00:00:25,000 Speaker 1: airplane hangar like conference center called the Doha Exhibition and 8 00:00:25,040 --> 00:00:28,480 Speaker 1: Convention Center, So if this episode sounds a little bit. 9 00:00:28,400 --> 00:00:29,480 Speaker 2: Different, you'll know why. 10 00:00:30,280 --> 00:00:32,560 Speaker 1: For the last couple of days, I've been attending Web 11 00:00:32,640 --> 00:00:36,239 Speaker 1: Summit Cutter, an international conference where investors and entrepreneurs and 12 00:00:36,280 --> 00:00:37,280 Speaker 1: thought leaders. 13 00:00:37,360 --> 00:00:39,560 Speaker 2: Gather from around the world to talk about the future 14 00:00:39,560 --> 00:00:40,280 Speaker 2: of technology. 15 00:00:41,040 --> 00:00:44,120 Speaker 1: Our friends at iHeart are the official podcast partner's Web Summit, 16 00:00:44,440 --> 00:00:47,680 Speaker 1: which means that I'm here and so is a very 17 00:00:47,840 --> 00:00:50,600 Speaker 1: familiar figure, Jonathan Strickland. 18 00:00:51,240 --> 00:00:53,960 Speaker 3: Well, hello, I'm so pleased to be a guest on 19 00:00:54,040 --> 00:00:57,720 Speaker 3: this historic, amazing podcast. I've heard so much about. 20 00:00:57,520 --> 00:00:59,760 Speaker 1: It, Jonathan, Welcome back to tech Stuff. How does it 21 00:00:59,800 --> 00:01:01,480 Speaker 1: feel to be on the other side of the table. 22 00:01:01,800 --> 00:01:07,119 Speaker 3: I'll be honest, it feels odd, but I'm so pleased 23 00:01:07,319 --> 00:01:09,960 Speaker 3: to be able to be a part of this. I 24 00:01:10,120 --> 00:01:13,400 Speaker 3: was been joking all week about you know, just when 25 00:01:13,440 --> 00:01:15,200 Speaker 3: I thought I was out, they pulled me back in. 26 00:01:15,400 --> 00:01:17,240 Speaker 2: Absolutely. Well, I'm very pleased to see you. 27 00:01:17,280 --> 00:01:19,319 Speaker 1: It's been almost seven years, I think since we saw 28 00:01:19,319 --> 00:01:21,520 Speaker 1: each other in person, So who knew the next time 29 00:01:21,560 --> 00:01:22,520 Speaker 1: would be in Doha? 30 00:01:22,840 --> 00:01:24,399 Speaker 3: And yeah, here I am here we are. 31 00:01:24,680 --> 00:01:27,240 Speaker 1: Based on the listener emails I've been receiving, I think 32 00:01:27,319 --> 00:01:29,360 Speaker 1: the listeners are going to be very, very very happy 33 00:01:29,400 --> 00:01:30,240 Speaker 1: to hear your voice today. 34 00:01:30,400 --> 00:01:31,199 Speaker 2: You're already missed. 35 00:01:31,360 --> 00:01:34,920 Speaker 3: Well just for you guys, Hey there, welcome to tech stuff. 36 00:01:35,240 --> 00:01:38,160 Speaker 3: There you go, what the tech? The tech? 37 00:01:38,480 --> 00:01:40,200 Speaker 1: So nice have you vation? And I hope, I hope 38 00:01:40,240 --> 00:01:43,160 Speaker 1: this will be the first of many visits. Absolutely, I'm 39 00:01:43,200 --> 00:01:45,400 Speaker 1: not just in Doha. Yes, I would love to do 40 00:01:45,440 --> 00:01:49,120 Speaker 1: this on East time. So there've been some fascinating speakers 41 00:01:49,120 --> 00:01:51,920 Speaker 1: already at the conference, and I wanted to talk to you, 42 00:01:52,040 --> 00:01:55,040 Speaker 1: Jonathan today about some of what I've heard, hoping that 43 00:01:55,080 --> 00:01:57,280 Speaker 1: you can contextualize it kind of from a bird's eye 44 00:01:57,400 --> 00:02:00,920 Speaker 1: view of having covered technology as a journalist for two 45 00:02:00,960 --> 00:02:04,240 Speaker 1: decades sure and attended a bunch of these conferences. 46 00:02:04,480 --> 00:02:09,560 Speaker 3: Yeah, conferences in the tech world in general play an 47 00:02:09,600 --> 00:02:12,680 Speaker 3: interesting role. So I've been to lots of different kinds. 48 00:02:12,680 --> 00:02:15,079 Speaker 3: I've been to trade shows like CEES or E three, 49 00:02:15,520 --> 00:02:17,760 Speaker 3: I've been to ones that have been held by specific 50 00:02:17,760 --> 00:02:21,520 Speaker 3: companies such as Intel or IBM and the web Sumit 51 00:02:21,600 --> 00:02:25,680 Speaker 3: of course is more of a general tech with Internet 52 00:02:25,760 --> 00:02:30,840 Speaker 3: focus on it, and there have been some pretty phenomenal speakers. Actually, 53 00:02:30,880 --> 00:02:32,080 Speaker 3: they've really impressed me. 54 00:02:32,560 --> 00:02:34,720 Speaker 1: And when you come to these types of events, like 55 00:02:35,280 --> 00:02:37,120 Speaker 1: what do you look out for, both in terms of 56 00:02:37,520 --> 00:02:39,400 Speaker 1: kind of getting a peek around the corner to what 57 00:02:39,480 --> 00:02:41,760 Speaker 1: may be coming in the world of tech, but also 58 00:02:41,800 --> 00:02:44,280 Speaker 1: in terms of applying some kind of critical lens to 59 00:02:44,720 --> 00:02:45,440 Speaker 1: what you're hearing. 60 00:02:45,600 --> 00:02:48,600 Speaker 3: That is an excellent question. So let's take CES for example. 61 00:02:48,680 --> 00:02:51,680 Speaker 3: Everyone is looking for the quirky thing that is a 62 00:02:51,720 --> 00:02:55,359 Speaker 3: standout from the standard stuff you would run into. So 63 00:02:56,160 --> 00:02:58,040 Speaker 3: an example would be from a couple of years ago, 64 00:02:58,160 --> 00:03:02,959 Speaker 3: the Little Rabbit AI, you know, pocket sized computer device, 65 00:03:03,160 --> 00:03:05,600 Speaker 3: which ended up not doing very well when it finally 66 00:03:05,639 --> 00:03:08,280 Speaker 3: came out. That's often the case where something that's quirky 67 00:03:08,480 --> 00:03:11,560 Speaker 3: gets your attention because it's different, but it turns out 68 00:03:11,560 --> 00:03:15,160 Speaker 3: it's not really practical or maybe it doesn't work as 69 00:03:15,200 --> 00:03:19,280 Speaker 3: well as was presented. So that's one thing you have 70 00:03:19,320 --> 00:03:21,240 Speaker 3: to look out for is yes, it's exciting and that 71 00:03:21,320 --> 00:03:22,760 Speaker 3: it's different, but at the same time you have to 72 00:03:22,760 --> 00:03:25,480 Speaker 3: ask the question, Okay, does this actually solve a problem. 73 00:03:25,520 --> 00:03:28,400 Speaker 3: One of the speakers we had yesterday actually talked about 74 00:03:28,400 --> 00:03:32,640 Speaker 3: this about how innovation for innovation's sake is a fool's 75 00:03:32,720 --> 00:03:36,480 Speaker 3: errand and that really innovation needs to be applied to 76 00:03:37,320 --> 00:03:41,560 Speaker 3: looking at real world problems, not inventing one. So that's 77 00:03:41,640 --> 00:03:45,480 Speaker 3: one thing, But I'd say another is you look to 78 00:03:45,560 --> 00:03:48,080 Speaker 3: see what people are talking about, what they're excited about, 79 00:03:48,120 --> 00:03:51,120 Speaker 3: and how they're talking about it, so that you can 80 00:03:51,200 --> 00:03:54,680 Speaker 3: kind of get a feel for what is going to 81 00:03:54,680 --> 00:03:57,080 Speaker 3: be the next push, and keep in mind that not 82 00:03:57,120 --> 00:03:59,600 Speaker 3: all those pushes are going to be successes. So as 83 00:03:59,600 --> 00:04:01,280 Speaker 3: an examp back in two thousand and eight when I 84 00:04:01,280 --> 00:04:04,360 Speaker 3: went to my first CEES, the big push back then 85 00:04:04,840 --> 00:04:07,880 Speaker 3: was from companies like Panasonic and Sony, where it was 86 00:04:07,920 --> 00:04:10,240 Speaker 3: all about three D television. Yeah, now, you and I 87 00:04:10,320 --> 00:04:13,920 Speaker 3: know that three D television never really took off. People 88 00:04:14,440 --> 00:04:17,000 Speaker 3: balked at the idea of having glasses that they had 89 00:04:17,000 --> 00:04:19,880 Speaker 3: to wear at home, and the expense and the lack 90 00:04:19,920 --> 00:04:23,000 Speaker 3: of content. All of these things contributed to the failure 91 00:04:23,400 --> 00:04:27,400 Speaker 3: of that technology. So when you're being exposed to these 92 00:04:27,480 --> 00:04:29,839 Speaker 3: new ideas and people are really excited about them, you 93 00:04:29,920 --> 00:04:34,400 Speaker 3: have to temper that a little bit with the reminder 94 00:04:34,440 --> 00:04:37,080 Speaker 3: that not everything that's being talked about is going to manifest, 95 00:04:37,160 --> 00:04:38,960 Speaker 3: or if it does, it won't manifest in the way 96 00:04:39,000 --> 00:04:42,960 Speaker 3: we anticipate. But if you do that and you proceed 97 00:04:43,400 --> 00:04:47,080 Speaker 3: with a kind of cautious optimism, I think great things 98 00:04:47,120 --> 00:04:47,839 Speaker 3: can be achieved. 99 00:04:48,160 --> 00:04:49,840 Speaker 1: Yeah, with that in mind, I mean, there are a 100 00:04:49,839 --> 00:04:52,760 Speaker 1: bunch of kind of smaller booths where people are presenting 101 00:04:53,120 --> 00:04:55,919 Speaker 1: on the floor, largely on laptops. I guess it's like 102 00:04:55,960 --> 00:04:59,240 Speaker 1: AI applications, but then as a main stage where there's 103 00:04:59,240 --> 00:05:01,320 Speaker 1: been some really interesting speakers. 104 00:05:01,360 --> 00:05:02,839 Speaker 2: So I've into three talks so far. 105 00:05:03,720 --> 00:05:05,040 Speaker 1: I kind of want to talk about each of them 106 00:05:05,080 --> 00:05:08,440 Speaker 1: briefly to get your view of how they exist in 107 00:05:08,480 --> 00:05:10,359 Speaker 1: the context of the history of technology. 108 00:05:10,360 --> 00:05:10,640 Speaker 3: Sure. 109 00:05:11,000 --> 00:05:13,159 Speaker 1: The first one was about the global chip race and 110 00:05:13,160 --> 00:05:16,400 Speaker 1: the continuing ways from deep seeg the second was about 111 00:05:16,440 --> 00:05:20,560 Speaker 1: the augmented reality presence in future, and the third was 112 00:05:20,680 --> 00:05:25,000 Speaker 1: AI and journalism. A fun time, a fun top, especially 113 00:05:25,000 --> 00:05:26,840 Speaker 1: for US journalists to come of technology. 114 00:05:26,920 --> 00:05:28,919 Speaker 3: Yeah, especially for those of us who worked for a 115 00:05:28,960 --> 00:05:33,400 Speaker 3: company that famously eliminated its editorial board in order to 116 00:05:33,440 --> 00:05:34,320 Speaker 3: replace it with AI. 117 00:05:34,960 --> 00:05:35,880 Speaker 2: Wait wait which company? 118 00:05:35,880 --> 00:05:38,839 Speaker 3: How stuff Works? Really? Yeah, back in twenty twenty three. 119 00:05:39,279 --> 00:05:41,239 Speaker 3: People have heard me talk about this on the show before, 120 00:05:41,279 --> 00:05:44,760 Speaker 3: but if you're new here, I worked for HowStuffWorks dot 121 00:05:44,760 --> 00:05:47,000 Speaker 3: com and I was a writer there, and in twenty 122 00:05:47,040 --> 00:05:51,440 Speaker 3: twenty three the company decided to stop working with freelance writers. 123 00:05:51,480 --> 00:05:53,680 Speaker 3: They had a freelance writing crew, and then they had 124 00:05:53,720 --> 00:05:58,120 Speaker 3: an editorial crew in house, and the idea was that 125 00:05:58,160 --> 00:06:01,040 Speaker 3: they were going to go with AI general rated articles 126 00:06:01,040 --> 00:06:03,000 Speaker 3: from that point forward. The editors would have to do 127 00:06:03,040 --> 00:06:05,719 Speaker 3: a full edit pass to make sure that everything was 128 00:06:05,880 --> 00:06:09,080 Speaker 3: accurate and correct, and then correct anything that wasn't right, 129 00:06:09,279 --> 00:06:13,120 Speaker 3: and as you know, generative AI can sometimes confabulate and 130 00:06:13,160 --> 00:06:17,599 Speaker 3: make things up. The editors protested, and then they were eliminated, 131 00:06:17,800 --> 00:06:20,280 Speaker 3: and everyone I knew who worked at how Stuff Works 132 00:06:20,360 --> 00:06:20,719 Speaker 3: was let go. 133 00:06:21,120 --> 00:06:21,400 Speaker 2: Wow. 134 00:06:21,520 --> 00:06:24,880 Speaker 3: Yeah, So I say this not so people will take 135 00:06:24,920 --> 00:06:27,680 Speaker 3: up torches and pitchforks and yell or anything, but rather 136 00:06:27,760 --> 00:06:30,760 Speaker 3: to explain that I am very much aware that I 137 00:06:30,800 --> 00:06:35,040 Speaker 3: have a very strong bias here, like I cannot be objective, 138 00:06:35,440 --> 00:06:36,600 Speaker 3: is what I'm trying to get. 139 00:06:36,520 --> 00:06:38,680 Speaker 1: At I mean, I think if you work as a 140 00:06:38,760 --> 00:06:41,960 Speaker 1: journalist and you know you're constantly hearing about the idea 141 00:06:42,000 --> 00:06:44,440 Speaker 1: of being replaced by AI, it h it's not the 142 00:06:44,480 --> 00:06:46,520 Speaker 1: most appealing seat to sit in. 143 00:06:46,520 --> 00:06:47,120 Speaker 2: This for sure. 144 00:06:47,120 --> 00:06:51,040 Speaker 3: And we've been through similar shifts before, like the pivot 145 00:06:51,080 --> 00:06:53,880 Speaker 3: to video being the famous one, right, we saw editorial 146 00:06:53,920 --> 00:06:58,000 Speaker 3: departments decimated because the idea was that the written word 147 00:06:58,160 --> 00:07:00,480 Speaker 3: was no longer the way to deliver and it was 148 00:07:00,520 --> 00:07:03,520 Speaker 3: going to be video only. But it didn't last very long, 149 00:07:03,839 --> 00:07:06,680 Speaker 3: and then by that time you had all these journalists 150 00:07:06,720 --> 00:07:11,560 Speaker 3: who were out of work. So obviously, as journalists, we 151 00:07:11,720 --> 00:07:14,280 Speaker 3: have a sensitivity to these things. 152 00:07:14,480 --> 00:07:16,560 Speaker 1: So we'll come back to that, but let's start with chips. 153 00:07:16,640 --> 00:07:19,040 Speaker 1: The talk I went to yesterday was from Andrew Feldman, 154 00:07:19,080 --> 00:07:22,040 Speaker 1: who is the CEO and co founder of Cerebras, which 155 00:07:22,080 --> 00:07:24,800 Speaker 1: is an AI chip manufacturer that has big ambitions to 156 00:07:24,840 --> 00:07:27,760 Speaker 1: take on in video. They manufacture the world's largest AI 157 00:07:27,880 --> 00:07:30,720 Speaker 1: chips and they're hoping to IPO in the US this 158 00:07:30,880 --> 00:07:33,080 Speaker 1: year to add fuel to their ambition to kind of 159 00:07:33,080 --> 00:07:36,120 Speaker 1: take on in video, and the company recently started offering 160 00:07:36,160 --> 00:07:39,320 Speaker 1: deep Seek running on its servers house in the US. 161 00:07:39,800 --> 00:07:42,800 Speaker 1: Deep Seak obviously being the Chinese AI model company that 162 00:07:42,880 --> 00:07:46,360 Speaker 1: managed to create this high performing reasoning model despite US 163 00:07:46,360 --> 00:07:50,120 Speaker 1: export controls on in Nvidia's most advanced chips, and the 164 00:07:50,160 --> 00:07:53,560 Speaker 1: CEO of Feldman talked about being quote crushed by demand 165 00:07:53,600 --> 00:07:57,120 Speaker 1: ever since offering deep seak on these servers. But first 166 00:07:57,160 --> 00:07:59,560 Speaker 1: of all, let's take a couple of steps back, because 167 00:08:00,080 --> 00:08:01,960 Speaker 1: I think last year you did a great episode on 168 00:08:02,000 --> 00:08:05,720 Speaker 1: tech Stuff with the title what are AI chips? And 169 00:08:05,760 --> 00:08:08,640 Speaker 1: I think that a brief refresher while the world talks 170 00:08:08,640 --> 00:08:10,200 Speaker 1: about it would actually be super helpful. 171 00:08:10,360 --> 00:08:15,440 Speaker 3: Oh. Absolutely so. If you think about processors in general, 172 00:08:15,480 --> 00:08:19,600 Speaker 3: if we're talking about your classic processors for computers, you 173 00:08:20,560 --> 00:08:24,000 Speaker 3: really have three major types at this point. You have CPUs, 174 00:08:24,000 --> 00:08:28,400 Speaker 3: the central processing unit that's the BOG standard basic processor. 175 00:08:29,240 --> 00:08:35,360 Speaker 3: Typically they're very good at performing high speed arithmetic operations 176 00:08:35,720 --> 00:08:39,000 Speaker 3: in sequence, right. And then you have GPUs or graphics 177 00:08:39,040 --> 00:08:42,040 Speaker 3: processing units. These we associate with things like gamers, like 178 00:08:42,080 --> 00:08:45,160 Speaker 3: if you're a real gamer, you got yourself a killer GPU. 179 00:08:45,280 --> 00:08:48,400 Speaker 3: These are really good at parallel processing, where they can 180 00:08:48,440 --> 00:08:54,360 Speaker 3: take multiple threads of operations and run processing on them simultaneously, 181 00:08:54,840 --> 00:08:58,000 Speaker 3: which can sometimes depending on the type of computer problem 182 00:08:58,200 --> 00:09:03,960 Speaker 3: work faster than a CPU could for specific types of processes. NPUs, 183 00:09:04,040 --> 00:09:08,320 Speaker 3: or neural processing units, are the new hotness, and they're 184 00:09:08,400 --> 00:09:11,400 Speaker 3: kind of like GPUs, but on steroids. They're even more 185 00:09:11,480 --> 00:09:14,880 Speaker 3: about parallel processing. They're optimized to run the types of 186 00:09:14,960 --> 00:09:20,160 Speaker 3: processes that your typical AI operations require these days. So 187 00:09:20,600 --> 00:09:22,320 Speaker 3: the laptop I have in front of me right now 188 00:09:22,400 --> 00:09:25,000 Speaker 3: has an NPU in it. So there's a neural processing 189 00:09:25,120 --> 00:09:28,720 Speaker 3: unit in my little laptop I've got that will run 190 00:09:28,800 --> 00:09:31,160 Speaker 3: those kind of operations natively on the computer that don't 191 00:09:31,200 --> 00:09:34,400 Speaker 3: require you to have a cloud connection, so that you're 192 00:09:34,400 --> 00:09:37,080 Speaker 3: not shipping all your data off to the cloud to 193 00:09:37,080 --> 00:09:40,520 Speaker 3: get processed and then sent back to you once it's done. Obviously, 194 00:09:40,520 --> 00:09:43,520 Speaker 3: that brings up questions of privacy and security. So one 195 00:09:43,520 --> 00:09:46,599 Speaker 3: of the big attractive features of having NPUs is you 196 00:09:46,640 --> 00:09:49,679 Speaker 3: can run those processes natively on your own devices and 197 00:09:49,760 --> 00:09:53,600 Speaker 3: not have to depend upon some third party being able 198 00:09:53,640 --> 00:09:56,640 Speaker 3: to access the information, especially in a world where we 199 00:09:56,720 --> 00:10:00,079 Speaker 3: worry about the information being used to train future model 200 00:10:00,160 --> 00:10:00,520 Speaker 3: of AI. 201 00:10:01,080 --> 00:10:03,120 Speaker 2: Had you come across this company before. 202 00:10:03,120 --> 00:10:08,320 Speaker 3: Cerebras only in Greek mythology, where the three headed hound 203 00:10:08,400 --> 00:10:10,600 Speaker 3: of Hades is guarding the Gates. 204 00:10:10,640 --> 00:10:10,679 Speaker 2: No. 205 00:10:10,960 --> 00:10:13,079 Speaker 3: Actually, I have heard of it before, but I had 206 00:10:13,120 --> 00:10:16,080 Speaker 3: not really looked into it. One of the downfalls of 207 00:10:16,120 --> 00:10:19,240 Speaker 3: my era of tech stuff, and I often said this, 208 00:10:19,320 --> 00:10:23,200 Speaker 3: is that being in the United States and being an 209 00:10:23,280 --> 00:10:28,679 Speaker 3: American meant that it had frequently a very strong American perspective, 210 00:10:29,520 --> 00:10:32,440 Speaker 3: and it meant that companies that were operating outside of 211 00:10:32,440 --> 00:10:37,320 Speaker 3: America often got less focus on my show. So one 212 00:10:37,320 --> 00:10:39,440 Speaker 3: of the things that I think is cool about the 213 00:10:39,480 --> 00:10:43,000 Speaker 3: Web Summit is I'm encountering companies that maybe I've heard 214 00:10:43,040 --> 00:10:44,960 Speaker 3: the name, but I really didn't know much about, and 215 00:10:45,040 --> 00:10:46,800 Speaker 3: I'm starting to learn a lot more about them and 216 00:10:46,880 --> 00:10:48,480 Speaker 3: have a greater appreciation for them. 217 00:10:49,040 --> 00:10:51,160 Speaker 1: It was interesting that kind of deep seek was such 218 00:10:51,160 --> 00:10:53,920 Speaker 1: a big part of this talk, I mean not surprising. 219 00:10:54,480 --> 00:10:57,000 Speaker 1: When did this narrative of like chip Wars or the 220 00:10:57,080 --> 00:11:00,600 Speaker 1: chip race start to kind of you know, Chris to 221 00:11:00,640 --> 00:11:02,319 Speaker 1: your when is when you were hosting takes stuff. 222 00:11:02,440 --> 00:11:06,280 Speaker 3: So a few years back in video was starting to 223 00:11:06,320 --> 00:11:11,400 Speaker 3: get incredibly popular, not through the AI world, but because 224 00:11:11,440 --> 00:11:17,840 Speaker 3: of cryptocurrency. Because for cryptocurrencies like ethereum, you would still 225 00:11:17,960 --> 00:11:22,360 Speaker 3: use parallel processing in order to attempt to mine a 226 00:11:22,400 --> 00:11:26,040 Speaker 3: block first and get that reward for things like bitcoin. 227 00:11:26,160 --> 00:11:28,520 Speaker 3: That value had gone so high that you were looking 228 00:11:28,520 --> 00:11:31,360 Speaker 3: at purpose built chips for that that we had gone 229 00:11:31,440 --> 00:11:35,520 Speaker 3: beyond GPUs. But in Video was riding high because of that, 230 00:11:35,559 --> 00:11:38,840 Speaker 3: and it meant the gamers were really upset because the 231 00:11:38,960 --> 00:11:41,679 Speaker 3: chips themselves were in very short supply and often where 232 00:11:42,080 --> 00:11:44,400 Speaker 3: you would find them in the aftermarket marked up to 233 00:11:44,679 --> 00:11:48,360 Speaker 3: ridiculous prices, and they're already quite expensive. So in Video 234 00:11:48,440 --> 00:11:50,679 Speaker 3: was already in the news then and their stock price 235 00:11:50,800 --> 00:11:54,000 Speaker 3: was soaring already because of the popularity of the chips. 236 00:11:54,520 --> 00:11:58,320 Speaker 3: Once the AI industry started to really take off toward 237 00:11:58,360 --> 00:12:02,200 Speaker 3: the end of twenty twenty two, that's when in Vidia 238 00:12:02,679 --> 00:12:08,640 Speaker 3: really flourished, and in an incredibly savvy move, they began 239 00:12:08,720 --> 00:12:12,959 Speaker 3: to reposition themselves as a company that made AI chips 240 00:12:13,040 --> 00:12:15,560 Speaker 3: and not just the GPU company. 241 00:12:15,720 --> 00:12:19,400 Speaker 1: Right, they sort of forced themselves into being considered a 242 00:12:19,480 --> 00:12:22,720 Speaker 1: national security or a kind of critical natural company. 243 00:12:22,920 --> 00:12:25,839 Speaker 3: Yeah, like they like they it was them or no one, 244 00:12:26,000 --> 00:12:29,440 Speaker 3: it was kind of the view. And so that's when 245 00:12:29,960 --> 00:12:33,120 Speaker 3: in Nvidia went from being really a pretty powerful company 246 00:12:33,120 --> 00:12:36,559 Speaker 3: to like in the top three gets often battling in 247 00:12:36,600 --> 00:12:39,240 Speaker 3: the top three spaces for most valuable company in the world. 248 00:12:39,520 --> 00:12:42,200 Speaker 1: Because of this chip race or chip wars narrative that 249 00:12:42,200 --> 00:12:46,120 Speaker 1: they were able to both profit from it also drive. 250 00:12:46,080 --> 00:12:48,240 Speaker 2: But also I mean it's true, right, I mean that yeah, 251 00:12:48,520 --> 00:12:49,080 Speaker 2: has been true. 252 00:12:49,160 --> 00:12:49,640 Speaker 3: Yeah, yeah. 253 00:12:50,040 --> 00:12:50,680 Speaker 2: What happened next? 254 00:12:50,760 --> 00:12:53,199 Speaker 3: Yeah, it's it's not boasting if you can back it up, right, 255 00:12:53,360 --> 00:12:58,120 Speaker 3: And you certainly could argue that there was exploitation going on, 256 00:12:58,200 --> 00:13:01,000 Speaker 3: like they were exploiting the the narrative, and in some 257 00:13:01,000 --> 00:13:04,280 Speaker 3: part I'm sure that's true because as I'm certain you 258 00:13:04,320 --> 00:13:09,720 Speaker 3: have noticed the craze around AI, there's a nugget of 259 00:13:09,920 --> 00:13:13,120 Speaker 3: truthfulness at the center, but there's a lot of hype 260 00:13:13,120 --> 00:13:13,679 Speaker 3: around it. 261 00:13:14,000 --> 00:13:17,439 Speaker 1: Yeah, yeah, yeah, yeah, yeah, Well it was interesting presentation yesterday. 262 00:13:17,520 --> 00:13:20,319 Speaker 1: I mean, I would say taking on in video is 263 00:13:20,360 --> 00:13:23,119 Speaker 1: a lofty goal, absolutely point. 264 00:13:23,000 --> 00:13:26,840 Speaker 3: But we've seen that happen before too, right, Like if 265 00:13:26,880 --> 00:13:27,120 Speaker 3: you go. 266 00:13:27,080 --> 00:13:29,040 Speaker 2: Back, I guess the way Vidio disrupted Intel, which is 267 00:13:29,080 --> 00:13:29,480 Speaker 2: on the floor. 268 00:13:29,559 --> 00:13:31,000 Speaker 3: That's exactly what I was going to say, Like if 269 00:13:31,040 --> 00:13:33,200 Speaker 3: you if you go back to the nineties, you could 270 00:13:33,200 --> 00:13:35,120 Speaker 3: ask someone like do you think n video is going 271 00:13:35,160 --> 00:13:37,760 Speaker 3: to overtake Intel? And you'd be left all the building. Yeah, 272 00:13:37,800 --> 00:13:39,439 Speaker 3: so there is precedence. 273 00:13:40,000 --> 00:13:41,760 Speaker 1: What did you I mean, you were out of the 274 00:13:41,880 --> 00:13:44,560 Speaker 1: chair when when deep Seak happened, that was like my 275 00:13:44,679 --> 00:13:47,280 Speaker 1: second week. Yeah, there's a text down host and I 276 00:13:47,320 --> 00:13:49,440 Speaker 1: was like, oh my goodness, this is a this is 277 00:13:49,720 --> 00:13:52,600 Speaker 1: for me. I think probably the biggest text story I've 278 00:13:52,640 --> 00:13:55,959 Speaker 1: seen since the chetchipt released in November twenty twenty two. 279 00:13:56,000 --> 00:13:59,760 Speaker 1: I mean it just the reaction from the world, from 280 00:13:59,760 --> 00:14:02,640 Speaker 1: the market, like everyone wants to know what's going on, 281 00:14:02,720 --> 00:14:04,880 Speaker 1: But what did you think in the moment, and how 282 00:14:04,880 --> 00:14:06,160 Speaker 1: have you digested it since then? 283 00:14:06,360 --> 00:14:11,319 Speaker 3: So I had two very strong reactions when the deep 284 00:14:11,320 --> 00:14:18,880 Speaker 3: seek information started to become mainstream news. One was how 285 00:14:18,920 --> 00:14:23,400 Speaker 3: interesting because we've just been through a year of the 286 00:14:23,440 --> 00:14:26,880 Speaker 3: biggest fuss ever being made over TikTok, and here we 287 00:14:26,920 --> 00:14:32,720 Speaker 3: have an artificial intelligence system from China where the perceived 288 00:14:32,800 --> 00:14:36,680 Speaker 3: thread of TikTok is you could argue minuscule compared to 289 00:14:37,160 --> 00:14:43,680 Speaker 3: an AI company that is completely dependent upon devouring as 290 00:14:43,760 --> 00:14:46,240 Speaker 3: much information as possible and then making use of it, 291 00:14:46,680 --> 00:14:50,360 Speaker 3: Like that should be the big national security concern. If 292 00:14:50,440 --> 00:14:53,840 Speaker 3: in fact that's the real reason why you're concerned about TikTok, 293 00:14:54,080 --> 00:14:56,520 Speaker 3: we could go into a crazy conversation about whether or 294 00:14:56,560 --> 00:14:58,160 Speaker 3: not that's true. And I'm not going to do that here. 295 00:14:58,720 --> 00:15:02,040 Speaker 2: But the second, especially because the TikTok booths is right 296 00:15:02,040 --> 00:15:02,360 Speaker 2: next to it. 297 00:15:04,240 --> 00:15:07,440 Speaker 3: Well, listen, every time I go by, I give reverence 298 00:15:07,440 --> 00:15:10,880 Speaker 3: and I floss a little bit. But no, the the 299 00:15:10,960 --> 00:15:15,760 Speaker 3: other reaction I had was that, golly, people have really 300 00:15:15,800 --> 00:15:20,480 Speaker 3: been waiting for a challenger to chat GPT. And I 301 00:15:20,480 --> 00:15:23,480 Speaker 3: think that tells you that as excited as people are 302 00:15:23,640 --> 00:15:28,040 Speaker 3: about Chad GPT and open ai in general, the need 303 00:15:28,160 --> 00:15:31,560 Speaker 3: for there to be competition and for perhaps there to 304 00:15:31,560 --> 00:15:35,320 Speaker 3: be a check on open AI's otherwise dominance in the 305 00:15:35,360 --> 00:15:38,000 Speaker 3: space is one that a lot of people were feeling, 306 00:15:38,080 --> 00:15:39,680 Speaker 3: even if they weren't able to articulate it. 307 00:15:39,880 --> 00:15:40,360 Speaker 2: Yeah, I think. 308 00:15:40,400 --> 00:15:44,000 Speaker 1: I mean that the point that the surrebrass guy was 309 00:15:44,080 --> 00:15:47,960 Speaker 1: making yesterday was that there they run these you know, 310 00:15:48,080 --> 00:15:51,320 Speaker 1: servers with their with their chips that run the open 311 00:15:51,360 --> 00:15:54,360 Speaker 1: source version of deep Seek's model, but in the US, 312 00:15:54,400 --> 00:15:56,160 Speaker 1: so that la is some of the he was basically, 313 00:15:56,160 --> 00:15:59,160 Speaker 1: he said explicitly like, don't use the deep seak product 314 00:15:59,200 --> 00:16:02,680 Speaker 1: like you. I mean, that's that's kind of interesting. I 315 00:16:02,680 --> 00:16:04,920 Speaker 1: think the second point you make about people wanting a 316 00:16:05,000 --> 00:16:08,240 Speaker 1: challenger to open ai is very true. I mean, emotionally, 317 00:16:08,320 --> 00:16:12,000 Speaker 1: like nobody likes, you know, one big bully in the playground, right, 318 00:16:12,080 --> 00:16:13,640 Speaker 1: But I think when you're in a place like Doha, 319 00:16:13,720 --> 00:16:18,080 Speaker 1: you also understand how the idea of a competing ecosystem 320 00:16:18,080 --> 00:16:21,200 Speaker 1: of different AI models and maybe a cheaper way to 321 00:16:21,320 --> 00:16:25,800 Speaker 1: use AI is an incredibly attractive thing in a world 322 00:16:25,960 --> 00:16:30,200 Speaker 1: where you know, the concept of American hegemony is is 323 00:16:30,920 --> 00:16:33,200 Speaker 1: fast receding and people want to know what's coming next. 324 00:16:33,200 --> 00:16:36,360 Speaker 1: So this this tech element of geopolitics or something which 325 00:16:36,400 --> 00:16:37,280 Speaker 1: is super fascinating. 326 00:16:37,400 --> 00:16:40,680 Speaker 3: Yeah, and again that's something that that's probably the most 327 00:16:40,760 --> 00:16:44,480 Speaker 3: valuable takeaway I'm going to have from this I say 328 00:16:44,480 --> 00:16:46,640 Speaker 3: that it's only the second day, but the most take 329 00:16:46,720 --> 00:16:48,520 Speaker 3: a powerful takeaway I'm going to have on this trip 330 00:16:48,680 --> 00:16:50,880 Speaker 3: is that opening of my perspective. 331 00:16:53,880 --> 00:16:56,440 Speaker 1: Coming up, Jonathan Strickland and I dig into the future 332 00:16:56,600 --> 00:16:57,920 Speaker 1: of augmented reality. 333 00:16:58,360 --> 00:16:59,000 Speaker 2: Stay with us. 334 00:17:06,119 --> 00:17:07,760 Speaker 1: So there's a lot to say about chips, but but 335 00:17:07,800 --> 00:17:09,600 Speaker 1: I wanted to move on to another technology that you've 336 00:17:09,640 --> 00:17:13,159 Speaker 1: covered closely over the years, augmented reality. I also went 337 00:17:13,200 --> 00:17:16,600 Speaker 1: to a talk given by a gentleman called Keyppan of 338 00:17:16,760 --> 00:17:22,320 Speaker 1: snap and he came out wearing these very futuristic RoboCop 339 00:17:22,440 --> 00:17:26,320 Speaker 1: like ar Snapchat glasses, and his view of the audience 340 00:17:26,400 --> 00:17:30,879 Speaker 1: was supposed to be broadcast on the screens behind him. Unfortunately, 341 00:17:31,080 --> 00:17:34,879 Speaker 1: the WIFEI didn't want, so poor guy came out wearing 342 00:17:34,920 --> 00:17:37,720 Speaker 1: these you know, spectacles and then. 343 00:17:39,200 --> 00:17:42,920 Speaker 3: No, no first person's view. Oh that hurts. 344 00:17:43,000 --> 00:17:44,560 Speaker 2: I know when you especially when you're giving it. I mean, 345 00:17:44,560 --> 00:17:45,399 Speaker 2: you're like a thousand. 346 00:17:45,160 --> 00:17:47,520 Speaker 1: People in the audience and and you've got your big 347 00:17:47,600 --> 00:17:49,480 Speaker 1: razzle dazzle opening moment doesn't work. 348 00:17:50,080 --> 00:17:52,119 Speaker 3: I don't know about you, os, but I get I 349 00:17:52,160 --> 00:17:57,760 Speaker 3: get secondhand embarrassment really easily. How ridiculous did the glasses look? 350 00:17:58,080 --> 00:17:59,800 Speaker 2: I mean they looked. 351 00:18:00,720 --> 00:18:04,200 Speaker 1: There was a there was a famous Greek shipping billionaire 352 00:18:04,200 --> 00:18:08,640 Speaker 1: in the sixties called Aristotle Nassis who married Jackie Kennedy. Yeah, 353 00:18:08,960 --> 00:18:12,160 Speaker 1: and he wore these great big sunglasses that come about 354 00:18:12,160 --> 00:18:15,919 Speaker 1: half his face. These look literally like they'd been pulled 355 00:18:15,960 --> 00:18:18,880 Speaker 1: off Aristotle Aristotle and assets his face. 356 00:18:19,040 --> 00:18:21,840 Speaker 3: I mean, I was at Cees the year that Lady 357 00:18:21,840 --> 00:18:26,280 Speaker 3: Gaga came out and promoted her polaroid glasses, and I'm like, okay. 358 00:18:26,000 --> 00:18:31,000 Speaker 1: I've we've seen big, big, big glasses, but anyway, keep 359 00:18:31,080 --> 00:18:33,800 Speaker 1: handed a valiant job. And I was thinking, probably, you know, 360 00:18:33,840 --> 00:18:35,680 Speaker 1: as you are also feel quite a little empathy and 361 00:18:35,720 --> 00:18:38,720 Speaker 1: thinking in his head, you know, his whole mind is screaming, 362 00:18:38,800 --> 00:18:39,040 Speaker 1: this is. 363 00:18:39,040 --> 00:18:40,200 Speaker 2: A failure, like blah blah. 364 00:18:40,240 --> 00:18:42,040 Speaker 1: But he had to just keep going, soldiering on, and 365 00:18:42,640 --> 00:18:45,000 Speaker 1: to be fair, he did and actually it was a 366 00:18:45,000 --> 00:18:48,520 Speaker 1: good talk. He kind of broke down the three phases 367 00:18:48,680 --> 00:18:53,200 Speaker 1: of augmented reality at Snapchat. Phase one was the phone 368 00:18:53,440 --> 00:18:57,200 Speaker 1: capturing the face and the computer vision basically knowing. 369 00:18:57,040 --> 00:18:59,560 Speaker 2: Those eyes, that's the nose, that's the mouth when you want. 370 00:18:59,440 --> 00:19:03,040 Speaker 1: A puke rain, but that's where it should come from. 371 00:19:05,359 --> 00:19:08,840 Speaker 1: The second phase was the computer vision understanding the human 372 00:19:08,880 --> 00:19:12,560 Speaker 1: bodies and feet, legs, so that you could basically put 373 00:19:12,600 --> 00:19:16,240 Speaker 1: gloves on, you could wear a costume where you would 374 00:19:16,280 --> 00:19:17,760 Speaker 1: know what your arms, it. 375 00:19:17,800 --> 00:19:21,480 Speaker 3: Could map properly to the right location on the body exactly. 376 00:19:21,560 --> 00:19:24,040 Speaker 2: And then phase three is basically capturing the world. 377 00:19:24,200 --> 00:19:27,119 Speaker 1: So these spectacles that are kind of looking out and 378 00:19:27,160 --> 00:19:29,600 Speaker 1: I keep calling spectacles because that's what the product is called. 379 00:19:29,600 --> 00:19:32,240 Speaker 1: They're called spectacles. They don't normally say spectacles. I normally 380 00:19:32,240 --> 00:19:36,280 Speaker 1: say glasses, right, just FYI. But basically, you know, the 381 00:19:36,280 --> 00:19:39,680 Speaker 1: big leap forward is that rather than just understanding the 382 00:19:39,760 --> 00:19:41,840 Speaker 1: human body, which is hard enough, This actually understands the 383 00:19:41,880 --> 00:19:43,639 Speaker 1: whole world around you in terms of being able to 384 00:19:43,680 --> 00:19:46,760 Speaker 1: interpret what objects are, knowing what a surface is, knowing 385 00:19:46,800 --> 00:19:49,840 Speaker 1: how far away things are, so that you can basically 386 00:19:50,160 --> 00:19:53,400 Speaker 1: create a virtual overlay on the physical world. And one 387 00:19:53,440 --> 00:19:56,520 Speaker 1: of the really cool demos was actually somebody wearing these 388 00:19:56,560 --> 00:20:01,240 Speaker 1: glasses and looking at a bunch of ingredients, and then 389 00:20:01,600 --> 00:20:04,160 Speaker 1: the glasses sent the picture of the food in real 390 00:20:04,240 --> 00:20:07,680 Speaker 1: time to the cloud, which bat back at JENNYI suggested 391 00:20:07,720 --> 00:20:08,800 Speaker 1: recipe cool. 392 00:20:09,000 --> 00:20:09,840 Speaker 2: Yeah, which is fun. 393 00:20:09,880 --> 00:20:11,359 Speaker 1: I mean it's not I can't imagine it's going to 394 00:20:11,400 --> 00:20:14,120 Speaker 1: be a huge consumer use case, but it's fun. 395 00:20:14,200 --> 00:20:16,800 Speaker 3: It's making me think of I don't know if you 396 00:20:16,840 --> 00:20:20,000 Speaker 3: remember this, but Google had Google Chef for a while 397 00:20:20,600 --> 00:20:23,280 Speaker 3: where you could do something similar. But obviously you're typing 398 00:20:23,359 --> 00:20:26,840 Speaker 3: the things. It could not, you know, do image processing 399 00:20:26,880 --> 00:20:28,600 Speaker 3: and do this, but you could type in the things 400 00:20:28,640 --> 00:20:31,119 Speaker 3: you had and it would suggest different recipes for you. 401 00:20:31,200 --> 00:20:35,040 Speaker 3: But it had the same foibles i'll say as generative AI, 402 00:20:35,160 --> 00:20:37,040 Speaker 3: and that occasionally you would get something where you're like, 403 00:20:37,200 --> 00:20:40,600 Speaker 3: well that sounds inedible, but all right, I. 404 00:20:40,520 --> 00:20:42,320 Speaker 1: Hope raid if I just looked in my fridge, the 405 00:20:42,359 --> 00:20:45,280 Speaker 1: glasses might just say get a breeze again. 406 00:20:45,680 --> 00:20:51,560 Speaker 2: Exactly, it's time to take out Exactly. But you mentioned Google. 407 00:20:52,400 --> 00:20:54,680 Speaker 1: I think you were the proud owner of some of 408 00:20:54,720 --> 00:20:56,480 Speaker 1: the very good specs. 409 00:20:56,600 --> 00:21:00,320 Speaker 3: Yeah, so not only was I an owner of Google Glass. 410 00:21:00,040 --> 00:21:02,000 Speaker 2: You removed the way proud. Yeah. 411 00:21:02,040 --> 00:21:05,000 Speaker 3: Well, I was not going to call attention to it, 412 00:21:05,040 --> 00:21:09,760 Speaker 3: but it's true. I think Google Glass was a noble effort. 413 00:21:09,920 --> 00:21:13,000 Speaker 3: I think it was nowhere close to being ready to 414 00:21:13,000 --> 00:21:15,400 Speaker 3: be a consumer product, which I think Google actually understood. 415 00:21:15,480 --> 00:21:18,200 Speaker 3: I mean, they never really marketed it as a mainstream 416 00:21:18,240 --> 00:21:21,960 Speaker 3: consumer product. But when I had them, I could see 417 00:21:22,000 --> 00:21:24,560 Speaker 3: the potential, and I thought that it was an incredible 418 00:21:24,560 --> 00:21:27,640 Speaker 3: potential that certainly was nowhere close to being realized. Yet. 419 00:21:27,720 --> 00:21:29,680 Speaker 3: Let's say you're walking around Doha and it's the first 420 00:21:29,680 --> 00:21:32,760 Speaker 3: time you've ever been here. Having the ability to see 421 00:21:33,160 --> 00:21:35,639 Speaker 3: directions in front of your eyes are or at least like 422 00:21:35,680 --> 00:21:37,600 Speaker 3: in a way that's not going to obscure your vision, 423 00:21:38,080 --> 00:21:41,520 Speaker 3: so that you can seamlessly navigate a city you've never 424 00:21:41,560 --> 00:21:45,119 Speaker 3: been to, that's incredible. Or being able to look at 425 00:21:45,119 --> 00:21:47,960 Speaker 3: a building and get a listing of the different businesses 426 00:21:47,960 --> 00:21:50,200 Speaker 3: that are in there, so that when late at night, 427 00:21:50,600 --> 00:21:54,040 Speaker 3: when you're craving Nando's. You don't spend forty five minutes 428 00:21:54,080 --> 00:21:56,480 Speaker 3: walking ground a mall wondering where it is. You know 429 00:21:56,520 --> 00:21:57,560 Speaker 3: where it is. Immediately. 430 00:21:58,000 --> 00:22:01,560 Speaker 1: In twenty seventeen, you didn't have of tech stuff which 431 00:22:01,560 --> 00:22:04,600 Speaker 1: asked the question is augmented reality ready for prime time? 432 00:22:05,520 --> 00:22:05,720 Speaker 3: Yes? 433 00:22:05,800 --> 00:22:06,720 Speaker 2: That was seven years ago. 434 00:22:06,880 --> 00:22:10,400 Speaker 3: Yes I did, I did ask that, and no it's not. Well, 435 00:22:11,080 --> 00:22:13,880 Speaker 3: there's a chicken and egg problem that's going on, and 436 00:22:14,240 --> 00:22:17,320 Speaker 3: that is the classic chicken and egg problem of hardware 437 00:22:17,400 --> 00:22:21,200 Speaker 3: versus software. Basically, the idea is that you have people 438 00:22:21,200 --> 00:22:24,160 Speaker 3: who are making the hardware, and when they create something, 439 00:22:24,160 --> 00:22:27,920 Speaker 3: even if it's really compelling, if there's not enough applications 440 00:22:27,960 --> 00:22:30,639 Speaker 3: for that hardware, there's not enough reason for people to 441 00:22:30,720 --> 00:22:32,560 Speaker 3: buy it. On Moss. The flip side, if you're a 442 00:22:32,600 --> 00:22:34,880 Speaker 3: software developer and you're looking at you see this really 443 00:22:34,960 --> 00:22:38,439 Speaker 3: exciting technology come out and you're thinking, Wow, that's really cool, 444 00:22:39,080 --> 00:22:42,840 Speaker 3: But I don't want to start dedicating resources to building 445 00:22:42,920 --> 00:22:45,879 Speaker 3: assets for this hardware until there's enough of a user 446 00:22:45,920 --> 00:22:48,720 Speaker 3: base to justify it. We talk about this with game 447 00:22:48,760 --> 00:22:50,800 Speaker 3: consoles all the time. A new game console comes out 448 00:22:50,800 --> 00:22:53,800 Speaker 3: and you're like, the console's incredible, but there's nothing to 449 00:22:53,840 --> 00:22:56,040 Speaker 3: play on it, so I'm going to wait, And so 450 00:22:57,400 --> 00:23:00,960 Speaker 3: it's this ongoing circular issue and it could be really 451 00:23:01,000 --> 00:23:01,880 Speaker 3: hard to break out of that. 452 00:23:01,960 --> 00:23:02,399 Speaker 2: I mean, that's that. 453 00:23:02,560 --> 00:23:05,520 Speaker 1: I think the software side is pretty interesting, right, I 454 00:23:05,600 --> 00:23:09,120 Speaker 1: mean the advances in AI on the computer vision side 455 00:23:09,160 --> 00:23:10,000 Speaker 1: have been pretty amazing. 456 00:23:10,240 --> 00:23:12,359 Speaker 2: Yes, yes, And you know. 457 00:23:12,320 --> 00:23:16,320 Speaker 1: If this deep Seek story plays out and it becomes 458 00:23:16,480 --> 00:23:19,879 Speaker 1: way way, way cheaper to run AI models everywhere, the 459 00:23:19,920 --> 00:23:24,560 Speaker 1: idea of real time computer vision could potentially unlock this 460 00:23:24,680 --> 00:23:28,479 Speaker 1: air reality. And what do you think needs to happen 461 00:23:29,119 --> 00:23:31,800 Speaker 1: for the situation you just described of walking down the 462 00:23:31,800 --> 00:23:34,600 Speaker 1: street in Doha Jonathan Stricklan was wearing his glasses. 463 00:23:34,800 --> 00:23:38,840 Speaker 3: I think probably the biggest issue, honestly is the battery issue, 464 00:23:38,920 --> 00:23:41,560 Speaker 3: right right, because you can only manaturize batteries so much 465 00:23:41,640 --> 00:23:43,520 Speaker 3: before you get to a point where you don't have 466 00:23:43,640 --> 00:23:48,359 Speaker 3: enough juice to provide power to a sophisticated device for 467 00:23:48,400 --> 00:23:50,879 Speaker 3: more than maybe an hour. And if that's all you 468 00:23:50,920 --> 00:23:52,439 Speaker 3: need it for, that's fine. But I think for a 469 00:23:52,440 --> 00:23:54,040 Speaker 3: lot of people, the thought of a device that they 470 00:23:54,080 --> 00:23:55,440 Speaker 3: can wear for an hour and then they need to 471 00:23:55,480 --> 00:23:58,280 Speaker 3: recharge it is quite frustrating. I think it's one of 472 00:23:58,320 --> 00:24:01,639 Speaker 3: the reasons why things like Active three glasses, we're a 473 00:24:01,680 --> 00:24:04,879 Speaker 3: non starter because even though they provide an incredible experience, 474 00:24:05,280 --> 00:24:08,119 Speaker 3: having to recharge your glasses every couple hours. Like you know, 475 00:24:08,160 --> 00:24:10,479 Speaker 3: if you're watching a Peter Jackson movie, you might have 476 00:24:10,520 --> 00:24:12,320 Speaker 3: to stop in the middle so that you can recharge 477 00:24:12,359 --> 00:24:14,639 Speaker 3: your glasses to watch the rest of the film. So 478 00:24:15,200 --> 00:24:17,600 Speaker 3: I think that's really a big issue, is that how 479 00:24:17,600 --> 00:24:20,000 Speaker 3: do you miniaturize the technology in such a way that 480 00:24:20,440 --> 00:24:22,960 Speaker 3: the glasses are something that you want to wear. They 481 00:24:23,000 --> 00:24:27,119 Speaker 3: look cool, but there's still enough power capacity there to 482 00:24:27,160 --> 00:24:30,719 Speaker 3: provide a good experience for more than a short time. 483 00:24:31,160 --> 00:24:34,400 Speaker 3: And I don't know the answer to that, because batteries 484 00:24:34,400 --> 00:24:38,360 Speaker 3: are dependent upon chemistry, and we can hack technology really 485 00:24:38,840 --> 00:24:42,280 Speaker 3: quite effectively, but chemistry you start to run against fundamental 486 00:24:42,359 --> 00:24:45,720 Speaker 3: laws of the universe and it starts to get a 487 00:24:45,800 --> 00:24:46,520 Speaker 3: lot trickier. 488 00:24:46,960 --> 00:24:49,000 Speaker 1: So do you have a whole I haven't had a 489 00:24:49,080 --> 00:24:50,720 Speaker 1: chance to attend any of it, but there's a whole 490 00:24:51,040 --> 00:24:53,840 Speaker 1: kind of new energy track at this conference where they're 491 00:24:53,840 --> 00:24:56,919 Speaker 1: talking about new battery technology. Right, it's interesting to be 492 00:24:57,000 --> 00:24:59,920 Speaker 1: here where you know, natural gas is coming out of 493 00:25:00,400 --> 00:25:03,000 Speaker 1: the ground the tune of trillions of dollars. Yes, there's 494 00:25:03,040 --> 00:25:05,280 Speaker 1: a lot of interest here in investing some of the 495 00:25:05,280 --> 00:25:07,359 Speaker 1: proceeds in these chemistry issues. 496 00:25:07,520 --> 00:25:10,280 Speaker 3: Yes, right, Yes, And it may be that there's some 497 00:25:10,359 --> 00:25:12,960 Speaker 3: breakthroughs that can come through that makes it more of 498 00:25:13,000 --> 00:25:16,960 Speaker 3: a practical application, right, And I think that's what really 499 00:25:17,040 --> 00:25:20,000 Speaker 3: is needed. It ends up being it transcends the issues 500 00:25:20,000 --> 00:25:21,919 Speaker 3: of hardware and software and it starts to get to 501 00:25:23,359 --> 00:25:27,119 Speaker 3: we've got all the technical capability here apart from where's 502 00:25:27,119 --> 00:25:28,960 Speaker 3: the power source coming from? Right, And I think that's 503 00:25:29,000 --> 00:25:31,760 Speaker 3: going to be the big thing, honestly. Obviously Apple was 504 00:25:32,520 --> 00:25:35,159 Speaker 3: trying to get there where they wanted to have a 505 00:25:35,200 --> 00:25:38,200 Speaker 3: really kind of lightweight pair of glasses that had incredible 506 00:25:38,200 --> 00:25:41,560 Speaker 3: augmented reality capabilities to it. Instead we got something that 507 00:25:41,800 --> 00:25:46,400 Speaker 3: like futuristic robotic ski goggles. You know. And if you think, well, 508 00:25:46,440 --> 00:25:50,040 Speaker 3: Apple being a multi trillion dollar company, if they're having 509 00:25:50,160 --> 00:25:53,560 Speaker 3: they're starting to hit a wall there. Then we're probably 510 00:25:53,600 --> 00:25:55,520 Speaker 3: at a point where it's just going to be a 511 00:25:55,680 --> 00:25:58,679 Speaker 3: while before we start seeing something that people think of 512 00:25:58,840 --> 00:26:01,480 Speaker 3: as being oh, this is something I want to wear 513 00:26:01,560 --> 00:26:04,280 Speaker 3: on a regular basis, as opposed to I have a 514 00:26:04,280 --> 00:26:06,919 Speaker 3: specific application in mind, I don't mind wearing it for 515 00:26:06,960 --> 00:26:12,520 Speaker 3: that application, whether it's like industrial or educational or military 516 00:26:12,520 --> 00:26:15,800 Speaker 3: of course, military or gaming, something like that where you're thinking, 517 00:26:15,800 --> 00:26:17,920 Speaker 3: all right, I'm gonna wear this for forty five minutes 518 00:26:17,920 --> 00:26:21,480 Speaker 3: to an hour or whatever, and then I put it aside, 519 00:26:21,720 --> 00:26:24,439 Speaker 3: Like I think the real dream of augmented reality is 520 00:26:24,640 --> 00:26:27,760 Speaker 3: you have something that you can wear pretty much all 521 00:26:27,800 --> 00:26:29,879 Speaker 3: the time and activate it whenever you need to. 522 00:26:35,440 --> 00:26:38,080 Speaker 1: When we come back, jonathan' strictly and I continue to 523 00:26:38,160 --> 00:26:46,400 Speaker 1: unpack what we've learned at Websomitic Cutter stay with us. 524 00:26:48,359 --> 00:26:51,080 Speaker 1: So finally, the last talk that I went to I 525 00:26:51,160 --> 00:26:53,320 Speaker 1: wants to discuss with you, which you've already touched on, 526 00:26:53,600 --> 00:26:56,520 Speaker 1: was from al mah Latour, the CEO of Dow Jones, 527 00:26:56,520 --> 00:26:58,520 Speaker 1: which is the parent company at the Wall Street Journal, 528 00:26:59,119 --> 00:27:01,679 Speaker 1: and he spoke about the future of journalism in the 529 00:27:01,720 --> 00:27:05,800 Speaker 1: age of AI. Unsurprisingly, he was very focused on how 530 00:27:06,040 --> 00:27:10,520 Speaker 1: IP creators and owners such as news publishers, can hold 531 00:27:10,560 --> 00:27:12,959 Speaker 1: onto the value of their work, and he mentioned there 532 00:27:12,960 --> 00:27:18,000 Speaker 1: are two ways of achieving this, commercial partnerships and litigation. 533 00:27:18,520 --> 00:27:22,920 Speaker 3: Yeah, yeah, I'm not surprised. So obviously this is a 534 00:27:23,040 --> 00:27:27,000 Speaker 3: very tricky topic, right, with a lot of different components 535 00:27:27,000 --> 00:27:29,640 Speaker 3: to it, one of which is that as people who 536 00:27:29,680 --> 00:27:33,320 Speaker 3: generate content. People who generate news content. We are very 537 00:27:33,359 --> 00:27:35,800 Speaker 3: well aware that the things we're putting out, which are 538 00:27:35,840 --> 00:27:40,840 Speaker 3: meant to inform, maybe entertain, maybe open up people's eyes 539 00:27:40,880 --> 00:27:44,000 Speaker 3: to new perspectives on certain topics, is also being used 540 00:27:44,000 --> 00:27:47,600 Speaker 3: to train AI. So one component that scares me about 541 00:27:47,640 --> 00:27:51,840 Speaker 3: AI and journalism is just the idea of AI benefiting 542 00:27:51,960 --> 00:27:56,080 Speaker 3: from the work of journalists and the journalists see no 543 00:27:56,160 --> 00:27:59,760 Speaker 3: benefit in return, right, So it's the copyright issue essentially 544 00:27:59,800 --> 00:28:02,080 Speaker 3: is what comes down to. And then the flip side 545 00:28:02,240 --> 00:28:05,040 Speaker 3: is the concern, like what I saw at hell Stuff works, 546 00:28:05,960 --> 00:28:09,320 Speaker 3: where a company and perhaps it's a company that's actually 547 00:28:09,320 --> 00:28:12,000 Speaker 3: in dire straits and they're really looking like, how can 548 00:28:12,040 --> 00:28:15,400 Speaker 3: we reach a point where we're still able to provide 549 00:28:15,440 --> 00:28:19,560 Speaker 3: the services that we're trying to provide without bleeding ourselves 550 00:28:19,600 --> 00:28:22,600 Speaker 3: dry because we're in a business that it just does 551 00:28:22,640 --> 00:28:26,199 Speaker 3: not have a huge return on investment, and sadly, the 552 00:28:26,240 --> 00:28:30,080 Speaker 3: media often can be that. So you have these companies 553 00:28:30,080 --> 00:28:32,600 Speaker 3: that have an incentive to say, well, you know, human 554 00:28:32,640 --> 00:28:36,639 Speaker 3: resources are staff. That's a really big expense, and if 555 00:28:36,640 --> 00:28:40,440 Speaker 3: we could just cut them and use this tool to 556 00:28:40,520 --> 00:28:42,360 Speaker 3: do the same thing they were doing and perhaps just 557 00:28:42,400 --> 00:28:46,120 Speaker 3: have a few people left behind to massage whatever is 558 00:28:46,200 --> 00:28:49,840 Speaker 3: made into something that the general public can consume, and 559 00:28:50,120 --> 00:28:52,920 Speaker 3: maybe they don't even notice the difference. Why don't we 560 00:28:53,000 --> 00:28:56,200 Speaker 3: do that? And we've seen why you don't do that. 561 00:28:56,240 --> 00:28:59,960 Speaker 3: It's because the tools that are being made, they are fallible, 562 00:29:00,160 --> 00:29:05,760 Speaker 3: and sometimes to a point that is disturbing. I told 563 00:29:05,800 --> 00:29:08,640 Speaker 3: you this OZ, and there's an episode of tech stuff. 564 00:29:08,640 --> 00:29:11,800 Speaker 3: I think it's called something like. AI wrote this episode 565 00:29:12,040 --> 00:29:15,120 Speaker 3: sort of and in that I had it write an 566 00:29:15,120 --> 00:29:17,640 Speaker 3: episode of tech stuff. All I gave it was very 567 00:29:17,640 --> 00:29:21,240 Speaker 3: simple instructions. I used chat GPT and I said, write 568 00:29:21,240 --> 00:29:24,320 Speaker 3: an episode of tech stuff about the technology of airbags. 569 00:29:24,560 --> 00:29:27,000 Speaker 3: That was it, like. I didn't give it any further instruction, 570 00:29:27,600 --> 00:29:31,280 Speaker 3: and as part of what it regurgitated to me, it 571 00:29:31,320 --> 00:29:35,680 Speaker 3: gave me statements from three supposed experts, but none of 572 00:29:35,720 --> 00:29:40,760 Speaker 3: those people existed, which means automatically you cannot trust the information. 573 00:29:41,200 --> 00:29:43,520 Speaker 3: And when I would ask for things like I followed 574 00:29:43,560 --> 00:29:45,600 Speaker 3: up I said, could you give me a source for 575 00:29:45,680 --> 00:29:48,920 Speaker 3: this information? And it wouldn't be able to. So these 576 00:29:48,960 --> 00:29:51,120 Speaker 3: things are going to be things that improve over time. 577 00:29:51,360 --> 00:29:55,040 Speaker 3: But I think the problems we're seeing now concern me 578 00:29:55,120 --> 00:29:59,200 Speaker 3: because whether the technology actually gets better, so it's being 579 00:29:59,240 --> 00:30:02,320 Speaker 3: more accountable, or it just gets better at obfus skating 580 00:30:02,960 --> 00:30:05,560 Speaker 3: when it's making stuff up. I don't know that we'll 581 00:30:05,600 --> 00:30:06,560 Speaker 3: be able to tell the difference. 582 00:30:07,000 --> 00:30:10,320 Speaker 1: Yeah, I mean, Latour was talking about how basically, how 583 00:30:10,360 --> 00:30:14,280 Speaker 1: do we at Dow Jones use these tools to make 584 00:30:14,320 --> 00:30:16,680 Speaker 1: sure we had compensated for our work so that we 585 00:30:16,680 --> 00:30:20,080 Speaker 1: can make more money so that we can fund more journalism. 586 00:30:20,280 --> 00:30:25,400 Speaker 3: It's a very pragmatic sort of look, but it makes sense, like, yeah, 587 00:30:25,520 --> 00:30:29,120 Speaker 3: the good balancing art and commerce is always a struggle, 588 00:30:29,280 --> 00:30:31,640 Speaker 3: even no matter what outlet you're looking at. 589 00:30:31,880 --> 00:30:34,000 Speaker 1: It's interesting as well, though, because it's not He wasn't 590 00:30:34,040 --> 00:30:37,800 Speaker 1: just talking in abstraction, Like right now, Dow Jones is 591 00:30:37,920 --> 00:30:41,000 Speaker 1: in partnership with open Ai and suing Perplexity. 592 00:30:41,360 --> 00:30:41,800 Speaker 3: Wow. 593 00:30:41,960 --> 00:30:45,000 Speaker 1: So with open Ai they made this deal last year 594 00:30:45,040 --> 00:30:47,840 Speaker 1: that The Wall Street Journal reported on its parent company 595 00:30:48,240 --> 00:30:50,200 Speaker 1: could be worth more than two hundred and fifteen million 596 00:30:50,240 --> 00:30:53,120 Speaker 1: dollars over the next five years in both cash and 597 00:30:53,240 --> 00:30:54,760 Speaker 1: credits for open ai technology. 598 00:30:56,080 --> 00:30:57,080 Speaker 2: But last year Dow. 599 00:30:57,080 --> 00:31:00,760 Speaker 1: Jones also sued Perplexity the AI search engine, and the 600 00:31:00,800 --> 00:31:03,240 Speaker 1: suit's not public and al Maesa didn't want to go 601 00:31:03,240 --> 00:31:06,960 Speaker 1: into details about what obviously, despite the best efforts of 602 00:31:07,000 --> 00:31:09,880 Speaker 1: Sarah Fisher from Axios on stage, but why it did 603 00:31:09,920 --> 00:31:12,840 Speaker 1: report on the lawsuit and they basically, you know, the 604 00:31:12,840 --> 00:31:17,360 Speaker 1: headline was that dow Jones alleges Perplexity is hallucinating fake 605 00:31:17,440 --> 00:31:20,760 Speaker 1: news and attributing it to real papers and that's illegal. 606 00:31:21,400 --> 00:31:23,320 Speaker 3: Oh yeah, so very similar to what I was just saying, 607 00:31:23,320 --> 00:31:26,520 Speaker 3: exact right, Yeah, this is this is like someone writing 608 00:31:26,560 --> 00:31:30,520 Speaker 3: a term paper and having a citation for a source 609 00:31:30,560 --> 00:31:31,440 Speaker 3: that doesn't exist. 610 00:31:31,520 --> 00:31:33,720 Speaker 1: It's like somebody making something up and then citing it 611 00:31:33,760 --> 00:31:35,520 Speaker 1: to a professor when the professor never said. 612 00:31:35,320 --> 00:31:38,080 Speaker 3: It, right, right, right, right, And yeah. 613 00:31:37,840 --> 00:31:39,400 Speaker 2: That's what that's what this suit is about. 614 00:31:39,480 --> 00:31:42,680 Speaker 1: Absolutely, yea, references in a way that's even more egregious 615 00:31:42,720 --> 00:31:45,400 Speaker 1: to your point about AI getting better at covering its 616 00:31:45,440 --> 00:31:48,320 Speaker 1: tracks that it makes up up when you're doing fake citations, 617 00:31:48,360 --> 00:31:49,120 Speaker 1: that's pretty. 618 00:31:49,800 --> 00:31:53,280 Speaker 3: I mean, it's it's terrifying because like obviously this is 619 00:31:53,520 --> 00:31:56,800 Speaker 3: this ends up becoming like a political term in some cases, 620 00:31:56,840 --> 00:32:00,560 Speaker 3: but like the whole fake news thing which often was 621 00:32:00,600 --> 00:32:03,920 Speaker 3: being used to try and delegitimize real news sources that 622 00:32:03,960 --> 00:32:06,040 Speaker 3: were just saying things you didn't like, Yea, you call 623 00:32:06,080 --> 00:32:08,320 Speaker 3: it fake news. So that you can dismiss it. But 624 00:32:08,320 --> 00:32:12,840 Speaker 3: we're actually talking about actual fake news, the news about 625 00:32:12,880 --> 00:32:15,800 Speaker 3: stuff that didn't happen or at least didn't happen the 626 00:32:15,800 --> 00:32:18,320 Speaker 3: way that you're being told it did, but it's being 627 00:32:18,400 --> 00:32:20,680 Speaker 3: put down in a record as if that, in fact 628 00:32:20,880 --> 00:32:24,080 Speaker 3: is what happened. How do you then get to the 629 00:32:24,120 --> 00:32:30,000 Speaker 3: point where you can determine reality from computational fiction. 630 00:32:30,440 --> 00:32:32,800 Speaker 1: Yeah, yeah, I mean the litigation stuff is very interesting. 631 00:32:33,160 --> 00:32:35,880 Speaker 1: Obviously if you look back to the Victorian times and 632 00:32:35,920 --> 00:32:39,360 Speaker 1: the aftermath of the First Industrial Revolution, that was when 633 00:32:39,440 --> 00:32:43,360 Speaker 1: copywriting laws took effect. And so you know, as the 634 00:32:43,400 --> 00:32:46,360 Speaker 1: New York Times law suit against open Ai potentially comes 635 00:32:46,400 --> 00:32:50,720 Speaker 1: to trial, as Dow Jones lawsuit against Perplexity comes to trial, potentially, 636 00:32:51,080 --> 00:32:52,840 Speaker 1: we're going to see new president and it'll be really 637 00:32:52,880 --> 00:32:55,400 Speaker 1: interesting to see how that shakes out and how that 638 00:32:55,600 --> 00:32:58,280 Speaker 1: shapes our future. I mean, the power of law to 639 00:32:58,440 --> 00:33:02,440 Speaker 1: shape reality is so huge, and copyright law has been 640 00:33:02,840 --> 00:33:05,040 Speaker 1: such a fundamental of our society and who we are 641 00:33:05,160 --> 00:33:05,800 Speaker 1: for so long. 642 00:33:06,440 --> 00:33:10,480 Speaker 3: Yeah, absolutely fascinating. It Also it goes back to that 643 00:33:10,600 --> 00:33:13,520 Speaker 3: old saying. It's almost a cliche, but you know, necessity 644 00:33:13,600 --> 00:33:16,240 Speaker 3: is the mother of invention and in this case, necessity 645 00:33:16,360 --> 00:33:19,240 Speaker 3: is the mother of the need for new laws. So 646 00:33:19,880 --> 00:33:24,400 Speaker 3: with copyright, before you get to things like mass printing, 647 00:33:24,840 --> 00:33:27,640 Speaker 3: copyright really wasn't that big of a concern, just because 648 00:33:27,640 --> 00:33:29,560 Speaker 3: it was such a pain in the butt to make 649 00:33:29,600 --> 00:33:32,000 Speaker 3: a copy of a work. But then you get to 650 00:33:32,040 --> 00:33:36,560 Speaker 3: a point where technology is capable of giving people opportunities 651 00:33:36,600 --> 00:33:39,200 Speaker 3: to do things on a much larger scale. That's where 652 00:33:39,200 --> 00:33:41,800 Speaker 3: you start to see the need for new law. And 653 00:33:41,880 --> 00:33:44,440 Speaker 3: I suspect we're going to see a lot of changes 654 00:33:44,480 --> 00:33:48,680 Speaker 3: to law in the next few years as a result 655 00:33:48,840 --> 00:33:50,960 Speaker 3: of the rise of technologies like AI. 656 00:33:51,440 --> 00:33:52,760 Speaker 2: Yeah, I also thought it was interesting. 657 00:33:52,840 --> 00:33:57,080 Speaker 1: I mean, obviously, dow Jones is, you know, a Murdock company, 658 00:33:57,280 --> 00:34:01,719 Speaker 1: so perhaps no surprise that they're pugnacious bunch. 659 00:34:02,160 --> 00:34:06,120 Speaker 3: Yes that's pretty good, kindly, but you. 660 00:34:06,080 --> 00:34:08,160 Speaker 1: Know, but the point that he made was basically, when 661 00:34:08,160 --> 00:34:11,480 Speaker 1: the Internet emerged, you know, twenty five years ago ish 662 00:34:11,520 --> 00:34:16,880 Speaker 1: as a mass adoption technology, the news organizations, the music 663 00:34:16,960 --> 00:34:22,000 Speaker 1: rights holders, you know, film and TV industry essentially allowed 664 00:34:22,080 --> 00:34:25,560 Speaker 1: the consumer expectation to develop the content was free, yes, 665 00:34:25,880 --> 00:34:30,120 Speaker 1: and that essentially hollowed out those industries to to you know, 666 00:34:30,160 --> 00:34:32,400 Speaker 1: to the extent we're seeing today where all of them 667 00:34:32,440 --> 00:34:34,840 Speaker 1: are really struggling. So his point was, how do we 668 00:34:34,920 --> 00:34:37,759 Speaker 1: make sure we we'ret a new technological inflection point that 669 00:34:37,800 --> 00:34:40,600 Speaker 1: we don't roll over again? And you know, so I 670 00:34:40,600 --> 00:34:41,520 Speaker 1: thought that was pretty interesting. 671 00:34:41,680 --> 00:34:44,920 Speaker 3: You know, that's an incredibly good point. I mean, we 672 00:34:45,000 --> 00:34:48,640 Speaker 3: are where we are today in part because we've all 673 00:34:48,840 --> 00:34:53,160 Speaker 3: had the expectation of the content that we want to 674 00:34:53,840 --> 00:34:56,400 Speaker 3: access at any given moment and any point of the 675 00:34:56,480 --> 00:34:59,759 Speaker 3: day should be easy and free to access. That's a 676 00:34:59,800 --> 00:35:02,560 Speaker 3: great eight thing in many ways for the individual, but 677 00:35:02,640 --> 00:35:05,520 Speaker 3: it does put an incredible burden on the entities that 678 00:35:05,560 --> 00:35:09,600 Speaker 3: are actually creating that, whether you are a solo content 679 00:35:09,680 --> 00:35:13,799 Speaker 3: creator and you're just trying to make your passion your occupation, 680 00:35:14,280 --> 00:35:16,880 Speaker 3: or you're a big media company and you're trying to 681 00:35:16,920 --> 00:35:20,960 Speaker 3: put out really incredible content, which costs a lot of 682 00:35:20,960 --> 00:35:24,560 Speaker 3: money to do, and then you're getting tiny little pennies 683 00:35:24,560 --> 00:35:29,439 Speaker 3: in return. It's really hard in today's world to have 684 00:35:29,560 --> 00:35:32,640 Speaker 3: the same kind of media output that we would expect 685 00:35:32,640 --> 00:35:35,440 Speaker 3: from say the early two thousands and have it be 686 00:35:35,480 --> 00:35:38,440 Speaker 3: a profitable business. And if there is no profit, there 687 00:35:38,480 --> 00:35:40,880 Speaker 3: is no business. If there's no business, there's no content. 688 00:35:40,960 --> 00:35:44,160 Speaker 3: So ultimately it does come back down to us to say, 689 00:35:44,960 --> 00:35:49,239 Speaker 3: how do we create a world where this technology can 690 00:35:49,400 --> 00:35:55,279 Speaker 3: be practical and useful, not dangerous, and not something that 691 00:35:55,560 --> 00:35:58,560 Speaker 3: bankrupts any company that actually gets into the business of 692 00:35:58,640 --> 00:35:59,080 Speaker 3: doing it. 693 00:35:59,640 --> 00:36:01,879 Speaker 1: And that was the other thing Alma was talking about 694 00:36:01,880 --> 00:36:03,800 Speaker 1: in terms of some of the positive use of AI. 695 00:36:04,320 --> 00:36:05,439 Speaker 2: He talked about. 696 00:36:05,120 --> 00:36:08,640 Speaker 1: Helping Dow Jones do better research. He talked about scaling 697 00:36:08,680 --> 00:36:12,880 Speaker 1: the output across multiple languages. Apparently Dow Jones wire service 698 00:36:12,960 --> 00:36:17,279 Speaker 1: is now available in Korean, largely translated by AI at 699 00:36:17,320 --> 00:36:19,919 Speaker 1: such a good enough level that Korean traders are able 700 00:36:19,920 --> 00:36:22,160 Speaker 1: to trade off it and also using I to make 701 00:36:22,160 --> 00:36:23,400 Speaker 1: their content more interactive. 702 00:36:24,000 --> 00:36:25,879 Speaker 2: How do you feel about some of these uses of AI. 703 00:36:26,200 --> 00:36:28,680 Speaker 3: I think that that's incredible. Like I think the idea 704 00:36:28,760 --> 00:36:32,759 Speaker 3: of using AI to improve accessibility to me is a 705 00:36:34,040 --> 00:36:37,759 Speaker 3: no brainer. Use for AI, making something that has got 706 00:36:37,840 --> 00:36:42,920 Speaker 3: value and usefulness to someone's life accessible to them is transformational. 707 00:36:43,600 --> 00:36:47,359 Speaker 3: It doesn't do me any good If the information I 708 00:36:47,440 --> 00:36:50,600 Speaker 3: need is in a format that I cannot access, it 709 00:36:50,680 --> 00:36:53,920 Speaker 3: might as well not exist. So I look at how 710 00:36:53,960 --> 00:36:56,680 Speaker 3: transformative the Internet has been over the last couple of decades. 711 00:36:56,719 --> 00:37:00,560 Speaker 3: I think I'm old enough where I remember when I 712 00:37:00,600 --> 00:37:04,680 Speaker 3: first saw the web like which I famously dismissed at 713 00:37:04,680 --> 00:37:07,880 Speaker 3: the time because it was so slow. But when the 714 00:37:07,920 --> 00:37:13,840 Speaker 3: web started to really take on properties that were undeniably 715 00:37:14,000 --> 00:37:17,880 Speaker 3: useful and transformative, I just took it for granted that 716 00:37:17,960 --> 00:37:19,920 Speaker 3: I was able to access all of it because it 717 00:37:20,320 --> 00:37:22,080 Speaker 3: largely came out of the United States. Now, if I 718 00:37:22,080 --> 00:37:26,200 Speaker 3: had lived anywhere else where I didn't know English, knowing 719 00:37:26,239 --> 00:37:28,200 Speaker 3: that that existed might have been interesting to me, but 720 00:37:28,200 --> 00:37:29,920 Speaker 3: it wouldn't be practical. I wouldn't be able to do 721 00:37:29,920 --> 00:37:34,160 Speaker 3: anything with it. So using AI to do things like translate, 722 00:37:34,600 --> 00:37:37,680 Speaker 3: whether it's a podcast or written piece of work or 723 00:37:37,719 --> 00:37:41,160 Speaker 3: a movie, that to me is one of the best uses, 724 00:37:41,200 --> 00:37:45,560 Speaker 3: the most incredible uses of AI. And obviously it's not perfect, 725 00:37:45,920 --> 00:37:49,799 Speaker 3: but it's really good. The issues we tend to run 726 00:37:49,800 --> 00:37:52,839 Speaker 3: into are things like idioms and cultural references that are 727 00:37:52,880 --> 00:37:56,279 Speaker 3: not easily translatable from one language to another. Those are 728 00:37:56,280 --> 00:37:59,160 Speaker 3: problems that are going to just remain for quite a while. 729 00:37:59,200 --> 00:38:01,000 Speaker 3: I don't know how long it's going to take us 730 00:38:01,400 --> 00:38:05,080 Speaker 3: to teach robots what these idioms mean. That's going to 731 00:38:05,080 --> 00:38:07,360 Speaker 3: be an issue that's ongoing. But I think that's a 732 00:38:07,480 --> 00:38:08,680 Speaker 3: very noble use of AI. 733 00:38:12,480 --> 00:38:15,920 Speaker 1: Thank you so much, Jonathan. Thanks for joining today in Doha, 734 00:38:16,040 --> 00:38:17,160 Speaker 1: and I hope we'll do it again soon. 735 00:38:17,360 --> 00:38:20,080 Speaker 3: Absolutely, I'll head up to New York if it's a 736 00:38:20,200 --> 00:38:21,200 Speaker 3: much shorter plane. 737 00:38:21,040 --> 00:38:22,440 Speaker 2: Ride, that's for sure. 738 00:38:24,080 --> 00:38:26,600 Speaker 1: That's it for this week for Tech Stuff, I'm mos Voloshin. 739 00:38:27,080 --> 00:38:30,040 Speaker 1: This episode was produced by Eliza Dennis. It was executive 740 00:38:30,080 --> 00:38:33,279 Speaker 1: produced by me Karen Price, and Kate Osborne for Kaleidoscope 741 00:38:33,760 --> 00:38:36,240 Speaker 1: and Katrina Norvelle for iHeart Podcasts. 742 00:38:36,680 --> 00:38:38,640 Speaker 2: Please rate, review, and reach out 743 00:38:38,640 --> 00:38:53,360 Speaker 1: To us at tech Stuff podcast at gmail dot com.