1 00:00:02,520 --> 00:00:07,040 Speaker 1: Bloomberg Audio Studios, podcasts, radio news. 2 00:00:07,840 --> 00:00:11,000 Speaker 2: European technology regulations remain at the center of the trade 3 00:00:11,039 --> 00:00:14,280 Speaker 2: talksteen the EU and US as they move to implement 4 00:00:14,640 --> 00:00:17,800 Speaker 2: their deal reached over this summer. Tomorrow, the European Commission 5 00:00:17,840 --> 00:00:19,599 Speaker 2: is due to set out changes to its rules on 6 00:00:19,680 --> 00:00:23,079 Speaker 2: data protection and artificial intelligence. They're meant to ease the 7 00:00:23,120 --> 00:00:26,239 Speaker 2: regulatory burden and boost competitiveness. We're joining me now in 8 00:00:26,280 --> 00:00:28,800 Speaker 2: the Brussels radio studio to discuss is Glenn Fogels, CEO 9 00:00:28,880 --> 00:00:31,320 Speaker 2: of Booking Holdings, which is the company that owns Booking 10 00:00:31,360 --> 00:00:34,520 Speaker 2: dot Com, Priceline, Open Table and more. Glenn, good morning, 11 00:00:34,560 --> 00:00:38,360 Speaker 2: Great to see you in Brussels. This is it's interesting 12 00:00:38,440 --> 00:00:40,280 Speaker 2: to talk to you because you are one of those 13 00:00:40,760 --> 00:00:43,159 Speaker 2: companies that is designated as a big platform by the 14 00:00:43,159 --> 00:00:46,000 Speaker 2: EU for its regulatory purposes. We often think of perhaps 15 00:00:46,080 --> 00:00:48,199 Speaker 2: the Googles or the Microsoft's of this world, but you 16 00:00:48,320 --> 00:00:50,279 Speaker 2: are a company that many people will have had interactions 17 00:00:50,280 --> 00:00:53,080 Speaker 2: with while booking travel or restaurants as it may have 18 00:00:53,120 --> 00:00:55,960 Speaker 2: been involved. The European Commission says at once to simplify 19 00:00:56,240 --> 00:00:58,920 Speaker 2: its digital rules. Is this something that you're excited about? 20 00:00:59,080 --> 00:01:00,520 Speaker 2: Is going to help you to grow your company. 21 00:01:00,840 --> 00:01:02,800 Speaker 1: Well, thanks for having me, Steve, and it is an 22 00:01:02,800 --> 00:01:06,959 Speaker 1: interesting time indeed, and you're right, we are an incredible 23 00:01:07,120 --> 00:01:11,320 Speaker 1: user of technology, and we also are of course regulated 24 00:01:11,400 --> 00:01:14,199 Speaker 1: by all the new AI rules that are coming into place, 25 00:01:14,680 --> 00:01:17,760 Speaker 1: and that can be very complex, particularly because we are 26 00:01:17,800 --> 00:01:21,160 Speaker 1: a very global company. Booking dot Com is one of 27 00:01:21,240 --> 00:01:24,120 Speaker 1: the largest travel companies in the world, and as such 28 00:01:24,200 --> 00:01:26,480 Speaker 1: we have to deal with business around the world and 29 00:01:26,520 --> 00:01:30,280 Speaker 1: different regulatory environments create different problems. And one of our 30 00:01:30,319 --> 00:01:34,520 Speaker 1: problems right now is with EU rules and that are 31 00:01:34,520 --> 00:01:38,560 Speaker 1: more stringent and more complicated and require a tremendous amount 32 00:01:38,640 --> 00:01:42,920 Speaker 1: of money investment for us versus some of our competitors 33 00:01:42,959 --> 00:01:45,600 Speaker 1: around the world that are not subjected to the same rules, 34 00:01:45,800 --> 00:01:49,640 Speaker 1: they get a competitive advantage and that's very disadvantageous to 35 00:01:49,840 --> 00:01:53,560 Speaker 1: us here in Europe trying to build a very competitive business. 36 00:01:53,800 --> 00:01:55,440 Speaker 2: So from your point of view, is it a question 37 00:01:55,480 --> 00:01:58,360 Speaker 2: that not enough companies are being covered by these regulations 38 00:01:58,440 --> 00:01:59,880 Speaker 2: or is that the rules are too onerous? 39 00:02:00,120 --> 00:02:01,840 Speaker 1: Well, I think there's a lot of things going on 40 00:02:01,960 --> 00:02:04,280 Speaker 1: right now. First of all, the rules are complex, not 41 00:02:04,400 --> 00:02:08,800 Speaker 1: well understood, there's contradictions, there's issues of nobody sure exactly 42 00:02:08,880 --> 00:02:11,360 Speaker 1: what does it mean in terms of our own businesses, 43 00:02:11,400 --> 00:02:14,640 Speaker 1: so that right away can be problematic, but certainly the 44 00:02:14,720 --> 00:02:17,520 Speaker 1: issue of having our competitors who are about the same 45 00:02:17,639 --> 00:02:21,560 Speaker 1: size not subjected to the same rules, that can be 46 00:02:21,720 --> 00:02:25,200 Speaker 1: very disadvantageous. So for example, I have engineers here in 47 00:02:25,320 --> 00:02:29,120 Speaker 1: Amsterdam or doing tremendous great work, but if they can't 48 00:02:29,160 --> 00:02:32,320 Speaker 1: get the cutting edge technology, that the stuff that's right 49 00:02:32,360 --> 00:02:36,080 Speaker 1: out front, because the people who create this, the giant hyperscalers, 50 00:02:36,120 --> 00:02:38,080 Speaker 1: don't want to bring it to Europe right now because 51 00:02:38,080 --> 00:02:41,000 Speaker 1: they're concerned about the rules in Europe, so they're not 52 00:02:41,240 --> 00:02:43,800 Speaker 1: giving it over to our engineers to be able to 53 00:02:43,840 --> 00:02:46,720 Speaker 1: do things and play with and learn well. That puts 54 00:02:46,800 --> 00:02:49,440 Speaker 1: us at a big disadvantage to a giant player in 55 00:02:49,840 --> 00:02:52,640 Speaker 1: the US like Expedias one of our big competitors, or 56 00:02:52,680 --> 00:02:56,200 Speaker 1: Airbnb's one of our big competitors, or the Chinese company 57 00:02:56,280 --> 00:02:59,000 Speaker 1: trip dot com, they're a big competitor. They all can 58 00:02:59,040 --> 00:03:01,680 Speaker 1: get that stuff. Our engineers may not be able to 59 00:03:01,680 --> 00:03:05,480 Speaker 1: get it. And because there's no rule per se. But 60 00:03:05,600 --> 00:03:08,520 Speaker 1: the issue is that people who create these rule create 61 00:03:08,560 --> 00:03:14,919 Speaker 1: these new technologies, are concerned about being liability. 62 00:03:13,360 --> 00:03:15,880 Speaker 2: Issue, which companies are afraid then to be selling those 63 00:03:15,880 --> 00:03:16,840 Speaker 2: products into Europe. 64 00:03:16,880 --> 00:03:19,440 Speaker 1: Well, I think every single I think every single AI 65 00:03:19,639 --> 00:03:22,320 Speaker 1: creator always has to be thinking in the back of 66 00:03:22,360 --> 00:03:24,840 Speaker 1: their head should I bring this to Europe right now 67 00:03:24,919 --> 00:03:26,560 Speaker 1: or not? Is this a ladder not? Look, we all 68 00:03:26,600 --> 00:03:30,160 Speaker 1: saw that when open ai first brought out chat GBT. 69 00:03:30,320 --> 00:03:32,320 Speaker 1: You must remember when Italy said, oh, we don't want 70 00:03:32,320 --> 00:03:34,839 Speaker 1: that right now, do you remember that. Well, that continues 71 00:03:34,880 --> 00:03:38,080 Speaker 1: on and on and on. Where the concern is as 72 00:03:38,200 --> 00:03:42,320 Speaker 1: these new technologies are developed, the people creating it look 73 00:03:42,440 --> 00:03:44,920 Speaker 1: at the EU and they look at the European regulation 74 00:03:45,080 --> 00:03:45,920 Speaker 1: and they are concerned. 75 00:03:45,960 --> 00:03:47,880 Speaker 2: Well, the interesting thing is that we spoke to yougo 76 00:03:47,960 --> 00:03:50,600 Speaker 2: recently about a survey that did it across nine European countries. 77 00:03:50,760 --> 00:03:54,880 Speaker 2: Overwhelmingly the people they spoke to supported regulation of AI, 78 00:03:55,160 --> 00:03:58,000 Speaker 2: even if that came at a cost of innovation. I wonder, 79 00:03:58,040 --> 00:04:01,160 Speaker 2: in the conversations that you're having with you and regulators, 80 00:04:01,200 --> 00:04:03,480 Speaker 2: how are you pushing your points towards them and do 81 00:04:03,520 --> 00:04:04,720 Speaker 2: you feel you're being listened to. 82 00:04:05,200 --> 00:04:08,640 Speaker 1: So this is a very complex problem here. We all 83 00:04:08,680 --> 00:04:13,320 Speaker 1: agree we want to have safe technology. Nobody disagrees with that, 84 00:04:13,640 --> 00:04:15,600 Speaker 1: and then you come up on the other hand, we 85 00:04:15,640 --> 00:04:18,839 Speaker 1: don't want to end up far behind other countries and 86 00:04:18,880 --> 00:04:21,640 Speaker 1: we don't want to end up not having great innovation 87 00:04:21,800 --> 00:04:25,320 Speaker 1: that helps society too, So it's a balancing issue. The 88 00:04:25,400 --> 00:04:28,960 Speaker 1: problem is the world is not one set of rules, 89 00:04:29,240 --> 00:04:32,719 Speaker 1: So how do we work here in Europe? How together 90 00:04:32,760 --> 00:04:35,680 Speaker 1: do we create up with rules that we'll be able 91 00:04:35,800 --> 00:04:38,440 Speaker 1: to match up with rules in other parts of the world. 92 00:04:38,680 --> 00:04:41,680 Speaker 1: And that is something that unfortunately we as a global 93 00:04:41,760 --> 00:04:44,680 Speaker 1: company have to deal with, but we can We cannot 94 00:04:44,760 --> 00:04:47,480 Speaker 1: change governments have to make these rules, but we can 95 00:04:47,520 --> 00:04:51,520 Speaker 1: give advice and we can work with our legislator are 96 00:04:51,560 --> 00:04:55,480 Speaker 1: different members of members of parliament and ways to try 97 00:04:55,520 --> 00:04:59,000 Speaker 1: and help them understand the difficulties that we are facing. 98 00:04:59,200 --> 00:05:00,760 Speaker 2: Can you give us the con create example of a 99 00:05:00,760 --> 00:05:03,839 Speaker 2: difficulty you're facing that you're bringing up when your conversations 100 00:05:03,839 --> 00:05:04,839 Speaker 2: while you're here in Brussels. 101 00:05:04,880 --> 00:05:06,719 Speaker 1: Well, the one I just brought up right now, the 102 00:05:06,760 --> 00:05:10,600 Speaker 1: fact that in terms of new technologies in AI, a 103 00:05:10,720 --> 00:05:13,640 Speaker 1: gentic AI that's coming in, will we be allowed to 104 00:05:13,720 --> 00:05:18,400 Speaker 1: use certain types of personalization or not versus our competitors 105 00:05:18,520 --> 00:05:21,320 Speaker 1: may be able to do that, and we can It's 106 00:05:21,320 --> 00:05:22,240 Speaker 1: a very technical things. 107 00:05:22,160 --> 00:05:24,760 Speaker 2: Right, what's the opportunity cost? I suppose if you can't 108 00:05:24,760 --> 00:05:26,320 Speaker 2: do that. What does that mean for your Facebook? 109 00:05:26,360 --> 00:05:27,640 Speaker 1: Well, it ends up we're in one of the most 110 00:05:27,720 --> 00:05:31,840 Speaker 1: competitive industries in the world. Travel is by far an 111 00:05:31,920 --> 00:05:34,320 Speaker 1: area where people know that they get lots of different 112 00:05:34,360 --> 00:05:37,279 Speaker 1: opportunities to do their travel. And if they come to 113 00:05:37,520 --> 00:05:39,960 Speaker 1: US and they do not see the same benefits, the 114 00:05:40,000 --> 00:05:43,559 Speaker 1: same ease of use, the same personalization that is shown 115 00:05:43,560 --> 00:05:46,360 Speaker 1: by one of our competitors, we may not get that sale. 116 00:05:46,800 --> 00:05:48,760 Speaker 1: And that's very In the long run. What you end 117 00:05:48,880 --> 00:05:51,680 Speaker 1: up with is that we'll end up with companies in 118 00:05:51,800 --> 00:05:55,680 Speaker 1: the in the European area cannot compete. Instead of hiring 119 00:05:55,720 --> 00:05:58,320 Speaker 1: more people, we end up kind to let people go instead, 120 00:05:58,640 --> 00:06:01,160 Speaker 1: and that be very detrimental for the economies overall. 121 00:06:01,320 --> 00:06:05,159 Speaker 2: I mentioned Donald Trump's criticism of European tech regulation. We 122 00:06:05,200 --> 00:06:07,359 Speaker 2: know it's a subject of discussion and the ongoing talks 123 00:06:07,360 --> 00:06:09,840 Speaker 2: between the EU and the US. Does Donald Trump making 124 00:06:09,920 --> 00:06:12,920 Speaker 2: criticism like that help your case? Do you agree with 125 00:06:13,000 --> 00:06:15,680 Speaker 2: him on some of the changes that need to be made. 126 00:06:16,040 --> 00:06:19,400 Speaker 1: Well, I won't own comment on what the US President 127 00:06:19,440 --> 00:06:21,839 Speaker 1: has said or not so, so in more generalities, the 128 00:06:21,920 --> 00:06:27,039 Speaker 1: fact is I understand completely the need for regulations, but 129 00:06:27,200 --> 00:06:29,960 Speaker 1: also understand they need to be smart regulations. They need 130 00:06:30,000 --> 00:06:33,479 Speaker 1: to be done in a way that enables companies to 131 00:06:33,560 --> 00:06:37,360 Speaker 1: continue to develop new technologies will make better things for society. 132 00:06:37,560 --> 00:06:39,680 Speaker 1: And that's what we're doing. Our travel is a wonderful thing. 133 00:06:39,680 --> 00:06:42,240 Speaker 1: People love travel. We want to continue to create a 134 00:06:42,279 --> 00:06:45,080 Speaker 1: way that it's easier our business and the use of 135 00:06:45,120 --> 00:06:47,800 Speaker 1: technology is huge. We have an incredible number of people 136 00:06:47,839 --> 00:06:50,760 Speaker 1: working all the time. But if the rules are such 137 00:06:50,920 --> 00:06:54,120 Speaker 1: that we cannot compete, that will be detrimental to the 138 00:06:54,240 --> 00:06:55,680 Speaker 1: entire European community. 139 00:06:55,800 --> 00:06:59,640 Speaker 2: Given this process that's underway, simplification, you know, adapting these 140 00:06:59,720 --> 00:07:02,760 Speaker 2: digit the rules. Are you confident that the rules will change? 141 00:07:04,080 --> 00:07:06,120 Speaker 1: I'm hopeful. How about that? Okay? 142 00:07:06,240 --> 00:07:08,360 Speaker 2: And what are you basing that hope on. 143 00:07:09,800 --> 00:07:13,160 Speaker 1: Some people say that you know your strategy. Your strategy 144 00:07:13,160 --> 00:07:15,520 Speaker 1: should not be just hope, and that's true. So what 145 00:07:15,520 --> 00:07:18,120 Speaker 1: we're doing is we're talking with everyone we can in 146 00:07:18,240 --> 00:07:22,600 Speaker 1: government to make sure they understand the complications. Look, one 147 00:07:22,600 --> 00:07:26,560 Speaker 1: of the most powerful laws really is the law of 148 00:07:26,600 --> 00:07:30,760 Speaker 1: the unintended consequences. And when new laws, new rules are 149 00:07:30,760 --> 00:07:33,320 Speaker 1: put into place, really working people so they understand what 150 00:07:33,400 --> 00:07:36,640 Speaker 1: the consequences of these rules or laws will be going forward. 151 00:07:36,880 --> 00:07:39,600 Speaker 1: And I think sometimes there isn't enough thought into that, 152 00:07:39,760 --> 00:07:42,880 Speaker 1: and I'll give examples. Right now, we are incredibly disadvantaged 153 00:07:43,160 --> 00:07:46,440 Speaker 1: by the DMA and the DSA. Now we are fully 154 00:07:46,440 --> 00:07:49,240 Speaker 1: in favor of supporting them. We follow all the rules, 155 00:07:49,280 --> 00:07:52,520 Speaker 1: we do everything, and it's costing us huge amounts millions 156 00:07:52,560 --> 00:07:55,000 Speaker 1: and millions and millions of euros that we could be 157 00:07:55,040 --> 00:07:59,200 Speaker 1: spending on hiring people, technology, building new things. Instead we're 158 00:07:59,240 --> 00:08:02,000 Speaker 1: spending it on the regulatory framework. The problem is are 159 00:08:02,080 --> 00:08:05,040 Speaker 1: competitors who are not subjected by these same rules, even 160 00:08:05,040 --> 00:08:07,920 Speaker 1: though we are competing for the same customers, they are not, 161 00:08:08,440 --> 00:08:09,520 Speaker 1: and that is problematic. 162 00:08:09,560 --> 00:08:11,200 Speaker 2: Would you be hiring those people in Europe? 163 00:08:11,200 --> 00:08:11,360 Speaker 1: Though? 164 00:08:11,400 --> 00:08:13,280 Speaker 2: Because the talent frey and a lot of cases is 165 00:08:13,280 --> 00:08:14,160 Speaker 2: in the US as well. 166 00:08:14,320 --> 00:08:17,200 Speaker 1: So booking dot Com was born in Amsterdam, a small thing. 167 00:08:17,240 --> 00:08:22,640 Speaker 1: We have over seven thousand employees. We are continuing to hire. Look, 168 00:08:22,640 --> 00:08:24,720 Speaker 1: I would like to hire even more, but the hire 169 00:08:24,800 --> 00:08:28,680 Speaker 1: more requires the capital to do it and the resources 170 00:08:28,680 --> 00:08:30,960 Speaker 1: that you want to put in. Instead of putting it 171 00:08:31,040 --> 00:08:35,280 Speaker 1: into regulations, we would put it into technology hiring more people. 172 00:08:35,320 --> 00:08:38,120 Speaker 1: That'd be great, But I look, I recognize the need 173 00:08:38,160 --> 00:08:41,160 Speaker 1: to have good regulatory frameworks. 174 00:08:41,320 --> 00:08:42,600 Speaker 2: I want to ask you a little bit about the 175 00:08:42,600 --> 00:08:44,920 Speaker 2: technology behind the travel as well. Though we're seeing, of 176 00:08:44,960 --> 00:08:46,959 Speaker 2: course you're using AI in the company more and more 177 00:08:47,000 --> 00:08:49,480 Speaker 2: as well. Are you trusting AI to make your travel 178 00:08:49,520 --> 00:08:50,600 Speaker 2: plans now already? 179 00:08:51,480 --> 00:08:54,840 Speaker 1: You know, there's that old saying of trust, but verify. 180 00:08:56,080 --> 00:08:58,839 Speaker 1: I think that's something everybody should absolutely do. Look, I 181 00:08:59,559 --> 00:09:02,760 Speaker 1: use a lot large language models for myself my own travel, 182 00:09:02,800 --> 00:09:04,559 Speaker 1: and we have great things at booking dot Com. We 183 00:09:04,600 --> 00:09:07,000 Speaker 1: have an AI trip Planner, which is a great way 184 00:09:07,040 --> 00:09:10,160 Speaker 1: to use technology to find an easier way to come 185 00:09:10,240 --> 00:09:13,800 Speaker 1: up with a holiday. I also do check though, I 186 00:09:13,920 --> 00:09:16,680 Speaker 1: do check to make sure everything's exactly right and using 187 00:09:16,679 --> 00:09:18,560 Speaker 1: other sorts, and I think people should continue to do 188 00:09:18,600 --> 00:09:21,640 Speaker 1: that until such time that we really feel everything is 189 00:09:21,679 --> 00:09:23,520 Speaker 1: at a level that you don't need to do that. 190 00:09:23,760 --> 00:09:25,679 Speaker 1: I think that's some way down the future. 191 00:09:25,840 --> 00:09:29,679 Speaker 2: You still sharing travel recommendations with people rather than necessarily 192 00:09:29,679 --> 00:09:31,600 Speaker 2: direction them to the technology. 193 00:09:31,160 --> 00:09:33,080 Speaker 1: Well, we will, you know, Look, we'll continue to give 194 00:09:33,160 --> 00:09:36,560 Speaker 1: great recommendations. One of the things about AI that really 195 00:09:36,600 --> 00:09:39,080 Speaker 1: is remarkable is how it comes up with things you 196 00:09:39,160 --> 00:09:41,640 Speaker 1: may not have thought of before. And look, we go 197 00:09:41,679 --> 00:09:44,920 Speaker 1: back to the issue of personalization. I would love for 198 00:09:45,080 --> 00:09:47,880 Speaker 1: our AI trip planner at booking dot com to know 199 00:09:48,080 --> 00:09:51,719 Speaker 1: every single thing about me. Remember everything about me, Use 200 00:09:51,760 --> 00:09:54,959 Speaker 1: everything about me. Within all parts of our business. It's 201 00:09:54,960 --> 00:09:57,520 Speaker 1: not just the booking for the hotel, but the flights, 202 00:09:57,640 --> 00:10:00,480 Speaker 1: the car rental, everything that we're bringing together. Well, there's 203 00:10:00,480 --> 00:10:02,720 Speaker 1: a problem actually in some of the regulations that makes 204 00:10:02,760 --> 00:10:05,839 Speaker 1: that more difficult for us, because we have different brands 205 00:10:05,880 --> 00:10:08,360 Speaker 1: that do the car rental, let's say, and that's problem matter. 206 00:10:08,360 --> 00:10:10,480 Speaker 2: I'm sure the European regulators argue that it's about consent 207 00:10:10,559 --> 00:10:12,160 Speaker 2: as well from people who want to share that sort 208 00:10:12,200 --> 00:10:12,679 Speaker 2: of information. 209 00:10:12,720 --> 00:10:15,360 Speaker 1: Consent is great. I'm in favor of consent. Nothing wrong 210 00:10:15,360 --> 00:10:15,800 Speaker 1: with that, Glen. 211 00:10:15,880 --> 00:10:17,880 Speaker 2: We'll have to leave it there. CEO of Booking Holdings, 212 00:10:17,880 --> 00:10:20,560 Speaker 2: Glenn Fogel, thank you very much for joining us in Brussels,