1 00:00:05,800 --> 00:00:08,720 Speaker 1: Welcome to the Bloomberg p m L Podcast. I'm pim Fox. 2 00:00:08,760 --> 00:00:11,520 Speaker 1: Along with my co host Lisa Bramowitz. Each day we 3 00:00:11,640 --> 00:00:15,120 Speaker 1: bring you the most important, noteworthy, and useful interviews for 4 00:00:15,200 --> 00:00:17,840 Speaker 1: you and your money, whether you're at the grocery store 5 00:00:17,960 --> 00:00:20,720 Speaker 1: or the trading floor. Find the Bloomberg p m L 6 00:00:20,840 --> 00:00:33,680 Speaker 1: Podcast on Apple Podcasts, SoundCloud, and Bloomberg dot com. Cryptocurrency 7 00:00:33,800 --> 00:00:37,160 Speaker 1: investors seem to think that the asset class is looking 8 00:00:37,240 --> 00:00:40,960 Speaker 1: bubblicious and bitcoin itself is in a bubble, but they 9 00:00:41,000 --> 00:00:45,040 Speaker 1: don't see it popping anytime soon. Here to talk about 10 00:00:45,080 --> 00:00:49,000 Speaker 1: this with us is a cryptocurrency investor himself, Chris Berniski, 11 00:00:49,120 --> 00:00:52,320 Speaker 1: partner of Placeholder. He also is an advisor to our 12 00:00:52,440 --> 00:00:55,600 Speaker 1: investment management, which does invest in bitcoin, and he joins 13 00:00:55,640 --> 00:00:57,800 Speaker 1: us here in our eleven three oh studios. So, Chris, 14 00:00:57,920 --> 00:01:00,400 Speaker 1: would you agree that bitcoin is in a double and 15 00:01:00,480 --> 00:01:02,680 Speaker 1: yet it won't pop anytime soon so it doesn't matter? 16 00:01:03,400 --> 00:01:05,760 Speaker 1: There are any number of ways we could approach this question. 17 00:01:05,840 --> 00:01:08,520 Speaker 1: On one hand, we could say money is the bubble 18 00:01:08,560 --> 00:01:12,160 Speaker 1: that never pops and uh as a store of value 19 00:01:12,280 --> 00:01:14,680 Speaker 1: or or that being one of its use cases for bitcoin, 20 00:01:15,080 --> 00:01:17,520 Speaker 1: there is a lot of demand within the traditional capital 21 00:01:17,560 --> 00:01:22,680 Speaker 1: markets for a risk off store value, a disaster hedge, 22 00:01:22,680 --> 00:01:25,280 Speaker 1: so to speak, just as people use gold. So that's 23 00:01:25,319 --> 00:01:28,200 Speaker 1: one side of it. Um. Now bitcoin is is off 24 00:01:28,280 --> 00:01:32,520 Speaker 1: over from its all time highs. Now, there are over 25 00:01:32,560 --> 00:01:36,440 Speaker 1: a thousand other crypto assets right and many of those 26 00:01:36,680 --> 00:01:41,039 Speaker 1: um are near all time highs. They're not off um 27 00:01:41,080 --> 00:01:43,240 Speaker 1: although today we've had a correction and many of those, 28 00:01:43,280 --> 00:01:46,800 Speaker 1: I would argue are much more in a bubble um 29 00:01:46,840 --> 00:01:49,600 Speaker 1: in terms of where their prices relative to where their 30 00:01:49,600 --> 00:01:53,160 Speaker 1: actual utility is. Chris, maybe just step back for just 31 00:01:53,240 --> 00:01:55,160 Speaker 1: a second, because I always like to find out, you know, 32 00:01:55,240 --> 00:01:58,280 Speaker 1: the definition of these things, like a definition of bitcoin, 33 00:01:58,320 --> 00:02:01,120 Speaker 1: and I was trying to put it into a context 34 00:02:01,200 --> 00:02:04,880 Speaker 1: that appeals to my analog brain, and I was thinking 35 00:02:05,400 --> 00:02:10,080 Speaker 1: of stamps, actual physical paper stamps is a limited number 36 00:02:10,120 --> 00:02:12,720 Speaker 1: that are issued, and if you go back in time, 37 00:02:12,800 --> 00:02:15,800 Speaker 1: stamps or even revenue script was something that could be 38 00:02:15,840 --> 00:02:20,960 Speaker 1: traded and could be used for multiple transactions, and at 39 00:02:21,000 --> 00:02:24,960 Speaker 1: each point you could even place a little snip or 40 00:02:25,000 --> 00:02:27,480 Speaker 1: you could cut away a little piece of paper that 41 00:02:27,520 --> 00:02:32,160 Speaker 1: would sort of detail who actually used that stamp. Is 42 00:02:32,200 --> 00:02:34,280 Speaker 1: that a good way to look at what a bitcoin 43 00:02:34,480 --> 00:02:37,680 Speaker 1: is or what a cryptocurrency, what role it plays. I 44 00:02:37,720 --> 00:02:41,960 Speaker 1: think that stamps have some properties which relate closely to 45 00:02:41,960 --> 00:02:45,840 Speaker 1: to a cryptocurrency, but um, a cryptocurrency like bitcoin. Bitcoin 46 00:02:45,960 --> 00:02:48,400 Speaker 1: is more broad in its use cases, and so I 47 00:02:48,480 --> 00:02:51,600 Speaker 1: really would think of it um per the economic definition 48 00:02:51,600 --> 00:02:55,120 Speaker 1: of a currency being a means of exchange, a store value, 49 00:02:55,200 --> 00:02:57,640 Speaker 1: and a unit of account. Right now, bitcoin serves as 50 00:02:57,680 --> 00:03:01,560 Speaker 1: a great store of value, a medio to to to 51 00:03:01,560 --> 00:03:04,560 Speaker 1: to low value mean medium of exchange, and not a 52 00:03:04,600 --> 00:03:07,200 Speaker 1: great unit of account outside of the crypto space because 53 00:03:07,200 --> 00:03:11,480 Speaker 1: it's so volatile. One aspect that has definitely been present 54 00:03:11,560 --> 00:03:14,720 Speaker 1: on people's minds is this idea of regulation. And the 55 00:03:14,840 --> 00:03:18,239 Speaker 1: SEC came out last week and said that initial coin 56 00:03:18,280 --> 00:03:21,920 Speaker 1: offerings in particular look a lot like securities. In fact, 57 00:03:21,919 --> 00:03:25,040 Speaker 1: they are securities, but they're not they're not sticking with 58 00:03:25,080 --> 00:03:29,400 Speaker 1: they're not following securities laws. How much does the sec 59 00:03:29,560 --> 00:03:34,920 Speaker 1: is apparent interest in cryptocurrencies, Uh, basically cap the potential 60 00:03:34,960 --> 00:03:38,000 Speaker 1: gains just because this has been a really largely unregulated space, 61 00:03:38,000 --> 00:03:41,360 Speaker 1: and that will change. I think, um, it doesn't. It 62 00:03:41,400 --> 00:03:45,240 Speaker 1: doesn't cap the opportunity to actually expands the opportunity in 63 00:03:45,280 --> 00:03:48,120 Speaker 1: that we need this space to be better regulated. We 64 00:03:48,200 --> 00:03:51,000 Speaker 1: need more clarity. The entrepreneurs, the investors, all of the 65 00:03:51,200 --> 00:03:54,520 Speaker 1: all of us are craving more more clarity, and so 66 00:03:54,640 --> 00:03:58,160 Speaker 1: regulation is just part of the maturation of this asset class. 67 00:03:58,280 --> 00:04:01,960 Speaker 1: And and for me, there may be uh temporary jitters 68 00:04:02,480 --> 00:04:05,960 Speaker 1: um around the sec coming in and enforcement action, but 69 00:04:06,080 --> 00:04:09,200 Speaker 1: longer term, I think this is a healthy development. What's 70 00:04:09,200 --> 00:04:12,640 Speaker 1: the ultimate goal? Is it for a bitcoin to act 71 00:04:12,800 --> 00:04:16,919 Speaker 1: as basically a gold substitute, a store of value, especially 72 00:04:17,040 --> 00:04:20,720 Speaker 1: for people in countries where the currency is incredibly volatile? 73 00:04:21,040 --> 00:04:25,919 Speaker 1: Is it to replace paper currencies and replace national currencies? 74 00:04:26,240 --> 00:04:28,440 Speaker 1: Where are we heading with this? Well, I'm going to 75 00:04:28,600 --> 00:04:31,080 Speaker 1: pull it back to the idea of crypto assets, which 76 00:04:31,080 --> 00:04:33,760 Speaker 1: is what I wrote the book on UM. I think 77 00:04:33,760 --> 00:04:36,719 Speaker 1: of crypto assets as a new asset class that organizes 78 00:04:36,839 --> 00:04:40,279 Speaker 1: human activity. And so the goal, I would argue, more 79 00:04:40,320 --> 00:04:45,000 Speaker 1: than replacing fiat currencies is replacing equities um as again 80 00:04:45,040 --> 00:04:48,279 Speaker 1: as a means to to organize and incentivize human activity. 81 00:04:48,760 --> 00:04:53,000 Speaker 1: UM around information networks. Bitcoin is an information network that 82 00:04:53,080 --> 00:04:57,800 Speaker 1: transmits currency ETHEREUM is an information network that coordinates compute 83 00:04:57,800 --> 00:05:01,919 Speaker 1: logic and developers file coins and formation network that stores files. 84 00:05:01,960 --> 00:05:05,480 Speaker 1: So we go far beyond the idea of cryptocurrencies, which 85 00:05:05,520 --> 00:05:09,200 Speaker 1: is why I call them crypto assets into cryptocurrencies, crypto commodities, 86 00:05:09,240 --> 00:05:13,479 Speaker 1: crypto tokens, and the information networks that they support. Chris, 87 00:05:14,040 --> 00:05:19,640 Speaker 1: Cryptocurrencies and financial technology fintech seemed to come together, at 88 00:05:19,680 --> 00:05:23,200 Speaker 1: least in various news reports, particularly when it comes to 89 00:05:23,480 --> 00:05:26,880 Speaker 1: markets in Asia. And I'm wondering if you could just 90 00:05:26,960 --> 00:05:30,560 Speaker 1: give us a little insight into what you believe the 91 00:05:30,600 --> 00:05:35,560 Speaker 1: future will look like in a place that is readily 92 00:05:35,600 --> 00:05:39,040 Speaker 1: accepting of cryptocurrencies. I mean, could we have a cryptocurrency, 93 00:05:39,080 --> 00:05:42,280 Speaker 1: let's say that's even back partly by gold, for example. 94 00:05:43,440 --> 00:05:46,240 Speaker 1: I think the possibilities are unless I think, um, you 95 00:05:46,320 --> 00:05:49,800 Speaker 1: brought up Asia. Asia has certainly been um, one of 96 00:05:49,839 --> 00:05:53,200 Speaker 1: the hard hottest areas for this space. UM. Something that 97 00:05:53,240 --> 00:05:55,120 Speaker 1: I've been trying to get to the bottom of is 98 00:05:55,160 --> 00:05:59,599 Speaker 1: how much of this is entrepreneurs and people building real 99 00:05:59,720 --> 00:06:03,040 Speaker 1: utility versus how much of this is speculators Because we know, 100 00:06:03,600 --> 00:06:06,000 Speaker 1: um that some of the Asian nations have a tendency 101 00:06:06,520 --> 00:06:10,000 Speaker 1: UM or an appetite for speculation. UM. But certainly in 102 00:06:10,200 --> 00:06:13,920 Speaker 1: countries like Korea or China where digital payments mobile payments 103 00:06:14,400 --> 00:06:17,320 Speaker 1: are prolific. I think that there's a high likelihood that 104 00:06:17,400 --> 00:06:21,320 Speaker 1: they adopt cryptocurrencies on a on a mass scale earlier 105 00:06:21,360 --> 00:06:23,880 Speaker 1: than we do in the US. Just going back to 106 00:06:23,960 --> 00:06:27,360 Speaker 1: what you were saying about crypto assets, I'm trying to 107 00:06:27,480 --> 00:06:30,479 Speaker 1: envision what this means. Does it mean that there could 108 00:06:30,560 --> 00:06:34,520 Speaker 1: be some kind of crypto tender that could be used 109 00:06:34,560 --> 00:06:38,919 Speaker 1: within an industry, like a retail cryptocurrency specific or you know, 110 00:06:39,000 --> 00:06:41,400 Speaker 1: specific to automobiles or something like that, and then you 111 00:06:41,400 --> 00:06:46,800 Speaker 1: could invest in a sector based uh type of tender? 112 00:06:47,000 --> 00:06:48,160 Speaker 1: Is that? Is that what you'r is that where you're 113 00:06:48,160 --> 00:06:49,719 Speaker 1: going with this? I think that's one way to look 114 00:06:49,720 --> 00:06:54,400 Speaker 1: at it. That that would be say sector specific currencies UM, 115 00:06:54,440 --> 00:06:58,240 Speaker 1: and you could design properties of that currency UM to 116 00:06:58,800 --> 00:07:01,920 Speaker 1: fit that sector. And there could even be loyalty points 117 00:07:01,960 --> 00:07:04,400 Speaker 1: baked into it. There's all kinds of things right These 118 00:07:04,520 --> 00:07:09,200 Speaker 1: UM these protocols are really just software that replaced middlemen, 119 00:07:09,279 --> 00:07:13,040 Speaker 1: that commoditized middlemen and the functions of middlemen in different areas. 120 00:07:13,760 --> 00:07:16,720 Speaker 1: But I just wonder how much a barrier is of 121 00:07:17,440 --> 00:07:21,960 Speaker 1: having regular consumers wrap their heads around a different form 122 00:07:22,240 --> 00:07:27,720 Speaker 1: of tender for different types. It's a complicated issue. How 123 00:07:27,720 --> 00:07:30,480 Speaker 1: do you get enough people on board. Well, that's the 124 00:07:30,520 --> 00:07:33,560 Speaker 1: currency aspect, right, and I don't necessarily know that we 125 00:07:33,640 --> 00:07:37,320 Speaker 1: end up having sector specific currencies or more these universal currencies, 126 00:07:37,720 --> 00:07:39,880 Speaker 1: some like Bitcoin that are really good at storing value, 127 00:07:39,880 --> 00:07:42,920 Speaker 1: others like z cash which are anonymous, others like light 128 00:07:42,920 --> 00:07:45,520 Speaker 1: coin which are a little faster or whatever it may be. UM. 129 00:07:45,560 --> 00:07:48,360 Speaker 1: But again, to pull us back with this opportunity, it's 130 00:07:48,400 --> 00:07:52,040 Speaker 1: not only currencies, right, Many of these are commodities UM, 131 00:07:52,080 --> 00:07:55,240 Speaker 1: creating markets for digital commodities, just as we have markets 132 00:07:55,280 --> 00:07:58,520 Speaker 1: for physical commodities. So why can't I trade cloud storage 133 00:07:58,520 --> 00:08:02,760 Speaker 1: futures or bandwidth future um or GPU flop futures, and 134 00:08:02,800 --> 00:08:06,160 Speaker 1: really what crypto commodities things like file coin or gollum 135 00:08:06,320 --> 00:08:08,640 Speaker 1: or these things. A lot of those are creating digital 136 00:08:09,280 --> 00:08:12,000 Speaker 1: markets for these digital commodities. And in terms of getting 137 00:08:12,440 --> 00:08:17,400 Speaker 1: the retail or that the average investor, average user on board, UM, 138 00:08:17,440 --> 00:08:20,440 Speaker 1: I think we're very similar to you know, late eighties, 139 00:08:20,440 --> 00:08:22,960 Speaker 1: early nineties trying to get people on board with the Internet. 140 00:08:23,200 --> 00:08:25,280 Speaker 1: Most people had no clue what it was going to 141 00:08:25,360 --> 00:08:27,280 Speaker 1: be used for, how to be relevant to their lives, 142 00:08:27,560 --> 00:08:30,000 Speaker 1: And it is built to be relevant into their lives 143 00:08:30,080 --> 00:08:33,120 Speaker 1: by entrepreneurs and Investors. Thanks very much for being with us. 144 00:08:33,200 --> 00:08:36,320 Speaker 1: Chris Berniski is a partner for a Placeholder and advisor 145 00:08:36,360 --> 00:08:39,000 Speaker 1: to our investment management and he is the author of 146 00:08:39,040 --> 00:08:44,360 Speaker 1: the book Crypto Assets, The Innovative Investors Guide to Bitcoin 147 00:08:44,960 --> 00:09:02,040 Speaker 1: and Beyond. We turned our attention now to Joshua Green, 148 00:09:02,120 --> 00:09:05,679 Speaker 1: our national correspondent for Bloomberg Business. We can be followed 149 00:09:05,679 --> 00:09:08,679 Speaker 1: on Twitter at Joshua Green. He is also the author 150 00:09:08,760 --> 00:09:13,360 Speaker 1: of Devil's Bargain, Steve Bannon, Donald Trump, and the Storming 151 00:09:13,600 --> 00:09:17,640 Speaker 1: of the Presidency. Josh Green, maybe you want to give 152 00:09:17,720 --> 00:09:20,480 Speaker 1: us an update, is if there anything that we need 153 00:09:20,559 --> 00:09:24,200 Speaker 1: to know about the ongoing feud between the Trump White House, 154 00:09:24,400 --> 00:09:28,320 Speaker 1: Steve Bannon, the Republican Party, all because of this book 155 00:09:28,520 --> 00:09:31,960 Speaker 1: Fire and Fury by Michael Wolfe. Well, it doesn't seem 156 00:09:32,000 --> 00:09:35,200 Speaker 1: like it's going to be resold anytime soon. Uh. You know, 157 00:09:35,280 --> 00:09:39,040 Speaker 1: Bannon put out a statement apologizing to Donald Trump Jr. 158 00:09:40,200 --> 00:09:43,080 Speaker 1: Who we described as treason as for holding a meeting 159 00:09:43,120 --> 00:09:46,920 Speaker 1: with Russian officials in um. Doesn't seem to be much 160 00:09:46,960 --> 00:09:49,679 Speaker 1: indications in the White House that's doing anything to smooth 161 00:09:49,679 --> 00:09:52,120 Speaker 1: the waters. But I've been talking to people in Bannon 162 00:09:52,360 --> 00:09:55,360 Speaker 1: orbits who say you know, he is determined to get 163 00:09:55,440 --> 00:09:58,240 Speaker 1: himself back into Trump's good graces, and you know the 164 00:09:58,280 --> 00:09:59,920 Speaker 1: only way to do that is to keep on a pole, 165 00:10:00,000 --> 00:10:04,040 Speaker 1: lologizing and trying to ingratiate yourself to Trump, who actually 166 00:10:04,080 --> 00:10:06,360 Speaker 1: does have a long history of continuing to talk to 167 00:10:06,440 --> 00:10:09,600 Speaker 1: people he's fired and eventually bringing them back into the fall. 168 00:10:09,679 --> 00:10:13,040 Speaker 1: So it's unlikely, but not impossible to think that Bannon 169 00:10:13,080 --> 00:10:16,839 Speaker 1: could somehow worm his way back into the president's good graces. Josh. 170 00:10:16,880 --> 00:10:19,719 Speaker 1: One thing that I have struggled to understand is this 171 00:10:19,880 --> 00:10:25,199 Speaker 1: all unfolds, is what Bannon's political clout really is. Because 172 00:10:25,440 --> 00:10:29,160 Speaker 1: a lot of people associated him with President Trump's base 173 00:10:29,320 --> 00:10:33,960 Speaker 1: of supporters. Bright Bart certainly was under his thumb. Bright 174 00:10:34,000 --> 00:10:38,280 Speaker 1: Bart seems to be distancing themselves from Bannon. So does 175 00:10:38,480 --> 00:10:42,120 Speaker 1: Bannon really represent at this point the base as so 176 00:10:42,160 --> 00:10:47,520 Speaker 1: many people initially believed. Well, that's unclear. But it's also 177 00:10:47,600 --> 00:10:50,200 Speaker 1: what I think drove the split between Trump and Bannon. 178 00:10:50,520 --> 00:10:54,120 Speaker 1: You know, Trump obviously thinks that nobody but Trump himself 179 00:10:54,240 --> 00:10:58,000 Speaker 1: deserves credit for his presidential victory um and Bannon, on 180 00:10:58,040 --> 00:11:00,280 Speaker 1: the other hand, thought that Trump was a bit on 181 00:11:00,320 --> 00:11:03,400 Speaker 1: a set of ideas that Trump turned out not really 182 00:11:03,440 --> 00:11:05,360 Speaker 1: to believe in that much this idea of a kind 183 00:11:05,360 --> 00:11:09,440 Speaker 1: of a right wing populist nationalism. So Bennon's project, ever 184 00:11:09,480 --> 00:11:11,840 Speaker 1: since he left the White House last August, was trying 185 00:11:11,880 --> 00:11:15,960 Speaker 1: to advance this national movement beyond Trump and he had 186 00:11:16,000 --> 00:11:19,160 Speaker 1: he had uh spent a lot of time recruiting candidates 187 00:11:19,160 --> 00:11:24,319 Speaker 1: to run against Republican incumbents Canadas who would work to 188 00:11:24,480 --> 00:11:28,240 Speaker 1: out sent a majority leader Mitch McConnell, who Bannon's thought 189 00:11:28,280 --> 00:11:32,360 Speaker 1: was impeding his nationalist project. Um. I think the problem 190 00:11:32,440 --> 00:11:35,160 Speaker 1: is that that Bannon, Bannon's ego kind of got out 191 00:11:35,200 --> 00:11:37,800 Speaker 1: ahead of where maybe it belonged. And what we what 192 00:11:37,800 --> 00:11:40,000 Speaker 1: we're gonna find out from this latest episode is now 193 00:11:40,040 --> 00:11:43,240 Speaker 1: that Trump has split with him so publicly and so decisively, 194 00:11:43,960 --> 00:11:46,719 Speaker 1: is there really a populist movement that's loyal to these 195 00:11:46,760 --> 00:11:49,400 Speaker 1: ideas that will split off and go with Bennon or 196 00:11:49,480 --> 00:11:51,920 Speaker 1: is it really more a cult of personality that's likely 197 00:11:51,960 --> 00:11:55,280 Speaker 1: to stay loyal to the president. Josh Green, what does 198 00:11:55,360 --> 00:11:59,960 Speaker 1: this do for a potential legislation regarding infrastructure, welfare reform, 199 00:12:00,000 --> 00:12:03,440 Speaker 1: more men, even funding for the US Mexico border wall. 200 00:12:04,559 --> 00:12:06,040 Speaker 1: I don't think it does a whole lot I mean, 201 00:12:06,040 --> 00:12:08,240 Speaker 1: on the margins, it might make that a little bit 202 00:12:08,280 --> 00:12:12,200 Speaker 1: easier to pass. Um. It's going to be necessary to 203 00:12:12,240 --> 00:12:14,120 Speaker 1: do that in a bipartisan manner. And one of the 204 00:12:14,160 --> 00:12:16,240 Speaker 1: things that Bannon did it Bryite Barton News, was put 205 00:12:16,280 --> 00:12:18,520 Speaker 1: a lot of pressure on the right wing of the 206 00:12:18,520 --> 00:12:21,640 Speaker 1: Republican Party not to come together, not to make deals 207 00:12:22,559 --> 00:12:27,040 Speaker 1: with Democrats. With that voice silenced or diminished, it ought, 208 00:12:27,120 --> 00:12:30,640 Speaker 1: in theory be easier for for the President and Republicans 209 00:12:30,640 --> 00:12:33,800 Speaker 1: and Democrats in Congress to agree on something and move 210 00:12:33,800 --> 00:12:37,360 Speaker 1: it forward. Josh, what about Bannon's allies in the White 211 00:12:37,360 --> 00:12:40,840 Speaker 1: House or former allies. I'm thinking in particular of Steve Miller, 212 00:12:41,200 --> 00:12:44,400 Speaker 1: does this potentially threaten their fate in the White House? 213 00:12:44,480 --> 00:12:48,880 Speaker 1: Or as Steve Miller effectively got against Bannon and edified 214 00:12:49,000 --> 00:12:54,160 Speaker 1: his standing with President Trump. No, Trump made clear you're 215 00:12:54,200 --> 00:12:56,720 Speaker 1: either with Bannon or you're with me, and Miller is 216 00:12:56,840 --> 00:12:59,880 Speaker 1: very much with Trump. You know, went on CNN interview 217 00:13:00,120 --> 00:13:06,959 Speaker 1: Jake Tapper yesterday, Uh basically was abasing himself to Trump, um, 218 00:13:06,720 --> 00:13:09,080 Speaker 1: you know, behaving like a factotum. In fact, the interview 219 00:13:09,160 --> 00:13:12,760 Speaker 1: was cut short in Miller. Miller was apparently marched out 220 00:13:12,760 --> 00:13:15,840 Speaker 1: of CNN by security, so he's left absolutely no doubt 221 00:13:15,840 --> 00:13:18,520 Speaker 1: as to where his loyalties lie. They lie with Trump, 222 00:13:18,640 --> 00:13:21,200 Speaker 1: not Bannon. I think that Jack Tapper at the end 223 00:13:21,200 --> 00:13:24,280 Speaker 1: of the interview said there's one viewer you're catering to 224 00:13:24,480 --> 00:13:27,600 Speaker 1: right now, and he was talking, of course, to regarding 225 00:13:27,600 --> 00:13:30,720 Speaker 1: a president of Trump. Thank you so much for joining us. 226 00:13:30,800 --> 00:13:33,640 Speaker 1: It's wonderful to hear your perspective as the expert frankly 227 00:13:34,000 --> 00:13:37,800 Speaker 1: on Ban and Steve Bannon and giving some color of 228 00:13:37,840 --> 00:13:41,240 Speaker 1: what we can expect going forward and why this is significant. 229 00:13:41,280 --> 00:14:01,200 Speaker 1: Josh Green, national correspondent for Bloomberg Business Week. Press one 230 00:14:01,440 --> 00:14:05,640 Speaker 1: for information about your account, press to to make a payment. 231 00:14:06,080 --> 00:14:08,559 Speaker 1: Robert Locacio he joins US now. He is founder and 232 00:14:08,640 --> 00:14:12,240 Speaker 1: chief executive officer of Live person, which is based in 233 00:14:12,280 --> 00:14:16,520 Speaker 1: New York and seeks to make that your experience when 234 00:14:16,559 --> 00:14:21,040 Speaker 1: you call companies, that customer service should be automated with 235 00:14:21,120 --> 00:14:24,440 Speaker 1: artificial intelligence rather than have a real person on the 236 00:14:24,480 --> 00:14:27,960 Speaker 1: other line. Robert, your business has sound tremendously well in 237 00:14:28,000 --> 00:14:31,760 Speaker 1: the past year. Your share prices have surged. Is that 238 00:14:31,800 --> 00:14:35,440 Speaker 1: the future? Yeah, you know, I mean, I've been in 239 00:14:35,440 --> 00:14:38,320 Speaker 1: the business for twenty years and working contact centers and 240 00:14:38,440 --> 00:14:43,080 Speaker 1: I invented chat and and seeing the change in consumers. 241 00:14:43,120 --> 00:14:46,960 Speaker 1: But um, the future is really about consumers be able 242 00:14:47,000 --> 00:14:49,440 Speaker 1: to have a conversation with a human or or a 243 00:14:49,520 --> 00:14:53,320 Speaker 1: bot or something that's automated. And I think ultimately about 244 00:14:53,960 --> 00:14:56,360 Speaker 1: conversations will become automated because we want it. We don't 245 00:14:56,360 --> 00:14:58,040 Speaker 1: want to be on hold, we don't want to make 246 00:14:58,040 --> 00:15:01,520 Speaker 1: phone calls and these and so our behavior is driving 247 00:15:01,840 --> 00:15:05,440 Speaker 1: us to do things differently. What's the price point comparison 248 00:15:05,480 --> 00:15:09,560 Speaker 1: between having a human being answer the phone call and 249 00:15:09,640 --> 00:15:12,440 Speaker 1: read off of a computer screen as opposed to having 250 00:15:12,480 --> 00:15:16,680 Speaker 1: a machine listened to your voice and follow various prompts. 251 00:15:17,160 --> 00:15:21,080 Speaker 1: So it's six dollars is the average UH cost of 252 00:15:21,080 --> 00:15:25,400 Speaker 1: a phone call? UM There are about two hundred and 253 00:15:25,600 --> 00:15:28,040 Speaker 1: eighty billion phone calls that take place every year to 254 00:15:28,120 --> 00:15:31,280 Speaker 1: contact center, so about one point two trillion dollars is 255 00:15:31,320 --> 00:15:36,320 Speaker 1: spent on phone calls. An automated message or or bought 256 00:15:37,040 --> 00:15:40,760 Speaker 1: UM or even an agent answering through messaging or something 257 00:15:40,760 --> 00:15:44,280 Speaker 1: like chat is about a dollar fifty, So a dollar 258 00:15:44,360 --> 00:15:48,720 Speaker 1: fifty versus six dollars, dollar fifty versus six dollars and 259 00:15:48,760 --> 00:15:50,920 Speaker 1: this and when I go back to this number, It's 260 00:15:51,000 --> 00:15:53,960 Speaker 1: one point two trillion dollars is spent on phone calls 261 00:15:54,040 --> 00:15:56,960 Speaker 1: that we as consumers have to make that we don't like, 262 00:15:57,560 --> 00:15:59,760 Speaker 1: and the agents who answer those calls don't like it, 263 00:16:00,000 --> 00:16:02,520 Speaker 1: and the company's imagined nobody likes it. It's just hold 264 00:16:02,520 --> 00:16:04,440 Speaker 1: on a second. But I think the part of the 265 00:16:04,520 --> 00:16:10,040 Speaker 1: criticism of automated responders to these phone calls is their efficacy. 266 00:16:10,440 --> 00:16:13,920 Speaker 1: And you know, is this is this a problem? You know, 267 00:16:13,960 --> 00:16:16,240 Speaker 1: at what point do you end up on hold at 268 00:16:16,240 --> 00:16:19,720 Speaker 1: the end of a long automated intro anyway, because your 269 00:16:20,000 --> 00:16:22,480 Speaker 1: request is so nuanced that you do require a live 270 00:16:22,560 --> 00:16:25,920 Speaker 1: human being. So in the analog world of voice, when 271 00:16:25,960 --> 00:16:29,280 Speaker 1: you try those automated systems, they're horrible. I mean you 272 00:16:29,280 --> 00:16:32,080 Speaker 1: think about people screaming into the phone no, yeah, and 273 00:16:32,120 --> 00:16:35,240 Speaker 1: then it's noisy, right, So this was just another bad 274 00:16:35,360 --> 00:16:38,160 Speaker 1: This is like bad analog with bad analog, like an 275 00:16:38,160 --> 00:16:40,640 Speaker 1: automated analog system like voice. But when you go to 276 00:16:40,680 --> 00:16:43,720 Speaker 1: something like we do messaging, where let's say with T Mobile, 277 00:16:43,800 --> 00:16:47,040 Speaker 1: you can message to a T Mobile rep. We can 278 00:16:47,040 --> 00:16:50,080 Speaker 1: see the text, so we we can automate a response back. 279 00:16:50,320 --> 00:16:52,480 Speaker 1: We can bring a human in when it's appropriate. But 280 00:16:52,520 --> 00:16:55,000 Speaker 1: when you have text, and you have that over a 281 00:16:55,040 --> 00:16:58,720 Speaker 1: digital framework like like messaging. Um, you can you can 282 00:16:58,760 --> 00:17:01,800 Speaker 1: do AI, but you're right, voice analog voice. This just 283 00:17:01,840 --> 00:17:03,760 Speaker 1: needs to go away. I think it's evil. You know, 284 00:17:03,760 --> 00:17:07,960 Speaker 1: I've said it for me. Voice calls are evil? Wow? Okay, 285 00:17:08,160 --> 00:17:12,760 Speaker 1: are are evil? Can can we just I'm totally is 286 00:17:12,800 --> 00:17:15,480 Speaker 1: it's it's the greatest waste of our time is being 287 00:17:15,480 --> 00:17:18,720 Speaker 1: on hold. We lose years of our life sitting on hold. 288 00:17:18,840 --> 00:17:24,000 Speaker 1: We all know it. It's horrible. Okay, it's horrible. Having 289 00:17:24,040 --> 00:17:26,159 Speaker 1: said that, I want to go back to sitting on 290 00:17:26,200 --> 00:17:29,560 Speaker 1: a couch and I key a couch specifically. And I'm 291 00:17:29,560 --> 00:17:32,000 Speaker 1: wondering if you could tell a story about how you 292 00:17:32,119 --> 00:17:36,680 Speaker 1: launched your first business and what failure has taught you. Yes, 293 00:17:36,720 --> 00:17:40,199 Speaker 1: So I started a company out of college in and 294 00:17:40,240 --> 00:17:43,200 Speaker 1: it was interactive kiosks like touch screens for college campuses 295 00:17:43,280 --> 00:17:46,359 Speaker 1: and funded on credit cards. But it went under and 296 00:17:46,440 --> 00:17:50,320 Speaker 1: in I sort of picked myself up, and uh, I 297 00:17:50,359 --> 00:17:52,080 Speaker 1: was in Baltimore at the time, moved to New York 298 00:17:52,080 --> 00:17:55,679 Speaker 1: City and I sub let a loft from a guy 299 00:17:55,720 --> 00:17:58,080 Speaker 1: who made t shirts. A little space and a loft, 300 00:17:58,840 --> 00:18:01,280 Speaker 1: and I had only a couch and a computer and 301 00:18:01,640 --> 00:18:04,080 Speaker 1: I ended up sleeping on that couch for two years 302 00:18:04,080 --> 00:18:06,800 Speaker 1: and showering at a health club, uh, down the block, 303 00:18:06,880 --> 00:18:10,280 Speaker 1: down in Tribeca. And that's how I got started with 304 00:18:10,280 --> 00:18:13,200 Speaker 1: with Live Person. So I mean, what was the lightbulb 305 00:18:13,240 --> 00:18:16,679 Speaker 1: moment for you for Live Person? The light bulb moment 306 00:18:16,760 --> 00:18:20,080 Speaker 1: for me was I remember going onto the internet. Uh, 307 00:18:20,080 --> 00:18:24,240 Speaker 1: this is and uh And I remember obviously there was 308 00:18:24,240 --> 00:18:26,399 Speaker 1: stuff being said about at that time although it was 309 00:18:26,480 --> 00:18:30,600 Speaker 1: just coming especially commerce was just starting. But I didn't 310 00:18:30,600 --> 00:18:32,639 Speaker 1: see anybody on. There was nobody there. It felt like 311 00:18:32,720 --> 00:18:35,600 Speaker 1: a very lonely experience to me, and that when I 312 00:18:35,600 --> 00:18:37,840 Speaker 1: showed up, but like Dell's website at the time, like 313 00:18:37,880 --> 00:18:39,960 Speaker 1: there's nobody here. And that drove me to think about 314 00:18:39,960 --> 00:18:44,320 Speaker 1: how do you bring the human element two the digital element, 315 00:18:44,359 --> 00:18:49,240 Speaker 1: which was which was the web. But on the failure side, UM, 316 00:18:49,440 --> 00:18:52,480 Speaker 1: failure for me has always been I've been twenty years, 317 00:18:52,480 --> 00:18:56,240 Speaker 1: sixteen years of public company CEO. UM. You know you 318 00:18:56,320 --> 00:18:59,080 Speaker 1: only learned through your failures. So it's it's for me 319 00:18:59,200 --> 00:19:02,080 Speaker 1: and you just can't quit. I always tell other CEOs 320 00:19:02,160 --> 00:19:05,120 Speaker 1: or entrepreneurs when you hit the bottom, you just gotta 321 00:19:05,119 --> 00:19:08,400 Speaker 1: work through it and there's another day. But you can't 322 00:19:08,480 --> 00:19:10,760 Speaker 1: quit at the bottom. You've gotta keep going. What's the 323 00:19:10,800 --> 00:19:16,040 Speaker 1: next iteration of automated response. The first thing we have 324 00:19:16,080 --> 00:19:17,639 Speaker 1: to do is we are working on right now on 325 00:19:17,720 --> 00:19:23,120 Speaker 1: what we're creating conversational design that feels very um natural. 326 00:19:23,359 --> 00:19:27,160 Speaker 1: So we as humans need conversations. We know this now. 327 00:19:27,520 --> 00:19:31,480 Speaker 1: We need conversations to transact, We need to ask questions 328 00:19:31,560 --> 00:19:34,080 Speaker 1: and on the other side, to automate them. We need 329 00:19:34,080 --> 00:19:36,639 Speaker 1: a way to too create. I call it like a 330 00:19:36,680 --> 00:19:40,159 Speaker 1: poetic experience that a consumer can ask questioning and something 331 00:19:40,160 --> 00:19:44,280 Speaker 1: comes back that's empathetic, that seems human um, that doesn't 332 00:19:44,280 --> 00:19:46,639 Speaker 1: try to fake them out because it isn't human. And 333 00:19:46,680 --> 00:19:49,520 Speaker 1: so the way we design those conversations are very much 334 00:19:49,680 --> 00:19:52,199 Speaker 1: I say, it's like because I was English literature major, Like, 335 00:19:52,400 --> 00:19:54,919 Speaker 1: it's like designing poetry. You know, you want to look 336 00:19:54,960 --> 00:19:57,800 Speaker 1: at the best conversations from the live agents who had conversations. 337 00:19:57,840 --> 00:20:00,760 Speaker 1: You want to look at the brand, the brand expression 338 00:20:00,800 --> 00:20:03,720 Speaker 1: in the market and bring that into our conversation. But 339 00:20:04,040 --> 00:20:06,520 Speaker 1: I I can. I I'm gonna make a prediction which 340 00:20:06,520 --> 00:20:08,399 Speaker 1: people are gonna think I'm a little a little crazy, 341 00:20:08,400 --> 00:20:13,720 Speaker 1: but I I actually think conversational commerce and messaging and 342 00:20:14,080 --> 00:20:17,160 Speaker 1: having conversations will replace great parts of the Internet will 343 00:20:17,200 --> 00:20:22,440 Speaker 1: replace the web. Websites have failed um Outside of Amazon, 344 00:20:22,880 --> 00:20:26,800 Speaker 1: most businesses don't transact through their websites. Thanks not, there's 345 00:20:26,800 --> 00:20:29,119 Speaker 1: an opportunity here. We've got to leave it there, but 346 00:20:29,400 --> 00:20:32,040 Speaker 1: look forward to having you in the future. Robert low Castio. 347 00:20:32,160 --> 00:20:35,359 Speaker 1: He is the founder and the chief executive of Live 348 00:20:35,560 --> 00:20:54,160 Speaker 1: Person and he is indeed live not a robot. Well, 349 00:20:54,160 --> 00:20:56,840 Speaker 1: there was a rousing speech from oprah Winfree at the 350 00:20:56,880 --> 00:20:59,880 Speaker 1: Golden Globes, this having to do with the really first 351 00:21:00,080 --> 00:21:03,280 Speaker 1: major awards ceremony and bringing to the four the me 352 00:21:03,520 --> 00:21:07,119 Speaker 1: to moment. And here to help us understand the financial 353 00:21:07,119 --> 00:21:12,119 Speaker 1: implications is Deborah Cat's founding partner employee rights attorney for Cats, 354 00:21:12,320 --> 00:21:16,360 Speaker 1: Marshall and Banks, joining us from Washington, d C. Home 355 00:21:16,359 --> 00:21:18,480 Speaker 1: to Bloomberg ninety nine one and one oh five point 356 00:21:18,560 --> 00:21:21,560 Speaker 1: seven fm h D two. Debra, thank you very much 357 00:21:21,600 --> 00:21:25,959 Speaker 1: for being with us. Can you connect the topic of 358 00:21:26,080 --> 00:21:31,359 Speaker 1: sexual harassment in the workplace with discrimination when it comes 359 00:21:31,400 --> 00:21:36,000 Speaker 1: to equal pay for equal work? Sure? First of all, 360 00:21:36,040 --> 00:21:38,359 Speaker 1: thank you for having me. And this is a topic 361 00:21:38,440 --> 00:21:42,840 Speaker 1: that is very important and it all too often we 362 00:21:42,880 --> 00:21:47,200 Speaker 1: siloed the topics of sexual harassment and sex based wage inequities, 363 00:21:47,440 --> 00:21:50,879 Speaker 1: and yet they're very connected. Sexual harassment is about abuse 364 00:21:50,920 --> 00:21:54,919 Speaker 1: of power, and often power is abused because you do 365 00:21:55,000 --> 00:21:57,919 Speaker 1: not have women in the c suite. Only five percent 366 00:21:58,320 --> 00:22:02,720 Speaker 1: of fortune companies have CEOs in those positions, and because 367 00:22:02,720 --> 00:22:05,680 Speaker 1: of the great power disparity, you have an environment that's 368 00:22:05,720 --> 00:22:09,720 Speaker 1: more likely to allow sexual harassment to continue. Women who 369 00:22:09,760 --> 00:22:12,880 Speaker 1: face sexual harassment are six and a half times more 370 00:22:12,920 --> 00:22:15,960 Speaker 1: likely to leave their jobs. And when they leave their jobs, 371 00:22:15,960 --> 00:22:19,560 Speaker 1: they're not typically trading up. They're usually trying to get 372 00:22:19,560 --> 00:22:22,560 Speaker 1: out of a very bad situation, and they often have 373 00:22:22,680 --> 00:22:25,960 Speaker 1: to go to lesser paying jobs and start all over again. 374 00:22:26,200 --> 00:22:30,879 Speaker 1: They lose equity, they lose title, and ultimately they have 375 00:22:30,920 --> 00:22:35,520 Speaker 1: a much bigger loss in economics over their entire career 376 00:22:35,840 --> 00:22:39,080 Speaker 1: as a result of the sexual harassment experience. So they're 377 00:22:39,160 --> 00:22:42,119 Speaker 1: very much tied together, Debrah. Can you give us a 378 00:22:42,160 --> 00:22:47,040 Speaker 1: sense of how prevalent the trend that you're describing really 379 00:22:47,160 --> 00:22:50,280 Speaker 1: is with people with women leaving their jobs as a 380 00:22:50,320 --> 00:22:53,880 Speaker 1: result of sexual harassment and thus not able to sort 381 00:22:53,880 --> 00:22:58,359 Speaker 1: of fulfill their uh their potential in their careers. Sure well, 382 00:22:58,800 --> 00:23:02,200 Speaker 1: low income workers to be stuck staying in their jobs 383 00:23:02,480 --> 00:23:07,320 Speaker 1: UM and do not have the luxury and to get 384 00:23:07,320 --> 00:23:12,200 Speaker 1: out of some very egregiously harassing situations. But I represent 385 00:23:12,400 --> 00:23:18,600 Speaker 1: women UH in financial services, doctors, lawyers, who are highly educated, 386 00:23:18,680 --> 00:23:21,600 Speaker 1: who are being subjected to sexual harassment, who try to 387 00:23:21,680 --> 00:23:25,159 Speaker 1: navigate the situation, who try to avoid the harass er 388 00:23:25,640 --> 00:23:28,119 Speaker 1: and maintain and at a certain point they may just 389 00:23:28,200 --> 00:23:32,800 Speaker 1: decide that they cannot live with that situation. And sexual 390 00:23:32,840 --> 00:23:38,920 Speaker 1: harassment UH effects women in law firms, medical practices, financial services, 391 00:23:39,000 --> 00:23:43,159 Speaker 1: and it's extremely prevalent, and it's underreported, and many women 392 00:23:43,280 --> 00:23:46,440 Speaker 1: reasonably feel that if they come forward and report their concerns, 393 00:23:46,680 --> 00:23:49,200 Speaker 1: they're going to be persona non grata, not only at 394 00:23:49,240 --> 00:23:53,320 Speaker 1: their particular company, but within an industry. So many women 395 00:23:53,440 --> 00:23:56,400 Speaker 1: choose to just vote with their feet and they leave, 396 00:23:56,520 --> 00:23:58,879 Speaker 1: and they try to get out of a bad situation 397 00:23:58,960 --> 00:24:01,439 Speaker 1: and get somewhere else. And I can't give you a 398 00:24:01,520 --> 00:24:07,800 Speaker 1: percentage of of women who actually UH go to lesser jobs, 399 00:24:07,880 --> 00:24:11,760 Speaker 1: but in my experience that's that's not uncommon, and that 400 00:24:11,880 --> 00:24:15,000 Speaker 1: they wind up landing in less lucrative fields or positions, 401 00:24:15,119 --> 00:24:18,200 Speaker 1: or they exit their chosen careers altogether. And we're losing 402 00:24:18,240 --> 00:24:21,080 Speaker 1: tremendous talent as a result of it. How much of 403 00:24:21,119 --> 00:24:26,520 Speaker 1: the blame lies with HR departments that could potentially respond 404 00:24:26,520 --> 00:24:28,720 Speaker 1: to this in a way that would keep them, keep 405 00:24:28,720 --> 00:24:32,560 Speaker 1: people there, keep women there. The spotlight is clearly on 406 00:24:32,840 --> 00:24:36,159 Speaker 1: HR departments now, But for I've done this work for 407 00:24:36,240 --> 00:24:40,400 Speaker 1: over thirty years, and typically HR partner departments carry out 408 00:24:40,480 --> 00:24:44,159 Speaker 1: the will of management. They're very complicit, and historically we 409 00:24:44,200 --> 00:24:46,639 Speaker 1: have seen people who harass get away with it because 410 00:24:46,640 --> 00:24:51,080 Speaker 1: they're perceived as star performers, are the big revenue generators, 411 00:24:51,119 --> 00:24:54,320 Speaker 1: and HR is there to help them remain in place 412 00:24:54,440 --> 00:24:57,040 Speaker 1: and manage the people out, manage the women at who 413 00:24:57,040 --> 00:25:00,280 Speaker 1: have asserted claims. Now we're in a moment where that 414 00:25:00,359 --> 00:25:03,640 Speaker 1: type of behavior is less is less tolerable, and we're 415 00:25:03,640 --> 00:25:07,800 Speaker 1: seeing some very famous and high profile people UH being 416 00:25:08,480 --> 00:25:12,040 Speaker 1: terminated in very public ways. But HR departments have been 417 00:25:12,080 --> 00:25:14,879 Speaker 1: complicit in allowing this kind of behavior to go forward 418 00:25:15,119 --> 00:25:18,879 Speaker 1: for decades and decades, and HRR officials often do not 419 00:25:19,040 --> 00:25:22,480 Speaker 1: have reporting relationships that allow them to elevate the issues 420 00:25:22,840 --> 00:25:25,639 Speaker 1: without themselves having a target on their backs. So we 421 00:25:25,720 --> 00:25:28,399 Speaker 1: often represent HR people who have tried to do the 422 00:25:28,480 --> 00:25:31,000 Speaker 1: right thing but have not been given to power, or 423 00:25:31,040 --> 00:25:35,840 Speaker 1: when they made recommendations to exit real star harassers. Uh 424 00:25:35,880 --> 00:25:40,119 Speaker 1: they themselves found their careers in jeopardy. So it's a 425 00:25:40,160 --> 00:25:44,240 Speaker 1: complicated issue. But um, I think we know that without 426 00:25:44,640 --> 00:25:48,720 Speaker 1: really a tone, it's top It says sexual harassment is 427 00:25:48,760 --> 00:25:52,360 Speaker 1: not acceptable and will not be tolerated in this company. 428 00:25:52,480 --> 00:25:56,520 Speaker 1: Uh h R officials are really not given the power 429 00:25:56,560 --> 00:25:59,679 Speaker 1: that they need to do what they need to eradicate 430 00:26:00,000 --> 00:26:03,199 Speaker 1: sexual brassment within their companies. Debor Cats, thank you so 431 00:26:03,280 --> 00:26:05,120 Speaker 1: much for joining us. We'll have to have you back. 432 00:26:05,160 --> 00:26:08,560 Speaker 1: Debor Cats founding partner and employee rights attorney with Cats, 433 00:26:08,680 --> 00:26:11,760 Speaker 1: Marshall and Banks, which is based in Washington, d C. 434 00:26:15,359 --> 00:26:17,919 Speaker 1: Thanks for listening to the Bloomberg P and L podcast. 435 00:26:18,240 --> 00:26:22,160 Speaker 1: You can subscribe and listen to interviews at Apple Podcasts, SoundCloud, 436 00:26:22,280 --> 00:26:25,720 Speaker 1: or whatever podcast platform you prefer. I'm Pim Fox. I'm 437 00:26:25,760 --> 00:26:29,320 Speaker 1: on Twitter at pim Fox. I'm on Twitter at Lisa 438 00:26:29,359 --> 00:26:32,479 Speaker 1: abramowits one before the podcast. You can always catch us 439 00:26:32,520 --> 00:26:34,080 Speaker 1: worldwide on Bloomberg Radio.