1 00:00:00,080 --> 00:00:03,279 Speaker 1: Professor. I bike to work every day and I often 2 00:00:03,760 --> 00:00:13,920 Speaker 1: blow through stop signs. This episode is brought to you 3 00:00:13,960 --> 00:00:17,440 Speaker 1: by nat X, the Binary Options Exchange. Binary Options let 4 00:00:17,480 --> 00:00:20,079 Speaker 1: you limit your risk and trade stock in dissees, commodities 5 00:00:20,120 --> 00:00:23,040 Speaker 1: for x and more from a single account. Nat X 6 00:00:23,160 --> 00:00:27,040 Speaker 1: is a CFTC regulated exchange with transparency, free market data, 7 00:00:27,280 --> 00:00:30,720 Speaker 1: and fairness guaranteed. The future of trading is here now 8 00:00:30,880 --> 00:00:34,040 Speaker 1: at n A d e X, dot com, futures, options 9 00:00:34,040 --> 00:00:36,520 Speaker 1: and spots. Trading involves risk and may not be appropriate 10 00:00:36,640 --> 00:00:45,159 Speaker 1: for all investors. Hi, and welcome back to Bloomberg Benchmark 11 00:00:45,200 --> 00:00:49,199 Speaker 1: Up podcast about the global economy. It is Wednesday, December three. 12 00:00:49,479 --> 00:00:52,600 Speaker 1: I'm Tori Stiwell, an economics reporter with Bloomberg News in 13 00:00:52,720 --> 00:00:55,880 Speaker 1: d C. And I am joined by my co host Akdo, 14 00:00:56,000 --> 00:00:59,240 Speaker 1: our editor for Benchmark in San Francisco. Hey, Tory, how's 15 00:00:59,240 --> 00:01:02,160 Speaker 1: it going pretty good? This weather that we're having is 16 00:01:02,200 --> 00:01:05,520 Speaker 1: crazy heading into the holidays. I know we've been getting 17 00:01:06,160 --> 00:01:08,319 Speaker 1: a lot of rain here in California, which is kind 18 00:01:08,319 --> 00:01:11,039 Speaker 1: of nuts. It's like, what is this thing falling out 19 00:01:11,040 --> 00:01:16,760 Speaker 1: of the sky. Nowhere carries umbrellas anymore? What are you 20 00:01:16,760 --> 00:01:20,560 Speaker 1: doing for Christmas? I am going home to North Carolina, 21 00:01:20,840 --> 00:01:25,920 Speaker 1: so we'll see the fam very nice. I'm going skiing 22 00:01:25,920 --> 00:01:29,039 Speaker 1: in Tahoe, which is why I've been obsessed about this rain. 23 00:01:29,120 --> 00:01:32,319 Speaker 1: If it's raining here, it's snowing in Tahoe. So that's 24 00:01:32,319 --> 00:01:36,039 Speaker 1: a good thing. Yeah. So, as we're nearing the end 25 00:01:36,080 --> 00:01:39,360 Speaker 1: of and I was reflecting back on the news of 26 00:01:39,360 --> 00:01:43,080 Speaker 1: the year, I couldn't help but realized just how much 27 00:01:43,319 --> 00:01:47,080 Speaker 1: cheating went on. You know, we had the Volkswagen diesel scandal. 28 00:01:47,319 --> 00:01:50,840 Speaker 1: Goldman Sacks and JP Morgan both had to fire analysts 29 00:01:50,880 --> 00:01:54,840 Speaker 1: for cheating on internal training tests. There's this former trader 30 00:01:55,040 --> 00:01:57,480 Speaker 1: for Ubs and City Group. He was sent us over 31 00:01:57,520 --> 00:02:00,320 Speaker 1: a decade of prison time after he was found guilty 32 00:02:00,360 --> 00:02:03,120 Speaker 1: of rigging library. There was deflate Gate. How could we 33 00:02:03,200 --> 00:02:08,360 Speaker 1: forget deflate Gate? And most recent and most recently, there's 34 00:02:08,520 --> 00:02:12,560 Speaker 1: Martin Screlly, the price gouging Parma exec who was charged 35 00:02:12,600 --> 00:02:16,520 Speaker 1: with securities fraud this month. Right, So we wanted to 36 00:02:16,600 --> 00:02:19,959 Speaker 1: use today's show to you know, talk about the economics 37 00:02:20,040 --> 00:02:23,920 Speaker 1: behind cheating. Um, not just those big examples that Tori 38 00:02:24,080 --> 00:02:26,799 Speaker 1: you just gave, but also little incidents of cheating, to 39 00:02:27,040 --> 00:02:30,760 Speaker 1: like when I don't follow the regular traffic laws of 40 00:02:30,800 --> 00:02:34,720 Speaker 1: San Francisco when I'm on my bike, because otherwise, you know, 41 00:02:34,760 --> 00:02:37,799 Speaker 1: if cheating wasn't this prevalent, we wouldn't need meter maids, 42 00:02:37,880 --> 00:02:41,400 Speaker 1: we wouldn't need judges, we wouldn't need an umpire at 43 00:02:41,440 --> 00:02:44,680 Speaker 1: a recreational softball game, or the little scanner things at 44 00:02:44,680 --> 00:02:46,839 Speaker 1: the doors of the stores that bleep when you try 45 00:02:46,919 --> 00:02:49,080 Speaker 1: to wake walk off with something you didn't pay for 46 00:02:49,360 --> 00:02:52,480 Speaker 1: just generally get very close to them in my experience. 47 00:02:53,280 --> 00:02:56,320 Speaker 1: Oh yeah, I hate those things. But it's it's that 48 00:02:56,520 --> 00:02:59,359 Speaker 1: level of ubiquity document that you just mentioned that makes 49 00:02:59,360 --> 00:03:03,600 Speaker 1: it almost possible to calculate how much cheating costs our society, 50 00:03:03,680 --> 00:03:05,240 Speaker 1: which is what I first set out to do when 51 00:03:05,280 --> 00:03:07,160 Speaker 1: I got to thinking about this topic. You know, how 52 00:03:07,280 --> 00:03:12,080 Speaker 1: much this cheating cost. It is not really easy to 53 00:03:12,480 --> 00:03:15,240 Speaker 1: come up with that figure. So instead we want to 54 00:03:15,280 --> 00:03:17,799 Speaker 1: talk about why people cheat in the first place, from 55 00:03:17,840 --> 00:03:20,560 Speaker 1: an economic standpoint, because sometimes there is a case to 56 00:03:20,600 --> 00:03:23,800 Speaker 1: be made in favor of cheating. Also, why do society 57 00:03:23,840 --> 00:03:28,160 Speaker 1: is discourage cheating? And how does globalization mean that it's 58 00:03:28,200 --> 00:03:31,000 Speaker 1: going to get a lot harder to police cheating in 59 00:03:31,000 --> 00:03:34,480 Speaker 1: our world? And for that we're bringing on Robert Stonebreaker 60 00:03:34,639 --> 00:03:39,160 Speaker 1: and economics professor at Winthrop University in rock Hill, South Carolina. 61 00:03:39,320 --> 00:03:43,400 Speaker 1: Hello professor, Hello, Hey, thanks so much for joining us. Professor. 62 00:03:43,800 --> 00:03:47,560 Speaker 1: Let's walk through the act of cheating from an economic standpoint, 63 00:03:47,600 --> 00:03:50,200 Speaker 1: because I think a lot of people out there are 64 00:03:50,520 --> 00:03:53,280 Speaker 1: prone to think of cheating as sometimes being this knee 65 00:03:53,360 --> 00:03:56,360 Speaker 1: jerk decision, something you do without thinking about it. But 66 00:03:56,560 --> 00:04:00,360 Speaker 1: it's often a lot more rational than that. Right. Yes, 67 00:04:00,640 --> 00:04:04,560 Speaker 1: economists would argue that our decisions are always made on 68 00:04:04,600 --> 00:04:09,600 Speaker 1: the basis of our perceived costs and benefits of those decisions. Uh. 69 00:04:09,760 --> 00:04:13,000 Speaker 1: For example, if I'm watching TV, I must have thought 70 00:04:13,040 --> 00:04:15,880 Speaker 1: the benefit of watching TV exceeded the cost of watching TV. 71 00:04:16,600 --> 00:04:20,279 Speaker 1: And it's the same thing for cheating. People understand that 72 00:04:20,320 --> 00:04:23,560 Speaker 1: there are potential benefits of cheating, they understand their potential 73 00:04:23,600 --> 00:04:27,839 Speaker 1: costs of cheating. But if they think that those benefits 74 00:04:28,080 --> 00:04:30,520 Speaker 1: are going to be greater than the costs, it is 75 00:04:30,640 --> 00:04:33,480 Speaker 1: rational for them to cheat. So we make our choices 76 00:04:33,520 --> 00:04:36,240 Speaker 1: about cheating. It's just like we make our choices about 77 00:04:36,279 --> 00:04:39,479 Speaker 1: anything else. We ask ourselves what are the benefits, what 78 00:04:39,520 --> 00:04:42,320 Speaker 1: are the costs? If we think cheating is more beneficial 79 00:04:42,360 --> 00:04:45,960 Speaker 1: than costly, we cheat. In my economics classes, we often 80 00:04:46,040 --> 00:04:48,680 Speaker 1: talked about cheating in the context of game theory. You know, 81 00:04:48,800 --> 00:04:52,400 Speaker 1: we talked about cartels and collusion, etcetera. Can you walk 82 00:04:52,560 --> 00:04:55,880 Speaker 1: us through exactly what game theory is and how cheating 83 00:04:55,920 --> 00:04:59,800 Speaker 1: would fit into that concept. Sure, game theory is just 84 00:05:00,080 --> 00:05:06,359 Speaker 1: uh theory designed to predict strategic behavior. So when we 85 00:05:06,400 --> 00:05:09,240 Speaker 1: talk about game theory, we're not talking about playing monopoly 86 00:05:09,400 --> 00:05:13,880 Speaker 1: or shoots and ladders. We're talking about strategic games. How 87 00:05:13,880 --> 00:05:18,600 Speaker 1: do we react to other people in strategic situations. One 88 00:05:18,640 --> 00:05:20,599 Speaker 1: of the things that crops up quite a bit in 89 00:05:20,680 --> 00:05:24,280 Speaker 1: game theory is the idea that it is sometimes very 90 00:05:24,320 --> 00:05:29,880 Speaker 1: difficult for individuals to make cooperative agreements. For example, suppose 91 00:05:30,120 --> 00:05:35,240 Speaker 1: two countries decide to try to limit a military arms race. 92 00:05:36,520 --> 00:05:40,960 Speaker 1: The difficulty is there's an incentive for both countries to cheat. 93 00:05:41,520 --> 00:05:44,440 Speaker 1: For example, if my country thinks the other country is 94 00:05:44,480 --> 00:05:47,799 Speaker 1: going to cheat, then I have to cheat too, because 95 00:05:47,839 --> 00:05:49,640 Speaker 1: you're going to cheat and I don't, I'm gonna lose. 96 00:05:50,960 --> 00:05:54,200 Speaker 1: So if I think you're going to cheat, my best 97 00:05:54,279 --> 00:05:57,719 Speaker 1: choice is also to cheat. But it turns out that 98 00:05:57,800 --> 00:06:00,479 Speaker 1: if I think you're not going to cheat, it's still 99 00:06:00,520 --> 00:06:02,600 Speaker 1: true that my best choice is to cheat, because if 100 00:06:02,600 --> 00:06:06,239 Speaker 1: you don't cheat, and I do. I win. So cheating 101 00:06:06,320 --> 00:06:11,400 Speaker 1: in that situation becomes what economists would call a dominant strategy. 102 00:06:11,920 --> 00:06:13,760 Speaker 1: If I think you're gonna cheat, I'm going to cheat. 103 00:06:14,040 --> 00:06:15,520 Speaker 1: If I think you're not going to cheat, I'm going 104 00:06:15,560 --> 00:06:18,120 Speaker 1: to cheat. No matter what I think you're going to do. 105 00:06:18,600 --> 00:06:21,480 Speaker 1: My best strategy is to cheat, and of course your 106 00:06:21,480 --> 00:06:23,640 Speaker 1: best strategy is also to cheat, because if you think 107 00:06:23,680 --> 00:06:25,680 Speaker 1: I'm going to cheat, you'll cheat. If you think I'm 108 00:06:25,680 --> 00:06:28,560 Speaker 1: not going to cheat, you can cheat and win. So 109 00:06:28,600 --> 00:06:31,599 Speaker 1: we both have a dominant strategy to cheat, and we 110 00:06:31,720 --> 00:06:35,039 Speaker 1: end up cheating each other and we get into a 111 00:06:35,839 --> 00:06:38,520 Speaker 1: what's sometimes called up a prisoner's dilemma from a game 112 00:06:38,560 --> 00:06:43,240 Speaker 1: theory perspective. But it's very difficult for us not to 113 00:06:43,360 --> 00:06:46,840 Speaker 1: cheat in those cases. Can we walk through maybe a 114 00:06:47,000 --> 00:06:52,080 Speaker 1: smaller example of cheating? So, Professor, I bike to work 115 00:06:52,160 --> 00:06:56,360 Speaker 1: every day and I often blow through stop signs. Oh 116 00:06:58,040 --> 00:07:00,640 Speaker 1: it's illegal. I'm pretty terrible, and I get yelled at 117 00:07:00,680 --> 00:07:03,080 Speaker 1: all the time. But can you walk us through what's 118 00:07:03,120 --> 00:07:05,680 Speaker 1: going on in my head in terms of the perceived 119 00:07:05,760 --> 00:07:09,840 Speaker 1: costs and benefits of that decision. You're asking yourself first, 120 00:07:09,880 --> 00:07:12,119 Speaker 1: what are the benefits of blowing through that soft sign. 121 00:07:12,960 --> 00:07:15,000 Speaker 1: The work earlier I can, I don't have to sit 122 00:07:15,000 --> 00:07:18,720 Speaker 1: and wait, and what are the costs? And the question 123 00:07:18,800 --> 00:07:22,560 Speaker 1: is is there a police officer sitting beside you. I'm 124 00:07:22,560 --> 00:07:26,240 Speaker 1: also thinking about the probability of getting caught exactly, in 125 00:07:26,280 --> 00:07:30,120 Speaker 1: other words, convinced that one it's not likely that you 126 00:07:30,120 --> 00:07:34,160 Speaker 1: will be caught, or two that even if you are caught, 127 00:07:34,200 --> 00:07:39,440 Speaker 1: the penalty will be basically nothing and go through. In 128 00:07:39,480 --> 00:07:41,560 Speaker 1: other words, there are two aspects of the cost. One 129 00:07:41,640 --> 00:07:44,240 Speaker 1: is what's the probability that you will be caught, and 130 00:07:44,320 --> 00:07:48,160 Speaker 1: two is what will the penalty be if you are caught, 131 00:07:49,040 --> 00:07:51,360 Speaker 1: And you need both of those. If I'm sure I'm 132 00:07:51,360 --> 00:07:52,720 Speaker 1: not going to get caught, I don't care what the 133 00:07:52,720 --> 00:07:56,760 Speaker 1: penalty is, it won't matter. And if the penalty is minimal, 134 00:07:57,080 --> 00:07:59,320 Speaker 1: I don't care if I'm caught because the penalty won't 135 00:07:59,320 --> 00:08:02,760 Speaker 1: matter anyway. So you need both probability of being caught 136 00:08:02,840 --> 00:08:05,880 Speaker 1: and you need a penalty. That makes a lot of 137 00:08:05,880 --> 00:08:08,800 Speaker 1: sense to me because I have not been caught yet 138 00:08:08,840 --> 00:08:13,320 Speaker 1: in my four years in San Francisco advertising this, but 139 00:08:13,480 --> 00:08:19,560 Speaker 1: maybe the next time. Let's let's talk about an example 140 00:08:19,680 --> 00:08:23,600 Speaker 1: from the business world. Volkswagen So for the last three 141 00:08:23,720 --> 00:08:26,040 Speaker 1: months or so, they've been dealing with the fallout from 142 00:08:26,040 --> 00:08:30,119 Speaker 1: a pollution cheating scandal. Um the company's diesel cars weren't 143 00:08:30,160 --> 00:08:33,480 Speaker 1: passing the strict emission standards in the US, so they 144 00:08:33,600 --> 00:08:36,679 Speaker 1: devised a cheat that eventually made its way into about 145 00:08:36,679 --> 00:08:41,439 Speaker 1: eleven million vehicles, and now they're facing roughly seven point 146 00:08:41,520 --> 00:08:46,400 Speaker 1: five billion dollars in diesel recall costs, not including fines 147 00:08:46,440 --> 00:08:49,959 Speaker 1: and potential damages from hundreds of lawsuits, and they've also 148 00:08:50,080 --> 00:08:52,880 Speaker 1: lost consumer trust. Their sales are falling, and they've been 149 00:08:52,920 --> 00:08:56,640 Speaker 1: slow to recall the vehicles. One analyst put the total 150 00:08:56,679 --> 00:08:59,680 Speaker 1: financial burden of the scandal at as much as twenty 151 00:08:59,720 --> 00:09:03,680 Speaker 1: two billions. So, Professor Stonebreaker, you're telling me that someone 152 00:09:03,720 --> 00:09:08,240 Speaker 1: at Volkswagen thought about all of these things and was like, yeah, 153 00:09:08,400 --> 00:09:12,560 Speaker 1: let's do it. Since well, they did. The problem was 154 00:09:12,640 --> 00:09:16,080 Speaker 1: they misestimated the likelihood that they would be caught. We 155 00:09:16,160 --> 00:09:21,960 Speaker 1: certainly make mistakes, and what happened at Volkswagen, I presume, 156 00:09:22,360 --> 00:09:26,840 Speaker 1: is that the people making those choices were convinced that 157 00:09:26,880 --> 00:09:29,880 Speaker 1: they would not be caught. And indeed it took many, 158 00:09:29,920 --> 00:09:32,920 Speaker 1: many years before they were caught, and it was almost 159 00:09:32,960 --> 00:09:39,200 Speaker 1: an accidental uh case in which they were caught. So 160 00:09:40,400 --> 00:09:43,959 Speaker 1: in that case, I'm sure that the Volkswagen people made 161 00:09:43,960 --> 00:09:47,079 Speaker 1: a rational choice, But our rational choices don't always turn 162 00:09:47,120 --> 00:09:49,000 Speaker 1: out the way we think. All of us have made 163 00:09:49,040 --> 00:09:51,240 Speaker 1: choices that we were absolutely convinced we're going to be 164 00:09:51,320 --> 00:09:54,560 Speaker 1: the right choices, only to find out later whoops. You know, 165 00:09:54,679 --> 00:09:58,600 Speaker 1: I misestimated this probability, I misestimated this cost, And I'm 166 00:09:58,600 --> 00:10:01,200 Speaker 1: sure that's what happened to Volkswagen. Yeah, it happens to 167 00:10:01,240 --> 00:10:06,720 Speaker 1: me when I eat like pizza late at night. Well, 168 00:10:06,760 --> 00:10:08,920 Speaker 1: we're going to take a quick break for a word 169 00:10:09,040 --> 00:10:11,840 Speaker 1: from our sponsor, but when we come back, we will 170 00:10:11,920 --> 00:10:16,800 Speaker 1: walk through how societies can stop cheating and the possibility 171 00:10:16,880 --> 00:10:19,560 Speaker 1: that it might be getting more prevalent as we become 172 00:10:19,679 --> 00:10:26,480 Speaker 1: an increasingly global society. When we come back, what do 173 00:10:26,559 --> 00:10:30,199 Speaker 1: traders want to limit risk? Access every opportunity and trade 174 00:10:30,280 --> 00:10:33,040 Speaker 1: on a level playing field. Nate x binary options let 175 00:10:33,040 --> 00:10:35,400 Speaker 1: you set your maximum profit and loss before the trade, 176 00:10:35,559 --> 00:10:39,080 Speaker 1: so your risk is always limited. Find opportunities in multiple markets, 177 00:10:39,120 --> 00:10:42,720 Speaker 1: stock indussees, commodities for US, even economic numbers, and bitcoin, 178 00:10:43,160 --> 00:10:45,959 Speaker 1: all from one account and platform. Nat X is a 179 00:10:46,080 --> 00:10:50,640 Speaker 1: CSTC regulated exchange with transparency, free market data, and fairness 180 00:10:50,679 --> 00:10:54,880 Speaker 1: guaranteed innovations of financial industry needs and nat X already has. 181 00:10:55,280 --> 00:10:57,959 Speaker 1: That's why we think binary options are the future of trading, 182 00:10:58,520 --> 00:11:00,520 Speaker 1: and it's here now at n A d e X 183 00:11:00,679 --> 00:11:03,800 Speaker 1: dot Com Futures options and swaps. Trading involves risk and 184 00:11:03,840 --> 00:11:10,040 Speaker 1: may not be appropriate for all investors. Okay, so now 185 00:11:10,200 --> 00:11:14,600 Speaker 1: we know about this economic framework behind whether we decide 186 00:11:14,679 --> 00:11:17,640 Speaker 1: to cheat or not to cheat. Um, So let's talk 187 00:11:17,679 --> 00:11:20,800 Speaker 1: about why societies try to stop it and how they 188 00:11:20,800 --> 00:11:23,640 Speaker 1: can do that, um, professor, what's so wrong with cheating 189 00:11:23,800 --> 00:11:27,160 Speaker 1: if it's the rational thing to do. The problem is 190 00:11:27,200 --> 00:11:30,080 Speaker 1: that what is rational for the individual making the choice 191 00:11:30,320 --> 00:11:34,360 Speaker 1: is not necessarily rational for the larger group. When one 192 00:11:34,400 --> 00:11:39,920 Speaker 1: person cheats, they impose costs on others. For example, if 193 00:11:39,960 --> 00:11:42,920 Speaker 1: I cheat on my taxes, surely I would never do that. 194 00:11:43,920 --> 00:11:47,600 Speaker 1: But if somebody does their taxes, that may be perfectly 195 00:11:47,720 --> 00:11:50,640 Speaker 1: rational from their perspective, but they're imposing costs on all 196 00:11:50,679 --> 00:11:53,880 Speaker 1: other taxpayers. Other taxpayers either going to have to pay 197 00:11:53,880 --> 00:11:56,679 Speaker 1: more or else people are going to lose benefits from 198 00:11:56,720 --> 00:12:00,240 Speaker 1: the programs that their taxes could have supported. So my 199 00:12:00,360 --> 00:12:04,600 Speaker 1: decision to cheat imposes costs on other people, and I 200 00:12:04,679 --> 00:12:07,040 Speaker 1: might find to me there's a sort of a benefit 201 00:12:07,080 --> 00:12:11,160 Speaker 1: of five from cheating, but maybe I impose costs of 202 00:12:11,280 --> 00:12:14,560 Speaker 1: thirty five on other people. So my benefit of five 203 00:12:14,640 --> 00:12:18,160 Speaker 1: certainly doesn't offset the thirty five costs to others. And 204 00:12:18,240 --> 00:12:22,320 Speaker 1: that happens in all kinds of situations. If students cheat 205 00:12:22,480 --> 00:12:24,839 Speaker 1: on exams, which is something I deal with on a 206 00:12:24,920 --> 00:12:27,680 Speaker 1: day to day basis as a teacher, they impose costs 207 00:12:27,720 --> 00:12:30,720 Speaker 1: on other people the ones that don't cheat, and in fact, 208 00:12:31,520 --> 00:12:37,840 Speaker 1: in education, online education online courses have this problem. Normally, 209 00:12:38,040 --> 00:12:43,119 Speaker 1: if you have a degree, a college degree, its signals 210 00:12:43,160 --> 00:12:49,240 Speaker 1: employers that you have certain qualities. But many people look 211 00:12:49,280 --> 00:12:54,560 Speaker 1: at online degrees as being less valuable signals because it's 212 00:12:54,600 --> 00:12:57,920 Speaker 1: difficult to know whether or not students cheated in those 213 00:12:57,960 --> 00:13:02,319 Speaker 1: online exams. And a classroom, I can monitor what's going on, 214 00:13:02,559 --> 00:13:04,400 Speaker 1: you know, peop all out of textbook or talk to 215 00:13:04,440 --> 00:13:08,400 Speaker 1: a classmate. I can see that, But many times when 216 00:13:08,400 --> 00:13:12,439 Speaker 1: people take online exams, it is difficult to monitor whether 217 00:13:12,559 --> 00:13:15,480 Speaker 1: or not they are in fact answering the questions, whether 218 00:13:15,559 --> 00:13:19,720 Speaker 1: they're even the ones answering the questions, or whether they're 219 00:13:19,800 --> 00:13:22,959 Speaker 1: using other kinds of sources to help them answer the questions, 220 00:13:23,880 --> 00:13:27,199 Speaker 1: and employers understand that, and employers sometimes look at online 221 00:13:27,200 --> 00:13:31,480 Speaker 1: degrees as being less valuable signals. In other words, is 222 00:13:31,520 --> 00:13:34,719 Speaker 1: some cheat, it destroys the value of that signal for 223 00:13:34,760 --> 00:13:38,600 Speaker 1: those who don't cheat. That's a really good perspective. I mean, so, 224 00:13:39,200 --> 00:13:42,320 Speaker 1: we we know now why societies are trying to curb cheating. 225 00:13:42,360 --> 00:13:44,440 Speaker 1: But is there a point at which the cost of 226 00:13:44,559 --> 00:13:49,200 Speaker 1: trying to prevent this cheating outweighs the benefits of successfully 227 00:13:49,280 --> 00:13:53,079 Speaker 1: doing so. Oh? Sure, it's like anything else. You want 228 00:13:53,120 --> 00:13:54,960 Speaker 1: to do it only if the benefit covers the cost. 229 00:13:55,480 --> 00:13:59,040 Speaker 1: If you look, for example, briefly that crime prevention h 230 00:13:59,559 --> 00:14:05,200 Speaker 1: I could stop everybody in America from speeding, or I 231 00:14:05,200 --> 00:14:08,280 Speaker 1: could stop everybody in America from blowing through a stop sign. 232 00:14:09,080 --> 00:14:11,360 Speaker 1: All I have to do is station a police officer 233 00:14:12,520 --> 00:14:15,440 Speaker 1: every twenty feet along every road. But the cost of 234 00:14:15,520 --> 00:14:20,360 Speaker 1: doing so would be enormous compared to the benefit. Cann't 235 00:14:20,400 --> 00:14:23,760 Speaker 1: be the same thing with cheating. I can stop people 236 00:14:23,800 --> 00:14:27,000 Speaker 1: from cheating if I spend enough time and energy on 237 00:14:27,120 --> 00:14:30,200 Speaker 1: monitoring what they do. But at some time, the cost 238 00:14:30,280 --> 00:14:34,360 Speaker 1: of monitoring people just exceeds the benefit, and you find 239 00:14:34,360 --> 00:14:37,200 Speaker 1: that in the workplace. My guess is that you've all 240 00:14:37,280 --> 00:14:40,080 Speaker 1: seen people in a workplace who don't work as hard 241 00:14:40,120 --> 00:14:44,440 Speaker 1: as they possibly could. At Bloomberg, no way, not possible. 242 00:14:46,400 --> 00:14:49,960 Speaker 1: But if we monitored every employee, we could do something 243 00:14:49,960 --> 00:14:52,760 Speaker 1: about that. You imagine you had a supervisor hanging over 244 00:14:52,800 --> 00:14:57,080 Speaker 1: your shoulder every minute of every day. We probably wouldn't 245 00:14:57,480 --> 00:15:00,120 Speaker 1: goof off as much as we otherwise would. But the 246 00:15:00,160 --> 00:15:03,320 Speaker 1: cost of doing that would be enormous. It would be counterproductive. 247 00:15:04,200 --> 00:15:08,640 Speaker 1: So we only want to cure a problem if the 248 00:15:08,680 --> 00:15:10,720 Speaker 1: benefit of curing that problem is going to exceed the 249 00:15:10,720 --> 00:15:13,840 Speaker 1: cost of curing that problem. You know, I kind of 250 00:15:13,880 --> 00:15:17,400 Speaker 1: feel like I hear about cheating a lot more these days. 251 00:15:17,400 --> 00:15:20,080 Speaker 1: And you know, maybe it's because I work in, you know, 252 00:15:20,160 --> 00:15:22,280 Speaker 1: the news industry and we talked about this a lot. 253 00:15:22,360 --> 00:15:26,440 Speaker 1: But you know, for example, you just talked about online schools. Um. 254 00:15:26,520 --> 00:15:28,920 Speaker 1: I feel like it's gotten easier to cheat thanks to 255 00:15:28,960 --> 00:15:31,760 Speaker 1: the Internet. So do you think it's the case that 256 00:15:31,840 --> 00:15:35,560 Speaker 1: we're becoming more immoral or um? Why do you think 257 00:15:35,640 --> 00:15:39,880 Speaker 1: cheating is becoming seems to be becoming more prolific these days. 258 00:15:40,280 --> 00:15:42,840 Speaker 1: I'm hesitant to say we're less moral than we were 259 00:15:42,840 --> 00:15:44,920 Speaker 1: in the past. I think immorality has been a problem 260 00:15:44,960 --> 00:15:47,960 Speaker 1: since there's been humanities. I think it's a matter of 261 00:15:48,040 --> 00:15:50,960 Speaker 1: changing costs and benefits. I think you can identify some 262 00:15:51,040 --> 00:15:55,480 Speaker 1: cases in which it is more beneficial and or less 263 00:15:55,520 --> 00:15:57,920 Speaker 1: costly to cheat today than it would have been in 264 00:15:57,960 --> 00:16:03,920 Speaker 1: the distant past. First, it's a matter of what economists 265 00:16:03,920 --> 00:16:07,120 Speaker 1: would call is are people involved in what they would 266 00:16:07,120 --> 00:16:10,360 Speaker 1: call a one shot game or repeated game. If I 267 00:16:10,400 --> 00:16:12,560 Speaker 1: go back to the game theory piece for a minute, 268 00:16:13,560 --> 00:16:15,760 Speaker 1: If I'm going to interact with you one time and 269 00:16:15,840 --> 00:16:19,680 Speaker 1: one time only, that's a different world that if I'm 270 00:16:19,680 --> 00:16:21,800 Speaker 1: going to have to interact with you over and over 271 00:16:21,880 --> 00:16:25,920 Speaker 1: and over again. If I'm a sort of the proverbial 272 00:16:25,960 --> 00:16:30,320 Speaker 1: traveling salesperson, you know, I walk into a town, I 273 00:16:30,480 --> 00:16:33,400 Speaker 1: con the people in that town out of money, and 274 00:16:33,480 --> 00:16:36,560 Speaker 1: I leave. I get away with it. But if I'm 275 00:16:36,560 --> 00:16:38,640 Speaker 1: going to try to do business there with the same 276 00:16:38,680 --> 00:16:41,720 Speaker 1: people day after day, week after week, month after month, 277 00:16:42,240 --> 00:16:46,560 Speaker 1: if I cheat them today, They're going to retaliate and 278 00:16:46,680 --> 00:16:49,080 Speaker 1: get back at me the next time I interact with 279 00:16:49,120 --> 00:16:53,880 Speaker 1: them rightly customers after So the more often I have 280 00:16:53,960 --> 00:16:57,240 Speaker 1: to interact with you, the less likely it is I'm 281 00:16:57,240 --> 00:16:59,560 Speaker 1: going to want to cheat you this time, because if 282 00:16:59,600 --> 00:17:02,760 Speaker 1: I cheat today, say you'll retaliate and get back at 283 00:17:02,800 --> 00:17:05,879 Speaker 1: me the next time. So one of the issues is 284 00:17:06,480 --> 00:17:09,720 Speaker 1: are we interacting with people more or less than we 285 00:17:09,760 --> 00:17:13,520 Speaker 1: did in the past. If you think it's sort of 286 00:17:13,560 --> 00:17:18,720 Speaker 1: a proverbial small town American where everybody knows what everybody 287 00:17:18,760 --> 00:17:22,840 Speaker 1: else does, the likelihood that I'm going to be caught 288 00:17:22,920 --> 00:17:26,080 Speaker 1: cheating is high, and the likelihood that word of mouth 289 00:17:26,680 --> 00:17:30,840 Speaker 1: will tell everybody in town that I cheated is high, 290 00:17:31,119 --> 00:17:33,040 Speaker 1: and that will make it very difficult for me to 291 00:17:33,080 --> 00:17:35,880 Speaker 1: interact with people in the future. That imposes a real 292 00:17:35,960 --> 00:17:41,159 Speaker 1: cost on me of cheating. But suppose you look at 293 00:17:41,160 --> 00:17:44,400 Speaker 1: a world in which you're interacting with different people every day. 294 00:17:44,520 --> 00:17:46,760 Speaker 1: Suppose I'm living not in a small town, but I'm 295 00:17:46,800 --> 00:17:50,640 Speaker 1: living in the middle of New York City. I could 296 00:17:50,680 --> 00:17:53,720 Speaker 1: cheat you today and maybe never interact with you again 297 00:17:53,760 --> 00:17:58,280 Speaker 1: for the rest of my life. So the more sort 298 00:17:58,280 --> 00:18:02,919 Speaker 1: of transient we become, the less we interact with the 299 00:18:03,000 --> 00:18:06,679 Speaker 1: same people over and over again, the more likely it 300 00:18:06,720 --> 00:18:09,719 Speaker 1: is we can get away with cheating. That's really interesting. 301 00:18:09,960 --> 00:18:14,000 Speaker 1: And also, like I guess, from a global standpoint and 302 00:18:14,119 --> 00:18:17,880 Speaker 1: from a company standpoint, when we're thinking about businesses cheating. 303 00:18:19,800 --> 00:18:23,080 Speaker 1: I feel like the globalization of commerce has got to 304 00:18:23,160 --> 00:18:27,400 Speaker 1: impact the cost benefit that these companies undergo when they're 305 00:18:27,400 --> 00:18:32,159 Speaker 1: deciding to cheat. Yeah, I think that's true of one 306 00:18:32,200 --> 00:18:34,719 Speaker 1: of the things that the companies worry about. And if 307 00:18:34,760 --> 00:18:37,720 Speaker 1: I get away from the small town analogy where word 308 00:18:37,760 --> 00:18:41,720 Speaker 1: of mouth tells people who cheats and who doesn't cheat. Uh, 309 00:18:42,480 --> 00:18:45,399 Speaker 1: how do you know if a firm you've never dealt 310 00:18:45,440 --> 00:18:49,159 Speaker 1: with before is reliable? How do you know if a 311 00:18:49,200 --> 00:18:51,639 Speaker 1: product you've never used before is a good product. How 312 00:18:51,640 --> 00:18:53,480 Speaker 1: do you know if a restaurant you've never eaten in 313 00:18:53,520 --> 00:18:58,520 Speaker 1: before is a good restaurant. Do you look at online reviews? Definitely? 314 00:18:58,840 --> 00:19:03,000 Speaker 1: But do people cheat on online reviews? This is a problem. 315 00:19:03,040 --> 00:19:06,480 Speaker 1: If we don't interact with these groups on an ongoing basis, 316 00:19:06,640 --> 00:19:09,919 Speaker 1: it's difficult for us to know if these companies are 317 00:19:09,920 --> 00:19:13,159 Speaker 1: trustworthy or not. The best we can do is to 318 00:19:13,200 --> 00:19:16,320 Speaker 1: look at things like online reviews, But even though we 319 00:19:16,400 --> 00:19:19,879 Speaker 1: find out are not always reliable. I think there have 320 00:19:19,960 --> 00:19:23,199 Speaker 1: been studies which have shown that companies pay people to 321 00:19:23,240 --> 00:19:26,280 Speaker 1: give them good online reviews and pay people to give 322 00:19:26,320 --> 00:19:30,760 Speaker 1: their competitors bad online reviews. So the more difficult it 323 00:19:30,920 --> 00:19:34,040 Speaker 1: is for us to know whether people are trustworthy or not, 324 00:19:34,680 --> 00:19:37,480 Speaker 1: the easier it is to cheat, I would say, also, 325 00:19:37,800 --> 00:19:40,600 Speaker 1: the stakes are higher too for these companies. It was 326 00:19:40,640 --> 00:19:43,080 Speaker 1: the other thing I was going to suggest, the stakes 327 00:19:43,119 --> 00:19:48,440 Speaker 1: are much larger in a globalized world. If I cheat 328 00:19:48,600 --> 00:19:53,840 Speaker 1: and win customers in a local market, that's nice. But today, 329 00:19:53,880 --> 00:19:56,520 Speaker 1: if I cheat, I can win not just the local market, 330 00:19:56,560 --> 00:19:59,560 Speaker 1: I can win a global market. And I'm talking again 331 00:19:59,640 --> 00:20:04,520 Speaker 1: about millions, billions of extra dollars as potential benefits. The 332 00:20:04,600 --> 00:20:07,960 Speaker 1: larger the market, it's the bigger the stakes, the larger 333 00:20:08,000 --> 00:20:11,480 Speaker 1: the market, it's the biggest potential games from cheating. Well, 334 00:20:11,520 --> 00:20:14,120 Speaker 1: this is this is all great fodder for me, especially 335 00:20:14,160 --> 00:20:16,360 Speaker 1: as we as we head into the holidays when I'm 336 00:20:16,359 --> 00:20:19,200 Speaker 1: probably going to cheat on my plans to eat healthily. 337 00:20:22,400 --> 00:20:25,800 Speaker 1: But thank you so much, Professor Stonebreaker for joining us. 338 00:20:25,840 --> 00:20:28,200 Speaker 1: I really appreciate it, and I hope you had fun. 339 00:20:28,480 --> 00:20:32,640 Speaker 1: Oh I did. Thank you. I enjoyed it, and thanks 340 00:20:32,680 --> 00:20:35,320 Speaker 1: to you all for listening to Bloomberg Benchmark. We will 341 00:20:35,400 --> 00:20:37,960 Speaker 1: be back next week, and until then you can find 342 00:20:38,000 --> 00:20:40,440 Speaker 1: us on the Bloomberg terminal and Bloomberg dot com, as 343 00:20:40,440 --> 00:20:44,080 Speaker 1: well as on iTunes. Pocketcast, Stitcher, Google Play, et cetera. 344 00:20:44,160 --> 00:20:46,760 Speaker 1: And while you're there, please take a minute to rate 345 00:20:46,800 --> 00:20:49,480 Speaker 1: and review the show so more listeners can find us. 346 00:20:49,600 --> 00:20:52,080 Speaker 1: Please and let us know what you thought of the show. 347 00:20:52,200 --> 00:20:54,199 Speaker 1: You can talk to us and follow us on Twitter, 348 00:20:54,400 --> 00:20:58,120 Speaker 1: Atto seven and Tori still Well. See you next week. 349 00:21:08,440 --> 00:21:11,600 Speaker 1: We're proud of our new and growing suite of original podcasts, 350 00:21:11,680 --> 00:21:14,720 Speaker 1: all designed to help you navigate the complexities of business, 351 00:21:14,920 --> 00:21:19,120 Speaker 1: financial markets, and the global economy. In addition to Bloomberg Benchmark, 352 00:21:19,200 --> 00:21:22,159 Speaker 1: which you're listening to now, don't miss Odd Lots, a 353 00:21:22,240 --> 00:21:25,520 Speaker 1: deep dive into the intersection of markets, economics, and finance 354 00:21:25,560 --> 00:21:28,760 Speaker 1: with Joe Wisenhugh and Tracy Alloway. There's also Deal the 355 00:21:28,800 --> 00:21:32,600 Speaker 1: Week with our mergers and Acquisitions reporter Alex Sherman, looking 356 00:21:32,600 --> 00:21:35,000 Speaker 1: at a breakdown of the biggest deals and giving you 357 00:21:35,040 --> 00:21:38,639 Speaker 1: an inside peek into corporate boardrooms. All three shows are 358 00:21:38,680 --> 00:21:43,480 Speaker 1: available on iTunes, SoundCloud, pocket Casts for Android, Bloomberg dot Com, 359 00:21:43,520 --> 00:21:46,119 Speaker 1: and of course, the Bloomberg Terminal. Check them out and 360 00:21:46,160 --> 00:21:51,040 Speaker 1: subscribe today. This episode was brought to you by nat X. 361 00:21:51,280 --> 00:21:53,399 Speaker 1: You know, any long term investment is going to go 362 00:21:53,440 --> 00:21:56,760 Speaker 1: through short term dips and price fluctuations nat X binary 363 00:21:56,800 --> 00:22:00,320 Speaker 1: options that you turn those short term movements into trading opportunity. 364 00:22:00,680 --> 00:22:03,399 Speaker 1: You decide your maximum profit and loss before each trade, 365 00:22:03,680 --> 00:22:06,679 Speaker 1: so your risk is always limited. Trade stock in dissees, 366 00:22:06,720 --> 00:22:10,320 Speaker 1: commodities for X, even bitcoin in economic numbers, all from 367 00:22:10,320 --> 00:22:14,560 Speaker 1: one account on a CFTC regulated US exchange. Instead of 368 00:22:14,600 --> 00:22:17,000 Speaker 1: just watching the markets ups and downs, turn them into 369 00:22:17,040 --> 00:22:20,720 Speaker 1: trading opportunities at nate x dot com. It's the future 370 00:22:20,720 --> 00:22:25,479 Speaker 1: of trading n A d e X, dot com, futures options, 371 00:22:25,480 --> 00:22:28,280 Speaker 1: and swaps. Trading involves risk and may not be appropriate 372 00:22:28,359 --> 00:22:29,480 Speaker 1: for all investors.