1 00:00:04,640 --> 00:00:08,319 Speaker 1: Welcome, Welcome, Welcome to the Bob Left Sets podcast. My 2 00:00:08,440 --> 00:00:12,800 Speaker 1: guest this week is CEO of Civic Science, John Dick. 3 00:00:13,280 --> 00:00:16,600 Speaker 1: John Bob good to be here. I met John we 4 00:00:16,600 --> 00:00:20,000 Speaker 1: were on a piano for an Access TV show. Because 5 00:00:20,040 --> 00:00:23,120 Speaker 1: Mark Cuban is involved on the board of Civic Science 6 00:00:23,720 --> 00:00:26,880 Speaker 1: doing Grammy predictions and John is in town. He sends 7 00:00:26,920 --> 00:00:31,960 Speaker 1: a weekly email talking about trends in the world at large. 8 00:00:32,200 --> 00:00:34,680 Speaker 1: Because ultimately, what would you call your company? Wouldn't call 9 00:00:34,760 --> 00:00:37,360 Speaker 1: a polling company. I would call us a polling company. 10 00:00:37,440 --> 00:00:39,239 Speaker 1: I wish there was a sexier way to describe it, 11 00:00:39,240 --> 00:00:42,360 Speaker 1: but we're a new breed of polling company, okay. And 12 00:00:42,520 --> 00:00:46,440 Speaker 1: he's always revealing fasciniating insights, so I thought it'd be 13 00:00:46,440 --> 00:00:48,320 Speaker 1: great to have him on the podcast. Let's start from 14 00:00:48,320 --> 00:00:52,160 Speaker 1: the beginning. You're in Pittsburgh. Correct, Why Pittsburgh the next 15 00:00:52,159 --> 00:00:55,240 Speaker 1: big thing? Pittsburgh is going through a tremendous renaissance. But 16 00:00:55,280 --> 00:00:57,760 Speaker 1: actually we're there. The company is there because carnegiel And 17 00:00:57,840 --> 00:01:01,440 Speaker 1: University is there, and the talent and technology that powers 18 00:01:01,560 --> 00:01:04,720 Speaker 1: the foundation of our company came from there, and so 19 00:01:04,920 --> 00:01:07,240 Speaker 1: um and it's a great place to live, and it's 20 00:01:07,280 --> 00:01:10,200 Speaker 1: a really emerging city with lots of talent and just 21 00:01:10,240 --> 00:01:12,480 Speaker 1: a smart place for us to Are you from Pittsburgh 22 00:01:12,520 --> 00:01:14,560 Speaker 1: born and raised? Yeah? Did you go to Carnegie Mill? 23 00:01:14,640 --> 00:01:16,840 Speaker 1: I did not. Where do you go to to Rollins 24 00:01:16,880 --> 00:01:19,880 Speaker 1: College in Winter Park, Florida? Okay? So then how did 25 00:01:19,920 --> 00:01:22,640 Speaker 1: you hook up with all these new technology things at 26 00:01:22,640 --> 00:01:25,319 Speaker 1: Carnegie Miller. Well, I started a company earlier in my 27 00:01:25,440 --> 00:01:28,160 Speaker 1: early twenties. Um sold that company in my late twenties. 28 00:01:28,240 --> 00:01:31,399 Speaker 1: What was that company's called GSP. We helped early stage 29 00:01:31,440 --> 00:01:33,760 Speaker 1: tech and biotech companies do business with the government, very 30 00:01:33,800 --> 00:01:37,920 Speaker 1: different business. And so I exited that company and had 31 00:01:37,920 --> 00:01:40,559 Speaker 1: the business plan for what became Civic Science shortly after, 32 00:01:40,760 --> 00:01:43,119 Speaker 1: and I knew it had some technology needs and found 33 00:01:43,160 --> 00:01:44,679 Speaker 1: a couple of guys that knew how to solve those 34 00:01:44,680 --> 00:01:47,120 Speaker 1: technology needs. Okay, So why did you want to be 35 00:01:47,160 --> 00:01:50,480 Speaker 1: in the polling business? Because it's broken? And uh, well 36 00:01:50,520 --> 00:01:52,880 Speaker 1: two reasons. When it was broken and at a little 37 00:01:52,920 --> 00:01:57,560 Speaker 1: slower Why is it broken because people don't participate the 38 00:01:57,560 --> 00:02:00,840 Speaker 1: way they used to because we're too busy. Um, it's 39 00:02:00,840 --> 00:02:03,560 Speaker 1: not a pleasant experience for people to participate in polling 40 00:02:03,560 --> 00:02:06,680 Speaker 1: as it's sort of originally constructed. My phone's ringing while 41 00:02:06,720 --> 00:02:08,440 Speaker 1: I'm having dinner, or I have to sign up at 42 00:02:08,480 --> 00:02:11,079 Speaker 1: some web page and answer surveys all day for five bucks. 43 00:02:11,080 --> 00:02:15,240 Speaker 1: And it's just not a user centric experience for anyone. 44 00:02:15,480 --> 00:02:19,360 Speaker 1: And so what happened is landline phone ownerships started to plummet, 45 00:02:19,720 --> 00:02:21,680 Speaker 1: and the people who have landline phones don't look like 46 00:02:21,720 --> 00:02:23,239 Speaker 1: the ones who don't, and so the quality of the 47 00:02:23,280 --> 00:02:25,880 Speaker 1: data started to suffer. And then posters said, well, that's fine, 48 00:02:25,919 --> 00:02:29,240 Speaker 1: we'll we'll call cell phones. But people don't answer cell 49 00:02:29,240 --> 00:02:31,200 Speaker 1: phone calls from numbers they don't recognize, or at least 50 00:02:31,200 --> 00:02:32,880 Speaker 1: the people who do don't look like the ones who don't. 51 00:02:33,560 --> 00:02:35,160 Speaker 1: And then you had the advent of what are called 52 00:02:35,160 --> 00:02:38,519 Speaker 1: survey panels, which are these websites people sign up at 53 00:02:38,600 --> 00:02:41,519 Speaker 1: and answer surveys for money. And those people don't look 54 00:02:41,600 --> 00:02:44,639 Speaker 1: like either of us. They are too much time on 55 00:02:44,680 --> 00:02:47,840 Speaker 1: their hands. Okay, just one other thing, generally speaking, because 56 00:02:47,880 --> 00:02:50,000 Speaker 1: you've lived in a place in Los Angeles, you're constantly 57 00:02:50,000 --> 00:02:53,800 Speaker 1: getting email whatever, Come participate for fifty a hundred dollars. 58 00:02:53,840 --> 00:02:56,600 Speaker 1: What are those panels worth? Well, I mean, what are 59 00:02:56,639 --> 00:02:59,880 Speaker 1: they worth? They certainly are powering a lot of decision 60 00:03:00,000 --> 00:03:02,680 Speaker 1: making in commercial America. I don't think they should be 61 00:03:02,680 --> 00:03:05,359 Speaker 1: because the people who answer those surveys don't look like 62 00:03:05,480 --> 00:03:08,200 Speaker 1: the real world. They are people who are highly incentivized 63 00:03:08,200 --> 00:03:13,120 Speaker 1: by that financial incentive, and so they're heavily biased. Um, 64 00:03:13,160 --> 00:03:17,919 Speaker 1: and then that financial reward inspire some of the wrong 65 00:03:18,000 --> 00:03:20,240 Speaker 1: behaviors too, And when you're trying to research someone as 66 00:03:20,240 --> 00:03:23,440 Speaker 1: opposed to getting their honest opinion about something. So that's 67 00:03:23,440 --> 00:03:25,720 Speaker 1: all been broken, and it started breaking twenty years ago. 68 00:03:25,800 --> 00:03:28,520 Speaker 1: It started to really sort of, Um, the chickens kind 69 00:03:28,520 --> 00:03:30,800 Speaker 1: of came home to roost in the last election cycle. 70 00:03:30,880 --> 00:03:32,720 Speaker 1: We see polling just isn't working the way that it 71 00:03:32,800 --> 00:03:34,440 Speaker 1: used to. Okay, we got we got a bunchet things 72 00:03:34,480 --> 00:03:36,760 Speaker 1: on what is the other reason you wanted to go 73 00:03:36,800 --> 00:03:38,960 Speaker 1: into this business? Because I think it needs to work. 74 00:03:39,120 --> 00:03:41,080 Speaker 1: It needs we need to figure out a way to 75 00:03:42,600 --> 00:03:46,520 Speaker 1: understand what people think. There was a mindset ten or 76 00:03:46,560 --> 00:03:49,200 Speaker 1: fifteen years ago that social media was somehow going to 77 00:03:49,280 --> 00:03:51,840 Speaker 1: replace survey research is the way that we understood what 78 00:03:51,880 --> 00:03:53,920 Speaker 1: the world thinks, and it just doesn't work that way. 79 00:03:53,960 --> 00:03:56,480 Speaker 1: I'm not myself on social media. I'm a curated version 80 00:03:56,520 --> 00:04:02,800 Speaker 1: of myself. We need a discreet means of understanding people's opinion. 81 00:04:02,840 --> 00:04:04,720 Speaker 1: There's a reason we pull a curtain behind us on 82 00:04:04,760 --> 00:04:08,200 Speaker 1: election day, because if our opinions are subject to scrutiny 83 00:04:08,280 --> 00:04:11,760 Speaker 1: and peer pressure and invective, then we um, we don't 84 00:04:11,760 --> 00:04:14,440 Speaker 1: tell people what we think. Why did everybody get it so? 85 00:04:14,600 --> 00:04:18,080 Speaker 1: I mean, now we have Nate Silver, who now presently 86 00:04:18,080 --> 00:04:19,479 Speaker 1: at five thirty eight, used to be The New York 87 00:04:19,480 --> 00:04:23,320 Speaker 1: Times the previous election cycle. He balances polls. He got 88 00:04:23,360 --> 00:04:26,560 Speaker 1: it wrong, although he will say he thought the odds 89 00:04:26,560 --> 00:04:29,080 Speaker 1: of Trump being successful. Hire New York Times got it 90 00:04:29,120 --> 00:04:33,800 Speaker 1: completely wrong. What did we learn there? Well, on a 91 00:04:33,880 --> 00:04:36,520 Speaker 1: national level, from a popular vote standpoint, the posters were 92 00:04:36,520 --> 00:04:39,400 Speaker 1: pretty strong. I mean they said most of the averages 93 00:04:39,440 --> 00:04:41,560 Speaker 1: said Hillary Clinton would win by three percent, and she 94 00:04:41,640 --> 00:04:45,360 Speaker 1: wrote one roughly by three Um, we're broke down. Was 95 00:04:45,400 --> 00:04:47,680 Speaker 1: at state levels, where a couple of hundred thousand votes 96 00:04:47,720 --> 00:04:49,919 Speaker 1: across three or four states were able to swing the election. 97 00:04:50,720 --> 00:04:52,640 Speaker 1: What did we learn about it in the grand scheme 98 00:04:52,680 --> 00:04:54,919 Speaker 1: of things was that there was an entire universe of 99 00:04:55,000 --> 00:04:57,880 Speaker 1: people out there who were piste off and we didn't 100 00:04:57,880 --> 00:05:00,919 Speaker 1: know about it. Um. One of the reasons, and we 101 00:05:00,960 --> 00:05:03,600 Speaker 1: study this a lot in our own data, Conservatives tend 102 00:05:03,600 --> 00:05:06,400 Speaker 1: to be much less outspoken on social media than their 103 00:05:06,440 --> 00:05:10,320 Speaker 1: liberal counterparts are. Why because they believe that when they 104 00:05:10,360 --> 00:05:14,360 Speaker 1: speak out on conservative issues, they are opening themselves up 105 00:05:14,400 --> 00:05:16,400 Speaker 1: to attack from the left, and so they tend to 106 00:05:16,480 --> 00:05:20,040 Speaker 1: keep their mouth shut. We've studied this extensively, and so 107 00:05:20,120 --> 00:05:25,040 Speaker 1: there was an entire um sub layer of us, the 108 00:05:25,120 --> 00:05:27,799 Speaker 1: US population that was piste off. But because they weren't 109 00:05:27,800 --> 00:05:30,320 Speaker 1: trumpeting it from the rooftops on social media, we didn't 110 00:05:30,360 --> 00:05:33,920 Speaker 1: know about it, and we didn't know about it until November. Okay, 111 00:05:33,960 --> 00:05:37,080 Speaker 1: so you move back to Pittsburgh, you align with these 112 00:05:37,120 --> 00:05:39,120 Speaker 1: people at Carnegie mellon what are the next steps in 113 00:05:39,240 --> 00:05:41,720 Speaker 1: starting civic science. Well, we had a lot of technology 114 00:05:41,760 --> 00:05:45,440 Speaker 1: and to build what was that technology, database and database 115 00:05:45,560 --> 00:05:48,240 Speaker 1: architecture designed to kind of find all the stuff we 116 00:05:48,279 --> 00:05:49,839 Speaker 1: can find in our data. We had to prove that 117 00:05:49,880 --> 00:05:52,919 Speaker 1: the data the methodology we were inventing worked. Now, the 118 00:05:52,960 --> 00:05:56,479 Speaker 1: methodology you started with when was this two eight? Is 119 00:05:56,520 --> 00:05:59,479 Speaker 1: that the same methodology you're still employing. Absolutely, So explained 120 00:05:59,520 --> 00:06:02,000 Speaker 1: to my what you with that methodology. So we encounter 121 00:06:02,120 --> 00:06:04,960 Speaker 1: people and polls and quizzes across the web. Inside of 122 00:06:05,000 --> 00:06:07,600 Speaker 1: content they're reading, it might be a polar quiz you 123 00:06:07,680 --> 00:06:09,640 Speaker 1: might see on a buzz feeder or Facebook of what 124 00:06:09,720 --> 00:06:11,960 Speaker 1: kind of wine or what kind of Simpson's character are you? 125 00:06:12,320 --> 00:06:14,640 Speaker 1: But more it's an opinion type of poll of the 126 00:06:14,720 --> 00:06:16,880 Speaker 1: day that you might find in an article you're reading 127 00:06:16,880 --> 00:06:19,239 Speaker 1: on your local newspaper, on a site like Paris Hilton 128 00:06:19,360 --> 00:06:22,600 Speaker 1: or Univision. We're generally asking somebody a couple of questions 129 00:06:22,600 --> 00:06:24,520 Speaker 1: at a time, no more than three or four, because 130 00:06:24,520 --> 00:06:26,359 Speaker 1: we know that if we add too many questions, we 131 00:06:26,360 --> 00:06:29,599 Speaker 1: start to see a decline and participation and that's no good. Um. 132 00:06:29,680 --> 00:06:32,880 Speaker 1: We give people back, um instant results, which we think 133 00:06:32,960 --> 00:06:36,120 Speaker 1: is really important. So I answer a question, you know, 134 00:06:36,560 --> 00:06:39,480 Speaker 1: what is my favorite color? And after I click, I'll 135 00:06:39,480 --> 00:06:41,960 Speaker 1: see what everybody else's favorite color is. Sure. Sure, we 136 00:06:42,000 --> 00:06:46,640 Speaker 1: don't generally ask favorite color, but yeah, sure, yeah, And 137 00:06:46,640 --> 00:06:50,040 Speaker 1: and we think there's some a lot of particularly today. Um. 138 00:06:50,080 --> 00:06:52,520 Speaker 1: And what we hear back from people is they appreciate 139 00:06:52,600 --> 00:06:55,440 Speaker 1: that we're giving them the answers right back. It's unvarnished, 140 00:06:55,960 --> 00:06:59,480 Speaker 1: unfiltered truth that here's what the world thinks, and here's 141 00:06:59,520 --> 00:07:01,800 Speaker 1: what people like you think, and it's not being passed 142 00:07:01,800 --> 00:07:04,200 Speaker 1: to you through the lens of a media company that 143 00:07:04,240 --> 00:07:06,560 Speaker 1: may have some bias now with somewhat Okay, what is 144 00:07:06,600 --> 00:07:10,560 Speaker 1: incentivizing them to answer the questions to begin with? Uh, 145 00:07:10,600 --> 00:07:14,920 Speaker 1: it's that intrinsic benefit of a feeling like I'm participating 146 00:07:15,000 --> 00:07:19,640 Speaker 1: in something good, be the interestingness of the results that 147 00:07:19,720 --> 00:07:22,320 Speaker 1: I get back feeling like, Okay, I just learned something 148 00:07:22,320 --> 00:07:24,000 Speaker 1: about the rest of the world by virtue of my 149 00:07:24,040 --> 00:07:27,240 Speaker 1: participation in it. And do you compensate the sites where 150 00:07:27,240 --> 00:07:30,400 Speaker 1: you're quizzes are now? So how do they end up 151 00:07:30,400 --> 00:07:35,920 Speaker 1: being placed there? Um? They it's effectively a bartering arrangement. Um. 152 00:07:35,960 --> 00:07:39,480 Speaker 1: But what they they learn things from the information that 153 00:07:39,560 --> 00:07:41,960 Speaker 1: we gather that help them grow their businesses, that we 154 00:07:42,000 --> 00:07:44,720 Speaker 1: help them improve their editorial functions and their ad sales. Okay, 155 00:07:44,720 --> 00:07:47,720 Speaker 1: So let's say you're on website A. You would approach 156 00:07:47,760 --> 00:07:48,920 Speaker 1: them and say, I want to do this, but I'm 157 00:07:48,920 --> 00:07:51,080 Speaker 1: going to share data with you that's going to be 158 00:07:51,120 --> 00:07:54,760 Speaker 1: your compensation. Correct. So how many sites are you presently on? 159 00:07:55,520 --> 00:07:59,360 Speaker 1: Well over a thousand at last count. Um? Very uh 160 00:07:59,640 --> 00:08:03,200 Speaker 1: piece together to give us a diverse representative view of 161 00:08:03,200 --> 00:08:05,280 Speaker 1: the US population. So we want to make sure we 162 00:08:05,360 --> 00:08:07,880 Speaker 1: have ample number of media companies that reach hispanics. An 163 00:08:07,920 --> 00:08:11,600 Speaker 1: ample number of media companies that reach women and so forth. Um, 164 00:08:11,720 --> 00:08:15,440 Speaker 1: the media companies make a distinction from us that the 165 00:08:15,800 --> 00:08:18,560 Speaker 1: size of our poll sitting on their web page, that 166 00:08:18,600 --> 00:08:21,200 Speaker 1: we are delivering more value to them from that information 167 00:08:21,200 --> 00:08:22,800 Speaker 1: than they would get from a banner ad that was 168 00:08:22,800 --> 00:08:24,960 Speaker 1: sitting in that same place. And as long as we 169 00:08:25,000 --> 00:08:27,920 Speaker 1: can maintain that equilibrium, we can continue to collect that data. 170 00:08:28,000 --> 00:08:31,400 Speaker 1: How do you choose what sites to place your quizzes on? Um? Well, 171 00:08:31,480 --> 00:08:33,640 Speaker 1: it's we have a team who goes out and sells 172 00:08:33,679 --> 00:08:37,720 Speaker 1: that essentially, and we do we prioritize at any given time, 173 00:08:37,760 --> 00:08:41,240 Speaker 1: We prioritize sales to publishers or partnerships with publishers who 174 00:08:41,360 --> 00:08:45,640 Speaker 1: meet demographic imbalances in our sampling. So if we if 175 00:08:45,679 --> 00:08:47,800 Speaker 1: we notice over time, hey we're starting to get a 176 00:08:47,840 --> 00:08:51,120 Speaker 1: disproportionate number of men answering our polls, we want to 177 00:08:51,120 --> 00:08:54,920 Speaker 1: go and target sites that have a disproportionate audience of women. Okay, 178 00:08:54,960 --> 00:08:58,840 Speaker 1: I am someone who never re answers those questions, So 179 00:08:58,920 --> 00:09:02,719 Speaker 1: that begs the question are certain people ultimately not included 180 00:09:02,720 --> 00:09:05,160 Speaker 1: in the sample? Yes? What do you do about that? Well, 181 00:09:05,160 --> 00:09:07,880 Speaker 1: it's a fundamental flaw. UM, I guess what we tell 182 00:09:07,920 --> 00:09:10,200 Speaker 1: the marketplace. And what we can prove in our research 183 00:09:10,320 --> 00:09:14,120 Speaker 1: is that because we've limited those barriers as much as possible, 184 00:09:14,200 --> 00:09:16,280 Speaker 1: that our audience looks more like the real world than 185 00:09:16,280 --> 00:09:18,959 Speaker 1: any other methodology you would think of. So you're not 186 00:09:19,000 --> 00:09:20,920 Speaker 1: going to be the person who signs up and answers 187 00:09:20,920 --> 00:09:23,800 Speaker 1: surveys all day for five bucks either. Um we are. 188 00:09:23,880 --> 00:09:26,520 Speaker 1: We are all missing some portion of the US population. 189 00:09:26,559 --> 00:09:29,079 Speaker 1: We're just missing a much smaller portion of the population. 190 00:09:29,200 --> 00:09:31,280 Speaker 1: And do you sit in your office and say, how 191 00:09:31,280 --> 00:09:33,599 Speaker 1: do we reach those people? You just tell your customers 192 00:09:34,040 --> 00:09:37,160 Speaker 1: some people are unreachable. Um, we're not giving up. I 193 00:09:37,200 --> 00:09:39,840 Speaker 1: think we spend a ton of time trying to understand 194 00:09:40,000 --> 00:09:42,880 Speaker 1: what types of questions or what kind of interaction will 195 00:09:42,920 --> 00:09:46,240 Speaker 1: bring people who otherwise don't want to participate to participate in. 196 00:09:46,240 --> 00:09:48,200 Speaker 1: Our response rates as a result, have gone up a lot. 197 00:09:49,040 --> 00:09:51,800 Speaker 1: The percentage of of people who see our polls on 198 00:09:51,800 --> 00:09:54,439 Speaker 1: a website and answer them has skyrocketed in the last 199 00:09:54,480 --> 00:09:57,280 Speaker 1: two years because we've invested very heavily in solving that problem. 200 00:09:57,320 --> 00:10:00,839 Speaker 1: I would assume it's still below ten percent. Sure, sure, um, 201 00:10:00,840 --> 00:10:02,760 Speaker 1: but if you think about what's that, how's that relate 202 00:10:02,840 --> 00:10:05,000 Speaker 1: to say, the likelihood of somebody clicking on an ad, 203 00:10:05,040 --> 00:10:08,080 Speaker 1: it's a hundred times more than that, right, because there's 204 00:10:08,440 --> 00:10:10,640 Speaker 1: there's something engaging about it, as opposed to, Oh, I'm 205 00:10:10,640 --> 00:10:12,280 Speaker 1: gonna click on this ad and somebody's gonna try and 206 00:10:12,440 --> 00:10:14,080 Speaker 1: try to try to sell me. And how many people 207 00:10:14,120 --> 00:10:18,480 Speaker 1: work at SIVEN Civic Science? Um? I think? And are 208 00:10:18,480 --> 00:10:20,600 Speaker 1: they all in Pittsburgh? All in way of one person 209 00:10:20,640 --> 00:10:24,440 Speaker 1: in New York? Okay? So what kind of companies uh, 210 00:10:24,760 --> 00:10:30,959 Speaker 1: by your data? Um, pretty broad, large ones, primarily very 211 00:10:31,520 --> 00:10:34,720 Speaker 1: bigger sort of fortune five hundred types of enterprises. Consumer 212 00:10:34,800 --> 00:10:41,200 Speaker 1: facing companies were really strong and tech and telcom, media, banking, healthcare, restaurant, 213 00:10:41,320 --> 00:10:45,480 Speaker 1: and some retail. Okay, we were talking off camera, and 214 00:10:45,520 --> 00:10:48,120 Speaker 1: these are all household name the names you think they are, 215 00:10:48,200 --> 00:10:52,520 Speaker 1: they are, but let's assume point of discussion. Telcom Okay, 216 00:10:52,880 --> 00:10:57,040 Speaker 1: it is a very competitive marketplace. Everyone already has a smartphone. 217 00:10:57,280 --> 00:11:00,240 Speaker 1: There are four competitors in the United States. What you 218 00:11:00,280 --> 00:11:03,760 Speaker 1: be able to tell a telecom company, Uh, there's going 219 00:11:03,800 --> 00:11:06,400 Speaker 1: to be a mix of things about attitudes towards their 220 00:11:06,440 --> 00:11:09,400 Speaker 1: specific company and their specific products. There's going to be 221 00:11:09,440 --> 00:11:12,480 Speaker 1: a mix of things about trends within their category. So 222 00:11:12,559 --> 00:11:15,280 Speaker 1: which of their competitors and maybe are more vulnerable than 223 00:11:15,280 --> 00:11:17,440 Speaker 1: others at a given point in time. And then the 224 00:11:17,480 --> 00:11:20,079 Speaker 1: more valuable stuff we do tends to look at how 225 00:11:20,160 --> 00:11:24,280 Speaker 1: various larger consumer trends and macro forces are affecting consumers 226 00:11:24,280 --> 00:11:26,679 Speaker 1: and how that relates to what mobile carriers they're going 227 00:11:26,720 --> 00:11:30,199 Speaker 1: to use, and and and um, how much of their 228 00:11:30,200 --> 00:11:32,040 Speaker 1: plan they're going to use. One thing we learned was 229 00:11:32,040 --> 00:11:35,599 Speaker 1: really fascinating pre and post election was the composition of 230 00:11:35,640 --> 00:11:38,319 Speaker 1: the people who were likely to switch mobile carriers changed 231 00:11:38,400 --> 00:11:41,319 Speaker 1: very dramatically from the summer of two thousand sixteen through 232 00:11:41,320 --> 00:11:44,600 Speaker 1: the summer. And what did your research tell you, Um, 233 00:11:44,720 --> 00:11:48,960 Speaker 1: Democrats were more likely to switch mobile carriers after the 234 00:11:49,000 --> 00:11:53,720 Speaker 1: election than they were before because they, UM, I switched 235 00:11:53,720 --> 00:11:57,280 Speaker 1: my mobile carrier generally to get a better financial deal, right, 236 00:11:58,120 --> 00:12:01,840 Speaker 1: and so it tends to correlate with my personal financial 237 00:12:01,880 --> 00:12:05,760 Speaker 1: situation or my consumer confidence. And so once um, Donald 238 00:12:05,800 --> 00:12:08,880 Speaker 1: Trump was elected, these people, UM, a lot of Democrats 239 00:12:08,960 --> 00:12:11,120 Speaker 1: began to have a bleaker view of the economy and said, 240 00:12:11,160 --> 00:12:12,640 Speaker 1: you know what, I better go find a better deal 241 00:12:12,679 --> 00:12:14,600 Speaker 1: for myself. And we saw a higher rate of switchers 242 00:12:14,640 --> 00:12:17,160 Speaker 1: among Democrats during that period of time as opposed to 243 00:12:17,160 --> 00:12:19,120 Speaker 1: the summer before when they tended to be more likely 244 00:12:19,120 --> 00:12:23,400 Speaker 1: to be Republicans. Now, sitting here as a lay person, 245 00:12:23,960 --> 00:12:29,320 Speaker 1: one could say, also Sprint was offering comprehensive plans with 246 00:12:30,000 --> 00:12:34,079 Speaker 1: very cheaper free phones. T Mobile was very cheap, So 247 00:12:34,640 --> 00:12:37,560 Speaker 1: one might say, really it was plans were so cheap 248 00:12:37,559 --> 00:12:41,520 Speaker 1: people were incentivized for that reason as opposed to consumer confidence. Well, 249 00:12:41,559 --> 00:12:45,560 Speaker 1: but still there's I chase a cheaper opportunity when I'm 250 00:12:45,559 --> 00:12:49,200 Speaker 1: looking when I have a less positive view of my 251 00:12:49,240 --> 00:12:51,360 Speaker 1: personal finances. So I would whether there was cause and 252 00:12:51,400 --> 00:12:54,320 Speaker 1: effect there or whether those companies were smart enough to 253 00:12:54,679 --> 00:12:56,760 Speaker 1: capitalize them what they knew was coming, which was more 254 00:12:56,800 --> 00:13:03,439 Speaker 1: price sensitivity in their core markets. UM. Hard to tell. Okay, Now, 255 00:13:03,480 --> 00:13:07,360 Speaker 1: many all these companies do use consumer research, but there's 256 00:13:07,440 --> 00:13:10,439 Speaker 1: a school of thought that research will tell you where 257 00:13:10,480 --> 00:13:12,880 Speaker 1: you've been, but it won't tell you where you're going. Oh, 258 00:13:12,920 --> 00:13:15,840 Speaker 1: that's the big part of our ethos as we focus 259 00:13:15,880 --> 00:13:19,439 Speaker 1: on looking through the windshield. So how do you do that? UM? 260 00:13:19,480 --> 00:13:22,960 Speaker 1: We look a lot at patterns and trends in this 261 00:13:23,280 --> 00:13:25,920 Speaker 1: history of data now that we have UM, those trends 262 00:13:25,960 --> 00:13:29,120 Speaker 1: tend to have a trajectory that we can look two months, 263 00:13:29,160 --> 00:13:31,440 Speaker 1: six months out and see where they're headed. We ask 264 00:13:31,520 --> 00:13:33,880 Speaker 1: a lot of questions that are forward looking of people. 265 00:13:34,000 --> 00:13:36,160 Speaker 1: What are your intentions? How likely you to do something 266 00:13:36,240 --> 00:13:41,000 Speaker 1: in the future. Um, there's ample backward looking data that 267 00:13:41,120 --> 00:13:43,600 Speaker 1: credit card companies have and people on the web have 268 00:13:43,679 --> 00:13:46,040 Speaker 1: based on things I clicked on yesterday and bought yesterday. 269 00:13:46,040 --> 00:13:48,320 Speaker 1: We don't we don't need to augment that information. We 270 00:13:48,400 --> 00:13:51,600 Speaker 1: focus on where are people going to be spending their 271 00:13:51,600 --> 00:13:55,040 Speaker 1: money next week or next month. But you look famously, 272 00:13:55,080 --> 00:14:00,320 Speaker 1: Steve Jobs did no consumer research, and he would come 273 00:14:00,400 --> 00:14:03,439 Speaker 1: up with products that people literally hadn't heard of, I 274 00:14:03,559 --> 00:14:07,280 Speaker 1: didn't think they needed, and then would clamor for good, 275 00:14:07,320 --> 00:14:09,680 Speaker 1: good work if you can get it. Okay, So he's 276 00:14:09,679 --> 00:14:13,280 Speaker 1: an anomaly. He's the outline absolutely. Okay. Yeah, how about 277 00:14:13,840 --> 00:14:18,000 Speaker 1: Elon Musk probably of that ilk. But I think those 278 00:14:18,000 --> 00:14:21,880 Speaker 1: people are the exceptions, not the rule. Okay. So where's 279 00:14:21,920 --> 00:14:26,880 Speaker 1: it going? Well, Um, it's a mix of exciting and 280 00:14:26,960 --> 00:14:32,280 Speaker 1: terrifying all at once. Um, we are seeing some amazing 281 00:14:32,320 --> 00:14:36,240 Speaker 1: see changes right now, a lot of things related. UM. 282 00:14:36,400 --> 00:14:39,920 Speaker 1: I don't think we fully appreciate yet the impact that 283 00:14:39,960 --> 00:14:43,000 Speaker 1: social media will have on our culture forever. I think 284 00:14:43,040 --> 00:14:45,440 Speaker 1: we intuitively get some of it, but I don't think 285 00:14:45,440 --> 00:14:47,280 Speaker 1: we're going to be able to diagnose any of that 286 00:14:47,320 --> 00:14:50,080 Speaker 1: for decades. But one of the things that that that 287 00:14:50,120 --> 00:14:52,560 Speaker 1: we notice a lot in our data is and and 288 00:14:52,560 --> 00:14:55,200 Speaker 1: everyone kind of can relate to this, is that because 289 00:14:55,240 --> 00:14:58,840 Speaker 1: I'm this kind of curated version of myself on social media, 290 00:14:59,120 --> 00:15:00,880 Speaker 1: I use social media as a way to make myself 291 00:15:00,920 --> 00:15:03,080 Speaker 1: look funnier and smarter and a better parent and more 292 00:15:03,080 --> 00:15:05,040 Speaker 1: adventurous and a better eater. And I think about that 293 00:15:05,080 --> 00:15:07,640 Speaker 1: consciously or not when I choose what to share, where 294 00:15:07,640 --> 00:15:10,880 Speaker 1: to check in. Uh. The flip side of that is 295 00:15:11,040 --> 00:15:14,120 Speaker 1: I have a high disincentive to say anything that is unpopular. 296 00:15:14,600 --> 00:15:16,720 Speaker 1: And so we see and it goes back to the 297 00:15:16,720 --> 00:15:19,760 Speaker 1: common I made earlier about the election. People who believed 298 00:15:19,760 --> 00:15:22,120 Speaker 1: it was unpopular to support Donald Trump in two thousand 299 00:15:22,120 --> 00:15:24,440 Speaker 1: and sixteen just didn't talk about it, but they showed 300 00:15:24,480 --> 00:15:26,200 Speaker 1: up on election day and did. Now we're seeing it's 301 00:15:26,200 --> 00:15:28,600 Speaker 1: a little more um. People are a little bolder about 302 00:15:28,600 --> 00:15:32,120 Speaker 1: their support now than they were in But what's happening 303 00:15:32,440 --> 00:15:36,920 Speaker 1: is unpopular opinions tend to be suppressed on social media, 304 00:15:37,200 --> 00:15:40,720 Speaker 1: and so we have a sense of consensus that doesn't 305 00:15:40,760 --> 00:15:44,320 Speaker 1: really exist. But yes, if if, if you remember the 306 00:15:44,360 --> 00:15:47,480 Speaker 1: summer of two thousand and fifteen, UM when the Supreme 307 00:15:47,480 --> 00:15:49,400 Speaker 1: Court was ruling on gay marriage, it was a very 308 00:15:49,480 --> 00:15:53,480 Speaker 1: popular thing to have this rainbow tinted profile picture as 309 00:15:53,520 --> 00:15:56,760 Speaker 1: a as a solidarity. Right around that same summer, the 310 00:15:57,280 --> 00:16:01,120 Speaker 1: South Carolina State Legislature was being was debating taking the 311 00:16:01,120 --> 00:16:04,200 Speaker 1: Confederate flag down from over the state Capitol building. There 312 00:16:04,240 --> 00:16:09,600 Speaker 1: were no equivalent Confederate flag filters over anybody's profile pictures, 313 00:16:10,280 --> 00:16:12,440 Speaker 1: which is kind of surprising to me because I grew 314 00:16:12,480 --> 00:16:14,560 Speaker 1: up in a pretty rural part of Pennsylvania, where I 315 00:16:14,560 --> 00:16:19,040 Speaker 1: know lots of gun toting, right wing evangelical Christian types. 316 00:16:19,080 --> 00:16:21,000 Speaker 1: You'd think that I would have an equal number of 317 00:16:21,120 --> 00:16:24,320 Speaker 1: rainbow printed tinted profile pictures and Confederate flag ones. But 318 00:16:24,360 --> 00:16:28,400 Speaker 1: I didn't, um because once Facebook had become a wash 319 00:16:28,560 --> 00:16:33,120 Speaker 1: with those rainbow filters, everybody who believed otherwise had a 320 00:16:33,160 --> 00:16:36,800 Speaker 1: significant disincentive from saying anything to the contrary because they 321 00:16:36,880 --> 00:16:39,640 Speaker 1: knew they'd be open to attack. So narratives that emerge 322 00:16:39,640 --> 00:16:41,960 Speaker 1: on social media and take take hold are very, very 323 00:16:41,960 --> 00:16:44,200 Speaker 1: difficult to change, and the impact that has on the 324 00:16:44,240 --> 00:16:48,000 Speaker 1: popularity of music, the popularity of brands and fashion. It's 325 00:16:48,120 --> 00:16:50,800 Speaker 1: it's almost a measurable today. Okay, Just so I understand 326 00:16:51,800 --> 00:16:54,520 Speaker 1: it is is it a skewed view of reality or 327 00:16:54,600 --> 00:16:57,320 Speaker 1: does it change reality? Or both both? I mean, I 328 00:16:57,400 --> 00:17:01,200 Speaker 1: think I think what we're what what what I'm suggesting 329 00:17:01,320 --> 00:17:03,080 Speaker 1: is that when we look back on this period of 330 00:17:03,160 --> 00:17:06,960 Speaker 1: history that it will have changed reality. Um. Well, I'm 331 00:17:06,960 --> 00:17:11,080 Speaker 1: a big believer that the reason the younger generation is 332 00:17:11,119 --> 00:17:14,399 Speaker 1: not as racist and gay marriage is accepted because they 333 00:17:14,440 --> 00:17:18,800 Speaker 1: saw a rainbow of colors and uh, different choices on 334 00:17:19,000 --> 00:17:23,800 Speaker 1: MTV and MTV dominated. So are we saying the same 335 00:17:23,880 --> 00:17:27,920 Speaker 1: thing that social media will ultimately affect what people believe? 336 00:17:28,320 --> 00:17:31,080 Speaker 1: There's no question about it. I think that it is. 337 00:17:33,960 --> 00:17:40,800 Speaker 1: Once a wave of popular opinion washes over, it moves people. Um, 338 00:17:40,840 --> 00:17:43,720 Speaker 1: it gets them to reconsider their points of view. Some 339 00:17:44,000 --> 00:17:46,919 Speaker 1: it gets others to really stig their heels in. So 340 00:17:46,960 --> 00:17:50,080 Speaker 1: what what are we learning with the backlash against Facebook 341 00:17:50,080 --> 00:17:54,680 Speaker 1: and social media presently? Well, there's a lot of technical 342 00:17:54,720 --> 00:17:56,880 Speaker 1: reasons for some of that that I don't think Facebook 343 00:17:56,880 --> 00:17:59,760 Speaker 1: should be blamed for. Um, this has all been happening 344 00:17:59,800 --> 00:18:02,560 Speaker 1: so fast, nobody, It's not like everybody saw it coming 345 00:18:02,600 --> 00:18:05,720 Speaker 1: and told Facebook to prepare itself. This was a whole 346 00:18:05,760 --> 00:18:07,679 Speaker 1: new frontier for them. I think they can fix it. 347 00:18:08,359 --> 00:18:11,120 Speaker 1: I think what we're learning about from the election through 348 00:18:11,200 --> 00:18:13,520 Speaker 1: now is what and and social media is driving a 349 00:18:13,560 --> 00:18:17,280 Speaker 1: lot of this, which is this incessant kind of tribalism 350 00:18:17,280 --> 00:18:21,080 Speaker 1: in America that we have where um I want to 351 00:18:21,440 --> 00:18:23,920 Speaker 1: and it's and it's affecting UM this or what we 352 00:18:23,960 --> 00:18:25,720 Speaker 1: call sort of the stay at home economy as well, 353 00:18:25,880 --> 00:18:27,720 Speaker 1: people don't want to go out in public as much 354 00:18:27,720 --> 00:18:29,159 Speaker 1: as that. We have a question that we track in 355 00:18:29,200 --> 00:18:31,320 Speaker 1: our database, which is do you would you say that 356 00:18:31,359 --> 00:18:32,879 Speaker 1: you want to go out, that you're going out more 357 00:18:32,960 --> 00:18:34,840 Speaker 1: or less than you did six months ago? Thirty three 358 00:18:34,840 --> 00:18:37,080 Speaker 1: percent of people and the last time we looked at 359 00:18:37,080 --> 00:18:38,560 Speaker 1: the state of a few weeks ago said that they're 360 00:18:38,600 --> 00:18:40,919 Speaker 1: going out in public, want to go out in public 361 00:18:40,960 --> 00:18:44,680 Speaker 1: less than they did six months ago. Why every time 362 00:18:44,680 --> 00:18:47,240 Speaker 1: there's a shooting, one person says or more people say, 363 00:18:47,280 --> 00:18:48,760 Speaker 1: I'm not going to go out today, I'm just gonna 364 00:18:48,840 --> 00:18:50,800 Speaker 1: stay home. And and we've made it super easy for 365 00:18:50,800 --> 00:18:53,440 Speaker 1: people to stay home. And we're controlling our surroundings, were 366 00:18:53,480 --> 00:18:57,080 Speaker 1: curating the people that were around in in two thousand 367 00:18:57,040 --> 00:18:59,000 Speaker 1: and fifteen, we had at we have a question that says, 368 00:18:59,040 --> 00:19:01,440 Speaker 1: do you generally like to be around people a lot? 369 00:19:01,640 --> 00:19:04,760 Speaker 1: It's very simple question. Percent of people in two thousand 370 00:19:04,760 --> 00:19:07,119 Speaker 1: fifteen said yes, I like to be around people a lot. 371 00:19:07,600 --> 00:19:11,760 Speaker 1: Today that number so it's about a decline in that 372 00:19:11,880 --> 00:19:13,720 Speaker 1: and that and that, and that doesn't seem like a 373 00:19:13,760 --> 00:19:16,680 Speaker 1: huge number or might. But what it's telling us is 374 00:19:16,720 --> 00:19:18,399 Speaker 1: people are saying, I don't want to go out and 375 00:19:18,440 --> 00:19:20,680 Speaker 1: sit in a restaurant and run a risk of sitting 376 00:19:20,720 --> 00:19:23,000 Speaker 1: next to somebody who's ranting on about some political issue 377 00:19:23,000 --> 00:19:24,680 Speaker 1: I disagree with. I'm going to stay at my home 378 00:19:25,040 --> 00:19:27,840 Speaker 1: where I've where I've curated my friend group on social media, 379 00:19:27,840 --> 00:19:30,399 Speaker 1: I've curated the media that I'm going to read and watch, 380 00:19:30,960 --> 00:19:33,399 Speaker 1: and I want to create a safe nest around myself. 381 00:19:33,440 --> 00:19:35,320 Speaker 1: And I think social media has helped to propagate a 382 00:19:35,359 --> 00:19:39,920 Speaker 1: lot of that. And so projecting five years out as 383 00:19:39,920 --> 00:19:45,120 Speaker 1: opposed to three months, do you have a thought on that? Yeah, 384 00:19:45,119 --> 00:19:47,920 Speaker 1: I think it's gonna continue to play out that way. 385 00:19:48,040 --> 00:19:51,320 Speaker 1: I think we're going to have UM. I think people 386 00:19:51,320 --> 00:19:55,000 Speaker 1: are a large portion of people are inherently introverted, but 387 00:19:55,080 --> 00:19:57,560 Speaker 1: it was always hard to be an introvert um. Now 388 00:19:57,600 --> 00:19:59,640 Speaker 1: I can have everything delivered to my doorstep. I never 389 00:19:59,680 --> 00:20:01,280 Speaker 1: have to the shop. I never have to leave to 390 00:20:01,280 --> 00:20:04,640 Speaker 1: see a movie. I actually and what happened. What's happening 391 00:20:04,680 --> 00:20:07,800 Speaker 1: is the commercial marketplace is moving towards those people were making. 392 00:20:07,800 --> 00:20:09,120 Speaker 1: We're saying, if you don't want to leave your house, 393 00:20:09,119 --> 00:20:12,600 Speaker 1: you never have to. UM. Companies are vying to own 394 00:20:12,680 --> 00:20:16,680 Speaker 1: my quote unquote ecosystem. They want to have my internet, 395 00:20:16,760 --> 00:20:20,159 Speaker 1: my cell phone, my content, my They want my refrigerator 396 00:20:20,240 --> 00:20:22,959 Speaker 1: to talk to my doorbell, to my smartphone, to my 397 00:20:23,080 --> 00:20:26,280 Speaker 1: watch on my hand, and the companies are all fighting 398 00:20:26,280 --> 00:20:28,320 Speaker 1: to own that so that when I choose to stay 399 00:20:28,359 --> 00:20:31,439 Speaker 1: home and never leave, I'm only using one company. And 400 00:20:31,520 --> 00:20:35,280 Speaker 1: I think maybe five years from now that peaks and saturates, 401 00:20:35,320 --> 00:20:39,040 Speaker 1: but we're far from that. Stay right there. We'll be 402 00:20:39,040 --> 00:20:41,800 Speaker 1: back with more of my conversation with John Dick, an 403 00:20:41,840 --> 00:20:44,840 Speaker 1: expert on pulling an analytics right here on the Bob 404 00:20:44,920 --> 00:20:48,359 Speaker 1: Left Sets podcast. You're listening to the Bob Left Sets 405 00:20:48,440 --> 00:20:51,720 Speaker 1: podcast recorded here in Venice, California, at the tune In Studios. 406 00:20:52,080 --> 00:20:54,160 Speaker 1: Each week, I interview a new guest that dive into 407 00:20:54,240 --> 00:20:57,520 Speaker 1: their backgrounds, their career, current events and everything in between. 408 00:20:58,119 --> 00:21:01,159 Speaker 1: If you like the podcast, subscribe, review the show, and 409 00:21:01,240 --> 00:21:03,840 Speaker 1: check out earlier episodes. You can hear them all on 410 00:21:04,000 --> 00:21:07,159 Speaker 1: tune in, Apple Podcasts, or your podcast player of choice. 411 00:21:07,440 --> 00:21:12,720 Speaker 1: And now more with John Dick. In light of gun tragedies, 412 00:21:13,000 --> 00:21:15,560 Speaker 1: Have you done any research on that that tells us 413 00:21:15,760 --> 00:21:19,520 Speaker 1: really were what America thinks? Um, And nothing that would 414 00:21:19,560 --> 00:21:24,320 Speaker 1: surprise you. UM. I think a lot of people and 415 00:21:24,400 --> 00:21:26,639 Speaker 1: this is not just our date. I think most of 416 00:21:26,680 --> 00:21:28,240 Speaker 1: the most data that you would see would say that 417 00:21:28,280 --> 00:21:33,480 Speaker 1: the majority of Americans, UM, support some stricter interpretation of, 418 00:21:33,560 --> 00:21:36,240 Speaker 1: if not if not stricter gun laws, stricter interpretation of 419 00:21:36,240 --> 00:21:39,439 Speaker 1: the gun laws that we have today. UM. But the 420 00:21:39,560 --> 00:21:42,679 Speaker 1: political environment is such that people don't want to give 421 00:21:42,720 --> 00:21:46,520 Speaker 1: any ground. One of the one of the unfortunate consequences 422 00:21:46,520 --> 00:21:48,639 Speaker 1: of tribalism isn't so much just that I want to 423 00:21:48,920 --> 00:21:51,639 Speaker 1: kind of organize only with people who agree with me 424 00:21:51,680 --> 00:21:54,920 Speaker 1: and believe me. But I start finding myself distrusting everybody else. 425 00:21:56,600 --> 00:22:01,040 Speaker 1: And that's that's the part of this kind socio cultural 426 00:22:01,119 --> 00:22:04,159 Speaker 1: shift that scares us when we look at our data. Okay, 427 00:22:04,440 --> 00:22:10,320 Speaker 1: tell my audience, certain things you've learned that are counterintuitive. Um, well, 428 00:22:12,280 --> 00:22:16,040 Speaker 1: there's certainly I felt that it was counterintuitive that we 429 00:22:16,119 --> 00:22:19,200 Speaker 1: see this trend towards introversion among people, because I thought 430 00:22:19,200 --> 00:22:21,080 Speaker 1: we tended to be more likely to be so I 431 00:22:21,280 --> 00:22:24,879 Speaker 1: thought we were more social beings than we really are. UM. 432 00:22:24,920 --> 00:22:28,120 Speaker 1: I don't think most of us ultimately want to be UM, 433 00:22:28,160 --> 00:22:31,680 Speaker 1: So that that kind of surprised me. Um. We see 434 00:22:31,720 --> 00:22:34,560 Speaker 1: relationships between things that I wouldn't have expected a few 435 00:22:34,600 --> 00:22:37,400 Speaker 1: years ago when restaurants spending started to decline in the US, 436 00:22:37,480 --> 00:22:41,080 Speaker 1: and it was kind of inexplicable at the time. Traditionally, 437 00:22:41,119 --> 00:22:45,920 Speaker 1: for for decades and decades, UM, restaurant spending is really 438 00:22:45,920 --> 00:22:50,800 Speaker 1: closely correlated to fuel prices, and because it's disposable income 439 00:22:50,920 --> 00:22:53,080 Speaker 1: and and and fuel prices affects everybody. If I have 440 00:22:53,119 --> 00:22:55,400 Speaker 1: a couple extra if I live in a rural town 441 00:22:55,400 --> 00:22:59,720 Speaker 1: in middle middle part of Pennsylvania, or I live in 442 00:22:59,720 --> 00:23:01,560 Speaker 1: a big city and drive a car, one of the 443 00:23:01,560 --> 00:23:03,800 Speaker 1: first things that I can notice in my bank account 444 00:23:03,800 --> 00:23:06,919 Speaker 1: as if I'm spending less money on gas and so Historically, 445 00:23:07,240 --> 00:23:10,359 Speaker 1: restaurant spending was very closely aligned with shifts and fuel prices, 446 00:23:10,400 --> 00:23:13,440 Speaker 1: but two years ago, restaurant spending was going down on 447 00:23:13,520 --> 00:23:17,119 Speaker 1: fuel prices were not going up. And we found in 448 00:23:17,119 --> 00:23:19,119 Speaker 1: our data, which is part of the benefit of being 449 00:23:19,160 --> 00:23:21,200 Speaker 1: able to study all of these things at once, is 450 00:23:21,240 --> 00:23:24,240 Speaker 1: that it turned out was very heavily correlated with how 451 00:23:24,320 --> 00:23:27,640 Speaker 1: much people were experiencing increases experiencing increases in their health 452 00:23:27,760 --> 00:23:31,399 Speaker 1: insurance premiums, that that was beginning to chip away more 453 00:23:31,440 --> 00:23:34,199 Speaker 1: at disposed healthcare costs were beginning to chip away at 454 00:23:34,200 --> 00:23:36,680 Speaker 1: disposable income at a faster rate than fuel prices were, 455 00:23:37,040 --> 00:23:39,840 Speaker 1: and that had never happened. How do you find that, Well, 456 00:23:39,880 --> 00:23:42,000 Speaker 1: because we're studying all of this stuff at once, we 457 00:23:42,080 --> 00:23:43,800 Speaker 1: look at all these lines on a chart to see 458 00:23:43,800 --> 00:23:46,960 Speaker 1: how they which one leads, one lags the other, How 459 00:23:47,000 --> 00:23:49,200 Speaker 1: related are they, does one seem to be a cause 460 00:23:49,240 --> 00:23:51,720 Speaker 1: of another, or do they just tend to follow or lead? 461 00:23:52,280 --> 00:23:55,840 Speaker 1: And it was pretty evident to us in looking at 462 00:23:55,880 --> 00:23:58,760 Speaker 1: two So we have two questions, say, um, it's actually 463 00:23:58,840 --> 00:24:01,560 Speaker 1: more complicated that, but imagine. And one question is would 464 00:24:01,560 --> 00:24:03,240 Speaker 1: you say you're spending more or less to eat out? 465 00:24:03,280 --> 00:24:05,080 Speaker 1: And another one is would you say you're spending more 466 00:24:05,160 --> 00:24:08,080 Speaker 1: or less on your health care? And we looked at 467 00:24:08,080 --> 00:24:10,640 Speaker 1: the people who answered yes to those two questions, they 468 00:24:10,680 --> 00:24:14,919 Speaker 1: were closely aligned to one another. Okay, but sometimes there 469 00:24:14,920 --> 00:24:17,840 Speaker 1: can be false correlations. Yeah, but that's just I mean, 470 00:24:18,160 --> 00:24:20,720 Speaker 1: really smart statisticians know how to fare at that stuff out. 471 00:24:20,800 --> 00:24:23,560 Speaker 1: How do you do it? Well? You test them, Um, 472 00:24:23,600 --> 00:24:27,200 Speaker 1: you continue to see if they if they continue to 473 00:24:27,240 --> 00:24:29,879 Speaker 1: play out over time, You ask questions different ways to 474 00:24:29,880 --> 00:24:31,960 Speaker 1: see if it's maybe just an error in the way 475 00:24:31,960 --> 00:24:38,000 Speaker 1: a person is interpreting the question. Um, we're we're certainly 476 00:24:38,040 --> 00:24:40,240 Speaker 1: mindful of that. We have a it's called a false 477 00:24:40,280 --> 00:24:44,159 Speaker 1: detection procedure that we use and it's essentially a coefficient 478 00:24:44,200 --> 00:24:47,440 Speaker 1: that we use to measure whether whether there's a false 479 00:24:47,480 --> 00:24:51,040 Speaker 1: positive in our data. Wow, no, wonder you have the 480 00:24:51,040 --> 00:24:53,480 Speaker 1: cardig email and people involved. Yeah that's so. Did you 481 00:24:53,560 --> 00:24:57,399 Speaker 1: learn anything else about health care? Um? Well no, I 482 00:24:57,400 --> 00:25:02,560 Speaker 1: mean it's a mess. Um. I think that we're we 483 00:25:02,600 --> 00:25:05,680 Speaker 1: saw a period of choice that hadn't existed before because 484 00:25:05,760 --> 00:25:08,280 Speaker 1: the Affordable Care Act and a lot of that's uncertain 485 00:25:08,400 --> 00:25:11,240 Speaker 1: right now. So it's a healthcare is a gray area 486 00:25:11,240 --> 00:25:13,000 Speaker 1: at the moment because we don't know what the future 487 00:25:13,080 --> 00:25:16,080 Speaker 1: of it's going to look like. Okay, so you're working 488 00:25:16,080 --> 00:25:21,680 Speaker 1: with Fortune five companies, how would they find you? Um? 489 00:25:21,720 --> 00:25:23,960 Speaker 1: They're either referred to us by somebody else we already 490 00:25:23,960 --> 00:25:27,280 Speaker 1: work with. We publish a lot of information. You mentioned 491 00:25:27,760 --> 00:25:30,119 Speaker 1: a little email that we do every Saturday morning that 492 00:25:30,200 --> 00:25:33,440 Speaker 1: finds its way into the hands of leaders. We don't 493 00:25:33,480 --> 00:25:35,800 Speaker 1: really have. We have a very very very small salesforce. 494 00:25:35,880 --> 00:25:38,520 Speaker 1: We're not the kind of company that's trying to um 495 00:25:38,840 --> 00:25:41,199 Speaker 1: barge our way into your office to demo something for you. 496 00:25:41,240 --> 00:25:45,359 Speaker 1: We typically, UM, somebody typically reaches out to us and says, hey, 497 00:25:45,440 --> 00:25:47,920 Speaker 1: I saw this study you published about this thing, and 498 00:25:47,960 --> 00:25:50,120 Speaker 1: I'd really like to understand how it affects my business. 499 00:25:50,200 --> 00:25:53,199 Speaker 1: And that's usually where a business. Okay, so I hire you. 500 00:25:53,280 --> 00:25:56,120 Speaker 1: Let's say I'm a software company. What will you do 501 00:25:56,200 --> 00:25:58,399 Speaker 1: for me? Well, first thing we have to do is 502 00:25:58,480 --> 00:26:01,240 Speaker 1: understand your business. How do you do that? UM? We 503 00:26:01,359 --> 00:26:03,640 Speaker 1: just spend time together. We do. We have what's called 504 00:26:03,640 --> 00:26:06,520 Speaker 1: a discovery process and a solution design process, which sounds 505 00:26:06,600 --> 00:26:10,520 Speaker 1: very bureaucratic, but it's just simply a process of us learning, 506 00:26:11,119 --> 00:26:13,719 Speaker 1: really what your blind spots are as a business, what 507 00:26:13,760 --> 00:26:15,680 Speaker 1: do you and and no one never knows what those 508 00:26:15,680 --> 00:26:17,600 Speaker 1: are in the first conversation because they don't think they 509 00:26:17,600 --> 00:26:20,800 Speaker 1: have any UM. So we we start to. We challenge 510 00:26:20,840 --> 00:26:23,080 Speaker 1: some of that, We ask what they know, and we 511 00:26:23,480 --> 00:26:25,240 Speaker 1: try to push back and we get a sense of 512 00:26:25,280 --> 00:26:27,160 Speaker 1: all right, where can we fill in some blanks about 513 00:26:27,200 --> 00:26:30,560 Speaker 1: their business? I told you earlier one of the unintuitive 514 00:26:30,560 --> 00:26:36,040 Speaker 1: things for me is that UM our our biggest clients 515 00:26:36,160 --> 00:26:39,480 Speaker 1: UM are companies that you would think don't need any 516 00:26:39,480 --> 00:26:41,960 Speaker 1: more data. I told you we have sort of one 517 00:26:42,040 --> 00:26:44,800 Speaker 1: open square on the tech Giant Bingo card of our business. 518 00:26:44,840 --> 00:26:47,919 Speaker 1: That that, but all the others we have, And you 519 00:26:47,960 --> 00:26:50,240 Speaker 1: would think I would mention the names of those companies 520 00:26:50,280 --> 00:26:52,080 Speaker 1: and you'd say, well, I thought they already knew everything 521 00:26:52,119 --> 00:26:55,280 Speaker 1: about everyone. But what fascinates me about them is that, 522 00:26:55,520 --> 00:26:58,280 Speaker 1: and I think was what's a fundamental truth is the 523 00:26:58,320 --> 00:27:00,760 Speaker 1: more data those businesses have, the more questions they have, 524 00:27:01,280 --> 00:27:03,199 Speaker 1: the more things they don't know. And I think the 525 00:27:03,240 --> 00:27:05,879 Speaker 1: reason they've become the giants they are is because they 526 00:27:05,960 --> 00:27:09,320 Speaker 1: continue to be inquisitive and they continue to look under 527 00:27:09,440 --> 00:27:12,960 Speaker 1: rocks for the next big insight, and those tend to 528 00:27:13,000 --> 00:27:14,800 Speaker 1: be our best clients. So let's say you're a big 529 00:27:14,840 --> 00:27:19,560 Speaker 1: Silicon Valley company which is collecting reams of data constantly, 530 00:27:20,720 --> 00:27:26,639 Speaker 1: what would you literally provide UM context? Sometimes UM a 531 00:27:26,720 --> 00:27:29,879 Speaker 1: lot of the data that those companies have tells you what, 532 00:27:30,119 --> 00:27:33,240 Speaker 1: but they may not necessarily answer the why, and you 533 00:27:33,240 --> 00:27:37,240 Speaker 1: can answer that, yeah, they Our our specialty is the 534 00:27:37,320 --> 00:27:39,639 Speaker 1: why and the what next? Where do we think this 535 00:27:39,720 --> 00:27:42,480 Speaker 1: is going? Let's talk specifically about the music business. Sure, 536 00:27:42,600 --> 00:27:45,639 Speaker 1: what have you learned about the music business in your research? Um, 537 00:27:45,680 --> 00:27:50,560 Speaker 1: it's in a interesting stage of of cannibalization. That doesn't 538 00:27:50,680 --> 00:27:56,640 Speaker 1: people aren't listening to music more, um, cannibalization. So there's 539 00:27:56,640 --> 00:27:59,639 Speaker 1: a it's it's something of a zero sum game for 540 00:27:59,720 --> 00:28:03,000 Speaker 1: the players in the space. There. There aren't more hours 541 00:28:03,000 --> 00:28:05,320 Speaker 1: in the day. There aren't more people listening to music 542 00:28:05,359 --> 00:28:08,760 Speaker 1: than listen to it twenty years ago. So the fact 543 00:28:08,760 --> 00:28:12,399 Speaker 1: that music is more accessible people are or not listening 544 00:28:12,400 --> 00:28:14,240 Speaker 1: to more music there does not appear to be any 545 00:28:14,280 --> 00:28:18,119 Speaker 1: evidence of that, or at least new people listening to 546 00:28:18,200 --> 00:28:21,000 Speaker 1: music who aren't listening to it a lot before. And 547 00:28:21,040 --> 00:28:23,520 Speaker 1: so with all of the providers coming into the marketplace, 548 00:28:23,800 --> 00:28:26,879 Speaker 1: they are fighting over a finite pool of and now 549 00:28:26,880 --> 00:28:29,239 Speaker 1: it's a large pool, don't get me wrong, but but 550 00:28:29,280 --> 00:28:33,159 Speaker 1: there's it's finite, and there's not room for there's certainly 551 00:28:33,160 --> 00:28:36,000 Speaker 1: a room for, you know, multiple providers, but only to 552 00:28:36,040 --> 00:28:38,480 Speaker 1: the extent that they are dividing up our attention. I 553 00:28:38,480 --> 00:28:41,719 Speaker 1: think we will see some consolidation, and I don't mean 554 00:28:41,720 --> 00:28:45,520 Speaker 1: necessarily from a business standpoint, Um, one company buying another necessarily. 555 00:28:45,520 --> 00:28:47,920 Speaker 1: But I think we will see more and more music 556 00:28:47,960 --> 00:28:51,000 Speaker 1: listeners gravitating the single sources of music than well. I 557 00:28:51,040 --> 00:28:53,400 Speaker 1: believe that for different ways we live in a we 558 00:28:53,440 --> 00:28:56,080 Speaker 1: will live in a winter and losery economy. Across the board, 559 00:28:56,080 --> 00:28:58,640 Speaker 1: we have incommittee, quality, etcetera. You want to go where 560 00:28:58,640 --> 00:29:01,719 Speaker 1: your friends are, and let's do successful companies have sixty 561 00:29:02,400 --> 00:29:05,400 Speaker 1: plus percent market You're online the roll click away, right, 562 00:29:05,800 --> 00:29:10,600 Speaker 1: and I think, well, that's a part of the virtuous circle. Um, 563 00:29:10,640 --> 00:29:14,000 Speaker 1: it's easier, but it's also desirable. Um. One of the 564 00:29:14,040 --> 00:29:16,680 Speaker 1: same thing with the kind of the at home tech ecosystem. 565 00:29:16,720 --> 00:29:19,160 Speaker 1: If I have fewer passwords I have to remember, if 566 00:29:19,160 --> 00:29:22,120 Speaker 1: your tech support people I have to call, fewer interfaces 567 00:29:22,160 --> 00:29:24,280 Speaker 1: I have to learn. I mean, that's that's a highly 568 00:29:24,360 --> 00:29:26,760 Speaker 1: that's as desirable to me as my refrigerator talking to 569 00:29:26,800 --> 00:29:29,560 Speaker 1: my doorbell would be. Just the simplicity and the convenience 570 00:29:29,560 --> 00:29:32,160 Speaker 1: of it. I think that's all driving. Um, it's driving 571 00:29:32,200 --> 00:29:36,000 Speaker 1: what will what we expect to be a rapid pace 572 00:29:36,040 --> 00:29:39,520 Speaker 1: of consolidation, particularly in all media, but music in particular. Okay, 573 00:29:39,520 --> 00:29:42,160 Speaker 1: so there, let's assume there'll be fewer places that you 574 00:29:42,160 --> 00:29:44,600 Speaker 1: would ultimately gravitate to listen to music, or they have 575 00:29:44,680 --> 00:29:51,360 Speaker 1: larger market share. What about the acts themselves, Well, we 576 00:29:51,360 --> 00:29:53,520 Speaker 1: we don't need our data to tell anyone and you're 577 00:29:53,600 --> 00:29:55,320 Speaker 1: you know this better than we do that. It's just 578 00:29:55,560 --> 00:30:01,120 Speaker 1: much much easier for people to record at right. Um, 579 00:30:01,360 --> 00:30:04,000 Speaker 1: it's happening at an almost immeasurable pace because I can 580 00:30:04,040 --> 00:30:05,920 Speaker 1: just take a quick video of myself and slap it 581 00:30:05,920 --> 00:30:11,920 Speaker 1: on YouTube. Um. I think we are because of a 582 00:30:11,960 --> 00:30:15,320 Speaker 1: lot of the trends we see in social media. Um, 583 00:30:15,440 --> 00:30:19,320 Speaker 1: we see why. I know you have some thoughts on virality. Um, 584 00:30:19,520 --> 00:30:21,520 Speaker 1: things can go from small to big very fast, but 585 00:30:21,560 --> 00:30:23,680 Speaker 1: they can also go from big small very fast with 586 00:30:23,720 --> 00:30:28,640 Speaker 1: a misstep. And I think the adoption curve of of everything, 587 00:30:28,680 --> 00:30:31,800 Speaker 1: including music is truncated. So it's happening much faster than 588 00:30:31,800 --> 00:30:34,600 Speaker 1: it did years ago because there's fewer barriers to entry. 589 00:30:35,040 --> 00:30:38,000 Speaker 1: The adoption curve. What is that, Well, it's it's the 590 00:30:38,120 --> 00:30:41,479 Speaker 1: extent to which Um, there are people out there who 591 00:30:41,480 --> 00:30:43,720 Speaker 1: will try everything. There are music fans who will listen 592 00:30:43,760 --> 00:30:46,640 Speaker 1: to every single artist they hear about, and then we 593 00:30:46,880 --> 00:30:50,600 Speaker 1: call those an early adopter. Um, there's a second phase 594 00:30:50,880 --> 00:30:53,280 Speaker 1: group of people will call a market maven in our terminology, 595 00:30:53,320 --> 00:30:55,400 Speaker 1: which is I try it and then I tell everyone 596 00:30:55,440 --> 00:30:58,080 Speaker 1: about it, and I go to social media and I say, 597 00:30:58,120 --> 00:30:59,800 Speaker 1: look at these songs I listened to check out this 598 00:30:59,800 --> 00:31:01,680 Speaker 1: band that I like. And those people have a lot 599 00:31:01,720 --> 00:31:04,080 Speaker 1: of power today. My sister is one of those people 600 00:31:04,080 --> 00:31:05,840 Speaker 1: in music. When she tells me about a band a 601 00:31:05,920 --> 00:31:08,640 Speaker 1: year later there at the Grammys, you know she's always 602 00:31:08,640 --> 00:31:12,239 Speaker 1: been that way. But UM, there's a third group of 603 00:31:12,240 --> 00:31:14,880 Speaker 1: people then, who wait for those market maven's to tell 604 00:31:14,880 --> 00:31:17,120 Speaker 1: them what to think. They don't want to make decisions 605 00:31:17,120 --> 00:31:19,320 Speaker 1: for themselves and not doing a ton of their own research, 606 00:31:19,400 --> 00:31:21,400 Speaker 1: going and reading reviews or any of that. They wait 607 00:31:21,440 --> 00:31:23,120 Speaker 1: for those friends they have like I do with my 608 00:31:23,160 --> 00:31:24,920 Speaker 1: sister who says, oh, you should go check out this 609 00:31:24,960 --> 00:31:27,520 Speaker 1: band because I don't have time to follow music. Now 610 00:31:27,560 --> 00:31:31,200 Speaker 1: with an overwhelming number of marketing messages, is the percentages 611 00:31:31,720 --> 00:31:36,960 Speaker 1: of early adopters, market maven's and passive listeners have those 612 00:31:37,040 --> 00:31:40,880 Speaker 1: numbers changed percentage wise? Not really? Um, it's a in fact, 613 00:31:41,160 --> 00:31:42,920 Speaker 1: we're working on a study on this right now. It's 614 00:31:42,960 --> 00:31:45,920 Speaker 1: it's remarkably consistent the percentage of people that fall into 615 00:31:45,960 --> 00:31:48,520 Speaker 1: those buckets. But what's happening is the progression from one 616 00:31:48,520 --> 00:31:51,160 Speaker 1: group to the next is happening much faster because the 617 00:31:51,160 --> 00:31:53,880 Speaker 1: connectivity between the early adopter to the market may even 618 00:31:53,920 --> 00:31:55,640 Speaker 1: to the lemmings who wait for those market mayven's to 619 00:31:55,640 --> 00:31:57,280 Speaker 1: tell them what to do. We're just much closer to 620 00:31:57,320 --> 00:32:00,280 Speaker 1: each other than we everywhere before. So every time I 621 00:32:00,320 --> 00:32:03,200 Speaker 1: turn on Facebook, there's the likelihood of my sister posting 622 00:32:03,280 --> 00:32:05,240 Speaker 1: some song that she just listened to that inspires me 623 00:32:05,280 --> 00:32:07,360 Speaker 1: to listen to it that I where five years ago 624 00:32:07,360 --> 00:32:08,360 Speaker 1: I had to wait for to call me on the 625 00:32:08,360 --> 00:32:11,000 Speaker 1: phone to do that. Yes, but also I would argue 626 00:32:11,280 --> 00:32:13,760 Speaker 1: that there's so many messages that it is harder to 627 00:32:13,760 --> 00:32:16,080 Speaker 1: get somebody to click or harder to pass on your 628 00:32:16,080 --> 00:32:20,320 Speaker 1: information today from a marketing standpoint, Sure, so that's why 629 00:32:20,600 --> 00:32:22,720 Speaker 1: social media is so powerful, because I don't know what 630 00:32:22,760 --> 00:32:25,280 Speaker 1: else to trust. But if my sister tells me, that's different, 631 00:32:25,320 --> 00:32:27,520 Speaker 1: and that's a hard thing for brands and marketers to 632 00:32:27,560 --> 00:32:30,640 Speaker 1: get their head around the power of the influencer, which 633 00:32:30,680 --> 00:32:33,320 Speaker 1: isn't necessary and I don't mean in a YouTube influencer 634 00:32:33,320 --> 00:32:36,360 Speaker 1: as much as I mean somebody in my social network 635 00:32:36,400 --> 00:32:38,800 Speaker 1: who I trust. One thing that we've seen in our 636 00:32:38,880 --> 00:32:42,440 Speaker 1: data indisputably over the last four or five years, we 637 00:32:42,520 --> 00:32:45,280 Speaker 1: track things like which of these kinds of advertising has 638 00:32:45,280 --> 00:32:48,920 Speaker 1: the most influence over your purchase decisions TV, radio, print, whatever. 639 00:32:49,400 --> 00:32:51,320 Speaker 1: One of the lines on that chart is comments and 640 00:32:51,320 --> 00:32:53,760 Speaker 1: recommendations from my friends on social media, and that line 641 00:32:53,760 --> 00:32:56,400 Speaker 1: has climbed and climbed. It has its surpassed television for 642 00:32:56,440 --> 00:32:58,880 Speaker 1: the first time two summers ago and has stayed there. 643 00:32:59,240 --> 00:33:02,160 Speaker 1: So I trust because I don't know what other media 644 00:33:02,200 --> 00:33:04,200 Speaker 1: to trust, and that's actually getting worse right. We can't 645 00:33:04,200 --> 00:33:06,400 Speaker 1: discern fake news from real news anymore. So the only 646 00:33:06,440 --> 00:33:08,160 Speaker 1: thing I can trust is what my sister tells me. 647 00:33:08,880 --> 00:33:13,920 Speaker 1: So what is the future of advertising? I think what 648 00:33:13,960 --> 00:33:17,240 Speaker 1: we're seeing and what we're telling our clients is that 649 00:33:17,320 --> 00:33:20,200 Speaker 1: they have to understand that adoption curve, and they have 650 00:33:20,320 --> 00:33:22,480 Speaker 1: to understand that the way you communicate to an early 651 00:33:22,520 --> 00:33:24,560 Speaker 1: adopter is different than the way you communicate to a 652 00:33:24,600 --> 00:33:26,640 Speaker 1: market maven, which is different too than a way you 653 00:33:26,680 --> 00:33:29,440 Speaker 1: communicate to that third tier. To what degree your people 654 00:33:29,480 --> 00:33:34,600 Speaker 1: just tuning out advertising completely. Oh a lot. UM. I 655 00:33:34,640 --> 00:33:38,640 Speaker 1: think you've seen the advent of UM so called native 656 00:33:38,640 --> 00:33:41,080 Speaker 1: advertising in the last five or six years. So it's 657 00:33:41,640 --> 00:33:47,080 Speaker 1: advertising masquerading is a news article. UM. Advertising has to 658 00:33:47,080 --> 00:33:51,680 Speaker 1: be a lot less obtrusive in my life or intrusive. UM. 659 00:33:52,800 --> 00:33:56,920 Speaker 1: And what you're seeing is brands focused on creating more 660 00:33:56,960 --> 00:33:59,280 Speaker 1: of an ongoing connection with somebody so they don't feel 661 00:33:59,280 --> 00:34:01,600 Speaker 1: like I'm being advert ties to all the time. It's 662 00:34:01,680 --> 00:34:03,920 Speaker 1: it's a much more subtle process than it was twenty 663 00:34:03,960 --> 00:34:08,560 Speaker 1: years ago. Okay, we'll return to this conversation with John Dick, 664 00:34:08,680 --> 00:34:12,319 Speaker 1: CEO of Civic Science in a moment. This is Bob 665 00:34:12,400 --> 00:34:13,960 Speaker 1: left Up. So I'm a writer and you could read 666 00:34:13,960 --> 00:34:16,720 Speaker 1: me at left Steps dot com if you haven't noticed already. 667 00:34:16,800 --> 00:34:19,120 Speaker 1: I love getting a person story. I like to know 668 00:34:19,160 --> 00:34:22,520 Speaker 1: what makes them tick, specifically as it relates to successful 669 00:34:22,560 --> 00:34:25,680 Speaker 1: people who are changing the game in their respective field. 670 00:34:26,000 --> 00:34:27,719 Speaker 1: I'm eager to get to the bottom of it on 671 00:34:27,760 --> 00:34:30,120 Speaker 1: this show and that same thing, and I'd like to 672 00:34:30,160 --> 00:34:32,760 Speaker 1: invite you to attend my music media summit in Santa 673 00:34:32,800 --> 00:34:34,960 Speaker 1: Barbara the last weekend in April. It's gonna be a 674 00:34:34,960 --> 00:34:36,800 Speaker 1: great way to connect with the movers and shakers in 675 00:34:36,840 --> 00:34:39,280 Speaker 1: the world of tech and music. Go to Music Media 676 00:34:39,360 --> 00:34:43,120 Speaker 1: Summit dot com for tickets and more information. And now 677 00:34:43,640 --> 00:34:48,360 Speaker 1: more with John Dick. Let's go back to uh lessons 678 00:34:48,360 --> 00:34:50,600 Speaker 1: you've learned. You were telling me earlier that you learned 679 00:34:51,000 --> 00:34:55,279 Speaker 1: that Android users correlate with more heart problems. Yeah, I 680 00:34:55,320 --> 00:34:57,160 Speaker 1: mean those are those are the kind of fun and 681 00:34:57,239 --> 00:35:00,400 Speaker 1: quirky but often useful things in our data. When you 682 00:35:00,440 --> 00:35:04,080 Speaker 1: study hundred thousand different questions that have been answered a 683 00:35:04,080 --> 00:35:07,279 Speaker 1: billion times, you find relationships between stuff that, Yes, if 684 00:35:07,360 --> 00:35:10,279 Speaker 1: you have an Android phone, you're more likely to have 685 00:35:10,320 --> 00:35:12,200 Speaker 1: a history of heart disease in your family. That is 686 00:35:12,200 --> 00:35:14,759 Speaker 1: not causality. That doesn't mean if you switch phones you 687 00:35:14,800 --> 00:35:18,960 Speaker 1: won't have heart disease. It means that people who have 688 00:35:19,120 --> 00:35:21,080 Speaker 1: the types of people who have Android phones are the 689 00:35:21,080 --> 00:35:22,719 Speaker 1: types of people who tend to have a higher rate 690 00:35:22,760 --> 00:35:24,880 Speaker 1: of heart disease in their family. And there's income and 691 00:35:25,200 --> 00:35:28,759 Speaker 1: and other kinds of proxy reasons for that, which makes 692 00:35:28,840 --> 00:35:30,600 Speaker 1: lots of sense when you explain it. But if I'm 693 00:35:30,600 --> 00:35:33,720 Speaker 1: a marketer at a pharmaceutical company, that's a pretty useful 694 00:35:33,760 --> 00:35:36,560 Speaker 1: discovery to find in a database that that relationship exists. 695 00:35:36,920 --> 00:35:39,839 Speaker 1: One of our absolute favorites is um, there is a 696 00:35:39,880 --> 00:35:42,840 Speaker 1: staggering correlation between how much people like Gary Busey and 697 00:35:42,880 --> 00:35:46,200 Speaker 1: how much they like Kia cars. Never been able to 698 00:35:46,200 --> 00:35:50,080 Speaker 1: explain it, but but I can tell you. I can 699 00:35:50,120 --> 00:35:53,560 Speaker 1: tell you five or six years ago that that little 700 00:35:53,560 --> 00:35:58,040 Speaker 1: bit of information formed the catalyst of a television campaign 701 00:35:58,040 --> 00:36:01,560 Speaker 1: for a regional network of Keya dealerships Star and Gary Boucy. 702 00:36:01,800 --> 00:36:06,120 Speaker 1: I did that works? It worked like a charm. Wow. Okay, 703 00:36:06,239 --> 00:36:08,880 Speaker 1: a little bit more. How did you find that correlation? 704 00:36:09,000 --> 00:36:12,920 Speaker 1: Same thing? So we just have these really smart people 705 00:36:13,000 --> 00:36:15,640 Speaker 1: in smart technology that crawls around and all of this 706 00:36:15,760 --> 00:36:18,000 Speaker 1: data all the time and looks for those kinds of things. 707 00:36:18,160 --> 00:36:22,120 Speaker 1: The kind of unexpected insight that one of our kind 708 00:36:22,120 --> 00:36:24,879 Speaker 1: of marketing sticks is that we answer questions you never 709 00:36:24,880 --> 00:36:28,160 Speaker 1: thought to ask. Okay, so do you go further and 710 00:36:28,200 --> 00:36:31,759 Speaker 1: try to get Okay, Gary Busey, he has kind of 711 00:36:31,800 --> 00:36:34,120 Speaker 1: an irreverence gear, gives a shit about life. And if 712 00:36:34,120 --> 00:36:36,799 Speaker 1: you're driving the Kia Squirrel car, it's kind of an 713 00:36:36,840 --> 00:36:38,919 Speaker 1: ugly car, but you're sort of saying the same thing. Yeah, 714 00:36:38,960 --> 00:36:43,720 Speaker 1: I mean, we get at we eventually you find the why. 715 00:36:43,800 --> 00:36:45,840 Speaker 1: And as I mentioned, I think That's the kind of 716 00:36:45,840 --> 00:36:48,520 Speaker 1: the gray space we're trying to fill in the world 717 00:36:48,640 --> 00:36:51,359 Speaker 1: is why is that true? So finding that it's true 718 00:36:51,440 --> 00:36:54,360 Speaker 1: is one thing. Understanding why it's true is another. Because 719 00:36:54,440 --> 00:36:57,480 Speaker 1: the advertising or marketing or product decisions that that inspire 720 00:36:57,560 --> 00:36:59,480 Speaker 1: have to be They can't just be Oh, I don't 721 00:36:59,480 --> 00:37:00,960 Speaker 1: know why, but we're gonna do it anyway. We have 722 00:37:01,000 --> 00:37:03,160 Speaker 1: to help them understand why. Okay, so you're on a roll. 723 00:37:03,239 --> 00:37:05,200 Speaker 1: Now tell us some more things you've learned in the data. 724 00:37:05,520 --> 00:37:08,960 Speaker 1: Oh geez, you know there are things that millennials are 725 00:37:08,960 --> 00:37:11,879 Speaker 1: more likely to pick rock and rock paper scissors than 726 00:37:11,920 --> 00:37:17,640 Speaker 1: other generations. Yeah, I'm trying to think. What what do 727 00:37:17,760 --> 00:37:20,160 Speaker 1: we learn from that? Well, you can learn how to 728 00:37:20,200 --> 00:37:24,960 Speaker 1: take money off of millennials at rock paper scissors. Yes, Uh, 729 00:37:25,000 --> 00:37:29,920 Speaker 1: that one's kind of trite. Um. Lots of what I mentioned, 730 00:37:29,920 --> 00:37:32,560 Speaker 1: the healthcare and restaurants spending, those are the more serious 731 00:37:32,640 --> 00:37:36,240 Speaker 1: kinds of useful, impactful, well, irrelevant of whether it's counterintuitive. 732 00:37:36,280 --> 00:37:38,759 Speaker 1: What other things might want to learn? What if someone 733 00:37:38,840 --> 00:37:42,080 Speaker 1: is an iPhone owner? What did you learn about iPhone owners? Well? 734 00:37:42,160 --> 00:37:47,720 Speaker 1: I can't touch that one. What I can say is, um, 735 00:37:48,160 --> 00:37:51,080 Speaker 1: we find one of my favorite areas of studies height. 736 00:37:51,560 --> 00:37:54,800 Speaker 1: There is a there are tons and tons of correlations 737 00:37:54,800 --> 00:37:58,319 Speaker 1: in our data tied to whether somebody is taller or 738 00:37:58,320 --> 00:38:01,200 Speaker 1: shorter than their peer group or people their age and gender. 739 00:38:01,320 --> 00:38:04,680 Speaker 1: There's there's and there's sociological reasons why that's true. But 740 00:38:05,440 --> 00:38:08,160 Speaker 1: taller people have different brand preferences, they have different media 741 00:38:08,200 --> 00:38:11,960 Speaker 1: consumption habits than shorter people do. And why is that? So? 742 00:38:12,160 --> 00:38:16,880 Speaker 1: We know over history taller people, generally when they were younger, 743 00:38:17,280 --> 00:38:21,480 Speaker 1: were better at physical sports than others. They tended to 744 00:38:22,200 --> 00:38:25,960 Speaker 1: that bread some confidence that manifests itself in the classroom 745 00:38:26,080 --> 00:38:29,479 Speaker 1: that um maybe helped the tall person stand out saying 746 00:38:29,480 --> 00:38:32,280 Speaker 1: a job interview or an admissions interview for a college. 747 00:38:32,280 --> 00:38:34,600 Speaker 1: And you do that millions and millions and billions of 748 00:38:34,600 --> 00:38:41,120 Speaker 1: times over human history. And there are certain um proclivities 749 00:38:41,160 --> 00:38:43,759 Speaker 1: that tall people have that short people don't, and vice versa. 750 00:38:43,800 --> 00:38:47,160 Speaker 1: And it's not always benefiting the tall people, by the way. UM. 751 00:38:47,160 --> 00:38:48,960 Speaker 1: But but we have a question in our database that 752 00:38:49,040 --> 00:38:51,640 Speaker 1: essentially is like are you taller shorter? And people answer 753 00:38:51,640 --> 00:38:53,440 Speaker 1: that question because they want to know how they compare 754 00:38:53,480 --> 00:38:55,279 Speaker 1: to other tall people and other short people. But when 755 00:38:55,320 --> 00:38:58,120 Speaker 1: we've asked that question, hundreds of thousands of times and 756 00:38:58,160 --> 00:39:01,359 Speaker 1: studied millions of things about people based on their height. 757 00:39:01,440 --> 00:39:04,799 Speaker 1: We find that there are indisputable relationships between your height 758 00:39:04,920 --> 00:39:08,360 Speaker 1: and your consumer behavior. For example, can you remember any 759 00:39:08,400 --> 00:39:12,040 Speaker 1: specific correlations. Um, Yeah, some of them are funny, I mean, 760 00:39:12,280 --> 00:39:14,560 Speaker 1: very obvious ones. Shorter people are less likely to go 761 00:39:14,640 --> 00:39:19,080 Speaker 1: see live music because they can't they can't see it. 762 00:39:19,280 --> 00:39:22,240 Speaker 1: I think they see your answer. That's probably your answer. Actually, 763 00:39:22,239 --> 00:39:25,640 Speaker 1: the way we the insight was that shorter people were 764 00:39:25,640 --> 00:39:30,120 Speaker 1: actually more likely to watch access concerts on television. Now, 765 00:39:30,160 --> 00:39:34,279 Speaker 1: in the music business, we've gone to so called, you know, 766 00:39:34,719 --> 00:39:38,960 Speaker 1: festival seating in that the people stand. I personally hate it. 767 00:39:39,000 --> 00:39:40,960 Speaker 1: I remember the old days when there was a seat 768 00:39:41,000 --> 00:39:45,239 Speaker 1: at the fill More East. Now the live companies would 769 00:39:45,239 --> 00:39:47,080 Speaker 1: tell you they can only put in certain number of 770 00:39:47,080 --> 00:39:50,120 Speaker 1: people because of fire marshal rules. That's a bunch of crap. 771 00:39:50,360 --> 00:39:53,919 Speaker 1: They squeeze more people in if they're standing. But would 772 00:39:53,960 --> 00:39:56,799 Speaker 1: their business be increased if they had seats and the 773 00:39:56,800 --> 00:39:59,720 Speaker 1: short people could see One would think yeah, I mean certainly. 774 00:39:59,719 --> 00:40:03,760 Speaker 1: If is an extent to which there's a person choosing 775 00:40:03,800 --> 00:40:05,759 Speaker 1: not to buy a ticket to a concert because they 776 00:40:05,760 --> 00:40:07,120 Speaker 1: don't feel like they're going to be able to see 777 00:40:07,120 --> 00:40:09,239 Speaker 1: well enough when they get there. That's what seemed to 778 00:40:09,239 --> 00:40:12,880 Speaker 1: me to be a pretty obvious measure for them to take. Wow, 779 00:40:13,080 --> 00:40:16,480 Speaker 1: that's very stimulating. Let's go to the reverse. Okay, No, 780 00:40:16,640 --> 00:40:20,040 Speaker 1: it really is, because I think this goes back to 781 00:40:20,080 --> 00:40:25,360 Speaker 1: your point that you're advising or working with household named companies, 782 00:40:25,600 --> 00:40:28,520 Speaker 1: Silicon Valley companies. We think of all the data, but 783 00:40:29,000 --> 00:40:32,520 Speaker 1: their insights. I mean, there's a lot of there's this 784 00:40:32,560 --> 00:40:34,799 Speaker 1: is something I have a lot of institutional beliefs that 785 00:40:34,880 --> 00:40:37,239 Speaker 1: I don't think are true, and the data can help 786 00:40:37,280 --> 00:40:44,120 Speaker 1: prove otherwise we are. There are certainly truisms um that 787 00:40:44,160 --> 00:40:48,520 Speaker 1: have at least evolved, if not are inherently untrue. UM 788 00:40:48,840 --> 00:40:51,239 Speaker 1: Like what really putting me on the spots some of 789 00:40:51,239 --> 00:40:56,359 Speaker 1: these No, I mean, I'll I'll think of um, well, 790 00:40:56,400 --> 00:41:00,799 Speaker 1: we don't millennials today, so hate you know, I feel 791 00:41:00,800 --> 00:41:02,080 Speaker 1: like I talked about it all them all the time, 792 00:41:02,080 --> 00:41:03,960 Speaker 1: because that's so much of what our clients want to 793 00:41:04,000 --> 00:41:07,400 Speaker 1: know about. But this sort of trend towards introversion among 794 00:41:07,440 --> 00:41:10,520 Speaker 1: consumers is really being driven by a millennial population. They're 795 00:41:10,520 --> 00:41:13,600 Speaker 1: going to bed earlier, they're waking up earlier, they're drinking 796 00:41:13,680 --> 00:41:17,800 Speaker 1: less um. There are lots of trends about young people 797 00:41:17,880 --> 00:41:22,200 Speaker 1: today that UM and younger, I mean young could be 798 00:41:22,239 --> 00:41:27,480 Speaker 1: anywhere from you know, thirty five and under that UM 799 00:41:27,640 --> 00:41:32,920 Speaker 1: are perceived to be a function of them their generation. Oh, 800 00:41:32,960 --> 00:41:38,239 Speaker 1: millennials are aren't going to drive cars because millennials, well, uh, 801 00:41:38,600 --> 00:41:40,920 Speaker 1: we want to draw distinction between what's a generation and 802 00:41:40,960 --> 00:41:44,080 Speaker 1: what's a life stage phenomenon. So, yeah, twenty four year 803 00:41:44,080 --> 00:41:45,480 Speaker 1: olds don't want to drive cars. I don't want to 804 00:41:45,520 --> 00:41:47,360 Speaker 1: drive a car when I was twenty four either, But 805 00:41:47,000 --> 00:41:48,640 Speaker 1: but all of a sudden, you grow up and you 806 00:41:48,640 --> 00:41:50,000 Speaker 1: have kids and you've got to truck them around a 807 00:41:50,040 --> 00:41:52,200 Speaker 1: soccer practice. Guess what, Millennials are starting to buy cars. 808 00:41:52,600 --> 00:41:54,839 Speaker 1: There was this, There were you know, gloom and doom 809 00:41:54,920 --> 00:41:58,200 Speaker 1: stories ten years ago that the auto manufacturers were screwed 810 00:41:58,239 --> 00:42:00,200 Speaker 1: because millennials were just going to take uber and ride 811 00:42:00,239 --> 00:42:02,680 Speaker 1: public transportation everywhere they went, and it's just not true. 812 00:42:03,239 --> 00:42:05,759 Speaker 1: We had a call a few years ago from the 813 00:42:05,800 --> 00:42:08,399 Speaker 1: director of a very large zoo who called to ask 814 00:42:08,480 --> 00:42:10,960 Speaker 1: why millennials weren't coming to the zoo. Now this might 815 00:42:10,960 --> 00:42:13,879 Speaker 1: have been longer six seven years ago. They were freaking 816 00:42:13,880 --> 00:42:16,240 Speaker 1: out because millennials weren't coming to the zoo, And I said, 817 00:42:16,760 --> 00:42:19,200 Speaker 1: year olds ever come to the zoo? Right? You know? 818 00:42:19,840 --> 00:42:23,520 Speaker 1: We we want to draw these these sort of hyperbolic 819 00:42:23,800 --> 00:42:28,160 Speaker 1: conclusions about things that quote unquote millennials are doing today. 820 00:42:28,239 --> 00:42:30,399 Speaker 1: But there were twenty two year old millennials. I mean, 821 00:42:30,480 --> 00:42:33,360 Speaker 1: so millennials technically what roughly eighteen to thirty four today? 822 00:42:33,520 --> 00:42:35,879 Speaker 1: I think of the life stages that encompasses. I could 823 00:42:35,880 --> 00:42:38,480 Speaker 1: be twenty two, smoking pots, sitting on my parents couch, 824 00:42:39,080 --> 00:42:41,560 Speaker 1: or I could be thirty three with two kids and 825 00:42:41,600 --> 00:42:44,879 Speaker 1: a mortgage and two cars in my garage. And so 826 00:42:44,960 --> 00:42:48,960 Speaker 1: to make some blanket statements about what's definitively millennial versus 827 00:42:48,960 --> 00:42:52,560 Speaker 1: definitively gen X versus baby boomers, A, it's a okay, 828 00:42:52,560 --> 00:42:55,440 Speaker 1: but let's go on a broad scale. In what ways 829 00:42:55,560 --> 00:42:58,799 Speaker 1: are millennials different from the generations they proceeded them? Well, 830 00:42:58,840 --> 00:43:04,239 Speaker 1: certainly digital connectivity and media consumption are absolutely true. Um, 831 00:43:04,280 --> 00:43:09,520 Speaker 1: these these these phenomena of the stay at home economy 832 00:43:09,640 --> 00:43:15,120 Speaker 1: and the the tribalism I mentioned before. Whether they're causing 833 00:43:15,160 --> 00:43:18,000 Speaker 1: it or whether they will be the first generation to 834 00:43:18,200 --> 00:43:20,839 Speaker 1: live fully within the confines of that, I don't know, 835 00:43:20,920 --> 00:43:23,160 Speaker 1: but it's certainly going to be the latter. They are 836 00:43:23,200 --> 00:43:27,040 Speaker 1: definitely going to be. They are driving a an era 837 00:43:27,360 --> 00:43:33,400 Speaker 1: of UM people staying in echo chambers, UM confining themselves 838 00:43:33,480 --> 00:43:37,000 Speaker 1: to a smaller group of people that they can curate 839 00:43:37,120 --> 00:43:39,600 Speaker 1: and can same thing with their media consumption. But you 840 00:43:39,640 --> 00:43:44,000 Speaker 1: can't open a major publication without people either selling or 841 00:43:44,080 --> 00:43:49,400 Speaker 1: analyzing ways to interact with millennials. Like especially millennials. You know, 842 00:43:49,480 --> 00:43:51,960 Speaker 1: you can't say anything negative to them. They're gonna cry, 843 00:43:52,000 --> 00:43:54,600 Speaker 1: They're gonna quit. Is that all horseshit or is that true? 844 00:43:54,760 --> 00:43:57,480 Speaker 1: I think that's a little overblown. I mean, most of 845 00:43:57,520 --> 00:43:59,640 Speaker 1: the people in my company fit the bill, fit the 846 00:43:59,680 --> 00:44:01,359 Speaker 1: deaf and isation of a millennial, and I don't find 847 00:44:01,360 --> 00:44:03,319 Speaker 1: that to be true at all. I think those are 848 00:44:03,480 --> 00:44:07,080 Speaker 1: those are fun stories for people to tell. I don't. 849 00:44:07,120 --> 00:44:11,400 Speaker 1: I don't have the gloomy view of, you know, millennials 850 00:44:11,560 --> 00:44:13,759 Speaker 1: killing the fabric of America or that will you read 851 00:44:13,800 --> 00:44:17,880 Speaker 1: a lot of I don't. It's I think it's overblown. UM. 852 00:44:17,920 --> 00:44:20,600 Speaker 1: I'll tell you another another thing that's that's maybe a 853 00:44:20,640 --> 00:44:24,400 Speaker 1: truism that we can dispel is that one of the 854 00:44:24,680 --> 00:44:27,120 Speaker 1: this is a huge sort of shifted topic. One of 855 00:44:27,120 --> 00:44:29,680 Speaker 1: the biggest fallacies in the world today is that Donald 856 00:44:29,680 --> 00:44:32,200 Speaker 1: Trump uses Twitter to talk to the American people. The 857 00:44:32,239 --> 00:44:36,040 Speaker 1: American people are not on Twitter. Of Americans at most 858 00:44:36,040 --> 00:44:38,759 Speaker 1: are on Twitter, even of his voters, give or take. 859 00:44:39,480 --> 00:44:43,520 Speaker 1: He is using Twitter to talk to the media who 860 00:44:43,560 --> 00:44:46,200 Speaker 1: carry his message to their little tribes of people who 861 00:44:46,200 --> 00:44:50,840 Speaker 1: read it, and so he doesn't care. And it's actually 862 00:44:50,920 --> 00:44:52,680 Speaker 1: quite brilliant. I'd have to tip my hat to the 863 00:44:52,680 --> 00:44:55,400 Speaker 1: way he's he's he's he handles it. I mean, he 864 00:44:55,960 --> 00:44:58,080 Speaker 1: tweets at three o'clock in the morning because he knows 865 00:44:58,080 --> 00:45:00,120 Speaker 1: he's setting the media landscape for the He's telling the 866 00:45:00,120 --> 00:45:01,920 Speaker 1: media what they're going to talk about that day. And 867 00:45:01,960 --> 00:45:03,920 Speaker 1: he doesn't care what CNN says about him because his 868 00:45:03,920 --> 00:45:08,040 Speaker 1: followers aren't watching CNN anyway. People don't trust information anymore. 869 00:45:08,080 --> 00:45:12,440 Speaker 1: They trust the distributors of that information. And so Donald 870 00:45:12,440 --> 00:45:16,240 Speaker 1: Trump seeds has ferry dust into Twitter, and those publications 871 00:45:16,320 --> 00:45:21,240 Speaker 1: run it to their individual tribes of readers. One um 872 00:45:21,280 --> 00:45:24,000 Speaker 1: one of the more profound trends that we see today 873 00:45:24,120 --> 00:45:27,560 Speaker 1: is even as even as consumption of every other form 874 00:45:27,600 --> 00:45:30,960 Speaker 1: of media is in decline, television ratings or down, sports 875 00:45:31,040 --> 00:45:34,799 Speaker 1: or down, newspaper readership is down radio listening is down. 876 00:45:35,239 --> 00:45:37,520 Speaker 1: The one thing that is skyrocketing is the readership of 877 00:45:37,520 --> 00:45:42,520 Speaker 1: political blogs political content online, which is a is a 878 00:45:42,840 --> 00:45:47,680 Speaker 1: harbinger of an ill future for so called objective media outlets, 879 00:45:47,719 --> 00:45:51,400 Speaker 1: because I don't want objective media. I want media that 880 00:45:51,640 --> 00:45:54,840 Speaker 1: affirms my existing beliefs and I and now there's so 881 00:45:54,880 --> 00:45:56,680 Speaker 1: much variety out there that I can go find it, 882 00:45:57,080 --> 00:45:59,200 Speaker 1: and it tends to be a blog who isn't trying 883 00:45:59,200 --> 00:46:01,799 Speaker 1: to appeal to the masses. UM that can tell me 884 00:46:01,880 --> 00:46:03,759 Speaker 1: exactly what I want to hear every time I click 885 00:46:03,840 --> 00:46:06,719 Speaker 1: on the web page. And Donald Trump has come along 886 00:46:06,760 --> 00:46:11,000 Speaker 1: in an era where he's benefiting from that immensely. Um So, 887 00:46:11,000 --> 00:46:16,200 Speaker 1: so he puts out his UM two morning tweets and 888 00:46:16,440 --> 00:46:18,640 Speaker 1: it's carried directly to the people he wanted wants to 889 00:46:18,680 --> 00:46:21,120 Speaker 1: carry it. Is he you believe he's consciously doing that? 890 00:46:21,960 --> 00:46:24,880 Speaker 1: I mean I think so. I think he knows that 891 00:46:24,920 --> 00:46:31,200 Speaker 1: he's communicating to his supporters through UM through the megaphones 892 00:46:31,280 --> 00:46:34,719 Speaker 1: of the media outlets that support him. What is your 893 00:46:34,719 --> 00:46:37,719 Speaker 1: research told you about the mid term elections? Well, uh, 894 00:46:37,920 --> 00:46:39,920 Speaker 1: funny you should ask that, because just today I was 895 00:46:39,960 --> 00:46:43,200 Speaker 1: reading some report on some of our data that is 896 00:46:43,239 --> 00:46:48,799 Speaker 1: speaking of um, we're seeing after sixteen months of very 897 00:46:48,840 --> 00:46:52,640 Speaker 1: positive growth and consumer sentiment, we've started to see over 898 00:46:52,680 --> 00:46:56,879 Speaker 1: the last month of fairly significant decline consumer confidence, economics sense, 899 00:46:56,920 --> 00:46:59,600 Speaker 1: people's attitudes or the job market and their personal finances, 900 00:46:59,600 --> 00:47:01,440 Speaker 1: and the out for the U s economy seems to 901 00:47:01,480 --> 00:47:05,600 Speaker 1: have peaked and is now beginning to slide. Um donivity 902 00:47:06,000 --> 00:47:09,279 Speaker 1: flawed as to why that is. Um, Well, I think 903 00:47:09,320 --> 00:47:11,799 Speaker 1: there was an artificial high in the first place. There 904 00:47:11,840 --> 00:47:13,239 Speaker 1: was a lot of sizzle, not a whole lot of 905 00:47:13,239 --> 00:47:16,640 Speaker 1: steak in terms of there was just optimism for optimism's sake. 906 00:47:16,760 --> 00:47:20,080 Speaker 1: There was a large sector of the US consumer population 907 00:47:20,080 --> 00:47:23,560 Speaker 1: who just believed things would get better. They tended to 908 00:47:23,560 --> 00:47:25,759 Speaker 1: be Republicans and the people who voted for Donald Trump 909 00:47:25,800 --> 00:47:28,120 Speaker 1: in the first place, even though they maybe not necessarily 910 00:47:28,120 --> 00:47:32,160 Speaker 1: had any reason to feel that way. Um. And so 911 00:47:32,880 --> 00:47:38,080 Speaker 1: we're next Tuesday in my hometown. There's a very significant 912 00:47:38,600 --> 00:47:42,160 Speaker 1: mid term special congressional election that the President's going there 913 00:47:42,160 --> 00:47:45,920 Speaker 1: on Saturday. Joe Biden was there yesterday, and I'm we're 914 00:47:45,920 --> 00:47:48,840 Speaker 1: not in the political prognostication business, but there's reason to 915 00:47:48,880 --> 00:47:52,040 Speaker 1: believe that this downturn in consumer sentiment because we do 916 00:47:52,120 --> 00:47:55,240 Speaker 1: know that that effects election cycles could have an impact 917 00:47:55,480 --> 00:47:57,719 Speaker 1: as as soon as next Tuesday on on a very 918 00:47:57,760 --> 00:48:01,680 Speaker 1: important it's a razor thin margin in that election, and 919 00:48:01,719 --> 00:48:03,759 Speaker 1: that could be the kind of downturn that might tip 920 00:48:03,760 --> 00:48:09,080 Speaker 1: it in the favor of the Democrat and Trump himself. 921 00:48:09,560 --> 00:48:12,840 Speaker 1: Now we read about his approval rating constantly. Is that 922 00:48:12,920 --> 00:48:16,640 Speaker 1: worth anything? Sure? I think so? I mean, is it 923 00:48:16,680 --> 00:48:21,440 Speaker 1: accurate the aggregate of If you take and one thing 924 00:48:21,440 --> 00:48:23,640 Speaker 1: we admire a lot about Nate Silver is he does 925 00:48:23,719 --> 00:48:27,000 Speaker 1: take into consideration a lot of different sources, and if 926 00:48:27,000 --> 00:48:29,520 Speaker 1: you normalize for the biases of one versus the other, 927 00:48:29,600 --> 00:48:34,560 Speaker 1: and those numbers, his approval numbers are fairly consistent. Um, 928 00:48:34,760 --> 00:48:37,760 Speaker 1: what matters more is whether who shows up to vote. 929 00:48:38,239 --> 00:48:41,239 Speaker 1: His approval ratings have haven't barged much at all. And 930 00:48:41,239 --> 00:48:44,000 Speaker 1: and and frankly, you know, there's what I think. The 931 00:48:44,080 --> 00:48:47,799 Speaker 1: estimate is about thirty of US adults actually voted for him. 932 00:48:47,960 --> 00:48:51,160 Speaker 1: If you can take into consideration the the people who 933 00:48:51,160 --> 00:48:52,799 Speaker 1: are registered to vote, in the portion that actually came 934 00:48:52,800 --> 00:48:54,439 Speaker 1: out and voted for him, we're talking about just over 935 00:48:54,520 --> 00:48:57,600 Speaker 1: one third of the entire US population. And that's roughly 936 00:48:57,640 --> 00:48:59,799 Speaker 1: give or take three or four points at a time, 937 00:48:59,840 --> 00:49:02,320 Speaker 1: that people who still approve of him, And so public 938 00:49:02,320 --> 00:49:05,960 Speaker 1: approval doesn't mean anything unless people vote, and we know 939 00:49:06,040 --> 00:49:08,560 Speaker 1: that they don't. And and the biggest problem that posters 940 00:49:08,600 --> 00:49:11,960 Speaker 1: have isn't can I measure public opinion? It's can I 941 00:49:12,000 --> 00:49:14,440 Speaker 1: predict who's going to turn out on election day? Okay, 942 00:49:14,480 --> 00:49:17,759 Speaker 1: we talked about millennials. What have we learned about baby boomers? Well, 943 00:49:17,800 --> 00:49:20,759 Speaker 1: I have to um putting my personal points of view 944 00:49:20,800 --> 00:49:23,800 Speaker 1: aside on that. Um, you consider yourself a gen x er. 945 00:49:23,880 --> 00:49:25,840 Speaker 1: I'm a gen xer. So should we start when we 946 00:49:25,920 --> 00:49:30,680 Speaker 1: learned about gen xers? Yeah? I mean, UM, conventional wisdom 947 00:49:30,719 --> 00:49:34,040 Speaker 1: would be there are a smaller bulge than the baby boomers. 948 00:49:34,360 --> 00:49:36,800 Speaker 1: They were in the wake of the baby boomers. They 949 00:49:36,800 --> 00:49:39,160 Speaker 1: feel on some level they missed out on woodstock in 950 00:49:39,160 --> 00:49:42,240 Speaker 1: the heyday, and they feel that the baby boomers raped 951 00:49:42,239 --> 00:49:44,879 Speaker 1: in pillage to their detriment. I think the latter is true. 952 00:49:44,920 --> 00:49:47,239 Speaker 1: I'm not necessarily sure. We feel feel like we missed 953 00:49:47,239 --> 00:49:49,880 Speaker 1: out on much. Um. I think we feel like we're 954 00:49:50,360 --> 00:49:53,320 Speaker 1: being we're responsible to clean up one group's mess to 955 00:49:53,360 --> 00:49:55,040 Speaker 1: make things better for another group, and we're not going 956 00:49:55,120 --> 00:49:57,759 Speaker 1: to get a benefit from either of them. That is 957 00:49:57,760 --> 00:49:59,640 Speaker 1: not a data that is a purely that is a 958 00:49:59,719 --> 00:50:02,080 Speaker 1: pure a subjective point of view of my own on 959 00:50:02,080 --> 00:50:05,200 Speaker 1: on the subject um. But we do see um. So 960 00:50:05,320 --> 00:50:07,600 Speaker 1: bringing it back to data, we see low we see 961 00:50:07,680 --> 00:50:12,080 Speaker 1: much lower divorce rates among gen xers. We see um 962 00:50:12,200 --> 00:50:16,400 Speaker 1: uh if you look at sort of declines in crime 963 00:50:16,480 --> 00:50:19,680 Speaker 1: rates over the last twenty years in the US. I'm 964 00:50:19,680 --> 00:50:22,120 Speaker 1: not saying that we're a more virtuous generation. We just 965 00:50:22,200 --> 00:50:25,960 Speaker 1: maybe didn't have as much time to to We're we're 966 00:50:26,000 --> 00:50:29,279 Speaker 1: all very busy. Um. What we do see one of 967 00:50:29,280 --> 00:50:31,800 Speaker 1: the biggest trends in the in the that's being driven 968 00:50:31,800 --> 00:50:34,879 Speaker 1: by gen X is the shift in um gender roles 969 00:50:34,920 --> 00:50:39,120 Speaker 1: in the family. Household responsibilities are shifting more and more 970 00:50:39,160 --> 00:50:41,799 Speaker 1: towards women, and men aren't picking up the slack um. 971 00:50:41,840 --> 00:50:45,000 Speaker 1: And that doesn't mean you know, uh, women are making 972 00:50:45,040 --> 00:50:49,200 Speaker 1: more larger household decisions about real estate, banking, things that 973 00:50:49,320 --> 00:50:53,040 Speaker 1: used to be kind of traditionally male decisions. Women are 974 00:50:53,080 --> 00:50:55,040 Speaker 1: making more of those decisions than they did in the past, 975 00:50:55,080 --> 00:50:57,319 Speaker 1: and men aren't doing much more. I mean, there are 976 00:50:57,480 --> 00:50:59,960 Speaker 1: men doing more cooking and more grocery shopping, but they're 977 00:51:00,000 --> 00:51:04,160 Speaker 1: not doing it at a rate that outweighs the increased 978 00:51:04,200 --> 00:51:07,319 Speaker 1: responsibility women are taking on in households. Um. I've we 979 00:51:07,360 --> 00:51:09,279 Speaker 1: have a piece that we wrote a couple of years 980 00:51:09,280 --> 00:51:11,239 Speaker 1: ago that still gets circulated around a lot, and it 981 00:51:11,320 --> 00:51:15,799 Speaker 1: was entitled women are getting a terrible deal because of that. Um, 982 00:51:15,840 --> 00:51:21,120 Speaker 1: you've got multi um spouse working or multi partner working households. 983 00:51:21,440 --> 00:51:24,640 Speaker 1: Women are bringing our while still making less money than 984 00:51:24,680 --> 00:51:27,080 Speaker 1: men overall, are starting to make more money and carrying 985 00:51:27,080 --> 00:51:30,000 Speaker 1: as much responsibility for households, and men are starting to 986 00:51:30,040 --> 00:51:32,400 Speaker 1: be to be free riders a little bit more than 987 00:51:32,440 --> 00:51:34,520 Speaker 1: we were before. And to the degree you've researched this, 988 00:51:34,600 --> 00:51:40,000 Speaker 1: what about the me too movement? Um, we don't. We 989 00:51:40,080 --> 00:51:44,319 Speaker 1: don't research that very much. Um, because it's unresearchable. Your 990 00:51:44,360 --> 00:51:46,600 Speaker 1: clients don't demand it now, I mean we don't, just 991 00:51:46,719 --> 00:51:49,680 Speaker 1: you know, we we don't. A lot of the not 992 00:51:49,719 --> 00:51:51,680 Speaker 1: all the research we do is because clients demanded a 993 00:51:51,680 --> 00:51:53,600 Speaker 1: lot of our clients say go research what's important and 994 00:51:53,640 --> 00:51:56,040 Speaker 1: come back and tell us the truth. You know, we are. 995 00:51:56,080 --> 00:51:59,400 Speaker 1: We are out there searching for truth and then we 996 00:51:59,440 --> 00:52:01,480 Speaker 1: share it with the people we think can benefit from 997 00:52:01,480 --> 00:52:04,719 Speaker 1: the knowledge. And I think the me too movement is 998 00:52:04,760 --> 00:52:09,959 Speaker 1: a is somewhat related to this topic of of male 999 00:52:10,120 --> 00:52:14,640 Speaker 1: entitlement that that is pervasive or has been pervasive for years. 1000 00:52:14,640 --> 00:52:16,640 Speaker 1: I think that's starting to lift. I just don't think 1001 00:52:16,640 --> 00:52:18,280 Speaker 1: there's a whole lot we can add to the discussion 1002 00:52:18,320 --> 00:52:24,799 Speaker 1: about it. Um it's a huge problem. It's wrong, it's um. 1003 00:52:24,840 --> 00:52:27,360 Speaker 1: There's just not a lot of nuance that any data 1004 00:52:27,400 --> 00:52:30,520 Speaker 1: is going to say about um, how right or wrong 1005 00:52:30,680 --> 00:52:32,800 Speaker 1: or how wrong it is. Okay, so what about baby boomers? 1006 00:52:32,840 --> 00:52:35,400 Speaker 1: You remember any insights on the baby boomers? Well, I 1007 00:52:35,400 --> 00:52:40,719 Speaker 1: mean yes, Um, so baby boomers are much closer to 1008 00:52:40,800 --> 00:52:43,880 Speaker 1: their kids than generations before them, and that has a 1009 00:52:43,880 --> 00:52:47,480 Speaker 1: lot of effect on how they're making decisions about their 1010 00:52:47,480 --> 00:52:51,920 Speaker 1: health care and and everything from their consumer decisions. We 1011 00:52:52,000 --> 00:52:55,080 Speaker 1: work with a number of healthcare companies who sell devices. 1012 00:52:55,680 --> 00:52:58,840 Speaker 1: I'll use an example of like a sleep apnea company. 1013 00:52:59,320 --> 00:53:01,399 Speaker 1: They have a sleep at the machine. So for years 1014 00:53:01,400 --> 00:53:03,520 Speaker 1: and years and years, you would advertise to the person 1015 00:53:03,560 --> 00:53:07,000 Speaker 1: who suffered from sleep apnea, Well should you really or 1016 00:53:07,000 --> 00:53:10,560 Speaker 1: should you advertise to the spouse who lives with the 1017 00:53:10,600 --> 00:53:13,720 Speaker 1: person sleep apnea? And so we've seen that shift, of course. 1018 00:53:14,239 --> 00:53:17,520 Speaker 1: But but but across other areas of healthcare, when it 1019 00:53:17,520 --> 00:53:19,959 Speaker 1: comes to medical devices, like say, say maybe not sleep 1020 00:53:19,960 --> 00:53:22,120 Speaker 1: apnea where I can't sleep because the person beside me 1021 00:53:22,160 --> 00:53:26,480 Speaker 1: is snoring. But other ailments like COPD or what have you. UM, 1022 00:53:26,680 --> 00:53:32,719 Speaker 1: the the millennial is of significant influence over the baby 1023 00:53:32,760 --> 00:53:36,000 Speaker 1: boomers decisions, and they're because they're closer than they were before. 1024 00:53:36,239 --> 00:53:39,960 Speaker 1: So we are challenging our our clients to think a 1025 00:53:40,000 --> 00:53:45,359 Speaker 1: lot about UM advertising or influencing baby boomers by influencing 1026 00:53:45,400 --> 00:53:47,920 Speaker 1: the people who influence them. And and that is a 1027 00:53:47,960 --> 00:53:51,040 Speaker 1: trend that we've seen shift quite a bit. So ironically, 1028 00:53:51,080 --> 00:53:54,720 Speaker 1: because baby boomers think they know everything, they don't. Well, 1029 00:53:54,920 --> 00:53:57,840 Speaker 1: they may still think they do, but they are influenced 1030 00:53:57,880 --> 00:53:59,719 Speaker 1: by other things. They are they are, and a lot 1031 00:53:59,800 --> 00:54:02,719 Speaker 1: of it is driven by the closeness of the relationships 1032 00:54:02,760 --> 00:54:04,759 Speaker 1: they've maintained with their kids, which is a very good thing. 1033 00:54:04,920 --> 00:54:10,279 Speaker 1: By the way, UM that there is UM they're they're 1034 00:54:10,280 --> 00:54:15,239 Speaker 1: either reliant on their younger children or they've just had 1035 00:54:15,239 --> 00:54:18,759 Speaker 1: a closer relationship through as their kids got older. And 1036 00:54:18,760 --> 00:54:21,399 Speaker 1: when you hear the stories of the millennials who are still, 1037 00:54:21,640 --> 00:54:24,360 Speaker 1: you know, living in their parents houses, there's one of 1038 00:54:24,360 --> 00:54:26,440 Speaker 1: the positive ends of that is the relationships between them 1039 00:54:26,440 --> 00:54:31,240 Speaker 1: tend to be pretty good, pretty strong. Okay, we'll return 1040 00:54:31,280 --> 00:54:34,360 Speaker 1: to this conversation with John Dick, CEO of Civic Science, 1041 00:54:34,640 --> 00:54:37,799 Speaker 1: in a moment. Hi, this is Bob left sense. My 1042 00:54:37,840 --> 00:54:40,400 Speaker 1: guests come to the tune In studios in Venice, California 1043 00:54:40,600 --> 00:54:42,799 Speaker 1: to have these conversations with me. And if you ever 1044 00:54:42,800 --> 00:54:44,520 Speaker 1: want to see what they look like where we take 1045 00:54:44,600 --> 00:54:48,000 Speaker 1: the show, check out the photos and videos search at 1046 00:54:48,040 --> 00:54:52,319 Speaker 1: tune In on Twitter, Facebook and Instagram. And now more 1047 00:54:52,400 --> 00:54:58,000 Speaker 1: with CEO of Civic Science, John Dick, And what about 1048 00:54:58,000 --> 00:55:00,799 Speaker 1: more macro trends. Who you mentioned cars? What is the 1049 00:55:00,840 --> 00:55:03,759 Speaker 1: future of audibobiles because we read all the time in 1050 00:55:03,800 --> 00:55:06,960 Speaker 1: the straight media that you know, autonomous driving is coming, 1051 00:55:08,080 --> 00:55:11,640 Speaker 1: I think, okay, so that's definitely something we studied very closely. 1052 00:55:12,320 --> 00:55:14,800 Speaker 1: The vast majority of people are still very scared about 1053 00:55:14,840 --> 00:55:17,279 Speaker 1: autonomous vehicles, particularly women, do not want to get in 1054 00:55:17,320 --> 00:55:19,520 Speaker 1: a car that's driving itself. By the same time, and 1055 00:55:19,600 --> 00:55:22,520 Speaker 1: we see things slip overnight, like we heard we heard 1056 00:55:23,120 --> 00:55:26,040 Speaker 1: digital photography was gonna place film for at least ten years, 1057 00:55:26,200 --> 00:55:28,080 Speaker 1: and then with the space of twelve months, seemed like 1058 00:55:28,120 --> 00:55:31,960 Speaker 1: film disappeared and digital came upon us. So a lot 1059 00:55:31,960 --> 00:55:34,640 Speaker 1: of people are fearful, but then when the product comes 1060 00:55:34,640 --> 00:55:36,719 Speaker 1: in their influences like a O. Well, you know that 1061 00:55:36,960 --> 00:55:38,560 Speaker 1: no one was online that all of a sudden, for 1062 00:55:38,560 --> 00:55:40,799 Speaker 1: a couple of years, everyone was buying a computer just 1063 00:55:40,880 --> 00:55:44,000 Speaker 1: to get online. So I'm just when people are fearful 1064 00:55:44,040 --> 00:55:46,520 Speaker 1: of something, could the switch be faster than it used 1065 00:55:46,560 --> 00:55:48,040 Speaker 1: to be? It could be, but it also, in the 1066 00:55:48,080 --> 00:55:50,400 Speaker 1: case of autonomous vehicles, could go the other direction to 1067 00:55:50,800 --> 00:55:54,480 Speaker 1: it only takes one hacker and some eastern block country 1068 00:55:54,560 --> 00:55:56,280 Speaker 1: to take over a bunch of cars with a computer 1069 00:55:56,360 --> 00:55:57,799 Speaker 1: and drive them into a crowd of people that no 1070 00:55:57,800 --> 00:55:59,319 Speaker 1: one's ever gonna want to get into a driver less 1071 00:55:59,400 --> 00:56:03,000 Speaker 1: vehicle for twenty years. I think that there's there's enough 1072 00:56:03,080 --> 00:56:08,000 Speaker 1: fear and and skepticism about autonomous vehicles and artificial intelligence 1073 00:56:08,000 --> 00:56:10,600 Speaker 1: in general that a couple of bad pr nightmares for 1074 00:56:10,640 --> 00:56:12,440 Speaker 1: one of those cars doing what it's not supposed to 1075 00:56:12,520 --> 00:56:15,080 Speaker 1: could kill it. Um there's no reason to believe that 1076 00:56:15,080 --> 00:56:18,160 Speaker 1: will happen, but it's you could certainly foresee a situation 1077 00:56:18,239 --> 00:56:21,680 Speaker 1: where whatever momentum driverless cars are gaining could be stopped 1078 00:56:21,680 --> 00:56:25,640 Speaker 1: in its tracks. I think it's more likely we'll see 1079 00:56:25,680 --> 00:56:29,919 Speaker 1: an evolution towards autonomous vehicles like we see evolution toward 1080 00:56:29,960 --> 00:56:34,720 Speaker 1: alternative fuels um our alternative energy UM. You've got huge 1081 00:56:35,000 --> 00:56:37,800 Speaker 1: infrastructure in the automobile industry and the fuel industry that 1082 00:56:37,840 --> 00:56:41,200 Speaker 1: are going to slow a lot of that progress down UM. 1083 00:56:41,239 --> 00:56:43,560 Speaker 1: And then you've got a lot of people who are 1084 00:56:43,640 --> 00:56:48,759 Speaker 1: just skeptical about or or simply uncomfortable with the idea 1085 00:56:48,800 --> 00:56:50,640 Speaker 1: of being in a car that they can't control. So 1086 00:56:50,680 --> 00:56:52,360 Speaker 1: I don't expect it to be an overnight thing, And 1087 00:56:52,400 --> 00:56:54,759 Speaker 1: there's nothing in our evidence. The evidence in our data 1088 00:56:54,840 --> 00:56:58,560 Speaker 1: is that the percentage of people who are comfortable with 1089 00:56:58,640 --> 00:57:01,120 Speaker 1: a driverless car has not grown much in the last 1090 00:57:01,160 --> 00:57:04,200 Speaker 1: two years. It's the same group of people. I vastly 1091 00:57:04,239 --> 00:57:07,000 Speaker 1: disagree with this, okay, just because I think that's what 1092 00:57:07,080 --> 00:57:11,000 Speaker 1: people believe. But I think when you know, we have 1093 00:57:11,120 --> 00:57:14,120 Speaker 1: the Tesla accident in Florida where the guy crashed into 1094 00:57:14,160 --> 00:57:17,480 Speaker 1: a car, these stories are really amplified. And of course 1095 00:57:17,480 --> 00:57:19,840 Speaker 1: we're talking about the pace of changes more quickly. It 1096 00:57:20,000 --> 00:57:22,439 Speaker 1: happens more quickly these days. But if you go back 1097 00:57:22,480 --> 00:57:25,680 Speaker 1: to even in the commercial jet age in the sixties, 1098 00:57:26,040 --> 00:57:29,240 Speaker 1: they did not have the flight control that they have today, 1099 00:57:29,240 --> 00:57:33,000 Speaker 1: and they had accidents. Okay, people were fearful. They're still 1100 00:57:33,040 --> 00:57:35,600 Speaker 1: fearful today, even the odds of having an accident are 1101 00:57:35,640 --> 00:57:37,840 Speaker 1: really low, but the barrier to entry to getting a 1102 00:57:37,840 --> 00:57:41,600 Speaker 1: ticket because deregulation in in airlines, there's a there's a 1103 00:57:41,600 --> 00:57:44,160 Speaker 1: lot of things that may not be directly applicable. But 1104 00:57:44,240 --> 00:57:47,440 Speaker 1: when the person next door gets a car or can 1105 00:57:47,720 --> 00:57:51,000 Speaker 1: even better, can call up a car like Uber, and 1106 00:57:51,000 --> 00:57:52,160 Speaker 1: you don't have to own a car, you don't have 1107 00:57:52,200 --> 00:57:54,320 Speaker 1: to have a garage. I think the pace of change 1108 00:57:54,640 --> 00:57:56,600 Speaker 1: happens very quickly. I mean I look at this like 1109 00:57:56,640 --> 00:58:00,080 Speaker 1: the Internet. Most of the public was not Internet savvy, 1110 00:58:00,520 --> 00:58:04,600 Speaker 1: and over the course of one to five years, everybody 1111 00:58:04,640 --> 00:58:07,680 Speaker 1: was Internet savvy. So I'm not sure I agree with that. 1112 00:58:07,680 --> 00:58:09,880 Speaker 1: What about climate change? You learn anything with climate change? 1113 00:58:09,960 --> 00:58:12,680 Speaker 1: It's not something we study a lot um. People are 1114 00:58:12,680 --> 00:58:14,880 Speaker 1: pretty setting their mindsets about that. We don't see a 1115 00:58:14,920 --> 00:58:17,680 Speaker 1: lot of change. I will go back, so I appreciate 1116 00:58:17,720 --> 00:58:20,080 Speaker 1: your your argument on the drive and let me let 1117 00:58:20,120 --> 00:58:21,680 Speaker 1: me say that I'll be the first person to call 1118 00:58:21,720 --> 00:58:24,280 Speaker 1: that driverless car to come pick me up. So I'm 1119 00:58:24,360 --> 00:58:26,640 Speaker 1: hoping it comes as fast as possible because I could 1120 00:58:26,720 --> 00:58:28,640 Speaker 1: do away with my car in a second, and I 1121 00:58:28,640 --> 00:58:30,400 Speaker 1: think there's a chance that that could happen. All I'm 1122 00:58:30,400 --> 00:58:31,840 Speaker 1: saying is there's also a chance you could go the 1123 00:58:31,920 --> 00:58:34,080 Speaker 1: other direction pretty quickly. But I guess what I'm saying 1124 00:58:34,160 --> 00:58:37,840 Speaker 1: is something it's irrelevant of driverless cars. One to what 1125 00:58:38,040 --> 00:58:43,360 Speaker 1: degree our stories being amplified that ultimately have no impact. Okay, 1126 00:58:43,360 --> 00:58:45,760 Speaker 1: there was a story back when these portals used to 1127 00:58:45,840 --> 00:58:48,800 Speaker 1: be of much greater impact. I think it was on Yahoo. 1128 00:58:48,840 --> 00:58:51,400 Speaker 1: They had a big thing saying tongue splitting was a 1129 00:58:51,440 --> 00:58:54,280 Speaker 1: big thing. Okay, well, you literally split the end of 1130 00:58:54,280 --> 00:58:57,440 Speaker 1: your tongue whatever. In reality, there were fewer than I believe, 1131 00:58:57,720 --> 00:58:58,920 Speaker 1: certainly a few of the ten. I think it was 1132 00:58:58,960 --> 00:59:02,040 Speaker 1: fewer than five people who'd ever done. There wasn't really 1133 00:59:02,080 --> 00:59:04,840 Speaker 1: a trend, but people were writing about it. It's sort 1134 00:59:04,840 --> 00:59:09,240 Speaker 1: of the flip side of the uh. People with unpopular opinions, 1135 00:59:09,280 --> 00:59:12,840 Speaker 1: we're not putting them on social media so that these 1136 00:59:12,840 --> 00:59:15,320 Speaker 1: stories that are amplified. This is something that's changed in 1137 00:59:15,320 --> 00:59:17,760 Speaker 1: the Internet. There used to be something called a turntable hit. Well, 1138 00:59:17,800 --> 00:59:20,680 Speaker 1: all the gatekeepers they were going to bang something and 1139 00:59:20,680 --> 00:59:22,760 Speaker 1: it makes it looks like it's popular, but the public 1140 00:59:22,800 --> 00:59:25,400 Speaker 1: isn't buying it. Now we can see whether the public 1141 00:59:25,440 --> 00:59:27,960 Speaker 1: buys so I see. I mean, I think the New 1142 00:59:28,040 --> 00:59:29,760 Speaker 1: York Times is the best we've got. I don't want 1143 00:59:29,760 --> 00:59:31,600 Speaker 1: to make it right about right or left. It's just 1144 00:59:31,680 --> 00:59:33,920 Speaker 1: that they have boots on the ground and in all 1145 00:59:33,960 --> 00:59:38,320 Speaker 1: these places other people don't. Okay, but now more than ever, 1146 00:59:38,840 --> 00:59:41,440 Speaker 1: the mainstream media will prints. I'm not talking about printing 1147 00:59:41,640 --> 00:59:45,200 Speaker 1: hard news. Okay, there was a war here, that's something different. 1148 00:59:45,720 --> 00:59:48,400 Speaker 1: But when they print softer news, I think it has 1149 00:59:48,480 --> 00:59:51,280 Speaker 1: less impact and less accuracy than it ever did before. 1150 00:59:51,520 --> 00:59:58,480 Speaker 1: That's indisputable, and it's and it's getting worse by the day. Uh. 1151 00:59:59,040 --> 01:00:02,200 Speaker 1: Two weeks ago, people were I had half dozen friends 1152 01:00:02,200 --> 01:00:05,080 Speaker 1: of mine on social media shared a story about the 1153 01:00:05,080 --> 01:00:07,800 Speaker 1: the idea that the students in Parkland were what were 1154 01:00:07,840 --> 01:00:11,360 Speaker 1: the crisis actors? That meant there were a hundred and 1155 01:00:11,360 --> 01:00:13,480 Speaker 1: seventeen thousand The one article that I saw had been 1156 01:00:13,480 --> 01:00:16,360 Speaker 1: shared a hundred and seventeen thousand times. The average Facebook 1157 01:00:16,440 --> 01:00:19,560 Speaker 1: user has three hundred and sixty eight friends. That means 1158 01:00:19,720 --> 01:00:22,560 Speaker 1: roughly what thirty seven some million people saw that article 1159 01:00:22,640 --> 01:00:24,960 Speaker 1: from a friend, leading them to believe it was true. Right, 1160 01:00:25,040 --> 01:00:27,600 Speaker 1: So it doesn't. So so that the scale at which 1161 01:00:27,640 --> 01:00:30,280 Speaker 1: misinformation can spread is is and we and we will 1162 01:00:30,320 --> 01:00:33,040 Speaker 1: retract from that as consumers because we're gonna only get 1163 01:00:33,040 --> 01:00:35,080 Speaker 1: burned by that so many times. So you're absolutely right. 1164 01:00:35,120 --> 01:00:38,400 Speaker 1: I think that's happening, and and I think it will 1165 01:00:38,440 --> 01:00:43,800 Speaker 1: take five misguided stories about bad driverless cars probably could 1166 01:00:43,800 --> 01:00:48,520 Speaker 1: spoil some group of people. Um but but uh, the 1167 01:00:48,520 --> 01:00:50,880 Speaker 1: pace of misinformation, I don't know where that ends. By 1168 01:00:50,920 --> 01:00:53,360 Speaker 1: the way, if I hope that people wise up and 1169 01:00:53,400 --> 01:00:55,919 Speaker 1: become more scrutinizing and what they believe, but maybe they won't. 1170 01:00:56,080 --> 01:00:58,200 Speaker 1: I don't believe people. You know, it's interesting you're selling 1171 01:00:58,240 --> 01:01:00,720 Speaker 1: that an analysis. I don't believe the average person has 1172 01:01:00,760 --> 01:01:03,320 Speaker 1: the power of analysis. Oh no, That's why I was 1173 01:01:03,320 --> 01:01:05,400 Speaker 1: saying earlier. I think one of the biggest challenges we 1174 01:01:05,480 --> 01:01:09,000 Speaker 1: face as an erosion of trust of data. We talked 1175 01:01:09,040 --> 01:01:11,080 Speaker 1: so much about big data and data science and all 1176 01:01:11,080 --> 01:01:13,920 Speaker 1: of that, but I think the average consumer distrusts numbers 1177 01:01:14,040 --> 01:01:17,840 Speaker 1: because based on who puts those numbers in front of them. Um. 1178 01:01:17,960 --> 01:01:21,040 Speaker 1: We I mentioned to you earlier when we were talking 1179 01:01:21,080 --> 01:01:23,720 Speaker 1: that we were we've been approached from by media outlets 1180 01:01:23,720 --> 01:01:26,400 Speaker 1: about sort of publishing or syndicating our numbers, and we 1181 01:01:26,440 --> 01:01:28,160 Speaker 1: just won't do it because all of a sudden, we 1182 01:01:28,240 --> 01:01:32,120 Speaker 1: lose we we we inherit the perception of bias that 1183 01:01:32,200 --> 01:01:34,800 Speaker 1: those outlets have if they put their names. But the irony, 1184 01:01:34,840 --> 01:01:38,080 Speaker 1: and I'm sure you're hearing it anyway, is people believe 1185 01:01:38,120 --> 01:01:40,960 Speaker 1: you're biased anyway. True, true, But I can sleep better 1186 01:01:40,960 --> 01:01:42,480 Speaker 1: at night if I have some control over well, I know. 1187 01:01:42,520 --> 01:01:46,160 Speaker 1: I mean, I I write about Spotify, and every day 1188 01:01:46,200 --> 01:01:47,680 Speaker 1: I get people said, well, you're on the payroll. I'm 1189 01:01:47,680 --> 01:01:49,800 Speaker 1: not on the perrol. It never took a cent. Okay, 1190 01:01:49,800 --> 01:01:51,520 Speaker 1: I'm not saying therefore the opposite is gonna hapen to 1191 01:01:51,600 --> 01:01:54,640 Speaker 1: take the money. But even if you are literally white clean, 1192 01:01:54,720 --> 01:01:57,440 Speaker 1: these days, people believe you're not. Oh, I mean, we'll 1193 01:01:57,640 --> 01:02:00,480 Speaker 1: we'll write, we'll put one blog post out about a topic, 1194 01:02:00,600 --> 01:02:02,760 Speaker 1: and on our Facebook page will be in the same 1195 01:02:02,840 --> 01:02:05,240 Speaker 1: by one person accused of being a right wing hill 1196 01:02:05,360 --> 01:02:07,720 Speaker 1: and on the next comment will be a left wing ship. 1197 01:02:07,800 --> 01:02:10,640 Speaker 1: I mean it. People will find the story they want 1198 01:02:10,640 --> 01:02:14,200 Speaker 1: to find if your data doesn't precisely affirm their existing belief. 1199 01:02:14,320 --> 01:02:16,280 Speaker 1: What's a sad state of affairs? I mean, I grew 1200 01:02:16,320 --> 01:02:18,680 Speaker 1: up an inner where there were many fewer media outlets 1201 01:02:18,840 --> 01:02:21,000 Speaker 1: and you could agree on the facts. You can't even 1202 01:02:21,000 --> 01:02:23,160 Speaker 1: agree on the facts, and even no you cannot know 1203 01:02:23,320 --> 01:02:25,320 Speaker 1: and it's hard to give it. So any other macro 1204 01:02:25,520 --> 01:02:28,320 Speaker 1: trends that you can give us. You're talking about, uh, 1205 01:02:28,400 --> 01:02:31,600 Speaker 1: the cocooning, which ironically is what Faith cop popcorns at 1206 01:02:31,600 --> 01:02:33,880 Speaker 1: a few decades back. But now we have the ability 1207 01:02:33,920 --> 01:02:36,640 Speaker 1: with all these instat cart and Netflix to do that. 1208 01:02:36,880 --> 01:02:40,720 Speaker 1: Any other macro trends in the public. Well, So I 1209 01:02:40,800 --> 01:02:47,120 Speaker 1: mentioned the the the true, the aspirational nature of social 1210 01:02:47,160 --> 01:02:51,560 Speaker 1: media and what that's doing. UM. The impact that's having 1211 01:02:51,600 --> 01:02:56,480 Speaker 1: on brands I think is pretty dramatic in terms of UM. 1212 01:02:56,520 --> 01:02:59,160 Speaker 1: I was on a I gave a talk at a 1213 01:02:59,280 --> 01:03:02,320 Speaker 1: conference of about a year ago as a beauty industry conference, 1214 01:03:02,320 --> 01:03:04,480 Speaker 1: and I sat with a woman who was an executive 1215 01:03:04,480 --> 01:03:06,000 Speaker 1: of a big beauty company, and she was telling a 1216 01:03:06,000 --> 01:03:08,040 Speaker 1: story about how the night before she had driven her son. 1217 01:03:08,280 --> 01:03:09,760 Speaker 1: She was taking her son. That we were in New York. 1218 01:03:10,080 --> 01:03:13,680 Speaker 1: She was taking her son to soccer practice, and I 1219 01:03:13,680 --> 01:03:16,720 Speaker 1: picture her in her land Rover in Westchester, right, and 1220 01:03:17,000 --> 01:03:19,240 Speaker 1: he hadn't eaten dinner yet, so she drove him through 1221 01:03:19,280 --> 01:03:21,680 Speaker 1: the drive through at a fast food restaurant to get 1222 01:03:21,720 --> 01:03:24,960 Speaker 1: him some burgers. And she gets to the soccer fields 1223 01:03:25,080 --> 01:03:28,360 Speaker 1: and all the suburban Westchester moms were standing there, and 1224 01:03:28,400 --> 01:03:30,120 Speaker 1: she has a panic attack that these moms are going 1225 01:03:30,160 --> 01:03:31,840 Speaker 1: to see that she'd taken her son of this fast 1226 01:03:33,080 --> 01:03:35,440 Speaker 1: So she's telling the story and we're all laughing, but 1227 01:03:35,520 --> 01:03:38,520 Speaker 1: she she stands up and she's shoving the stuff in 1228 01:03:38,560 --> 01:03:41,080 Speaker 1: her bag so that moms won't see it. And so 1229 01:03:41,720 --> 01:03:46,280 Speaker 1: we've been materialists are since one person had a nicer 1230 01:03:46,320 --> 01:03:48,640 Speaker 1: loincloth than the other. Right, it's going back as much 1231 01:03:48,680 --> 01:03:50,240 Speaker 1: as you know, as far back as you can go. 1232 01:03:50,320 --> 01:03:54,360 Speaker 1: Janis Joplin wrote about it in nineteen seventy and um, 1233 01:03:54,400 --> 01:03:57,919 Speaker 1: but we've never had the tools to or the purse, 1234 01:03:58,080 --> 01:04:00,400 Speaker 1: or the fear that ever that are that are purchase 1235 01:04:00,440 --> 01:04:03,280 Speaker 1: and consumption decisions were on public display like they are now. 1236 01:04:03,480 --> 01:04:05,080 Speaker 1: And this was a case of a woman driving to 1237 01:04:05,120 --> 01:04:07,640 Speaker 1: a parking lot with food in her car. But with 1238 01:04:07,680 --> 01:04:10,040 Speaker 1: the cameras and the social media and the sharing and 1239 01:04:10,080 --> 01:04:11,760 Speaker 1: the posting and all that, the decisions that I make 1240 01:04:11,800 --> 01:04:14,560 Speaker 1: about where I eat and where I buy are are 1241 01:04:14,600 --> 01:04:17,000 Speaker 1: on public display and subject to scrutiny like they've never 1242 01:04:17,040 --> 01:04:20,240 Speaker 1: been before. So we've always had guilty pleasures, and I 1243 01:04:20,360 --> 01:04:23,280 Speaker 1: might I may choose not to eat at a particular 1244 01:04:23,320 --> 01:04:26,240 Speaker 1: fast food restaurant because it's unhealthy for me. But if 1245 01:04:26,280 --> 01:04:27,919 Speaker 1: I do, I just don't post about it or brag 1246 01:04:27,960 --> 01:04:30,600 Speaker 1: about it. But what about that woman who next time 1247 01:04:30,640 --> 01:04:34,920 Speaker 1: her son hasn't eaten dinner, she actively chooses not to 1248 01:04:34,960 --> 01:04:37,240 Speaker 1: go to that fast food restaurant because of the potential 1249 01:04:37,240 --> 01:04:39,200 Speaker 1: trauma of being seen taking her son there. And now 1250 01:04:39,240 --> 01:04:41,800 Speaker 1: that that bit very large fast food restaurant has lost 1251 01:04:41,880 --> 01:04:44,880 Speaker 1: a customer or at least a visit, and think about 1252 01:04:44,920 --> 01:04:46,920 Speaker 1: how often that might be happening at the scale. And 1253 01:04:46,960 --> 01:04:49,280 Speaker 1: you look at some of these large brands, the Blue Chips, 1254 01:04:49,320 --> 01:04:52,040 Speaker 1: CpG companies and others who have There are a lot 1255 01:04:52,080 --> 01:04:54,520 Speaker 1: of reasons they can they can explain some of the 1256 01:04:54,520 --> 01:04:56,120 Speaker 1: declines that they've had, But that has to be a 1257 01:04:56,120 --> 01:04:58,720 Speaker 1: big part of it is that it's just not cool 1258 01:04:58,800 --> 01:05:00,800 Speaker 1: to go there. It's not as cool to eat there. 1259 01:05:00,840 --> 01:05:03,040 Speaker 1: It's and I think that's going to affect music as well. 1260 01:05:03,080 --> 01:05:05,440 Speaker 1: I think I think being able to brag about the 1261 01:05:05,800 --> 01:05:08,520 Speaker 1: concert I was at and that's always been there, but 1262 01:05:08,560 --> 01:05:10,520 Speaker 1: it's never been amplified the way that it is now. 1263 01:05:11,440 --> 01:05:15,360 Speaker 1: Staying with the same thing, because this crosses bombs, we 1264 01:05:15,560 --> 01:05:20,360 Speaker 1: learn that acquisition is down and experiences are up, especially 1265 01:05:20,400 --> 01:05:25,560 Speaker 1: amongst millennials. Is your data absolutely very true? Yes, UM, 1266 01:05:25,840 --> 01:05:31,400 Speaker 1: wanting to experience restaurant spending is down, but the types 1267 01:05:31,440 --> 01:05:33,560 Speaker 1: of restaurants that people are choosing to eat at are 1268 01:05:33,600 --> 01:05:36,560 Speaker 1: more experiential, So certain types of restaurants are doing well. 1269 01:05:37,040 --> 01:05:39,760 Speaker 1: UM travel is certainly up. It's making it. It's a 1270 01:05:39,840 --> 01:05:42,480 Speaker 1: it's quite a big blind spot for UM the FED 1271 01:05:42,560 --> 01:05:45,520 Speaker 1: in terms of forecasting consumer spending because a lot of 1272 01:05:45,520 --> 01:05:48,560 Speaker 1: those categories aren't measured by a lot of the federal 1273 01:05:48,640 --> 01:05:52,000 Speaker 1: resources in terms of UM. You know, it's not store 1274 01:05:52,040 --> 01:05:54,880 Speaker 1: receipts from Target and Walmart anymore. There's money being spent 1275 01:05:54,960 --> 01:05:57,600 Speaker 1: somewhere that's not as visible, and that is a that 1276 01:05:57,760 --> 01:06:00,880 Speaker 1: is a that is a fundamentally true trend. And it's 1277 01:06:00,920 --> 01:06:04,400 Speaker 1: not just millennials. It's happening in gen X as well. Okay, 1278 01:06:04,400 --> 01:06:06,480 Speaker 1: so if someone wants to check you out, they can 1279 01:06:06,480 --> 01:06:08,240 Speaker 1: listen to this podcast. Is there anywhere they can go 1280 01:06:08,240 --> 01:06:11,360 Speaker 1: online to see some of your data other than your quizzes? Uh, 1281 01:06:11,520 --> 01:06:15,520 Speaker 1: Civic science dot com. Pretty straightforward, UM, anybody who would 1282 01:06:15,520 --> 01:06:19,640 Speaker 1: like to UM. You mentioned earlier that little Saturday email 1283 01:06:19,640 --> 01:06:22,480 Speaker 1: that I send out. It is it's kind of invitation 1284 01:06:22,560 --> 01:06:26,520 Speaker 1: only but it is a it is a an awesome 1285 01:06:26,560 --> 01:06:29,080 Speaker 1: list of people who read it and if anybody would 1286 01:06:29,120 --> 01:06:31,640 Speaker 1: like to be added to that, they would could reach 1287 01:06:31,680 --> 01:06:35,520 Speaker 1: out to you maybe to do that, but otherwise, UM yes, 1288 01:06:35,560 --> 01:06:38,439 Speaker 1: Civic Science dot com and um, we write a lot. 1289 01:06:38,720 --> 01:06:42,280 Speaker 1: It's um our blog is is a lot of content 1290 01:06:42,360 --> 01:06:45,680 Speaker 1: every week, new things were studying, trends worth finding. Um. 1291 01:06:45,720 --> 01:06:48,280 Speaker 1: We do have some partnerships with some media outlets that 1292 01:06:48,320 --> 01:06:51,080 Speaker 1: take our information and give it a spotlight for us. 1293 01:06:51,120 --> 01:06:54,000 Speaker 1: But um, yeah, they can certainly reach out to us 1294 01:06:54,000 --> 01:06:55,240 Speaker 1: if they want to be added to any of the 1295 01:06:55,240 --> 01:06:57,480 Speaker 1: distribution lists of these kind of things that we're writing about. 1296 01:06:57,640 --> 01:06:59,560 Speaker 1: And once again we haven't mentioned the names of the 1297 01:06:59,600 --> 01:07:03,760 Speaker 1: company that employ Civic Science, but I know those names 1298 01:07:03,760 --> 01:07:06,600 Speaker 1: and believe me, they're all the alphabet soup that you 1299 01:07:06,720 --> 01:07:10,400 Speaker 1: believe they are. So once again, John Dick, CEO of 1300 01:07:10,400 --> 01:07:12,040 Speaker 1: Civic Science, thanks so much for being here on the 1301 01:07:12,040 --> 01:07:14,760 Speaker 1: Bob Left Sets Podcast. It was my pleasure, Bob, thank you. 1302 01:07:17,120 --> 01:07:19,320 Speaker 1: That wraps up this week's episode of the Bob Left 1303 01:07:19,320 --> 01:07:22,320 Speaker 1: Sets Podcast, recorded here at Venice, California at the tune 1304 01:07:22,320 --> 01:07:26,080 Speaker 1: In Studios. I hope you enjoyed this fascinating conversation with 1305 01:07:26,120 --> 01:07:30,000 Speaker 1: the CEO of Civic Science John Dick, about market analytics, 1306 01:07:30,040 --> 01:07:33,080 Speaker 1: the future of tech, the music industry, and consumer habits. 1307 01:07:33,560 --> 01:07:36,800 Speaker 1: As always, I welcome your feedback. Email me at Bob 1308 01:07:36,840 --> 01:07:52,560 Speaker 1: at left sets dot com.