1 00:00:11,880 --> 00:00:15,360 Speaker 1: I'm Anny, this is Devin, and this is no such thing. 2 00:00:15,640 --> 00:00:18,880 Speaker 2: Show we settle are the arguments in yours by actually 3 00:00:18,920 --> 00:00:23,439 Speaker 2: doing the research on today's episode, are you happy or not? 4 00:00:25,000 --> 00:00:28,440 Speaker 2: A deep dive into those bathroom reaction buttons. 5 00:00:32,640 --> 00:00:38,760 Speaker 3: There's no no such thing, no such thing. No touch than. 6 00:00:40,720 --> 00:00:51,040 Speaker 2: Touch thank no touch thank all Right, So today's episode 7 00:00:51,159 --> 00:00:54,800 Speaker 2: was actually inspired by a listener named Emma. 8 00:00:57,960 --> 00:01:01,560 Speaker 4: Hey, guys, my question is what is the purpose of 9 00:01:01,800 --> 00:01:05,200 Speaker 4: immediately action in person reviews, like when they ask you 10 00:01:05,240 --> 00:01:07,000 Speaker 4: to hit a smile or fri anny face after you 11 00:01:07,040 --> 00:01:09,759 Speaker 4: go to the bathroom. I just was asked to sell 12 00:01:09,760 --> 00:01:12,360 Speaker 4: out one of the post office. I know that most 13 00:01:12,400 --> 00:01:15,160 Speaker 4: online reviews help with Google ratings, but it doesn't seem 14 00:01:15,200 --> 00:01:17,600 Speaker 4: like those in personal ones would contribute to that. And 15 00:01:17,600 --> 00:01:19,720 Speaker 4: it's also hard to believe that they actually care enough 16 00:01:19,720 --> 00:01:21,679 Speaker 4: to look at the reviews in these places. 17 00:01:22,200 --> 00:01:31,520 Speaker 5: Thinks by the bathroom one is really interesting to me, 18 00:01:31,720 --> 00:01:33,720 Speaker 5: so many other ones to make more sense, Like you know, 19 00:01:33,840 --> 00:01:36,080 Speaker 5: now when you do to tap to pay, sometimes they 20 00:01:36,240 --> 00:01:39,080 Speaker 5: like they got the at the restaurant, they got the dumbs. 21 00:01:38,880 --> 00:01:40,679 Speaker 1: Up down usually skip it whatever. 22 00:01:40,760 --> 00:01:42,520 Speaker 2: But that makes sense because it's like, oh, we're getting 23 00:01:42,560 --> 00:01:43,680 Speaker 2: a lot of dumbs down. 24 00:01:43,640 --> 00:01:46,360 Speaker 1: Based on the media. Yeah, it's an immediate Why is 25 00:01:46,360 --> 00:01:49,480 Speaker 1: there thumbs down to you? Yeah, doing it? But the 26 00:01:49,520 --> 00:01:53,080 Speaker 1: bathroom one one is a physical button in airports. 27 00:01:53,400 --> 00:01:55,400 Speaker 6: Yeah, that's where I first noticed it. Yeah, I feel 28 00:01:55,440 --> 00:01:56,960 Speaker 6: like stadiums. 29 00:01:56,480 --> 00:01:58,800 Speaker 1: Yes, I have it. Now, Like how did we do it? 30 00:01:58,880 --> 00:02:01,160 Speaker 7: Because it's like gas stations, random places sometimes and. 31 00:02:01,200 --> 00:02:03,160 Speaker 6: Those kind of super public places. 32 00:02:03,800 --> 00:02:05,559 Speaker 3: It's not like our choices matter. 33 00:02:05,800 --> 00:02:08,360 Speaker 1: Yes, we need we have after it's not an option. 34 00:02:09,320 --> 00:02:12,440 Speaker 6: My thought when I read this question was I that 35 00:02:12,520 --> 00:02:15,240 Speaker 6: those things probably are barely even doing anything or no 36 00:02:15,240 --> 00:02:17,160 Speaker 6: one's looking at it. It's it's purely just like a 37 00:02:17,639 --> 00:02:19,160 Speaker 6: maybe I'm mad because I had to wait in line, 38 00:02:19,160 --> 00:02:20,560 Speaker 6: and now I get to do this and it makes 39 00:02:20,560 --> 00:02:23,480 Speaker 6: you feel better and it might stop me from going off. 40 00:02:23,639 --> 00:02:25,000 Speaker 7: Yes, I don't online or something. 41 00:02:25,000 --> 00:02:28,080 Speaker 1: I don't got to find a person already. I'm done. 42 00:02:28,360 --> 00:02:31,520 Speaker 3: I'm done, So this would be somewhat of a conspiracy. 43 00:02:32,240 --> 00:02:32,480 Speaker 1: Yeah. 44 00:02:32,560 --> 00:02:35,280 Speaker 3: I think they're putting that out there and they know 45 00:02:35,400 --> 00:02:36,079 Speaker 3: that it doesn't. 46 00:02:35,960 --> 00:02:38,000 Speaker 1: It's like the button thing at a crosswalk. 47 00:02:38,560 --> 00:02:40,720 Speaker 2: Oh yeah, where you know so much of them are 48 00:02:40,760 --> 00:02:44,880 Speaker 2: not even connected, right, Yeah. 49 00:02:43,520 --> 00:02:45,720 Speaker 8: Only two hundred and fifty nine of the twelve hundred 50 00:02:45,720 --> 00:02:49,000 Speaker 8: crosswalk buttons in San Francisco actually change a traffic pattern. 51 00:02:49,120 --> 00:02:51,440 Speaker 2: Meanwhile, in New York City, just one hundred of the 52 00:02:51,440 --> 00:02:53,320 Speaker 2: one thousand worked in twenty eighteen. 53 00:02:54,320 --> 00:02:56,120 Speaker 1: But you get there and you press the button and 54 00:02:56,160 --> 00:02:57,280 Speaker 1: you think, okay. 55 00:02:56,960 --> 00:03:00,160 Speaker 6: I pressed the button to change someone got to sleep. 56 00:03:00,040 --> 00:03:02,160 Speaker 2: And then thirty seconds later it changes, and you're like 57 00:03:02,160 --> 00:03:02,880 Speaker 2: that was because of me. 58 00:03:03,960 --> 00:03:07,240 Speaker 6: Yeah, they would have just you know, have never stopped. 59 00:03:08,840 --> 00:03:10,640 Speaker 2: So maybe it's something that that like to that, like 60 00:03:10,639 --> 00:03:12,080 Speaker 2: you're saying, it's like okay. 61 00:03:11,840 --> 00:03:13,960 Speaker 6: Yeah, kind of just like t s as like kind 62 00:03:14,000 --> 00:03:16,919 Speaker 6: of of yes, doing something. 63 00:03:16,840 --> 00:03:19,560 Speaker 2: Like we care about your opinion on this thing. Yeah, 64 00:03:19,680 --> 00:03:22,000 Speaker 2: go ahead press that butt But not even connected that. 65 00:03:22,160 --> 00:03:25,440 Speaker 6: Do you guys ever press those buttons? I was, I was, 66 00:03:25,720 --> 00:03:29,080 Speaker 6: I have, I think, but not again, not thinking it's 67 00:03:29,120 --> 00:03:29,840 Speaker 6: gonna do anything. 68 00:03:29,880 --> 00:03:32,840 Speaker 7: Maybe almost more just like a something to do. 69 00:03:33,000 --> 00:03:36,720 Speaker 6: Yeah, you know, I've got a curious mind. 70 00:03:37,920 --> 00:03:39,480 Speaker 3: You just want to let my opinions be know, it's 71 00:03:39,560 --> 00:03:41,360 Speaker 3: just like a fidget spinner on the way out of 72 00:03:41,360 --> 00:03:41,840 Speaker 3: the bathroom. 73 00:03:42,520 --> 00:03:43,720 Speaker 7: Yeah, I guess. 74 00:03:43,720 --> 00:03:46,720 Speaker 6: I'm also thinking like if I have hit them, I've 75 00:03:46,760 --> 00:03:50,440 Speaker 6: never hit the happy one, probably always hit the deep red. 76 00:03:50,600 --> 00:03:51,000 Speaker 7: I like that. 77 00:03:51,040 --> 00:03:51,560 Speaker 1: I do like that. 78 00:03:51,640 --> 00:03:54,760 Speaker 7: It's not just happy and sad. It's like there's four 79 00:03:55,200 --> 00:03:57,240 Speaker 7: usually in the milaious ones, so it's. 80 00:03:57,080 --> 00:04:03,480 Speaker 6: Really upset, kind of upset. Okay, really I like that gradient. 81 00:04:09,320 --> 00:04:11,640 Speaker 2: So it's been a few weeks since we had our 82 00:04:11,720 --> 00:04:17,279 Speaker 2: last discussion about those buttons in the bathroom. So I 83 00:04:17,400 --> 00:04:21,000 Speaker 2: found the company behind most of those devices. 84 00:04:21,240 --> 00:04:25,479 Speaker 1: Their name is Happy or Not and I spoke with 85 00:04:25,600 --> 00:04:26,839 Speaker 1: Scott Erickson. 86 00:04:26,560 --> 00:04:30,520 Speaker 8: And I am the VP of US sales and global 87 00:04:30,600 --> 00:04:31,800 Speaker 8: channels and Happy or. 88 00:04:31,720 --> 00:04:34,719 Speaker 2: Not Happy or Not, like I said, as the company 89 00:04:34,720 --> 00:04:36,839 Speaker 2: behind most of those instant reaction buttons. 90 00:04:37,160 --> 00:04:41,360 Speaker 1: It is a Finish based company. Hmmm, which is funny. 91 00:04:41,000 --> 00:04:47,720 Speaker 2: Because Finland last eight years happiest country, Happy. 92 00:04:47,080 --> 00:04:49,119 Speaker 1: Or Not pretty happy. 93 00:04:49,279 --> 00:04:53,960 Speaker 3: Maybe the ability to give feedback in so many public places. 94 00:04:53,560 --> 00:04:54,520 Speaker 1: Has made them more happy. 95 00:04:54,560 --> 00:04:57,760 Speaker 3: It's to them feeling feeling heard and respected. 96 00:04:58,760 --> 00:05:01,479 Speaker 2: So the founder actually came up with the initial idea 97 00:05:01,520 --> 00:05:02,920 Speaker 2: when he was a teenager, a. 98 00:05:02,880 --> 00:05:07,680 Speaker 8: Guy named Hickey Vannen, and he was actually shopping in 99 00:05:07,760 --> 00:05:10,040 Speaker 8: a retail store somewhere in Finland and. 100 00:05:10,000 --> 00:05:12,360 Speaker 2: He couldn't find someone to help him. So we've all 101 00:05:12,400 --> 00:05:16,120 Speaker 2: been in a place where you're like, oh, like you 102 00:05:16,160 --> 00:05:17,520 Speaker 2: want to complain, but like you don't want to go 103 00:05:17,600 --> 00:05:18,240 Speaker 2: find a manager. 104 00:05:18,279 --> 00:05:21,599 Speaker 1: You're not going to like put in famously. Yeah. 105 00:05:21,680 --> 00:05:23,440 Speaker 2: So he was like, man, it would be cool if 106 00:05:23,480 --> 00:05:25,400 Speaker 2: there was a quick way to get feedback. 107 00:05:24,920 --> 00:05:27,760 Speaker 8: And he had this moment of inspiration, like there's got 108 00:05:27,920 --> 00:05:33,440 Speaker 8: to be a way to easily share that sentiment with 109 00:05:33,520 --> 00:05:39,120 Speaker 8: that particular company. 110 00:05:39,279 --> 00:05:40,800 Speaker 2: He was a teenager at the time, so he didn't 111 00:05:40,839 --> 00:05:43,200 Speaker 2: really do anything about it, but he came back to 112 00:05:43,240 --> 00:05:46,159 Speaker 2: the idea. And the quoting in Finland is if you 113 00:05:46,160 --> 00:05:49,039 Speaker 2: have like a startup idea, the government will actually give 114 00:05:49,080 --> 00:05:49,680 Speaker 2: you some money. 115 00:05:50,240 --> 00:05:50,600 Speaker 1: Wow. 116 00:05:50,680 --> 00:05:53,719 Speaker 2: They got a lot of startup money from the government 117 00:05:53,760 --> 00:05:57,880 Speaker 2: of Finland to put this idea together. So the company 118 00:05:57,920 --> 00:06:01,000 Speaker 2: has a few different devices where mostamiliar with, like the 119 00:06:01,160 --> 00:06:06,039 Speaker 2: four emoji version. They also have like iPads and other things. 120 00:06:06,680 --> 00:06:08,560 Speaker 2: But I realized last time we did a pretty bad 121 00:06:08,640 --> 00:06:12,440 Speaker 2: job of actually explaining what it looks like. So I'm 122 00:06:12,480 --> 00:06:15,240 Speaker 2: actually gonna send a photo here. 123 00:06:16,080 --> 00:06:20,080 Speaker 3: What I'm looking at here is essentially a stand with 124 00:06:20,200 --> 00:06:23,080 Speaker 3: a device on at the top of the stand and 125 00:06:23,120 --> 00:06:27,760 Speaker 3: it says please rate our service today, And there's four 126 00:06:28,279 --> 00:06:32,480 Speaker 3: emoji options, two smiley faces, one really dark green, one 127 00:06:32,560 --> 00:06:36,640 Speaker 3: lighter green, two frownie faces, one kind of red and 128 00:06:36,640 --> 00:06:38,360 Speaker 3: then the final one is very red. 129 00:06:38,480 --> 00:06:40,080 Speaker 1: So this is what we sucked up last time. There's 130 00:06:40,080 --> 00:06:41,200 Speaker 1: no neutral option here. 131 00:06:41,720 --> 00:06:44,160 Speaker 3: Yeah, this is really happy or not. Yeah, you gotta 132 00:06:44,240 --> 00:06:44,919 Speaker 3: pick a side. 133 00:06:44,920 --> 00:06:47,560 Speaker 8: There's no mes so we take the neutral out of it. 134 00:06:47,640 --> 00:06:51,360 Speaker 8: You're either really happy, you're happy, you're unhappy, or you're 135 00:06:51,400 --> 00:06:55,320 Speaker 8: really unhappy. So we truly do force that person. You 136 00:06:55,360 --> 00:06:57,880 Speaker 8: got to pick one exactly to tell us what's out 137 00:06:57,960 --> 00:06:58,360 Speaker 8: of the fence. 138 00:06:58,400 --> 00:06:58,760 Speaker 1: They' wrong. 139 00:06:58,800 --> 00:07:00,880 Speaker 8: There's no middle ground for us, because we don't think 140 00:07:00,880 --> 00:07:03,599 Speaker 8: the middle ground actually does anything as you're describing. 141 00:07:03,800 --> 00:07:06,120 Speaker 2: We've done plenty of surveys whereas there there's a lot 142 00:07:06,160 --> 00:07:08,920 Speaker 2: of neutral yes, and they're annoying and I always. 143 00:07:08,680 --> 00:07:11,120 Speaker 7: Said I use them, yeah, but we probably get it 144 00:07:11,160 --> 00:07:12,200 Speaker 7: and they're like what. 145 00:07:16,720 --> 00:07:17,960 Speaker 3: Yeah, so yeah. 146 00:07:18,000 --> 00:07:20,760 Speaker 2: The big thing is there's no neutral option on the device. 147 00:07:21,400 --> 00:07:23,760 Speaker 2: You gotta you know, you gotta pick you happy or not. 148 00:07:25,120 --> 00:07:27,600 Speaker 2: And the most shocking thing to me is someone is 149 00:07:27,640 --> 00:07:29,360 Speaker 2: actually looking at this data. 150 00:07:31,320 --> 00:07:34,080 Speaker 1: How real time is this? Is this something that you're collecting? 151 00:07:34,120 --> 00:07:37,880 Speaker 2: I pressed the button in that data as being transferred automatically? 152 00:07:38,080 --> 00:07:40,120 Speaker 1: Does it happen in batches? Is it? 153 00:07:40,200 --> 00:07:41,800 Speaker 2: You know a couple of days later that you're able 154 00:07:41,800 --> 00:07:44,120 Speaker 2: to sort of look at it collectively or is it you. 155 00:07:44,080 --> 00:07:45,160 Speaker 1: Know, just in the moment. 156 00:07:45,320 --> 00:07:49,160 Speaker 8: It is indeed real time. And all of our devices 157 00:07:49,280 --> 00:07:51,760 Speaker 8: have a SIM card in them, so they act like 158 00:07:51,800 --> 00:07:57,080 Speaker 8: a cellular device, so they're transmitting via cellular network. As 159 00:07:57,160 --> 00:08:00,280 Speaker 8: soon as you make it live, it's transmitting had a 160 00:08:00,320 --> 00:08:04,320 Speaker 8: live in real time, which again is the true essence 161 00:08:04,360 --> 00:08:06,640 Speaker 8: of what we're trying to do for our customers. 162 00:08:07,800 --> 00:08:11,360 Speaker 2: Did A big thing too, is that they're collecting this 163 00:08:11,480 --> 00:08:13,840 Speaker 2: data as the thing is happening. Right, A lot of 164 00:08:13,840 --> 00:08:17,760 Speaker 2: surveys are asking you about a moment in your past. 165 00:08:18,320 --> 00:08:20,800 Speaker 2: You know, you're leaving the bathroom and you are immediately 166 00:08:20,880 --> 00:08:23,240 Speaker 2: like was that bathroom clean or not? You're not thinking 167 00:08:23,360 --> 00:08:27,360 Speaker 2: back to that's powerful was the bathroom clean? It's very powerful. 168 00:08:26,920 --> 00:08:30,640 Speaker 6: Well, because it's like, especially something seemingly as innocuous as 169 00:08:30,640 --> 00:08:33,000 Speaker 6: an airport bathroom, even if I have a you know, 170 00:08:33,080 --> 00:08:35,400 Speaker 6: say there's no toilet paper or something's missing, I'm not 171 00:08:35,440 --> 00:08:39,240 Speaker 6: happy with the situation. Yeah, if someone reached out a 172 00:08:39,280 --> 00:08:42,280 Speaker 6: month later like, hey, we know you flew into lax, 173 00:08:43,160 --> 00:08:46,120 Speaker 6: what'd you think unless something really horrible happened, I'm going 174 00:08:46,200 --> 00:08:48,520 Speaker 6: to just be like, yeah, I'm not doing. 175 00:08:48,280 --> 00:08:53,720 Speaker 3: This click now do real time, meaning when I press 176 00:08:53,880 --> 00:08:57,280 Speaker 3: frowny face, it's gonna be like nine to fourteen pms. Okay, wow, 177 00:08:57,320 --> 00:08:57,640 Speaker 3: that is. 178 00:08:57,640 --> 00:09:00,000 Speaker 6: Very Not only that, but someone if they're looking at it, 179 00:09:00,080 --> 00:09:03,000 Speaker 6: could see that, could see right, not like a lot 180 00:09:03,040 --> 00:09:06,280 Speaker 6: of it and yeah wow yeah. 181 00:09:06,320 --> 00:09:08,960 Speaker 2: So let me let me play this bite that's going 182 00:09:09,000 --> 00:09:10,800 Speaker 2: to talk about how to sort of look at this 183 00:09:10,920 --> 00:09:12,160 Speaker 2: data in real time. 184 00:09:12,640 --> 00:09:15,760 Speaker 8: Circling back to my restroom use case example. So we're 185 00:09:15,800 --> 00:09:21,560 Speaker 8: helping those airport operations teams monitor the cleanests of those 186 00:09:21,640 --> 00:09:25,040 Speaker 8: restrooms where our devices are deployed, and then on the 187 00:09:25,080 --> 00:09:28,200 Speaker 8: back end, we're collecting all of those feedback data points 188 00:09:28,840 --> 00:09:31,600 Speaker 8: and displaying it to them in a clean, simple to 189 00:09:31,720 --> 00:09:35,319 Speaker 8: use analytics dashboard so they can monitor the trends every 190 00:09:35,360 --> 00:09:38,080 Speaker 8: single day, all day long. And then we even have 191 00:09:38,120 --> 00:09:41,880 Speaker 8: an alert mechanism, so if things are trending south where 192 00:09:41,920 --> 00:09:44,960 Speaker 8: we know there could be a problem with feedback, we 193 00:09:45,040 --> 00:09:47,960 Speaker 8: can alert staff in real time so they can take 194 00:09:48,000 --> 00:09:51,080 Speaker 8: the necessary action to go improve the cleanness of that 195 00:09:51,160 --> 00:09:52,120 Speaker 8: specific restroom. 196 00:09:52,679 --> 00:09:53,000 Speaker 3: M HM. 197 00:09:53,679 --> 00:09:55,480 Speaker 2: So you could send out basically like you're saying a 198 00:09:55,520 --> 00:09:58,280 Speaker 2: push alert A you know, the bathroom by Toronto A 199 00:09:59,080 --> 00:10:00,760 Speaker 2: is getting a lot of uh, you know. 200 00:10:01,440 --> 00:10:04,800 Speaker 1: Mad faces exactly, So say many goes in there. 201 00:10:06,280 --> 00:10:09,400 Speaker 2: Just fux it up, bad situation, nasty ship all over 202 00:10:09,440 --> 00:10:11,000 Speaker 2: the floor, which has never happened never. 203 00:10:11,720 --> 00:10:13,680 Speaker 1: I'm just saying this for example. 204 00:10:13,480 --> 00:10:15,880 Speaker 7: Yeah, it's not you, man, not you, it's a guy. 205 00:10:15,760 --> 00:10:18,760 Speaker 1: Named man in the bathroom of an airport. 206 00:10:18,840 --> 00:10:25,600 Speaker 6: Sure so then handsome tall guy. 207 00:10:25,880 --> 00:10:29,120 Speaker 2: So then there's you know, fifteen twenty minutes that everybody 208 00:10:29,200 --> 00:10:34,360 Speaker 2: just read rare smiley face, horrible, nasty as cross so happy. 209 00:10:35,200 --> 00:10:36,920 Speaker 2: The people who work at that airport could get an 210 00:10:36,920 --> 00:10:39,120 Speaker 2: alert on their phone saying, oh go check out bath 211 00:10:39,120 --> 00:10:42,880 Speaker 2: from their terminal. A we got a lot of reds, 212 00:10:43,040 --> 00:10:45,000 Speaker 2: and that is that's not the norm at this time 213 00:10:45,000 --> 00:10:45,360 Speaker 2: of day. 214 00:10:45,520 --> 00:10:47,640 Speaker 3: This is kind of game changing to me that this 215 00:10:47,720 --> 00:10:50,079 Speaker 3: is happening in real time, the fact. 216 00:10:49,920 --> 00:10:51,760 Speaker 1: That anyone is looking at it at any point. 217 00:10:51,840 --> 00:10:54,040 Speaker 6: I basically thought those things weren't even plugged in, yes, 218 00:10:54,240 --> 00:10:58,320 Speaker 6: or whatever would be. I thought it was literally like 219 00:10:58,400 --> 00:11:01,120 Speaker 6: a board game or something with a little button that. 220 00:11:02,320 --> 00:11:03,440 Speaker 1: Yeah. 221 00:11:03,480 --> 00:11:06,240 Speaker 3: Now here's a question. Have either of you seen what 222 00:11:06,320 --> 00:11:09,200 Speaker 3: happens to the screen after you press one of the buttons? 223 00:11:09,559 --> 00:11:13,040 Speaker 3: Does you say, like it's just like a screen it's 224 00:11:13,080 --> 00:11:15,880 Speaker 3: literally there's literally four buttons, so it doesn't say like 225 00:11:15,960 --> 00:11:17,160 Speaker 3: your response has been sent to. 226 00:11:17,280 --> 00:11:19,280 Speaker 7: It doesn't even I don't even think they light up 227 00:11:19,280 --> 00:11:19,800 Speaker 7: when you press. 228 00:11:20,320 --> 00:11:23,360 Speaker 6: It's literally it's like if you press the light switch 229 00:11:23,480 --> 00:11:24,280 Speaker 6: and nothing happened. 230 00:11:24,520 --> 00:11:26,440 Speaker 7: Ye, that's how it would feel. 231 00:11:28,280 --> 00:11:29,840 Speaker 1: It's like pressing a button on a remote. 232 00:11:29,960 --> 00:11:32,199 Speaker 7: I see yeah, yeah there's no TV. 233 00:11:33,880 --> 00:11:36,679 Speaker 1: Yeah there's no TV if it's registering. 234 00:11:36,240 --> 00:11:38,440 Speaker 3: But there is something happening. You're turning on a TV 235 00:11:38,559 --> 00:11:40,200 Speaker 3: somewhere exactly. 236 00:11:40,600 --> 00:11:41,120 Speaker 7: You couldn't be. 237 00:11:41,240 --> 00:11:43,240 Speaker 1: You could be. No, you definitely are. 238 00:11:44,480 --> 00:11:47,239 Speaker 2: And they have they have versions that are more complicated, 239 00:11:47,240 --> 00:11:49,040 Speaker 2: like I've haads, it's stuff that are more thorough. But 240 00:11:49,080 --> 00:11:51,920 Speaker 2: the ones that we are we say the most popular 241 00:11:51,920 --> 00:11:53,240 Speaker 2: ones are just the four buttons. 242 00:11:53,240 --> 00:11:56,320 Speaker 1: There's no and a question. That's it. It's simple, simple. 243 00:11:57,280 --> 00:11:58,559 Speaker 3: I feel like we have to test this. 244 00:11:59,200 --> 00:11:59,959 Speaker 1: You're gonna go pressing. 245 00:12:00,480 --> 00:12:02,959 Speaker 3: It's maybe not worth going through airport security, but we 246 00:12:03,160 --> 00:12:05,400 Speaker 3: just start hitting that red butt. 247 00:12:05,800 --> 00:12:06,840 Speaker 7: You need such scale. 248 00:12:07,320 --> 00:12:12,439 Speaker 1: Yes. The thing too is they can protect against people. 249 00:12:14,320 --> 00:12:16,800 Speaker 6: If you were pressing a million times all at once, yeah, 250 00:12:16,840 --> 00:12:19,040 Speaker 6: that's not going to really register versus Okay, we got 251 00:12:19,080 --> 00:12:19,680 Speaker 6: an hour. 252 00:12:19,520 --> 00:12:20,480 Speaker 1: And it takes time. 253 00:12:20,520 --> 00:12:21,840 Speaker 3: For something is still there. 254 00:12:22,120 --> 00:12:24,280 Speaker 2: Yeah, so a manager can't just go and just press 255 00:12:24,400 --> 00:12:25,880 Speaker 2: during green greener and green, green green. 256 00:12:26,000 --> 00:12:29,160 Speaker 3: Wow, they've thought of everything in the bamart. 257 00:12:30,360 --> 00:12:34,520 Speaker 2: But there's also something very shocking about these devices. 258 00:12:34,559 --> 00:12:37,160 Speaker 1: Shocking there's a camera in them. 259 00:12:37,360 --> 00:12:39,040 Speaker 7: Oh what the fuck. 260 00:12:39,120 --> 00:12:42,719 Speaker 3: This, I did not know, camera in a bathroom, immediately 261 00:12:42,800 --> 00:12:43,920 Speaker 3: raising some red flags. 262 00:12:44,880 --> 00:12:46,720 Speaker 2: All right, we are going to take a quick break 263 00:12:46,840 --> 00:12:49,400 Speaker 2: and when we get back, what the hell are cameras 264 00:12:49,440 --> 00:13:02,840 Speaker 2: doing in these devices? There's also something very shocking about 265 00:13:02,840 --> 00:13:06,480 Speaker 2: these devices, shocking there's a camera in them. 266 00:13:06,679 --> 00:13:09,840 Speaker 3: Oh what the fuck this, I did not know, camera 267 00:13:10,000 --> 00:13:13,240 Speaker 3: in a bathroom, immediately raising some red flags. 268 00:13:13,960 --> 00:13:16,560 Speaker 8: So there is a camera in the device, and we're 269 00:13:16,559 --> 00:13:21,200 Speaker 8: not taking photos of anyone, but we are assessing via 270 00:13:21,320 --> 00:13:26,400 Speaker 8: some AI technology to allow us to predict that person's 271 00:13:27,080 --> 00:13:33,760 Speaker 8: age and gender, and that can generate a number of benefits. 272 00:13:33,960 --> 00:13:37,880 Speaker 8: But one of the main benefits is automatically filtering out 273 00:13:37,920 --> 00:13:41,000 Speaker 8: feedback from certain age groups. As you can imagine, a 274 00:13:41,000 --> 00:13:43,160 Speaker 8: lot of kids are drawn to using their devices out 275 00:13:43,200 --> 00:13:49,240 Speaker 8: there in the wild exactly, So that kind of technology 276 00:13:49,280 --> 00:13:52,319 Speaker 8: allows us to automatically filter out all of the feedback 277 00:13:52,400 --> 00:13:55,559 Speaker 8: from that age group because we know our customers probably 278 00:13:55,559 --> 00:13:57,760 Speaker 8: don't view that as true feedback. 279 00:13:58,320 --> 00:14:01,480 Speaker 2: So when you're so, you're are you using this the 280 00:14:01,640 --> 00:14:07,040 Speaker 2: agent gender data just to weed stuff out or do 281 00:14:07,080 --> 00:14:09,640 Speaker 2: you also use it as say, hey, we're noticing that, 282 00:14:10,080 --> 00:14:13,280 Speaker 2: you know a lot of older people are giving you know, 283 00:14:13,520 --> 00:14:17,120 Speaker 2: negative reviews to you know, let's say a restaurant. You know, 284 00:14:17,240 --> 00:14:19,280 Speaker 2: maybe that's a segment that the music is too loud 285 00:14:19,360 --> 00:14:21,240 Speaker 2: or something. You know, it's like, are you how are 286 00:14:21,320 --> 00:14:22,800 Speaker 2: y'all using that age data? 287 00:14:23,720 --> 00:14:27,640 Speaker 8: Yeah, so filtering out age data is one definitive use case. 288 00:14:27,720 --> 00:14:30,400 Speaker 8: But let me give you another practical use case in 289 00:14:30,440 --> 00:14:35,080 Speaker 8: one of our customers in Europe airport specifically, they're using 290 00:14:35,120 --> 00:14:39,680 Speaker 8: this device to measure feedback in their like food court area, 291 00:14:39,840 --> 00:14:44,320 Speaker 8: and they had transition to requiring passengers to order basically 292 00:14:44,360 --> 00:14:47,360 Speaker 8: through a kind of a QR code approach, and as 293 00:14:47,360 --> 00:14:50,840 Speaker 8: you can imagine, the older generation is not totally thrilled 294 00:14:50,840 --> 00:14:54,720 Speaker 8: with that kind of ordering model, and the feedback through 295 00:14:54,760 --> 00:14:59,560 Speaker 8: again this demographic based approach highlighted that like that category 296 00:14:59,600 --> 00:15:03,640 Speaker 8: of of demographic was just not happy with that change 297 00:15:03,640 --> 00:15:05,920 Speaker 8: that they implemented in the airport. So they fixed it 298 00:15:06,160 --> 00:15:08,840 Speaker 8: and take a while, guess what happened feedback went back 299 00:15:08,880 --> 00:15:11,520 Speaker 8: the other direction as far as the general satisfaction of 300 00:15:12,240 --> 00:15:13,760 Speaker 8: passengers in that area. 301 00:15:15,280 --> 00:15:17,960 Speaker 7: Well, yeah, because my thought was obviously the kids thing. 302 00:15:18,000 --> 00:15:19,320 Speaker 6: But then I was like, oh, you could also then 303 00:15:19,320 --> 00:15:22,200 Speaker 6: filter that kind of you know, the Karens, you know, 304 00:15:23,640 --> 00:15:25,400 Speaker 6: be like you know, they're they're not going. 305 00:15:25,320 --> 00:15:28,280 Speaker 7: To be satisfied as a fife. Yeah, so like, yeah, 306 00:15:28,440 --> 00:15:29,640 Speaker 7: we're going to cut out the trash. 307 00:15:29,840 --> 00:15:33,920 Speaker 6: Yeah, we're going to cut out this segment and under fourteen. 308 00:15:34,040 --> 00:15:37,040 Speaker 3: Groc please under fourteen boys and forty plus year old 309 00:15:37,040 --> 00:15:39,600 Speaker 3: white women from the responses. 310 00:15:39,120 --> 00:15:40,920 Speaker 7: Yeah, they're resting up with the data too much. It's 311 00:15:40,960 --> 00:15:42,960 Speaker 7: not fair the outliers. 312 00:15:43,280 --> 00:15:46,080 Speaker 3: Yeah, that is crazy though. The camera thing, I mean, 313 00:15:46,280 --> 00:15:49,160 Speaker 3: it makes sense from their use case of wanting to 314 00:15:49,200 --> 00:15:53,160 Speaker 3: filter out people who are more likely to be just 315 00:15:53,200 --> 00:15:55,640 Speaker 3: fucking around on them. Yeah, but that does raise some 316 00:15:55,680 --> 00:15:59,240 Speaker 3: concerns about just like, Okay, my child's is being recorded 317 00:15:59,240 --> 00:16:00,280 Speaker 3: in the bathroom. 318 00:16:00,040 --> 00:16:02,600 Speaker 2: And this is not a device that you would I 319 00:16:02,640 --> 00:16:06,200 Speaker 2: have similized that there's no indication that there there's a camera. 320 00:16:06,240 --> 00:16:08,040 Speaker 6: When I think of them, Oh my god, I'm sure 321 00:16:08,080 --> 00:16:10,480 Speaker 6: there's a small dot somewhere. Now keep an eye out, 322 00:16:10,480 --> 00:16:12,280 Speaker 6: But I imagine this is something that came on later. 323 00:16:12,520 --> 00:16:18,680 Speaker 2: So yeah, they actually started introducing cameras post COVID. So 324 00:16:19,120 --> 00:16:22,480 Speaker 2: remember COVID happens in twenty twenty. Oh, we don't want 325 00:16:22,480 --> 00:16:26,440 Speaker 2: to touch stuff, yes, So in March of twenty twenty one, 326 00:16:26,920 --> 00:16:32,720 Speaker 2: they introduce a device that recognized gestures versus touch an 327 00:16:32,760 --> 00:16:34,880 Speaker 2: actual button. So you would hold up a certain amount 328 00:16:34,920 --> 00:16:37,800 Speaker 2: of fingers which would indicate whether you're happy or not. 329 00:16:37,840 --> 00:16:39,960 Speaker 2: So you'd put up like two fingers, which will be 330 00:16:40,080 --> 00:16:42,640 Speaker 2: like a smiley face, and then it would recognize it 331 00:16:42,680 --> 00:16:46,080 Speaker 2: on the device. So that was how it started, and 332 00:16:46,120 --> 00:16:50,120 Speaker 2: then it later evolved so that in twenty twenty three 333 00:16:50,240 --> 00:16:56,120 Speaker 2: they started using a similar system to capture demographic data, 334 00:16:56,280 --> 00:17:01,280 Speaker 2: like he was saying, age and gender. And now it's 335 00:17:01,320 --> 00:17:05,439 Speaker 2: on all their devices, but the people who are buying them, 336 00:17:05,480 --> 00:17:07,960 Speaker 2: so like you know, the stores, any airports or whatever, 337 00:17:08,160 --> 00:17:11,000 Speaker 2: they have to opt into having it on. 338 00:17:11,560 --> 00:17:12,399 Speaker 7: Oh I see. 339 00:17:12,840 --> 00:17:15,600 Speaker 6: And to be fair, as far as the bathroom concern is, 340 00:17:16,040 --> 00:17:19,880 Speaker 6: they're always positioned on the entryway. Yeah, yeah, you're not 341 00:17:19,320 --> 00:17:22,359 Speaker 6: going it's even around the corner from where you'd be 342 00:17:22,400 --> 00:17:23,240 Speaker 6: even by the sinks. 343 00:17:23,320 --> 00:17:26,200 Speaker 3: Yeah yeah, but you can imagine. But yeah, certain oh yeah, 344 00:17:26,240 --> 00:17:27,520 Speaker 3: these days politicians. 345 00:17:27,560 --> 00:17:31,080 Speaker 2: Hey, and I asked them, I didn't even notice thing 346 00:17:31,119 --> 00:17:35,720 Speaker 2: had a camera. How you protected my security, my privacy? Yes, 347 00:17:35,840 --> 00:17:40,000 Speaker 2: you say, well, let me ask you this. So you're, 348 00:17:40,080 --> 00:17:43,359 Speaker 2: like you said, you're not taking or storing photos of people. 349 00:17:43,359 --> 00:17:47,639 Speaker 2: So what happens with how how do you protect customer data? 350 00:17:47,880 --> 00:17:49,880 Speaker 2: Because I would imagine most people don't even realize that 351 00:17:49,880 --> 00:17:53,159 Speaker 2: that's happening when you're pressing the buttons. So how do 352 00:17:53,200 --> 00:17:55,679 Speaker 2: you make sure that that data is not, you know, 353 00:17:55,920 --> 00:17:56,919 Speaker 2: doesn't end up somewhere. 354 00:17:56,960 --> 00:17:59,320 Speaker 8: Yeah, this is getting probably a little bit technical, but 355 00:17:59,480 --> 00:18:02,880 Speaker 8: just generally speaking, we're not storing anything on the actual 356 00:18:02,920 --> 00:18:07,040 Speaker 8: device nor in the cloud, So rest assured to your listeners, 357 00:18:07,080 --> 00:18:11,840 Speaker 8: there's no privacy concerns of any kind. We're simply just 358 00:18:11,920 --> 00:18:15,760 Speaker 8: taking what we call a vector analysis of that person's face, 359 00:18:17,040 --> 00:18:19,840 Speaker 8: and that allows us to map it against a predictive 360 00:18:20,480 --> 00:18:25,880 Speaker 8: method to predict the age and the gender. 361 00:18:26,920 --> 00:18:27,320 Speaker 1: Gotcha. 362 00:18:27,400 --> 00:18:30,000 Speaker 2: So it's not I would imagine the devices are not 363 00:18:30,240 --> 00:18:33,560 Speaker 2: being trained on the data that it's being inputed. It's 364 00:18:33,640 --> 00:18:36,879 Speaker 2: taking sort of whatever prompt you have already given it 365 00:18:36,920 --> 00:18:39,639 Speaker 2: and implying it in that instant, and then the photo 366 00:18:39,720 --> 00:18:40,040 Speaker 2: is gone. 367 00:18:41,320 --> 00:18:42,480 Speaker 1: Yes, does that make sense? 368 00:18:42,640 --> 00:18:45,840 Speaker 8: Yeah, you're describing correctly. Although there is no photo, there's 369 00:18:46,200 --> 00:18:48,840 Speaker 8: no there's a record of any kind that has maintained. 370 00:18:49,000 --> 00:18:50,240 Speaker 1: So not gonna lie. You know. 371 00:18:50,320 --> 00:18:53,040 Speaker 2: I went into this thinking I was just talking about buttons. 372 00:18:53,119 --> 00:18:55,240 Speaker 2: Now we're talking about AI and privacy concerns. 373 00:18:55,440 --> 00:18:56,919 Speaker 1: Yeah, Sonny, you know. 374 00:18:56,920 --> 00:18:59,960 Speaker 2: I reached out to Jason Kiebler, he's a tech report 375 00:19:00,119 --> 00:19:04,240 Speaker 2: at four Form Media, to get some insight into like. 376 00:19:04,359 --> 00:19:07,520 Speaker 1: Should we be concerned about this? You know, what's happening here. 377 00:19:07,640 --> 00:19:08,920 Speaker 3: So I'm concerned. 378 00:19:09,200 --> 00:19:10,520 Speaker 1: He was pretty shocked. 379 00:19:10,200 --> 00:19:13,919 Speaker 2: That there were cameras, so he had some privacy concerns, 380 00:19:15,480 --> 00:19:17,959 Speaker 2: not only about like the disclosures that they're recording you, 381 00:19:18,000 --> 00:19:23,080 Speaker 2: but like holiday processing this data. So I reached back 382 00:19:23,119 --> 00:19:25,120 Speaker 2: out to Happy or Not after I did the interview, 383 00:19:25,640 --> 00:19:29,440 Speaker 2: and they reiterated what Scott said during the interview that 384 00:19:29,520 --> 00:19:34,840 Speaker 2: the system does not collect, store, or transmit identifiable images 385 00:19:35,119 --> 00:19:39,040 Speaker 2: or personal data. The AI detects a face and it 386 00:19:39,080 --> 00:19:42,679 Speaker 2: creates a one and twenty eight dimensional vector based on 387 00:19:42,800 --> 00:19:46,800 Speaker 2: it in real time. No actual photograph of the feedback 388 00:19:46,840 --> 00:19:50,960 Speaker 2: giver's face is saved or uploaded anywhere. Happy or Not 389 00:19:51,040 --> 00:19:57,760 Speaker 2: does not collect any personally identifiable data, so Jason's follow 390 00:19:57,840 --> 00:20:00,320 Speaker 2: up to that was that, okay, if you're using an 391 00:20:00,320 --> 00:20:04,760 Speaker 2: AI system to do this, first of all, AI being 392 00:20:04,800 --> 00:20:07,800 Speaker 2: able to detect people's ages and genders, he said, is 393 00:20:07,880 --> 00:20:12,240 Speaker 2: notoriously like not reliable. I also followed up with Matty Belichick, 394 00:20:12,240 --> 00:20:15,600 Speaker 2: who you may remember as the AI researcher from Shell Game, 395 00:20:15,680 --> 00:20:19,000 Speaker 2: who says, yeah, Jason is correct. These models are notoriously 396 00:20:19,080 --> 00:20:22,359 Speaker 2: not great at detecting age and gender, even in perfect 397 00:20:22,480 --> 00:20:26,200 Speaker 2: lab settings. He says the performance varies a lot by 398 00:20:26,440 --> 00:20:29,760 Speaker 2: race and gender, saying they tend to perform worse with 399 00:20:29,840 --> 00:20:32,560 Speaker 2: women of color, and he said, if they're not training 400 00:20:32,560 --> 00:20:35,719 Speaker 2: these models on this new data, their models are going 401 00:20:35,760 --> 00:20:41,920 Speaker 2: to be even worse. So Happy or Not responded saying 402 00:20:43,040 --> 00:20:47,920 Speaker 2: that accuracy was validated through a combination of control testing 403 00:20:48,080 --> 00:20:52,800 Speaker 2: and real world deployment prior to launch, ensuring performance across 404 00:20:52,800 --> 00:20:56,040 Speaker 2: a range of environments and use cases with age and 405 00:20:56,119 --> 00:21:04,760 Speaker 2: gender recognition achieving up to ninety five percent accuracy. Now 406 00:21:05,560 --> 00:21:07,400 Speaker 2: I'm gonna call out there up to. 407 00:21:08,480 --> 00:21:10,640 Speaker 3: Yeah, I mean that could be literally anything under. 408 00:21:10,920 --> 00:21:15,000 Speaker 2: That could be one percent. It could be one percent 409 00:21:15,040 --> 00:21:18,840 Speaker 2: that is worse. Up to up to means the most 410 00:21:18,880 --> 00:21:20,879 Speaker 2: accurate is going to be is ninety five percent. 411 00:21:21,040 --> 00:21:23,760 Speaker 1: It's never gonna get to one hundred. So in certain 412 00:21:23,800 --> 00:21:25,040 Speaker 1: situations we're talking you. 413 00:21:25,000 --> 00:21:27,800 Speaker 6: Know, stadiums are talking, thousands of airports even more, yeah, 414 00:21:27,840 --> 00:21:30,400 Speaker 6: talking thousands of people every day. It could it could 415 00:21:30,400 --> 00:21:31,760 Speaker 6: be thousands slipping through the craft. 416 00:21:31,800 --> 00:21:34,240 Speaker 1: It could be up to being wrongly aged. It could 417 00:21:34,240 --> 00:21:38,160 Speaker 1: be up to but you know, who knows. 418 00:21:38,200 --> 00:21:40,480 Speaker 2: And this is the sort of thing where it's like, yeah, 419 00:21:40,480 --> 00:21:42,679 Speaker 2: they're not going to share it with me, their AI 420 00:21:43,080 --> 00:21:44,760 Speaker 2: data and video and. 421 00:21:44,960 --> 00:21:45,920 Speaker 1: Sports on No. 422 00:21:46,119 --> 00:21:49,080 Speaker 2: Yeah, I hope, we should hope not, but it is 423 00:21:49,600 --> 00:21:51,560 Speaker 2: you know, we're also asked about like what is the 424 00:21:51,640 --> 00:21:53,760 Speaker 2: likelihood that you know, something like this can be hacked. 425 00:21:53,800 --> 00:21:57,760 Speaker 2: They said, everything's encrypted, you know, on the device itself. 426 00:21:57,800 --> 00:22:01,679 Speaker 2: You can't get access to the data too. So you know, 427 00:22:01,720 --> 00:22:03,480 Speaker 2: they said they put in a lot of precautions to 428 00:22:03,560 --> 00:22:07,239 Speaker 2: make sure that whatever demographic data they are collecting on 429 00:22:07,280 --> 00:22:10,600 Speaker 2: you is not traceable back to you. There's no photos 430 00:22:10,720 --> 00:22:14,440 Speaker 2: or live streams or anything that is stored anywhere. 431 00:22:15,119 --> 00:22:15,479 Speaker 6: Hmmm. 432 00:22:15,840 --> 00:22:17,880 Speaker 2: And I did, but I did ask the disclosure question 433 00:22:17,960 --> 00:22:19,639 Speaker 2: of just sort of like, okay, all this is fine, 434 00:22:19,720 --> 00:22:22,560 Speaker 2: but like shouldn't you guys just say like, hey, here's 435 00:22:22,560 --> 00:22:23,119 Speaker 2: a camera here. 436 00:22:23,400 --> 00:22:24,920 Speaker 1: Yeah, and they're like. 437 00:22:24,880 --> 00:22:28,920 Speaker 2: That's basically on the people who are buying these devices. 438 00:22:29,720 --> 00:22:32,920 Speaker 2: Based on wherever you are, you should follow whatever local 439 00:22:33,000 --> 00:22:35,320 Speaker 2: rules there are around or having disclose. 440 00:22:35,160 --> 00:22:38,160 Speaker 3: Yeah, that information. Yeah, because I wondered if I mean 441 00:22:38,200 --> 00:22:39,440 Speaker 3: it does sound. 442 00:22:39,880 --> 00:22:44,000 Speaker 6: Illegal when you go when you go into say a 443 00:22:44,040 --> 00:22:46,320 Speaker 6: bar or something, and they're filming something or whatever and 444 00:22:46,400 --> 00:22:48,000 Speaker 6: it has the sign on the front line it. 445 00:22:47,960 --> 00:22:48,680 Speaker 1: Might be recorded. 446 00:22:48,840 --> 00:22:50,240 Speaker 7: Yeah, I guess it's basically that rule. 447 00:22:50,280 --> 00:22:54,320 Speaker 6: And maybe I guess an airport probably does have that yah, yeah, 448 00:22:54,560 --> 00:22:56,359 Speaker 6: you know or whatever, like most have it. 449 00:22:56,760 --> 00:22:59,000 Speaker 2: We gotta look out to see if they're Yeah, because 450 00:22:59,000 --> 00:23:01,320 Speaker 2: there's probably they would probably say it's the same. 451 00:23:01,560 --> 00:23:02,560 Speaker 7: It's all under one thing. 452 00:23:02,680 --> 00:23:05,760 Speaker 2: Yeah, like maybe got security cameras and there's a camera 453 00:23:06,200 --> 00:23:08,199 Speaker 2: camera in the airport, and that camera is not going 454 00:23:08,240 --> 00:23:10,840 Speaker 2: to be used, you know if there's a trial. Yeah, 455 00:23:11,000 --> 00:23:12,960 Speaker 2: right right, and you stole something, they're not gonna be 456 00:23:12,960 --> 00:23:14,600 Speaker 2: able to pull a happier not information. 457 00:23:15,000 --> 00:23:18,240 Speaker 6: That's the real test theoretically. Well, here's what we do 458 00:23:19,080 --> 00:23:24,320 Speaker 6: commit a crime, Yes, one of these things, no other witnesses. Yes, 459 00:23:24,600 --> 00:23:25,440 Speaker 6: you see what happens. 460 00:23:26,160 --> 00:23:29,080 Speaker 7: So I'll rob one of you guys, kill what they do? 461 00:23:29,200 --> 00:23:32,000 Speaker 1: Kill one of us. Yeah, go to a bathroom a 462 00:23:32,040 --> 00:23:32,600 Speaker 1: Happier or Not. 463 00:23:32,960 --> 00:23:36,240 Speaker 7: Yeah, we know how to kill from episode three. 464 00:23:38,720 --> 00:23:40,679 Speaker 2: You go on these devices and you do like a 465 00:23:40,800 --> 00:23:43,640 Speaker 2: Jinx confession. I know if Friedman. 466 00:23:43,320 --> 00:23:45,320 Speaker 7: Have killed I just killed this man in front of 467 00:23:45,320 --> 00:23:45,760 Speaker 7: the Happier. 468 00:23:49,560 --> 00:23:54,440 Speaker 6: My experience was great, green smile. 469 00:23:54,800 --> 00:23:57,280 Speaker 2: That'll be true because then we'll know, Because then then 470 00:23:57,280 --> 00:23:59,359 Speaker 2: you'll get like the FEDS. You know, you get police 471 00:23:59,440 --> 00:24:01,639 Speaker 2: trying to Hey, you gotta give us this information. So 472 00:24:01,680 --> 00:24:04,320 Speaker 2: if it is being stroyed anywhere, that would be yes, 473 00:24:04,359 --> 00:24:05,080 Speaker 2: I will find out. 474 00:24:05,119 --> 00:24:07,800 Speaker 3: We could even like pose as like yeah, someone who's 475 00:24:08,160 --> 00:24:12,640 Speaker 3: looking for information just email them like, hey, I. 476 00:24:12,600 --> 00:24:15,800 Speaker 6: Just got my friend who just got murdered in this bathroom. 477 00:24:15,960 --> 00:24:17,600 Speaker 7: Hey, hey see the footage. 478 00:24:17,760 --> 00:24:25,040 Speaker 3: I'm the FBI, the Federal Bureau of Investigation. 479 00:24:25,920 --> 00:24:29,000 Speaker 1: Man was murdered in a JFK bathroom, and I will. 480 00:24:28,920 --> 00:24:29,720 Speaker 7: View of the Happier Not. 481 00:24:31,240 --> 00:24:32,399 Speaker 6: Could you slide me that link. 482 00:24:33,400 --> 00:24:35,080 Speaker 7: Please unencrypt the vector? 483 00:24:36,080 --> 00:24:38,679 Speaker 6: Please unencrypted eight dimensional vector. 484 00:24:38,960 --> 00:24:40,480 Speaker 7: I think that'll really help with this case. 485 00:24:41,600 --> 00:24:44,320 Speaker 2: Okay, So after this episode aired, we got some more 486 00:24:44,359 --> 00:24:48,120 Speaker 2: clarification from Happier or Not. I initially asked them, are 487 00:24:48,280 --> 00:24:52,560 Speaker 2: cameras standard on all of their new devices. They responded saying, yeah, 488 00:24:52,640 --> 00:24:57,320 Speaker 2: cameras are standard on all new Smiley Touch devices. But 489 00:24:57,400 --> 00:25:01,480 Speaker 2: what they didn't clarify is that Smiley Touch is a 490 00:25:01,600 --> 00:25:04,840 Speaker 2: name just for their touchscreen devices, not all of their 491 00:25:04,880 --> 00:25:08,280 Speaker 2: devices which you can touch. So if you see a 492 00:25:08,280 --> 00:25:12,320 Speaker 2: touchscreen that has a camera, if you don't see a touchscreen, 493 00:25:12,600 --> 00:25:13,600 Speaker 2: it does not have a camera. 494 00:25:13,800 --> 00:25:16,480 Speaker 1: Okay, that's your episode. 495 00:25:18,720 --> 00:25:22,160 Speaker 2: I asked him basically, what percentage of people who pass 496 00:25:22,240 --> 00:25:24,600 Speaker 2: these devices actually press a button? 497 00:25:25,000 --> 00:25:28,600 Speaker 8: Yeah, again, it does vary by the sector and the 498 00:25:28,720 --> 00:25:31,800 Speaker 8: use case. But just as a kind of a high 499 00:25:31,920 --> 00:25:35,240 Speaker 8: level average to give you, we're in the kind of 500 00:25:35,280 --> 00:25:39,159 Speaker 8: the ten percent neighborhood. In certain environments we see twenty 501 00:25:39,160 --> 00:25:47,640 Speaker 8: to thirty percent engagement, So it can be really really high. 502 00:25:45,359 --> 00:25:47,399 Speaker 1: Which to me doesn't seem not high. 503 00:25:47,600 --> 00:25:49,240 Speaker 6: No, but if you think where they are, yeah, I 504 00:25:49,240 --> 00:25:53,119 Speaker 6: mean so, he said, let me see airports and stadiums. 505 00:25:53,119 --> 00:25:54,720 Speaker 3: Ten percent of people that go to the bathroom of 506 00:25:54,720 --> 00:25:55,800 Speaker 3: the airport, I mean that thousand. 507 00:25:55,840 --> 00:25:56,440 Speaker 1: That's a lot of people. 508 00:25:57,000 --> 00:25:59,119 Speaker 2: He put me on a spot, asked me, Ohm, I 509 00:25:59,119 --> 00:26:01,280 Speaker 2: ask you how many people are there in the world. 510 00:26:01,680 --> 00:26:02,960 Speaker 7: Oh, it's like eight billion. 511 00:26:03,040 --> 00:26:05,800 Speaker 1: Now, okay, I got that right. I just took that off. 512 00:26:05,920 --> 00:26:06,760 Speaker 1: I just could. 513 00:26:06,960 --> 00:26:09,639 Speaker 7: I only know. I only know I from doing the 514 00:26:09,720 --> 00:26:11,439 Speaker 7: seven foot episode everyone should listen to. 515 00:26:13,400 --> 00:26:15,960 Speaker 6: Because growing up, yeah, it used to always be oh, 516 00:26:16,040 --> 00:26:17,960 Speaker 6: seven billion, seven million, So they've. 517 00:26:17,720 --> 00:26:23,840 Speaker 2: Gotten anyway, this is a long way. 518 00:26:24,440 --> 00:26:27,240 Speaker 1: They've gotten two billion responses. 519 00:26:27,680 --> 00:26:30,000 Speaker 2: So he was comparing this to say, like, you know, 520 00:26:30,000 --> 00:26:32,240 Speaker 2: when you get those like email surveys or whatever after 521 00:26:32,280 --> 00:26:34,000 Speaker 2: the fact, and like four percentage. 522 00:26:33,640 --> 00:26:36,000 Speaker 1: Of people actually feel those out, it is. 523 00:26:35,960 --> 00:26:39,399 Speaker 8: Not unheard of for email surveys as far as the 524 00:26:39,400 --> 00:26:43,040 Speaker 8: engagement level to be in like the quarter percent to 525 00:26:43,160 --> 00:26:44,919 Speaker 8: half a percent response, Right. 526 00:26:44,840 --> 00:26:47,760 Speaker 3: I'm offended when they send me those even what yeah, 527 00:26:47,920 --> 00:26:50,120 Speaker 3: or like you go to McDonald's or something and it's 528 00:26:50,160 --> 00:26:52,960 Speaker 3: like you get free sandwich, you can get ten free sandwiches, 529 00:26:53,000 --> 00:26:53,600 Speaker 3: fill out the survey. 530 00:26:53,720 --> 00:26:54,000 Speaker 7: I'm like. 531 00:26:55,320 --> 00:26:59,560 Speaker 2: And another interesting I would assume, like many were saying earlier, 532 00:26:59,680 --> 00:27:01,520 Speaker 2: you know, they no neutral option. If people are going 533 00:27:01,520 --> 00:27:03,600 Speaker 2: to press these buttons, I would assume for. 534 00:27:03,520 --> 00:27:05,520 Speaker 3: The most part, they'll be more likely to be upset. 535 00:27:05,640 --> 00:27:07,960 Speaker 2: Yeah, I'm gonna comp this is my way of complaining 536 00:27:07,960 --> 00:27:08,960 Speaker 2: without actually complaining. 537 00:27:09,080 --> 00:27:12,560 Speaker 7: Yeah, with me leaving or review doing something else. 538 00:27:13,440 --> 00:27:17,320 Speaker 2: But my boy Scott said, that's not the case people 539 00:27:17,480 --> 00:27:21,160 Speaker 2: assume that they're just gonna get negative reactions, they get 540 00:27:21,200 --> 00:27:23,400 Speaker 2: a lot of positive fascinating. 541 00:27:23,480 --> 00:27:26,520 Speaker 8: A lot of people think they're just deployed to gather 542 00:27:26,720 --> 00:27:29,439 Speaker 8: negative feedback, and it's actually quite the opposite. You'd be 543 00:27:29,480 --> 00:27:35,000 Speaker 8: surprised how heavily weighted the positive feedback actually is. Really 544 00:27:35,040 --> 00:27:38,480 Speaker 8: and I think as humans were kind of wired to 545 00:27:38,600 --> 00:27:42,120 Speaker 8: want to share feedback good or bad, but people generally 546 00:27:42,240 --> 00:27:45,000 Speaker 8: like to share positive feedback without a doubt. 547 00:27:45,080 --> 00:27:48,680 Speaker 6: I could see out a store like a convenience thing, 548 00:27:48,760 --> 00:27:52,639 Speaker 6: and you know service cash year is just friendly, nice, whatever, easy. 549 00:27:53,080 --> 00:27:54,800 Speaker 6: I can see smashing a positive for sure. 550 00:27:58,040 --> 00:28:00,480 Speaker 2: All right, So that was happy or not. We're gonna 551 00:28:00,520 --> 00:28:03,920 Speaker 2: take a break. When we come back. We're bringing back 552 00:28:04,040 --> 00:28:07,320 Speaker 2: one of my favorite segments, Dev's Hot Takes. 553 00:28:09,440 --> 00:28:11,359 Speaker 7: All right, we're curry segment. We done once. 554 00:28:12,080 --> 00:28:31,080 Speaker 1: This is our second time doing it. All right, welcome back. 555 00:28:31,960 --> 00:28:35,439 Speaker 2: It is time, ladies and gentlemen start the music for 556 00:28:35,520 --> 00:28:48,080 Speaker 2: Dev's Hot Takes. Sometimes we do this dead silence, everyone's waiting. 557 00:28:48,240 --> 00:28:50,360 Speaker 2: Sometimes we do these at the end of episodes. The 558 00:28:50,400 --> 00:28:51,560 Speaker 2: people have been asking for. 559 00:28:51,560 --> 00:28:53,560 Speaker 7: Him all the time. Sometime we do them. 560 00:28:55,040 --> 00:29:00,280 Speaker 2: This So so fellas last night I went to the movies. 561 00:29:00,400 --> 00:29:01,760 Speaker 1: Oh nice, I saw. 562 00:29:01,800 --> 00:29:04,600 Speaker 2: I saw a project till Mary, beautiful film. You should 563 00:29:04,640 --> 00:29:07,719 Speaker 2: bring baby Lula to it. Good family film. Okay, I 564 00:29:07,800 --> 00:29:11,920 Speaker 2: got there late, which at the movies is early, So 565 00:29:11,960 --> 00:29:14,160 Speaker 2: I got there about fifteen twenty minutes. 566 00:29:14,680 --> 00:29:17,000 Speaker 1: You know, trailers were still still showing trailer. 567 00:29:17,560 --> 00:29:19,320 Speaker 7: Probably just started the trailers and they said through like. 568 00:29:19,280 --> 00:29:20,800 Speaker 1: Fifteen twenty minutes more trailers. 569 00:29:20,920 --> 00:29:21,280 Speaker 3: Damn. 570 00:29:22,120 --> 00:29:24,920 Speaker 2: Here's here's a thing about when you go to the movies. 571 00:29:25,240 --> 00:29:29,400 Speaker 2: It's one thing to show thirty forty minutes to trailers. Yeah, 572 00:29:29,440 --> 00:29:32,960 Speaker 2: it should be illegal, I know, to show God damn 573 00:29:33,920 --> 00:29:40,640 Speaker 2: Amazon commercials, Geico commercials. Yeah, you should not be allowed 574 00:29:40,680 --> 00:29:42,960 Speaker 2: to show a commercial once the trailers start. 575 00:29:43,000 --> 00:29:45,520 Speaker 6: Yeah, if you're doing that in the pre yeah, the 576 00:29:45,560 --> 00:29:49,080 Speaker 6: pre pre moment before the trailers you have whatever Guico, 577 00:29:49,200 --> 00:29:50,720 Speaker 6: Amazon and Maria Minunos. 578 00:29:50,800 --> 00:29:51,920 Speaker 1: Yes, that's fine. 579 00:29:52,000 --> 00:29:55,640 Speaker 7: And trivia like trivia trivia. Yeah, that's that's fine. Once 580 00:29:55,680 --> 00:29:57,040 Speaker 7: it's time and its trailer time. 581 00:29:57,200 --> 00:30:01,000 Speaker 6: Once I'm watching real trailers, real commercial paid to be here. 582 00:30:01,160 --> 00:30:05,080 Speaker 2: I don't need to see a double ad when it's 583 00:30:05,120 --> 00:30:08,040 Speaker 2: twenty five minutes after the movie was already supposed to start. 584 00:30:08,120 --> 00:30:09,600 Speaker 3: So wait, and this is a long n I don't 585 00:30:09,600 --> 00:30:12,520 Speaker 3: think I've run into this. You saw trailers and then 586 00:30:12,600 --> 00:30:14,160 Speaker 3: you saw Adams mixed. 587 00:30:15,200 --> 00:30:16,200 Speaker 1: They try to fool you. 588 00:30:16,320 --> 00:30:18,120 Speaker 6: Yeah, so it'll be like whatever, It'll be like a 589 00:30:18,200 --> 00:30:21,360 Speaker 6: Rolex commercial or something in there, and it's like and 590 00:30:21,600 --> 00:30:23,880 Speaker 6: they're they're like so highly produced for it. 591 00:30:23,880 --> 00:30:24,320 Speaker 7: They start. 592 00:30:24,320 --> 00:30:25,960 Speaker 6: You're like, yeah, well they didn't they didn't show the thing, 593 00:30:26,000 --> 00:30:28,240 Speaker 6: and you're like, what movie is this? And then it's like, oh, 594 00:30:28,360 --> 00:30:30,880 Speaker 6: or it's a range Rover commercial. It's the one there's like, 595 00:30:30,920 --> 00:30:34,320 Speaker 6: who's the guy from White Lotus THEO James. Yeah, he'll 596 00:30:34,360 --> 00:30:38,959 Speaker 6: be there driving around, not in a movie. Wow, I 597 00:30:38,960 --> 00:30:39,680 Speaker 6: haven't run into this. 598 00:30:39,680 --> 00:30:40,480 Speaker 3: What theater did you? 599 00:30:40,920 --> 00:30:44,200 Speaker 2: It's you know, we're regal heads here. So if I'm president, 600 00:30:44,480 --> 00:30:46,560 Speaker 2: I'm passing a lot. First of all, you should have 601 00:30:46,640 --> 00:30:49,640 Speaker 2: to tell people what time movie actually starts. Well, yeah, 602 00:30:49,680 --> 00:30:51,400 Speaker 2: I need to plan my day if you're not going 603 00:30:51,440 --> 00:30:53,959 Speaker 2: to start the movie till eight twenty. I had about, 604 00:30:54,080 --> 00:30:56,400 Speaker 2: you know, fifteen twenty minutes worth of stuff I could 605 00:30:56,400 --> 00:30:57,520 Speaker 2: have did before seeing the movie. 606 00:30:57,640 --> 00:30:59,800 Speaker 3: Yeah, and what's what's even at big? 607 00:30:59,840 --> 00:31:02,959 Speaker 6: The is like that. It's it kind of varies as 608 00:31:03,000 --> 00:31:04,920 Speaker 6: far as it wouldn't actually start. Because sometimes if you 609 00:31:05,480 --> 00:31:07,800 Speaker 6: not not at like a prime time seven PM or 610 00:31:07,800 --> 00:31:08,760 Speaker 6: something that's gonna. 611 00:31:08,560 --> 00:31:09,640 Speaker 7: Have the standard trailers. 612 00:31:09,680 --> 00:31:12,000 Speaker 6: But if you go sometimes to a noon or something, 613 00:31:13,200 --> 00:31:14,520 Speaker 6: there might not even be any trailers. 614 00:31:14,600 --> 00:31:16,080 Speaker 1: Yeah trailers they'll show. 615 00:31:15,880 --> 00:31:18,160 Speaker 6: It, which is nice, like halfway through the movie. Yeah, 616 00:31:18,200 --> 00:31:21,920 Speaker 6: So like if you're used to the AMC regal lifestyle, 617 00:31:22,160 --> 00:31:23,520 Speaker 6: basically you got thirty minutes. 618 00:31:24,280 --> 00:31:26,160 Speaker 7: You can't take that luxury all the time. 619 00:31:26,560 --> 00:31:32,280 Speaker 2: I got burned back during what's that movie Widows? Windows? 620 00:31:32,360 --> 00:31:37,320 Speaker 6: Widows Widows the Steve McQueen, No windows, what's with the 621 00:31:37,360 --> 00:31:43,280 Speaker 6: bank with the lady with the hair? Oh well, I 622 00:31:43,320 --> 00:31:45,160 Speaker 6: was like, wow, you're talking about a movie from five 623 00:31:45,240 --> 00:31:46,760 Speaker 6: years like eight years ago. 624 00:31:47,400 --> 00:31:50,720 Speaker 1: Earl weapons. I showed up windows. 625 00:31:51,320 --> 00:31:53,280 Speaker 7: I mean they're looking through the windows. 626 00:31:53,840 --> 00:31:56,120 Speaker 1: To be fair, there are no weapons in the movie. 627 00:31:56,200 --> 00:31:57,840 Speaker 6: Hey there's the one that they imagine and it's not 628 00:31:57,880 --> 00:31:58,680 Speaker 6: about school shooting. 629 00:31:59,600 --> 00:32:03,160 Speaker 2: Yeah yeah, but I showed up to that ten minutes late. 630 00:32:03,480 --> 00:32:05,719 Speaker 2: I was like, oh, I got plenty of time. The 631 00:32:05,760 --> 00:32:06,960 Speaker 2: movie was already started. 632 00:32:07,200 --> 00:32:09,680 Speaker 6: D happened to me they should be we should be 633 00:32:09,680 --> 00:32:11,240 Speaker 6: on the ticket, or it should alert. 634 00:32:11,080 --> 00:32:13,600 Speaker 2: You, yes, something to be online or something, say it's 635 00:32:13,600 --> 00:32:17,320 Speaker 2: gonna start fifteen twenty minutes no reason, and I get worried. 636 00:32:17,400 --> 00:32:19,000 Speaker 2: You know, I'm I'm someone who likes to be there 637 00:32:19,040 --> 00:32:21,760 Speaker 2: even before no risks. I said ten minutes, you're always 638 00:32:21,760 --> 00:32:23,960 Speaker 2: safe with ten minutes. Yeah, I show up, the kids 639 00:32:24,000 --> 00:32:25,320 Speaker 2: are already gone. 640 00:32:26,080 --> 00:32:27,320 Speaker 7: Damn, I didn't even see the kids. 641 00:32:27,400 --> 00:32:28,560 Speaker 1: Thank God, I know the kids. 642 00:32:28,840 --> 00:32:31,200 Speaker 7: What the hell is going on here? What's the problem. 643 00:32:31,280 --> 00:32:34,560 Speaker 2: I've been to regal movies where it's been thirty minutes 644 00:32:35,000 --> 00:32:38,720 Speaker 2: of trailers and people show up thirty minutes into the movie. 645 00:32:39,280 --> 00:32:41,120 Speaker 1: You were an hour late to the time. 646 00:32:41,160 --> 00:32:42,880 Speaker 6: That is on you Like they're treating it like it's 647 00:32:42,880 --> 00:32:44,760 Speaker 6: the twenties or something, and it's just like you just 648 00:32:44,760 --> 00:32:46,520 Speaker 6: show up to the movies and you just sit through whatever. 649 00:32:47,200 --> 00:32:48,720 Speaker 7: It's like, what's going on? 650 00:32:48,800 --> 00:32:53,320 Speaker 2: You paid twenty five dollars to be an hour late. 651 00:32:53,520 --> 00:32:55,400 Speaker 1: I don't get it. Well, the thing will say, I 652 00:32:55,440 --> 00:32:57,320 Speaker 1: want my money back, I'm gonna see another showtime. 653 00:32:57,440 --> 00:33:00,000 Speaker 6: Yeah, especially now you can you can basically refund anath tickets. 654 00:33:00,040 --> 00:33:00,200 Speaker 1: Yeah. 655 00:33:00,200 --> 00:33:02,680 Speaker 7: I feel like the thing with the. 656 00:33:02,440 --> 00:33:06,040 Speaker 3: Commercials that's so sinister. I feel like is that we've got, 657 00:33:06,400 --> 00:33:09,520 Speaker 3: you know, kind of a social somewhat of a social 658 00:33:09,520 --> 00:33:12,960 Speaker 3: agreement that if you pay money for something, you don't 659 00:33:13,000 --> 00:33:16,760 Speaker 3: get advertisements exactly. You pay for the Netflix without ads, 660 00:33:16,840 --> 00:33:19,120 Speaker 3: or like you pay for YouTube premium or whatever. 661 00:33:19,560 --> 00:33:22,680 Speaker 6: If I'm paying for this movie ticket and I have 662 00:33:22,840 --> 00:33:23,760 Speaker 6: to see ads. 663 00:33:24,400 --> 00:33:27,160 Speaker 3: Trailers are another thing because you're they're trailers for other 664 00:33:27,280 --> 00:33:31,280 Speaker 3: movies you might be interested with, like commercials for cars. 665 00:33:30,880 --> 00:33:32,880 Speaker 7: And ship That's disgusting. 666 00:33:32,960 --> 00:33:35,360 Speaker 6: You should be paying me to show that the commercials 667 00:33:35,360 --> 00:33:36,240 Speaker 6: for the theater you're in. 668 00:33:36,720 --> 00:33:40,200 Speaker 7: We know, I know, we love the Nicole Kidman thing. 669 00:33:40,280 --> 00:33:40,880 Speaker 1: I get it. 670 00:33:41,080 --> 00:33:44,000 Speaker 7: But I'm here already. Yeah, show that on TV. 671 00:33:44,520 --> 00:33:46,080 Speaker 6: Yeah, why don't they show that on TV? 672 00:33:47,360 --> 00:33:52,640 Speaker 7: Hello? Like this is cool? Yeah, I'm here. You don't 673 00:33:52,680 --> 00:33:53,480 Speaker 7: need to tell me to cool. 674 00:33:53,680 --> 00:33:55,600 Speaker 6: It's like a pep talkers. 675 00:33:55,000 --> 00:33:57,000 Speaker 7: No sense. 676 00:33:57,160 --> 00:34:01,360 Speaker 1: Thank Hey. 677 00:34:01,760 --> 00:34:05,000 Speaker 2: In case you missed it, Manny's new book, Colored People 678 00:34:05,120 --> 00:34:07,760 Speaker 2: Time is out anywhere you buy books. We're going to 679 00:34:07,840 --> 00:34:10,440 Speaker 2: link to it and show notes if you enjoy the 680 00:34:10,520 --> 00:34:13,959 Speaker 2: humor and heart of this show, the love it. 681 00:34:13,880 --> 00:34:15,000 Speaker 1: To collection of essays. 682 00:34:15,600 --> 00:34:17,239 Speaker 2: I really love it, and I'm not just saying that 683 00:34:17,360 --> 00:34:20,760 Speaker 2: because he's my friend. No such thing as a production 684 00:34:20,920 --> 00:34:24,400 Speaker 2: of kaleidoscope content. Our executive producers r Kay Osborne and 685 00:34:24,480 --> 00:34:25,480 Speaker 2: manghest Kaakadur. 686 00:34:25,800 --> 00:34:29,320 Speaker 1: The show was created by Manny Noah and me Devin. 687 00:34:29,520 --> 00:34:33,319 Speaker 2: Theming credit song by Manny, mixing for this episode by 688 00:34:33,440 --> 00:34:36,880 Speaker 2: Steve Bone, and thank you to Tyler Hale for some 689 00:34:37,040 --> 00:34:40,600 Speaker 2: additional production on this episode. Our guest to speak was 690 00:34:40,600 --> 00:34:44,000 Speaker 2: Scott Erickson from Happy or Not. There's actually a great 691 00:34:44,040 --> 00:34:46,880 Speaker 2: New Yorker piece about Happy or Not by David Owen 692 00:34:46,880 --> 00:34:49,320 Speaker 2: that we're going to link to in our show notes, 693 00:34:49,520 --> 00:34:52,279 Speaker 2: so check that out. 694 00:34:53,320 --> 00:34:55,440 Speaker 1: This is not just thing that show. To subscribe to 695 00:34:55,440 --> 00:34:56,080 Speaker 1: our newsletter. 696 00:34:56,560 --> 00:35:00,640 Speaker 2: If you have feedback for us or question, can email 697 00:35:00,719 --> 00:35:05,160 Speaker 2: us at Manny Noahdevin at gmail dot com, or if 698 00:35:05,160 --> 00:35:07,400 Speaker 2: you're in a US you can leave us a voicemail 699 00:35:07,719 --> 00:35:09,879 Speaker 2: call the number in our show notes, and who knows, 700 00:35:09,960 --> 00:35:13,640 Speaker 2: we may do an entire episode based on your question. Emma, 701 00:35:13,880 --> 00:35:15,680 Speaker 2: thanks for running in. We had a lot of fun 702 00:35:15,719 --> 00:35:18,240 Speaker 2: getting to the bottom of this one. We'll see y'all 703 00:35:18,280 --> 00:35:18,880 Speaker 2: next week. 704 00:35:19,600 --> 00:35:24,400 Speaker 1: Hell's Hell's hells He he knows, he knows such things.