1 00:00:07,760 --> 00:00:11,840 Speaker 1: From WBZ News Radio in Boston. This is New England Weekend, 2 00:00:12,080 --> 00:00:14,560 Speaker 1: each and every week. Right here, we come together, we 3 00:00:14,680 --> 00:00:17,280 Speaker 1: talk about all the topics important to you and the 4 00:00:17,280 --> 00:00:19,079 Speaker 1: place where you live. Great to have you with us 5 00:00:19,160 --> 00:00:22,400 Speaker 1: this week. As always, I'm Nicole Davis. Most of the time, 6 00:00:22,560 --> 00:00:24,520 Speaker 1: or even all the time, you might not realize it, 7 00:00:24,600 --> 00:00:26,840 Speaker 1: but there is a force working behind the scenes on 8 00:00:26,880 --> 00:00:29,600 Speaker 1: your phone, computer, and even the price tags at the 9 00:00:29,600 --> 00:00:33,160 Speaker 1: grocery store. It's quietly shaping the cost of all kinds 10 00:00:33,200 --> 00:00:36,479 Speaker 1: of merchandise. We're talking about algorithmic pricing, and it's all 11 00:00:36,479 --> 00:00:41,239 Speaker 1: about using automated systems to literally set prices in real time. Now. 12 00:00:41,360 --> 00:00:44,280 Speaker 1: Dynamic pricing has been around for some time. Think about 13 00:00:44,320 --> 00:00:46,960 Speaker 1: surging ride share prices when it gets busy, but this 14 00:00:47,040 --> 00:00:49,479 Speaker 1: takes it to a whole new level. Shopping apps and 15 00:00:49,520 --> 00:00:52,440 Speaker 1: websites are tracking your data to give you personalized ads 16 00:00:52,440 --> 00:00:55,040 Speaker 1: and discounts and in many cases what you end up 17 00:00:55,080 --> 00:00:58,440 Speaker 1: paying over others who want the exact same thing. Noah 18 00:00:58,480 --> 00:01:01,760 Speaker 1: John Siracusa is an off They're a mathematician and professor 19 00:01:01,760 --> 00:01:04,520 Speaker 1: at Bentley University. He is here to help us make 20 00:01:04,520 --> 00:01:06,360 Speaker 1: a bit more sense of all this. So Noah, it's 21 00:01:06,360 --> 00:01:08,720 Speaker 1: great to have you give us more insight. I guess 22 00:01:08,760 --> 00:01:11,480 Speaker 1: to start about what exactly companies are doing here. 23 00:01:11,760 --> 00:01:14,560 Speaker 2: Well, there's two questions, what is what are they doing? 24 00:01:14,560 --> 00:01:16,240 Speaker 2: In one is what could they be doing? And we 25 00:01:16,280 --> 00:01:18,440 Speaker 2: don't know a lot about what they are doing because 26 00:01:18,440 --> 00:01:21,720 Speaker 2: it's really hard to find out. So the dynamic pricing 27 00:01:21,760 --> 00:01:25,520 Speaker 2: that you mentioned dynamic just meaning changing and a lot 28 00:01:25,600 --> 00:01:28,720 Speaker 2: of people have experienced this with airlines for instance. You 29 00:01:28,720 --> 00:01:30,759 Speaker 2: know when you try to buy it a ticket, let's 30 00:01:30,760 --> 00:01:33,680 Speaker 2: say to go visit your family for the holidays, you 31 00:01:33,760 --> 00:01:35,319 Speaker 2: check and you see the price, and then you check 32 00:01:35,400 --> 00:01:37,360 Speaker 2: just a few days later and it might be much higher, 33 00:01:37,440 --> 00:01:41,560 Speaker 2: much lower. So that rapid pace of changing, that's the 34 00:01:41,680 --> 00:01:44,959 Speaker 2: dynamic aspect, and that means the airlines are using some 35 00:01:45,040 --> 00:01:47,920 Speaker 2: kind of algorithms, So some computer system that reads and 36 00:01:48,000 --> 00:01:50,640 Speaker 2: lots of factors like the demand, maybe even the weather, 37 00:01:50,920 --> 00:01:53,000 Speaker 2: current events, could be lots of things we don't know, 38 00:01:53,520 --> 00:01:55,560 Speaker 2: and it's adjusting the prices based on that. So it's 39 00:01:55,640 --> 00:01:58,800 Speaker 2: very dynamic. It's rapidly changing. That's just one of the 40 00:01:58,840 --> 00:02:01,440 Speaker 2: things that we're seeing. It's been around for a while, 41 00:02:01,520 --> 00:02:04,480 Speaker 2: but as you can imagine with AI and things as 42 00:02:04,480 --> 00:02:09,040 Speaker 2: computers get more powerful. As data gets more available, these 43 00:02:09,080 --> 00:02:13,920 Speaker 2: systems get more potents, more powerful and more complex, like 44 00:02:14,120 --> 00:02:15,400 Speaker 2: it's harder to know what they're doing. 45 00:02:15,680 --> 00:02:18,480 Speaker 1: Yeah, and I've noticed that it's happening in grocery stores, 46 00:02:18,520 --> 00:02:20,600 Speaker 1: which kind of threw me for a loop because you 47 00:02:20,639 --> 00:02:22,440 Speaker 1: would think that the price of food would be the 48 00:02:22,520 --> 00:02:24,960 Speaker 1: price of food. I mean maybe if it's out of season, 49 00:02:25,040 --> 00:02:27,600 Speaker 1: that's one thing. How is this getting rolled out in 50 00:02:27,600 --> 00:02:29,799 Speaker 1: grocery stores though, well. 51 00:02:29,720 --> 00:02:32,480 Speaker 2: Yeah, that's a really interesting question. So grocery store a 52 00:02:32,520 --> 00:02:35,080 Speaker 2: lot of retailers would like to be able to do 53 00:02:35,120 --> 00:02:37,720 Speaker 2: what the airlines are doing, but there's a difference. When 54 00:02:37,720 --> 00:02:39,919 Speaker 2: you buy your plane ticket, you're usually on a web page, 55 00:02:40,160 --> 00:02:42,000 Speaker 2: so they can just use their computer to update the 56 00:02:42,040 --> 00:02:43,880 Speaker 2: price as quickly as they want, you know, from one 57 00:02:43,919 --> 00:02:46,760 Speaker 2: second to the next. A grocery store can't do that, right, 58 00:02:46,760 --> 00:02:48,160 Speaker 2: You go and you see the price of apples and 59 00:02:48,200 --> 00:02:49,920 Speaker 2: it's printed out that it is what it is. 60 00:02:50,440 --> 00:02:50,560 Speaker 1: Right. 61 00:02:50,720 --> 00:02:52,600 Speaker 2: But one of the things a lot of grocery stores 62 00:02:52,600 --> 00:02:54,600 Speaker 2: are i should say some grocery stores are doing is 63 00:02:54,600 --> 00:02:57,919 Speaker 2: switching to digital price tags, so you'll just see like 64 00:02:57,960 --> 00:03:00,440 Speaker 2: a little computer just or not even a computer, like 65 00:03:00,440 --> 00:03:03,400 Speaker 2: almost like a little calculator display that just shows the 66 00:03:03,400 --> 00:03:07,320 Speaker 2: current price, and the manager can sit from their desk 67 00:03:07,400 --> 00:03:10,240 Speaker 2: and just click a button and change the price. Or 68 00:03:10,400 --> 00:03:12,320 Speaker 2: if they want, they can take the human out of 69 00:03:12,320 --> 00:03:15,400 Speaker 2: the loop and just use a computer algorithm that'll just 70 00:03:15,919 --> 00:03:18,040 Speaker 2: calculate based on various factors. You know, it might look 71 00:03:18,040 --> 00:03:20,640 Speaker 2: at the weather in Florida and decide, oh, there's a 72 00:03:20,680 --> 00:03:23,640 Speaker 2: storm coming, let's raise the price of orange juice. So 73 00:03:23,760 --> 00:03:26,040 Speaker 2: being able to change these prices very quickly is something 74 00:03:26,080 --> 00:03:28,640 Speaker 2: a lot of stores want to do, and grocery stores 75 00:03:28,639 --> 00:03:31,200 Speaker 2: seem to be jumping on that backlag and very quickly. 76 00:03:31,360 --> 00:03:33,480 Speaker 1: I don't know. I've heard somebody talk about how it's 77 00:03:33,520 --> 00:03:36,720 Speaker 1: not really very moral, you know, because you're taking advantage 78 00:03:36,720 --> 00:03:40,280 Speaker 1: of a situation. It's almost predatory, some people say, just 79 00:03:40,280 --> 00:03:42,960 Speaker 1: to try and make an extra buck. But unfortunately you're 80 00:03:43,000 --> 00:03:46,120 Speaker 1: also dealing in an environment where the price is already 81 00:03:46,160 --> 00:03:48,040 Speaker 1: high on everything. So there's got to be a lot 82 00:03:48,080 --> 00:03:48,880 Speaker 1: of pushback to this. 83 00:03:49,800 --> 00:03:51,560 Speaker 2: There is a lot of pushback, and I was just 84 00:03:51,600 --> 00:03:55,000 Speaker 2: talking to a journalist in Maryland where apparently the state 85 00:03:55,000 --> 00:03:59,880 Speaker 2: has introduced some legislation to limit these grocery store specific 86 00:04:00,120 --> 00:04:04,119 Speaker 2: these digital price tags suggesting able to update once per day, 87 00:04:04,280 --> 00:04:06,760 Speaker 2: like a twenty four hour speed limit on them. I 88 00:04:06,760 --> 00:04:08,720 Speaker 2: thought that's kind of an interesting solution. I'm not sure 89 00:04:08,720 --> 00:04:10,640 Speaker 2: how I feel about it, but that's an interesting thought. 90 00:04:10,880 --> 00:04:12,520 Speaker 1: Yeah, and we actually have a law here. I was 91 00:04:12,520 --> 00:04:16,320 Speaker 1: doing some research here. A bill has been introduced by Rep. 92 00:04:16,400 --> 00:04:19,800 Speaker 1: Lindsay Sabadosa. It's called an Act relative to Surveillance Pricing 93 00:04:19,839 --> 00:04:23,320 Speaker 1: and Grocery Stores, and it's very similar to that. Right now, 94 00:04:23,360 --> 00:04:26,359 Speaker 1: it's in committee and kind of just being considered on 95 00:04:26,480 --> 00:04:29,279 Speaker 1: Beacon Hill. But I would imagine that a lot of 96 00:04:29,400 --> 00:04:32,040 Speaker 1: legislatures around the country are going to start taking action 97 00:04:32,120 --> 00:04:32,400 Speaker 1: on this. 98 00:04:32,839 --> 00:04:34,680 Speaker 2: There has been a lot of interest, even in the 99 00:04:34,760 --> 00:04:36,559 Speaker 2: last few months. I've had a lot of people reaching 100 00:04:36,600 --> 00:04:39,320 Speaker 2: out to talk as you did, to find out more. 101 00:04:40,040 --> 00:04:41,720 Speaker 2: But let me just mention you hit on a really 102 00:04:41,720 --> 00:04:44,920 Speaker 2: important word. You said surveillance there rather than dynamic. Yes, 103 00:04:45,080 --> 00:04:47,279 Speaker 2: And I think that's something important for our listeners to 104 00:04:47,279 --> 00:04:52,760 Speaker 2: really understand what is surveillance pricing or sometimes called personalized pricing, 105 00:04:52,760 --> 00:04:55,520 Speaker 2: What does that mean? And how's that different than dynamic. So, 106 00:04:55,560 --> 00:04:57,440 Speaker 2: the way I like to think about it, dynamic just 107 00:04:57,520 --> 00:05:00,800 Speaker 2: means rapidly changing. As we said, this could be the 108 00:05:00,839 --> 00:05:04,680 Speaker 2: weather changes, the suddenly airline prices change, or orangees price changes. 109 00:05:04,760 --> 00:05:09,560 Speaker 2: That's dynamic. But once we talk about surveillance that it's 110 00:05:09,600 --> 00:05:11,400 Speaker 2: I think the word comes from what some people call 111 00:05:11,480 --> 00:05:14,560 Speaker 2: surveillance capitalism, which is this idea that a lot of 112 00:05:14,600 --> 00:05:17,720 Speaker 2: tech companies like Facebook and Google were Historically the main 113 00:05:17,760 --> 00:05:21,560 Speaker 2: ones have been surveilling, which means sort of like slurping 114 00:05:21,640 --> 00:05:24,000 Speaker 2: up a lot of personal data about people to try 115 00:05:24,000 --> 00:05:26,640 Speaker 2: to target us with ads. And you know, we see 116 00:05:26,640 --> 00:05:28,520 Speaker 2: this one. We're online, we get a lot of targeted ads. 117 00:05:28,520 --> 00:05:31,320 Speaker 2: They're based on what we search for, what we watch 118 00:05:31,400 --> 00:05:36,719 Speaker 2: on YouTube, lots of information. Well, now surveillance pricing is 119 00:05:36,760 --> 00:05:38,560 Speaker 2: kind of the same idea, but instead of using that 120 00:05:38,680 --> 00:05:41,599 Speaker 2: data and these algorithms to target us with ads, they're 121 00:05:41,680 --> 00:05:44,800 Speaker 2: using it to set our prices. So they might have 122 00:05:44,839 --> 00:05:46,920 Speaker 2: some idea that I'm going to be willing to pay 123 00:05:46,960 --> 00:05:49,880 Speaker 2: more for my milk and eggs than you are, and 124 00:05:50,640 --> 00:05:52,600 Speaker 2: based on that they can decide to charge me more. 125 00:05:53,240 --> 00:05:56,280 Speaker 2: So as much as the dynamic pricing is uncomfortable and 126 00:05:56,320 --> 00:05:59,960 Speaker 2: potentially predatory, I think it's the surveillance where it really got. 127 00:06:00,000 --> 00:06:03,000 Speaker 2: It's creepy because just imagine, like I don't know. Just 128 00:06:03,040 --> 00:06:05,520 Speaker 2: to be extreme, I'm not claiming this happens, but if 129 00:06:05,560 --> 00:06:07,680 Speaker 2: you and I watch different TikTok videos, do you think 130 00:06:07,720 --> 00:06:11,000 Speaker 2: we should pay different prices at the grocery store. That's absurd. 131 00:06:11,240 --> 00:06:12,919 Speaker 2: If they can get that data and they can use it, 132 00:06:13,040 --> 00:06:15,000 Speaker 2: there's currently no laws stopping them from doing that. 133 00:06:15,279 --> 00:06:16,760 Speaker 1: Yeah, I mean what do they say if it's a 134 00:06:16,800 --> 00:06:20,520 Speaker 1: free service like TikTok or Facebook or Instagram. I mean, 135 00:06:20,560 --> 00:06:23,719 Speaker 1: you are the product, essentially, your data is the product. 136 00:06:24,000 --> 00:06:26,400 Speaker 2: That's right. But now we're talking about just regular stores, 137 00:06:26,440 --> 00:06:28,599 Speaker 2: like going to the grocery store and buying your produce. Wow. 138 00:06:28,960 --> 00:06:31,679 Speaker 2: So now if they're if they're thinking about or heading 139 00:06:31,680 --> 00:06:34,000 Speaker 2: towards using a lot of that same technology what you 140 00:06:34,080 --> 00:06:36,960 Speaker 2: just described, Like, Okay, I understand Google's free and giving 141 00:06:37,000 --> 00:06:38,880 Speaker 2: up my data and they target me. You know, it's 142 00:06:38,920 --> 00:06:40,680 Speaker 2: sort of a deal with the devil in a way. 143 00:06:41,080 --> 00:06:42,919 Speaker 2: But I never agreed to that kind of deal with 144 00:06:43,000 --> 00:06:44,760 Speaker 2: the grocery store or with the airlines. 145 00:06:45,080 --> 00:06:45,320 Speaker 1: No. 146 00:06:45,440 --> 00:06:47,279 Speaker 2: So a lot of companies are interested in using my 147 00:06:47,320 --> 00:06:51,520 Speaker 2: personal data or our personal data to maximize the prices 148 00:06:51,560 --> 00:06:53,520 Speaker 2: that we're willing to pay a little creepy, it is. 149 00:06:53,560 --> 00:06:54,640 Speaker 1: It's completely creepy. 150 00:06:54,839 --> 00:06:57,360 Speaker 2: I have to point out because it's so tempting to say, oh, 151 00:06:57,400 --> 00:07:00,000 Speaker 2: this is horrible, we should ban all kinds of personalized price. 152 00:07:00,640 --> 00:07:02,200 Speaker 2: But I always have to tell people, and I've even 153 00:07:02,200 --> 00:07:04,680 Speaker 2: mentioned this to like lawmakers, to say, be really careful, 154 00:07:05,120 --> 00:07:09,840 Speaker 2: because guess what senior citizen discounts is technically personalized pricing. 155 00:07:10,440 --> 00:07:14,920 Speaker 2: Giving people a lower price based on their age is 156 00:07:15,040 --> 00:07:17,239 Speaker 2: a form of personal life pricing. So if we passed 157 00:07:17,240 --> 00:07:20,120 Speaker 2: a law that said personalized pricing is horrible and creepy 158 00:07:20,120 --> 00:07:22,520 Speaker 2: and predatory, we need to ban it, we're going to 159 00:07:22,560 --> 00:07:25,480 Speaker 2: lose some of those discounts. A return customer, if you 160 00:07:25,600 --> 00:07:27,160 Speaker 2: shop at the same store and you're in sort of 161 00:07:27,160 --> 00:07:30,400 Speaker 2: a loyalty program, you've established a loyalty and you kind 162 00:07:30,400 --> 00:07:33,680 Speaker 2: of deserve the discounts. But that's also based on your 163 00:07:33,720 --> 00:07:36,520 Speaker 2: personal data of where you shop. So it's very hard 164 00:07:36,560 --> 00:07:38,720 Speaker 2: to write laws, and that's what I think we're facing 165 00:07:39,160 --> 00:07:42,760 Speaker 2: that will carve out the bad, creepy, predatory versions of 166 00:07:42,800 --> 00:07:45,400 Speaker 2: this while leaving the good ones that we want. So 167 00:07:45,440 --> 00:07:46,600 Speaker 2: we have to be very cautious. 168 00:07:46,880 --> 00:07:48,920 Speaker 1: Yeah, I mean, it makes more sense that this would 169 00:07:48,920 --> 00:07:52,080 Speaker 1: happen with online stores like Amazon or Target. I know that, 170 00:07:52,120 --> 00:07:54,520 Speaker 1: you know, when I buy things at Target, suddenly magically 171 00:07:54,560 --> 00:07:56,600 Speaker 1: all my ads have to do with Target and this 172 00:07:56,680 --> 00:07:58,800 Speaker 1: and that what I've bought. But when it comes to 173 00:07:59,080 --> 00:08:01,600 Speaker 1: not just grocery store but more brick and mortar stores, 174 00:08:01,640 --> 00:08:05,240 Speaker 1: are you noticing that business owners are adopting this or 175 00:08:05,280 --> 00:08:07,440 Speaker 1: are they kind of pumping the brakes a little bit 176 00:08:07,480 --> 00:08:09,080 Speaker 1: to see where it all hashes out. 177 00:08:09,280 --> 00:08:12,440 Speaker 2: I mean, I would say everyone is moving more towards 178 00:08:12,520 --> 00:08:15,720 Speaker 2: data and AI in various forms, right, that's just kind 179 00:08:15,760 --> 00:08:20,240 Speaker 2: of universal, but this specifically, like changing prices based on 180 00:08:20,320 --> 00:08:23,280 Speaker 2: people's past behavior and data. We're not seeing a lot 181 00:08:23,320 --> 00:08:25,560 Speaker 2: of that yet. But the issue, I would say is 182 00:08:25,720 --> 00:08:30,160 Speaker 2: the technological infrastructure has been established so that the companies 183 00:08:30,160 --> 00:08:32,480 Speaker 2: that want to go down that path there's not much 184 00:08:32,480 --> 00:08:35,959 Speaker 2: stopping them. If you want to buy personal data on people, 185 00:08:36,000 --> 00:08:37,640 Speaker 2: you can do that on the internet. If you want 186 00:08:37,679 --> 00:08:40,360 Speaker 2: to use feed that into algorithms, you can do that. 187 00:08:40,880 --> 00:08:43,240 Speaker 2: There are no laws preventing it. The data is available. 188 00:08:43,840 --> 00:08:47,320 Speaker 2: So it's almost like this wait and see eerie moment 189 00:08:47,400 --> 00:08:49,839 Speaker 2: where a lot of us are kind of creeped out 190 00:08:49,880 --> 00:08:51,839 Speaker 2: by this. And I would say, not a lot of 191 00:08:51,920 --> 00:08:55,800 Speaker 2: stores are doing it, but we're worried because there's nothing 192 00:08:55,840 --> 00:08:57,800 Speaker 2: really stopping more people from trying it. 193 00:08:58,200 --> 00:09:01,920 Speaker 1: No, that's true, So somebody wants to try and protect 194 00:09:01,920 --> 00:09:03,840 Speaker 1: themselves from this, although, like you said, it is a 195 00:09:03,840 --> 00:09:06,199 Speaker 1: double edged sword, because I, if I was a senior, 196 00:09:06,200 --> 00:09:08,640 Speaker 1: would love my senior discount. I would love to get 197 00:09:08,679 --> 00:09:11,520 Speaker 1: my Duncan rewards or whatever it is. How are we 198 00:09:11,559 --> 00:09:13,679 Speaker 1: supposed to navigate this at this point? 199 00:09:13,840 --> 00:09:16,280 Speaker 2: You know it's tricky, and I would say the fact 200 00:09:16,320 --> 00:09:19,840 Speaker 2: that it's so hard on the individual consumer is what 201 00:09:19,960 --> 00:09:24,839 Speaker 2: suggests that we need broader approach. My general philosophy life 202 00:09:24,920 --> 00:09:28,000 Speaker 2: is if something is really hard for the individual, like 203 00:09:28,080 --> 00:09:31,240 Speaker 2: protecting yourself from this pricing, that's a big sign that 204 00:09:31,240 --> 00:09:33,240 Speaker 2: we need laws to help us, because we can't all 205 00:09:33,280 --> 00:09:35,880 Speaker 2: put in the effort that said, what can you do? 206 00:09:36,400 --> 00:09:39,199 Speaker 2: You can try comparison shopping, but in a slightly different 207 00:09:39,200 --> 00:09:41,360 Speaker 2: way than you're used to. We all know about going 208 00:09:41,400 --> 00:09:43,800 Speaker 2: to store A and store B and seeing which one 209 00:09:43,880 --> 00:09:45,520 Speaker 2: is cheaper, and you know, buying our milk at one 210 00:09:45,559 --> 00:09:47,920 Speaker 2: store or eggs at the other store, So we're used 211 00:09:47,960 --> 00:09:49,920 Speaker 2: to that. You can do a similar thing where you 212 00:09:49,960 --> 00:09:52,720 Speaker 2: can open up a new browser tab or a browser 213 00:09:52,760 --> 00:09:56,080 Speaker 2: window that's in like a private, incognito mode where it 214 00:09:56,120 --> 00:09:59,160 Speaker 2: doesn't have your past personal data attached to it, and 215 00:09:59,200 --> 00:10:02,160 Speaker 2: you can just go on whatever story you're looking at 216 00:10:02,200 --> 00:10:04,200 Speaker 2: on their web page and see am I getting lower 217 00:10:04,320 --> 00:10:08,120 Speaker 2: higher prices? So that's so far for online that's the 218 00:10:08,160 --> 00:10:11,520 Speaker 2: main one. For in person, there hasn't been a lot 219 00:10:11,559 --> 00:10:14,560 Speaker 2: attached to personal data because you know, if I walk 220 00:10:14,600 --> 00:10:17,480 Speaker 2: into Walmart, they don't have my full TikTok viewing history. 221 00:10:17,760 --> 00:10:20,800 Speaker 2: Now what they have is my loyalty card, where they 222 00:10:20,800 --> 00:10:23,760 Speaker 2: know what I've bought in the past. And that's more 223 00:10:23,880 --> 00:10:25,920 Speaker 2: like what you said, where I kind of know if 224 00:10:25,960 --> 00:10:28,080 Speaker 2: I'm using a loyalty card that they're keeping trappicked of 225 00:10:28,080 --> 00:10:30,120 Speaker 2: what I've bought, and I'm kind of buying into that. 226 00:10:30,760 --> 00:10:33,120 Speaker 2: But they're not usually changing the prices there. It's more 227 00:10:33,320 --> 00:10:34,920 Speaker 2: I'll see the price on the shelf and maybe again 228 00:10:34,960 --> 00:10:37,600 Speaker 2: a discount. So I would say broadly people, you know, 229 00:10:37,640 --> 00:10:40,000 Speaker 2: we're we're okay with getting discounts. We just don't like 230 00:10:40,120 --> 00:10:41,720 Speaker 2: getting things, you know, with higher prices. 231 00:10:41,880 --> 00:10:44,839 Speaker 1: Well who would, right, especially in this day and age. Yes, 232 00:10:44,920 --> 00:10:48,240 Speaker 1: I'll gladly pay more for this than I absolutely have to. 233 00:10:48,559 --> 00:10:50,640 Speaker 2: That makes me think of an interesting point that also 234 00:10:50,760 --> 00:10:52,439 Speaker 2: I think we need to keep in mind. And this 235 00:10:52,480 --> 00:10:55,080 Speaker 2: is a so when we're having these conversations really easy 236 00:10:55,080 --> 00:10:57,840 Speaker 2: to feel like, oh, this is just this horrible, evil thing, 237 00:10:57,880 --> 00:10:59,680 Speaker 2: why can't we just get rid of it? But here's 238 00:10:59,720 --> 00:11:01,720 Speaker 2: another example where it may not be as bad as 239 00:11:01,760 --> 00:11:05,240 Speaker 2: it sounds. There's something called a pain point, which is 240 00:11:05,840 --> 00:11:09,160 Speaker 2: it's sort of a theoretical concept in marketing or economics. 241 00:11:09,440 --> 00:11:12,280 Speaker 2: It's the highest amount someone is willing to pay for 242 00:11:12,320 --> 00:11:15,080 Speaker 2: a product. So maybe my pain point for a gallon 243 00:11:15,120 --> 00:11:18,600 Speaker 2: of milk is five dollars, Maybe yours is eight dollars. 244 00:11:18,640 --> 00:11:21,600 Speaker 2: You're more willing to pay. So the idea is, oh, 245 00:11:22,240 --> 00:11:25,600 Speaker 2: they could use this surveillance pricing to charge you eight 246 00:11:25,640 --> 00:11:27,959 Speaker 2: dollars for your milk and me five dollars because they 247 00:11:27,960 --> 00:11:30,080 Speaker 2: know if they charged me eight dollars, I wouldn't buy it. 248 00:11:30,760 --> 00:11:33,200 Speaker 2: So that seems kind of creepy, right, But you could 249 00:11:33,200 --> 00:11:35,800 Speaker 2: also turn it around and say maybe the store was 250 00:11:35,840 --> 00:11:38,640 Speaker 2: going to charge eight dollars, but then they say, Okay, 251 00:11:38,720 --> 00:11:41,120 Speaker 2: Nichole's going to buy the milk for eight dollars. But no, 252 00:11:41,240 --> 00:11:42,920 Speaker 2: what we know he's only going to buy it if 253 00:11:42,920 --> 00:11:44,599 Speaker 2: it goes down to five. So let's go ahead. And 254 00:11:44,679 --> 00:11:47,320 Speaker 2: lower his milk price down to five so they could 255 00:11:47,400 --> 00:11:50,040 Speaker 2: use it the other way and help me buy the 256 00:11:50,040 --> 00:11:52,960 Speaker 2: products that I wouldn't be able to buy at my 257 00:11:53,040 --> 00:11:56,320 Speaker 2: current budget. So there are some potential positives and we 258 00:11:56,360 --> 00:11:58,559 Speaker 2: have to be really careful to not get rid of 259 00:11:58,600 --> 00:11:59,880 Speaker 2: those when we try to get to the bad. 260 00:12:00,160 --> 00:12:02,600 Speaker 1: That's true. I've also heard that if you are shopping 261 00:12:02,679 --> 00:12:05,040 Speaker 1: online and if you put stuff in your cart and 262 00:12:05,080 --> 00:12:07,200 Speaker 1: you just leave it there for a couple of days, 263 00:12:07,240 --> 00:12:08,560 Speaker 1: I mean, not only is it a good idea just 264 00:12:08,600 --> 00:12:10,200 Speaker 1: to wait a couple of days if you really don't 265 00:12:10,200 --> 00:12:13,320 Speaker 1: need it, but you know, it can sometimes essentially trigger 266 00:12:13,360 --> 00:12:15,840 Speaker 1: an email or trigger a message like hey, come back, 267 00:12:15,880 --> 00:12:18,600 Speaker 1: we're going to give you fifteen percent off. It's almost 268 00:12:18,640 --> 00:12:20,840 Speaker 1: like they're waiting to see which way you're going to go. 269 00:12:21,280 --> 00:12:24,200 Speaker 2: Yeah, I think absolutely, And maybe the price has even 270 00:12:24,240 --> 00:12:27,199 Speaker 2: changed while it's sitting in the carts. Yeah. Yeah. It's 271 00:12:27,200 --> 00:12:30,040 Speaker 2: funny because it's so high tech and so modern, and 272 00:12:30,120 --> 00:12:33,040 Speaker 2: yet what you're describing, doesn't that sound so kind of 273 00:12:33,080 --> 00:12:35,600 Speaker 2: old fashioned? Like yeah, you know, imagine like an old 274 00:12:35,600 --> 00:12:37,680 Speaker 2: marketplace and you go, nah, I'm not interested in this, 275 00:12:37,720 --> 00:12:39,240 Speaker 2: and you start to walk away, and then they try 276 00:12:39,280 --> 00:12:41,920 Speaker 2: to like no, no, no, come on for you. It discounts. Yeah, 277 00:12:41,960 --> 00:12:44,040 Speaker 2: so it's kind of like bartering and haggling in a way. 278 00:12:44,080 --> 00:12:47,040 Speaker 2: We're kind of returning to it. But let me point 279 00:12:47,080 --> 00:12:50,120 Speaker 2: out a really important difference. I'll give your listeners a 280 00:12:50,120 --> 00:12:54,120 Speaker 2: fancy word for this, which is information asymmetry. And all 281 00:12:54,120 --> 00:12:56,760 Speaker 2: I mean by that is if I walk into an 282 00:12:56,760 --> 00:12:59,600 Speaker 2: old market, you know, imagine like an old bizarre and 283 00:12:59,640 --> 00:13:02,200 Speaker 2: I see some hat that I want to buy, and 284 00:13:02,280 --> 00:13:04,800 Speaker 2: the persons selling it they can tell I'm not from 285 00:13:04,880 --> 00:13:06,680 Speaker 2: this town, and they might try to take advantage of 286 00:13:06,720 --> 00:13:09,560 Speaker 2: me and charge more. Okay, So that's kind of bartering 287 00:13:09,559 --> 00:13:11,240 Speaker 2: where I can see how they look, they can see 288 00:13:11,240 --> 00:13:14,440 Speaker 2: how I look. That's all the information we have. But 289 00:13:14,559 --> 00:13:17,439 Speaker 2: imagine going on to let's say an airline web page, 290 00:13:17,880 --> 00:13:19,959 Speaker 2: and they have all kinds of data of what I've 291 00:13:20,000 --> 00:13:22,560 Speaker 2: looked at, what I've purchased, when I've flown, maybe they've 292 00:13:22,559 --> 00:13:25,360 Speaker 2: figured out where my family is based on social media stuff. 293 00:13:25,760 --> 00:13:28,160 Speaker 2: They have a huge amount of information about me. How 294 00:13:28,240 --> 00:13:31,319 Speaker 2: much information do I have about this airline? How much 295 00:13:31,360 --> 00:13:34,520 Speaker 2: agency and power do I have? So it's that inequality. 296 00:13:34,679 --> 00:13:36,560 Speaker 2: It's the fact that, yeah, this is sort of like 297 00:13:36,960 --> 00:13:39,280 Speaker 2: the old school bartering and haggling that we've seen in 298 00:13:39,320 --> 00:13:43,120 Speaker 2: movies and things. But it's so different when they have 299 00:13:43,240 --> 00:13:45,640 Speaker 2: so much information about us and we have almost no 300 00:13:45,720 --> 00:13:48,440 Speaker 2: information about them. And that's kind of the key I think, 301 00:13:48,520 --> 00:13:49,200 Speaker 2: the key point. 302 00:13:49,400 --> 00:13:54,479 Speaker 1: Yeah, this has me thinking, should people consider investing in VPNs? 303 00:13:54,640 --> 00:13:57,520 Speaker 1: I mean, you mentioned incognito mode? Should you go into 304 00:13:57,800 --> 00:14:00,720 Speaker 1: looking into a more privacy focused web? How do you 305 00:14:00,720 --> 00:14:03,640 Speaker 1: think people should work on at least online to try 306 00:14:03,640 --> 00:14:04,199 Speaker 1: to get around this? 307 00:14:04,600 --> 00:14:06,640 Speaker 2: Great point? And I should have I'm glad you brought 308 00:14:06,640 --> 00:14:08,480 Speaker 2: it up. I should have as well, but I've forgot. 309 00:14:08,480 --> 00:14:11,720 Speaker 2: But VPN is something that certainly people are recommending exploring. 310 00:14:12,200 --> 00:14:15,240 Speaker 2: But I would say realistically, we're a little premature. This 311 00:14:15,320 --> 00:14:19,400 Speaker 2: does not seem to be a very prevalent phenomenon yet. Okay, 312 00:14:20,160 --> 00:14:23,560 Speaker 2: it's more that we are concerned it may become and 313 00:14:23,600 --> 00:14:25,760 Speaker 2: we want to cut it off before it becomes pervasive. 314 00:14:26,680 --> 00:14:29,960 Speaker 2: So right now, it's more of like government agencies are 315 00:14:30,000 --> 00:14:32,920 Speaker 2: exploring to try to see where it is because it's 316 00:14:33,000 --> 00:14:34,600 Speaker 2: very hard to find. We don't have a list of 317 00:14:34,640 --> 00:14:37,160 Speaker 2: which companies are doing it, how much they're doing it, 318 00:14:37,360 --> 00:14:41,200 Speaker 2: what impact it is. It's very unknown, like all we 319 00:14:41,240 --> 00:14:44,040 Speaker 2: can do is all government agencies can do. Is what 320 00:14:44,080 --> 00:14:46,480 Speaker 2: we do is go and click and try and say, hey, 321 00:14:46,520 --> 00:14:50,920 Speaker 2: my price is different yesterday than today, or my spouse's 322 00:14:50,960 --> 00:14:53,200 Speaker 2: price on this app is different than mine. What's going on? 323 00:14:53,800 --> 00:14:55,560 Speaker 2: So we really don't know the extent of it, But 324 00:14:55,600 --> 00:14:58,440 Speaker 2: I don't I wouldn't recommend jumping into something I give 325 00:14:58,480 --> 00:15:00,800 Speaker 2: VPN yet, but I think the the tools we need 326 00:15:00,840 --> 00:15:02,760 Speaker 2: to keep in mind as we move forward. 327 00:15:02,840 --> 00:15:05,400 Speaker 1: If it becomes worse, it's just so much more stress 328 00:15:05,560 --> 00:15:07,840 Speaker 1: when you know, buying things online. Wasn't it supposed to 329 00:15:07,880 --> 00:15:09,760 Speaker 1: be easier for us? At the end of the day, 330 00:15:09,800 --> 00:15:10,320 Speaker 1: it's just so. 331 00:15:10,320 --> 00:15:13,680 Speaker 2: Much more stress. I mean, it is stressful, but just 332 00:15:13,680 --> 00:15:15,320 Speaker 2: think of it this way. At the end of the day, 333 00:15:15,440 --> 00:15:17,320 Speaker 2: you get to decide if you're willing to pay the 334 00:15:17,320 --> 00:15:21,800 Speaker 2: price or not. True. You know, we're very worried that, oh, 335 00:15:21,920 --> 00:15:24,080 Speaker 2: my next door neighbor might get a different price than 336 00:15:24,120 --> 00:15:27,760 Speaker 2: me with this personalized pricing because maybe their data is different. 337 00:15:27,960 --> 00:15:29,760 Speaker 2: But just at the end of the day, if you're 338 00:15:29,760 --> 00:15:31,720 Speaker 2: willing to pay a price for the product, then okay, 339 00:15:31,720 --> 00:15:34,440 Speaker 2: do it. And if you're not, then don't nobody's forcing 340 00:15:34,440 --> 00:15:36,560 Speaker 2: me to buy these things? So in a way like 341 00:15:36,640 --> 00:15:39,000 Speaker 2: it's it's not as horrible as it sounds, like maybe 342 00:15:39,600 --> 00:15:41,600 Speaker 2: I'm getting ripped off a little bit because of this 343 00:15:41,760 --> 00:15:43,640 Speaker 2: or that, like who knows. But at the end of 344 00:15:43,640 --> 00:15:45,200 Speaker 2: the day, I get to decide if I'm willing to 345 00:15:45,200 --> 00:15:47,520 Speaker 2: pay that price. And that's you know, that's how it is. 346 00:15:47,760 --> 00:15:49,800 Speaker 2: They can never force me to buy something I don't. 347 00:15:49,880 --> 00:15:52,320 Speaker 2: I don't want people to be too overly stressed and 348 00:15:53,080 --> 00:15:56,360 Speaker 2: you know, worried, Like I think, technology moves forward and 349 00:15:56,400 --> 00:15:58,480 Speaker 2: a lot of it is an adjustment phase where we 350 00:15:58,600 --> 00:16:01,080 Speaker 2: have to get used to a changing in vironment, and 351 00:16:01,120 --> 00:16:03,960 Speaker 2: we do need government to help smooth the bumps and 352 00:16:04,000 --> 00:16:06,960 Speaker 2: regulate the really bad excesses. But part of it is 353 00:16:07,000 --> 00:16:09,160 Speaker 2: just kind of accepting it that things do change and 354 00:16:09,200 --> 00:16:13,800 Speaker 2: this is the next phase in what commerce looks like perhaps. 355 00:16:13,440 --> 00:16:16,080 Speaker 1: Yeah, AI is here to stay, it's not going anywhere 356 00:16:16,120 --> 00:16:17,760 Speaker 1: to stay. Yeah, absolutely, It's true. 357 00:16:17,880 --> 00:16:19,840 Speaker 2: How weird was like Google and all these things were 358 00:16:19,840 --> 00:16:22,480 Speaker 2: like all these amazing things were free online on the internet, 359 00:16:22,720 --> 00:16:25,240 Speaker 2: Like we didn't understand it. Twenty years later, you just 360 00:16:25,280 --> 00:16:27,600 Speaker 2: you don't even think about it. It's just perfectly natural. 361 00:16:28,040 --> 00:16:29,600 Speaker 2: It takes time to acclimate. 362 00:16:29,320 --> 00:16:31,440 Speaker 1: It's true, and there's all these different Well, there's all 363 00:16:31,440 --> 00:16:34,120 Speaker 1: these different conversations we can have about just like data 364 00:16:34,240 --> 00:16:36,960 Speaker 1: and how for the past twenty years, I mean, we 365 00:16:37,040 --> 00:16:40,120 Speaker 1: may not have realized what we were contributing to make 366 00:16:40,160 --> 00:16:42,520 Speaker 1: ourselves the product and all of this. But that's going 367 00:16:42,600 --> 00:16:44,480 Speaker 1: to have to be for a different interview. I'm gonna 368 00:16:44,480 --> 00:16:47,120 Speaker 1: have to grab you back another time. But you are 369 00:16:47,960 --> 00:16:50,440 Speaker 1: definitely right on the pulse when it comes to this stuff. 370 00:16:50,480 --> 00:16:53,080 Speaker 1: How can people find you know, the work that you've done, 371 00:16:53,400 --> 00:16:56,440 Speaker 1: How can people learn more about you and what you 372 00:16:56,440 --> 00:16:58,320 Speaker 1: you know? Your thoughts on these issues. 373 00:16:58,240 --> 00:17:01,040 Speaker 2: Well, this particular issue has just been coming up a lot, 374 00:17:01,120 --> 00:17:03,000 Speaker 2: and it's really funny how I first got into it. 375 00:17:03,560 --> 00:17:05,680 Speaker 2: I wrote a book that came out last summer called 376 00:17:05,760 --> 00:17:08,800 Speaker 2: Robin Hood Math, and it's all about trying to take 377 00:17:08,920 --> 00:17:10,840 Speaker 2: math out of the hands of the elites and give 378 00:17:10,880 --> 00:17:14,520 Speaker 2: it to ordinary people. And it was my editor. So 379 00:17:14,560 --> 00:17:16,960 Speaker 2: this is not a Mathew person, not an algorithm person. 380 00:17:17,200 --> 00:17:20,480 Speaker 2: A book person said I keep hearing about this dynamic 381 00:17:20,480 --> 00:17:23,120 Speaker 2: pricing stuff. Can you add a couple of paragraphs about that? 382 00:17:23,720 --> 00:17:24,960 Speaker 2: And I had never heard of it at the time, 383 00:17:25,000 --> 00:17:27,080 Speaker 2: and I said, sure, I'll throw them in turns out 384 00:17:27,119 --> 00:17:28,720 Speaker 2: out of the whole book, you know whatever, two hundred 385 00:17:28,760 --> 00:17:31,600 Speaker 2: and fifty pages. It's those two paragraphs that people keep 386 00:17:31,640 --> 00:17:33,879 Speaker 2: asking me about they want to know about it, and 387 00:17:33,920 --> 00:17:35,720 Speaker 2: it just really made me realize this is a topic 388 00:17:35,800 --> 00:17:39,520 Speaker 2: that it really just gets into our I don't know, 389 00:17:39,520 --> 00:17:45,000 Speaker 2: our private, intimate experience of shopping and existing on the internet, 390 00:17:45,119 --> 00:17:47,600 Speaker 2: so it's really important. But to go back to that 391 00:17:47,640 --> 00:17:50,200 Speaker 2: to your question. So I'm at Bentley University, which is 392 00:17:50,240 --> 00:17:53,600 Speaker 2: in Waltham. So if you search my name Noah John Cerquisa, 393 00:17:53,640 --> 00:17:55,480 Speaker 2: I know it's a long name, Sorry about that, you 394 00:17:55,520 --> 00:17:57,760 Speaker 2: can get to my university web page and it connects 395 00:17:57,760 --> 00:18:01,080 Speaker 2: to everything and I would love Honestly, I'm just completely 396 00:18:01,160 --> 00:18:04,159 Speaker 2: genuinely authentically saying this. I love hearing from people. So 397 00:18:04,320 --> 00:18:08,040 Speaker 2: if you, if any listeners have found issues where they 398 00:18:08,080 --> 00:18:10,200 Speaker 2: think there might be some of this weird, fishy price 399 00:18:10,200 --> 00:18:12,480 Speaker 2: and stuff, let me know because I'm happy to explore 400 00:18:12,560 --> 00:18:15,360 Speaker 2: further if there's anything I can help you with navigate. 401 00:18:16,840 --> 00:18:18,480 Speaker 2: I do this because I'm really passionate about it, So 402 00:18:18,800 --> 00:18:20,720 Speaker 2: please please please find me and reach out to me 403 00:18:20,760 --> 00:18:21,520 Speaker 2: and let's talk. 404 00:18:21,720 --> 00:18:24,200 Speaker 1: It's very obvious that you're very passionate about this and 405 00:18:24,280 --> 00:18:26,240 Speaker 1: you care, and I'm grateful that you took the time 406 00:18:26,720 --> 00:18:28,280 Speaker 1: to be with us and explain this and kind of 407 00:18:28,280 --> 00:18:30,600 Speaker 1: break this down a little bit because this impacts all 408 00:18:30,640 --> 00:18:33,360 Speaker 1: of us. And thank you for doing the work. And 409 00:18:33,480 --> 00:18:36,520 Speaker 1: you know, Noah G I A and dot com. That's 410 00:18:36,640 --> 00:18:40,520 Speaker 1: Noah G I A N dot com. That's your website, Noah, 411 00:18:40,520 --> 00:18:42,520 Speaker 1: this is This has been really helpful. Thank you so 412 00:18:42,600 --> 00:18:43,360 Speaker 1: much for taking the time. 413 00:18:43,400 --> 00:18:45,680 Speaker 2: Glad to hear it. It's honestly been my pleasure. Thank 414 00:18:45,680 --> 00:18:46,399 Speaker 2: you so much. Nicole. 415 00:18:47,400 --> 00:18:50,120 Speaker 1: Be sure to bundle up, stay safe and warm and healthy, 416 00:18:50,240 --> 00:18:52,520 Speaker 1: and join us again next week for another edition of 417 00:18:52,560 --> 00:18:57,040 Speaker 1: the show. I'm Nicole Davis from WBZ News Radio on iHeartRadio.