1 00:00:01,320 --> 00:00:04,800 Speaker 1: Welcome to Stuff You Should Know from House Stuff Works 2 00:00:04,800 --> 00:00:13,520 Speaker 1: dot com. Hey, and welcome to the podcast. I'm Josh Clark, 3 00:00:13,560 --> 00:00:18,600 Speaker 1: and there's Charles W. Chuck Bryant. Jerome's to my right, uh, 4 00:00:19,600 --> 00:00:22,360 Speaker 1: and this is stuff you should There's probably some people going, 5 00:00:22,560 --> 00:00:26,720 Speaker 1: did they replaced Jerry with Jerome? Named Jerome? Yeah, that's 6 00:00:26,760 --> 00:00:31,200 Speaker 1: just the nickname we have for Jerry. Uh. It's a 7 00:00:31,320 --> 00:00:35,519 Speaker 1: it's an arcane reference to the librarian at the beginning 8 00:00:35,520 --> 00:00:40,520 Speaker 1: of Ghostbusters whose uncle thought he was sat Jerome. No, really, 9 00:00:41,240 --> 00:00:43,480 Speaker 1: that's what I've always been referencing. I thought we were 10 00:00:43,520 --> 00:00:46,159 Speaker 1: on the same page about it. We also call her Jarre's. 11 00:00:47,320 --> 00:00:52,760 Speaker 1: That's a reference to nothing, right, Jerry, all right, that's 12 00:00:52,800 --> 00:00:56,640 Speaker 1: a that was a great Jerry Chuck um chuck. Yes, 13 00:00:56,960 --> 00:01:01,360 Speaker 1: have you ever been on the Internet of Things? M Well, 14 00:01:03,680 --> 00:01:06,640 Speaker 1: you can't really be on the Internet of things. I 15 00:01:06,680 --> 00:01:10,440 Speaker 1: don't understand. Explain. Well, the Internet of Things, my friend, 16 00:01:10,560 --> 00:01:15,320 Speaker 1: is a really just a collection of interconnected devices to 17 00:01:15,440 --> 00:01:21,679 Speaker 1: make your life simpler and more and less private, pretty 18 00:01:21,760 --> 00:01:27,600 Speaker 1: much in front with potential complications, but more convenient. Allegedly, 19 00:01:28,360 --> 00:01:30,480 Speaker 1: there's a lot of people who also say this is 20 00:01:30,520 --> 00:01:32,480 Speaker 1: all just a bunch of navel gazing in a lot 21 00:01:32,480 --> 00:01:35,760 Speaker 1: of ways, you know, Like, do we really need these apps? 22 00:01:35,760 --> 00:01:40,480 Speaker 1: I ran across a a vape for smoking weed as 23 00:01:40,520 --> 00:01:44,160 Speaker 1: they call it, that has an app. Uh, it's an 24 00:01:44,240 --> 00:01:48,160 Speaker 1: a vaping system for smoking. It's just a little pipe marijuana. 25 00:01:49,160 --> 00:01:52,560 Speaker 1: But but it's just a little pipe, and um, it 26 00:01:52,640 --> 00:01:56,600 Speaker 1: has an app that goes with it that remotely controls 27 00:01:56,640 --> 00:01:59,680 Speaker 1: it the heat settings and stuff. Oh does it like 28 00:02:00,240 --> 00:02:03,800 Speaker 1: things like you've smoked one ounce of weed this month? No, 29 00:02:04,000 --> 00:02:09,080 Speaker 1: which would be pretty intriguing. I guess that would be. Yeah, 30 00:02:09,280 --> 00:02:11,160 Speaker 1: And that's the point. There's a lot of there's a 31 00:02:11,160 --> 00:02:13,440 Speaker 1: lot of stuff that you can point to and say, 32 00:02:13,480 --> 00:02:15,919 Speaker 1: this is pretty neat. This thing feeds my cat while 33 00:02:15,960 --> 00:02:19,359 Speaker 1: I spend twenties three hours a day at work. Yeah, 34 00:02:19,400 --> 00:02:21,760 Speaker 1: that's it's nice that I can keep this cat alive 35 00:02:21,800 --> 00:02:24,359 Speaker 1: that I have no connection with any longer because I'm 36 00:02:24,400 --> 00:02:26,800 Speaker 1: at work all the time. But this machine feeds it 37 00:02:26,840 --> 00:02:29,359 Speaker 1: because I can control it remotely with an app. Right. 38 00:02:29,400 --> 00:02:31,200 Speaker 1: But if you have something tracking how much we do 39 00:02:31,200 --> 00:02:33,400 Speaker 1: you smoke? You're either smoking far too much or not 40 00:02:33,440 --> 00:02:36,320 Speaker 1: nearly enough, or you have too much money is another 41 00:02:36,320 --> 00:02:39,600 Speaker 1: way to put it. Yeah. Uh, you know what my 42 00:02:39,639 --> 00:02:42,880 Speaker 1: friend Clay. I don't know if you've met Clay, but 43 00:02:42,960 --> 00:02:48,000 Speaker 1: he told me something. The way, as far as being 44 00:02:48,040 --> 00:02:51,280 Speaker 1: concerned about security and privacy rights and things, is what 45 00:02:51,320 --> 00:02:53,120 Speaker 1: they do is they sell it to you first as 46 00:02:53,160 --> 00:02:57,840 Speaker 1: a convenience and then before you know it, you know, 47 00:02:59,160 --> 00:03:02,560 Speaker 1: dot dot dot Clay Clay never finished the sence he did, 48 00:03:02,760 --> 00:03:05,040 Speaker 1: but you know, I don't remember exactly he finished it. 49 00:03:05,240 --> 00:03:09,240 Speaker 1: Clay said this in Oh, yeah, I think i'd like 50 00:03:09,360 --> 00:03:12,919 Speaker 1: Clay like way before, like he was probably just talking 51 00:03:12,960 --> 00:03:17,440 Speaker 1: about geez, I don't even know, like a credit card 52 00:03:17,639 --> 00:03:20,840 Speaker 1: or something, and and he was just like, just be aware, man. 53 00:03:20,960 --> 00:03:22,080 Speaker 1: He was like, they sell it to he is a 54 00:03:22,080 --> 00:03:25,880 Speaker 1: convenience before you know what. Everyone is doing it that way, 55 00:03:26,040 --> 00:03:29,480 Speaker 1: and it's fraught with complications, was essentially what he was saying. Yeah, 56 00:03:29,600 --> 00:03:32,520 Speaker 1: I think that's very precient because that's exactly the point 57 00:03:32,560 --> 00:03:35,080 Speaker 1: that we're at right now. And we'll talk a little more, 58 00:03:35,200 --> 00:03:37,760 Speaker 1: well a lot more, I'm sure, about security and privacy 59 00:03:37,800 --> 00:03:40,200 Speaker 1: and all that stuff, but ultimately it's like you said, 60 00:03:40,200 --> 00:03:42,840 Speaker 1: the Internet of things. Is this A lot of other 61 00:03:42,880 --> 00:03:45,080 Speaker 1: people call it the internet, Like this is just the 62 00:03:45,120 --> 00:03:47,280 Speaker 1: next wave of the Internet. This is where the Internet 63 00:03:47,320 --> 00:03:51,040 Speaker 1: is going but the I guess the best description of 64 00:03:51,080 --> 00:03:58,000 Speaker 1: it is it's a series of interconnected UM machines devices that, 65 00:03:58,160 --> 00:04:01,360 Speaker 1: since the environment in a lot of k is, UM 66 00:04:01,560 --> 00:04:03,840 Speaker 1: can carry out some sort of function. Usually it is 67 00:04:03,840 --> 00:04:08,720 Speaker 1: a sensing function, and UM can communicate with central servers, 68 00:04:08,800 --> 00:04:11,440 Speaker 1: usually in the cloud, via the Internet. It's about as 69 00:04:11,440 --> 00:04:13,560 Speaker 1: simple as that. Like, that's the Internet of things. Yeah, 70 00:04:13,600 --> 00:04:15,920 Speaker 1: Like if you think, boy, this all sounds weird and 71 00:04:16,080 --> 00:04:18,840 Speaker 1: I don't use stuff like that, if you have anything 72 00:04:18,920 --> 00:04:21,240 Speaker 1: that has the word smart in front of it, then 73 00:04:21,279 --> 00:04:24,040 Speaker 1: you're probably using the Internet of things already, right if 74 00:04:24,080 --> 00:04:29,400 Speaker 1: you've got a smartphone that to a smart thermostat or 75 00:04:29,440 --> 00:04:33,440 Speaker 1: a smart smoke detector, or if you wear a exercise tracker. 76 00:04:34,000 --> 00:04:37,039 Speaker 1: This that's the Internet of things right exactly. And the 77 00:04:37,040 --> 00:04:40,279 Speaker 1: there's a lot of obvious steps that are right there 78 00:04:40,279 --> 00:04:43,480 Speaker 1: on the horizon coming after this is like the idea 79 00:04:43,560 --> 00:04:47,520 Speaker 1: that UM your refrigerator will be able to be like, oh, 80 00:04:47,680 --> 00:04:50,560 Speaker 1: these guys are almost out of cash. You milk, I'll 81 00:04:50,680 --> 00:04:54,080 Speaker 1: contact Amazon and have it delivered in two hours. And 82 00:04:54,120 --> 00:04:56,600 Speaker 1: you're like, I don't drink cash, you milk anymore? Fridge, 83 00:04:56,720 --> 00:04:59,920 Speaker 1: dupid fridge? How many times do I have to abuse 84 00:05:00,120 --> 00:05:03,840 Speaker 1: was you. But that's pretty neat. I mean I think 85 00:05:03,880 --> 00:05:06,359 Speaker 1: a lot of this. It's it's both good and a 86 00:05:06,360 --> 00:05:09,359 Speaker 1: little creepy. Like that'd be cool. If I showed up 87 00:05:09,400 --> 00:05:11,280 Speaker 1: at home one day and I was out of milk 88 00:05:11,279 --> 00:05:14,880 Speaker 1: and it was waiting on my doorstep, I of milk. 89 00:05:14,920 --> 00:05:17,159 Speaker 1: You wouldn't want waiting on your doorstep. O, cash your milk. 90 00:05:17,160 --> 00:05:19,599 Speaker 1: It doesn't matter. Oh you don't in refrigerate it no way, 91 00:05:19,680 --> 00:05:21,760 Speaker 1: because it's not milk, it's cash your juice. Yes, it 92 00:05:21,960 --> 00:05:25,640 Speaker 1: is what they should call it. Um the I don't 93 00:05:25,680 --> 00:05:28,120 Speaker 1: disagree with you, right, Like, I'm sure it is pretty neat. 94 00:05:28,160 --> 00:05:31,040 Speaker 1: It's pretty cool, and in five years it'll be totally 95 00:05:31,320 --> 00:05:36,200 Speaker 1: second nature to us. Right. But the in my experience, 96 00:05:36,279 --> 00:05:40,599 Speaker 1: the more mechanized, the more automated, the more convenient and 97 00:05:40,640 --> 00:05:45,240 Speaker 1: I just made air quotes, life gets, the more difficult 98 00:05:45,279 --> 00:05:47,919 Speaker 1: it is to keep up with, the less simple it is, 99 00:05:48,240 --> 00:05:52,240 Speaker 1: and the more horrific it is when something breaks down. Interesting, 100 00:05:52,320 --> 00:05:56,640 Speaker 1: So in within simplification, you think it becomes more complicated, Yeah, 101 00:05:56,640 --> 00:05:59,560 Speaker 1: because you you rely on machines that can break, and 102 00:05:59,640 --> 00:06:02,680 Speaker 1: when they break, you're like, I forgot how to order 103 00:06:02,760 --> 00:06:05,840 Speaker 1: cash your milk? Where you get that stuff. So you 104 00:06:05,880 --> 00:06:09,680 Speaker 1: think we're headed for idiocracy, uh, to an extent, But 105 00:06:09,760 --> 00:06:11,479 Speaker 1: I think it's more. It's more than that. In the 106 00:06:11,520 --> 00:06:16,040 Speaker 1: short term. I think it's just that it's it's so 107 00:06:16,120 --> 00:06:19,560 Speaker 1: much easier to walk to a grocery store and buy 108 00:06:19,600 --> 00:06:21,839 Speaker 1: cash you milk, and walk back home than it is 109 00:06:21,880 --> 00:06:25,120 Speaker 1: to um ensure that your fridge has all of the 110 00:06:25,279 --> 00:06:29,760 Speaker 1: updated firmware and make sure that it's it's ordering correctly 111 00:06:29,800 --> 00:06:32,320 Speaker 1: from Amazon, and to make sure Amazon gets there. And 112 00:06:32,520 --> 00:06:34,560 Speaker 1: you're just relying on all these other components rather than 113 00:06:34,600 --> 00:06:36,840 Speaker 1: your own two feet and the idea that the people 114 00:06:36,880 --> 00:06:38,520 Speaker 1: at the store are gonna have your cash you milk. 115 00:06:39,000 --> 00:06:41,919 Speaker 1: I see that in a way, but I also disagree 116 00:06:41,960 --> 00:06:44,839 Speaker 1: in a way. Like, for instance, I have I have 117 00:06:44,880 --> 00:06:46,479 Speaker 1: a few things in my life that I've set to 118 00:06:46,480 --> 00:06:52,880 Speaker 1: auto order, like air filters, baby formula, stuff that water 119 00:06:53,000 --> 00:06:55,479 Speaker 1: my fridge water filter like this stuff gets shipped to 120 00:06:55,480 --> 00:06:58,560 Speaker 1: me automatically, and it's wonderful because I don't have to 121 00:06:58,560 --> 00:07:00,920 Speaker 1: think about it at all. So does your fridge order 122 00:07:00,960 --> 00:07:04,200 Speaker 1: itself or you just put a time around like Amazon, 123 00:07:04,440 --> 00:07:08,279 Speaker 1: it's on a timer to these you know, not even Amazon, 124 00:07:08,360 --> 00:07:10,680 Speaker 1: like you know, the fridge filter company. You can just 125 00:07:10,720 --> 00:07:13,560 Speaker 1: set it to auto deliver like every sixty days. So 126 00:07:13,560 --> 00:07:15,280 Speaker 1: it's nice because I don't have to think about it. 127 00:07:15,320 --> 00:07:17,440 Speaker 1: The only thing is missing is the camera or the 128 00:07:17,480 --> 00:07:21,600 Speaker 1: device itself being hooked up telling the company, Hey, my 129 00:07:21,720 --> 00:07:25,720 Speaker 1: air filter is over or you know, spent um, which 130 00:07:25,720 --> 00:07:28,320 Speaker 1: would be pretty awesome to tell you the truth, if 131 00:07:28,640 --> 00:07:31,560 Speaker 1: your air filter could be like it's actually we're at 132 00:07:32,120 --> 00:07:35,400 Speaker 1: we need to go ahead and order. However, you know what, 133 00:07:35,480 --> 00:07:37,680 Speaker 1: that's a little I never really thought about that being 134 00:07:37,680 --> 00:07:40,880 Speaker 1: a little dicey, because you know, you need a new 135 00:07:40,880 --> 00:07:44,200 Speaker 1: water filter, trust me, Like really, but what if you 136 00:07:44,200 --> 00:07:47,160 Speaker 1: could get another month out of this one that you're 137 00:07:47,280 --> 00:07:49,280 Speaker 1: throwing away and you're ordering early, then they can sell 138 00:07:49,320 --> 00:07:53,240 Speaker 1: another to a year to you exactly. That's what I 139 00:07:53,280 --> 00:07:55,360 Speaker 1: assume when you're trusting them, like when they're like you 140 00:07:55,400 --> 00:07:57,680 Speaker 1: have to change your oil every three thousand miles or 141 00:07:57,960 --> 00:07:59,800 Speaker 1: or yeah, you need to change your water filter every 142 00:07:59,800 --> 00:08:02,360 Speaker 1: three months, Like I never changed my oil. Yeah, I forget. 143 00:08:02,400 --> 00:08:07,280 Speaker 1: That's for chumps. It's a scam. Uh anyway, all right, 144 00:08:07,320 --> 00:08:11,640 Speaker 1: so that's a little personal overview. The phrase Internet of 145 00:08:11,680 --> 00:08:15,280 Speaker 1: Things is actually coined, they think, in the late nineties 146 00:08:15,280 --> 00:08:18,600 Speaker 1: by guy named Kevin Ashton, who worked for PNG. I'll 147 00:08:18,640 --> 00:08:20,560 Speaker 1: bet he has a T shirt that says I coined 148 00:08:20,560 --> 00:08:23,240 Speaker 1: the term Internet of things. Well just so he can 149 00:08:23,280 --> 00:08:25,680 Speaker 1: invite people to punch him in the face or give 150 00:08:25,760 --> 00:08:28,680 Speaker 1: him a hug, depending on who you are. He worked 151 00:08:28,680 --> 00:08:32,400 Speaker 1: for Procter and Gamble, and he had a presentation, Uh, 152 00:08:32,480 --> 00:08:34,480 Speaker 1: they're at work where he said, you know what we 153 00:08:34,480 --> 00:08:38,360 Speaker 1: should do. We should put um radio frequency ID tags 154 00:08:38,480 --> 00:08:41,000 Speaker 1: r F I D s on lipstick in the store, 155 00:08:41,679 --> 00:08:44,040 Speaker 1: so and have that hooked up to a machine where 156 00:08:44,040 --> 00:08:47,560 Speaker 1: we could automatically send that information and say, hey, the 157 00:08:47,559 --> 00:08:50,000 Speaker 1: store is running low on lipstick, get a shipment over there. 158 00:08:50,520 --> 00:08:54,000 Speaker 1: And he coined the term Internet of things supposedly in 159 00:08:54,040 --> 00:08:57,600 Speaker 1: that meeting, and apparently though at this time, but prior 160 00:08:57,640 --> 00:08:59,720 Speaker 1: to that, the nineties were a big time for something 161 00:08:59,760 --> 00:09:04,760 Speaker 1: called ubiquitous computing, which is the basically the predecessor to 162 00:09:05,280 --> 00:09:07,960 Speaker 1: the idea of the Internet of things, where like computers 163 00:09:07,960 --> 00:09:12,040 Speaker 1: would just be integrated into our lives totally and completely um, 164 00:09:12,160 --> 00:09:14,760 Speaker 1: and the Internet of things is in that vein a 165 00:09:14,760 --> 00:09:17,360 Speaker 1: little similarly. But from stuff I've read, there's a lot 166 00:09:17,360 --> 00:09:20,280 Speaker 1: of people were like, I didn't quite feel fulfill the 167 00:09:20,280 --> 00:09:23,240 Speaker 1: promise of ubiquitous computing. This is just kind of like 168 00:09:23,640 --> 00:09:27,400 Speaker 1: life slightly more convenient now thanks to this, But that 169 00:09:27,640 --> 00:09:31,960 Speaker 1: that his original idea, Kevin Ashton's, makes total and complete sense, 170 00:09:32,000 --> 00:09:35,960 Speaker 1: you know, like the you could how many sales do 171 00:09:36,040 --> 00:09:39,160 Speaker 1: you miss when your lipstick thing is empty until you 172 00:09:39,200 --> 00:09:40,880 Speaker 1: get you find out it's empty, and then get to 173 00:09:40,920 --> 00:09:44,240 Speaker 1: refill it, Like if your lipstick the last one can 174 00:09:44,280 --> 00:09:45,960 Speaker 1: be like, hey, I'm the last one here, you guys, 175 00:09:45,960 --> 00:09:50,080 Speaker 1: better sense and replacements. It's great. That makes perfect, total sense, 176 00:09:50,080 --> 00:09:53,240 Speaker 1: and that was ultimately the original basis of the Internet 177 00:09:53,240 --> 00:09:58,199 Speaker 1: of Things. It's taking um dumb things and making them, 178 00:09:58,240 --> 00:10:01,200 Speaker 1: like you said, smart, by give then the ability to 179 00:10:01,400 --> 00:10:06,680 Speaker 1: sense their surroundings and communicate that data to a central 180 00:10:06,720 --> 00:10:10,240 Speaker 1: server where that's analyzed and then the proper people are alerted. Yeah, 181 00:10:10,280 --> 00:10:12,480 Speaker 1: and here's the thing that I also never considered is 182 00:10:14,040 --> 00:10:16,320 Speaker 1: they must have discovered that there was a need for that. 183 00:10:17,000 --> 00:10:19,400 Speaker 1: And the only need I can imagine would be that 184 00:10:19,880 --> 00:10:23,240 Speaker 1: the employees were so bad at realizing that they were 185 00:10:23,320 --> 00:10:26,120 Speaker 1: running out of stock that they would go several days 186 00:10:26,240 --> 00:10:29,840 Speaker 1: without having lipstick on hand. So they were like, these 187 00:10:29,880 --> 00:10:33,480 Speaker 1: people can't even do that. Well, what if let's say 188 00:10:33,520 --> 00:10:36,040 Speaker 1: that you have a person whose job it is to 189 00:10:36,160 --> 00:10:41,079 Speaker 1: restock lipstick, right, and they go one week and there's 190 00:10:41,160 --> 00:10:44,400 Speaker 1: like five million tubes of orange, so much orange, they 191 00:10:44,440 --> 00:10:47,280 Speaker 1: just take some handful and throw throw it away just 192 00:10:47,280 --> 00:10:49,120 Speaker 1: because they don't want their bosses to feel bad about 193 00:10:49,120 --> 00:10:51,720 Speaker 1: the orange lipstick that's not being sold. Tubes of orange 194 00:10:52,160 --> 00:10:55,040 Speaker 1: come back right the next week and it's all gone 195 00:10:55,360 --> 00:10:58,040 Speaker 1: sold out. And then it's erratic like that, you've got 196 00:10:58,160 --> 00:11:02,440 Speaker 1: an employee you're sending there, Um that's just hit or miss, 197 00:11:02,480 --> 00:11:04,240 Speaker 1: whether you just wasted a bunch of gas in the 198 00:11:04,280 --> 00:11:08,640 Speaker 1: employees time, rather than being alerted like now you guys 199 00:11:08,679 --> 00:11:11,959 Speaker 1: can come before it's too late. Sure, that's the thing 200 00:11:12,000 --> 00:11:15,599 Speaker 1: that makes the most sense to me. Kevin Ashton genius. 201 00:11:16,559 --> 00:11:18,280 Speaker 1: Maybe so I meant to look him up to see 202 00:11:18,320 --> 00:11:20,480 Speaker 1: what he was doing these days, if he was just 203 00:11:20,600 --> 00:11:23,320 Speaker 1: wearing the T shirt or he might be a professional 204 00:11:24,200 --> 00:11:28,800 Speaker 1: term coiner, you know, haven't worked out quite as well 205 00:11:28,880 --> 00:11:31,520 Speaker 1: to sit back in the fishnet of the future. Yeah, 206 00:11:31,600 --> 00:11:33,960 Speaker 1: that that was that kind of caught on a little 207 00:11:33,960 --> 00:11:37,359 Speaker 1: bit early in the two thousand's, but then it just yeah, 208 00:11:37,480 --> 00:11:44,520 Speaker 1: it went the way the dodo, which was coined by Yeah, 209 00:11:44,760 --> 00:11:48,800 Speaker 1: that's been around for a while. It's the Dodo. Um, 210 00:11:48,960 --> 00:11:52,760 Speaker 1: the Dodo died, all right, So let's talk about a 211 00:11:52,800 --> 00:11:55,439 Speaker 1: little bit what you What you've got essentially is a 212 00:11:55,480 --> 00:11:58,760 Speaker 1: is a step by step system that many times starts 213 00:11:58,760 --> 00:12:03,360 Speaker 1: with a smartphone this connected to the internet. Uh. Then 214 00:12:03,400 --> 00:12:05,240 Speaker 1: you have other pieces of hardware in your home that 215 00:12:05,280 --> 00:12:08,040 Speaker 1: are also connected to the internet, and there is most 216 00:12:08,040 --> 00:12:11,520 Speaker 1: likely an app for that hardware on your smartphone, and 217 00:12:11,559 --> 00:12:14,640 Speaker 1: then that's usually sent out to the cloud. It's not 218 00:12:15,200 --> 00:12:19,160 Speaker 1: some guy or some ladies sitting in a room in 219 00:12:19,960 --> 00:12:24,640 Speaker 1: looking at your data. It's eight thousand steps today, just 220 00:12:24,640 --> 00:12:28,120 Speaker 1: following behind you, counting count No, they're looking at the 221 00:12:28,200 --> 00:12:31,680 Speaker 1: data from your wearable. Yeah, so just in your guest room. 222 00:12:31,920 --> 00:12:36,840 Speaker 1: That become a neat I don't know. No, it depends 223 00:12:36,840 --> 00:12:39,520 Speaker 1: on whether they like get along with you, or if 224 00:12:39,559 --> 00:12:43,920 Speaker 1: they drink your cash you milk. Well, Todd drinks all 225 00:12:43,960 --> 00:12:46,160 Speaker 1: the cash you milk. So that's why Todd's not wanted. 226 00:12:46,160 --> 00:12:49,480 Speaker 1: He's being counterproductive. So then it usually goes to the cloud, 227 00:12:50,120 --> 00:12:52,240 Speaker 1: which is where we send our data these days, where 228 00:12:52,280 --> 00:12:54,800 Speaker 1: it's analyzed, and that's a big that's a big part 229 00:12:54,880 --> 00:12:57,280 Speaker 1: of the Internet things too, Chuck, is the cloud, because 230 00:12:57,520 --> 00:13:00,000 Speaker 1: that means that you don't have to analyze the day 231 00:13:00,800 --> 00:13:03,240 Speaker 1: in the little machine, in the little censer. All it 232 00:13:03,280 --> 00:13:06,160 Speaker 1: has to do is sense stuff and create that data 233 00:13:06,240 --> 00:13:08,480 Speaker 1: and then send it to the cloud, where you're basically 234 00:13:08,520 --> 00:13:12,199 Speaker 1: outsourcing all the analysis. That takes a lot more computing power. 235 00:13:12,440 --> 00:13:14,400 Speaker 1: So that was a big development that there's such a 236 00:13:14,440 --> 00:13:16,360 Speaker 1: thing as the cloud now. Yeah, that's it kind of 237 00:13:16,360 --> 00:13:20,439 Speaker 1: puts the smart into it all. Um. And if you're 238 00:13:20,440 --> 00:13:22,679 Speaker 1: wondering how big it is right now, that depends on 239 00:13:22,760 --> 00:13:25,800 Speaker 1: who you ask, but some say between fifteen and twenty 240 00:13:25,840 --> 00:13:30,360 Speaker 1: five billion devices already that are connected, and uh, some 241 00:13:30,440 --> 00:13:36,280 Speaker 1: people say by twenty it could be anywhere from fifty 242 00:13:36,440 --> 00:13:40,800 Speaker 1: billion to a trillion devices connected, depending on how much 243 00:13:41,120 --> 00:13:43,199 Speaker 1: it catches on, you know, how every day it becomes. 244 00:13:43,600 --> 00:13:45,520 Speaker 1: But it's at it that way for sure. I don't 245 00:13:45,520 --> 00:13:48,120 Speaker 1: think there's any going back from this. I think they're 246 00:13:48,160 --> 00:13:53,439 Speaker 1: going to stick a some sort of um computing hardware 247 00:13:53,800 --> 00:14:00,679 Speaker 1: that taps into the Internet on everything everything, okay, And 248 00:14:00,679 --> 00:14:04,439 Speaker 1: and it doesn't necessarily have to be a smart thermostat, 249 00:14:04,600 --> 00:14:07,520 Speaker 1: like they have devices now that you can tag to 250 00:14:07,960 --> 00:14:12,000 Speaker 1: everyday items to keep up with things. Yeah, or let's 251 00:14:12,040 --> 00:14:15,720 Speaker 1: say cameras that Let's say you have a security system 252 00:14:15,760 --> 00:14:18,520 Speaker 1: at your house that you can view from your smartphone 253 00:14:18,520 --> 00:14:21,440 Speaker 1: from anywhere in the world, and maybe it automatically calls 254 00:14:21,520 --> 00:14:26,120 Speaker 1: the police. That's the Internet of things, right you know. Yeah, 255 00:14:26,200 --> 00:14:28,160 Speaker 1: I mean there's a lot of great applications for it. 256 00:14:28,200 --> 00:14:32,160 Speaker 1: And again, like this is just we're in the the 257 00:14:32,280 --> 00:14:36,040 Speaker 1: nascent period of of this, like the stuff that's like, wow, 258 00:14:36,120 --> 00:14:40,440 Speaker 1: holy cow, I have a smart doorbell. That's amazing and 259 00:14:40,480 --> 00:14:44,120 Speaker 1: it's it's awesome and it works really well. But there's 260 00:14:44,480 --> 00:14:49,200 Speaker 1: you can't you basically can't apply your imagination to predict 261 00:14:49,680 --> 00:14:52,320 Speaker 1: what's even gonna be fifteen years into the future as 262 00:14:52,320 --> 00:14:55,080 Speaker 1: far as the Internet of things goes, Like just the 263 00:14:55,080 --> 00:15:00,240 Speaker 1: the change in uh, how we deal in inner act 264 00:15:00,320 --> 00:15:05,360 Speaker 1: with the Internet and our surroundings is it's it's inestimable. Yeah, 265 00:15:05,360 --> 00:15:07,800 Speaker 1: who wrote this one was a strickling No, this was 266 00:15:08,360 --> 00:15:11,640 Speaker 1: Bernadette Johnson. Well, she wrote a line in here that 267 00:15:11,640 --> 00:15:13,640 Speaker 1: I'm just gonna read because it really kind of hit 268 00:15:13,680 --> 00:15:16,720 Speaker 1: home for me. She said, we've essentially given common physical 269 00:15:16,760 --> 00:15:22,280 Speaker 1: objects both computing power and senses, and that explains it 270 00:15:22,320 --> 00:15:24,280 Speaker 1: to a t. Yeah, you did a good job with this. 271 00:15:24,480 --> 00:15:27,080 Speaker 1: There's a lot of information, Like you could do anything. 272 00:15:27,640 --> 00:15:29,960 Speaker 1: You can make anything smart that you wanted. You could 273 00:15:29,960 --> 00:15:35,200 Speaker 1: have a smart can't open her smart tuba lipstick, yeah exactly, 274 00:15:35,520 --> 00:15:38,280 Speaker 1: or smart tube of toothpaste that you know when you're 275 00:15:38,440 --> 00:15:41,600 Speaker 1: squeezing the end and your how do you get all 276 00:15:41,600 --> 00:15:44,160 Speaker 1: the toothpaste out? What's your method? Oh? You know they 277 00:15:44,200 --> 00:15:48,320 Speaker 1: make a little um remover. I can't remember what it's called. 278 00:15:48,480 --> 00:15:50,960 Speaker 1: You gotta you got a machine? No, well, yeah, I 279 00:15:51,000 --> 00:15:54,400 Speaker 1: have a smart toothpaste remover. It's like a little it 280 00:15:54,440 --> 00:15:56,960 Speaker 1: looks like a remember those candy lips, the wax lips. 281 00:15:57,760 --> 00:16:00,600 Speaker 1: It looks like those, but it's pocket right now. Okay, 282 00:16:00,600 --> 00:16:03,720 Speaker 1: so pull us out. Um. If you had a like 283 00:16:03,760 --> 00:16:07,000 Speaker 1: a slit in the lips, you put the tube of 284 00:16:07,080 --> 00:16:09,480 Speaker 1: toothpaste in the like the end of it in there, 285 00:16:09,560 --> 00:16:11,600 Speaker 1: and you just kind of tilted at an angle a 286 00:16:11,600 --> 00:16:13,880 Speaker 1: little bit and puts pressure and you just slide it 287 00:16:13,920 --> 00:16:17,240 Speaker 1: along and it pushes it to the to the front. Yeah. 288 00:16:17,480 --> 00:16:20,440 Speaker 1: I use my toothbrush to do the same thing. How 289 00:16:20,760 --> 00:16:24,320 Speaker 1: Oh you slide the tooth trust the toothpaste tube on 290 00:16:24,360 --> 00:16:27,720 Speaker 1: the sink and just you know, use it as a 291 00:16:27,800 --> 00:16:33,840 Speaker 1: uh flattener. Squeezer, squeezer. Yeah, that's a good one. Interesting. Yeah, 292 00:16:33,880 --> 00:16:37,760 Speaker 1: I never realized you just hacked your tooth brush. What 293 00:16:37,840 --> 00:16:39,640 Speaker 1: do you? Uh, what do you pay for something like 294 00:16:39,640 --> 00:16:42,880 Speaker 1: what you got? Like a dollar something like that? Yeah, yeah, nothing, 295 00:16:43,000 --> 00:16:45,240 Speaker 1: nothing too much. Well, boy, I guess I got away 296 00:16:45,280 --> 00:16:49,920 Speaker 1: with it. See now, if if I if there was 297 00:16:49,960 --> 00:16:53,640 Speaker 1: some sort of computing chip on this thing that calculated 298 00:16:53,680 --> 00:16:56,160 Speaker 1: how much toothpaste was left and sent that information to 299 00:16:56,200 --> 00:17:00,480 Speaker 1: my app, that would be a smart toothpaste squeezer, that's it. 300 00:17:00,840 --> 00:17:02,640 Speaker 1: Or you come home and open the mailbox and there's 301 00:17:02,640 --> 00:17:04,920 Speaker 1: a new toothpaste and you're like, I didn't even know 302 00:17:05,040 --> 00:17:07,760 Speaker 1: I was out because I haven't been using my my 303 00:17:07,840 --> 00:17:12,720 Speaker 1: lippy device. You know, Yeah, I forgot to brush my 304 00:17:12,760 --> 00:17:15,680 Speaker 1: teeth for three weeks. Let's take a break. Yeah, it's 305 00:17:15,680 --> 00:17:17,679 Speaker 1: getting a little silly. Yeah, And you go brush your 306 00:17:17,680 --> 00:17:20,480 Speaker 1: teeth and we'll meet back in here. And how long 307 00:17:20,520 --> 00:17:22,520 Speaker 1: does it take you to brush your teeth? Like? Seven? 308 00:17:22,560 --> 00:17:24,760 Speaker 1: Eight seconds? Great, we'll be back here in eight seconds. 309 00:17:40,800 --> 00:17:43,680 Speaker 1: By the way, seven or eight seconds is not nearly 310 00:17:43,800 --> 00:17:46,160 Speaker 1: enough time. I just know. That's washing your hands. You're 311 00:17:46,160 --> 00:17:50,440 Speaker 1: supposed to do the alphabet whie for brushing your teeth. 312 00:17:51,480 --> 00:17:54,159 Speaker 1: Either for washing your hands, or brushing your teeth, you know, 313 00:17:54,160 --> 00:17:56,119 Speaker 1: if you're supposed to do like three minutes for brushing 314 00:17:56,160 --> 00:18:01,200 Speaker 1: your teeth. I've got the we won't name check here 315 00:18:01,600 --> 00:18:06,080 Speaker 1: and buzz market, but I've got a mechanical toothbrush and 316 00:18:06,160 --> 00:18:09,919 Speaker 1: it um electric toothbrush and you know it has it 317 00:18:10,000 --> 00:18:13,080 Speaker 1: beeps and you divide your mouth up into four zones. 318 00:18:13,200 --> 00:18:14,800 Speaker 1: Oh yeah, I think I have the same top left, 319 00:18:14,800 --> 00:18:16,639 Speaker 1: top right, bottom left, bottom right, and it just beeps 320 00:18:16,680 --> 00:18:21,920 Speaker 1: and deep like vibrates. Oh really, it's already vibrating though. 321 00:18:22,000 --> 00:18:24,920 Speaker 1: How can you tell the difference? It changes its vibration really? Yeah, 322 00:18:25,040 --> 00:18:27,520 Speaker 1: how weird. I guess maybe it's a pause in the 323 00:18:27,640 --> 00:18:29,639 Speaker 1: vibration that I think about it. I just assigned to 324 00:18:29,640 --> 00:18:32,400 Speaker 1: a vibration. Yeah, the same deal though. Yeah. And they 325 00:18:32,520 --> 00:18:35,320 Speaker 1: again there's smart toothbrushes that can keep up with how 326 00:18:35,680 --> 00:18:37,800 Speaker 1: much you brush your teeth and that kind of stuff. 327 00:18:37,840 --> 00:18:42,680 Speaker 1: Are they really uh? Their WiFi connected toothbrushes that connect 328 00:18:42,680 --> 00:18:45,800 Speaker 1: to an app. You know, my brother in law, the 329 00:18:46,800 --> 00:18:50,919 Speaker 1: Marine Corps general um he. I used to laugh at 330 00:18:50,960 --> 00:18:53,200 Speaker 1: him because I was in his bathroom once and opened 331 00:18:53,200 --> 00:18:56,200 Speaker 1: the drawer and he had a log of his razor 332 00:18:56,520 --> 00:18:59,800 Speaker 1: and shaving log and like how many times he'd used 333 00:18:59,800 --> 00:19:04,240 Speaker 1: the raser And it's basically pre smart Internet of Things 334 00:19:04,280 --> 00:19:06,960 Speaker 1: smart razor, because I'm sure they have those now to 335 00:19:07,240 --> 00:19:09,720 Speaker 1: alert you, like when you should change your your razor blades. 336 00:19:10,880 --> 00:19:12,399 Speaker 1: I haven't heard of that one, but I wouldn't be 337 00:19:12,400 --> 00:19:14,440 Speaker 1: surprised if there was. I just thought it was very funny. 338 00:19:14,440 --> 00:19:16,080 Speaker 1: I mean, I said a lot about who he was. 339 00:19:16,840 --> 00:19:19,520 Speaker 1: Oh yeah, you know, and no surprised that he's a 340 00:19:19,560 --> 00:19:21,600 Speaker 1: Marine Corps general. If he's keeping up with stuff like that, 341 00:19:21,720 --> 00:19:24,120 Speaker 1: I can I can see him, like, um, I met him. 342 00:19:24,119 --> 00:19:26,359 Speaker 1: He's a great guy, just sitting at the edge of 343 00:19:26,359 --> 00:19:29,880 Speaker 1: the bed right before bedtime, petting his cat fifty times, 344 00:19:29,880 --> 00:19:31,440 Speaker 1: no more, no less, and than putting it in the 345 00:19:31,440 --> 00:19:33,200 Speaker 1: foot locker at the end of the bed for the night, 346 00:19:33,240 --> 00:19:35,600 Speaker 1: and like tucking himself in. Yeah, he's the one that 347 00:19:35,640 --> 00:19:38,680 Speaker 1: trere me on to peeing sitting down too. So oh yeah, 348 00:19:38,760 --> 00:19:41,280 Speaker 1: a great debt to him. Nice hats off. Is he 349 00:19:41,400 --> 00:19:43,960 Speaker 1: coming to our DC show again? Uh? No, man, they're 350 00:19:44,000 --> 00:19:47,520 Speaker 1: transferred over overseas for the first time ever. Are they 351 00:19:47,520 --> 00:19:52,639 Speaker 1: coming to our UK shows? No, not that overseas. I'm 352 00:19:52,640 --> 00:19:55,920 Speaker 1: not allowed to say where he's going. Okay, that's cool. 353 00:19:56,000 --> 00:19:58,359 Speaker 1: I'm with you top secret, huh. I can tell you 354 00:19:58,400 --> 00:20:01,520 Speaker 1: off the airs. That says, I see what you're saying. 355 00:20:01,520 --> 00:20:04,280 Speaker 1: It probably doesn't matter. I'm just respecting this privacy by 356 00:20:04,280 --> 00:20:06,160 Speaker 1: telling everyone that he taught me to be sitting down. 357 00:20:07,600 --> 00:20:10,160 Speaker 1: You really got starting that national combo? Why don't you 358 00:20:10,920 --> 00:20:13,359 Speaker 1: I just think it's important, you know, I need to 359 00:20:13,359 --> 00:20:16,600 Speaker 1: be talking about it. No mistakes is the motto of 360 00:20:16,600 --> 00:20:21,920 Speaker 1: the tagline. No no drips, no runs, no errors. Nice. Um, 361 00:20:22,440 --> 00:20:23,960 Speaker 1: did we just take a break and we came back 362 00:20:24,000 --> 00:20:29,320 Speaker 1: with this garbage? Maybe we should start over again, Jerry, 363 00:20:29,400 --> 00:20:35,160 Speaker 1: all right, well let's talk about the chech how about that? Yeah, 364 00:20:35,400 --> 00:20:41,200 Speaker 1: telemetry nothing new, No apparently, this says. And again, the whole, 365 00:20:41,640 --> 00:20:43,960 Speaker 1: the whole basis of the Internet of Things is what's 366 00:20:43,960 --> 00:20:48,359 Speaker 1: called machine to machine um communication. Right, you have like 367 00:20:48,400 --> 00:20:51,000 Speaker 1: your smart lipstick just sitting there sensing that it's the 368 00:20:51,080 --> 00:20:53,600 Speaker 1: last tube it can sense all day long, and it's 369 00:20:53,600 --> 00:20:57,720 Speaker 1: still a dumb stick, a lips lipstick. It can't tell you, 370 00:20:57,920 --> 00:21:01,320 Speaker 1: right unless it can communicate that data to the people 371 00:21:01,359 --> 00:21:03,679 Speaker 1: who need to know that stuff. And they do that 372 00:21:03,720 --> 00:21:06,960 Speaker 1: through machine to machine communication. And like you're saying, telemetry 373 00:21:07,080 --> 00:21:09,640 Speaker 1: was the original version of that, which apparently dates back 374 00:21:09,680 --> 00:21:14,160 Speaker 1: the Yeah, it's comes from the Greek um tele means 375 00:21:14,240 --> 00:21:20,240 Speaker 1: remote and uh metro metron means measure. And that's where 376 00:21:20,480 --> 00:21:23,840 Speaker 1: you're basically in a remote area, you would measure something 377 00:21:24,440 --> 00:21:29,120 Speaker 1: and then send that uh via back then telephone line, right, 378 00:21:29,200 --> 00:21:33,800 Speaker 1: like an Arctic station or something set up to watch animals, 379 00:21:33,840 --> 00:21:36,240 Speaker 1: like deep in the jungle or something. It's like, there's 380 00:21:36,240 --> 00:21:38,080 Speaker 1: a world the beast, don't we go there's a wild beast. 381 00:21:38,480 --> 00:21:42,760 Speaker 1: That was early telemetry exactly. And that's essentially just an 382 00:21:42,800 --> 00:21:45,879 Speaker 1: extension of what we're doing now. Yeah, now it's an 383 00:21:45,880 --> 00:21:48,600 Speaker 1: extension of from then. Well we've we've built upon that. 384 00:21:48,680 --> 00:21:50,840 Speaker 1: I mean think like the first dial up stuff that 385 00:21:50,880 --> 00:21:53,879 Speaker 1: was I would guess probably telemetry, you know, the series 386 00:21:53,920 --> 00:21:57,879 Speaker 1: of like the and all that. I mean, you're sending 387 00:21:57,880 --> 00:22:01,120 Speaker 1: signals from one machine to another saying and let me online. 388 00:22:01,280 --> 00:22:04,880 Speaker 1: Yeah it's your problem. Why are you so slow? Uh? 389 00:22:04,880 --> 00:22:08,200 Speaker 1: And what's allowed the Internet of Things to take root? Um? 390 00:22:08,320 --> 00:22:12,320 Speaker 1: Very simply it gets more complicated, but the invention of 391 00:22:12,359 --> 00:22:16,879 Speaker 1: the Worldwide Web by Mr Tim Burners Lee off Man 392 00:22:17,280 --> 00:22:23,919 Speaker 1: h Then the ubiquitous nous is that a word? Ubiquity ubiquitousness, 393 00:22:24,080 --> 00:22:29,040 Speaker 1: ubiquitousness of WiFi. Although yeah, I think ubiquity is yeah, 394 00:22:29,280 --> 00:22:34,200 Speaker 1: is that right? Yeah, WiFi, I do the porky pig 395 00:22:34,200 --> 00:22:39,280 Speaker 1: thing where I just skip it. Um, the the widespread 396 00:22:39,400 --> 00:22:42,280 Speaker 1: nature of WiFi all of a sudden, where um, you 397 00:22:42,320 --> 00:22:44,600 Speaker 1: don't have to be physically connected to something that really 398 00:22:44,600 --> 00:22:49,080 Speaker 1: advanced things speed. And then like we already said, the cloud, Yeah, 399 00:22:49,160 --> 00:22:51,080 Speaker 1: I think the cloud is the thing that really kind 400 00:22:51,119 --> 00:22:54,400 Speaker 1: of allows it more than anything else. It's just if 401 00:22:54,440 --> 00:22:58,040 Speaker 1: you had to have that kind of computing power right 402 00:22:58,080 --> 00:23:01,080 Speaker 1: there in the sensor, then it's just would be very limiting. 403 00:23:01,240 --> 00:23:03,440 Speaker 1: You couldn't put it on just anything, and it would 404 00:23:03,480 --> 00:23:05,840 Speaker 1: be a lot more expensive. To these things that they're 405 00:23:05,840 --> 00:23:10,239 Speaker 1: adding to to you know, normal inanimate objects to make 406 00:23:10,280 --> 00:23:15,400 Speaker 1: them smart are very cheap to produce. They just need 407 00:23:15,440 --> 00:23:19,600 Speaker 1: a few components. They need computing hardware, they need sensors, 408 00:23:19,920 --> 00:23:22,760 Speaker 1: they need communication hardware, and then they need some sort 409 00:23:22,760 --> 00:23:24,960 Speaker 1: of power source, which you can get that from the 410 00:23:25,000 --> 00:23:28,040 Speaker 1: machine itself that you plug in, right, So like if 411 00:23:28,040 --> 00:23:30,560 Speaker 1: it's a smart coffee maker, it can draw power from 412 00:23:30,600 --> 00:23:33,520 Speaker 1: the the plug that the coffee maker runs off of. 413 00:23:34,040 --> 00:23:36,560 Speaker 1: And then it needs Internet access, which if you have 414 00:23:36,600 --> 00:23:38,679 Speaker 1: a smart coffee maker, but you don't have internet at 415 00:23:38,720 --> 00:23:41,719 Speaker 1: your house. You made a poorer decision in your coffee 416 00:23:41,720 --> 00:23:45,000 Speaker 1: maker purchase. Like pretty much everything comes with internet access 417 00:23:45,000 --> 00:23:48,920 Speaker 1: at this point right in the Western world. Sure. Uh. 418 00:23:48,960 --> 00:23:50,879 Speaker 1: The other thing we've kind of been talking about is 419 00:23:51,040 --> 00:23:54,719 Speaker 1: UM your own devices in your home. But it just 420 00:23:55,400 --> 00:23:57,200 Speaker 1: you don't have to be just hooked up to things 421 00:23:57,240 --> 00:23:59,439 Speaker 1: that you own. You can hook up to other like 422 00:23:59,640 --> 00:24:04,639 Speaker 1: SIST devices. Like let's say your town has UM devices 423 00:24:04,680 --> 00:24:08,119 Speaker 1: that monitor traffic conditions, you can tap into that. I 424 00:24:08,119 --> 00:24:10,560 Speaker 1: guess that's what WAYS is, right or is that all 425 00:24:10,600 --> 00:24:13,159 Speaker 1: self reported? I looked up and UM, it seems to 426 00:24:13,160 --> 00:24:16,280 Speaker 1: be all self reported. But there's something called like WAYS 427 00:24:16,400 --> 00:24:19,560 Speaker 1: Citizen or something like that, and it it appears to 428 00:24:19,600 --> 00:24:22,440 Speaker 1: be WAYS trying to get smart cities to let them 429 00:24:22,440 --> 00:24:26,080 Speaker 1: tap into their information like traffic cams and stuff like that. Yeah, 430 00:24:26,320 --> 00:24:28,840 Speaker 1: and apparently that's already a thing like if you just 431 00:24:28,920 --> 00:24:31,720 Speaker 1: leave your phone open or like the Bluetooth on, if 432 00:24:31,720 --> 00:24:35,440 Speaker 1: you're driving through a smart city with UM traffic sensors, 433 00:24:35,760 --> 00:24:39,560 Speaker 1: it basically uses your phone's information while you're in traffic 434 00:24:39,880 --> 00:24:42,919 Speaker 1: as real time traffic information because your phone has something 435 00:24:43,000 --> 00:24:46,000 Speaker 1: like a an accelerometer in it, so it knows how 436 00:24:46,040 --> 00:24:49,320 Speaker 1: fast you're moving at any given point. And if it's 437 00:24:49,800 --> 00:24:52,960 Speaker 1: giving that information to just a panel on the side 438 00:24:52,960 --> 00:24:55,160 Speaker 1: of the street, that panel can put all that info 439 00:24:55,240 --> 00:24:58,359 Speaker 1: together and be like oh, peach trees like super backed 440 00:24:58,440 --> 00:25:01,080 Speaker 1: up right now, and then if Ways can get their 441 00:25:01,080 --> 00:25:03,919 Speaker 1: hands on their information, they can send that out to 442 00:25:04,000 --> 00:25:06,719 Speaker 1: their users. But for right now. Ways, as far as 443 00:25:06,760 --> 00:25:10,040 Speaker 1: I know, is is it's a it's a social apps. 444 00:25:10,680 --> 00:25:14,720 Speaker 1: It relies on its users to update conditions, which, by 445 00:25:14,720 --> 00:25:17,480 Speaker 1: the way, Ways I think might be the best app 446 00:25:17,520 --> 00:25:20,080 Speaker 1: of the twenty one century so far. I didn't start 447 00:25:20,200 --> 00:25:22,000 Speaker 1: using it. I had it on my phone for a while, 448 00:25:22,040 --> 00:25:27,400 Speaker 1: but I never used it, but have a little bit recently. Um, 449 00:25:27,840 --> 00:25:29,920 Speaker 1: I don't I don't like interact with it much. I'll 450 00:25:29,960 --> 00:25:32,040 Speaker 1: just set it to tell me where to go, like 451 00:25:32,080 --> 00:25:35,120 Speaker 1: I don't report accidents and things. Does that mean I'm 452 00:25:35,119 --> 00:25:38,840 Speaker 1: a bad user? I mean you're you're you're using like 453 00:25:38,880 --> 00:25:42,119 Speaker 1: the efforts of other people without contributing. I mean the 454 00:25:42,160 --> 00:25:44,280 Speaker 1: point is is for everybody to contribute. It makes it 455 00:25:44,359 --> 00:25:46,480 Speaker 1: more robust. But it's not like you're going to show 456 00:25:46,560 --> 00:25:50,560 Speaker 1: up at your house, you pull over on the highway, 457 00:25:50,600 --> 00:25:55,480 Speaker 1: that's That's probably the one big thing about about ways 458 00:25:55,560 --> 00:25:57,480 Speaker 1: is that like you're not supposed to be using it 459 00:25:57,560 --> 00:25:59,920 Speaker 1: in that situation. If you're the driver, you're supposed to 460 00:25:59,920 --> 00:26:02,040 Speaker 1: be the passengers. But they're also kind of telling you 461 00:26:02,080 --> 00:26:04,240 Speaker 1: to you know well, I mean there's a thing that 462 00:26:04,440 --> 00:26:07,200 Speaker 1: when when it comes up or when you try to start, 463 00:26:07,280 --> 00:26:10,719 Speaker 1: it'll say like are you a passenger? And they just 464 00:26:10,760 --> 00:26:13,800 Speaker 1: assume that you're going to just be truthful about that. Yeah. 465 00:26:13,840 --> 00:26:15,640 Speaker 1: That's like the website to say, like, tell us your 466 00:26:15,640 --> 00:26:18,480 Speaker 1: twenty one by clicking here exactly and then welcome to 467 00:26:18,520 --> 00:26:21,280 Speaker 1: the party. Yeah, and the twenty year olds like, no, shoot, 468 00:26:21,119 --> 00:26:24,080 Speaker 1: you can click on this close two weeks, I'll be 469 00:26:24,080 --> 00:26:28,680 Speaker 1: back in two weeks um. The other thing I was 470 00:26:28,720 --> 00:26:31,400 Speaker 1: wondering too, like what if the lipstick as an example, 471 00:26:32,000 --> 00:26:36,520 Speaker 1: what if they that's open uh to where you can 472 00:26:36,560 --> 00:26:38,159 Speaker 1: look at your app and say like, well, no, this 473 00:26:38,200 --> 00:26:41,000 Speaker 1: store is out of lipstick. That'd be cool, you know. Yeah, 474 00:26:41,160 --> 00:26:42,480 Speaker 1: I said, of having to call and talk to a 475 00:26:42,560 --> 00:26:45,040 Speaker 1: dumb person, I know it's awful wait for them to 476 00:26:45,040 --> 00:26:47,600 Speaker 1: go look with their eyes. Well, but you know, I 477 00:26:47,840 --> 00:26:49,399 Speaker 1: get it though, because now the days you call and 478 00:26:49,440 --> 00:26:50,760 Speaker 1: say hey, I want to check and see if you 479 00:26:50,760 --> 00:26:54,040 Speaker 1: have something stock, and you usually met with all right, 480 00:26:54,920 --> 00:26:58,119 Speaker 1: hold on and not like sure, I'd be happy to 481 00:26:58,119 --> 00:27:02,320 Speaker 1: go check before user. They're like, oh man, I can't 482 00:27:02,359 --> 00:27:05,199 Speaker 1: wait to get outsourced to a robot, right. But then 483 00:27:05,240 --> 00:27:07,040 Speaker 1: you get to the store and they don't have it 484 00:27:07,040 --> 00:27:08,720 Speaker 1: in stock, and you're like, I called and asked someone. 485 00:27:08,760 --> 00:27:12,520 Speaker 1: They're like, who'd you talk to? There's no one by 486 00:27:12,600 --> 00:27:14,880 Speaker 1: their name of works here. Sorry, and he's like got 487 00:27:14,960 --> 00:27:18,840 Speaker 1: covering his name back with his hand. Um and Chuck. 488 00:27:19,960 --> 00:27:22,960 Speaker 1: Speaking of smart cities, you know, traffic info is a 489 00:27:22,960 --> 00:27:28,240 Speaker 1: big one, so was um smart traffic lights, which I 490 00:27:28,240 --> 00:27:32,280 Speaker 1: wish they's been around starting when I was sixteen, because 491 00:27:32,359 --> 00:27:34,320 Speaker 1: there is there are a few things to me that 492 00:27:34,359 --> 00:27:36,359 Speaker 1: are more of a waste of time than sitting in 493 00:27:36,440 --> 00:27:39,520 Speaker 1: a traffic light when there's no traffic going through Decatur 494 00:27:39,640 --> 00:27:42,119 Speaker 1: where near a while it is famous for its lights 495 00:27:42,119 --> 00:27:45,400 Speaker 1: not being timed or trip tripped or whatever, and it's 496 00:27:45,560 --> 00:27:48,720 Speaker 1: there forever it's horrible. Yeah, Decator does have really long 497 00:27:48,800 --> 00:27:51,880 Speaker 1: lights and yeah, they don't even have the We found 498 00:27:51,920 --> 00:27:54,280 Speaker 1: out it's either a metal detector or a weight sensor 499 00:27:54,920 --> 00:27:57,520 Speaker 1: that where there's like the lines that where they obviously 500 00:27:57,520 --> 00:28:01,680 Speaker 1: cut out the the um a hard top in front 501 00:28:01,680 --> 00:28:04,760 Speaker 1: of you in front of a light. If you don't 502 00:28:04,760 --> 00:28:08,480 Speaker 1: even have that, that's a problem. But even those ones 503 00:28:08,520 --> 00:28:11,240 Speaker 1: that have sensors don't always like to it immediately. This 504 00:28:11,280 --> 00:28:13,800 Speaker 1: should be like, it should be a lot smarter than that. 505 00:28:13,840 --> 00:28:16,000 Speaker 1: And that's part of what the revenure engine. You're that 506 00:28:16,040 --> 00:28:19,000 Speaker 1: guy have to stop right? You know Decatur's motto when 507 00:28:19,000 --> 00:28:22,880 Speaker 1: you drive in it says Decator, what's your hurry? Really? No, 508 00:28:23,320 --> 00:28:26,440 Speaker 1: I can kind of see it actually slow down, Decatur. 509 00:28:26,560 --> 00:28:29,440 Speaker 1: What's with all the baby strollers to be another one 510 00:28:30,119 --> 00:28:34,360 Speaker 1: road baby strollers? Why they have this big off road 511 00:28:34,400 --> 00:28:37,320 Speaker 1: like jogging baby stroller with the huge tires and yet 512 00:28:37,359 --> 00:28:41,320 Speaker 1: one of those Yeah, but that's uh, well no, I 513 00:28:41,320 --> 00:28:43,040 Speaker 1: mean that I don't really go off road, But that's 514 00:28:43,040 --> 00:28:47,520 Speaker 1: because the sidewalks in my neighborhood are awful there. They suck. 515 00:28:47,600 --> 00:28:49,440 Speaker 1: Ye is off road might as well be like tree 516 00:28:49,480 --> 00:28:52,160 Speaker 1: roots growing up everywhere, that kind of thing. Yeah, they're like, O, 517 00:28:52,200 --> 00:28:54,120 Speaker 1: this tree's never gonna grow, Let's put a sidewalk up 518 00:28:54,200 --> 00:28:58,960 Speaker 1: right next to it. Um, well, you're talking about smart cities. 519 00:28:59,000 --> 00:29:01,840 Speaker 1: The other cool thing part actually that they could do 520 00:29:01,920 --> 00:29:05,680 Speaker 1: They might actually be doing this is infrastructure, like embedding 521 00:29:05,960 --> 00:29:10,160 Speaker 1: sensors into sidewalks. Well, that's a good example. Like a 522 00:29:10,200 --> 00:29:14,000 Speaker 1: sidewalk that becomes cracked or broken, or bridge that becomes 523 00:29:14,160 --> 00:29:16,880 Speaker 1: weak in one point, they can send the signal and say, hey, 524 00:29:17,320 --> 00:29:20,000 Speaker 1: maybe you should come check out this bridge, and then 525 00:29:20,000 --> 00:29:24,520 Speaker 1: eventually they will send a signal to the robots sidewalk crew, 526 00:29:25,200 --> 00:29:27,760 Speaker 1: who will come out and prepare the sidewalk and everything 527 00:29:27,800 --> 00:29:30,720 Speaker 1: will look perfect all the time thanks to the robots. Right, 528 00:29:30,840 --> 00:29:33,680 Speaker 1: but there's forty robots and like thirty of them are 529 00:29:33,720 --> 00:29:38,360 Speaker 1: just standing around working And where did they learn to 530 00:29:38,400 --> 00:29:42,880 Speaker 1: smoke cigarettes? It seems weird to get so rubbed at 531 00:29:42,880 --> 00:29:45,840 Speaker 1: cigarette breakes because I didn't smoke. Yeah, and I was 532 00:29:45,880 --> 00:29:48,520 Speaker 1: always like, you know, I'm just gonna go stand outside 533 00:29:48,520 --> 00:29:51,400 Speaker 1: and the boss will be like, you can't do that. Yeah, 534 00:29:51,440 --> 00:29:53,760 Speaker 1: I mean you could take a break, but cigarette brakes 535 00:29:53,760 --> 00:29:55,680 Speaker 1: weren't even real breaks. Someone's just like a new smoke, 536 00:29:55,960 --> 00:29:57,520 Speaker 1: but you can never go you know, I'm just gonna 537 00:29:57,520 --> 00:30:00,200 Speaker 1: go stand outside for five minutes. Like you have to 538 00:30:00,200 --> 00:30:03,200 Speaker 1: be killing yourself to make that allowed. Yeah. No, I 539 00:30:03,400 --> 00:30:05,440 Speaker 1: that dawned on me when I was smoker, to like 540 00:30:05,720 --> 00:30:08,160 Speaker 1: that's when I was young and like, uh, you know, 541 00:30:08,280 --> 00:30:12,920 Speaker 1: I was more angry back then about justice. Yeah. So again, 542 00:30:13,000 --> 00:30:15,840 Speaker 1: we can sit here basically all day and talk about 543 00:30:16,000 --> 00:30:18,800 Speaker 1: you know, devices and applications for this kind of thing. 544 00:30:19,240 --> 00:30:22,000 Speaker 1: But there's some hurdles that are coming up that need 545 00:30:22,040 --> 00:30:25,840 Speaker 1: to be addressed, um pretty soon, and we'll talk about 546 00:30:25,880 --> 00:30:45,120 Speaker 1: those right after this. Alright, chuckers, we're back hurdles. So 547 00:30:45,240 --> 00:30:49,760 Speaker 1: right now, there's some immediate hurdles, including the idea that 548 00:30:50,240 --> 00:30:56,560 Speaker 1: a lot of smart technology operates on using totally different languages, 549 00:30:56,640 --> 00:31:00,320 Speaker 1: different protocols, different everything, um, so that if you have 550 00:31:00,360 --> 00:31:03,400 Speaker 1: a house full of different smart gadgets, you probably have 551 00:31:03,480 --> 00:31:05,960 Speaker 1: an app for every single one of them, rather than 552 00:31:06,000 --> 00:31:10,120 Speaker 1: one integrated gap or app. And that's that's a it's 553 00:31:10,120 --> 00:31:12,480 Speaker 1: not a hurdle, like you can have that many apps, 554 00:31:12,880 --> 00:31:16,080 Speaker 1: but the idea of it being seamlessly integrated into just 555 00:31:16,360 --> 00:31:19,800 Speaker 1: one part of your phone would be great. And if 556 00:31:19,840 --> 00:31:21,960 Speaker 1: they could talk to one another without you having to 557 00:31:22,040 --> 00:31:28,200 Speaker 1: control it. You know, like you're, um, you're the light 558 00:31:28,280 --> 00:31:32,800 Speaker 1: sensor on your light shade notices that the sun story 559 00:31:32,840 --> 00:31:35,000 Speaker 1: and you go down, so it opens your blinds a 560 00:31:35,000 --> 00:31:38,680 Speaker 1: little bit, right, and when that happens that you're you're smart. 561 00:31:38,720 --> 00:31:41,160 Speaker 1: Candle up slicer knows that you like a slice the 562 00:31:41,200 --> 00:31:44,120 Speaker 1: candle up before dinner, so it slices up the candle up. 563 00:31:44,120 --> 00:31:46,520 Speaker 1: And they're all talking to one another. So it's not 564 00:31:46,600 --> 00:31:48,880 Speaker 1: like everything's on a timer and things happen at once. 565 00:31:49,160 --> 00:31:52,280 Speaker 1: It's happening because one thing is sensing this and it's 566 00:31:52,320 --> 00:31:54,720 Speaker 1: relaying that information to the other devices in your house 567 00:31:54,800 --> 00:31:58,280 Speaker 1: is as well, that's not happening right now. Yeah, what 568 00:31:58,360 --> 00:32:04,200 Speaker 1: I need is I need between seven and nine am. 569 00:32:04,280 --> 00:32:09,080 Speaker 1: I need my toilet, toilet to flush about every eight minutes. Yeah, 570 00:32:09,600 --> 00:32:13,720 Speaker 1: man alive. It would be my smart house. Or as 571 00:32:13,800 --> 00:32:17,040 Speaker 1: soon as the coffee starts brewing seven minutes later, the 572 00:32:17,040 --> 00:32:20,360 Speaker 1: toilet flushes. That happens. See, coffee is good for that. 573 00:32:20,440 --> 00:32:22,440 Speaker 1: It's great for that. But that's another thing that's coming 574 00:32:22,520 --> 00:32:24,560 Speaker 1: very soon too. Smart toilets. I can tell you, like, 575 00:32:25,480 --> 00:32:27,760 Speaker 1: you've got a lot of Billy Rubens in here, Yeah, 576 00:32:27,880 --> 00:32:31,120 Speaker 1: what's up with that? It'll say, what's up with that? 577 00:32:32,160 --> 00:32:34,640 Speaker 1: Uh So, basically, what you're talking about is systems that 578 00:32:34,640 --> 00:32:36,680 Speaker 1: aren't integrated because it's a bunch of different companies with 579 00:32:36,680 --> 00:32:40,600 Speaker 1: all their own devices. But there are companies trying to 580 00:32:40,640 --> 00:32:46,720 Speaker 1: come together to um join up with open source platforms uh, 581 00:32:46,800 --> 00:32:49,400 Speaker 1: and one of them is created by Qualcom called the 582 00:32:49,480 --> 00:32:54,920 Speaker 1: All Scene Alliance, which is when like it sounds something 583 00:32:54,960 --> 00:32:58,760 Speaker 1: like from a future horror movie, sounds really creepy, the 584 00:32:58,800 --> 00:33:02,200 Speaker 1: All Scene Alliance or a new speak like we might 585 00:33:02,200 --> 00:33:04,160 Speaker 1: as you might as well just say, like, we want 586 00:33:04,160 --> 00:33:07,160 Speaker 1: a camera in every room of your home so we 587 00:33:07,200 --> 00:33:08,600 Speaker 1: can all just talk to each other and make your 588 00:33:08,640 --> 00:33:15,120 Speaker 1: life simpler, just relax, laid back. Apple's Home Kit always 589 00:33:15,320 --> 00:33:18,160 Speaker 1: make it sound cute and not creepy, and it's probably 590 00:33:18,200 --> 00:33:21,400 Speaker 1: creepier than the All Seeing the Lions. Yeah, a bunch 591 00:33:21,400 --> 00:33:24,280 Speaker 1: of people have one. Google, as M, Samsung, there's one 592 00:33:24,320 --> 00:33:27,000 Speaker 1: called everything That's missing a couple of vowels. Wink is 593 00:33:27,040 --> 00:33:29,840 Speaker 1: a big one. Um, it's a big one right now. 594 00:33:30,400 --> 00:33:32,800 Speaker 1: It controls some stuff like I think Phillips lights, and 595 00:33:32,840 --> 00:33:35,240 Speaker 1: it works with Nest maybe or something like that. It 596 00:33:35,280 --> 00:33:37,720 Speaker 1: does like two things, so it's like cutting manage right now. 597 00:33:38,080 --> 00:33:41,200 Speaker 1: But um, as the author of this article, Bernadett Johnson 598 00:33:41,240 --> 00:33:44,640 Speaker 1: puts out, none are all encompassing, which I saw that 599 00:33:44,680 --> 00:33:46,800 Speaker 1: and I was like, Mitch Hedberg would have like that sentence, 600 00:33:47,400 --> 00:33:49,840 Speaker 1: none are all encompassing. Yeah, he said he was like 601 00:33:49,880 --> 00:33:53,320 Speaker 1: trying new words and um rather than like totally he 602 00:33:53,360 --> 00:33:56,000 Speaker 1: said he was eating totally too much. They'd be like 603 00:33:56,400 --> 00:33:59,560 Speaker 1: Mitch to you like s'mores, and'd be like all encompassingly. 604 00:34:02,080 --> 00:34:05,000 Speaker 1: So another hurdle, UM that we are already getting around 605 00:34:05,240 --> 00:34:09,040 Speaker 1: was um. Back in the nineties, we started to realize 606 00:34:09,040 --> 00:34:12,640 Speaker 1: we were running out of IP addresses. The standard IP 607 00:34:12,800 --> 00:34:17,320 Speaker 1: address was the UH, well it still is in some ways, 608 00:34:17,400 --> 00:34:21,480 Speaker 1: the I p V four. Yes, I p V four, 609 00:34:21,920 --> 00:34:24,080 Speaker 1: And in the nineties they got smart. It wasn't like 610 00:34:24,200 --> 00:34:27,360 Speaker 1: the Y two K bug. We're like, oh my gosh, 611 00:34:27,520 --> 00:34:30,520 Speaker 1: things are going to be different in a month. Uh. 612 00:34:30,560 --> 00:34:32,839 Speaker 1: They got on this a while ago and created the 613 00:34:33,000 --> 00:34:40,839 Speaker 1: I p v six and started UH basically created uh 614 00:34:41,040 --> 00:34:46,279 Speaker 1: potentially what's the number, an undisillion number of at three 615 00:34:46,800 --> 00:34:52,520 Speaker 1: forty undisillian addresses that is one with thirty six zeros 616 00:34:52,600 --> 00:34:57,239 Speaker 1: behind it and enough to give uh IP addresses to 617 00:34:57,440 --> 00:35:00,120 Speaker 1: everyone on the planet times ten to the tw the 618 00:35:00,200 --> 00:35:03,879 Speaker 1: eighth power right, So basically they said, we don't want 619 00:35:03,880 --> 00:35:06,600 Speaker 1: to run out ever again. Well, the funny thing is, 620 00:35:06,719 --> 00:35:08,920 Speaker 1: Chuck is in one when they came up with the 621 00:35:09,000 --> 00:35:11,920 Speaker 1: I p v four that came up with four point 622 00:35:11,960 --> 00:35:16,440 Speaker 1: to nine five billion possible addresses. Yeah, they're like, that's 623 00:35:16,760 --> 00:35:20,440 Speaker 1: we're doing. And then what within like thirty thirty years, 624 00:35:20,880 --> 00:35:24,719 Speaker 1: thirty five years, they started really run out. And apparently 625 00:35:24,760 --> 00:35:26,680 Speaker 1: there was a prediction that in two thousand and fifteen 626 00:35:26,719 --> 00:35:28,680 Speaker 1: we were gonna straight up run out of I p 627 00:35:28,880 --> 00:35:33,240 Speaker 1: v four Internet addresses or IP addresses UM, and apparently 628 00:35:33,280 --> 00:35:36,360 Speaker 1: that was a cliff we avoided, obviously, because that's still 629 00:35:36,480 --> 00:35:39,600 Speaker 1: we're still making things that have their own IP addresses. 630 00:35:39,960 --> 00:35:42,239 Speaker 1: I thought they did run out, No, they used They 631 00:35:42,400 --> 00:35:45,720 Speaker 1: used different things to mitigate it, including this network address 632 00:35:45,760 --> 00:35:49,440 Speaker 1: translation really kind of open things up, and that's where 633 00:35:49,480 --> 00:35:53,720 Speaker 1: a server identifies a network as a single IP address 634 00:35:53,960 --> 00:35:57,000 Speaker 1: and then leads it to the local network to decide 635 00:35:57,320 --> 00:35:59,719 Speaker 1: where the information that's supposed to be going to. One 636 00:35:59,760 --> 00:36:02,480 Speaker 1: can you around the network goes see what I'm saying, 637 00:36:02,560 --> 00:36:04,480 Speaker 1: but to the server, to the rest of the Internet, 638 00:36:04,520 --> 00:36:07,560 Speaker 1: that whole network, which can be a ton of computers 639 00:36:08,080 --> 00:36:11,600 Speaker 1: um is just one IP address. So you just reduced 640 00:36:11,680 --> 00:36:14,320 Speaker 1: it by that many computers that are on that local network. 641 00:36:15,000 --> 00:36:17,919 Speaker 1: That was a big one. But then also building um 642 00:36:18,120 --> 00:36:21,399 Speaker 1: new things on the i p v six platform has 643 00:36:21,440 --> 00:36:23,520 Speaker 1: helped mitigate it. A little bit too. So I think 644 00:36:23,560 --> 00:36:25,920 Speaker 1: it's a cliff that we came very close to but 645 00:36:25,960 --> 00:36:28,920 Speaker 1: avoided going over. Well, it doesn't matter now because I 646 00:36:29,040 --> 00:36:31,520 Speaker 1: p V six is the new way forward. It is, 647 00:36:31,520 --> 00:36:34,480 Speaker 1: but there's a lot of stuff still in use that's 648 00:36:34,520 --> 00:36:37,680 Speaker 1: plenty good for the next couple of years that have 649 00:36:37,840 --> 00:36:40,880 Speaker 1: like I p V before p I p V four 650 00:36:40,960 --> 00:36:44,400 Speaker 1: addresses that still need to like they're like me too, Well, 651 00:36:44,480 --> 00:36:47,359 Speaker 1: that doesn't matter. They can they're compatible now. Well, they're 652 00:36:47,360 --> 00:36:49,520 Speaker 1: working to make them compatible. I think they're already a 653 00:36:49,560 --> 00:36:54,040 Speaker 1: long way down that road. They're using both seamlessly pretty much. Um, 654 00:36:54,480 --> 00:36:57,080 Speaker 1: there's a great wire state, well I think so. There's 655 00:36:57,120 --> 00:36:59,960 Speaker 1: a great Wired article about it. Uh and they basically say, 656 00:37:00,000 --> 00:37:04,759 Speaker 1: at first they weren't entirely compatible. Uh, you had to 657 00:37:04,800 --> 00:37:06,760 Speaker 1: have some sort of layer in between to make them 658 00:37:06,800 --> 00:37:10,840 Speaker 1: basically be friends. And um, they're still working on it. 659 00:37:10,840 --> 00:37:14,080 Speaker 1: It's not like finished, but it said so far the 660 00:37:14,080 --> 00:37:17,760 Speaker 1: transition has been pretty seamless. Like you're using you're interacting 661 00:37:17,800 --> 00:37:20,160 Speaker 1: with I PB six right now. You don't even know it. Yeah, 662 00:37:20,200 --> 00:37:22,279 Speaker 1: I would assume like if you have something that was 663 00:37:22,320 --> 00:37:24,480 Speaker 1: made in the last couple of years, it's probably I 664 00:37:24,600 --> 00:37:28,520 Speaker 1: p V six, That's what I would guess, So that's 665 00:37:28,520 --> 00:37:31,880 Speaker 1: pretty neat Undistillian didn't even know there was a thing. 666 00:37:32,000 --> 00:37:33,759 Speaker 1: I didn't either, I had to look it up. It's like, 667 00:37:33,760 --> 00:37:39,160 Speaker 1: what the heck is that a typo on? So I 668 00:37:39,239 --> 00:37:41,200 Speaker 1: don't think we can put it off any longer, Chuck. 669 00:37:41,800 --> 00:37:46,120 Speaker 1: There's a lot of security and privacy concerns that crop 670 00:37:46,200 --> 00:37:49,839 Speaker 1: up from the just the presence of the Internet of things. Right, 671 00:37:50,120 --> 00:37:52,120 Speaker 1: if you have a bunch of sensors in your house 672 00:37:52,200 --> 00:37:56,280 Speaker 1: collecting data and everything from you know how how many 673 00:37:56,320 --> 00:38:00,400 Speaker 1: times you toss and turn in your sleep to um, 674 00:38:00,440 --> 00:38:03,480 Speaker 1: you know how many minutes the toilet needs to be 675 00:38:03,520 --> 00:38:07,520 Speaker 1: flushed in intervals to whether you're moving around your house 676 00:38:07,600 --> 00:38:10,520 Speaker 1: or not, whether you're home. There's a lot of sensors 677 00:38:10,560 --> 00:38:15,840 Speaker 1: in in even now the standard home in the United States, UM, 678 00:38:15,920 --> 00:38:19,640 Speaker 1: that can that are collecting data and there's not a 679 00:38:19,719 --> 00:38:23,759 Speaker 1: lot of regulation on what happens to that data, who 680 00:38:23,760 --> 00:38:26,239 Speaker 1: has access to that data, how safe that data has 681 00:38:26,280 --> 00:38:31,600 Speaker 1: to be, and um, it's just wide open for government surveillance, hackers, 682 00:38:32,440 --> 00:38:36,840 Speaker 1: targeted ads. I mean, if you're paranoid about government stuff, 683 00:38:36,920 --> 00:38:39,880 Speaker 1: then this probably worries you. Hacking as a whole different 684 00:38:40,320 --> 00:38:43,480 Speaker 1: can of worms, like everyone should worry about that. Yeah, 685 00:38:43,640 --> 00:38:46,560 Speaker 1: I feel like everyone should worry about governments aveillance as well, 686 00:38:47,560 --> 00:38:50,960 Speaker 1: big time some people. You know, I think that's bunk. 687 00:38:51,680 --> 00:38:57,920 Speaker 1: But governments aveillance well some people. Yeah, sure they are fools. 688 00:38:58,400 --> 00:39:00,920 Speaker 1: Maybe people are utter fool and the world is full 689 00:39:00,960 --> 00:39:03,879 Speaker 1: of fools. That's crazy to me. How could you not? 690 00:39:05,480 --> 00:39:09,040 Speaker 1: I mean, like, there's there's testimony from the head of 691 00:39:09,080 --> 00:39:12,960 Speaker 1: the n s A. There was Snowdon releasing the prison files. Like, 692 00:39:12,960 --> 00:39:15,520 Speaker 1: how could anyone just say, no, that's not the case 693 00:39:15,560 --> 00:39:17,960 Speaker 1: any Well, what you're saying is everyone should be as 694 00:39:18,040 --> 00:39:20,440 Speaker 1: up on this as I am. And that's that's the 695 00:39:20,480 --> 00:39:25,360 Speaker 1: foolish statement. Your people on the street today, fifty of 696 00:39:25,400 --> 00:39:29,960 Speaker 1: them would say that's just that's just conspiracy stuff that 697 00:39:30,080 --> 00:39:32,280 Speaker 1: I see. I thought you were saying like people knew 698 00:39:32,520 --> 00:39:34,560 Speaker 1: and and they were saying like, no, this isn't a 699 00:39:34,600 --> 00:39:37,120 Speaker 1: real thing. No, I think I got most people probably 700 00:39:37,120 --> 00:39:39,759 Speaker 1: have their head in the sand of totally agree. I 701 00:39:39,760 --> 00:39:44,160 Speaker 1: see what you mean. I still think they're foolish and 702 00:39:44,160 --> 00:39:48,560 Speaker 1: and sad because there's the one thing that would press 703 00:39:49,320 --> 00:39:53,000 Speaker 1: security into the Internet of things. There isn't any right now, 704 00:39:53,160 --> 00:39:55,840 Speaker 1: there's virtually none, but I mean, it's all got to 705 00:39:55,880 --> 00:40:00,319 Speaker 1: be self installed, not by the person but by the company, like, hey, 706 00:40:00,320 --> 00:40:02,000 Speaker 1: we know you're probably worried, so we've done this, this 707 00:40:02,080 --> 00:40:04,760 Speaker 1: and this right. And the way that that will happen 708 00:40:04,920 --> 00:40:07,920 Speaker 1: is if people say, oh, that brand is not very secure, 709 00:40:08,000 --> 00:40:10,400 Speaker 1: I'm going to go to their competitor, which is super secure, 710 00:40:10,760 --> 00:40:13,080 Speaker 1: or if they get sued, sure, and that will cause 711 00:40:13,160 --> 00:40:15,640 Speaker 1: brands to which are self regulating right now as far 712 00:40:15,680 --> 00:40:19,080 Speaker 1: as security goes, to become more secure. But if people 713 00:40:19,160 --> 00:40:21,399 Speaker 1: are unaware of it or just don't think that kind 714 00:40:21,400 --> 00:40:23,480 Speaker 1: of stuff is going on, then there's not gonna be 715 00:40:23,520 --> 00:40:25,400 Speaker 1: any call for that, and they'll be able to continue 716 00:40:25,440 --> 00:40:28,120 Speaker 1: to put sensors, sensors in our home, devices in our 717 00:40:28,120 --> 00:40:31,160 Speaker 1: home that can you drop on us, that can detect 718 00:40:31,239 --> 00:40:34,680 Speaker 1: all sorts of different things about us without any thought 719 00:40:34,760 --> 00:40:37,880 Speaker 1: for security whatsoever. Yeah, well, they have a great example 720 00:40:37,920 --> 00:40:40,120 Speaker 1: in here, and it's not just the Internet of Things 721 00:40:40,239 --> 00:40:44,319 Speaker 1: is already happening. Um with target His dad in two 722 00:40:44,400 --> 00:40:49,520 Speaker 1: thousand twelve got mad because his teenage daughter was getting 723 00:40:50,080 --> 00:40:54,239 Speaker 1: baby ads targeted toward her, and he was like, why 724 00:40:54,239 --> 00:40:56,120 Speaker 1: are you trying to get my teenage daughter to have 725 00:40:56,160 --> 00:40:57,840 Speaker 1: a baby? Why do you keep center to this stuff. 726 00:40:58,840 --> 00:41:02,279 Speaker 1: And he found out she's pregnant and was like he 727 00:41:02,320 --> 00:41:05,920 Speaker 1: actually apologized to them, but I still think like he 728 00:41:06,000 --> 00:41:09,200 Speaker 1: had a beef because what I thought is they were 729 00:41:09,239 --> 00:41:14,560 Speaker 1: just using her search uh information to target ads like, 730 00:41:14,800 --> 00:41:17,680 Speaker 1: which is what goes on all the time. But she wasn't. 731 00:41:17,800 --> 00:41:23,120 Speaker 1: What Target does is they have every customer. Every time 732 00:41:23,160 --> 00:41:25,840 Speaker 1: you shop at Target with a credit card, you have 733 00:41:25,880 --> 00:41:29,360 Speaker 1: a guest I D number that says, oh, here's that 734 00:41:29,360 --> 00:41:32,320 Speaker 1: credit card from Josh Clark again, he's back in my store. 735 00:41:32,719 --> 00:41:37,319 Speaker 1: Here's all the things that he's bought. Uh. So let's 736 00:41:37,360 --> 00:41:42,000 Speaker 1: target ads at him simply by shopping there without using cash, 737 00:41:42,040 --> 00:41:44,560 Speaker 1: which I didn't know that happened. Well, yeah, and it 738 00:41:44,680 --> 00:41:46,200 Speaker 1: used to be you had to sign up for like 739 00:41:46,239 --> 00:41:49,719 Speaker 1: a rewards program like a Kroger cards that I understand. 740 00:41:50,480 --> 00:41:53,480 Speaker 1: This is the same thing without you opting into it. 741 00:41:53,960 --> 00:41:56,200 Speaker 1: You know. Yeah, it's just tracking your credit card. And 742 00:41:56,960 --> 00:41:58,720 Speaker 1: I think it's New York Times. I read one article 743 00:41:58,760 --> 00:42:01,840 Speaker 1: where eventually they quit talking to the New York Times, 744 00:42:01,960 --> 00:42:05,400 Speaker 1: but he got a little information at first, uh, And 745 00:42:05,640 --> 00:42:07,479 Speaker 1: he said he talked to a Target employee that said, 746 00:42:07,680 --> 00:42:11,000 Speaker 1: here's the hypothetical example. Let's say this. There's a girl 747 00:42:11,360 --> 00:42:14,600 Speaker 1: in Atlanta shopping here. She buys a cocoa butter lotion. 748 00:42:15,200 --> 00:42:17,720 Speaker 1: She buys a big purse that could be a diaper bag. 749 00:42:18,320 --> 00:42:23,000 Speaker 1: She buys magnesium supplements, uh, and a bright blue rug. 750 00:42:23,680 --> 00:42:26,520 Speaker 1: They might just surmise, Hey, I bet this lady's pregnant, 751 00:42:26,640 --> 00:42:29,359 Speaker 1: but she's gonna have a boy because she's gonna buy 752 00:42:29,480 --> 00:42:32,200 Speaker 1: that bright blue rug. He's gonna smell like cocoa butter. 753 00:42:32,640 --> 00:42:34,759 Speaker 1: And yeah, and so you know what, I bet you 754 00:42:34,880 --> 00:42:38,640 Speaker 1: she's doing August too, uh, determined to buy what she's purchasing. 755 00:42:38,920 --> 00:42:42,600 Speaker 1: So let's start bombarding her, bombarding her with ads. And 756 00:42:42,640 --> 00:42:44,240 Speaker 1: that just seems a little creepy if you're not opting 757 00:42:44,320 --> 00:42:46,839 Speaker 1: in with I mean, it's a little creepy anyway if 758 00:42:46,840 --> 00:42:51,360 Speaker 1: you get like the shopper's card, but you're you're saying like, sure, 759 00:42:51,480 --> 00:42:53,840 Speaker 1: I'll take a little bit of a discount and exchange 760 00:42:53,840 --> 00:42:57,520 Speaker 1: for you keeping checking my spending habits, or you willingly 761 00:42:57,600 --> 00:43:00,640 Speaker 1: check the box terms and conditions with reading it, Well, 762 00:43:00,719 --> 00:43:03,240 Speaker 1: that's I think terms and conditions should be a whole 763 00:43:03,280 --> 00:43:06,920 Speaker 1: other episode. Man, there's a documentary about that. Yeah, that's 764 00:43:07,200 --> 00:43:11,520 Speaker 1: I think it's called terms and Conditions apply and it's maybe, 765 00:43:11,520 --> 00:43:13,879 Speaker 1: but boy it is creepy. Yeah, you know, I think 766 00:43:15,400 --> 00:43:17,440 Speaker 1: no way, and they make it that long so that 767 00:43:17,560 --> 00:43:20,600 Speaker 1: no one would but it's yeah, And there's actually we've 768 00:43:20,600 --> 00:43:23,480 Speaker 1: read this uh Guardian article. Did you check that out 769 00:43:23,520 --> 00:43:26,520 Speaker 1: that I sent you? So in the Guardian article there's 770 00:43:26,560 --> 00:43:31,799 Speaker 1: a mention of Samsung which had, uh, they had terms 771 00:43:31,800 --> 00:43:33,800 Speaker 1: and conditions for their TV. I think it was in 772 00:43:33,840 --> 00:43:37,120 Speaker 1: two fourteen or fifteen that it came out and it says, 773 00:43:37,200 --> 00:43:40,160 Speaker 1: in the terms and conditions for the TV that you 774 00:43:40,239 --> 00:43:43,560 Speaker 1: bring into your home, please be aware that if your 775 00:43:43,640 --> 00:43:48,040 Speaker 1: spoken words include personal or other sensitive information, that information 776 00:43:48,080 --> 00:43:51,000 Speaker 1: will be among the data captured and transmitted to a 777 00:43:51,080 --> 00:43:55,000 Speaker 1: third party through your use of voice recognition, which means 778 00:43:55,200 --> 00:43:58,880 Speaker 1: your TV is listening to you and transmitting your conversations. 779 00:43:59,200 --> 00:44:01,400 Speaker 1: Are at the very least keywords from your conversations in 780 00:44:01,440 --> 00:44:04,319 Speaker 1: your voice to somebody else who can figure out how 781 00:44:04,360 --> 00:44:06,720 Speaker 1: to target ads, who can put you on a government 782 00:44:06,760 --> 00:44:09,040 Speaker 1: watch list, who can do anything, Which means that you're 783 00:44:09,080 --> 00:44:12,279 Speaker 1: talking normally in your own home and your TV's eavesdropping 784 00:44:12,280 --> 00:44:15,719 Speaker 1: on you. Yeah, and you uh, you think you say 785 00:44:15,760 --> 00:44:18,959 Speaker 1: that's okay, not you, but one says that's fine. Because 786 00:44:19,000 --> 00:44:20,719 Speaker 1: I don't want to touch my remote. I just want 787 00:44:20,760 --> 00:44:25,160 Speaker 1: to say, turn up volume. You know, yeah, I can't 788 00:44:25,160 --> 00:44:29,200 Speaker 1: be bothered to use my finger. Fine, Nicolas Cage movies 789 00:44:29,520 --> 00:44:35,560 Speaker 1: come a bad one. He didn't make bad movies, dude, kidding. 790 00:44:36,840 --> 00:44:39,760 Speaker 1: I used to love Andy Sandberg's Nick Cage on SNL. 791 00:44:40,440 --> 00:44:42,120 Speaker 1: I don't think I ever saw that one. So funny. 792 00:44:42,280 --> 00:44:45,799 Speaker 1: I like Nick Cage's tiny elvis. Oh yeah, that was good. 793 00:44:45,880 --> 00:44:49,799 Speaker 1: He was so bizarre. He's I love Nicholas Cage because 794 00:44:49,840 --> 00:44:53,040 Speaker 1: he's unabashed. He will do some really great smaller movies 795 00:44:53,080 --> 00:44:55,640 Speaker 1: where you're like, man, this dude is an amazing actor, 796 00:44:56,120 --> 00:44:58,640 Speaker 1: and then he'll do the worst garbage you can imagine 797 00:44:59,080 --> 00:45:02,279 Speaker 1: for money, yes, and just it's like, yeah, I wanna 798 00:45:02,560 --> 00:45:04,919 Speaker 1: buy eight new motorcycles. But I think he's a great 799 00:45:04,960 --> 00:45:07,719 Speaker 1: example of what a good director can do with an 800 00:45:07,760 --> 00:45:10,200 Speaker 1: actor if they know what they're doing with them, because 801 00:45:10,239 --> 00:45:13,000 Speaker 1: he does virtually the same thing in all all movies. 802 00:45:13,080 --> 00:45:15,440 Speaker 1: It's just how much more he's doing it and how 803 00:45:15,520 --> 00:45:17,880 Speaker 1: much he's reined in, or how good the script is. 804 00:45:18,120 --> 00:45:21,439 Speaker 1: He comes in, he's like, you want Cage or Age. 805 00:45:22,360 --> 00:45:24,280 Speaker 1: I mean, you're right. He has made some great movies. 806 00:45:24,320 --> 00:45:29,680 Speaker 1: But man, he has made some bad ones. Wow. Oh wait, 807 00:45:29,680 --> 00:45:31,480 Speaker 1: hold on, yeah, let's get back on track. There's one 808 00:45:31,480 --> 00:45:34,040 Speaker 1: other thing too. There's a big debate going on right now, Chuck, 809 00:45:34,080 --> 00:45:38,319 Speaker 1: about whether your phone is eaves dropping on you for 810 00:45:38,520 --> 00:45:41,439 Speaker 1: at the very least targeted ads. Again, if you think 811 00:45:41,480 --> 00:45:46,040 Speaker 1: that your phone is not eaves dropping on you, you're 812 00:45:46,040 --> 00:45:52,319 Speaker 1: you're deluding yourself. Your phone, your TV, your laptop, everything 813 00:45:52,400 --> 00:45:54,680 Speaker 1: around you that is connected to the Internet and has 814 00:45:54,719 --> 00:45:58,480 Speaker 1: a microphone and or a video camera is eaves dropping 815 00:45:58,800 --> 00:46:02,560 Speaker 1: on you. And you don't care, right, I do care, 816 00:46:02,719 --> 00:46:04,600 Speaker 1: But I also have a feeling like what you have 817 00:46:04,600 --> 00:46:07,200 Speaker 1: a smart what can I do? I know you cannot 818 00:46:07,200 --> 00:46:09,120 Speaker 1: have a smart phone. That's part of it. And there's 819 00:46:09,160 --> 00:46:11,120 Speaker 1: there's a that's a big thing. There's a there's a 820 00:46:11,160 --> 00:46:13,279 Speaker 1: trade off. It's like, Okay, I want to be able 821 00:46:13,320 --> 00:46:16,520 Speaker 1: to read Twitter every thirty seconds and just be like 822 00:46:16,760 --> 00:46:22,160 Speaker 1: I was boring and then do it again thirty seconds later. Um, 823 00:46:22,160 --> 00:46:25,240 Speaker 1: And I'm willing to trade that ability for the idea 824 00:46:25,280 --> 00:46:27,200 Speaker 1: that yeah, I'm being listened to in the in the 825 00:46:27,239 --> 00:46:30,120 Speaker 1: gamble that well, I mean, I guess I'm not saying 826 00:46:30,120 --> 00:46:33,480 Speaker 1: anything that important, you know, But I mean, like that's 827 00:46:33,640 --> 00:46:36,799 Speaker 1: that's that's wrong, Like that's wrong. Well, or the people 828 00:46:36,840 --> 00:46:38,319 Speaker 1: say like, well, if you don't have nothing to hide, 829 00:46:38,320 --> 00:46:40,920 Speaker 1: then you know, that's that's a fallacy. That's a logical 830 00:46:40,960 --> 00:46:44,200 Speaker 1: fallacy that a lot of the people collecting that data 831 00:46:44,280 --> 00:46:47,400 Speaker 1: bank on it has it still has a chilling effect 832 00:46:47,400 --> 00:46:49,640 Speaker 1: on on society at large. And if they ever do 833 00:46:49,760 --> 00:46:54,800 Speaker 1: want something on you, brother, they got it. Man, I'm sorry, 834 00:46:54,800 --> 00:47:00,600 Speaker 1: I'm worked up. I was driving. I mean, I'm gonna 835 00:47:00,680 --> 00:47:05,719 Speaker 1: be cool. Then that light didn't change green. It all 836 00:47:05,760 --> 00:47:08,360 Speaker 1: went south. It's a dumb traffic light, all right. So 837 00:47:08,360 --> 00:47:11,319 Speaker 1: we talked a little bit about hackers. Um, and we're 838 00:47:11,320 --> 00:47:14,000 Speaker 1: not just talking about stealing your information or tapping into 839 00:47:14,000 --> 00:47:18,359 Speaker 1: your bank. Um. What about if you if you're if 840 00:47:18,360 --> 00:47:22,040 Speaker 1: your grandmother who is a shut in, has this great 841 00:47:22,160 --> 00:47:25,680 Speaker 1: new smart health system that is hooked up to her 842 00:47:25,719 --> 00:47:30,239 Speaker 1: body and alerts her doctor if something's wrong, she's low 843 00:47:30,280 --> 00:47:32,799 Speaker 1: on meds. Uh. These are all great things, but what 844 00:47:32,840 --> 00:47:36,680 Speaker 1: if someone can hack into that and tell you know, 845 00:47:36,840 --> 00:47:39,640 Speaker 1: and hacking the grandma systems where it doesn't alert, then 846 00:47:39,640 --> 00:47:42,400 Speaker 1: her life is literally at stake. Or what if you 847 00:47:42,440 --> 00:47:45,319 Speaker 1: have and this has happened too, what if you go 848 00:47:45,360 --> 00:47:48,440 Speaker 1: into your baby's room and your baby monitor you hear 849 00:47:48,520 --> 00:47:50,760 Speaker 1: some guy's voice on the end of their end, yelling 850 00:47:50,800 --> 00:47:54,320 Speaker 1: and screaming curse words in Russian. I think that's happened 851 00:47:54,320 --> 00:47:58,280 Speaker 1: to your baby? Is just like, what's this guy's problem? Yeah, 852 00:47:58,320 --> 00:48:03,359 Speaker 1: he usually tells me nice stories. Where's Sergey wants back? 853 00:48:04,600 --> 00:48:08,120 Speaker 1: It's all really creepy, man. You know one of my heroes, 854 00:48:08,239 --> 00:48:12,480 Speaker 1: Charles c Man, who I've never heard of it. He 855 00:48:12,600 --> 00:48:15,239 Speaker 1: also he wrote an article in Vanity Fair called look 856 00:48:15,280 --> 00:48:17,720 Speaker 1: Out He's Got a Phone, and it was all about 857 00:48:17,800 --> 00:48:20,400 Speaker 1: the ways that the Internet of Things could be hacked 858 00:48:20,400 --> 00:48:23,080 Speaker 1: to like basically really threatened somebody. Like if you've got 859 00:48:23,080 --> 00:48:26,600 Speaker 1: a smart pacemaker or smart insulin pump, those things could 860 00:48:26,600 --> 00:48:29,200 Speaker 1: be hacked, you know, and that's a that's a something 861 00:48:29,200 --> 00:48:31,880 Speaker 1: that we're gonna have to deal with, or we're dealing 862 00:48:31,880 --> 00:48:34,360 Speaker 1: with now as it as it stands. Well, one of 863 00:48:34,400 --> 00:48:36,799 Speaker 1: the things that could help and what should be going 864 00:48:36,880 --> 00:48:40,279 Speaker 1: on is these devices at the very least should be 865 00:48:40,320 --> 00:48:44,399 Speaker 1: giving you options on how much data they get their 866 00:48:44,400 --> 00:48:48,680 Speaker 1: hands on, how it's stored, uh, and what the expiration 867 00:48:48,800 --> 00:48:52,280 Speaker 1: date on that is, Like if you quit using this device, 868 00:48:52,480 --> 00:48:56,160 Speaker 1: they still have your information or might still be collecting 869 00:48:56,200 --> 00:49:00,160 Speaker 1: it too. Yeah, absolutely, or uh, when I didn't think 870 00:49:00,160 --> 00:49:04,880 Speaker 1: about when these systems are no longer supported, like you know, 871 00:49:04,920 --> 00:49:07,920 Speaker 1: the company shuts down or something, it needs to have 872 00:49:08,000 --> 00:49:13,360 Speaker 1: a suicide uh measure programmed in to where it like 873 00:49:13,400 --> 00:49:16,680 Speaker 1: it kills itself after it's not supported anymore, and it 874 00:49:16,719 --> 00:49:23,080 Speaker 1: should do it gruesomely. What about economics, Well, as you 875 00:49:23,080 --> 00:49:26,520 Speaker 1: can imagine, if there's hundreds of billions of devices on 876 00:49:26,560 --> 00:49:30,280 Speaker 1: the horizon being connected, it's going to have a pretty 877 00:49:30,280 --> 00:49:34,040 Speaker 1: big economic impact. And they're talking about something on the 878 00:49:34,160 --> 00:49:37,799 Speaker 1: order of what was it four point three trillion dollars 879 00:49:37,840 --> 00:49:44,400 Speaker 1: in value up from nine billion? It seems low to me. Yeah, 880 00:49:44,640 --> 00:49:47,160 Speaker 1: you know, like think about just in cash you milk 881 00:49:47,200 --> 00:49:54,640 Speaker 1: alone trillion dollars. Yeah. Well, it's also costing some companies. Um. 882 00:49:54,760 --> 00:49:58,359 Speaker 1: How so, well, if you've heard of square, yeah, you'd 883 00:49:58,400 --> 00:50:00,200 Speaker 1: probably pay for a lot of things with ware of 884 00:50:00,239 --> 00:50:02,840 Speaker 1: these days. It's a great thing because it allows a 885 00:50:02,880 --> 00:50:06,080 Speaker 1: small business. Previously there was only one way to make 886 00:50:06,080 --> 00:50:10,440 Speaker 1: credit card transactions. You had to get a fairly expensive 887 00:50:10,480 --> 00:50:15,200 Speaker 1: system that uh or a cash register that you know 888 00:50:15,239 --> 00:50:17,279 Speaker 1: made it all possible, and you had to They kind 889 00:50:17,280 --> 00:50:19,080 Speaker 1: of had you over a barrel a little bit. Then 890 00:50:19,080 --> 00:50:20,920 Speaker 1: Square came along and said, now you know what, you 891 00:50:20,960 --> 00:50:23,319 Speaker 1: don't need that stuff. Let's democratize this. Yeah, we have 892 00:50:23,320 --> 00:50:25,680 Speaker 1: the Internet. Now here's some competition. All you need this 893 00:50:25,760 --> 00:50:28,920 Speaker 1: little thing to plug in to your tablet and you 894 00:50:28,960 --> 00:50:31,080 Speaker 1: can swipe it right there in the cab or in 895 00:50:31,120 --> 00:50:35,359 Speaker 1: the place of business and avoid the middlemen and use 896 00:50:35,719 --> 00:50:38,680 Speaker 1: or use PayPal and basically skirt these companies that have 897 00:50:38,760 --> 00:50:42,000 Speaker 1: kind of been ripping you off, um, as a business, 898 00:50:42,000 --> 00:50:44,160 Speaker 1: and then that business passes the cost on to you 899 00:50:44,200 --> 00:50:47,080 Speaker 1: as a customer so in a way, or they're like, 900 00:50:47,160 --> 00:50:49,120 Speaker 1: we don't take that credit card there, fees are too 901 00:50:49,239 --> 00:50:54,520 Speaker 1: high or whatever. Yeah, exactly. Um, but you're right, it's 902 00:50:54,719 --> 00:50:58,239 Speaker 1: the democratization, which is good. I mean it's great. It's 903 00:50:58,280 --> 00:51:01,520 Speaker 1: opened up a lot of a lot. It's taken etsy 904 00:51:01,600 --> 00:51:05,959 Speaker 1: into the real world. That's right, you know. And isn't 905 00:51:05,960 --> 00:51:09,120 Speaker 1: that awesome? Well yeah, Um, there's also worry that it 906 00:51:09,120 --> 00:51:11,280 Speaker 1: could cost jobs. Like you said, what if the lipstick 907 00:51:11,560 --> 00:51:16,040 Speaker 1: um stocker gets fired? Um because he threw away all 908 00:51:16,080 --> 00:51:18,480 Speaker 1: those orange tubes of lipstick. Yeah, he deserves that. He 909 00:51:18,520 --> 00:51:20,920 Speaker 1: didn't do his job good enough. Um. Well that's it. 910 00:51:21,120 --> 00:51:24,160 Speaker 1: Like this, I think this article kind of just kind 911 00:51:24,160 --> 00:51:26,680 Speaker 1: of glosses over that issue, and it's a big issue 912 00:51:26,680 --> 00:51:28,719 Speaker 1: in and of itself. I don't know, but I don't 913 00:51:28,719 --> 00:51:30,520 Speaker 1: think it glossed over it so much as there's a 914 00:51:30,840 --> 00:51:34,480 Speaker 1: school of thought, a very like prominent school of thought 915 00:51:34,560 --> 00:51:38,560 Speaker 1: that says, no, that's not what happens. People get different 916 00:51:38,640 --> 00:51:41,719 Speaker 1: jobs and learn new things. And the one example they 917 00:51:41,840 --> 00:51:44,560 Speaker 1: used in here, which I think makes sense is, um, 918 00:51:44,719 --> 00:51:47,440 Speaker 1: a t M s A TMS popped up everywhere and 919 00:51:47,480 --> 00:51:49,359 Speaker 1: people like, oh, well, there's not gonna be any more 920 00:51:49,360 --> 00:51:51,359 Speaker 1: bank tellers. No one needs to go to a bank, 921 00:51:51,480 --> 00:51:55,239 Speaker 1: And well, yeah, but they actually increased in number, right, Yeah, 922 00:51:55,239 --> 00:51:57,440 Speaker 1: they did, uh. And they think part of that reason 923 00:51:57,520 --> 00:52:02,120 Speaker 1: is because banks could open more branches because they didn't 924 00:52:02,120 --> 00:52:05,160 Speaker 1: need to staff it with fourteen bankers, they just needed 925 00:52:05,200 --> 00:52:08,760 Speaker 1: a couple. Yeah, but more branches meant ultimately more tellers, 926 00:52:09,360 --> 00:52:11,600 Speaker 1: just not in one place. The thing is is, I 927 00:52:11,360 --> 00:52:14,279 Speaker 1: I would be very curious to know whether that was 928 00:52:14,480 --> 00:52:18,440 Speaker 1: an anomaly like that. You know, if if typically in 929 00:52:18,480 --> 00:52:22,239 Speaker 1: an industry that gets replaced by a machine, a good 930 00:52:22,239 --> 00:52:25,600 Speaker 1: one like an a t M works pretty well, Um, 931 00:52:25,640 --> 00:52:28,440 Speaker 1: if they if they actually if jobs actually go up, 932 00:52:28,520 --> 00:52:30,600 Speaker 1: or if that was just like one of the very 933 00:52:30,680 --> 00:52:33,480 Speaker 1: rare examples of it. Well, I think it's it depends 934 00:52:33,520 --> 00:52:36,319 Speaker 1: on your industry. If you're one of the people that 935 00:52:36,400 --> 00:52:38,680 Speaker 1: did that thing, you're like, well, I lost my job 936 00:52:38,719 --> 00:52:42,160 Speaker 1: to a robot. If you build the robots, you're like, 937 00:52:42,239 --> 00:52:45,120 Speaker 1: I got a job because I'm now building robots. Right. 938 00:52:45,120 --> 00:52:47,000 Speaker 1: And again, I think we talked about this, I don't 939 00:52:47,040 --> 00:52:51,960 Speaker 1: remember in what episode, but if you are getting rid 940 00:52:51,960 --> 00:52:54,960 Speaker 1: of an industry and and putting a lot of people 941 00:52:54,960 --> 00:53:00,440 Speaker 1: out of their their employment your careers. Um, I'm not 942 00:53:00,520 --> 00:53:04,200 Speaker 1: against automating stuff like that, but I think part and 943 00:53:04,280 --> 00:53:07,040 Speaker 1: parcel with that is to figure out a way to 944 00:53:07,120 --> 00:53:10,719 Speaker 1: take those out of work people and train them to 945 00:53:10,800 --> 00:53:14,680 Speaker 1: go into new fields or just to to um build 946 00:53:14,680 --> 00:53:17,960 Speaker 1: the stuff that that took over their jobs or whatever. 947 00:53:18,360 --> 00:53:20,640 Speaker 1: But you can't just be like, best of luck, we 948 00:53:20,719 --> 00:53:22,560 Speaker 1: figured out a way for a robot to do what 949 00:53:22,600 --> 00:53:26,279 Speaker 1: you're doing. Go uh, go get hooked on oxyconton and 950 00:53:26,320 --> 00:53:29,520 Speaker 1: go die. Wait, we gotta what was that because we 951 00:53:29,600 --> 00:53:31,960 Speaker 1: got a great listener mail about that. Yeah, that's that's 952 00:53:32,000 --> 00:53:35,160 Speaker 1: what I was talking about. A certain amount of people 953 00:53:35,200 --> 00:53:39,320 Speaker 1: from this industry. It was a Kentucky coal coal industry 954 00:53:39,480 --> 00:53:41,560 Speaker 1: like that were then cross trained to do computer work. 955 00:53:42,600 --> 00:53:44,480 Speaker 1: Wasn't even that long ago, but I can't remember exactly 956 00:53:44,520 --> 00:53:46,200 Speaker 1: what it was, but that's exactly what I mean. Like, 957 00:53:46,280 --> 00:53:48,920 Speaker 1: that's number one. That's a role of government. In my opinion, 958 00:53:49,280 --> 00:53:51,040 Speaker 1: it's one of the it's one of the clear things 959 00:53:51,040 --> 00:53:52,760 Speaker 1: that you can look at and be like, oh, yeah, 960 00:53:52,800 --> 00:53:55,959 Speaker 1: that's what government's for. They're supposed to invest in infrastructure 961 00:53:55,960 --> 00:54:00,279 Speaker 1: and education to to um keep people employed so that 962 00:54:00,400 --> 00:54:05,160 Speaker 1: everyone can earn a decent wage. That's that's my soapbox. 963 00:54:05,200 --> 00:54:07,840 Speaker 1: This has basically been one long soapbox, hasn't it. I 964 00:54:07,880 --> 00:54:10,800 Speaker 1: don't think so. Well, it's the Internet of Things. There's 965 00:54:10,880 --> 00:54:16,280 Speaker 1: literally nothing more to speak about it. Okay, no more. Uh, 966 00:54:16,320 --> 00:54:18,359 Speaker 1: I'm just kidding. And if you want to know more 967 00:54:18,360 --> 00:54:19,879 Speaker 1: about that kind of stuff, you should go check out 968 00:54:19,880 --> 00:54:24,480 Speaker 1: our compadre, John Strickland's podcast, tech Stuff. I guarantee talks 969 00:54:24,520 --> 00:54:28,000 Speaker 1: about the Internet of Things every other week, I would imagine. Um. 970 00:54:28,080 --> 00:54:30,480 Speaker 1: If you want to know more about it. In the meantime, 971 00:54:30,560 --> 00:54:33,520 Speaker 1: you can look up this article on how stuff works 972 00:54:33,520 --> 00:54:36,200 Speaker 1: dot com by typing Internet of Things on the search bar. 973 00:54:36,760 --> 00:54:39,680 Speaker 1: And since I said things, it's time for a listener 974 00:54:39,719 --> 00:54:42,200 Speaker 1: mail No, sir, Oh yeah, we already did listener mail. 975 00:54:42,360 --> 00:54:44,960 Speaker 1: Well that's right, but we have a bonus because now 976 00:54:44,960 --> 00:54:57,600 Speaker 1: we're gonna finish up with part two of Administered. Okay, 977 00:54:58,000 --> 00:55:00,239 Speaker 1: alright again, if you're new to the show, this is 978 00:55:00,239 --> 00:55:02,480 Speaker 1: when we thank people for the nice things they send us. 979 00:55:03,000 --> 00:55:07,080 Speaker 1: And it goes a little something like this. Uh. Peter, 980 00:55:08,440 --> 00:55:12,279 Speaker 1: the organization People for the Ethical Treatment of Animals, they 981 00:55:12,320 --> 00:55:15,880 Speaker 1: send us a cat care package after our cat podcast, 982 00:55:16,600 --> 00:55:19,919 Speaker 1: and I think probably partially because of my soapbox on 983 00:55:20,520 --> 00:55:24,000 Speaker 1: declining and outdoor cats, They're like this guy, yeah, send 984 00:55:24,040 --> 00:55:26,520 Speaker 1: him some cats stuff. Give him some cat stuff stat. 985 00:55:26,800 --> 00:55:28,439 Speaker 1: So thanks for that. Put a cat in a box 986 00:55:28,480 --> 00:55:32,600 Speaker 1: and mail it to him. That's Peter's way. We got 987 00:55:32,640 --> 00:55:38,840 Speaker 1: a postcard from China from Mary Kate Mueller. Thanks a 988 00:55:38,840 --> 00:55:42,280 Speaker 1: lot for that, Mary Kate. We appreciate it. Beautiful. Lisa 989 00:55:42,360 --> 00:55:46,040 Speaker 1: of black Bow Sweets, send us some candied pecans. Dude, 990 00:55:46,680 --> 00:55:49,759 Speaker 1: those are dangerous. Oh yeah, they did not last long 991 00:55:49,760 --> 00:55:52,920 Speaker 1: in the Clarkhouse. No, almost didn't make it. On the 992 00:55:53,000 --> 00:55:59,040 Speaker 1: ride home, I had to be like, I, man, they're good. Uh. 993 00:55:59,080 --> 00:56:02,840 Speaker 1: Aaron Supper send us the bottle of Sonoma County Distilling 994 00:56:02,880 --> 00:56:06,279 Speaker 1: Companies west of Kentucky Bourbon number one. I haven't tried 995 00:56:06,320 --> 00:56:08,080 Speaker 1: it as a good I have not tried it yet either, 996 00:56:08,120 --> 00:56:09,919 Speaker 1: but I'm very much looking forward to it. So thanks 997 00:56:09,920 --> 00:56:14,040 Speaker 1: a lot eron. And speaking of whiskey, uh, thirty three Books, 998 00:56:14,760 --> 00:56:18,280 Speaker 1: Dave from thirty three Books sent us a whiskey tasting 999 00:56:18,320 --> 00:56:23,160 Speaker 1: set which is a little uh I think imported from Ireland. 1000 00:56:23,160 --> 00:56:26,640 Speaker 1: Even a little whiskey tasting glass and book for note 1001 00:56:27,080 --> 00:56:29,800 Speaker 1: and a pen even. Yeah, it's everything you need, everything 1002 00:56:29,840 --> 00:56:33,560 Speaker 1: you need to taste whiskey. So thanks Dave for that. Um, 1003 00:56:33,680 --> 00:56:37,560 Speaker 1: we got a postcard from Caitlin and her fiance from 1004 00:56:37,600 --> 00:56:41,839 Speaker 1: the Mayo Clinic, remember the Helen Branch mentioned and I think, 1005 00:56:41,920 --> 00:56:46,240 Speaker 1: like I think some Unsolved Mysteries. We heard the episode 1006 00:56:46,239 --> 00:56:49,920 Speaker 1: on Unsolved Mysteries long ago. Yeah, but we got a 1007 00:56:49,960 --> 00:56:54,719 Speaker 1: Mayo Clinic postcard. Yeah, why not? Robin and Arran sent 1008 00:56:54,840 --> 00:56:58,960 Speaker 1: us some coasters, some Detroit coasters because they know, even 1009 00:56:58,960 --> 00:57:01,640 Speaker 1: though we poked on a Detroit, we secretly love Detroit. Yeah, 1010 00:57:02,280 --> 00:57:05,319 Speaker 1: thanks Robin and Aaron for that. Mark Singleton over at 1011 00:57:05,400 --> 00:57:09,040 Speaker 1: Rudolph Food sent us a ton of pork crimes and 1012 00:57:09,200 --> 00:57:10,960 Speaker 1: a bunch of great gear to go with it. So 1013 00:57:11,120 --> 00:57:14,600 Speaker 1: we can wear camouflaged hat while we eat our pork 1014 00:57:14,600 --> 00:57:19,680 Speaker 1: crimes right as it should be. Sam Meckling of Jeffson's 1015 00:57:19,760 --> 00:57:24,000 Speaker 1: Millard of Chicago, Yeah, sent us bottles of Milord, And 1016 00:57:24,040 --> 00:57:27,240 Speaker 1: if you've never heard of Milord, it is Chicago's own 1017 00:57:28,040 --> 00:57:35,600 Speaker 1: special uh liqueur, it's something it is. It's um known 1018 00:57:35,760 --> 00:57:44,040 Speaker 1: for its ah harsh aftertaste. Guess it's a good way 1019 00:57:44,040 --> 00:57:45,840 Speaker 1: to put it. Yeah. The great thing about Mallord, though, 1020 00:57:45,880 --> 00:57:49,200 Speaker 1: is they know the deal. They're not like, this is 1021 00:57:49,240 --> 00:57:53,400 Speaker 1: so delicious, You're just never gonna have anything better here. 1022 00:57:53,960 --> 00:57:57,120 Speaker 1: You seem to be having a good day. Let's change that. Um. Yeah, 1023 00:57:57,160 --> 00:57:59,960 Speaker 1: but Mlord actually have gotten this is the best thing 1024 00:58:00,000 --> 00:58:02,720 Speaker 1: you can say. I've gotten used to it, and um, 1025 00:58:02,760 --> 00:58:05,400 Speaker 1: it's an interesting taste. You should try it out. Well. 1026 00:58:05,440 --> 00:58:07,480 Speaker 1: Thanks to the dudes who sent us that, we appreciate. 1027 00:58:07,560 --> 00:58:10,480 Speaker 1: Thank you, Sam Um. Speaking of Booze, I think we 1028 00:58:10,560 --> 00:58:13,120 Speaker 1: mentioned it the other day, but also again, thank you 1029 00:58:13,160 --> 00:58:15,760 Speaker 1: to the people at Spring forty four, which is a 1030 00:58:15,800 --> 00:58:18,960 Speaker 1: Colorado distillery for the old tom Jin that they sent us. Yeah, 1031 00:58:19,000 --> 00:58:22,960 Speaker 1: that was just beautiful yea and actually just ran out Yeah, 1032 00:58:23,120 --> 00:58:28,160 Speaker 1: so just fire just thrown that out there. Badger body 1033 00:58:28,360 --> 00:58:31,440 Speaker 1: body products. Um, a competitive of my own wife. Even. 1034 00:58:31,560 --> 00:58:34,440 Speaker 1: That was Dave who sent us those Dave from Badger 1035 00:58:34,480 --> 00:58:38,000 Speaker 1: Body Products for outam New Hampshire sentence shaving stuff and 1036 00:58:38,080 --> 00:58:41,720 Speaker 1: sunscreen and beared oils and such. That was Dave Morrell. 1037 00:58:41,960 --> 00:58:44,640 Speaker 1: He was the beer guy. Yeah, he worked at Sweetwater. Yeah, 1038 00:58:44,680 --> 00:58:46,840 Speaker 1: so he used to bring a sweetwater. He's a great 1039 00:58:46,880 --> 00:58:48,640 Speaker 1: guy and like all beer guys, he ended up in 1040 00:58:48,640 --> 00:58:53,200 Speaker 1: New Hampshire. Um. But Emily, actually my own wife, who 1041 00:58:53,200 --> 00:58:56,400 Speaker 1: has her natural body product company, went and she usually 1042 00:58:56,400 --> 00:58:59,600 Speaker 1: poop boo's because people say they're natural in art. She went, oh, 1043 00:58:59,720 --> 00:59:02,440 Speaker 1: she's like Badger's good. Actually, yeah, she's like they make 1044 00:59:02,480 --> 00:59:05,040 Speaker 1: good stuff. So I used their beard oil. Now, well 1045 00:59:05,040 --> 00:59:08,400 Speaker 1: thanks a lot, David. I've been using the um the 1046 00:59:08,400 --> 00:59:12,640 Speaker 1: They have a bug repellent sun block that works, smells awesome, 1047 00:59:12,880 --> 00:59:15,040 Speaker 1: smells like citron now that works like a charm. So 1048 00:59:15,280 --> 00:59:18,439 Speaker 1: thank you for that. Pie Lady and Son, oh yeah, 1049 00:59:18,440 --> 00:59:20,840 Speaker 1: out of New York. They sent us Pie Dude and 1050 00:59:20,920 --> 00:59:23,840 Speaker 1: they were just getting started with their shipping program and 1051 00:59:24,000 --> 00:59:26,080 Speaker 1: Pie Lady and Son I have to tell you it 1052 00:59:26,160 --> 00:59:30,000 Speaker 1: worked great. They showed up fresh and delicious, and by 1053 00:59:30,040 --> 00:59:32,680 Speaker 1: delicious I mean really really delicious. Yeah, thanks a lot 1054 00:59:32,720 --> 00:59:34,680 Speaker 1: for that guy. Yeah, so you can support them as well. 1055 00:59:34,760 --> 00:59:38,000 Speaker 1: Pie Lady and Son out of New York City. UM 1056 00:59:38,120 --> 00:59:41,320 Speaker 1: Zach debt More sent us some beautiful cherry, walnut and 1057 00:59:41,400 --> 00:59:44,680 Speaker 1: maplewood boxes. Those are great. Yeah, you got mine on 1058 00:59:44,760 --> 00:59:49,120 Speaker 1: my desk. Matt Dent sent us his He's a he's 1059 00:59:49,120 --> 00:59:51,880 Speaker 1: a comic strip guy who's created the Willie who comics 1060 00:59:52,360 --> 00:59:54,320 Speaker 1: around for twenty five years. I know, I saw that book. 1061 00:59:54,320 --> 00:59:57,000 Speaker 1: It's amazing. Yeah. He sent us a big collection collector's edition. 1062 00:59:57,200 --> 01:00:01,840 Speaker 1: Congratulations Matt and Chuck. Some buddy, Uh made us a 1063 01:00:01,920 --> 01:00:05,000 Speaker 1: longboard that I've got stuff. You should know, a longboard. 1064 01:00:05,000 --> 01:00:08,840 Speaker 1: It's amazing. I don't know. We lost the correspondence. Who 1065 01:00:08,840 --> 01:00:10,720 Speaker 1: don't know who made it? Yes, so if you made it, 1066 01:00:10,880 --> 01:00:12,920 Speaker 1: send that in and we'll reach your name. Yeah, well, 1067 01:00:13,240 --> 01:00:15,520 Speaker 1: thank you, but thank you very much for the longboard. 1068 01:00:15,800 --> 01:00:20,320 Speaker 1: Our buttery butter buddy, Tyler Murphy, he's there, butter, He's 1069 01:00:20,320 --> 01:00:23,920 Speaker 1: our butter, our bread and butter. Uh from South Dakota 1070 01:00:24,000 --> 01:00:28,800 Speaker 1: sinis Um Instant Empire shirts and records, which is really cool. 1071 01:00:29,400 --> 01:00:31,360 Speaker 1: And I just realized that Tyler's emails has been going 1072 01:00:31,400 --> 01:00:34,400 Speaker 1: to my spam folder. I emailed them today because I 1073 01:00:34,440 --> 01:00:37,000 Speaker 1: never look in there and I happened to for something else. 1074 01:00:37,320 --> 01:00:39,200 Speaker 1: That's a bunch of emails from Tyler. I'm like, dude, 1075 01:00:39,320 --> 01:00:45,720 Speaker 1: so sorry. Yeah. Um, Hillary lows Are and Mike Dude. 1076 01:00:45,960 --> 01:00:47,520 Speaker 1: I don't know if Mike's a lows Are or not, 1077 01:00:49,120 --> 01:00:51,600 Speaker 1: but Hillary Mike have been with us for years. They 1078 01:00:51,600 --> 01:00:54,720 Speaker 1: are also from the Dakotas and travel to see our 1079 01:00:54,760 --> 01:00:57,640 Speaker 1: shows in Seattle and they're wonderful people. Hillary is a 1080 01:00:57,680 --> 01:01:00,200 Speaker 1: teacher and they as always sent us to list just 1081 01:01:00,280 --> 01:01:04,360 Speaker 1: delicious flathead like cheese. Dude, that is the best cheese 1082 01:01:04,400 --> 01:01:06,680 Speaker 1: on the planet. I think, who you got better cheese 1083 01:01:06,720 --> 01:01:09,160 Speaker 1: and flathead like? Send it in? Let us be the 1084 01:01:09,240 --> 01:01:14,480 Speaker 1: judge Rachel Stone, who is artist from Australia's East Coast. 1085 01:01:14,560 --> 01:01:17,960 Speaker 1: She has a site called Land of Wonderful dot com. 1086 01:01:18,280 --> 01:01:20,360 Speaker 1: She sent us a lovely handmade cards and letters. So 1087 01:01:20,400 --> 01:01:23,000 Speaker 1: thanks a lot, Rachel. Uh. And then finally you got 1088 01:01:23,040 --> 01:01:26,400 Speaker 1: any more? Nope? Last one. Emily and the crew at 1089 01:01:26,440 --> 01:01:30,040 Speaker 1: Kickapoo Joy Drinks. Oh yeah, they have their Kickapoo Joy 1090 01:01:30,080 --> 01:01:35,280 Speaker 1: Juice and their Atlanta base and they make all natural drinks, juice, 1091 01:01:35,400 --> 01:01:38,920 Speaker 1: juices and sodasn't things Kickapoo Juice and they sent us 1092 01:01:38,960 --> 01:01:41,280 Speaker 1: a box and that was super nice. Thanks a lot, guys. 1093 01:01:41,360 --> 01:01:44,040 Speaker 1: Thanks to everybody who sent us stuff. We appreciate it 1094 01:01:44,160 --> 01:01:47,080 Speaker 1: every time, so thank you. Uh. If you want to 1095 01:01:47,160 --> 01:01:49,320 Speaker 1: hang out with us on social media, you can go 1096 01:01:49,520 --> 01:01:53,400 Speaker 1: to s Y s K podcast on Instagram and Twitter. 1097 01:01:53,720 --> 01:01:55,680 Speaker 1: You can go to Facebook dot com slash stuff you 1098 01:01:55,720 --> 01:01:58,080 Speaker 1: Should Know. You can send us an email to Stuff 1099 01:01:58,120 --> 01:02:00,360 Speaker 1: podcast at how stuff works dot com and is always 1100 01:02:00,360 --> 01:02:02,000 Speaker 1: trying to start a home on the web. Stuff you 1101 01:02:02,000 --> 01:02:10,120 Speaker 1: Should Know dot com. For more on this and thousands 1102 01:02:10,200 --> 01:02:12,520 Speaker 1: of other topics, is it how stuff Works dot com