1 00:00:04,120 --> 00:00:07,160 Speaker 1: Get in touch with technology with tech Stuff from how 2 00:00:07,200 --> 00:00:14,120 Speaker 1: stuff Works dot com. Hey there, and welcome to tech Stuff. 3 00:00:14,160 --> 00:00:17,599 Speaker 1: I'm your host, Jonathan Strickland. I'm an executive producer with 4 00:00:17,640 --> 00:00:20,320 Speaker 1: how Staffworks and my Heart Radio and I love all 5 00:00:20,440 --> 00:00:25,720 Speaker 1: things tech and Happy New Year. But the last several 6 00:00:26,079 --> 00:00:28,920 Speaker 1: years and I've said aside an episode at the end 7 00:00:28,960 --> 00:00:31,400 Speaker 1: of the calendar year or beginning of the new year 8 00:00:31,800 --> 00:00:36,080 Speaker 1: to make predictions that will follow in the months ahead, 9 00:00:36,720 --> 00:00:39,199 Speaker 1: I do pretty well, and most of the time, I 10 00:00:39,240 --> 00:00:41,080 Speaker 1: you know, I end up being at least partly right 11 00:00:41,120 --> 00:00:44,400 Speaker 1: about a few things and entirely wrong about other things. 12 00:00:45,120 --> 00:00:47,440 Speaker 1: The future is pretty hard to predict, even for someone 13 00:00:47,479 --> 00:00:49,920 Speaker 1: who's really smart, So for a goof is like me, 14 00:00:50,640 --> 00:00:53,559 Speaker 1: it's nearly a coin flip situation in some cases. So 15 00:00:53,640 --> 00:00:55,480 Speaker 1: this year I thought I would change it up a 16 00:00:55,480 --> 00:00:57,640 Speaker 1: little bit. I thought I would give myself a break. 17 00:00:58,160 --> 00:01:01,080 Speaker 1: Rather than make predictions of what I think might or 18 00:01:01,200 --> 00:01:03,720 Speaker 1: might not happen over the next year, I am going 19 00:01:03,760 --> 00:01:07,000 Speaker 1: to talk about what I hope to see happen. So 20 00:01:07,080 --> 00:01:09,800 Speaker 1: these are based off of stuff that happens to be 21 00:01:09,840 --> 00:01:12,480 Speaker 1: going on today and the outcomes that I hope to 22 00:01:12,600 --> 00:01:15,479 Speaker 1: see from those things that are going on. I don't 23 00:01:15,480 --> 00:01:20,440 Speaker 1: necessarily believe that all or even most of these outcomes 24 00:01:20,440 --> 00:01:22,880 Speaker 1: will happen, but I figure we can all engage in 25 00:01:22,880 --> 00:01:25,400 Speaker 1: a little wishful thinking now and then, as long as 26 00:01:25,440 --> 00:01:28,400 Speaker 1: we remember that it is in fact wishful and not 27 00:01:28,480 --> 00:01:32,640 Speaker 1: necessarily realistic, and it gives us something to work for now. 28 00:01:32,680 --> 00:01:35,600 Speaker 1: I'll warn you some of you might find this episode 29 00:01:35,640 --> 00:01:38,600 Speaker 1: to be preaching, and I understand that, and I can't 30 00:01:38,720 --> 00:01:42,920 Speaker 1: really argue against it, though my intent isn't to preach, 31 00:01:43,160 --> 00:01:46,720 Speaker 1: but to beseech perhaps or just kind of explain where 32 00:01:46,720 --> 00:01:49,680 Speaker 1: I'm coming from. However, if you find the whole idea 33 00:01:49,760 --> 00:01:52,960 Speaker 1: to be off putting, I won't blame you for skipping 34 00:01:53,000 --> 00:01:56,840 Speaker 1: this episode. I understand. I would rather you spend your 35 00:01:56,880 --> 00:01:59,920 Speaker 1: time doing something you enjoy doing, and it's totally fine. 36 00:02:00,280 --> 00:02:03,320 Speaker 1: There's no hard feelings on my end. So if you 37 00:02:03,360 --> 00:02:05,560 Speaker 1: think it ends up sounding preachy and you don't want 38 00:02:05,600 --> 00:02:08,640 Speaker 1: to listen to it, that's fine. You don't have to 39 00:02:09,200 --> 00:02:11,919 Speaker 1: write me or tweet me to let me know, because 40 00:02:12,520 --> 00:02:14,560 Speaker 1: you know I'm doing it anyway. So here we go. 41 00:02:15,000 --> 00:02:20,040 Speaker 1: With that in mind, let's start with transparency. Ifen had 42 00:02:20,120 --> 00:02:23,800 Speaker 1: one super strong theme for some of the biggest companies 43 00:02:23,840 --> 00:02:28,680 Speaker 1: out there, and talking about companies like Google, Facebook, Twitter, Apple, 44 00:02:28,760 --> 00:02:31,239 Speaker 1: that kind of thing. It was that there was a 45 00:02:31,360 --> 00:02:33,920 Speaker 1: lack of transparency in general, and it can make a 46 00:02:33,919 --> 00:02:38,639 Speaker 1: tricky situation turned into a disastrous one. So, for example, 47 00:02:38,720 --> 00:02:42,440 Speaker 1: it's bad if you discover a data breach, right if 48 00:02:42,440 --> 00:02:45,000 Speaker 1: you find out someone has managed to get hold of 49 00:02:45,040 --> 00:02:49,160 Speaker 1: information that should have been under protection, that's bad. But 50 00:02:49,400 --> 00:02:52,440 Speaker 1: that bad situation can be made much worse if you 51 00:02:52,480 --> 00:02:55,119 Speaker 1: then try to keep that information quiet. If you're trying 52 00:02:55,120 --> 00:02:57,360 Speaker 1: to keep it away from the public until it is 53 00:02:57,400 --> 00:02:59,720 Speaker 1: no longer possible to hide it and then it all 54 00:02:59,760 --> 00:03:04,680 Speaker 1: become public. That sort of behavior breeds distrust and anger, 55 00:03:05,000 --> 00:03:07,680 Speaker 1: and it should because the general public tends to be 56 00:03:08,120 --> 00:03:12,000 Speaker 1: unknowing victims of these breaches, so it's only fair to 57 00:03:12,080 --> 00:03:14,640 Speaker 1: let them know what has happened as soon as possible 58 00:03:14,680 --> 00:03:17,880 Speaker 1: so that they can, you know, make moves to protect themselves. 59 00:03:18,160 --> 00:03:21,640 Speaker 1: So my hope for twenty nineteen is that companies adopt 60 00:03:21,639 --> 00:03:25,760 Speaker 1: a more transparent approach in general, not just for data breaches, 61 00:03:25,880 --> 00:03:28,800 Speaker 1: but for other stuff too. They should think ahead about 62 00:03:28,840 --> 00:03:31,320 Speaker 1: the possible impact that their decisions are going to have 63 00:03:31,320 --> 00:03:34,600 Speaker 1: on their customers and the public. At large, they should 64 00:03:34,600 --> 00:03:37,880 Speaker 1: explain those decisions as best they can without you know, 65 00:03:37,920 --> 00:03:42,240 Speaker 1: obviously getting into the point where you're revealing too much 66 00:03:42,240 --> 00:03:44,760 Speaker 1: about your strategy or you know, you're talking about insider 67 00:03:44,880 --> 00:03:48,600 Speaker 1: trading or something like that. They should consider the consequences 68 00:03:48,880 --> 00:03:51,760 Speaker 1: and make better choices and be clear about those choices 69 00:03:51,760 --> 00:03:54,640 Speaker 1: when communicating them to the world at large. So, if 70 00:03:54,640 --> 00:03:57,640 Speaker 1: a company experiences a data breach, I would hope that 71 00:03:57,720 --> 00:04:01,280 Speaker 1: after the initial investigation, which I think is still necessary, 72 00:04:01,560 --> 00:04:03,400 Speaker 1: if nothing else, just to get a handle on what 73 00:04:03,480 --> 00:04:07,880 Speaker 1: the scope of the breach was, after that initial investigation, 74 00:04:07,920 --> 00:04:13,080 Speaker 1: the company probably should come forward and let people know 75 00:04:13,200 --> 00:04:15,360 Speaker 1: so that they can in fact go out to try 76 00:04:15,360 --> 00:04:17,440 Speaker 1: and make sure their their stuff is as safe as 77 00:04:17,440 --> 00:04:20,760 Speaker 1: it can be and to help mitigate the amount of 78 00:04:20,800 --> 00:04:23,599 Speaker 1: harm that can be done Otherwise. Likewise, if a company 79 00:04:23,640 --> 00:04:26,359 Speaker 1: decides to make a change in policy that's going to 80 00:04:26,440 --> 00:04:29,560 Speaker 1: affect customers or employees, I would like to see them 81 00:04:29,600 --> 00:04:33,080 Speaker 1: do that in a clear and honest way. When Apple 82 00:04:33,120 --> 00:04:36,480 Speaker 1: admitted that it had been throttling the performance of older 83 00:04:36,520 --> 00:04:40,560 Speaker 1: models of the iPhone, ostensibly to preserve the battery life 84 00:04:40,880 --> 00:04:43,640 Speaker 1: of those old phones, that was a bit too late, 85 00:04:43,680 --> 00:04:47,920 Speaker 1: because people had already figured out that Apple was throttling 86 00:04:47,960 --> 00:04:51,960 Speaker 1: those old phones, and they had already assigned motivation to Apple. 87 00:04:52,480 --> 00:04:54,560 Speaker 1: A lot of people believe that the whole reason it 88 00:04:54,600 --> 00:04:57,599 Speaker 1: was done was the company was trying to convince people 89 00:04:57,680 --> 00:05:01,200 Speaker 1: to upgrade to newer iPhone model. You know, you just 90 00:05:01,200 --> 00:05:03,239 Speaker 1: sit there and say, Wow, my old iPhone just doesn't 91 00:05:03,279 --> 00:05:04,720 Speaker 1: run as fast as it used to. I guess I 92 00:05:04,800 --> 00:05:07,480 Speaker 1: need a new one. That was what people were saying 93 00:05:07,680 --> 00:05:13,400 Speaker 1: Apple's intent was, and maybe that was true. But assuming 94 00:05:13,400 --> 00:05:16,920 Speaker 1: Tim Cook was sincere when he said that Apple wasn't 95 00:05:17,040 --> 00:05:19,920 Speaker 1: trying to push people into buying new phones, but rather 96 00:05:20,320 --> 00:05:23,080 Speaker 1: they were throttling the old ones so that they wouldn't 97 00:05:23,120 --> 00:05:28,280 Speaker 1: burn through battery life so quickly, that was the official 98 00:05:28,839 --> 00:05:34,279 Speaker 1: uh explanation. Even if that's true, that explanation came too late. 99 00:05:34,320 --> 00:05:39,159 Speaker 1: People had already made and believed another story, which again 100 00:05:39,520 --> 00:05:43,279 Speaker 1: might also possibly be true. Sometimes I think it's necessary 101 00:05:43,320 --> 00:05:45,720 Speaker 1: for a company to make a change in policy in 102 00:05:45,839 --> 00:05:48,880 Speaker 1: order to do business or to optimize the technology, and 103 00:05:49,000 --> 00:05:53,760 Speaker 1: sometimes those changes, while necessary, are not popular with customers. 104 00:05:54,040 --> 00:05:56,560 Speaker 1: But I think it's for that reason that companies need 105 00:05:56,600 --> 00:05:59,600 Speaker 1: to be more transparent about the whole thing. I think 106 00:05:59,680 --> 00:06:03,520 Speaker 1: people are more likely to accept, even if it's begrudgingly 107 00:06:04,000 --> 00:06:07,440 Speaker 1: a change, if they at least understand the reasoning behind 108 00:06:07,800 --> 00:06:10,680 Speaker 1: the change. Otherwise they get the feeling that companies are 109 00:06:10,920 --> 00:06:15,000 Speaker 1: either hoping their customers are ignorant or stupid or unobservant, 110 00:06:15,720 --> 00:06:18,520 Speaker 1: that we're all marks. In other words, we're all suckers. 111 00:06:18,960 --> 00:06:21,280 Speaker 1: And if you feel a company holds you in contempt, 112 00:06:21,320 --> 00:06:24,600 Speaker 1: you're not likely to feel particularly warm toward it. So 113 00:06:24,640 --> 00:06:26,800 Speaker 1: I hope in twenty nineteen we see companies treat the 114 00:06:26,839 --> 00:06:30,400 Speaker 1: general public with a little more respect in these policies. 115 00:06:30,960 --> 00:06:33,600 Speaker 1: I also hope to see companies respond more quickly to 116 00:06:33,680 --> 00:06:37,480 Speaker 1: concerns that have legitimacy behind them, if customers have a 117 00:06:37,560 --> 00:06:41,640 Speaker 1: legitimate complaint about a product or service, or if employees 118 00:06:41,680 --> 00:06:45,400 Speaker 1: object to corporate policies, or and I think this is 119 00:06:45,560 --> 00:06:50,480 Speaker 1: incredibly important, if an employee brings allegations of sexual harassment 120 00:06:50,560 --> 00:06:54,320 Speaker 1: or sexual discrimination against a fellow employee or an executive. 121 00:06:54,600 --> 00:06:57,360 Speaker 1: I want companies to treat those events with the attention 122 00:06:57,400 --> 00:07:00,320 Speaker 1: they deserve, and not to be slow in doing so. 123 00:07:00,920 --> 00:07:05,000 Speaker 1: In ten there were stories of companies failing to do this, 124 00:07:05,120 --> 00:07:07,840 Speaker 1: or at least failing to do it adequately. And timely, 125 00:07:08,200 --> 00:07:12,160 Speaker 1: until public scrutiny and criticism essentially forced them to change 126 00:07:12,200 --> 00:07:16,200 Speaker 1: their policies. Google was faced with enormous pressure, both from 127 00:07:16,200 --> 00:07:19,360 Speaker 1: within the company and from outside the company to change 128 00:07:19,360 --> 00:07:23,120 Speaker 1: its mandatory arbitration policy. That was the policy that required 129 00:07:23,160 --> 00:07:27,680 Speaker 1: employees to try and resolve conflicts, including allegations of sexual 130 00:07:27,680 --> 00:07:32,800 Speaker 1: harassment and discrimination internally inside Google. That usually meant that 131 00:07:33,280 --> 00:07:35,800 Speaker 1: very little was being done about the problems, and it 132 00:07:35,840 --> 00:07:40,160 Speaker 1: perpetuated a harmful culture in some departments. Now Google has 133 00:07:40,200 --> 00:07:42,920 Speaker 1: since backed off this policy. They've decided to make the 134 00:07:43,000 --> 00:07:46,480 Speaker 1: arbitration optional, not mandatory, which is a really good step. 135 00:07:47,000 --> 00:07:49,520 Speaker 1: I want to see companies like Google be more responsive, 136 00:07:49,600 --> 00:07:51,560 Speaker 1: and that is to say, I want there to be 137 00:07:51,720 --> 00:07:56,880 Speaker 1: a reckoning, but not like a reckoning against employees, but 138 00:07:57,240 --> 00:08:00,560 Speaker 1: against bad policies. I want these companies to prove that 139 00:08:00,600 --> 00:08:03,920 Speaker 1: they take these matters seriously and will investigate and react 140 00:08:03,920 --> 00:08:07,480 Speaker 1: in appropriate ways. I think that changing the culture within 141 00:08:07,560 --> 00:08:10,160 Speaker 1: those companies will be a step in the right direction 142 00:08:10,200 --> 00:08:13,400 Speaker 1: to create workplaces that are more productive and positive in general, 143 00:08:13,640 --> 00:08:17,240 Speaker 1: and then that improvement in workplace culture will manifest in 144 00:08:17,280 --> 00:08:20,240 Speaker 1: the actual work. So, in other words, companies are going 145 00:08:20,280 --> 00:08:23,840 Speaker 1: to get better results with better work environments. Now along 146 00:08:23,880 --> 00:08:26,560 Speaker 1: that same vein, I want to see a continuation of 147 00:08:26,600 --> 00:08:30,960 Speaker 1: a trend of seeing a greater representation in technology at 148 00:08:31,000 --> 00:08:35,720 Speaker 1: all levels for anyone who's not young and male. And 149 00:08:36,000 --> 00:08:39,880 Speaker 1: that's essentially what I mean by representation. Uh, young, male 150 00:08:39,960 --> 00:08:43,920 Speaker 1: and white would be even more specific, although in technology 151 00:08:43,960 --> 00:08:49,200 Speaker 1: we're seeing other folks besides just Caucasian men in the field. 152 00:08:49,400 --> 00:08:50,920 Speaker 1: I want to see more of that. I want to 153 00:08:50,920 --> 00:08:58,800 Speaker 1: see people who are of all ethnicities, genders, sexual orientations. 154 00:08:58,800 --> 00:09:02,080 Speaker 1: I want to see more of everybody in technology. I 155 00:09:02,120 --> 00:09:04,120 Speaker 1: think that the world of technology needs to be a 156 00:09:04,160 --> 00:09:07,840 Speaker 1: reflection of the world around us. That way, the stuff 157 00:09:07,880 --> 00:09:12,319 Speaker 1: that we're making tends to be representative of who we 158 00:09:12,400 --> 00:09:17,280 Speaker 1: are as a collective group, and it doesn't leave people out. 159 00:09:18,160 --> 00:09:21,280 Speaker 1: You also see less chance for things like bias to 160 00:09:21,400 --> 00:09:25,040 Speaker 1: be inserted into the design process. I talked about that 161 00:09:25,080 --> 00:09:29,400 Speaker 1: with artificial intelligence and machine learning and about how bias, 162 00:09:29,480 --> 00:09:33,120 Speaker 1: even unconscious bias, can find its way into these sorts 163 00:09:33,160 --> 00:09:37,760 Speaker 1: of systems and that can be harmful in the long run. Well, 164 00:09:38,000 --> 00:09:42,840 Speaker 1: with increased representation, that can be something we can reduce. 165 00:09:43,280 --> 00:09:45,960 Speaker 1: And I'm not talking about just throwing people in there 166 00:09:45,960 --> 00:09:48,000 Speaker 1: for the sake of throwing them in there. There are 167 00:09:48,080 --> 00:09:51,240 Speaker 1: lots of people who are very interested and qualified in 168 00:09:51,280 --> 00:09:55,800 Speaker 1: these fields who are finding it difficult to get a 169 00:09:55,880 --> 00:09:59,480 Speaker 1: real working job in those spaces. I want to see 170 00:09:59,480 --> 00:10:03,880 Speaker 1: that chain. So I also want to see changes in 171 00:10:04,040 --> 00:10:08,480 Speaker 1: our various institutions to encourage greater representation. And I'm talking 172 00:10:08,480 --> 00:10:13,400 Speaker 1: from education, to employment, to entertainment, because we've constructed this 173 00:10:13,480 --> 00:10:18,400 Speaker 1: reality and then we've reinforced this reality. This idea of 174 00:10:18,960 --> 00:10:21,440 Speaker 1: to be in technology means that you need to be 175 00:10:22,160 --> 00:10:26,840 Speaker 1: a guy that tends to be the the messaging that 176 00:10:26,880 --> 00:10:30,120 Speaker 1: we make with our various systems and even our entertainment 177 00:10:30,120 --> 00:10:33,959 Speaker 1: and culture. I don't believe it's necessarily real. Some women 178 00:10:34,040 --> 00:10:36,840 Speaker 1: have no interest in these fields. That's true. There's some 179 00:10:36,880 --> 00:10:39,520 Speaker 1: women who have no interest in technology, but there's some 180 00:10:39,559 --> 00:10:43,079 Speaker 1: men who don't have interest in technology. Some women are 181 00:10:43,200 --> 00:10:46,600 Speaker 1: very much interested and accomplished in these fields. But these 182 00:10:46,600 --> 00:10:51,360 Speaker 1: women must frequently work against a reinforced system that discourages 183 00:10:51,440 --> 00:10:56,280 Speaker 1: their participation there, working even harder than men are in 184 00:10:56,400 --> 00:10:59,360 Speaker 1: order to just be in the field. Not saying that 185 00:10:59,400 --> 00:11:01,400 Speaker 1: they have to work harder than men to succeed, but 186 00:11:01,559 --> 00:11:05,199 Speaker 1: rather they have a system that does not favor them, 187 00:11:05,679 --> 00:11:08,120 Speaker 1: they have to work to get past that. And I'm 188 00:11:08,120 --> 00:11:10,720 Speaker 1: talking about women versus men, But a lot of these 189 00:11:10,720 --> 00:11:15,000 Speaker 1: concepts applied to those other factors like ethnicity or sexual orientation. 190 00:11:15,559 --> 00:11:18,840 Speaker 1: We've constructed ideas of where people should fit based upon 191 00:11:18,920 --> 00:11:22,280 Speaker 1: whatever categories they seem to belong to, and in my mind, 192 00:11:22,640 --> 00:11:25,000 Speaker 1: that has been a huge problem, not just for people 193 00:11:25,120 --> 00:11:28,480 Speaker 1: who find themselves having to overcome those preconceptions, but for 194 00:11:28,520 --> 00:11:31,960 Speaker 1: all of us. Now, I really believe that technology improves 195 00:11:32,080 --> 00:11:35,079 Speaker 1: as we include more people in the design and construction 196 00:11:35,200 --> 00:11:38,520 Speaker 1: of that technology. So by encouraging a wider spectrum of 197 00:11:38,559 --> 00:11:41,560 Speaker 1: people into the field, we bring in new perspectives, we 198 00:11:41,640 --> 00:11:45,160 Speaker 1: bring in new ideas, new approaches. Are stuff improves because 199 00:11:45,160 --> 00:11:48,560 Speaker 1: we're getting more of the best ideas and implementations. When 200 00:11:48,600 --> 00:11:52,360 Speaker 1: we discourage people from going into these fields that they 201 00:11:52,360 --> 00:11:55,560 Speaker 1: are otherwise interested in, we're denying ourselves the benefit of 202 00:11:55,600 --> 00:11:59,959 Speaker 1: their work. So remember, the first computers weren't electronic, They 203 00:12:00,000 --> 00:12:03,800 Speaker 1: weren't even electro mechanical. The first computers were women who 204 00:12:03,800 --> 00:12:08,200 Speaker 1: are calculating ballistics tables for the military. Let's not do 205 00:12:08,280 --> 00:12:13,679 Speaker 1: anything to discourage that representation. Now that's the super heavy 206 00:12:13,679 --> 00:12:15,360 Speaker 1: stuff I wanted to talk about I got it all 207 00:12:15,400 --> 00:12:18,120 Speaker 1: the way. First thing. When we come back, we'll go 208 00:12:18,200 --> 00:12:20,640 Speaker 1: to some more general ideas about technology that aren't quite 209 00:12:20,640 --> 00:12:23,520 Speaker 1: so heavy handed. The first let's take a quick break 210 00:12:23,600 --> 00:12:34,000 Speaker 1: to thank our sponsor. All right, let's talk about some 211 00:12:34,080 --> 00:12:37,640 Speaker 1: of the other wishes I have for nineteen. One of 212 00:12:37,679 --> 00:12:39,480 Speaker 1: those wishes is something that a lot of people have 213 00:12:39,520 --> 00:12:43,600 Speaker 1: been asking for for a very long time. I want 214 00:12:44,400 --> 00:12:48,559 Speaker 1: us to see. In twenty nineteen, Twitter introduced the option 215 00:12:48,600 --> 00:12:51,720 Speaker 1: to edit a tweet within a certain amount of time 216 00:12:51,720 --> 00:12:55,720 Speaker 1: of having posted that tweet. That way, when someone posts 217 00:12:55,720 --> 00:12:57,880 Speaker 1: something that has a typo in it, or maybe it's 218 00:12:57,920 --> 00:13:01,080 Speaker 1: not a fully formed thought, maybe they accidentally posted it 219 00:13:01,120 --> 00:13:03,720 Speaker 1: before they had even finished writing out the tweet. Maybe 220 00:13:03,720 --> 00:13:06,280 Speaker 1: they posted something and thought, oh man, no, that really 221 00:13:06,320 --> 00:13:10,920 Speaker 1: needs a little more context. It's easy that you could 222 00:13:11,120 --> 00:13:14,679 Speaker 1: add another tweet to that, You could thread tweets to 223 00:13:14,760 --> 00:13:16,240 Speaker 1: explain it. But it would be great if you could 224 00:13:16,320 --> 00:13:17,720 Speaker 1: edit it. It It would be nice if you could do 225 00:13:17,800 --> 00:13:21,480 Speaker 1: so so that the tweet isn't immediately seized upon by 226 00:13:21,480 --> 00:13:25,440 Speaker 1: the general Twitter public, which is pretty judgmental. If you've 227 00:13:25,440 --> 00:13:27,960 Speaker 1: been on Twitter, you can see that a bad tweet 228 00:13:28,040 --> 00:13:31,800 Speaker 1: can get a lot of very negative reaction very quickly, 229 00:13:31,840 --> 00:13:35,160 Speaker 1: and it might be the case where someone didn't intend 230 00:13:35,520 --> 00:13:38,520 Speaker 1: for the tweet to come across the way it did. Now, 231 00:13:38,600 --> 00:13:42,000 Speaker 1: in some of those cases, it's pretty undeniable someone's trolling 232 00:13:42,280 --> 00:13:46,840 Speaker 1: or just espousing terrible beliefs or thoughts or whatever. But 233 00:13:46,880 --> 00:13:49,600 Speaker 1: in other cases, I think it really just genuinely comes 234 00:13:49,600 --> 00:13:51,920 Speaker 1: across as a goof. So it would be really nice 235 00:13:51,960 --> 00:13:53,600 Speaker 1: if we could edit it now. I don't think the 236 00:13:53,720 --> 00:13:56,720 Speaker 1: edit feature should allow people to change tweets years after 237 00:13:56,800 --> 00:14:00,079 Speaker 1: they made those tweets. Maybe the time limit could be 238 00:14:00,200 --> 00:14:03,600 Speaker 1: a couple of minutes from the posted tweet, just a 239 00:14:03,600 --> 00:14:06,120 Speaker 1: couple of minutes max. And I wouldn't even mind if 240 00:14:06,120 --> 00:14:09,240 Speaker 1: the tweet indicated that it had been edited, that if 241 00:14:09,280 --> 00:14:11,720 Speaker 1: anyone saw the tweet after the edit, it would actually 242 00:14:11,720 --> 00:14:15,560 Speaker 1: show this tweet has been edited at such and such timestamp. 243 00:14:15,679 --> 00:14:19,120 Speaker 1: That's fine. Just let me fix my typos without having 244 00:14:19,160 --> 00:14:22,440 Speaker 1: to delete and repost a message or follow up with 245 00:14:22,680 --> 00:14:25,800 Speaker 1: a done goofed up tweet. We should also see in 246 00:14:25,840 --> 00:14:29,760 Speaker 1: twenty nineteen some early deployment and roll out of five 247 00:14:29,880 --> 00:14:33,200 Speaker 1: G technology, and I haven't really talked about five G 248 00:14:33,440 --> 00:14:35,640 Speaker 1: on tech stuff very much yet. So I'll have to 249 00:14:35,640 --> 00:14:38,360 Speaker 1: do an episode on it pretty soon. But it is 250 00:14:38,480 --> 00:14:42,520 Speaker 1: the next generation of wireless data transmission protocols. It is 251 00:14:42,560 --> 00:14:47,520 Speaker 1: a promising um uh technology. We're gonna see some pretty 252 00:14:47,520 --> 00:14:50,760 Speaker 1: wicked speed from this thing, like a gigabit per second 253 00:14:50,800 --> 00:14:54,160 Speaker 1: download rate for users, which is really fast for wireless. 254 00:14:54,160 --> 00:14:56,920 Speaker 1: I mean it's I consider it really fast period. I 255 00:14:56,960 --> 00:14:59,800 Speaker 1: can't get a gig a bit download on my service 256 00:14:59,800 --> 00:15:02,040 Speaker 1: and I have the best that I can possibly have 257 00:15:02,240 --> 00:15:06,200 Speaker 1: in my neighborhood. I'm pretty sure we won't see global 258 00:15:06,320 --> 00:15:09,720 Speaker 1: five G coverage until about at the earliest, but we 259 00:15:09,760 --> 00:15:12,400 Speaker 1: should at least have some areas covered in twenty nineteen 260 00:15:12,400 --> 00:15:15,560 Speaker 1: and some technology incorporating five G in it. So my 261 00:15:15,600 --> 00:15:18,600 Speaker 1: wish list is for me to be in that coverage 262 00:15:18,640 --> 00:15:21,120 Speaker 1: area so I can change how I access the Internet 263 00:15:21,160 --> 00:15:24,840 Speaker 1: at home. Yes, this is a selfish wish that is 264 00:15:24,920 --> 00:15:28,640 Speaker 1: ultimately all about me. More generally, I look forward to 265 00:15:28,640 --> 00:15:32,040 Speaker 1: seeing how five G can support new technologies, including expansive 266 00:15:32,080 --> 00:15:35,240 Speaker 1: Internet of Things technologies. If I had one big wish 267 00:15:35,280 --> 00:15:37,440 Speaker 1: for it, it would be that all the people building 268 00:15:37,480 --> 00:15:39,440 Speaker 1: the tech that's going to be running on five G 269 00:15:39,880 --> 00:15:42,800 Speaker 1: are doing so with security in mind, and that the 270 00:15:42,840 --> 00:15:49,440 Speaker 1: tech would be really strong against hackers and people who 271 00:15:49,520 --> 00:15:53,360 Speaker 1: want to leverage and exploit that system, because it presents 272 00:15:53,360 --> 00:15:57,080 Speaker 1: an incredibly tempting target and a potentially rich environment for 273 00:15:57,120 --> 00:16:00,680 Speaker 1: people for to actually exploit it. So we've got to 274 00:16:00,680 --> 00:16:03,280 Speaker 1: be careful out there. I think we're gonna see a 275 00:16:03,280 --> 00:16:06,720 Speaker 1: lot more large scale renewable energy projects come online in 276 00:16:06,720 --> 00:16:10,440 Speaker 1: twenty nineteen, So my wish is that these are successful 277 00:16:10,720 --> 00:16:14,560 Speaker 1: and can show how renewable energy, when properly deployed, can 278 00:16:14,600 --> 00:16:17,960 Speaker 1: help offset our need to depend upon fossil fuels. I'm 279 00:16:18,000 --> 00:16:20,680 Speaker 1: pretty much convinced that renewable energy has to be at 280 00:16:20,720 --> 00:16:23,840 Speaker 1: least part of our strategy to meet our energy needs 281 00:16:23,840 --> 00:16:27,760 Speaker 1: moving forward, at least until we can crack sustainable fusion. 282 00:16:28,160 --> 00:16:32,000 Speaker 1: My wish is that the global community, the US included, 283 00:16:32,280 --> 00:16:35,680 Speaker 1: would invest more in building out renewable energy power plants, 284 00:16:35,960 --> 00:16:38,120 Speaker 1: and that we use that to decrease the load on 285 00:16:38,240 --> 00:16:42,920 Speaker 1: fossil fuel burning power plants rather than just generate and 286 00:16:42,960 --> 00:16:46,240 Speaker 1: then gobble up extra energy. That is a big wish 287 00:16:46,360 --> 00:16:49,480 Speaker 1: because typically when it comes to stuff like electricity, we 288 00:16:49,560 --> 00:16:52,960 Speaker 1: just tend to use more if we generate more. We 289 00:16:53,000 --> 00:16:55,840 Speaker 1: don't tend to say, oh, well, now we have more 290 00:16:55,880 --> 00:17:00,200 Speaker 1: electricity than we need, so we can shift of our 291 00:17:00,240 --> 00:17:03,240 Speaker 1: production to renewables or whatever. So my wishes that we 292 00:17:03,320 --> 00:17:06,080 Speaker 1: buck that trend, we really start moving off of fossil 293 00:17:06,119 --> 00:17:08,880 Speaker 1: fuels as much as we can. And on a related note, 294 00:17:09,000 --> 00:17:11,280 Speaker 1: my wish is that we as a whole will develop 295 00:17:11,320 --> 00:17:16,360 Speaker 1: an adopt strategies to reduce carbon dioxide emissions, whether that's uh, 296 00:17:16,400 --> 00:17:20,399 Speaker 1: you know, embracing electric vehicles more and then also making 297 00:17:20,400 --> 00:17:23,800 Speaker 1: sure that the way we generate electricity doesn't use a 298 00:17:23,800 --> 00:17:27,320 Speaker 1: lot of fossil fuels, or also that we come up 299 00:17:27,320 --> 00:17:32,199 Speaker 1: with more strategies to capture carbon from the atmosphere, and 300 00:17:32,200 --> 00:17:35,600 Speaker 1: this would help us mitigate the effects of climate change. Now, 301 00:17:35,640 --> 00:17:37,640 Speaker 1: I want to be clear, even if I'm being at 302 00:17:37,680 --> 00:17:40,679 Speaker 1: my most optimistic, there's no way that we're going to 303 00:17:40,800 --> 00:17:43,760 Speaker 1: reverse the trend on any sort of near term basis 304 00:17:43,760 --> 00:17:47,359 Speaker 1: for climate change. For that we're talking generations down the road. 305 00:17:47,680 --> 00:17:51,200 Speaker 1: But we can work to reduce climate change is impact 306 00:17:51,560 --> 00:17:55,320 Speaker 1: in the lifetime that we have, and then we can 307 00:17:55,359 --> 00:17:58,080 Speaker 1: also set the stage for recovery for future generations. It 308 00:17:58,080 --> 00:18:01,200 Speaker 1: requires us to be a little selfless, because we're gonna 309 00:18:01,240 --> 00:18:03,360 Speaker 1: have to make some sacrifices on the way in order 310 00:18:03,359 --> 00:18:05,800 Speaker 1: to make that happen, but I think it's absolutely necessary 311 00:18:05,800 --> 00:18:09,760 Speaker 1: if we want to avoid global catastrophe. Another wish I 312 00:18:09,800 --> 00:18:12,440 Speaker 1: have to see in twenty nineteen, or I hope to 313 00:18:12,480 --> 00:18:15,639 Speaker 1: see in twenty nineteen, are some good advances in autonomous 314 00:18:15,680 --> 00:18:18,000 Speaker 1: car technology. I did all those episodes and I talked 315 00:18:18,000 --> 00:18:19,800 Speaker 1: about where we are and where we need to be. 316 00:18:20,280 --> 00:18:22,320 Speaker 1: We've seen a ton of progress so far, don't get 317 00:18:22,320 --> 00:18:25,080 Speaker 1: me wrong. I mean, I am incredibly impressed with where 318 00:18:25,080 --> 00:18:27,479 Speaker 1: we are, but I also feel we're not nearly as 319 00:18:27,560 --> 00:18:31,159 Speaker 1: far along as would be indicated with the launch of 320 00:18:31,240 --> 00:18:34,000 Speaker 1: something like Weymo one, or at least the way it's marketed, 321 00:18:34,119 --> 00:18:38,560 Speaker 1: or the deployment of Tesla's autopilot feature. Um, these things 322 00:18:39,119 --> 00:18:42,639 Speaker 1: give us, I think, an unrealistic expectation of where we 323 00:18:42,680 --> 00:18:45,880 Speaker 1: are with autonomous car technology. We're not nearly as far 324 00:18:45,960 --> 00:18:49,760 Speaker 1: along as I think a lot of people just assume. Now. 325 00:18:49,800 --> 00:18:52,280 Speaker 1: I hope that the various programs that are in place, 326 00:18:52,320 --> 00:18:55,959 Speaker 1: like Weymo one, are able to gather more information and 327 00:18:56,000 --> 00:18:59,240 Speaker 1: build out better and better models so that we can 328 00:18:59,480 --> 00:19:03,440 Speaker 1: get to autonomous cars where we feel safe with them 329 00:19:03,480 --> 00:19:06,560 Speaker 1: on the road in wide deployment. And I'm hoping we 330 00:19:06,600 --> 00:19:09,400 Speaker 1: see a lot of that In twenty nineteen, we see 331 00:19:09,400 --> 00:19:12,720 Speaker 1: a lot of advance toward that. I honestly don't think 332 00:19:12,760 --> 00:19:15,600 Speaker 1: we're gonna get to a future where autonomous cars are 333 00:19:15,640 --> 00:19:20,280 Speaker 1: going to integrate seamlessly with traffic for another probably five 334 00:19:20,400 --> 00:19:23,800 Speaker 1: or six years. I would guess at least that's me 335 00:19:24,040 --> 00:19:27,480 Speaker 1: being kind of conservative with those I think really or 336 00:19:27,520 --> 00:19:29,480 Speaker 1: actually being generous, because I think, really it's going to 337 00:19:29,520 --> 00:19:32,800 Speaker 1: be more like ten years before we're really really certain. 338 00:19:33,200 --> 00:19:39,160 Speaker 1: And that's because driving as such a complicated process, and 339 00:19:39,280 --> 00:19:41,920 Speaker 1: so many different things can happen that you can't easily 340 00:19:41,960 --> 00:19:44,440 Speaker 1: plan for when you're building out an autonomous system. But 341 00:19:44,440 --> 00:19:46,160 Speaker 1: I think we're going to see a lot more advanced 342 00:19:46,240 --> 00:19:49,200 Speaker 1: in it next year, just not to the point where 343 00:19:49,200 --> 00:19:52,639 Speaker 1: we're going to have super duper robot cars. Now. Shortly 344 00:19:52,680 --> 00:19:55,359 Speaker 1: after this episode airs, I'm going to be in Las Vegas, 345 00:19:55,440 --> 00:19:59,159 Speaker 1: Nevada to attend or Nevada. I'm sorry to attend the 346 00:19:59,200 --> 00:20:01,679 Speaker 1: two thousand nine teen ce S. So here are some 347 00:20:01,800 --> 00:20:05,040 Speaker 1: things I hope I will see at CES. There's some 348 00:20:05,080 --> 00:20:06,920 Speaker 1: things I know I'm gonna see. I'm going to see 349 00:20:06,920 --> 00:20:09,359 Speaker 1: the latest in televisions. You know, they're gonna be ultra 350 00:20:09,440 --> 00:20:13,280 Speaker 1: high definition, beautiful four K, eight K sets. I'm going 351 00:20:13,320 --> 00:20:15,440 Speaker 1: to see all that kind of stuff there. I hope 352 00:20:15,440 --> 00:20:18,199 Speaker 1: I'm going to see some truly cool innovations in some 353 00:20:18,240 --> 00:20:21,360 Speaker 1: technology that's just kind of hitting a slump right now, 354 00:20:21,400 --> 00:20:25,080 Speaker 1: stuff like virtual reality and augmented reality. You know, those 355 00:20:25,119 --> 00:20:28,560 Speaker 1: just haven't really been able to take hold the way 356 00:20:28,560 --> 00:20:31,080 Speaker 1: people were hoping just a couple of years ago. And 357 00:20:31,560 --> 00:20:34,719 Speaker 1: I still think both technologies are really incredible. There are 358 00:20:34,720 --> 00:20:37,199 Speaker 1: also technologies that tend to be hard to promote for 359 00:20:37,240 --> 00:20:39,920 Speaker 1: a lot of reasons. One is that they all all 360 00:20:39,920 --> 00:20:41,919 Speaker 1: tend to be pretty expensive. Most of the time, you 361 00:20:41,960 --> 00:20:44,800 Speaker 1: have to pair the headset with some other piece of technology, 362 00:20:45,280 --> 00:20:49,400 Speaker 1: such as a game console or a high performing computer 363 00:20:49,680 --> 00:20:52,280 Speaker 1: or a smartphone. So if you're just starting out that 364 00:20:52,400 --> 00:20:54,520 Speaker 1: you don't have any of these things, you may have 365 00:20:54,560 --> 00:20:57,160 Speaker 1: to buy several pieces of equipment, not just the headset 366 00:20:57,359 --> 00:21:00,440 Speaker 1: or a headset and a controller. So for consoles and mutters, 367 00:21:00,760 --> 00:21:04,040 Speaker 1: that typically means you've also got a physical cable connecting 368 00:21:04,080 --> 00:21:07,159 Speaker 1: the headset to the device. There are some wireless solutions 369 00:21:07,160 --> 00:21:09,760 Speaker 1: out there, but they're not many, so I'm hoping I'll 370 00:21:09,760 --> 00:21:12,119 Speaker 1: see some wireless tech on display for a R and 371 00:21:12,200 --> 00:21:14,240 Speaker 1: v R at c S Sough, then itself is a 372 00:21:14,359 --> 00:21:16,280 Speaker 1: challenge when you go to a show like CES, because 373 00:21:16,320 --> 00:21:20,040 Speaker 1: they're just hundreds of companies showing off wireless technologies. So 374 00:21:20,080 --> 00:21:23,240 Speaker 1: there's a lot of potential interference at c S. But 375 00:21:23,280 --> 00:21:26,280 Speaker 1: I hope to see it anyway. Beyond all these technical 376 00:21:26,280 --> 00:21:29,480 Speaker 1: and economic challenges to getting VR and R out the door, 377 00:21:29,520 --> 00:21:33,600 Speaker 1: there's another big obstacle. It's really hard to promote these 378 00:21:33,640 --> 00:21:37,320 Speaker 1: technologies without having someone actually experience it. It's one of 379 00:21:37,320 --> 00:21:39,639 Speaker 1: those things that you kind of have to try to 380 00:21:39,680 --> 00:21:42,480 Speaker 1: get a feel for how incredible it could be. Also, 381 00:21:42,680 --> 00:21:45,440 Speaker 1: I tend to be a little reticent to try these 382 00:21:45,520 --> 00:21:50,199 Speaker 1: kinds of technologies at c S cause you're in an 383 00:21:50,400 --> 00:21:54,119 Speaker 1: enormous convention with tens of thousands of people, many of 384 00:21:54,119 --> 00:21:57,159 Speaker 1: whom have tried that technology before you got there, which 385 00:21:57,640 --> 00:22:00,000 Speaker 1: seems like a really good way to get conjunctive itis. 386 00:22:00,600 --> 00:22:03,320 Speaker 1: But my wish is to see better v R and 387 00:22:03,400 --> 00:22:06,320 Speaker 1: A R technology and some really cool applications this year 388 00:22:06,359 --> 00:22:09,399 Speaker 1: at CES. There are a lot of outlets predicting that 389 00:22:09,440 --> 00:22:12,480 Speaker 1: we're gonna see various foldable screens at c E S, 390 00:22:12,520 --> 00:22:14,159 Speaker 1: which could be cool. I'm talking about like things like 391 00:22:14,200 --> 00:22:17,679 Speaker 1: foldable phones and laptops that are taking advantage of oh 392 00:22:17,800 --> 00:22:21,399 Speaker 1: LED technology, screens that you can actually fold themselves, so 393 00:22:21,480 --> 00:22:24,440 Speaker 1: it's not just a hinged thing, but something that can 394 00:22:24,480 --> 00:22:27,480 Speaker 1: fold and bend because of the nature of the o 395 00:22:27,680 --> 00:22:31,320 Speaker 1: LED technology. My wishes to see really innovative designs that 396 00:22:31,400 --> 00:22:34,760 Speaker 1: take full advantage of this quality, for the novelty factor 397 00:22:35,400 --> 00:22:41,119 Speaker 1: wears off. The nice thing about ce S wishes is 398 00:22:41,119 --> 00:22:42,880 Speaker 1: that I'm going to find out if they come true 399 00:22:42,960 --> 00:22:44,680 Speaker 1: or not right at the very start of the year, 400 00:22:44,960 --> 00:22:48,080 Speaker 1: so that will get out of the way pretty early on. 401 00:22:48,080 --> 00:22:52,800 Speaker 1: One other big wish is I would love to uh 402 00:22:53,000 --> 00:22:57,200 Speaker 1: see some innovative, engaging new social networking platforms emerged in 403 00:22:57,240 --> 00:23:00,960 Speaker 1: twenty nineteen. I feel like face books a year and 404 00:23:01,840 --> 00:23:04,359 Speaker 1: provided a lot of lessons to learn about how to 405 00:23:04,440 --> 00:23:09,240 Speaker 1: conduct business, how to treat customers, both on the user 406 00:23:09,359 --> 00:23:12,480 Speaker 1: side and advertisers. But I also think that once you 407 00:23:12,520 --> 00:23:16,120 Speaker 1: have a really large established company like Facebook, it can 408 00:23:16,160 --> 00:23:19,840 Speaker 1: be really hard to create actionable plans based on those lessons. 409 00:23:19,880 --> 00:23:22,720 Speaker 1: It's it could be hard to enact those lessons. I'd 410 00:23:22,720 --> 00:23:25,680 Speaker 1: love to see a new take on social networking that 411 00:23:25,720 --> 00:23:28,040 Speaker 1: would be guided by those lessons, and I'd love to 412 00:23:28,080 --> 00:23:31,880 Speaker 1: see that get traction. That being said, there's been plenty 413 00:23:32,040 --> 00:23:34,919 Speaker 1: of social networking sites that have tried to rise up 414 00:23:34,960 --> 00:23:37,600 Speaker 1: to challenge Facebook over the years, and none of them 415 00:23:37,800 --> 00:23:40,359 Speaker 1: have had much success, at least not in the United States. 416 00:23:40,800 --> 00:23:44,280 Speaker 1: Some of them were smaller endeavors like Diaspora, but others 417 00:23:44,280 --> 00:23:47,240 Speaker 1: were from big companies like Apple which had Ping do 418 00:23:47,280 --> 00:23:51,120 Speaker 1: you remember Ping? Or Google with Google Plus that's gonna 419 00:23:51,160 --> 00:23:54,000 Speaker 1: shut down in twenty nineteen after a couple of big 420 00:23:54,080 --> 00:23:58,440 Speaker 1: data breaches became public. But someone's got to be able 421 00:23:58,480 --> 00:24:00,240 Speaker 1: to come up with a new approach that's in aging 422 00:24:00,359 --> 00:24:02,800 Speaker 1: enough to get people to adopt it. I mean, we 423 00:24:02,840 --> 00:24:06,040 Speaker 1: think of Facebook as being enormous, but my Space was 424 00:24:06,200 --> 00:24:11,199 Speaker 1: enormous before Facebook came along, so it's not unprecedented. It 425 00:24:11,280 --> 00:24:17,000 Speaker 1: just requires a social networking site that is compelling, it's 426 00:24:17,040 --> 00:24:20,400 Speaker 1: easy to use, you know, it's well designed, and it's 427 00:24:20,480 --> 00:24:24,400 Speaker 1: very transparent in its policies. That's what's needed. And even 428 00:24:24,440 --> 00:24:26,480 Speaker 1: then after that, you still have to convince people to 429 00:24:26,480 --> 00:24:28,920 Speaker 1: go over there and use it. That's the toughest part, 430 00:24:28,920 --> 00:24:31,840 Speaker 1: I guess, But I'm hoping that we see that. If 431 00:24:31,880 --> 00:24:35,000 Speaker 1: nothing else, it would add more pressure on Facebook to 432 00:24:35,119 --> 00:24:38,199 Speaker 1: make changes that would be positive for all of its users. 433 00:24:39,160 --> 00:24:40,760 Speaker 1: I have a few more wishes I want to share, 434 00:24:41,359 --> 00:24:44,680 Speaker 1: but first let's take another quick break to thank our sponsor. 435 00:24:52,240 --> 00:24:54,800 Speaker 1: I hope to see a lot more focused and enthusiasm 436 00:24:54,880 --> 00:24:58,960 Speaker 1: around science and technology from an educational perspective in twenty nineteen. 437 00:24:59,200 --> 00:25:01,240 Speaker 1: We're starting to see some of that stuff now, which 438 00:25:01,280 --> 00:25:04,560 Speaker 1: is great. I want that trend to continue. Uh. There's 439 00:25:04,600 --> 00:25:08,119 Speaker 1: a fun documentary in eighteen called Science Fair, which in 440 00:25:08,200 --> 00:25:12,719 Speaker 1: parts seems to revel an absurdity because in some cases 441 00:25:12,720 --> 00:25:15,480 Speaker 1: they're they're likening science to something like being a rock star. 442 00:25:15,600 --> 00:25:18,399 Speaker 1: But I think that's awesome, and to be fair, I 443 00:25:18,440 --> 00:25:20,840 Speaker 1: think the documentary brings that out too. I don't think 444 00:25:20,880 --> 00:25:23,000 Speaker 1: they're making fun of the kids, and it's not like 445 00:25:23,040 --> 00:25:27,400 Speaker 1: this is unprecedented. Their figures, like Thomas Edison, Nicola Tesla, 446 00:25:27,640 --> 00:25:30,800 Speaker 1: Albert Einstein, they were all treated as not just brilliant people, 447 00:25:30,800 --> 00:25:34,680 Speaker 1: but celebrities. I think elevating science and technology in this way, 448 00:25:35,119 --> 00:25:37,680 Speaker 1: not as a cult of personality, mind you, but it's 449 00:25:37,680 --> 00:25:41,200 Speaker 1: something that's just really interesting to pursue is a fantastic idea. 450 00:25:41,520 --> 00:25:44,400 Speaker 1: I want to see more people from all backgrounds encouraged 451 00:25:44,400 --> 00:25:47,360 Speaker 1: to dive into that world. I want to see schools 452 00:25:47,400 --> 00:25:50,680 Speaker 1: and organizations invest in it to give young people the 453 00:25:50,760 --> 00:25:53,840 Speaker 1: chance to try out new ideas. Now I'm nowhere near 454 00:25:54,160 --> 00:25:56,600 Speaker 1: the first person to say this, but one of the 455 00:25:56,640 --> 00:26:00,879 Speaker 1: most amazing things about young people is that they don't 456 00:26:00,920 --> 00:26:04,600 Speaker 1: know what is impossible, so sometimes they find ways to 457 00:26:04,640 --> 00:26:09,439 Speaker 1: accomplish things that older people like myself have long dismissed 458 00:26:09,480 --> 00:26:13,240 Speaker 1: as being outside of our capabilities. And so time and again, 459 00:26:13,600 --> 00:26:17,119 Speaker 1: young people prove that old folks like me are wrong, 460 00:26:17,440 --> 00:26:20,960 Speaker 1: that were narrow minded, that we're not considering all the potential, 461 00:26:21,560 --> 00:26:25,360 Speaker 1: and the impossible is in fact achievable. In some cases. 462 00:26:25,800 --> 00:26:29,000 Speaker 1: This is a source of unending inspiration for me, and 463 00:26:29,200 --> 00:26:31,600 Speaker 1: so I want to see twenty nineteen bring with it 464 00:26:31,680 --> 00:26:34,960 Speaker 1: a culture that supports and encourages that kind of participation 465 00:26:35,880 --> 00:26:39,399 Speaker 1: in the field of entertainment, getting into some real, fluffy stuff. 466 00:26:39,840 --> 00:26:42,600 Speaker 1: I would love to see some hopeful science fiction. I 467 00:26:42,920 --> 00:26:46,879 Speaker 1: think dark science fiction definitely has its place, and I 468 00:26:46,920 --> 00:26:49,520 Speaker 1: love it. A lot of my favorite novels are in 469 00:26:49,560 --> 00:26:53,160 Speaker 1: that kind of dystopian sci fi realm, like four Brave 470 00:26:53,200 --> 00:26:56,360 Speaker 1: New World or Fair Night fifty one. Science fiction has 471 00:26:56,400 --> 00:26:59,480 Speaker 1: often been the vehicle that authors have used to warn 472 00:26:59,680 --> 00:27:04,800 Speaker 1: us about potentially catastrophic scenarios that could come about, sometimes 473 00:27:05,080 --> 00:27:09,960 Speaker 1: due to misuse of technology, sometimes despite our ability to 474 00:27:10,040 --> 00:27:14,119 Speaker 1: use technology. And I don't think that those stories should stop. 475 00:27:14,480 --> 00:27:17,080 Speaker 1: We have a need for stuff like Black Mirror there 476 00:27:17,080 --> 00:27:19,880 Speaker 1: reminds us that we need to be careful. But I'd 477 00:27:19,960 --> 00:27:24,199 Speaker 1: also love to see more hopeful aspirational science fiction stories 478 00:27:24,240 --> 00:27:28,960 Speaker 1: that contain not only conflict, but wonder and innovation. I'd 479 00:27:29,000 --> 00:27:32,800 Speaker 1: like to see something closer to Star Trek the next generation. Now. 480 00:27:32,880 --> 00:27:35,240 Speaker 1: I think it's good to have both kinds of stories 481 00:27:35,280 --> 00:27:37,520 Speaker 1: out there. On the one hand, you want to remind 482 00:27:37,560 --> 00:27:41,080 Speaker 1: everyone that technology by itself can be misused, or it 483 00:27:41,119 --> 00:27:43,480 Speaker 1: can be designed in such a way that it causes harm, 484 00:27:43,520 --> 00:27:47,280 Speaker 1: whether intentionally or otherwise. But on the other hand, I 485 00:27:47,640 --> 00:27:50,720 Speaker 1: think we also have a need to have inspirational stories 486 00:27:50,760 --> 00:27:54,639 Speaker 1: that remind us that this isn't necessarily the only outcome 487 00:27:54,880 --> 00:27:57,879 Speaker 1: for the future. We can shape that future through our choices, 488 00:27:57,960 --> 00:28:00,760 Speaker 1: and if we make good choices, we're more likely to 489 00:28:00,760 --> 00:28:03,240 Speaker 1: have a good feature as a result. So here's to 490 00:28:03,320 --> 00:28:08,240 Speaker 1: hoping for more aspirational sci fi in twenty nineteen. We're 491 00:28:08,240 --> 00:28:12,480 Speaker 1: gonna see several new premium streaming video services emerge in 492 00:28:12,480 --> 00:28:15,000 Speaker 1: twenty nineteen, and we're not really sure what's going to 493 00:28:15,080 --> 00:28:18,680 Speaker 1: happen with Hulu next year as Disney asserts majority control 494 00:28:18,760 --> 00:28:21,680 Speaker 1: over the service. My hope, though, is that we continue 495 00:28:21,720 --> 00:28:26,399 Speaker 1: to see innovative programming on these various services. So it 496 00:28:26,440 --> 00:28:30,000 Speaker 1: can be frustrating as a consumer to see the proliferation 497 00:28:30,040 --> 00:28:33,400 Speaker 1: of so many services scattering the various types of content 498 00:28:33,520 --> 00:28:37,119 Speaker 1: we want to see across numerous subscriptions. I mean, one 499 00:28:37,119 --> 00:28:40,480 Speaker 1: of the big motivating factors for cord cutting is that 500 00:28:40,640 --> 00:28:42,360 Speaker 1: you don't want to pay for all the stuff you 501 00:28:42,400 --> 00:28:44,720 Speaker 1: don't want to see. But then if all the stuff 502 00:28:44,720 --> 00:28:48,160 Speaker 1: you want to see gets spread out over a competing services, 503 00:28:48,200 --> 00:28:49,720 Speaker 1: the only way to see all of it is to 504 00:28:49,800 --> 00:28:52,480 Speaker 1: subscribe to all those competing services, which just seems like 505 00:28:52,520 --> 00:28:57,360 Speaker 1: it's it's complicated in a different way. So my hope 506 00:28:57,440 --> 00:29:01,280 Speaker 1: is that more storytellers are going to get cool opportunities 507 00:29:01,320 --> 00:29:04,720 Speaker 1: to bring their ideas to life thanks to those streaming services. 508 00:29:05,400 --> 00:29:09,200 Speaker 1: I don't wish any of them to necessarily just go away. 509 00:29:09,400 --> 00:29:12,960 Speaker 1: I mean, obviously, people their livelihoods depend upon these things. 510 00:29:13,320 --> 00:29:15,320 Speaker 1: I wanted to be in such a way that it 511 00:29:15,400 --> 00:29:17,720 Speaker 1: makes sense for consumers, and I want to see more 512 00:29:17,840 --> 00:29:21,360 Speaker 1: stories get a chance to come to life. Though I'm 513 00:29:21,360 --> 00:29:24,840 Speaker 1: sure I'll still be grouchy about how many of those 514 00:29:25,320 --> 00:29:29,800 Speaker 1: services I'll be subscribing to, because I see that when 515 00:29:29,800 --> 00:29:32,040 Speaker 1: that notification pops up in my email each month, I 516 00:29:32,080 --> 00:29:36,000 Speaker 1: just think, but I need them. Oh and since Disney 517 00:29:36,000 --> 00:29:38,000 Speaker 1: has now acquired Fox, I kind of hope we start 518 00:29:38,040 --> 00:29:40,520 Speaker 1: seeing stuff like the X Men get incorporated into the 519 00:29:40,560 --> 00:29:44,000 Speaker 1: Marvel Cinematic universe. That's kind of outside the scope of 520 00:29:44,120 --> 00:29:46,480 Speaker 1: tech stuff. That's just what Jonathan hopes to see in 521 00:29:46,520 --> 00:29:50,040 Speaker 1: theaters moving forward. But sticking with entertainment, I'm going to 522 00:29:50,120 --> 00:29:53,640 Speaker 1: get really granular here. There's another Jonathan specific wish. I 523 00:29:53,680 --> 00:29:57,080 Speaker 1: hope that Bethesda is able to sort out and salvage 524 00:29:57,200 --> 00:30:00,120 Speaker 1: fall Out seventy six. I'm one of a relative the 525 00:30:00,680 --> 00:30:03,959 Speaker 1: small percentage of players who enjoys the game, or at 526 00:30:04,000 --> 00:30:06,600 Speaker 1: least I haven't grown so frustrated that I won't play 527 00:30:06,640 --> 00:30:09,480 Speaker 1: the game. But I think anyone who is being intellectually 528 00:30:09,520 --> 00:30:12,000 Speaker 1: honest test to admit that game has a lot of issues. 529 00:30:12,680 --> 00:30:15,120 Speaker 1: But as has already started to address some of those, 530 00:30:15,160 --> 00:30:18,200 Speaker 1: They've rolled out various patches. I would love to see 531 00:30:18,200 --> 00:30:21,280 Speaker 1: Fallout seventy six reach a point where people playing the 532 00:30:21,320 --> 00:30:24,440 Speaker 1: game don't feel like they have to justify their decision 533 00:30:24,480 --> 00:30:26,160 Speaker 1: to play the game. So, in other words, I want 534 00:30:26,160 --> 00:30:28,240 Speaker 1: to see Fallout seventy six reach a point at which 535 00:30:28,280 --> 00:30:31,520 Speaker 1: the average critic would say it's a good game. Maybe 536 00:30:31,520 --> 00:30:33,320 Speaker 1: not a great game, but a good game by the 537 00:30:33,360 --> 00:30:36,000 Speaker 1: end of tween nineteen. That might take a lot of 538 00:30:36,000 --> 00:30:39,760 Speaker 1: work considering how hard a lot of critics slagged this game, 539 00:30:39,800 --> 00:30:42,760 Speaker 1: and again, I can't blame them for their reactions. I 540 00:30:42,800 --> 00:30:45,560 Speaker 1: don't hate it, but I definitely see the reasoning behind 541 00:30:45,720 --> 00:30:49,240 Speaker 1: the criticism. I'm also hoping that we see many more 542 00:30:49,320 --> 00:30:53,240 Speaker 1: high quality podcasts emerge in twenty nineteen, whether they are 543 00:30:53,280 --> 00:30:56,280 Speaker 1: ongoing series or a limited run. One of the things 544 00:30:56,320 --> 00:30:59,280 Speaker 1: I love about podcasts is that it can give people 545 00:30:59,320 --> 00:31:02,160 Speaker 1: who have in credible stories to tell a platform upon 546 00:31:02,240 --> 00:31:05,800 Speaker 1: which they can share those stories. Whether it's an investigative 547 00:31:05,920 --> 00:31:09,240 Speaker 1: podcast that dives deep into a subject or an event 548 00:31:09,840 --> 00:31:13,880 Speaker 1: in order to tell that really compelling story, or it's 549 00:31:13,920 --> 00:31:18,880 Speaker 1: a more general podcast that covers topics like technology, or 550 00:31:18,920 --> 00:31:21,680 Speaker 1: if it's a comedy podcast it's just really made to 551 00:31:21,720 --> 00:31:25,080 Speaker 1: make people laugh. We're in a real golden age of 552 00:31:25,200 --> 00:31:28,520 Speaker 1: content right now. There's way too much out there right 553 00:31:28,520 --> 00:31:31,120 Speaker 1: now for anyone to listen to all of it. But 554 00:31:31,160 --> 00:31:33,440 Speaker 1: I'm still hoping that twenty nineteen will bring with it 555 00:31:33,720 --> 00:31:38,520 Speaker 1: some amazing shows with different voices telling important or entertaining stories, 556 00:31:38,600 --> 00:31:41,800 Speaker 1: or that some of those podcasts that are already out 557 00:31:41,840 --> 00:31:46,000 Speaker 1: there and are already amazing but are largely unknown. I 558 00:31:46,040 --> 00:31:48,840 Speaker 1: hope they can rise up to the surface so that 559 00:31:48,880 --> 00:31:52,840 Speaker 1: more people can discover them. Uh. I still really love 560 00:31:52,920 --> 00:31:57,360 Speaker 1: listening to podcasts. I as a podcaster who does this 561 00:31:57,400 --> 00:31:59,120 Speaker 1: all the time. I still think it's one of the 562 00:31:59,160 --> 00:32:04,360 Speaker 1: most entertaining forms of connecting with people. So I definitely 563 00:32:04,360 --> 00:32:07,400 Speaker 1: want to see more of those. As for things I'm 564 00:32:07,480 --> 00:32:10,680 Speaker 1: not wishing for in twenty nineteen, I'm not wishing for 565 00:32:10,680 --> 00:32:13,560 Speaker 1: flying cars. I know people are still working on that, 566 00:32:13,600 --> 00:32:16,080 Speaker 1: but honestly, we are still dealing with trying to get 567 00:32:16,200 --> 00:32:20,200 Speaker 1: terrestrial autonomous cars working. I can't even imagine what happens 568 00:32:20,200 --> 00:32:22,360 Speaker 1: when we get the flying cars in there now. In 569 00:32:22,400 --> 00:32:25,680 Speaker 1: some ways, flying can actually be easier for automated systems, 570 00:32:25,720 --> 00:32:28,680 Speaker 1: particularly when you're at a high enough altitude where there 571 00:32:28,720 --> 00:32:32,000 Speaker 1: are very few potential obstacles. But once you talk about 572 00:32:32,040 --> 00:32:34,280 Speaker 1: a vehicle that should be able to operate both on 573 00:32:34,400 --> 00:32:38,080 Speaker 1: the ground level and in the air above, say a city, 574 00:32:38,320 --> 00:32:41,440 Speaker 1: you encounter some really difficult problems that we absolutely have 575 00:32:41,600 --> 00:32:44,440 Speaker 1: to solve before we can safely roll out, so to speak, 576 00:32:44,880 --> 00:32:49,320 Speaker 1: so ambitious a technology. I don't necessarily think flying cars 577 00:32:49,360 --> 00:32:52,120 Speaker 1: will never be a thing, but I think we just 578 00:32:52,200 --> 00:32:54,160 Speaker 1: need to kind of put that on the back burner 579 00:32:54,240 --> 00:32:56,880 Speaker 1: for like a decade or so to make sure we've 580 00:32:56,920 --> 00:33:00,640 Speaker 1: got the biggest challenges sorted out. I'm also not wishing 581 00:33:00,680 --> 00:33:04,200 Speaker 1: for more marketing speech or on the concept of digital transformation. 582 00:33:04,440 --> 00:33:07,520 Speaker 1: That is, you know, I can get behind the idea. 583 00:33:07,640 --> 00:33:13,000 Speaker 1: Generally speaking, it suggests that we leverage technology, particularly digital technologies, 584 00:33:13,280 --> 00:33:15,800 Speaker 1: to do work and solve problems. It's sort of an 585 00:33:15,800 --> 00:33:19,480 Speaker 1: extension of the old paperless office concept, in which computers 586 00:33:19,520 --> 00:33:21,040 Speaker 1: would remove the need for us to have all those 587 00:33:21,120 --> 00:33:25,360 Speaker 1: darned paper files and everything. But the marketing around this 588 00:33:25,480 --> 00:33:28,720 Speaker 1: concept tends to be very shallow. Um. It tends to 589 00:33:28,720 --> 00:33:31,200 Speaker 1: be let's throw technology at the problem and that will 590 00:33:31,240 --> 00:33:34,520 Speaker 1: fix everything, which I don't think is ever really true. 591 00:33:34,840 --> 00:33:39,120 Speaker 1: I've seen businesses, organizations, and schools jump on this kind 592 00:33:39,160 --> 00:33:41,920 Speaker 1: of idea because It almost seems like it's a shortcut 593 00:33:42,000 --> 00:33:46,160 Speaker 1: to success, right that, oh, if we just throw computers 594 00:33:46,440 --> 00:33:49,160 Speaker 1: at it, things will be better. But it's not just 595 00:33:49,280 --> 00:33:52,440 Speaker 1: the technology that makes things better. It's the application of 596 00:33:52,520 --> 00:33:57,000 Speaker 1: technology that's really important. Not not the acquisition of tech, 597 00:33:57,520 --> 00:33:59,520 Speaker 1: but the way you apply it. So I'd like to 598 00:33:59,520 --> 00:34:04,520 Speaker 1: see more tension devoted to implementing technology purposefully and thoughtfully, 599 00:34:04,640 --> 00:34:07,200 Speaker 1: rather than just let's get the clothes the newest toys 600 00:34:07,240 --> 00:34:09,400 Speaker 1: that are out there and give them to everybody. That 601 00:34:09,440 --> 00:34:12,640 Speaker 1: doesn't necessarily fix everything. In fact, at least in the 602 00:34:12,680 --> 00:34:14,640 Speaker 1: short term, it can make things worse as everyone just 603 00:34:14,680 --> 00:34:18,480 Speaker 1: tries to figure out how the darn thing works. And 604 00:34:18,560 --> 00:34:21,480 Speaker 1: I don't think we're gonna see any enormous leaps in 605 00:34:21,680 --> 00:34:26,040 Speaker 1: AI in twenty nineteen. We'll see improvement, obviously, that will continue. 606 00:34:26,080 --> 00:34:29,000 Speaker 1: We'll see innovation, we'll see interesting ways in which people 607 00:34:29,000 --> 00:34:31,759 Speaker 1: are going to apply AI. But I don't think we're 608 00:34:31,760 --> 00:34:34,840 Speaker 1: gonna get the world's first self aware conscious machine or 609 00:34:34,840 --> 00:34:36,520 Speaker 1: anything like that. But this is a wish list, not 610 00:34:36,560 --> 00:34:41,399 Speaker 1: a predictions list. So my wish list is if we 611 00:34:41,640 --> 00:34:46,279 Speaker 1: do get some sort of self aware, self conscious machine, which, 612 00:34:46,320 --> 00:34:48,359 Speaker 1: by the way, I think it's almost impossible, but if 613 00:34:48,440 --> 00:34:51,239 Speaker 1: we do get it, my wish is that it is 614 00:34:51,280 --> 00:34:56,400 Speaker 1: a genuinely helpful construct now that will augment our ability 615 00:34:56,480 --> 00:34:59,400 Speaker 1: to learn and to grow as people. I would absolutely 616 00:34:59,400 --> 00:35:02,600 Speaker 1: love for a I to become better human beings. That 617 00:35:02,600 --> 00:35:05,400 Speaker 1: would be my my wish, And I know that's a 618 00:35:05,480 --> 00:35:09,319 Speaker 1: huge wish on top of the possibility of this AI 619 00:35:09,560 --> 00:35:12,719 Speaker 1: even existing, to make it a benevolent one that makes 620 00:35:12,800 --> 00:35:15,640 Speaker 1: us better people, that's a huge, huge thing. It's not 621 00:35:15,680 --> 00:35:18,399 Speaker 1: even close to being listed as one of the more 622 00:35:18,480 --> 00:35:22,680 Speaker 1: likely scenarios in the wake of a sophisticated artificial intelligence emerging. 623 00:35:23,239 --> 00:35:26,400 Speaker 1: But we are talking about wishes here. What about you, guys, 624 00:35:26,600 --> 00:35:29,440 Speaker 1: what do you wish for in twenty nineteen? What are 625 00:35:29,440 --> 00:35:32,880 Speaker 1: you hoping to see or experience? What are you hoping 626 00:35:32,880 --> 00:35:35,320 Speaker 1: in the world of tech will happen. I'm curious to 627 00:35:35,400 --> 00:35:38,680 Speaker 1: hear your thoughts and also if you have any suggestions 628 00:35:38,680 --> 00:35:41,320 Speaker 1: for future episodes of tech Stuff, whether it's a company 629 00:35:41,360 --> 00:35:44,680 Speaker 1: of technology, maybe there's someone I should interview, let me know. 630 00:35:44,880 --> 00:35:47,759 Speaker 1: Send me an email the addresses tech Stuff at how 631 00:35:47,800 --> 00:35:51,359 Speaker 1: stuff works dot com or dropped by our website that's 632 00:35:51,440 --> 00:35:54,759 Speaker 1: tech Stuff podcast dot com. You're gonna find other ways 633 00:35:54,800 --> 00:35:58,000 Speaker 1: to contact me there, including social media. There's also a 634 00:35:58,040 --> 00:36:00,760 Speaker 1: link to our merchandise store that's over at t public 635 00:36:00,840 --> 00:36:04,239 Speaker 1: dot com slash tech stuff. Every item you purchase goes 636 00:36:04,280 --> 00:36:06,799 Speaker 1: to help the show, and we greatly appreciate it. And 637 00:36:06,840 --> 00:36:15,640 Speaker 1: I'll talk to you again really soon for moral thiss 638 00:36:15,640 --> 00:36:18,160 Speaker 1: and thousands of other topics because it how stuff works 639 00:36:18,160 --> 00:36:28,480 Speaker 1: dot com