1 00:00:04,440 --> 00:00:12,320 Speaker 1: Welcome to tech Stuff, a production from iHeartRadio. Hey there, 2 00:00:12,360 --> 00:00:15,600 Speaker 1: and welcome to tech Stuff. I'm your host, Jonathan Strickland. 3 00:00:15,640 --> 00:00:19,239 Speaker 1: I'm an executive producer with iHeartRadio, and how the tech 4 00:00:19,320 --> 00:00:23,200 Speaker 1: are you here? We are on a Friday again. That 5 00:00:23,280 --> 00:00:27,560 Speaker 1: means it's time for a classic episode. This episode originally 6 00:00:27,640 --> 00:00:31,680 Speaker 1: published on June fifteenth, twenty sixteen. It is called AI 7 00:00:31,960 --> 00:00:36,720 Speaker 1: Assistants and You. This is clearly an episode that could 8 00:00:36,880 --> 00:00:40,720 Speaker 1: use an update now twenty sixteen. Man boy, We've got 9 00:00:40,760 --> 00:00:43,400 Speaker 1: a lot more to say about AI assistants these days, 10 00:00:43,400 --> 00:00:46,960 Speaker 1: but I'll leave that for the autro. Let's take a listen. 11 00:00:48,040 --> 00:00:53,320 Speaker 2: So we want to talk today. We being me and you. 12 00:00:53,320 --> 00:00:54,080 Speaker 2: You can talk back. 13 00:00:54,120 --> 00:00:57,000 Speaker 1: I just won't be able to hear you about these 14 00:00:57,040 --> 00:01:00,800 Speaker 1: personal digital assistants, but not the pen of the past. 15 00:01:00,840 --> 00:01:04,000 Speaker 1: We want to talk about the series and the quirtanas 16 00:01:04,000 --> 00:01:07,360 Speaker 1: and the Google assistants and things of that nature. And 17 00:01:07,400 --> 00:01:10,399 Speaker 1: I want to specifically look into how are these going 18 00:01:10,440 --> 00:01:13,160 Speaker 1: to be incorporated into our lives in the future, and 19 00:01:13,200 --> 00:01:15,000 Speaker 1: what are some of the concerns we have and what 20 00:01:15,160 --> 00:01:19,720 Speaker 1: differentiates all these products that have been sort of coming 21 00:01:19,800 --> 00:01:20,840 Speaker 1: into their own. 22 00:01:20,720 --> 00:01:21,959 Speaker 2: Over the past few years. 23 00:01:22,600 --> 00:01:24,760 Speaker 1: So to start with you might say, well, you know 24 00:01:24,880 --> 00:01:30,399 Speaker 1: which of these assistants came first, And arguably you could 25 00:01:30,400 --> 00:01:34,120 Speaker 1: say Google actually beat everyone to the punch by a 26 00:01:34,160 --> 00:01:37,840 Speaker 1: couple of months because on June fourteenth, twenty eleven, Google 27 00:01:37,920 --> 00:01:41,600 Speaker 1: announced at an inside Google Search event that it was 28 00:01:41,640 --> 00:01:45,039 Speaker 1: going to roll out voice search on Google dot com. 29 00:01:45,840 --> 00:01:48,000 Speaker 2: And the project name at Google was. 30 00:01:49,840 --> 00:01:53,040 Speaker 1: Majel or Magel, depending upon how you want to pronounce it, 31 00:01:53,040 --> 00:01:56,360 Speaker 1: but Majel would be the way her name was actually pronounced. 32 00:01:56,640 --> 00:01:59,880 Speaker 1: Named after Majel Barrett, who was the wife of Gene Roddenberry, 33 00:01:59,880 --> 00:02:03,200 Speaker 1: the creator of Star Trek. Major Barrett actually played the 34 00:02:03,320 --> 00:02:06,440 Speaker 1: voice of the computer system, particularly on Star Trek the 35 00:02:06,440 --> 00:02:08,680 Speaker 1: next generation. Whenever you heard the computer speak, that was 36 00:02:08,720 --> 00:02:15,560 Speaker 1: Majel Barrett's voice. She also played Deanna Troy's mother, Luaxana Troy. Anyway, 37 00:02:16,400 --> 00:02:21,160 Speaker 1: they named it after her. Internally, it actually doesn't have 38 00:02:21,720 --> 00:02:24,280 Speaker 1: a name name, which kind of sets it apart from 39 00:02:24,280 --> 00:02:27,720 Speaker 1: some of the competitors. So the Voice Command project was 40 00:02:27,840 --> 00:02:32,000 Speaker 1: a tool from Google Labs, so their research and development 41 00:02:32,120 --> 00:02:35,560 Speaker 1: arm and on March twenty fourth, twenty fourteen, this particular 42 00:02:35,600 --> 00:02:40,960 Speaker 1: feature was rolled into the Google Now product and it 43 00:02:41,040 --> 00:02:43,480 Speaker 1: was part of the Android four point one release. 44 00:02:43,520 --> 00:02:45,400 Speaker 2: That was the jelly Bean release. 45 00:02:46,280 --> 00:02:49,280 Speaker 1: Now, at that point, the speech recognition commands had evolved 46 00:02:49,320 --> 00:02:51,639 Speaker 1: a little bit. It had gone beyond some of the 47 00:02:52,040 --> 00:02:54,640 Speaker 1: initial stuff where you could just ask Google to search 48 00:02:54,680 --> 00:02:57,480 Speaker 1: something for you. This was also a feature that was 49 00:02:57,520 --> 00:03:00,239 Speaker 1: worked into Google Glass, so if you had a pair 50 00:03:00,240 --> 00:03:03,600 Speaker 1: of Google Glass, you know that the voice command would 51 00:03:03,600 --> 00:03:08,359 Speaker 1: always start with the phrase okay, followed by Google I'm 52 00:03:08,400 --> 00:03:10,560 Speaker 1: not going to say it together, just in case some 53 00:03:10,600 --> 00:03:13,120 Speaker 1: of you are listening to your devices or listening with 54 00:03:13,160 --> 00:03:15,360 Speaker 1: a device nearby and it's on its home screen, I 55 00:03:15,360 --> 00:03:18,920 Speaker 1: don't want to activate it for whatever reason, but you 56 00:03:18,919 --> 00:03:21,760 Speaker 1: could use that phrase that would end up alerting the 57 00:03:22,280 --> 00:03:24,840 Speaker 1: virtual assistant that you wanted something, and then you would 58 00:03:24,880 --> 00:03:29,160 Speaker 1: speak whatever it was you wanted. And over time functionality increase, 59 00:03:29,240 --> 00:03:32,280 Speaker 1: so it went beyond just searches and into more interactive 60 00:03:32,320 --> 00:03:35,240 Speaker 1: features like with an Android phone, you could set an alarm, 61 00:03:35,440 --> 00:03:39,520 Speaker 1: or you could set a reminder or review your calendar 62 00:03:39,920 --> 00:03:44,520 Speaker 1: and more as time went on. At this point it 63 00:03:44,560 --> 00:03:47,240 Speaker 1: has evolved into something a little bit more robust than that. 64 00:03:47,560 --> 00:03:49,920 Speaker 1: Even you can start to interact with some third party 65 00:03:49,960 --> 00:03:54,400 Speaker 1: stuff as well, and at Google Io twenty sixteen it 66 00:03:54,520 --> 00:03:58,920 Speaker 1: became part of Google Assistant. Now Google Assistant is really 67 00:03:59,000 --> 00:04:04,400 Speaker 1: the intelligent personal assistant product from Google. 68 00:04:04,520 --> 00:04:05,839 Speaker 2: The earlier versions. 69 00:04:05,560 --> 00:04:09,120 Speaker 1: You could think of as sort of a rudimentary form 70 00:04:09,560 --> 00:04:12,600 Speaker 1: or perhaps a prototype, or maybe just like these are 71 00:04:12,600 --> 00:04:16,719 Speaker 1: features that would eventually be rolled all into one finished product, 72 00:04:17,080 --> 00:04:20,600 Speaker 1: being Google Assistant. So by that argument, if you say 73 00:04:20,600 --> 00:04:23,920 Speaker 1: Google Assistant, you know, if you mark the Google Io 74 00:04:24,000 --> 00:04:27,880 Speaker 1: twenty sixteen event as its premiere, then it's not the oldest, 75 00:04:28,120 --> 00:04:33,080 Speaker 1: but it dates back to June fourteenth, twenty eleven, when 76 00:04:33,120 --> 00:04:38,839 Speaker 1: Google announced this initial search voice search ability. So that 77 00:04:39,000 --> 00:04:43,560 Speaker 1: same year, in October, on October fourteenth, in fact, Apple 78 00:04:43,720 --> 00:04:44,800 Speaker 1: introduced Siri. 79 00:04:46,120 --> 00:04:47,240 Speaker 2: And I'm sure you all know what. 80 00:04:47,279 --> 00:04:49,719 Speaker 1: Siri is, but just in case you don't, it's billed 81 00:04:49,760 --> 00:04:52,960 Speaker 1: as an intelligent personal assistant and it was introduced as 82 00:04:53,000 --> 00:04:56,000 Speaker 1: a feature with the iPhone four S and it's been 83 00:04:56,040 --> 00:05:00,240 Speaker 1: part of the iPhone iOS ecosystem ever since. Use a 84 00:05:00,240 --> 00:05:03,320 Speaker 1: speech recognition to interpret user requests and responds with what 85 00:05:03,440 --> 00:05:08,839 Speaker 1: is hopefully inappropriate action. According to series creators, Apple actually 86 00:05:08,839 --> 00:05:11,920 Speaker 1: scaled back what Siri was supposed to be able to do. 87 00:05:12,600 --> 00:05:15,360 Speaker 1: They said that they had arranged for Siri to work 88 00:05:15,560 --> 00:05:20,080 Speaker 1: with about forty to forty five different apps that Apple had, 89 00:05:20,800 --> 00:05:24,919 Speaker 1: and then the company scaled that back significantly, so the 90 00:05:25,120 --> 00:05:29,520 Speaker 1: Siri creators essentially sold the product to Apple. Then they 91 00:05:29,520 --> 00:05:33,560 Speaker 1: went on to create a different intelligent assistant called VIV 92 00:05:33,839 --> 00:05:38,520 Speaker 1: VIV and VIV is currently unaffiliated with any other big names, 93 00:05:38,520 --> 00:05:41,119 Speaker 1: but it has received funding from some very wealthy folks 94 00:05:41,200 --> 00:05:44,960 Speaker 1: in the text sphere, like Mark Zuckerberg, for example. And 95 00:05:45,480 --> 00:05:48,320 Speaker 1: VIV is what the creators of VIV say that it's 96 00:05:48,320 --> 00:05:50,479 Speaker 1: what Siri was supposed to be from the get go, 97 00:05:50,640 --> 00:05:54,040 Speaker 1: and essentially they're saying that Siri was kind of hampered, hamstrung, 98 00:05:54,080 --> 00:05:57,880 Speaker 1: if you will, by Apple, and we'll get into more 99 00:05:57,880 --> 00:06:00,000 Speaker 1: about why that may be in a little bit. 100 00:06:01,480 --> 00:06:02,760 Speaker 2: So Siri actually came. 101 00:06:02,720 --> 00:06:07,159 Speaker 1: Second after Google had announced their voice search, keeping in 102 00:06:07,200 --> 00:06:09,920 Speaker 1: mind that Siri was a different presentation, So you could 103 00:06:10,000 --> 00:06:12,800 Speaker 1: argue that Siri was really more of the first assistant, 104 00:06:13,760 --> 00:06:18,039 Speaker 1: and that the Google approach eventually evolved into an assistant, 105 00:06:18,440 --> 00:06:22,040 Speaker 1: but wasn't really at that same level back in twenty eleven. 106 00:06:23,520 --> 00:06:26,840 Speaker 1: Moving forward in spring twenty fourteen, that's when Microsoft got 107 00:06:26,880 --> 00:06:30,760 Speaker 1: into the game by unveiling Kortana, which is their intelligent 108 00:06:30,800 --> 00:06:34,640 Speaker 1: assistant for the Windows phone platform, and in twenty fifteen, 109 00:06:34,720 --> 00:06:37,960 Speaker 1: Microsoft included Kortana with Windows ten, So if you have 110 00:06:38,000 --> 00:06:40,080 Speaker 1: a Windows ten machine, Kuortana. 111 00:06:39,720 --> 00:06:40,279 Speaker 2: Is part of that. 112 00:06:40,880 --> 00:06:43,279 Speaker 1: And if you have a microphone you can actually give 113 00:06:43,360 --> 00:06:46,200 Speaker 1: voice commands to Kortana. You can also interact via text. 114 00:06:47,520 --> 00:06:50,640 Speaker 1: Kortana is named after the AI and the Halo franchise 115 00:06:50,680 --> 00:06:53,440 Speaker 1: and is voiced by the same actress who provided the 116 00:06:53,520 --> 00:06:55,520 Speaker 1: voice of Kortana in the games. So you can ask 117 00:06:55,600 --> 00:06:57,880 Speaker 1: fun things about Master Chief and she always has a 118 00:06:58,760 --> 00:07:02,320 Speaker 1: interesting answer for the All of these, by the way, 119 00:07:03,200 --> 00:07:06,040 Speaker 1: tend to have some sort of fun element to them, 120 00:07:06,040 --> 00:07:11,080 Speaker 1: where the developers clearly thought of ridiculous things you could 121 00:07:11,120 --> 00:07:16,160 Speaker 1: ask the digital assistance and built in responses that were humorous. 122 00:07:16,480 --> 00:07:17,320 Speaker 2: For example, the. 123 00:07:17,280 --> 00:07:19,720 Speaker 1: Big one that everyone talked about with Siri was where 124 00:07:19,760 --> 00:07:22,040 Speaker 1: can I hide a body? And Siri would come back 125 00:07:22,080 --> 00:07:25,600 Speaker 1: with nearby quarries and cave. 126 00:07:25,400 --> 00:07:26,880 Speaker 2: Systems and things of that nature. 127 00:07:28,200 --> 00:07:32,320 Speaker 1: Now, in November twenty fourteen, we get our final big 128 00:07:32,440 --> 00:07:37,240 Speaker 1: name in this battle, Amazon. That's when Amazon unveiled the Echo, 129 00:07:37,480 --> 00:07:41,720 Speaker 1: which is that sort of standalone speaker system that has 130 00:07:41,800 --> 00:07:46,120 Speaker 1: the intelligent assistant Alexa incorporated into it, and like the 131 00:07:46,160 --> 00:07:48,560 Speaker 1: other ones, I've mentioned so far, Alexa can follow your 132 00:07:48,640 --> 00:07:51,280 Speaker 1: voice commands and interact with the Internet as well as 133 00:07:51,320 --> 00:07:55,080 Speaker 1: with other Internet connected devices. That list of Internet connected 134 00:07:55,080 --> 00:07:58,440 Speaker 1: devices Alexa can work with is growing day by day, 135 00:07:59,040 --> 00:08:01,880 Speaker 1: and Amazon's actually trying to build out the capabilities further 136 00:08:02,000 --> 00:08:04,000 Speaker 1: and as such as hired a team to create a 137 00:08:04,080 --> 00:08:08,000 Speaker 1: guide on how to develop for Alexa. I'm going to 138 00:08:08,040 --> 00:08:10,520 Speaker 1: interview one of the developers on that team in a 139 00:08:10,560 --> 00:08:15,120 Speaker 1: later episode. We actually have that scheduled for later this summer, 140 00:08:15,600 --> 00:08:18,080 Speaker 1: and we'll talk more about what it's like to develop 141 00:08:18,320 --> 00:08:21,640 Speaker 1: for this platform and the potential of using such a 142 00:08:21,680 --> 00:08:25,800 Speaker 1: platform in new and creative ways. So we have four 143 00:08:25,960 --> 00:08:29,440 Speaker 1: really big players in the space. We've got Apple and 144 00:08:29,560 --> 00:08:34,840 Speaker 1: Google and Microsoft and Amazon already vuying to be the 145 00:08:34,880 --> 00:08:38,600 Speaker 1: big digital assistant provider. Then we have the other names, 146 00:08:38,640 --> 00:08:42,000 Speaker 1: like we've got the team behind viv and other apps 147 00:08:42,000 --> 00:08:44,520 Speaker 1: as well that are in this space that are trying 148 00:08:44,559 --> 00:08:47,679 Speaker 1: to kind of become the voice that you interact with 149 00:08:48,920 --> 00:08:50,800 Speaker 1: so that it can do all the things you needed 150 00:08:50,880 --> 00:08:53,720 Speaker 1: to do in a as seamless a way as possible. 151 00:08:55,080 --> 00:08:57,800 Speaker 1: So one of the things we need to also look 152 00:08:57,800 --> 00:09:00,720 Speaker 1: at is how does this differentiate, How do these different players. 153 00:09:01,040 --> 00:09:04,400 Speaker 1: How are they different from one another? If they're exactly 154 00:09:04,440 --> 00:09:06,840 Speaker 1: the same as each other, then it really doesn't matter 155 00:09:06,880 --> 00:09:09,480 Speaker 1: which one you pick, right, I mean, it kind of 156 00:09:09,520 --> 00:09:11,320 Speaker 1: depends just which platform. 157 00:09:10,880 --> 00:09:11,640 Speaker 2: You have available. 158 00:09:11,640 --> 00:09:15,400 Speaker 1: If you have all iOS devices, then Siri is pretty 159 00:09:15,600 --> 00:09:17,040 Speaker 1: much going to be the one you're going to depend 160 00:09:17,120 --> 00:09:22,240 Speaker 1: upon the most, most likely at any rate. So Cortana, Siri, 161 00:09:22,360 --> 00:09:26,280 Speaker 1: and Google Assistant are all part of existing platforms like 162 00:09:26,400 --> 00:09:32,079 Speaker 1: smartphones and computers, So they are incorporated into things that 163 00:09:32,400 --> 00:09:37,520 Speaker 1: we already have. You you probably already have a smartphone 164 00:09:37,720 --> 00:09:40,800 Speaker 1: or a computer or both, and so it makes sense 165 00:09:40,880 --> 00:09:43,880 Speaker 1: that you would incorporate your digital assistant into that. You 166 00:09:43,880 --> 00:09:47,160 Speaker 1: don't have to buy anything else, it's right there, and 167 00:09:47,200 --> 00:09:50,400 Speaker 1: you can incorporate that into other systems that are connected 168 00:09:50,400 --> 00:09:54,160 Speaker 1: to a personal network or a home network. Then you've 169 00:09:54,200 --> 00:10:00,520 Speaker 1: got Alexa, which debuted on a standalone device called the Echo, 170 00:10:01,400 --> 00:10:03,880 Speaker 1: which again is just this sort of intelligent speaker, a 171 00:10:03,920 --> 00:10:07,760 Speaker 1: smart speaker with a built in microphone. Google Assistant is 172 00:10:07,760 --> 00:10:10,360 Speaker 1: actually following suit with that with Google Home that was 173 00:10:10,400 --> 00:10:13,480 Speaker 1: announced at Google Io twenty sixteen, and Google Home is 174 00:10:13,520 --> 00:10:16,559 Speaker 1: also a smart speaker with a microphone that's going to 175 00:10:16,559 --> 00:10:19,160 Speaker 1: be available sometime later in twenty sixteen, and as of 176 00:10:19,200 --> 00:10:21,680 Speaker 1: the recording of this podcast, I don't have a date 177 00:10:21,840 --> 00:10:24,880 Speaker 1: or a price on that, so it's hard to say 178 00:10:24,920 --> 00:10:27,480 Speaker 1: whether or not it will be competitively priced against the 179 00:10:27,520 --> 00:10:28,599 Speaker 1: Amazon Echo. 180 00:10:29,120 --> 00:10:30,160 Speaker 2: It does look like it's going. 181 00:10:30,080 --> 00:10:34,280 Speaker 1: To be a particularly powerful version of this personal assistant, 182 00:10:35,080 --> 00:10:37,600 Speaker 1: And there are also rumors emerging that Apple is also 183 00:10:37,640 --> 00:10:42,439 Speaker 1: working on Seri hardware, so it'd be another standalone speaker 184 00:10:42,520 --> 00:10:47,760 Speaker 1: microphone system of some sort, and that Apple's Seri platform 185 00:10:47,800 --> 00:10:51,400 Speaker 1: would exist on that. Now, as of the recording of 186 00:10:51,400 --> 00:10:53,720 Speaker 1: this podcast, we don't have confirmation on that, so there's 187 00:10:53,760 --> 00:10:57,680 Speaker 1: no timetable associated with such a thing or a price. 188 00:10:58,160 --> 00:11:01,320 Speaker 1: I would expect that any announce of such a device 189 00:11:01,360 --> 00:11:04,240 Speaker 1: would come at one of Apple's big events, So probably, 190 00:11:04,920 --> 00:11:09,640 Speaker 1: if I had the guess, I'd say September twenty sixteen 191 00:11:09,760 --> 00:11:12,200 Speaker 1: is when they would announce it. That's typically when they 192 00:11:12,200 --> 00:11:17,439 Speaker 1: announced all the big iPhone changes. But that's just a guess. 193 00:11:17,600 --> 00:11:20,600 Speaker 1: They might hold a single event for this particular thing, 194 00:11:20,679 --> 00:11:22,000 Speaker 1: or they might not hold an event at all. They 195 00:11:22,000 --> 00:11:25,200 Speaker 1: may just release it. That doesn't seem particularly Apple like, 196 00:11:25,360 --> 00:11:29,280 Speaker 1: but it's a possibility. So what's the big deal with 197 00:11:29,320 --> 00:11:32,600 Speaker 1: this technology? In the first place, why should we care. Well, 198 00:11:32,600 --> 00:11:35,120 Speaker 1: For one thing, it represents huge leaps forward in the 199 00:11:35,120 --> 00:11:38,880 Speaker 1: field of artificial intelligence. So in one way, it's a 200 00:11:38,960 --> 00:11:44,600 Speaker 1: really cool glimpse at the state of the art in AI, 201 00:11:45,880 --> 00:11:50,280 Speaker 1: specifically in stuff like speech recognition, which is pretty hard stuff. 202 00:11:50,440 --> 00:11:54,200 Speaker 1: I mean, we all have different ways of pronouncing words, 203 00:11:54,679 --> 00:11:58,400 Speaker 1: and depending upon your region, you might have an accent 204 00:11:58,679 --> 00:12:02,760 Speaker 1: that has a different way of pronouncing word. For example, 205 00:12:03,240 --> 00:12:06,559 Speaker 1: you know the Brits say aluminium and we say aluminum 206 00:12:06,559 --> 00:12:10,280 Speaker 1: here in the United States. Then even within a single country, 207 00:12:10,320 --> 00:12:14,160 Speaker 1: you have different ways of pronouncing things. And when Google 208 00:12:14,240 --> 00:12:18,880 Speaker 1: first began translating speech to text in voice messages, I 209 00:12:19,000 --> 00:12:23,480 Speaker 1: noticed that it was having a real hard time interpreting 210 00:12:23,520 --> 00:12:26,520 Speaker 1: the words of some of my friends and family. Now 211 00:12:26,559 --> 00:12:31,640 Speaker 1: keep in mind I am in the Southeast United States Georgia, 212 00:12:32,120 --> 00:12:35,480 Speaker 1: and that is we have a lot of people here. 213 00:12:35,280 --> 00:12:39,400 Speaker 2: With Southern accents. I have a tiny bit of one. 214 00:12:39,440 --> 00:12:43,240 Speaker 1: My parents have a slightly stronger Southern accent. Some of 215 00:12:43,280 --> 00:12:48,000 Speaker 1: my extended relatives haven't even stronger Southern accent, and so 216 00:12:48,200 --> 00:12:51,320 Speaker 1: when they would call and leave a voicemail, Google had 217 00:12:51,360 --> 00:12:54,440 Speaker 1: to guess at what they were saying, and was not 218 00:12:54,679 --> 00:12:58,040 Speaker 1: always correct. I would have to go and listen to 219 00:12:58,160 --> 00:13:03,040 Speaker 1: the voicemail because the transcript would be completely indecipherable. Now, 220 00:13:03,080 --> 00:13:08,080 Speaker 1: over time this has improved. The speech recognition software has improved, 221 00:13:08,679 --> 00:13:13,319 Speaker 1: where it can adjust for things like different accents and 222 00:13:13,920 --> 00:13:17,080 Speaker 1: the different ways that people speak, using a lot of 223 00:13:17,080 --> 00:13:20,480 Speaker 1: different algorithms that have been based in machine learning to 224 00:13:20,960 --> 00:13:24,160 Speaker 1: kind of get a grip on what is being said 225 00:13:24,200 --> 00:13:29,080 Speaker 1: and even anticipate what the next thing will be in 226 00:13:29,120 --> 00:13:32,680 Speaker 1: any line of thought. Obviously, for someone like me who 227 00:13:33,440 --> 00:13:36,880 Speaker 1: stumbles over words occasionally, that's a real challenge because sometimes 228 00:13:36,960 --> 00:13:38,599 Speaker 1: I don't even know what's next to come out of 229 00:13:38,600 --> 00:13:42,720 Speaker 1: my mouth. But that's really where that power comes in. Now, 230 00:13:42,760 --> 00:13:46,839 Speaker 1: over time, not just speech recognition has improved, we've also 231 00:13:46,920 --> 00:13:50,760 Speaker 1: had to look at the problem of natural language. Now, 232 00:13:50,840 --> 00:13:53,680 Speaker 1: natural language is how you and I communicate with one another. 233 00:13:53,840 --> 00:13:57,440 Speaker 1: Unless it's like a really formal setting, we usually are 234 00:13:57,480 --> 00:14:00,240 Speaker 1: pretty casual with our language, and we can make use 235 00:14:00,280 --> 00:14:03,760 Speaker 1: of lots of different linguistic flourishes and tools, things like 236 00:14:03,800 --> 00:14:08,760 Speaker 1: figures of speech, metaphors, similes, puns, references, and lots of 237 00:14:08,760 --> 00:14:11,520 Speaker 1: other stuff that gives meaning to what we say. But 238 00:14:11,679 --> 00:14:14,760 Speaker 1: only if the other person also understands what's going on. 239 00:14:15,559 --> 00:14:19,400 Speaker 1: They also have to have that benefit, otherwise it just 240 00:14:19,440 --> 00:14:22,960 Speaker 1: becomes a jumble of nonsense. I'm reminded of a Star 241 00:14:23,000 --> 00:14:26,760 Speaker 1: Trek in the Next Generation episode where characters only spoke 242 00:14:27,120 --> 00:14:32,520 Speaker 1: in allegory, and if you didn't have that cultural background, 243 00:14:32,560 --> 00:14:35,960 Speaker 1: if you didn't understand the references, you didn't understand where 244 00:14:36,000 --> 00:14:42,320 Speaker 1: the communic what the communication meant. Similar problem with machines. 245 00:14:42,360 --> 00:14:45,600 Speaker 1: They don't necessarily know what we're saying all the time. 246 00:14:45,680 --> 00:14:47,800 Speaker 1: A lot of machines are not very good at doing this. 247 00:14:48,160 --> 00:14:51,800 Speaker 1: But natural language familiarity has been a huge challenge in 248 00:14:51,800 --> 00:14:55,680 Speaker 1: an AI, and we're getting better at overcoming that challenge. 249 00:14:55,760 --> 00:14:59,640 Speaker 1: So at that same IO event where Google announced Google Home, 250 00:15:00,600 --> 00:15:02,840 Speaker 1: they demonstrate that you could start a conversation with your 251 00:15:02,880 --> 00:15:07,080 Speaker 1: personal assistant asking something fairly specific, such as, we're going 252 00:15:07,160 --> 00:15:12,000 Speaker 1: to go with a local reference for yours truly, how 253 00:15:12,000 --> 00:15:15,240 Speaker 1: are the Atlanta Braves doing this season? Then the assistant 254 00:15:15,280 --> 00:15:17,760 Speaker 1: would actually break your heart by telling you how poorly 255 00:15:17,840 --> 00:15:20,720 Speaker 1: the Braves are doing this season and it is abysmal. 256 00:15:21,200 --> 00:15:23,920 Speaker 1: And you could follow that up with when do they 257 00:15:23,920 --> 00:15:27,480 Speaker 1: play at home next? And the assistant would understand that 258 00:15:27,560 --> 00:15:30,760 Speaker 1: when you say they, you mean the Atlanta braves, and 259 00:15:30,800 --> 00:15:35,440 Speaker 1: when you say at home, it would understand you meant Atlanta, Georgia, 260 00:15:35,520 --> 00:15:38,560 Speaker 1: so it would be able to figure out the context 261 00:15:38,600 --> 00:15:41,920 Speaker 1: of what you said without you having to restate when 262 00:15:41,960 --> 00:15:45,840 Speaker 1: do the Atlanta braves play in Atlanta? Next, you could 263 00:15:45,840 --> 00:15:48,760 Speaker 1: take these little linguistic shortcuts that we would normally do 264 00:15:48,840 --> 00:15:53,640 Speaker 1: in natural conversation. But typically machines are not great at that. 265 00:15:53,720 --> 00:15:59,360 Speaker 1: They don't have the capacity to understand how one sentence 266 00:15:59,440 --> 00:16:03,200 Speaker 1: can follow another. But this is an example of how 267 00:16:03,240 --> 00:16:06,920 Speaker 1: that's changed through machine learning. We'll be right back to 268 00:16:07,000 --> 00:16:13,920 Speaker 1: talk more about AI assistance and you after this quick break. 269 00:16:19,840 --> 00:16:23,600 Speaker 1: So you've got this new approach where you can continue 270 00:16:24,040 --> 00:16:27,680 Speaker 1: a series of questions that build on previous questions and answers, 271 00:16:28,640 --> 00:16:32,880 Speaker 1: and the Google Assistant can continue to give you relevant information, 272 00:16:33,000 --> 00:16:37,840 Speaker 1: which is a pretty powerful statement in AI. Also, you 273 00:16:37,880 --> 00:16:41,000 Speaker 1: might have heard that funny story that Google fed romance 274 00:16:41,120 --> 00:16:43,920 Speaker 1: novels to its AI to make it better at understanding 275 00:16:44,000 --> 00:16:47,680 Speaker 1: natural language, And to be fair, that's just part of 276 00:16:47,720 --> 00:16:50,600 Speaker 1: the story. Google actually fed lots of different types of 277 00:16:50,720 --> 00:16:54,600 Speaker 1: unpublished literature to its AI, all with the goal of 278 00:16:54,640 --> 00:16:57,400 Speaker 1: teaching the AI that there are many different ways to 279 00:16:57,440 --> 00:17:00,800 Speaker 1: say the same thing. So here's an example. I could 280 00:17:00,840 --> 00:17:04,240 Speaker 1: say it's raining pretty hard today, or it's really coming 281 00:17:04,280 --> 00:17:07,280 Speaker 1: down out there, or it's raining cats and dogs, or 282 00:17:07,320 --> 00:17:10,720 Speaker 1: it's pouring outside, and all of those mean the same thing. 283 00:17:10,800 --> 00:17:13,520 Speaker 1: But they're all different ways of saying that it's raining 284 00:17:13,920 --> 00:17:18,399 Speaker 1: really hard. And there are a lot of other ways 285 00:17:18,440 --> 00:17:21,480 Speaker 1: I could say the same you know, to express the 286 00:17:21,480 --> 00:17:23,160 Speaker 1: same thought using different words. 287 00:17:23,800 --> 00:17:24,920 Speaker 2: And that's a challenge for. 288 00:17:24,960 --> 00:17:30,120 Speaker 1: Machines because we as humans understand that you can say 289 00:17:30,160 --> 00:17:32,880 Speaker 1: all these different things and that all means the same thing. 290 00:17:33,440 --> 00:17:34,959 Speaker 2: But machines have to be taught that. 291 00:17:36,640 --> 00:17:39,080 Speaker 1: So romance novels, as it turns out, are a good 292 00:17:39,160 --> 00:17:42,760 Speaker 1: way to teach an AI how to interpret different things 293 00:17:43,840 --> 00:17:48,040 Speaker 1: because romance novels are incredibly formulaic. If you were to 294 00:17:48,080 --> 00:17:52,639 Speaker 1: break down a romance novel and you outlined it seen 295 00:17:52,760 --> 00:17:55,440 Speaker 1: by scene so that you understood where the beats and 296 00:17:55,480 --> 00:17:58,120 Speaker 1: the story were, and who the characters were and their 297 00:17:58,119 --> 00:18:01,000 Speaker 1: relationships to one another, you would see that a lot 298 00:18:01,000 --> 00:18:05,040 Speaker 1: of romance novels follow the exact same structure, exact same 299 00:18:05,080 --> 00:18:08,960 Speaker 1: plot structure, but because they're written by different people, because 300 00:18:09,080 --> 00:18:12,920 Speaker 1: the character names and places are often changed from book 301 00:18:12,920 --> 00:18:14,760 Speaker 1: to book, I mean, obviously you wouldn't want to write 302 00:18:14,760 --> 00:18:18,359 Speaker 1: the same novel forty times. It means that you have 303 00:18:18,400 --> 00:18:22,560 Speaker 1: a lot of different ways to express the same ideas. 304 00:18:23,359 --> 00:18:26,240 Speaker 1: So if you feed a whole bunch of formulating novels 305 00:18:26,280 --> 00:18:30,159 Speaker 1: into an AI to teach it humans have lots of 306 00:18:30,160 --> 00:18:33,760 Speaker 1: different ways to express the same thoughts, that's a pretty 307 00:18:33,800 --> 00:18:37,760 Speaker 1: powerful tool. And again, it wasn't the only type of 308 00:18:37,960 --> 00:18:41,399 Speaker 1: story that was being fed to Google's AI. It's just 309 00:18:41,440 --> 00:18:43,680 Speaker 1: the one that caught a lot of people's attention because 310 00:18:43,720 --> 00:18:47,920 Speaker 1: it the headlines right themselves at that point. So one 311 00:18:47,920 --> 00:18:50,480 Speaker 1: thing that is really, you know, funny about that is 312 00:18:50,480 --> 00:18:53,720 Speaker 1: a lot of people made jokes about Google AI suggesting 313 00:18:53,760 --> 00:18:56,040 Speaker 1: different ways to rip a bodice or to make a 314 00:18:56,080 --> 00:18:59,639 Speaker 1: bosom heave from the whole romance novel thing. But as 315 00:18:59,680 --> 00:19:03,240 Speaker 1: it turns, there was some real thought given to using 316 00:19:03,240 --> 00:19:08,280 Speaker 1: this approach. Now, one of the way that these assistants 317 00:19:08,320 --> 00:19:10,479 Speaker 1: work so well is to tap into information about you 318 00:19:11,400 --> 00:19:14,080 Speaker 1: and to store all of that off of the hardware 319 00:19:14,880 --> 00:19:17,880 Speaker 1: so that it can anticipate what you want and what 320 00:19:17,920 --> 00:19:22,320 Speaker 1: you need and how to fulfill that. So, for example, 321 00:19:22,400 --> 00:19:26,679 Speaker 1: if I'm using Amazon and I'm using the Echo and 322 00:19:27,000 --> 00:19:31,600 Speaker 1: I'm using Alexa to purchase certain things off the Amazon Store. 323 00:19:32,359 --> 00:19:37,320 Speaker 1: This ends up tapping into that algorithm that tells Amazon 324 00:19:37,400 --> 00:19:41,399 Speaker 1: what I've bought and what I have browsed, and the 325 00:19:41,440 --> 00:19:43,480 Speaker 1: sort of stuff I'm interested in, so it can suggest 326 00:19:43,600 --> 00:19:45,800 Speaker 1: new things that I might be interested in but didn't 327 00:19:45,800 --> 00:19:46,240 Speaker 1: know about. 328 00:19:47,080 --> 00:19:48,520 Speaker 2: All of that is a very powerful tool. 329 00:19:49,600 --> 00:19:55,320 Speaker 1: One of the exceptions here is Apple's SERI, So Apple 330 00:19:55,440 --> 00:19:59,040 Speaker 1: pretty much locks everything down into the hardware as opposed 331 00:19:59,080 --> 00:20:01,960 Speaker 1: to share it with third party or putting it in 332 00:20:02,000 --> 00:20:07,439 Speaker 1: the cloud. That's because Apple's revenue source is selling that 333 00:20:07,560 --> 00:20:12,679 Speaker 1: hardware and related services like support plans, like product support 334 00:20:13,280 --> 00:20:16,840 Speaker 1: or protection plans for your hardware. That's how Apple makes 335 00:20:16,880 --> 00:20:20,040 Speaker 1: its money. It's making it through selling this hardware that 336 00:20:20,119 --> 00:20:23,560 Speaker 1: it is producing, as opposed to something like Google, which 337 00:20:23,720 --> 00:20:28,159 Speaker 1: until Google Home comes out, it's selling an idea to 338 00:20:28,280 --> 00:20:34,960 Speaker 1: you and then selling you to advertisers. So Apple benefits 339 00:20:35,000 --> 00:20:36,840 Speaker 1: Apple in some ways because it means that you can 340 00:20:36,880 --> 00:20:40,040 Speaker 1: trust Siri a little more than you could some of 341 00:20:40,080 --> 00:20:44,400 Speaker 1: the other assistants because it's mostly contained to your device. 342 00:20:45,280 --> 00:20:47,639 Speaker 1: On the flip side, it makes the actual service a 343 00:20:47,680 --> 00:20:50,959 Speaker 1: little less useful because it cannot tap into the massive 344 00:20:51,000 --> 00:20:53,840 Speaker 1: resources of the Internet the way some of these other 345 00:20:53,920 --> 00:20:58,800 Speaker 1: assistants can, because again it's all pretty much contained to 346 00:20:58,800 --> 00:21:01,040 Speaker 1: your device. Now I can access I can pull stuff 347 00:21:01,040 --> 00:21:04,560 Speaker 1: from the Internet for you, but it's not as interactive as. 348 00:21:04,400 --> 00:21:05,760 Speaker 2: Some of these other assistants are. 349 00:21:07,560 --> 00:21:12,840 Speaker 1: Sore with the possibility of advertising or things like Google 350 00:21:13,000 --> 00:21:18,240 Speaker 1: or Amazon rather Amazon's integrated shopping services, you start to 351 00:21:18,280 --> 00:21:22,479 Speaker 1: see some real potential for revenue generation on the back end. 352 00:21:22,520 --> 00:21:26,959 Speaker 1: But it also brings up some questions about privacy and security. 353 00:21:28,119 --> 00:21:28,280 Speaker 3: Now. 354 00:21:28,280 --> 00:21:30,920 Speaker 1: To look into that matter further, I spoke with an 355 00:21:31,000 --> 00:21:34,120 Speaker 1: expert on the subject, the founder of a company called 356 00:21:34,160 --> 00:21:37,840 Speaker 1: big Id, Dimitri Serota, and here's what he. 357 00:21:37,760 --> 00:21:38,480 Speaker 2: Had to say. 358 00:21:39,640 --> 00:21:42,880 Speaker 3: Well, I think that clearly there's a certain degree of 359 00:21:43,040 --> 00:21:46,160 Speaker 3: inevitability around this. I think we've moved from an age 360 00:21:46,440 --> 00:21:49,000 Speaker 3: of having these technologies and you can almost think of 361 00:21:49,040 --> 00:21:51,440 Speaker 3: this as kind of web dot dot one in terms 362 00:21:51,520 --> 00:21:55,960 Speaker 3: of being responsive to the user and personalization really being 363 00:21:56,000 --> 00:21:59,760 Speaker 3: about kind of targeting you. I think we're now shifting 364 00:21:59,800 --> 00:22:03,280 Speaker 3: to an era of anticipation. You know, the technologies are 365 00:22:03,320 --> 00:22:06,680 Speaker 3: becoming smarter and they know more about you because they 366 00:22:06,720 --> 00:22:09,720 Speaker 3: touch you on so many levels, whether you're on the web, 367 00:22:09,760 --> 00:22:12,400 Speaker 3: whether you're on your mobile, whether you're at the office, 368 00:22:12,440 --> 00:22:16,000 Speaker 3: whether you're in the car, whether you're at home. As 369 00:22:16,000 --> 00:22:19,280 Speaker 3: is the case with Google Home and Amazon Echo and 370 00:22:19,320 --> 00:22:23,439 Speaker 3: similar technologies, that they're no longer just about kind of 371 00:22:24,320 --> 00:22:28,120 Speaker 3: responding to a particular action. They're now trying to anticipate 372 00:22:28,720 --> 00:22:30,920 Speaker 3: what you'd want. And in some degree, you know, they're 373 00:22:30,920 --> 00:22:34,440 Speaker 3: becoming more like your mother or your parent, where they 374 00:22:34,480 --> 00:22:37,640 Speaker 3: know so much about you that they anticipate your needs. 375 00:22:38,840 --> 00:22:40,040 Speaker 3: And there's a good and the bad to. 376 00:22:39,960 --> 00:22:43,800 Speaker 1: That, right, So just the actions that we would take 377 00:22:43,840 --> 00:22:48,439 Speaker 1: in our homes can start to set up these expectations, 378 00:22:48,520 --> 00:22:50,640 Speaker 1: for lack of a better word, that our technology will 379 00:22:50,680 --> 00:22:54,720 Speaker 1: have about us. For example, the easiest way I think 380 00:22:54,760 --> 00:22:57,520 Speaker 1: to illustrate this to today's audience is to talk about 381 00:22:57,560 --> 00:23:01,040 Speaker 1: something like the Nest thermostat, where you have said it 382 00:23:01,080 --> 00:23:03,640 Speaker 1: a certain way, and it starts to learn what your 383 00:23:03,640 --> 00:23:06,720 Speaker 1: preferences are over time, and then it begins to automatically 384 00:23:06,720 --> 00:23:09,439 Speaker 1: adjust without you ever having to touch it, to the 385 00:23:09,440 --> 00:23:12,520 Speaker 1: point where it's even seeing quote unquote seeing when you 386 00:23:12,520 --> 00:23:15,199 Speaker 1: are home versus not home. This is the sort of 387 00:23:15,200 --> 00:23:18,160 Speaker 1: stuff that when incorporated into a device like Google Home, 388 00:23:18,359 --> 00:23:22,440 Speaker 1: can become very powerful. But also, like you said, has 389 00:23:22,200 --> 00:23:26,119 Speaker 1: this other side to it, this side that if we 390 00:23:26,200 --> 00:23:31,960 Speaker 1: don't pay attention to it, it could become potentially harmful 391 00:23:32,000 --> 00:23:35,080 Speaker 1: to us, or at least, at the very least anyway 392 00:23:36,040 --> 00:23:40,960 Speaker 1: inconvenient to us. So, for example, with our setting up 393 00:23:41,000 --> 00:23:43,560 Speaker 1: Google Home so that we would be able to control 394 00:23:43,760 --> 00:23:48,439 Speaker 1: lighting and security systems and thermostats, not only would we 395 00:23:48,920 --> 00:23:51,240 Speaker 1: have it set up so that it's to our preferences, 396 00:23:51,240 --> 00:23:54,160 Speaker 1: but it actually has learned when we're at home versus 397 00:23:54,200 --> 00:23:57,359 Speaker 1: when we're not at home, and what that information means 398 00:23:57,440 --> 00:24:02,159 Speaker 1: could be potentially very harmful to us. So in your mind, 399 00:24:03,119 --> 00:24:07,879 Speaker 1: where does accountability lie? Is this something that we ourselves 400 00:24:07,880 --> 00:24:11,119 Speaker 1: are are at least partly accountable for that kind of information? 401 00:24:11,400 --> 00:24:14,959 Speaker 1: Are the companies that create this technology? Are they accountable? 402 00:24:15,840 --> 00:24:18,600 Speaker 1: It's such a cloudy area. Where do you see that? 403 00:24:19,960 --> 00:24:23,040 Speaker 3: So it's a mix now clearly, and I think I 404 00:24:23,080 --> 00:24:25,360 Speaker 3: want to kind of emphasize this. You mentioned earlier how 405 00:24:25,440 --> 00:24:29,320 Speaker 3: this could become inconvenient. The reality is is that we 406 00:24:29,359 --> 00:24:32,439 Speaker 3: as the consumer want this because we want it because 407 00:24:32,440 --> 00:24:36,600 Speaker 3: it is convenient. We want technologies that are passive, We 408 00:24:36,600 --> 00:24:40,199 Speaker 3: don't necessarily want to click buttons. We want technologies that 409 00:24:40,240 --> 00:24:43,160 Speaker 3: are intelligent enough to be able to help us make 410 00:24:43,200 --> 00:24:47,560 Speaker 3: decisions right. I mentioned earlier about anticipation, the challenge with 411 00:24:47,760 --> 00:24:52,520 Speaker 3: convenience and convenience typically goes against the grain of security 412 00:24:52,600 --> 00:24:56,080 Speaker 3: and privacy to some degree. If we really want a 413 00:24:56,200 --> 00:25:00,600 Speaker 3: mother kind of anticipating our needs, what we want for lunch, 414 00:25:01,320 --> 00:25:04,320 Speaker 3: where we want to go for travel, you get The 415 00:25:04,320 --> 00:25:07,200 Speaker 3: negative of having your parent with you after you've kind 416 00:25:07,200 --> 00:25:10,080 Speaker 3: of left for university or left for the office, is 417 00:25:10,080 --> 00:25:12,800 Speaker 3: that you don't want them intruding in too many places 418 00:25:12,880 --> 00:25:15,160 Speaker 3: or knowing too much about you. You want to keep 419 00:25:15,200 --> 00:25:18,680 Speaker 3: certain parts of your life separate. And the reality is 420 00:25:18,680 --> 00:25:21,480 Speaker 3: is that there's a trade off around here. So you know, 421 00:25:21,520 --> 00:25:24,159 Speaker 3: I do think that there's a consumer drive towards this. 422 00:25:24,680 --> 00:25:28,280 Speaker 3: It's just that we are not necessarily always prepared because 423 00:25:28,280 --> 00:25:31,000 Speaker 3: there's a bit of a lag or a delay before 424 00:25:31,040 --> 00:25:34,639 Speaker 3: the consequences of having this convenience are fully made aware 425 00:25:34,920 --> 00:25:38,880 Speaker 3: aware to us. So in terms of the responsibility of 426 00:25:39,359 --> 00:25:42,239 Speaker 3: who cares about this, we as consumers obviously care about it. 427 00:25:42,600 --> 00:25:46,280 Speaker 3: You know. The companies like Google and Amazon, obviously you know, 428 00:25:46,320 --> 00:25:50,160 Speaker 3: they would argue that by personalizing service to you, they 429 00:25:50,200 --> 00:25:53,200 Speaker 3: are giving you this convenience. But the reality is it's 430 00:25:53,200 --> 00:25:55,639 Speaker 3: really up to their best efforts or what they think 431 00:25:55,760 --> 00:25:59,639 Speaker 3: is the right combination of privacy and security for now. 432 00:25:59,800 --> 00:26:03,000 Speaker 3: The reason for that is the regulators take time to 433 00:26:03,040 --> 00:26:06,440 Speaker 3: catch up, they don't necessarily know the latest, they didn't 434 00:26:06,440 --> 00:26:10,320 Speaker 3: attend Google Io, and they don't necessarily know how to 435 00:26:10,359 --> 00:26:13,159 Speaker 3: react or respond. So there's always going to be this 436 00:26:13,760 --> 00:26:18,199 Speaker 3: lag between what the consumers want, what the companies are 437 00:26:18,240 --> 00:26:21,000 Speaker 3: able to deliver in response to that need, and then 438 00:26:21,000 --> 00:26:23,560 Speaker 3: what the regulators are able to introduce in terms of 439 00:26:23,920 --> 00:26:27,760 Speaker 3: a balance in terms of rules and regulations, and in 440 00:26:27,760 --> 00:26:29,920 Speaker 3: this particular case, in around privacy and security. 441 00:26:30,560 --> 00:26:33,040 Speaker 1: And I would argue that the companies have it in 442 00:26:33,080 --> 00:26:36,680 Speaker 1: their best interest to handle this as carefully as possible 443 00:26:36,760 --> 00:26:39,960 Speaker 1: for multiple reasons. One, like you've just pointed out, If 444 00:26:39,960 --> 00:26:43,159 Speaker 1: they do not, then that means that you're going to 445 00:26:43,160 --> 00:26:46,000 Speaker 1: get that sort of tick talk effect, the tick being 446 00:26:46,400 --> 00:26:48,479 Speaker 1: that they take a certain approach, the talk being that 447 00:26:48,560 --> 00:26:55,240 Speaker 1: regulations are following, because if there's any mishandling, especially of 448 00:26:55,280 --> 00:26:58,600 Speaker 1: a chaotic scale, then there's going to be a harsh 449 00:26:58,760 --> 00:27:01,919 Speaker 1: response are down the line, and it doesn't behoove the 450 00:27:01,920 --> 00:27:08,040 Speaker 1: companies to invite that in Also, obviously, if they do 451 00:27:08,160 --> 00:27:11,120 Speaker 1: not prove to be responsible with that data that reflects 452 00:27:11,119 --> 00:27:14,200 Speaker 1: poorly on them from a consumer standpoint as well, they'll 453 00:27:14,200 --> 00:27:18,000 Speaker 1: lose customers. So it's not as if there's no incentive 454 00:27:18,480 --> 00:27:20,960 Speaker 1: on the company's part to be careful, but at the 455 00:27:20,960 --> 00:27:22,800 Speaker 1: same time they want to be able to leverage that 456 00:27:22,920 --> 00:27:27,360 Speaker 1: data to make as good use of it as possible. 457 00:27:28,440 --> 00:27:31,440 Speaker 1: We're going to wrap up our discussion about AI assistance 458 00:27:31,480 --> 00:27:34,439 Speaker 1: and you, at least the twenty sixteen version after we 459 00:27:34,520 --> 00:27:46,639 Speaker 1: come back from these messages. I've often said on this 460 00:27:46,760 --> 00:27:50,520 Speaker 1: show that if you look around and you realize that 461 00:27:50,560 --> 00:27:53,920 Speaker 1: the service you are using doesn't cost you anything, then 462 00:27:54,040 --> 00:27:57,240 Speaker 1: essentially that means that you yourself are the product and 463 00:27:57,280 --> 00:28:00,480 Speaker 1: that what you are doing is generating value for another 464 00:28:00,640 --> 00:28:03,200 Speaker 1: entity out there, for example like Google, where you're using 465 00:28:03,200 --> 00:28:06,719 Speaker 1: Google Search and then turn is generating value for Google. 466 00:28:06,760 --> 00:28:09,320 Speaker 1: You yourself are the product being sold to other companies. 467 00:28:10,040 --> 00:28:12,720 Speaker 1: So it's one of those things where it's the balance 468 00:28:12,800 --> 00:28:17,040 Speaker 1: between the desire to provide this service and to make 469 00:28:18,440 --> 00:28:22,520 Speaker 1: revenue off of of something beyond just selling a device 470 00:28:22,760 --> 00:28:25,840 Speaker 1: like the Google Home device, and making sure certain that 471 00:28:25,920 --> 00:28:30,480 Speaker 1: you don't alienate your consumer base or invite particularly restrictive 472 00:28:30,520 --> 00:28:34,439 Speaker 1: regulations to that end. Of course, in the United States, 473 00:28:34,440 --> 00:28:37,280 Speaker 1: it's one story. In other parts of the world, there 474 00:28:37,320 --> 00:28:41,160 Speaker 1: are different views of privacy and security, some of which 475 00:28:41,200 --> 00:28:44,560 Speaker 1: go well beyond what is typically seen here in the US. 476 00:28:44,720 --> 00:28:46,800 Speaker 1: Do you think the device do you think Google Home 477 00:28:46,880 --> 00:28:49,520 Speaker 1: and things like Amazon's Echo, do you think those are 478 00:28:49,520 --> 00:28:53,920 Speaker 1: going to have different levels of acceptance in different parts. 479 00:28:53,760 --> 00:28:54,240 Speaker 2: Of the world? 480 00:28:54,360 --> 00:28:56,840 Speaker 1: And where do you think might be a case for 481 00:28:58,040 --> 00:29:00,320 Speaker 1: this is probably going to be a big sixs in 482 00:29:00,320 --> 00:29:02,280 Speaker 1: one place? We've heard that Amazon Nico has been a 483 00:29:02,280 --> 00:29:05,560 Speaker 1: pretty big success so far versus a market where it 484 00:29:05,600 --> 00:29:06,280 Speaker 1: may not be. 485 00:29:08,120 --> 00:29:11,280 Speaker 3: Yeah, well, you've seen even things like credit card adoption 486 00:29:12,040 --> 00:29:16,080 Speaker 3: differ from country to country just because there are different 487 00:29:16,240 --> 00:29:21,840 Speaker 3: kind of cultural kind of moras around credit around potential 488 00:29:21,960 --> 00:29:26,000 Speaker 3: privacy implications in terms of knowing kind of a transaction 489 00:29:26,200 --> 00:29:29,080 Speaker 3: and kind of the origins and so forth. And so 490 00:29:29,080 --> 00:29:31,000 Speaker 3: you've seen this in Europe in particular, right, So not 491 00:29:31,040 --> 00:29:34,920 Speaker 3: all countries in Europe are equally predisposed to using credit 492 00:29:35,000 --> 00:29:37,400 Speaker 3: cards as we are in the US. So yeah, I 493 00:29:37,400 --> 00:29:43,320 Speaker 3: think there definitely will be different cultural adoption. But you know, 494 00:29:43,320 --> 00:29:45,560 Speaker 3: at the end of the day, like you mentioned rightly, 495 00:29:45,600 --> 00:29:47,640 Speaker 3: a lot of these companies it's in their interest to 496 00:29:48,480 --> 00:29:51,560 Speaker 3: do a good job, because we as consumers only tend 497 00:29:51,560 --> 00:29:55,200 Speaker 3: to shop from people we trust. The challenges, of course, 498 00:29:55,440 --> 00:29:58,400 Speaker 3: is that we will sometimes wonder, just like you are 499 00:29:58,480 --> 00:30:00,480 Speaker 3: right here in terms of this interview, you know, what 500 00:30:00,560 --> 00:30:02,680 Speaker 3: are the implications. You know, you could very quickly go 501 00:30:02,720 --> 00:30:05,880 Speaker 3: from a situation that appears like having your mother around 502 00:30:05,920 --> 00:30:08,920 Speaker 3: all of you around you, to having a situation where 503 00:30:09,000 --> 00:30:12,320 Speaker 3: having big brother around you in the nineteen eighty four 504 00:30:12,640 --> 00:30:14,960 Speaker 3: kind of sense, in the Orwellian sense, where something is 505 00:30:15,000 --> 00:30:18,160 Speaker 3: so aware of every patht about your life that maybe 506 00:30:18,160 --> 00:30:20,640 Speaker 3: they just know a little bit too much. And so, 507 00:30:20,800 --> 00:30:23,520 Speaker 3: you know, we're kind of entering that phase. Right. We've 508 00:30:23,720 --> 00:30:29,880 Speaker 3: historically had a few places that we weren't necessarily connected to, right, 509 00:30:29,960 --> 00:30:32,760 Speaker 3: and our home, with the exception obviously of our PCs 510 00:30:32,840 --> 00:30:36,920 Speaker 3: and our phones have not been connected. They've been into 511 00:30:36,920 --> 00:30:39,160 Speaker 3: some degree of thanks Grup, we sit down for dinner, 512 00:30:40,200 --> 00:30:42,920 Speaker 3: we're not connected to the net. And I think what 513 00:30:43,480 --> 00:30:47,560 Speaker 3: this revelation is making people aware of is that kind 514 00:30:47,560 --> 00:30:49,880 Speaker 3: of in the future, there'll be very very few places 515 00:30:49,960 --> 00:30:54,280 Speaker 3: left that are not networked, where our activities are not 516 00:30:54,960 --> 00:31:00,720 Speaker 3: kind of transponding or transmitting or telegraphing kind of our activities, 517 00:31:01,600 --> 00:31:03,920 Speaker 3: and you know, it will take time for people to adjust, 518 00:31:03,960 --> 00:31:05,960 Speaker 3: and as I mentioned earlier, it will it won't just 519 00:31:06,000 --> 00:31:09,760 Speaker 3: be about consumers and kind of buyers, but also you know, 520 00:31:09,840 --> 00:31:12,560 Speaker 3: the governments will have a say, and as you pointed out, 521 00:31:13,000 --> 00:31:15,840 Speaker 3: you know, in certain places the governments have already had 522 00:31:15,880 --> 00:31:19,440 Speaker 3: a say around privacy, like in Europe with the introduction 523 00:31:19,640 --> 00:31:24,640 Speaker 3: of the General Data Protection Regulation to better protect consumers. 524 00:31:24,880 --> 00:31:26,640 Speaker 3: And I think we're all going to become a lot 525 00:31:26,680 --> 00:31:31,280 Speaker 3: more sensitive to the privacy implications of always being online. 526 00:31:32,400 --> 00:31:35,200 Speaker 1: And I think that we're seeing that as well, just 527 00:31:35,640 --> 00:31:40,480 Speaker 1: you know, in other areas of technology. Just recently there 528 00:31:40,600 --> 00:31:44,480 Speaker 1: was these reports coming out about the FBI's database of 529 00:31:45,040 --> 00:31:49,320 Speaker 1: biometric data and the concerns people have about that, and 530 00:31:49,400 --> 00:31:52,240 Speaker 1: even interesting questions. 531 00:31:51,680 --> 00:31:54,920 Speaker 2: Like do do I own my own face? Should I? 532 00:31:54,960 --> 00:31:59,880 Speaker 1: Shouldn't I have access to data about me? And the 533 00:32:00,160 --> 00:32:02,920 Speaker 1: this Again, you know, we're in a world where our 534 00:32:02,960 --> 00:32:07,520 Speaker 1: technology is pervasive, and in many ways that is amazing. 535 00:32:07,560 --> 00:32:08,560 Speaker 2: It is giving us. 536 00:32:08,920 --> 00:32:13,880 Speaker 1: An almost seamless experience of having our desires catered to 537 00:32:14,000 --> 00:32:16,040 Speaker 1: before we can even give thought to them. That is 538 00:32:16,080 --> 00:32:18,160 Speaker 1: the big promise of the Internet of things and I 539 00:32:18,200 --> 00:32:21,400 Speaker 1: love that idea. It is something that really appeals to me. 540 00:32:21,760 --> 00:32:24,840 Speaker 1: On the flip side, you start to realize that your 541 00:32:25,000 --> 00:32:28,880 Speaker 1: regular actions are creating data, and that data does in 542 00:32:28,920 --> 00:32:32,720 Speaker 1: fact have value, different value to different entities out there, 543 00:32:33,480 --> 00:32:36,080 Speaker 1: and so having these sort of technologies and fining them in. 544 00:32:37,080 --> 00:32:39,360 Speaker 2: Once you have reconciled. 545 00:32:38,680 --> 00:32:42,560 Speaker 1: This idea and you realize that that this is going on, 546 00:32:42,720 --> 00:32:44,720 Speaker 1: then you can start to make those strategies how is 547 00:32:45,080 --> 00:32:47,160 Speaker 1: was the best way of handling that, both on the 548 00:32:48,080 --> 00:32:50,640 Speaker 1: end user side and on the back end side, so 549 00:32:50,760 --> 00:32:56,120 Speaker 1: that it is a responsible approach. That's really what your 550 00:32:56,160 --> 00:33:00,400 Speaker 1: company is looking into, right, the idea of secure security 551 00:33:00,920 --> 00:33:06,920 Speaker 1: and helping companies protect customer data. 552 00:33:07,160 --> 00:33:09,720 Speaker 3: Yeah, so that's actually kind of very very kind of 553 00:33:09,720 --> 00:33:10,920 Speaker 3: similar to this. So I think, you know, one of 554 00:33:11,000 --> 00:33:14,160 Speaker 3: the things you were kind of touching upon is this 555 00:33:14,360 --> 00:33:17,760 Speaker 3: kind of expectation of organizations to do a better job 556 00:33:17,840 --> 00:33:23,520 Speaker 3: of safeguarding your information, essentially being responsible custodians of your data. 557 00:33:24,160 --> 00:33:28,000 Speaker 3: The challenge for most companies, you know, maybe with less 558 00:33:28,000 --> 00:33:31,400 Speaker 3: sophistication than at Google, but maybe even Google, is that 559 00:33:31,440 --> 00:33:35,200 Speaker 3: they collect so much information about you, and they collect 560 00:33:35,280 --> 00:33:38,800 Speaker 3: it in so many different places and so many applications. 561 00:33:39,240 --> 00:33:42,240 Speaker 3: It doesn't necessarily mean that all that information is tied together, 562 00:33:42,760 --> 00:33:47,080 Speaker 3: but you are leaving digital footprints across organizations, and so 563 00:33:47,160 --> 00:33:53,040 Speaker 3: these companies are essentially becoming large data collection points, and 564 00:33:53,080 --> 00:33:55,560 Speaker 3: it's hard for you as a consumer to know exactly 565 00:33:55,640 --> 00:33:58,680 Speaker 3: what digital footprints you've left. You want to know what 566 00:33:58,840 --> 00:34:01,280 Speaker 3: assets you've left with them, and believe it or not, 567 00:34:01,880 --> 00:34:05,440 Speaker 3: you know if you think about accounting and how companies 568 00:34:05,480 --> 00:34:11,480 Speaker 3: are expected to have responsible tools in place to track 569 00:34:11,560 --> 00:34:14,400 Speaker 3: how much revenue that comes in, how that money is 570 00:34:14,440 --> 00:34:18,920 Speaker 3: getting dispersed, who it's paying. So that's all about accounting 571 00:34:19,000 --> 00:34:23,840 Speaker 3: and financial responsibility. On the digital side, there's very little 572 00:34:23,840 --> 00:34:25,880 Speaker 3: of that today, and that's kind of the origin of 573 00:34:25,920 --> 00:34:28,440 Speaker 3: big ID. You could think of big ID as a 574 00:34:28,480 --> 00:34:32,760 Speaker 3: tool set to help big companies understand where their customer 575 00:34:32,840 --> 00:34:37,680 Speaker 3: information is, what's at risk or potentially at risk, either 576 00:34:37,680 --> 00:34:41,520 Speaker 3: in terms of breach or in terms of misuse, and 577 00:34:41,600 --> 00:34:44,480 Speaker 3: then how to better understand how that information is getting 578 00:34:44,560 --> 00:34:48,719 Speaker 3: used in the organization, either to help ensure that it's 579 00:34:48,719 --> 00:34:52,319 Speaker 3: compliant with regulations or secondly that it's complied with their 580 00:34:52,360 --> 00:34:56,440 Speaker 3: own kind of privacy rules, their own consent agreements that 581 00:34:56,440 --> 00:35:00,480 Speaker 3: they've created between themselves and their consumers. I think that 582 00:35:00,880 --> 00:35:03,920 Speaker 3: this idea of a ledger or accounting software for privacy 583 00:35:03,960 --> 00:35:07,719 Speaker 3: information doesn't as yet exist, and I think increasingly, just 584 00:35:07,840 --> 00:35:10,000 Speaker 3: given the number of digital touch points that we have 585 00:35:10,239 --> 00:35:14,319 Speaker 3: with the companies we interact with, that it's going to 586 00:35:14,320 --> 00:35:16,279 Speaker 3: be certainly a future requirement. 587 00:35:17,040 --> 00:35:20,480 Speaker 1: I think that's really interesting, and I'm thankful that there 588 00:35:20,520 --> 00:35:23,640 Speaker 1: are organizations like yours that are looking into this to 589 00:35:23,680 --> 00:35:27,600 Speaker 1: try and create those best practices, because as we've seen recently, 590 00:35:28,680 --> 00:35:32,000 Speaker 1: the scholarship has shown, it takes very few data points 591 00:35:32,360 --> 00:35:36,040 Speaker 1: to be able to link some information to a specific person, 592 00:35:36,480 --> 00:35:39,239 Speaker 1: and I think a lot of companies out there may 593 00:35:39,280 --> 00:35:42,880 Speaker 1: not even be aware of the implications of some of 594 00:35:42,880 --> 00:35:46,880 Speaker 1: the data they're collecting, not through any sort of maliciousness. 595 00:35:46,960 --> 00:35:50,080 Speaker 1: It simply is, as you point out, there's so many 596 00:35:50,280 --> 00:35:53,960 Speaker 1: of these little digital touch points that you cannot necessarily 597 00:35:54,000 --> 00:35:58,319 Speaker 1: anticipate what the consequences are from the very beginning. And 598 00:35:59,280 --> 00:36:02,600 Speaker 1: it's amazing to me to think that this is going on. 599 00:36:02,760 --> 00:36:03,719 Speaker 2: Everywhere, and. 600 00:36:05,400 --> 00:36:08,799 Speaker 1: It's a snowball that's already going down the hill. 601 00:36:08,840 --> 00:36:10,080 Speaker 2: It's just going to keep on going. 602 00:36:10,520 --> 00:36:13,720 Speaker 1: It's very reassuring to hear that there are people actively 603 00:36:13,760 --> 00:36:16,600 Speaker 1: thinking about these and trying these issues, and trying to 604 00:36:16,600 --> 00:36:19,240 Speaker 1: find the best ways of handling that kind of information 605 00:36:19,880 --> 00:36:24,759 Speaker 1: so that we don't have any we can avoid as 606 00:36:24,800 --> 00:36:30,719 Speaker 1: many chaotic moments of absolute failure as possible. Again, I 607 00:36:30,760 --> 00:36:35,480 Speaker 1: think a lot of people assume that big companies are 608 00:36:36,719 --> 00:36:42,600 Speaker 1: actively pursuing the collection and selling of all of the data, 609 00:36:42,640 --> 00:36:43,920 Speaker 1: and that's not the case. 610 00:36:44,080 --> 00:36:47,120 Speaker 2: Across the board. There are companies that are. 611 00:36:46,920 --> 00:36:49,360 Speaker 1: Collecting a great deal of data in the pursuit of 612 00:36:49,440 --> 00:36:53,160 Speaker 1: whatever business they do, but it's not through the it's 613 00:36:53,200 --> 00:36:56,719 Speaker 1: not necessarily with an intent to do anything you know, 614 00:36:57,760 --> 00:37:01,840 Speaker 1: commercial with that information. But knowing this makes it easier 615 00:37:01,840 --> 00:37:05,719 Speaker 1: for those companies to be more responsible and also to 616 00:37:06,080 --> 00:37:07,880 Speaker 1: maybe even get to a point where they change up 617 00:37:07,920 --> 00:37:10,320 Speaker 1: their practices so that they're only collecting the points of 618 00:37:10,400 --> 00:37:12,120 Speaker 1: data that are relevant to their business. 619 00:37:13,280 --> 00:37:15,960 Speaker 3: Yeah. Well, look, certainly that's the intent of big Idea 620 00:37:16,080 --> 00:37:20,640 Speaker 3: is to help companies be more responsible around their digital assets, 621 00:37:20,640 --> 00:37:24,200 Speaker 3: their customer assets, which you could argue are probably their 622 00:37:24,200 --> 00:37:27,319 Speaker 3: most important assets. You know, sometimes you're hear people talk 623 00:37:27,360 --> 00:37:30,480 Speaker 3: about employees, and you know, your most important assets or 624 00:37:30,480 --> 00:37:33,919 Speaker 3: your employees and they walk out the door every every night. Well, 625 00:37:33,960 --> 00:37:36,839 Speaker 3: your customers are pretty valuable too, because they if they 626 00:37:36,880 --> 00:37:42,200 Speaker 3: stop patronizing you, your business suffers, and their loyalty increasingly 627 00:37:42,320 --> 00:37:45,520 Speaker 3: is very fickle. So if they don't have confidence that 628 00:37:45,880 --> 00:37:51,160 Speaker 3: you are protecting their personal information, their kind of digital 629 00:37:51,320 --> 00:37:54,880 Speaker 3: digital footprints, don't go somewhere else. We'll go to somebody 630 00:37:54,920 --> 00:37:58,120 Speaker 3: that does take better care of that, which again is 631 00:37:58,560 --> 00:38:02,040 Speaker 3: why it kind of makes sense to have technology that 632 00:38:02,320 --> 00:38:09,360 Speaker 3: gives organizations better tooling to track, manage, protect those digital 633 00:38:09,400 --> 00:38:12,839 Speaker 3: personal assets, but digital information that represents kind of who 634 00:38:12,880 --> 00:38:15,400 Speaker 3: you are, where you live, where you've been, what you 635 00:38:15,600 --> 00:38:17,880 Speaker 3: like when you're going on vacation, et cetera. 636 00:38:18,719 --> 00:38:21,839 Speaker 1: Now, I've got a question for you personally, which is 637 00:38:21,880 --> 00:38:26,080 Speaker 1: that are you at a point where you would adopt 638 00:38:26,120 --> 00:38:29,640 Speaker 1: a technology such as Amazon Echo or Google Home or 639 00:38:29,680 --> 00:38:33,880 Speaker 1: would you personally wait a little longer or you know, 640 00:38:33,960 --> 00:38:36,600 Speaker 1: where do you stand on that? Because I can tell 641 00:38:36,600 --> 00:38:39,319 Speaker 1: you being aware of these issues, I guess it's only 642 00:38:39,320 --> 00:38:41,560 Speaker 1: fair that I answered my own question being aware of 643 00:38:41,600 --> 00:38:45,600 Speaker 1: these issues and being cognizant of them. I'm still leaning 644 00:38:45,640 --> 00:38:49,680 Speaker 1: toward getting one, knowing what I know, and taking the 645 00:38:49,760 --> 00:38:53,640 Speaker 1: risk in order to have the benefit. My wife feels 646 00:38:53,760 --> 00:38:56,000 Speaker 1: very differently about it. So that's why I do not 647 00:38:56,080 --> 00:38:59,759 Speaker 1: have one. But I'm curious what, as you, as an 648 00:38:59,800 --> 00:39:02,759 Speaker 1: ex expert on this subject, matter, how you feel about that. 649 00:39:04,520 --> 00:39:07,400 Speaker 3: Yeah, so I think there's two things that come into play. Obviously, 650 00:39:08,360 --> 00:39:10,759 Speaker 3: I'm a fifteen year veteran of the security industry with 651 00:39:10,800 --> 00:39:14,160 Speaker 3: a company focused on enterprise privacy management now, so I 652 00:39:14,239 --> 00:39:17,360 Speaker 3: understand some of the consequences and repercussions. But I'm also 653 00:39:17,480 --> 00:39:19,960 Speaker 3: at heart a person that likes technology. I was a 654 00:39:20,000 --> 00:39:23,120 Speaker 3: reader of Isaac Asimov as a kid Mainland, all the 655 00:39:23,160 --> 00:39:25,680 Speaker 3: kind of great science fiction writers. And I realize the 656 00:39:25,719 --> 00:39:29,120 Speaker 3: future is coming towards us and we could either try 657 00:39:29,160 --> 00:39:31,640 Speaker 3: and hide or dock, or we could try and embrace 658 00:39:31,680 --> 00:39:35,960 Speaker 3: it and understand the consequences. So for me personally, I 659 00:39:36,080 --> 00:39:38,879 Speaker 3: look at this and try to understand how this technology 660 00:39:39,719 --> 00:39:42,520 Speaker 3: will impact our lives going forward. So I will be 661 00:39:42,560 --> 00:39:45,239 Speaker 3: an embracer of the technology because I think, as I 662 00:39:45,239 --> 00:39:47,200 Speaker 3: said at the very beginning, there's a lot of good, 663 00:39:47,239 --> 00:39:49,520 Speaker 3: there's a lot of convenience that comes with it, but 664 00:39:49,560 --> 00:39:52,040 Speaker 3: it's also important for me to understand some of the 665 00:39:52,040 --> 00:39:55,480 Speaker 3: consequences by going through it firsthand, because at the end 666 00:39:55,520 --> 00:39:58,239 Speaker 3: of the day, if I'm you know, I'm part of 667 00:39:58,280 --> 00:40:02,719 Speaker 3: a team building technology to better help protect customer information. 668 00:40:03,200 --> 00:40:08,000 Speaker 3: That you understand the implications of these new whole automation 669 00:40:08,320 --> 00:40:09,879 Speaker 3: car automation technologies. 670 00:40:10,840 --> 00:40:15,760 Speaker 1: Excellent, Dimiti Srota, thank you so much, founder and CEO 671 00:40:16,000 --> 00:40:19,200 Speaker 1: of Big ID. You really helped me and I hope 672 00:40:19,239 --> 00:40:22,600 Speaker 1: my listeners understand a bit more of the implications of this. 673 00:40:22,760 --> 00:40:25,680 Speaker 1: I realize that this sort of technology that has this 674 00:40:25,840 --> 00:40:29,920 Speaker 1: incredible connection to our personal lives, really a level of 675 00:40:29,960 --> 00:40:34,120 Speaker 1: intimacy that most technology does not have, carries with it 676 00:40:34,239 --> 00:40:36,400 Speaker 1: some things that can be a little worrisome. But I 677 00:40:36,480 --> 00:40:39,719 Speaker 1: agree with you. I think if we enter into it 678 00:40:39,760 --> 00:40:43,680 Speaker 1: with open eyes and we are aware of the challenges, 679 00:40:43,719 --> 00:40:46,560 Speaker 1: We're not denying that challenges exist, but we are aware 680 00:40:46,560 --> 00:40:49,719 Speaker 1: of them. That allows us to actually overcome those challenges 681 00:40:49,760 --> 00:40:53,439 Speaker 1: and reap the benefits of this really powerful tool. Thank 682 00:40:53,480 --> 00:40:55,440 Speaker 1: you so much for coming on the show and talking 683 00:40:55,440 --> 00:40:56,600 Speaker 1: with us. 684 00:40:56,640 --> 00:40:58,160 Speaker 3: My pleasure chick here, byebye. 685 00:40:58,440 --> 00:41:01,080 Speaker 1: I think it's really important to remember that mister Soota 686 00:41:01,160 --> 00:41:05,040 Speaker 1: actually said we should embrace technology, but do so. 687 00:41:05,840 --> 00:41:07,960 Speaker 2: In a way where we're aware of the. 688 00:41:07,920 --> 00:41:11,600 Speaker 1: Consequences and we are doing our best to mitigate any 689 00:41:11,680 --> 00:41:14,600 Speaker 1: negative fallout from this technology. 690 00:41:14,640 --> 00:41:15,400 Speaker 2: Moving forward. 691 00:41:16,000 --> 00:41:19,200 Speaker 1: We shouldn't deny it, we shouldn't try to stop it, 692 00:41:19,280 --> 00:41:21,680 Speaker 1: but we should definitely be responsible with the way we 693 00:41:21,760 --> 00:41:25,120 Speaker 1: develop it and the way that we use it. Potentially, 694 00:41:25,200 --> 00:41:29,319 Speaker 1: it has the capacity to make our lives easier. I mean, 695 00:41:29,360 --> 00:41:34,960 Speaker 1: imagine being able to handle everything by just shifting it 696 00:41:35,040 --> 00:41:39,439 Speaker 1: over to your personal assistant who lives everywhere. You can 697 00:41:39,520 --> 00:41:42,760 Speaker 1: access that personal assistant wherever you might be through whatever 698 00:41:43,160 --> 00:41:46,640 Speaker 1: computer or smartphone or standalone device you happen to have 699 00:41:46,680 --> 00:41:50,400 Speaker 1: at your disposal at that place, and access all of 700 00:41:50,440 --> 00:41:56,120 Speaker 1: that those features, everything from entertainment to handling travel and 701 00:41:56,760 --> 00:42:00,880 Speaker 1: stuff that you want taking care of you don't necessarily 702 00:42:00,920 --> 00:42:03,200 Speaker 1: want to attend to yourself, so you can save that 703 00:42:03,280 --> 00:42:07,000 Speaker 1: time to do something else. That's a really cool idea, 704 00:42:07,040 --> 00:42:10,920 Speaker 1: and I love the promise of digital assistance, the idea 705 00:42:10,960 --> 00:42:14,759 Speaker 1: that we will slowly get toward this future where the 706 00:42:14,760 --> 00:42:17,960 Speaker 1: technology around us anticipates what we need before we can 707 00:42:18,000 --> 00:42:21,319 Speaker 1: give voice to it. I love that thought and the 708 00:42:21,400 --> 00:42:24,399 Speaker 1: idea that my life just becomes sort of magical as 709 00:42:24,400 --> 00:42:28,759 Speaker 1: a result, because the technology is shifting things to my 710 00:42:28,920 --> 00:42:32,080 Speaker 1: whim before I can even voice what that whim is, 711 00:42:32,280 --> 00:42:34,799 Speaker 1: before I might even be aware there's a whim I 712 00:42:34,800 --> 00:42:35,919 Speaker 1: could be whimless. 713 00:42:36,640 --> 00:42:37,680 Speaker 2: I'm done saying him. 714 00:42:38,920 --> 00:42:42,480 Speaker 1: Well, that was the twenty sixteen version of AI assistance, 715 00:42:42,560 --> 00:42:45,319 Speaker 1: and you obviously there's a lot more to say now. 716 00:42:45,400 --> 00:42:48,719 Speaker 1: I mean, some AI assistants have been abandoned, like Kortana 717 00:42:49,640 --> 00:42:54,800 Speaker 1: no longer really a thing. Also, Amazon has been cutting 718 00:42:54,960 --> 00:43:00,200 Speaker 1: way back on its division for its personal assistance that 719 00:43:00,239 --> 00:43:04,279 Speaker 1: I will not name at this point, but yeah, there 720 00:43:04,280 --> 00:43:07,360 Speaker 1: have been companies that have been taking massive cuts in 721 00:43:07,400 --> 00:43:10,240 Speaker 1: those departments, at least as I'm recording these intros and outros, 722 00:43:10,280 --> 00:43:12,640 Speaker 1: which by the way, was way back in January of 723 00:43:12,640 --> 00:43:17,000 Speaker 1: twenty twenty three. This should be publishing many months after that. 724 00:43:17,160 --> 00:43:21,800 Speaker 1: But I'm currently living in a time where those divisions 725 00:43:21,800 --> 00:43:25,680 Speaker 1: are getting massive cuts because it turns out that these 726 00:43:25,960 --> 00:43:31,200 Speaker 1: assistants have not been particularly valuable as far as revenue generation, 727 00:43:32,120 --> 00:43:35,400 Speaker 1: and if you can't generate revenue from a product, eventually 728 00:43:35,400 --> 00:43:38,080 Speaker 1: you start to see cutbacks for those products because it 729 00:43:38,120 --> 00:43:40,560 Speaker 1: doesn't make sense to keep supporting them if they're just 730 00:43:40,760 --> 00:43:44,440 Speaker 1: draining resources and not contributing to the overall health of 731 00:43:44,480 --> 00:43:48,400 Speaker 1: the company. So it's been one of those things where 732 00:43:48,760 --> 00:43:54,239 Speaker 1: companies have found it difficult to leverage these AI assistants 733 00:43:54,280 --> 00:43:58,239 Speaker 1: in a way to generate revenue. And yeah, it may 734 00:43:58,320 --> 00:44:01,280 Speaker 1: be that AI assistance since are one of those things 735 00:44:01,320 --> 00:44:06,439 Speaker 1: that ultimately kind of fade away, unless that changes. Maybe 736 00:44:06,520 --> 00:44:09,000 Speaker 1: by the time you're listening to that this has changed 737 00:44:09,200 --> 00:44:12,640 Speaker 1: and I'll need to do an update on this episode. Anyway, 738 00:44:13,080 --> 00:44:15,160 Speaker 1: if you have suggestions for topics I should cover in 739 00:44:15,200 --> 00:44:17,600 Speaker 1: future episodes of tech Stuff, please reach out to me 740 00:44:17,840 --> 00:44:19,680 Speaker 1: let me know. One way to do that is to 741 00:44:19,719 --> 00:44:23,000 Speaker 1: download the iHeartRadio app. It's free to download, free to use. 742 00:44:23,360 --> 00:44:25,359 Speaker 1: You can navigate over to tech Stuff. Just put tech 743 00:44:25,400 --> 00:44:28,160 Speaker 1: Stuff in the search field. It'll pop up you go 744 00:44:28,200 --> 00:44:31,839 Speaker 1: into the podcast page. There's a little microphone icon. If 745 00:44:31,840 --> 00:44:33,680 Speaker 1: you click on that, you can leave a voice message 746 00:44:33,719 --> 00:44:36,040 Speaker 1: up to thirty seconds in length. If you would prefer 747 00:44:36,160 --> 00:44:38,600 Speaker 1: not to do a voice message, you can reach out 748 00:44:38,680 --> 00:44:41,720 Speaker 1: via Twitter. The handle for the show is tech Stuff 749 00:44:41,920 --> 00:44:52,000 Speaker 1: HSW and I'll talk to you again really soon. Tech 750 00:44:52,040 --> 00:44:56,440 Speaker 1: Stuff is an iHeartRadio production. For more podcasts from iHeartRadio, 751 00:44:56,760 --> 00:45:00,479 Speaker 1: visit the iHeartRadio app, Apple Podcasts, or wherever you listen 752 00:45:00,520 --> 00:45:01,600 Speaker 1: to your favorite shows.