1 00:00:04,400 --> 00:00:07,800 Speaker 1: Welcome to tech Stuff, a production from I Heart Radio. 2 00:00:12,200 --> 00:00:14,840 Speaker 1: Hey there, and welcome to tech Stuff. I'm your host, 3 00:00:15,000 --> 00:00:17,880 Speaker 1: Jonathan Strickland. I'm an executive producer with I Heart Radio 4 00:00:18,040 --> 00:00:20,440 Speaker 1: and I love all things tech. And this is the 5 00:00:20,480 --> 00:00:26,840 Speaker 1: tech news for Tuesday, April twenty one. Let's get started. 6 00:00:27,400 --> 00:00:30,480 Speaker 1: The Guardian has a pretty incredible piece about a former 7 00:00:30,560 --> 00:00:34,640 Speaker 1: Facebook data scientist named Sophie Jong and her attempts to 8 00:00:34,720 --> 00:00:39,599 Speaker 1: bring attention to state backed misinformation campaigns that used Facebook 9 00:00:39,640 --> 00:00:43,600 Speaker 1: as a platform. She was fired from Facebook allegedly due 10 00:00:43,600 --> 00:00:47,000 Speaker 1: to poor performance, though Young says that it was because 11 00:00:47,040 --> 00:00:49,360 Speaker 1: she was spending so much time trying to show how 12 00:00:49,560 --> 00:00:53,199 Speaker 1: organized groups were creating propaganda campaigns on Facebook, so she 13 00:00:53,240 --> 00:00:56,600 Speaker 1: wasn't able to spend as much time on her normal duties. 14 00:00:56,720 --> 00:00:58,960 Speaker 1: However you want to frame that that was why she 15 00:00:59,080 --> 00:01:03,200 Speaker 1: was fired. And she wrote a farewell post, which is 16 00:01:03,240 --> 00:01:06,520 Speaker 1: common practice at Facebook, and it was nearly eight thousand 17 00:01:06,560 --> 00:01:10,800 Speaker 1: words long, which is not common practice at Facebook, and 18 00:01:11,080 --> 00:01:14,319 Speaker 1: it was sort of to make her case. She created 19 00:01:14,400 --> 00:01:17,920 Speaker 1: a separate website as well and published her farewell post 20 00:01:18,000 --> 00:01:22,520 Speaker 1: there because the original one was on an internal Facebook forum. 21 00:01:22,840 --> 00:01:25,319 Speaker 1: She wanted to make sure it existed in a separate 22 00:01:25,319 --> 00:01:29,200 Speaker 1: location because she suspected that perhaps the company would delete 23 00:01:29,280 --> 00:01:32,480 Speaker 1: her post. It did, in fact deleted the post, at 24 00:01:32,520 --> 00:01:38,880 Speaker 1: least temporarily, and then allegedly Facebook pressured the hosting service 25 00:01:38,920 --> 00:01:43,000 Speaker 1: that Jong was using for her website to take that 26 00:01:43,120 --> 00:01:46,640 Speaker 1: post offline too, which is a big yikes from me. 27 00:01:46,880 --> 00:01:50,840 Speaker 1: But now Jong is coming forward and supplying various news 28 00:01:50,920 --> 00:01:55,720 Speaker 1: organizations with information about how state backed efforts pushing propaganda 29 00:01:55,760 --> 00:01:59,200 Speaker 1: were relying on Facebook and how Facebook leadership kind of 30 00:01:59,240 --> 00:02:02,000 Speaker 1: allowed it to have open and did not make any 31 00:02:02,080 --> 00:02:06,040 Speaker 1: significant moves to stop it until it became, you know, 32 00:02:06,120 --> 00:02:10,320 Speaker 1: kind of politically necessary. Her argument is that these actions 33 00:02:10,480 --> 00:02:15,160 Speaker 1: or in actions from you know, Facebook management, strengthened authoritarianism 34 00:02:15,160 --> 00:02:18,360 Speaker 1: in multiple parts of the globe. Jong says that the 35 00:02:18,440 --> 00:02:20,919 Speaker 1: incident that kind of started this all off and alerted 36 00:02:20,960 --> 00:02:25,040 Speaker 1: her to the problem was with Facebook pages that were 37 00:02:25,080 --> 00:02:30,400 Speaker 1: supporting the President of Honduras, Juan Orlando Hernandez. She said 38 00:02:30,400 --> 00:02:33,639 Speaker 1: that over the course of six months, his posts had 39 00:02:33,680 --> 00:02:37,760 Speaker 1: received more than fifty nine thousand likes, but upon closer inspection, 40 00:02:37,840 --> 00:02:41,440 Speaker 1: she saw that nearly eighty percent of those likes were 41 00:02:41,520 --> 00:02:47,519 Speaker 1: from accounts that didn't represent real people. Instead, somebody presumably 42 00:02:47,560 --> 00:02:52,880 Speaker 1: on her. Nandez's team had created hundreds of Facebook page accounts. 43 00:02:53,440 --> 00:02:58,320 Speaker 1: So Facebook pages are meant for organizations and businesses, right, 44 00:02:58,360 --> 00:03:02,880 Speaker 1: They don't represent a person so much as an entity, 45 00:03:03,000 --> 00:03:07,160 Speaker 1: and a single administrator was overseeing hundreds of these pages 46 00:03:07,560 --> 00:03:10,919 Speaker 1: and use them to boost likes on posts because Facebook 47 00:03:10,919 --> 00:03:14,720 Speaker 1: pages can interact with posts the same way that normal 48 00:03:14,760 --> 00:03:19,639 Speaker 1: Facebook accounts can. And then they also would post things 49 00:03:19,680 --> 00:03:22,960 Speaker 1: that were praising her Nandez on that Facebook page and 50 00:03:23,040 --> 00:03:27,200 Speaker 1: have other Facebook pages within that big network boost that 51 00:03:27,440 --> 00:03:30,680 Speaker 1: particular post, giving it lots of likes and thus elevating it. 52 00:03:30,760 --> 00:03:33,720 Speaker 1: And the effect was kind of a rising tide. It 53 00:03:33,800 --> 00:03:38,320 Speaker 1: lifts all disinformation as Facebook's algorithm concludes that this is 54 00:03:38,400 --> 00:03:40,880 Speaker 1: material that's driving engagement. You see a lot of likes 55 00:03:40,920 --> 00:03:44,280 Speaker 1: and shares and stuff that shows engagement. Now, granted that 56 00:03:44,320 --> 00:03:47,280 Speaker 1: engagement is coming from a bunch of Facebook pages that 57 00:03:47,320 --> 00:03:50,960 Speaker 1: are run by the same person, but on surface, it 58 00:03:51,000 --> 00:03:55,000 Speaker 1: looks like it's leget That meant that Facebook would then 59 00:03:55,040 --> 00:03:58,720 Speaker 1: serve that up to an even wider audience. While she 60 00:03:58,800 --> 00:04:03,240 Speaker 1: initially faced relucts from Facebook's management regarding taking any action 61 00:04:03,760 --> 00:04:08,480 Speaker 1: towards this, ultimately the company would remove lots of accounts 62 00:04:08,560 --> 00:04:10,920 Speaker 1: and admitted that Joan was correct to push for it. 63 00:04:11,320 --> 00:04:15,360 Speaker 1: But despite that progress, she discovered that she often encountered 64 00:04:15,360 --> 00:04:19,440 Speaker 1: opposition or at the very least, apathy when it came 65 00:04:19,520 --> 00:04:23,159 Speaker 1: to addressing similar problems in lots of other countries. She 66 00:04:23,240 --> 00:04:26,360 Speaker 1: saw that this was not unique to Honduras. It was 67 00:04:26,440 --> 00:04:30,960 Speaker 1: happening in lots of places, particularly places with more authoritarian 68 00:04:31,600 --> 00:04:36,000 Speaker 1: leaders in power. So she found instances in which these 69 00:04:36,080 --> 00:04:40,200 Speaker 1: various regimes were using Facebook pages to heap criticism and 70 00:04:40,240 --> 00:04:43,760 Speaker 1: abuse on political opponents, not just boost their own message, 71 00:04:43,800 --> 00:04:48,640 Speaker 1: but to degrade anyone who opposed them. She advocated a 72 00:04:48,760 --> 00:04:51,080 Speaker 1: change in the company's rules to close this sort of 73 00:04:51,080 --> 00:04:56,719 Speaker 1: loophole that Facebook pages created for these different people. And 74 00:04:56,839 --> 00:04:59,360 Speaker 1: as it stands, you can still make as many Facebook 75 00:04:59,360 --> 00:05:02,960 Speaker 1: pages as you'll like, and they still interact with other posts, 76 00:05:03,040 --> 00:05:06,360 Speaker 1: just like a normal Facebook account would. She noted that 77 00:05:06,600 --> 00:05:10,400 Speaker 1: for poorer countries with smaller numbers of Facebook users, she 78 00:05:10,640 --> 00:05:14,480 Speaker 1: rarely got very much support in doing anything about the problem. 79 00:05:14,600 --> 00:05:17,640 Speaker 1: She said it was pretty clear that if the country 80 00:05:17,680 --> 00:05:21,560 Speaker 1: were wealthy, then the company was more likely to do 81 00:05:21,640 --> 00:05:24,880 Speaker 1: something about it. If the country had a larger number 82 00:05:24,960 --> 00:05:28,240 Speaker 1: of people using Facebook, then it was more likely to 83 00:05:28,240 --> 00:05:31,520 Speaker 1: do something about it, but it wasn't the same across 84 00:05:31,560 --> 00:05:34,839 Speaker 1: the board. I really recommend reading the article. It is 85 00:05:34,880 --> 00:05:39,800 Speaker 1: titled How Facebook let Fake Engagement Distort Global Politics? A 86 00:05:39,839 --> 00:05:43,920 Speaker 1: whistleblowers account. It's by Julia Carey Wong, and like I said, 87 00:05:43,960 --> 00:05:46,960 Speaker 1: it's at the Guardian. It's a very good read. I've 88 00:05:46,960 --> 00:05:51,000 Speaker 1: only given a tiny summation of some of the stuff 89 00:05:51,040 --> 00:05:55,600 Speaker 1: that's in that article, and it is fascinating and infuriating. 90 00:05:56,520 --> 00:05:59,520 Speaker 1: In similar news, the Guardian also reports that Facebook, after 91 00:06:00,000 --> 00:06:02,239 Speaker 1: a great deal of pressure and more than a year 92 00:06:02,279 --> 00:06:05,920 Speaker 1: of delays, removed more than sixteen thousand groups that were 93 00:06:05,960 --> 00:06:09,839 Speaker 1: posting fake reviews for products and services. Essentially, what it 94 00:06:09,920 --> 00:06:14,039 Speaker 1: sounds like is that these groups were outright buying positive 95 00:06:14,080 --> 00:06:18,480 Speaker 1: reviews for their own products and sometimes buying negative reviews 96 00:06:18,520 --> 00:06:23,560 Speaker 1: for competitors, so paying someone to leave various reviews to 97 00:06:23,600 --> 00:06:28,120 Speaker 1: try and boost numbers. It's pretty schevy stuff. This all 98 00:06:28,200 --> 00:06:30,720 Speaker 1: came out of the UK. I mean that should be 99 00:06:30,800 --> 00:06:34,280 Speaker 1: obviously with the Guardian, but the UK's Competition and Market 100 00:06:34,320 --> 00:06:39,440 Speaker 1: authority pushed Facebook to act on this way back in January. 101 00:06:39,560 --> 00:06:41,680 Speaker 1: But clearly since I reporting on it, and we're now 102 00:06:41,720 --> 00:06:45,360 Speaker 1: in April of one, you could say that Facebook failed 103 00:06:45,400 --> 00:06:48,240 Speaker 1: to act in a timely manner. The CMA released a 104 00:06:48,240 --> 00:06:51,640 Speaker 1: statement that said, quote after we intervened again, the company 105 00:06:51,680 --> 00:06:55,080 Speaker 1: made significant changes, but it is disappointing it has taken 106 00:06:55,120 --> 00:06:57,960 Speaker 1: them over a year to fix these issues. End quote. 107 00:06:58,480 --> 00:07:00,200 Speaker 1: I might have to do a full episode of out 108 00:07:00,279 --> 00:07:03,679 Speaker 1: fake reviews and followers and such in a future tech Stuff, 109 00:07:04,080 --> 00:07:07,120 Speaker 1: so stay tuned for that. You know. Back in two 110 00:07:07,839 --> 00:07:11,800 Speaker 1: the Natan's nuclear reactor facility in Iran was hit with 111 00:07:11,840 --> 00:07:15,880 Speaker 1: a specific type of malware called stucks net. We actually 112 00:07:15,880 --> 00:07:18,120 Speaker 1: did a tech Stuff episode about it. So this malware 113 00:07:18,360 --> 00:07:21,000 Speaker 1: had a very specific goal. It was to change the 114 00:07:21,120 --> 00:07:26,040 Speaker 1: rotational speeds of centrifuges at the nuclear power plant, and 115 00:07:26,080 --> 00:07:29,920 Speaker 1: the purpose of those centrifuges was to refine uranium. Refined 116 00:07:30,040 --> 00:07:33,000 Speaker 1: uranium is the fuel for nuclear power plants, but if 117 00:07:33,000 --> 00:07:36,880 Speaker 1: you refine it enough, you could potentially use that uranium 118 00:07:36,920 --> 00:07:39,600 Speaker 1: to make a nuclear weapon. And it turned out that 119 00:07:39,640 --> 00:07:44,000 Speaker 1: the CIA and Israel had collaborated on creating and deploying 120 00:07:44,000 --> 00:07:46,240 Speaker 1: this malware, finding a way to get installed in the 121 00:07:46,280 --> 00:07:50,200 Speaker 1: facility despite the fact that the facility had an air gap, 122 00:07:50,360 --> 00:07:54,200 Speaker 1: that is the facilities, critical systems did not connect out 123 00:07:54,240 --> 00:07:59,160 Speaker 1: to the internet. Stucks net would spread far beyond Iran though. 124 00:07:59,200 --> 00:08:02,480 Speaker 1: It kind of just that dormant everywhere else because it 125 00:08:02,600 --> 00:08:06,200 Speaker 1: had that specific purpose, and it set the Iranian nuclear 126 00:08:06,240 --> 00:08:08,880 Speaker 1: program back a few years. Okay, but what does this 127 00:08:08,960 --> 00:08:11,320 Speaker 1: have to do with today? Well, if you flash forward, 128 00:08:11,680 --> 00:08:14,800 Speaker 1: it sounds like Israel has had a hand in another 129 00:08:14,960 --> 00:08:19,520 Speaker 1: tech attack designed to disrupt the Natan's nuclear reactor. This time, 130 00:08:19,840 --> 00:08:23,160 Speaker 1: the Iranians report that someone carried out a cyber attack 131 00:08:23,600 --> 00:08:27,440 Speaker 1: that sent the Natan's facility into a blackout, and it's 132 00:08:27,440 --> 00:08:30,800 Speaker 1: a move that the Iranian officials say also caused extensive 133 00:08:30,880 --> 00:08:35,040 Speaker 1: damage to the electricity grid. As I record this, Israel 134 00:08:35,160 --> 00:08:39,880 Speaker 1: hasn't denied involvement in what Iranians are calling sabotage. Israel 135 00:08:39,920 --> 00:08:42,280 Speaker 1: and Iran have been hostile to one another for a 136 00:08:42,400 --> 00:08:46,280 Speaker 1: very long time, with both countries striking out against the 137 00:08:46,280 --> 00:08:51,760 Speaker 1: other for several years. Speaking of nuclear power, the Natan's plant, 138 00:08:51,880 --> 00:08:57,080 Speaker 1: like all practical nuclear plants that are commercial facilities today, 139 00:08:57,280 --> 00:09:00,120 Speaker 1: runs on nuclear fission. That is the process that we 140 00:09:00,160 --> 00:09:04,240 Speaker 1: rely on to generate electricity. Boils down to heavy atoms 141 00:09:04,280 --> 00:09:07,680 Speaker 1: get split apart and they release a lot of energy, 142 00:09:07,720 --> 00:09:10,480 Speaker 1: which is then used to generate electricity. I won't go 143 00:09:10,520 --> 00:09:13,319 Speaker 1: into the full detail that it's an episode all by itself, 144 00:09:13,360 --> 00:09:17,320 Speaker 1: but there are scientists working hard to create practical nuclear 145 00:09:17,520 --> 00:09:22,199 Speaker 1: fusion reactors which would take to light atoms like as 146 00:09:22,240 --> 00:09:26,080 Speaker 1: in lighter ones on the lighter into the elemental table 147 00:09:26,720 --> 00:09:30,120 Speaker 1: and then fuse those two light atoms together, which also 148 00:09:30,760 --> 00:09:33,600 Speaker 1: releases a lot of energy. And what's more, you're not 149 00:09:33,640 --> 00:09:36,480 Speaker 1: talking about the dangerous radiation that you would get with 150 00:09:36,640 --> 00:09:41,480 Speaker 1: fusion reactors, Like the material you're using isn't radioactive in 151 00:09:41,520 --> 00:09:45,560 Speaker 1: that sense, then you're not generating radioactive waste. It could 152 00:09:45,600 --> 00:09:48,160 Speaker 1: lead to some transformational changes in the way that countries 153 00:09:48,240 --> 00:09:52,520 Speaker 1: meet their energy requirements. With working fusion reactors, a country 154 00:09:52,520 --> 00:09:56,400 Speaker 1: could conceivably go entirely independent for its energy needs, which 155 00:09:56,440 --> 00:10:00,000 Speaker 1: in turn really boosts national security. You're not dependent upon 156 00:10:00,040 --> 00:10:04,440 Speaker 1: on some other nation for say oil. But fusion reactors 157 00:10:05,120 --> 00:10:07,400 Speaker 1: require a lot of energy just to get started, and 158 00:10:07,440 --> 00:10:12,120 Speaker 1: you're typically talking about pumping in energy that is equivalent 159 00:10:12,160 --> 00:10:14,280 Speaker 1: to millions of degrees of heat. And that means that 160 00:10:14,320 --> 00:10:18,960 Speaker 1: while scientists have created fusion reactors that can produce energy 161 00:10:19,080 --> 00:10:23,000 Speaker 1: from fusion reactions, the big barrier is that it took 162 00:10:23,160 --> 00:10:25,760 Speaker 1: as much or nearly as much energy to start the 163 00:10:25,800 --> 00:10:29,760 Speaker 1: reaction as you get out of that reaction. Now, that 164 00:10:29,880 --> 00:10:33,400 Speaker 1: sort of ratio clearly doesn't work, and it doesn't work 165 00:10:33,440 --> 00:10:35,720 Speaker 1: if you want to produce electricity at any real scale. 166 00:10:36,080 --> 00:10:38,960 Speaker 1: If you're pouring in as much energy as you're getting out, 167 00:10:39,000 --> 00:10:43,440 Speaker 1: then you're not really achieving anything. But a company called 168 00:10:43,600 --> 00:10:47,120 Speaker 1: t A E Technologies says that it anticipates a working 169 00:10:47,200 --> 00:10:50,920 Speaker 1: commercial fusion reactor by the end of the decade. That 170 00:10:51,080 --> 00:10:54,480 Speaker 1: is exciting stuff. And yeah, it's still several years away, 171 00:10:54,600 --> 00:10:57,360 Speaker 1: but considering the amount of energy and computing power needed 172 00:10:57,400 --> 00:11:01,200 Speaker 1: to get this just right, it's still really impressive. My 173 00:11:01,320 --> 00:11:03,920 Speaker 1: hope is that T A E. Technologies is absolutely right 174 00:11:03,960 --> 00:11:08,439 Speaker 1: about this, and that by we see fusion reactors transform 175 00:11:08,480 --> 00:11:10,920 Speaker 1: how we get our electricity, it could really make an 176 00:11:11,040 --> 00:11:13,920 Speaker 1: enormous contribution to countries that are trying to leave fossil 177 00:11:13,960 --> 00:11:17,000 Speaker 1: fuels behind, and it could power all those companies that 178 00:11:17,080 --> 00:11:19,760 Speaker 1: it plans to go carbon neutral or carbon negative in 179 00:11:19,800 --> 00:11:24,200 Speaker 1: the near future. Microsoft is buying a company called Nuance, 180 00:11:24,360 --> 00:11:28,840 Speaker 1: which is in the AI and speech recognition industry. Nuance 181 00:11:28,920 --> 00:11:32,400 Speaker 1: produces a type of software called Dragon, which uses machine 182 00:11:32,480 --> 00:11:35,800 Speaker 1: learning to tweak itself as it interacts with a particular 183 00:11:35,960 --> 00:11:39,400 Speaker 1: speaker as a person who is speaking. So, in other words, 184 00:11:39,640 --> 00:11:43,280 Speaker 1: this is a speech recognition approach that improves the more 185 00:11:43,559 --> 00:11:47,360 Speaker 1: someone works with it. The system comes to recognize that 186 00:11:47,520 --> 00:11:53,600 Speaker 1: person's accent, their speech patterns, pronunciation, etcetera. Apple's serri relies 187 00:11:53,720 --> 00:11:57,000 Speaker 1: in part on the Dragon software, though to what degree 188 00:11:57,120 --> 00:12:04,240 Speaker 1: remains Apple's secret sauce or secret Apple sauce, Secret Apple Sauce. 189 00:12:04,320 --> 00:12:08,199 Speaker 1: I'm getting off track, sorry. Microsoft is paying nearly twenty 190 00:12:08,240 --> 00:12:13,839 Speaker 1: billion dollars princely some for Nuance. The only acquisition Microsoft 191 00:12:13,880 --> 00:12:18,079 Speaker 1: has paid more for was linked In back in that 192 00:12:18,160 --> 00:12:21,000 Speaker 1: set them back around twenty six billion bucks or so. 193 00:12:21,880 --> 00:12:25,360 Speaker 1: What does Microsoft plan to do with Nuance? Well, I 194 00:12:25,360 --> 00:12:28,720 Speaker 1: imagine we'll see some tight integrations of Dragon software with 195 00:12:28,800 --> 00:12:33,120 Speaker 1: other Microsoft products such as Office three. Perhaps will see 196 00:12:33,160 --> 00:12:37,480 Speaker 1: more speech to text applications. If those are good enough, 197 00:12:37,520 --> 00:12:40,440 Speaker 1: maybe I'll just dictate my notes to my computer instead 198 00:12:40,440 --> 00:12:43,000 Speaker 1: of you know, typing them all out. But then I'll 199 00:12:43,000 --> 00:12:45,520 Speaker 1: have to say the same stuff twice, with the second 200 00:12:45,520 --> 00:12:48,439 Speaker 1: time being you know, on Mike for recording. Maybe I 201 00:12:48,480 --> 00:12:51,679 Speaker 1: should think that through again. But seriously, this could be 202 00:12:51,720 --> 00:12:55,440 Speaker 1: an evolutionary step in accessibility, and it could really bolster 203 00:12:55,559 --> 00:12:59,000 Speaker 1: the features found in other Microsoft products. So I'm very 204 00:12:59,000 --> 00:13:03,280 Speaker 1: curious to see how it gets integrated moving forward. Mark 205 00:13:03,360 --> 00:13:06,040 Speaker 1: German of Bloomberg reports that Apple is working on a 206 00:13:06,080 --> 00:13:09,320 Speaker 1: smart speaker product that will have an iPad as a display. 207 00:13:09,520 --> 00:13:12,120 Speaker 1: So I guess it's a home pod with a view. 208 00:13:12,800 --> 00:13:14,959 Speaker 1: Home Pod, in case you don't know, is Apple's smart 209 00:13:15,040 --> 00:13:20,160 Speaker 1: speaker they launched that they have since discontinued the original 210 00:13:20,160 --> 00:13:23,040 Speaker 1: home pod, that the HomePod many still exists, I think. 211 00:13:23,640 --> 00:13:26,360 Speaker 1: The German says that at least one of the concepts 212 00:13:26,400 --> 00:13:29,360 Speaker 1: for the new speaker included away for the screen the 213 00:13:29,400 --> 00:13:32,640 Speaker 1: little iPad on it to rotate around the top of 214 00:13:32,640 --> 00:13:35,320 Speaker 1: the speaker so that the screen can always face you. 215 00:13:35,400 --> 00:13:38,800 Speaker 1: So they imagine around speaker with a screen attached to it, 216 00:13:38,880 --> 00:13:40,880 Speaker 1: and as you move from one side of the room 217 00:13:40,920 --> 00:13:44,600 Speaker 1: to the other, the screen actually rotates around the speaker 218 00:13:44,679 --> 00:13:48,600 Speaker 1: to always face you. Um. Sounds kind of creepy. Really, 219 00:13:48,640 --> 00:13:51,520 Speaker 1: also sounds very similar to the Amazon Echo show Tin, 220 00:13:51,679 --> 00:13:53,959 Speaker 1: which I admit was a product that I didn't even 221 00:13:53,960 --> 00:13:57,520 Speaker 1: know existed until I was looking into this. German says 222 00:13:57,520 --> 00:13:59,880 Speaker 1: the Apples also working on a sort of Apple TV 223 00:14:00,120 --> 00:14:04,240 Speaker 1: slash soundbar type device that also has a camera incorporated 224 00:14:04,280 --> 00:14:05,960 Speaker 1: in it. So it makes me think of a kind 225 00:14:05,960 --> 00:14:09,439 Speaker 1: of a combination of things. Parts streaming media service, kind 226 00:14:09,440 --> 00:14:12,959 Speaker 1: of like a Roku, part sound system, so like your 227 00:14:13,000 --> 00:14:16,400 Speaker 1: traditional sound bar, and part video interface like the old 228 00:14:16,440 --> 00:14:19,520 Speaker 1: Microsoft Connect. Though reportedly this camera would be used for 229 00:14:19,520 --> 00:14:22,800 Speaker 1: stuff like making video calls, but not you know, gest 230 00:14:22,840 --> 00:14:27,840 Speaker 1: your controls necessarily. I'm not super deep in the Apple ecosystem, 231 00:14:27,880 --> 00:14:30,040 Speaker 1: so while I think these technologies are kind of cool, 232 00:14:30,320 --> 00:14:33,080 Speaker 1: I'm not sure that I have a place for them 233 00:14:33,120 --> 00:14:35,920 Speaker 1: in my life. But they do sound really neat. What 234 00:14:36,080 --> 00:14:39,440 Speaker 1: I'm really waiting on to learn more about is Apple's 235 00:14:39,440 --> 00:14:42,560 Speaker 1: augmented reality technology that's been in development for a while. 236 00:14:42,680 --> 00:14:44,960 Speaker 1: I'm waiting to see more about that. That's what has 237 00:14:45,000 --> 00:14:48,440 Speaker 1: me excited. If you're keeping track of all the services 238 00:14:48,440 --> 00:14:51,680 Speaker 1: and apps that Google has introduced and then subsequently abandoned, 239 00:14:51,800 --> 00:14:54,760 Speaker 1: get ready to add another one. The company has confirmed 240 00:14:54,800 --> 00:14:58,280 Speaker 1: it will shut down the Shopping app on iOS and 241 00:14:58,280 --> 00:15:02,560 Speaker 1: Android devices by eight June, with much of that functionality 242 00:15:02,680 --> 00:15:05,880 Speaker 1: closing down in the weeks leading up to that. If 243 00:15:05,920 --> 00:15:08,360 Speaker 1: you've never used the Shopping app, it's an app that 244 00:15:08,360 --> 00:15:11,960 Speaker 1: tells you where you can shop for specific products. The 245 00:15:12,000 --> 00:15:15,520 Speaker 1: app leveraged Google's search to come through online stores and 246 00:15:15,560 --> 00:15:18,440 Speaker 1: pulling up results that hopefully are relevant to whatever it 247 00:15:18,480 --> 00:15:21,600 Speaker 1: is you're searching for. Google will continue to support the 248 00:15:21,720 --> 00:15:25,240 Speaker 1: Google Shopping website, which you know does the same thing, 249 00:15:25,800 --> 00:15:28,680 Speaker 1: and you could just access that using a browser on 250 00:15:28,760 --> 00:15:32,720 Speaker 1: your smartphone and bypass the app entirely. We're also likely 251 00:15:32,760 --> 00:15:36,640 Speaker 1: to see the shopping features included into other Google products, 252 00:15:36,720 --> 00:15:39,400 Speaker 1: So if you were to just search for a specific 253 00:15:39,440 --> 00:15:42,720 Speaker 1: item on a Google phone, for example, you might get 254 00:15:42,760 --> 00:15:45,600 Speaker 1: a shopping link pop up as one of your potential 255 00:15:45,960 --> 00:15:49,880 Speaker 1: you know, search results in that respect. But yeah, iOS 256 00:15:49,920 --> 00:15:53,360 Speaker 1: and Android shopping app from Google is going out of business. 257 00:15:54,240 --> 00:15:57,240 Speaker 1: If you live in Houston, Texas, and if you order 258 00:15:57,280 --> 00:16:00,280 Speaker 1: a pizza from Dominoes, there's a chance your pizza will 259 00:16:00,360 --> 00:16:04,080 Speaker 1: arrive at your front door courtesy of a robot. Domino's 260 00:16:04,120 --> 00:16:07,480 Speaker 1: has partnered with a startup company called neuro in u 261 00:16:07,760 --> 00:16:11,520 Speaker 1: r O to test this out in a pilot or 262 00:16:11,640 --> 00:16:15,960 Speaker 1: I guess technically a pilot less program. The robot looks 263 00:16:16,040 --> 00:16:20,240 Speaker 1: kind of like a manature van or bus. It is autonomous. 264 00:16:20,520 --> 00:16:24,920 Speaker 1: It's called R two, which is cute. It has cameras, radar, 265 00:16:24,960 --> 00:16:27,960 Speaker 1: and thermal imaging to help detect its environment and let 266 00:16:27,960 --> 00:16:32,560 Speaker 1: it get around, and the robots precious cargo is protected 267 00:16:32,560 --> 00:16:36,080 Speaker 1: by an electronic lock which can be disengaged if you 268 00:16:36,160 --> 00:16:39,800 Speaker 1: type in the proper pen which is good because I 269 00:16:39,840 --> 00:16:43,320 Speaker 1: know that here in America robots traveling by themselves don't 270 00:16:43,320 --> 00:16:47,120 Speaker 1: always do so well. Rest in peace, Hitchbot gone, but 271 00:16:47,200 --> 00:16:50,040 Speaker 1: not forgotten. And this is not going to be a 272 00:16:50,120 --> 00:16:53,760 Speaker 1: citywide program either. It's not like if you live in 273 00:16:53,760 --> 00:16:56,520 Speaker 1: Houston you're going to be able to access this. It's 274 00:16:56,560 --> 00:17:00,160 Speaker 1: actually a very small experiment. Only the Domino's p It's 275 00:17:00,160 --> 00:17:05,080 Speaker 1: a joint in Woodland Heights on Houston Avenue is participating 276 00:17:05,119 --> 00:17:08,280 Speaker 1: in this experiment so far. And even then, the robot 277 00:17:08,320 --> 00:17:11,440 Speaker 1: will only be delivering during certain times on certain days. 278 00:17:11,680 --> 00:17:14,600 Speaker 1: But on those days and during those times, residents of 279 00:17:14,600 --> 00:17:16,719 Speaker 1: Woodland Heights and Houston might be able to get up 280 00:17:16,760 --> 00:17:20,320 Speaker 1: piping hot zo delivered courtesy of a robot that almost 281 00:17:20,320 --> 00:17:23,760 Speaker 1: certainly doesn't wish to conquer the planet, assuming of course, 282 00:17:23,800 --> 00:17:26,639 Speaker 1: that the annoid hasn't found a way to compromise the robot. 283 00:17:27,640 --> 00:17:32,800 Speaker 1: And that wraps up the news for Tuesday A one. 284 00:17:33,160 --> 00:17:35,639 Speaker 1: If you have suggestions or topics I should cover in 285 00:17:35,720 --> 00:17:38,879 Speaker 1: Tech Stuff, let me know. Reach out via Twitter. The 286 00:17:38,880 --> 00:17:42,080 Speaker 1: handle for the show is Text Stuff hs W and 287 00:17:42,080 --> 00:17:49,959 Speaker 1: now I'll talk to you again really soon. Text Stuff 288 00:17:50,080 --> 00:17:53,240 Speaker 1: is an I Heart Radio production. For more podcasts from 289 00:17:53,240 --> 00:17:57,040 Speaker 1: I Heart Radio, visit the I Heart Radio app, Apple Podcasts, 290 00:17:57,119 --> 00:18:00,879 Speaker 1: or wherever you listen to your favorite shows. Th