1 00:00:04,400 --> 00:00:07,800 Speaker 1: Welcome to tech Stuff, a production from I Heart Radio. 2 00:00:12,080 --> 00:00:14,960 Speaker 1: Hey there, and welcome to tech Stuff. I'm your host, 3 00:00:15,080 --> 00:00:18,200 Speaker 1: Jonathan Strickland. I'm an executive producer with iHeart Radio and 4 00:00:18,239 --> 00:00:21,560 Speaker 1: I love all things tech. And this is the tech 5 00:00:21,680 --> 00:00:27,560 Speaker 1: news for Thursday May one, and we're gonna concentrate a 6 00:00:27,560 --> 00:00:31,480 Speaker 1: lot on Google for much of this episode. So, last year, 7 00:00:31,880 --> 00:00:36,239 Speaker 1: Google canceled its annual io event for developers because of 8 00:00:36,280 --> 00:00:39,760 Speaker 1: the pandemic. This year, the company held a virtual event 9 00:00:40,080 --> 00:00:42,920 Speaker 1: in which presenters appeared on camera in front of a 10 00:00:42,960 --> 00:00:47,880 Speaker 1: small audience of Googlers at the Google Plex in Mountain View, California. 11 00:00:48,159 --> 00:00:50,600 Speaker 1: So we're going to go over some of the announcements 12 00:00:50,760 --> 00:00:55,120 Speaker 1: from that event. And first up is Android twelve. There 13 00:00:55,160 --> 00:00:58,280 Speaker 1: were a lot of updates to Android's operating system that 14 00:00:58,320 --> 00:01:01,760 Speaker 1: we're announced and it is now public beta, but it 15 00:01:01,800 --> 00:01:04,760 Speaker 1: will launch for real zs this fall. Some of the 16 00:01:04,880 --> 00:01:08,720 Speaker 1: changes are to use your interface and they're largely cosmetic. 17 00:01:08,959 --> 00:01:12,200 Speaker 1: For example, the color palette of the operating system is 18 00:01:12,200 --> 00:01:15,960 Speaker 1: more customizable in Android twelve, and you can set the 19 00:01:15,959 --> 00:01:19,240 Speaker 1: OS so that it will choose a color palette sourced 20 00:01:19,319 --> 00:01:23,200 Speaker 1: from whatever image you choose as your background wallpaper. So 21 00:01:23,560 --> 00:01:25,679 Speaker 1: on my phone. I've got a picture of my doggie 22 00:01:25,680 --> 00:01:30,120 Speaker 1: Tibolt as my wallpaper, and presumably if I run Android 23 00:01:30,200 --> 00:01:33,959 Speaker 1: twelve on my phone, then my notifications and stuff will 24 00:01:34,319 --> 00:01:37,960 Speaker 1: have a color scheme that reflects the colors in that photo, 25 00:01:38,160 --> 00:01:40,880 Speaker 1: which is kind of neat. There are tons of other 26 00:01:41,000 --> 00:01:44,840 Speaker 1: updates besides the cosmetic, however. One is that there's a 27 00:01:44,880 --> 00:01:48,680 Speaker 1: new privacy dashboard and Android twelve, which lets you see 28 00:01:48,680 --> 00:01:52,920 Speaker 1: which apps are accessing your phone's various features and how 29 00:01:52,920 --> 00:01:56,000 Speaker 1: frequently they do so, so you can check to see 30 00:01:56,480 --> 00:01:59,880 Speaker 1: which apps are checking on the your location, you know, 31 00:02:00,040 --> 00:02:03,280 Speaker 1: really frequently, or making use of your camera or your 32 00:02:03,280 --> 00:02:06,960 Speaker 1: microphone or whatever. And you can also use the dashboard 33 00:02:07,000 --> 00:02:10,280 Speaker 1: to revoke access to those features should you wish. So 34 00:02:10,320 --> 00:02:13,840 Speaker 1: if you look and say, oh, this app is pinging 35 00:02:14,040 --> 00:02:17,480 Speaker 1: my location, you know, twenty times a day, you can 36 00:02:17,480 --> 00:02:21,160 Speaker 1: revoke access from that dashboard. It's Google's way to give 37 00:02:21,280 --> 00:02:25,120 Speaker 1: users more insight into how their phones are harvesting data 38 00:02:25,400 --> 00:02:28,560 Speaker 1: with these various apps. And don't get me wrong, Google 39 00:02:28,680 --> 00:02:32,200 Speaker 1: does this as much or more than anyone else, but 40 00:02:32,280 --> 00:02:35,040 Speaker 1: I do think this is a really nice feature. Other 41 00:02:35,160 --> 00:02:37,600 Speaker 1: updates to the OS include a feature in which the 42 00:02:37,680 --> 00:02:41,400 Speaker 1: clock on the screen saver screen will change size depending 43 00:02:41,480 --> 00:02:44,560 Speaker 1: upon how many notifications you have, so as you clear 44 00:02:44,560 --> 00:02:47,880 Speaker 1: out notifications, the clock gets bigger. That's a pretty handy 45 00:02:47,919 --> 00:02:50,880 Speaker 1: way to see if you've got messages waiting on you 46 00:02:51,000 --> 00:02:54,560 Speaker 1: on your phone. One of the cool technologies Google showed 47 00:02:54,560 --> 00:02:58,000 Speaker 1: off at the event was projects star Line, which is 48 00:02:58,000 --> 00:03:01,920 Speaker 1: a sort of three D video conferencing technology. In the 49 00:03:02,000 --> 00:03:05,080 Speaker 1: version we saw at the event, there were examples of 50 00:03:05,160 --> 00:03:08,400 Speaker 1: two people each sitting down in front of a large screen, 51 00:03:08,960 --> 00:03:11,880 Speaker 1: and there wasn't really any indication as to how far 52 00:03:11,919 --> 00:03:15,160 Speaker 1: apart these people were in real space. I mean, maybe 53 00:03:15,240 --> 00:03:17,960 Speaker 1: they were across the country from each other, or maybe 54 00:03:18,040 --> 00:03:20,280 Speaker 1: they were in different parts of the same building, but 55 00:03:20,360 --> 00:03:24,359 Speaker 1: either way, the setup included depth sensing cameras that could 56 00:03:24,400 --> 00:03:27,639 Speaker 1: capture light fields and then send that data over to 57 00:03:27,760 --> 00:03:30,960 Speaker 1: the other screen while doing the same thing for the 58 00:03:31,000 --> 00:03:34,560 Speaker 1: other side. So the effect was that the image you 59 00:03:34,600 --> 00:03:38,040 Speaker 1: saw on screen appeared to have real depth to it, 60 00:03:38,480 --> 00:03:41,480 Speaker 1: almost as if the person you're talking to on screen 61 00:03:41,920 --> 00:03:44,640 Speaker 1: is actually sitting right in front of you in person. 62 00:03:45,120 --> 00:03:48,200 Speaker 1: Which is pretty nifty stuff all by itself. But what 63 00:03:48,320 --> 00:03:52,480 Speaker 1: I found particularly fascinating is that Google developed a compression 64 00:03:52,520 --> 00:03:55,240 Speaker 1: algorithm that can handle this kind of application in the 65 00:03:55,280 --> 00:04:00,120 Speaker 1: first place. High definition camera with depth sensing capabilities is 66 00:04:00,160 --> 00:04:03,880 Speaker 1: going to generate a lot of data and transmitting that 67 00:04:03,960 --> 00:04:06,960 Speaker 1: data in real time so that you can have a 68 00:04:07,080 --> 00:04:11,119 Speaker 1: video chat with no perceptible lag time. That's a big challenge. 69 00:04:11,640 --> 00:04:14,120 Speaker 1: Google says it developed a way to compress the data 70 00:04:14,160 --> 00:04:17,720 Speaker 1: stream down to a factor of one hundred, so one 71 00:04:17,800 --> 00:04:23,119 Speaker 1: hundred times smaller than the uncompressed size. The demonstration was cool. 72 00:04:23,360 --> 00:04:26,479 Speaker 1: It was also prerecorded, I should add, but there's no 73 00:04:26,600 --> 00:04:29,080 Speaker 1: word on whether Google is going to roll this out 74 00:04:29,120 --> 00:04:32,320 Speaker 1: on any sort of meaningful scale, or if this will 75 00:04:32,360 --> 00:04:35,760 Speaker 1: remain more as a demonstration of really neat technologies that 76 00:04:35,839 --> 00:04:39,279 Speaker 1: might find their way into other applications in the future. 77 00:04:39,920 --> 00:04:43,599 Speaker 1: Something that is rolling out is another collaborative work tool 78 00:04:43,800 --> 00:04:48,479 Speaker 1: called smart Canvas. Google's demo showed team members working on 79 00:04:48,520 --> 00:04:51,920 Speaker 1: a project together in real time on a Google platform 80 00:04:51,960 --> 00:04:56,119 Speaker 1: that incorporates stuff like Google Docs and Google Sheets. Also 81 00:04:56,120 --> 00:04:58,599 Speaker 1: allowed users to have a video chat going at the 82 00:04:58,640 --> 00:05:01,279 Speaker 1: same time, so that people can talk with one another 83 00:05:01,320 --> 00:05:05,359 Speaker 1: while collaborating in real time, and the tool incorporates some 84 00:05:05,520 --> 00:05:09,400 Speaker 1: of Google's AI features as well to to help communicate 85 00:05:09,480 --> 00:05:13,039 Speaker 1: clearly and avoid pitfalls. So the example they had in 86 00:05:13,080 --> 00:05:17,400 Speaker 1: their video was using the word chairman in part of 87 00:05:17,400 --> 00:05:20,640 Speaker 1: a presentation and the AI suggest changing it to chair 88 00:05:20,720 --> 00:05:26,480 Speaker 1: person to avoid using a gender specific noun. So that 89 00:05:26,560 --> 00:05:29,800 Speaker 1: was one example they gave. Google isn't a stranger to 90 00:05:29,880 --> 00:05:33,880 Speaker 1: creating collaborative tools. I still remember the old Google Wave product, 91 00:05:34,160 --> 00:05:38,440 Speaker 1: which was interesting but somewhat confusing way back in the day, 92 00:05:38,720 --> 00:05:40,880 Speaker 1: and that would allow multiple people to work in the 93 00:05:40,920 --> 00:05:45,240 Speaker 1: same virtual workspace simultaneously. At the time, it was kind 94 00:05:45,240 --> 00:05:48,200 Speaker 1: of hard to imagine use cases for Google Wave beyond 95 00:05:48,240 --> 00:05:50,320 Speaker 1: how I was using it, which was to build out 96 00:05:50,320 --> 00:05:54,159 Speaker 1: a rundown for live shows. But smart Canvas has a 97 00:05:54,200 --> 00:05:57,840 Speaker 1: more straightforward approach to collaboration that I think is pretty 98 00:05:57,920 --> 00:06:02,200 Speaker 1: easy to understand. I wasn't super impressed with the interface 99 00:06:02,200 --> 00:06:05,240 Speaker 1: when I saw it, but um, it was, you know, 100 00:06:05,320 --> 00:06:08,239 Speaker 1: kind of stripped down. It wasn't like flashy or anything, 101 00:06:08,560 --> 00:06:11,799 Speaker 1: so maybe simpler is better. I don't know. I'm also 102 00:06:12,000 --> 00:06:16,320 Speaker 1: notoriously bad about using collaborative tools, so maybe I'm not 103 00:06:16,360 --> 00:06:19,640 Speaker 1: the right one to comment on it. In addition, Google 104 00:06:19,680 --> 00:06:24,640 Speaker 1: showed off an AI app called lamb DA that was odd. 105 00:06:24,920 --> 00:06:29,400 Speaker 1: So this AI uses things like voice recognition, and natural 106 00:06:29,520 --> 00:06:34,960 Speaker 1: language processing to interpret language. Then it generates responses and 107 00:06:35,080 --> 00:06:39,200 Speaker 1: makes it very conversational, and moreover, it can generate responses 108 00:06:39,240 --> 00:06:43,280 Speaker 1: as if it were something else, as opposed to you know, 109 00:06:43,360 --> 00:06:46,719 Speaker 1: the Google Assistant, which is just this kind of ephemeral 110 00:06:47,160 --> 00:06:50,520 Speaker 1: AI assistant. In the demos that Google showed off, they 111 00:06:50,560 --> 00:06:54,240 Speaker 1: had Lambda answering questions as if Lambda were a paper 112 00:06:54,279 --> 00:06:58,560 Speaker 1: airplane or the dwarf planet Pluto. And this makes me 113 00:06:58,600 --> 00:07:01,360 Speaker 1: wonder if you had Lamb a pose as a chicken, 114 00:07:01,839 --> 00:07:05,800 Speaker 1: would it finally explain why it crossed the road, Because 115 00:07:05,880 --> 00:07:10,360 Speaker 1: a lot of people have been asking about that. It 116 00:07:10,440 --> 00:07:13,080 Speaker 1: really was interesting to think of an app that could 117 00:07:13,080 --> 00:07:17,200 Speaker 1: presumably put itself in the role of different things and 118 00:07:17,240 --> 00:07:20,960 Speaker 1: then answer questions about it, including bits in which the 119 00:07:21,040 --> 00:07:24,400 Speaker 1: AI would make the equivalent of small talk only as 120 00:07:24,440 --> 00:07:27,360 Speaker 1: if the AI were you know, the dwarf planet Pluto 121 00:07:27,640 --> 00:07:30,560 Speaker 1: or whatever it was neat and it really shows how 122 00:07:30,560 --> 00:07:34,200 Speaker 1: far we've come with natural language processing and the ability 123 00:07:34,280 --> 00:07:38,880 Speaker 1: to create AI programs that can seemingly converse fairly well. 124 00:07:39,040 --> 00:07:42,840 Speaker 1: Though Google does say that this is far from perfect, 125 00:07:42,920 --> 00:07:46,320 Speaker 1: it's still a pretty early build. It feels like another 126 00:07:46,320 --> 00:07:48,920 Speaker 1: big step forward from the demo we saw a couple 127 00:07:48,960 --> 00:07:52,360 Speaker 1: of years ago in which a Google assistant program made 128 00:07:52,400 --> 00:07:56,120 Speaker 1: reservations on the behalf of a user, and it seemed 129 00:07:56,200 --> 00:07:59,000 Speaker 1: to be an actual assistant, like it called up a 130 00:07:59,080 --> 00:08:02,720 Speaker 1: restaurant to make reservations, and it was impossible to tell 131 00:08:03,120 --> 00:08:06,480 Speaker 1: that it was an automated program. It sounded like it 132 00:08:06,560 --> 00:08:09,720 Speaker 1: was a person, which is pretty nifty. Google also had 133 00:08:09,760 --> 00:08:13,480 Speaker 1: some announcements about where os that's w E A R. 134 00:08:14,240 --> 00:08:17,520 Speaker 1: Now it's just known as where, and didn't really talk 135 00:08:17,560 --> 00:08:21,040 Speaker 1: about the Pixel Watch. In fact, Google also didn't talk 136 00:08:21,080 --> 00:08:25,280 Speaker 1: about pixel phones. Really. There was a distinct lack of 137 00:08:25,320 --> 00:08:28,960 Speaker 1: talk about hardware at all. They didn't mention anything about 138 00:08:29,000 --> 00:08:33,240 Speaker 1: the rumored Google design chips that are codenamed Whitechapel. You know, 139 00:08:33,280 --> 00:08:36,640 Speaker 1: like Apple, Google is apparently moving toward developing its own 140 00:08:36,800 --> 00:08:40,760 Speaker 1: processors rather than relying on stuff that's made by other companies. 141 00:08:41,360 --> 00:08:43,640 Speaker 1: None of that was brought up in the IO event, 142 00:08:43,679 --> 00:08:46,160 Speaker 1: at least not as of the recording of this episode. 143 00:08:46,880 --> 00:08:49,640 Speaker 1: Other stuff Google showed off included phone apps that are 144 00:08:50,040 --> 00:08:53,000 Speaker 1: better at working with different skin tones for stuff like 145 00:08:53,040 --> 00:08:56,160 Speaker 1: white balance that has traditionally been one of the big 146 00:08:56,200 --> 00:08:59,560 Speaker 1: problems with Google photo apps. They simply didn't work as 147 00:08:59,640 --> 00:09:01,880 Speaker 1: well for people of color, which is one of the 148 00:09:01,920 --> 00:09:04,080 Speaker 1: many ways tech can turn out to have a bias 149 00:09:04,280 --> 00:09:08,480 Speaker 1: against certain groups. And Google demonstrated a cool technology that 150 00:09:08,480 --> 00:09:11,760 Speaker 1: could create an animation from two different photos of the 151 00:09:11,840 --> 00:09:15,520 Speaker 1: same scene. So let's say you take a picture of 152 00:09:15,520 --> 00:09:19,120 Speaker 1: someone who's standing in a particular pose and they're standing 153 00:09:19,160 --> 00:09:21,720 Speaker 1: in front of like a landmark of some sort, and 154 00:09:21,760 --> 00:09:23,880 Speaker 1: then they make a different pose and you take a 155 00:09:23,920 --> 00:09:28,520 Speaker 1: second photo. With this new feature, Google could create interpretative 156 00:09:28,720 --> 00:09:32,640 Speaker 1: frames between those two photos and create a short animation 157 00:09:32,960 --> 00:09:35,960 Speaker 1: moving from one pose to the other. It's kind of 158 00:09:36,000 --> 00:09:39,080 Speaker 1: creepy in a way because Google is literally creating new 159 00:09:39,120 --> 00:09:42,520 Speaker 1: images based off the input of the two reference frames 160 00:09:42,760 --> 00:09:45,800 Speaker 1: and then joining them together. So it's creepy and cool 161 00:09:46,000 --> 00:09:48,840 Speaker 1: at the same time, kinda like most of my friends. 162 00:09:49,600 --> 00:09:52,520 Speaker 1: And Google showed off updates two maps which will now 163 00:09:52,559 --> 00:09:56,440 Speaker 1: include more relevant information as you move through areas. For example, 164 00:09:56,840 --> 00:09:59,520 Speaker 1: the new maps will show you restaurants that happened to 165 00:09:59,559 --> 00:10:02,120 Speaker 1: be open as you pass through town, so you're not 166 00:10:02,160 --> 00:10:05,679 Speaker 1: getting a notification about, you know, Billy Bob's Best b 167 00:10:05,760 --> 00:10:09,040 Speaker 1: b Q when Billy Bob's happens to be closed. That 168 00:10:09,040 --> 00:10:11,800 Speaker 1: wouldn't do you much good, and maps will alert you 169 00:10:11,800 --> 00:10:15,040 Speaker 1: when areas are getting busy and help you plan routes better. 170 00:10:15,720 --> 00:10:17,760 Speaker 1: Maybe it's best to stay at the office for another 171 00:10:17,760 --> 00:10:20,200 Speaker 1: twenty minutes and that will end up saving you forty 172 00:10:20,240 --> 00:10:23,600 Speaker 1: five minutes of sitting in traffic, for example. And there 173 00:10:23,640 --> 00:10:26,000 Speaker 1: are some cool a R features in Google Maps that 174 00:10:26,040 --> 00:10:28,760 Speaker 1: they showed off, like virtual signs that can pop up 175 00:10:28,760 --> 00:10:31,920 Speaker 1: in your view that tell you where certain landmarks are 176 00:10:31,960 --> 00:10:34,480 Speaker 1: in relation to where you are. Oh and they will 177 00:10:34,480 --> 00:10:38,120 Speaker 1: also help drivers pick the most eco friendly routes, routes 178 00:10:38,160 --> 00:10:40,800 Speaker 1: that will have fewer stops and starts to them, or 179 00:10:40,840 --> 00:10:43,640 Speaker 1: fewer locations where you know you might have to break 180 00:10:43,720 --> 00:10:46,600 Speaker 1: suddenly or drive up a steep hill, which is kind 181 00:10:46,640 --> 00:10:48,760 Speaker 1: of a neat feature. So it may not be the 182 00:10:48,840 --> 00:10:52,280 Speaker 1: fastest route to your destination, but it would be at least, 183 00:10:52,280 --> 00:10:55,359 Speaker 1: in theory, a route that would generate fewer carbon emissions 184 00:10:55,400 --> 00:10:57,719 Speaker 1: because of the nature of the drive, which sounds kind 185 00:10:57,720 --> 00:10:59,320 Speaker 1: of chill to me. Oh and if you have a 186 00:10:59,360 --> 00:11:02,520 Speaker 1: BM double you you might be able to unlock and 187 00:11:02,600 --> 00:11:06,280 Speaker 1: start your car using your phone as a digital key. 188 00:11:06,400 --> 00:11:10,679 Speaker 1: Android twelve will support digital key operations over u w 189 00:11:10,880 --> 00:11:14,960 Speaker 1: B and NFC protocols. But so far, BMW is the 190 00:11:15,000 --> 00:11:19,480 Speaker 1: only automaker confirmed to offer compatible car models with this technology. 191 00:11:19,880 --> 00:11:22,679 Speaker 1: It is another example of how smart devices are replacing 192 00:11:23,280 --> 00:11:26,640 Speaker 1: various other things we typically use, from payments to car keys. 193 00:11:27,400 --> 00:11:31,480 Speaker 1: But in non Google news, the cryptocurrency world has seen 194 00:11:31,520 --> 00:11:35,240 Speaker 1: a dramatic drop in value over the last two weeks. 195 00:11:35,240 --> 00:11:39,160 Speaker 1: Since May twelve, the overall market, which includes cryptocurrencies like 196 00:11:39,240 --> 00:11:43,160 Speaker 1: Bitcoin and ethereum, as well as doge coin, the joke 197 00:11:43,280 --> 00:11:46,600 Speaker 1: that really got out of hand, and numerous others, the 198 00:11:46,679 --> 00:11:49,600 Speaker 1: market has dropped in value to the tune around eight 199 00:11:49,720 --> 00:11:54,280 Speaker 1: hundred billion dollars as I record this. Many cryptocurrencies are 200 00:11:54,320 --> 00:11:57,600 Speaker 1: experiencing a brief climate value as I record this episode, 201 00:11:57,640 --> 00:12:00,160 Speaker 1: So it could be that we just saw ma of 202 00:12:00,240 --> 00:12:03,120 Speaker 1: drop and now the market is recovering. Or it could 203 00:12:03,200 --> 00:12:05,480 Speaker 1: be that what I'm seeing is just a blip and 204 00:12:05,520 --> 00:12:08,359 Speaker 1: we're going to have a bear market in which valuation 205 00:12:08,400 --> 00:12:11,520 Speaker 1: will continue to drop further. No one is really sure, 206 00:12:11,880 --> 00:12:13,800 Speaker 1: and the big part of that is that the value 207 00:12:13,880 --> 00:12:17,640 Speaker 1: of many of these currencies largely depends on whether people 208 00:12:17,679 --> 00:12:22,720 Speaker 1: are buying into them or selling them off. Buying cryptocurrencies 209 00:12:22,760 --> 00:12:28,199 Speaker 1: decreases the supply, at least for most cryptocurrencies, which in turn, 210 00:12:28,320 --> 00:12:31,760 Speaker 1: you know, drives up the value of the remaining cryptocurrencies 211 00:12:31,760 --> 00:12:36,240 Speaker 1: that are in circulation. Selling off cryptocurrencies to convert them 212 00:12:36,240 --> 00:12:41,359 Speaker 1: into cash, that increases the supply and thus the value decreases. 213 00:12:41,760 --> 00:12:44,360 Speaker 1: Now it's a bit more complicated than that, but those 214 00:12:44,400 --> 00:12:46,959 Speaker 1: are the basics. So does this mean you should rush 215 00:12:47,000 --> 00:12:50,079 Speaker 1: in and buy up cryptocurrencies while they are still relatively 216 00:12:50,160 --> 00:12:53,040 Speaker 1: low or at least, you know, lower than they were 217 00:12:53,400 --> 00:12:57,200 Speaker 1: two weeks ago. I have no idea. We might see 218 00:12:57,200 --> 00:13:00,000 Speaker 1: the market plunge again. Heck, this could be the big 219 00:13:00,040 --> 00:13:04,200 Speaker 1: getting of a relatively long bear market where values dropped 220 00:13:04,360 --> 00:13:08,600 Speaker 1: week after week before they recover. Or we might already 221 00:13:08,640 --> 00:13:11,640 Speaker 1: be in another upswing. It's really hard to tell, and 222 00:13:11,720 --> 00:13:14,880 Speaker 1: chances are anyone who is giving you hard and firm 223 00:13:14,920 --> 00:13:18,800 Speaker 1: advice has a vested interest in the outcome. So if 224 00:13:18,800 --> 00:13:21,920 Speaker 1: someone you know is stressing that, hey, now is the 225 00:13:21,920 --> 00:13:25,080 Speaker 1: time to buy, a good follow up question for that 226 00:13:25,120 --> 00:13:28,280 Speaker 1: person is how much money do you have wrapped up 227 00:13:28,280 --> 00:13:32,720 Speaker 1: in cryptocurrency? Because it could be that the person telling 228 00:13:32,760 --> 00:13:35,840 Speaker 1: you to buy is desperate to see those values go 229 00:13:35,960 --> 00:13:38,320 Speaker 1: up so that they can recapture some of the wealth 230 00:13:38,480 --> 00:13:42,880 Speaker 1: they had been accumulating prior to the market downturn. Just 231 00:13:43,040 --> 00:13:45,880 Speaker 1: be careful, is all I'm saying. I mentioned in the 232 00:13:45,880 --> 00:13:48,840 Speaker 1: Google Stories about how Google's photo app is getting tweaks 233 00:13:48,880 --> 00:13:52,240 Speaker 1: to address negative biases and the app has with regard 234 00:13:52,280 --> 00:13:56,000 Speaker 1: to people of color. Twitter is doing something similar. Twitter 235 00:13:56,080 --> 00:13:59,840 Speaker 1: had been using a tool to auto crop photos, using 236 00:14:00,120 --> 00:14:02,760 Speaker 1: I to identify the subject of a photograph and then 237 00:14:02,800 --> 00:14:05,200 Speaker 1: to crop the image so that it would better fit 238 00:14:05,280 --> 00:14:09,720 Speaker 1: within Twitter feeds, particularly on mobile devices. But in use 239 00:14:09,800 --> 00:14:12,520 Speaker 1: it was shown to work best with pictures of white people, 240 00:14:13,000 --> 00:14:16,160 Speaker 1: and it would crop those images fairly effectively, but it 241 00:14:16,240 --> 00:14:20,400 Speaker 1: worked much less well for people who weren't white. Not great, 242 00:14:20,800 --> 00:14:24,480 Speaker 1: So now Twitter is abandoning the auto crop feature. The 243 00:14:24,520 --> 00:14:28,440 Speaker 1: company tested the algorithm for gender and race biases and 244 00:14:28,520 --> 00:14:32,120 Speaker 1: concluded that in fact, the algorithm does have problems. Now 245 00:14:32,120 --> 00:14:34,920 Speaker 1: Twitter's mobile app will show full images in the Twitter 246 00:14:35,000 --> 00:14:38,120 Speaker 1: feed and will only crop photos if the pictures are 247 00:14:38,120 --> 00:14:40,920 Speaker 1: just too tall or too wide to fit on screen, 248 00:14:41,400 --> 00:14:43,760 Speaker 1: and then only doing that in a pretty standard way 249 00:14:43,840 --> 00:14:47,000 Speaker 1: as opposed to trying to auto identify the subject of 250 00:14:47,040 --> 00:14:50,680 Speaker 1: the photo. Once again we see how bias can work 251 00:14:50,680 --> 00:14:53,920 Speaker 1: its way into code, even unintentionally, and I think Twitter 252 00:14:54,000 --> 00:14:57,280 Speaker 1: made the right move here. Perhaps in the future the 253 00:14:57,280 --> 00:15:00,200 Speaker 1: company will have an improved auto cropping tool that won't 254 00:15:00,240 --> 00:15:03,240 Speaker 1: show the same tendencies towards bias, But in the meantime, 255 00:15:03,400 --> 00:15:05,840 Speaker 1: it's better to have nothing than to have a tool 256 00:15:05,880 --> 00:15:09,000 Speaker 1: that works really well for one demographic and not so 257 00:15:09,040 --> 00:15:13,520 Speaker 1: well for all the others. Finally, byte Dance, the parent 258 00:15:13,600 --> 00:15:17,120 Speaker 1: company that owns TikTok, has announced a change in leadership 259 00:15:17,440 --> 00:15:21,720 Speaker 1: that I find interesting and fairly refreshing. Co founder jeng 260 00:15:21,840 --> 00:15:25,360 Speaker 1: Yi Ming says he will step down as CEO to 261 00:15:25,480 --> 00:15:28,880 Speaker 1: take on some other role in the company. So in 262 00:15:28,920 --> 00:15:31,480 Speaker 1: a message to employees, he explained that he felt his 263 00:15:31,520 --> 00:15:36,920 Speaker 1: strengths were in areas outside of managing people, including ideation 264 00:15:37,240 --> 00:15:39,960 Speaker 1: and data analysis and that kind of thing. He also 265 00:15:40,000 --> 00:15:42,320 Speaker 1: said he's not a very social person and that he 266 00:15:42,360 --> 00:15:45,040 Speaker 1: felt that byte Dance needed a leader who would be 267 00:15:45,080 --> 00:15:47,600 Speaker 1: able to take the company to new places. So, in 268 00:15:47,600 --> 00:15:49,840 Speaker 1: other words, he was saying, I might have been the 269 00:15:49,920 --> 00:15:52,520 Speaker 1: right person to get things moving to where they are, 270 00:15:52,720 --> 00:15:55,280 Speaker 1: but I'm not the right person to evolve the company 271 00:15:55,320 --> 00:15:59,520 Speaker 1: from here. Co founder Liang Rubo will take over as 272 00:15:59,640 --> 00:16:01,960 Speaker 1: ce oh, and I've seen this sort of thing happen 273 00:16:02,000 --> 00:16:04,920 Speaker 1: with a few companies here in the States. There are 274 00:16:05,000 --> 00:16:09,280 Speaker 1: some entrepreneurs who absolutely love the experience of launching a 275 00:16:09,320 --> 00:16:12,920 Speaker 1: new company and then having it grow to a certain point, 276 00:16:13,600 --> 00:16:17,640 Speaker 1: but beyond that they tend to lose interest. Their strengths 277 00:16:17,720 --> 00:16:20,560 Speaker 1: lie in those early phases of getting a company off 278 00:16:20,600 --> 00:16:24,480 Speaker 1: the ground and establishing a presence. But once these companies 279 00:16:24,520 --> 00:16:28,120 Speaker 1: reach a certain scale, the entrepreneurs find it less interesting 280 00:16:28,400 --> 00:16:31,760 Speaker 1: or outside of their wheelhouse of skills, and so they 281 00:16:31,800 --> 00:16:34,760 Speaker 1: look for ways to move into other roles. And I 282 00:16:34,800 --> 00:16:37,920 Speaker 1: think that's a healthy thing. Not everyone is made to 283 00:16:38,080 --> 00:16:41,760 Speaker 1: grow into these leadership positions while the company itself is 284 00:16:41,800 --> 00:16:46,720 Speaker 1: also scaling up, and a transparent transition of leadership tends 285 00:16:46,800 --> 00:16:49,560 Speaker 1: to be a sign of a healthy company as opposed 286 00:16:49,600 --> 00:16:52,400 Speaker 1: to some of the revolving door situations we've seen with 287 00:16:52,440 --> 00:16:55,840 Speaker 1: companies in the past. Analysts don't expect this change in 288 00:16:55,920 --> 00:17:00,200 Speaker 1: leadership to have massive consequences for daily operations of either 289 00:17:00,280 --> 00:17:04,440 Speaker 1: Byte Dance or TikTok. And that's it. That's the tech 290 00:17:04,520 --> 00:17:08,560 Speaker 1: news for Thursday, May one. I hope all of you 291 00:17:08,600 --> 00:17:11,280 Speaker 1: are well. If you have any suggestions for topics I 292 00:17:11,280 --> 00:17:14,119 Speaker 1: should cover in tech stuff, reach out to me on Twitter. 293 00:17:14,320 --> 00:17:16,920 Speaker 1: The handle for the show is tech Stuff hs W 294 00:17:17,640 --> 00:17:26,840 Speaker 1: and I'll talk to you again really soon. Tech Stuff 295 00:17:26,920 --> 00:17:30,120 Speaker 1: is an I Heart Radio production. For more podcasts from 296 00:17:30,119 --> 00:17:33,880 Speaker 1: my heart Radio, visit the i heart Radio app, Apple Podcasts, 297 00:17:34,000 --> 00:17:36,000 Speaker 1: or wherever you listen to your favorite shows.