1 00:00:04,400 --> 00:00:07,800 Speaker 1: Welcome to tech Stuff, a production from I Heart Radio. 2 00:00:12,119 --> 00:00:15,000 Speaker 1: Hey there, and welcome to tech Stuff. I'm your host, 3 00:00:15,160 --> 00:00:18,640 Speaker 1: Jonathan Strickland. I'm an executive producer with I Heart Radio 4 00:00:18,760 --> 00:00:21,759 Speaker 1: and a love of all things tech. And this is 5 00:00:21,800 --> 00:00:27,880 Speaker 1: the tech news for Tuesday, February twenty three, twenty one. Yesterday, 6 00:00:28,000 --> 00:00:31,040 Speaker 1: Spotify announced a ton of stuff that I would like 7 00:00:31,080 --> 00:00:34,000 Speaker 1: to touch on. Before I get to that, I should 8 00:00:34,040 --> 00:00:37,600 Speaker 1: give you all a disclaimer. I'm an employee of I 9 00:00:37,720 --> 00:00:41,400 Speaker 1: Heart Media and Spotify is a competing company in a 10 00:00:41,440 --> 00:00:45,200 Speaker 1: lot of ways. Also, what I say here is from 11 00:00:45,240 --> 00:00:48,760 Speaker 1: my own perspective. It doesn't reflect any sort of official 12 00:00:48,880 --> 00:00:52,720 Speaker 1: I heart stance, mostly because I'm not important enough to 13 00:00:52,760 --> 00:00:55,800 Speaker 1: even know if there is an official stance, let alone 14 00:00:55,800 --> 00:00:58,800 Speaker 1: whatever it might be. So let's get to those announcements. 15 00:00:59,320 --> 00:01:02,440 Speaker 1: One of them is that Spotify is launching a Hi 16 00:01:02,600 --> 00:01:07,839 Speaker 1: Fi or high fidelity premium subscription service later this year. 17 00:01:08,240 --> 00:01:12,760 Speaker 1: The service will give subscribers access to lossless streaming audio 18 00:01:12,840 --> 00:01:16,240 Speaker 1: quality and a quick word on that. Back when data 19 00:01:16,280 --> 00:01:19,760 Speaker 1: throughput had a lot less behind it, it was really 20 00:01:19,800 --> 00:01:23,759 Speaker 1: important to compress file sizes so that downloads wouldn't take 21 00:01:23,760 --> 00:01:27,880 Speaker 1: sadar and long. But many of those compression strategies meant 22 00:01:27,880 --> 00:01:30,240 Speaker 1: that this process would also give the boot to some 23 00:01:30,319 --> 00:01:34,720 Speaker 1: of the files data stuff that was deemed unnecessary. When 24 00:01:34,760 --> 00:01:37,200 Speaker 1: it came to audio, this meant that you would end 25 00:01:37,280 --> 00:01:41,080 Speaker 1: up with lossy audio files like m P three's. These 26 00:01:41,120 --> 00:01:45,119 Speaker 1: were ones that had shed some of the information and 27 00:01:45,160 --> 00:01:50,400 Speaker 1: thus some of the depth and vibrancy of the audio recording. Now, 28 00:01:50,440 --> 00:01:53,520 Speaker 1: Spotify is not the first to go with a lossless 29 00:01:53,560 --> 00:01:59,200 Speaker 1: audio streaming service. Deezer, Title and Amazon Music all have 30 00:01:59,680 --> 00:02:04,600 Speaker 1: loss lists options for premium subscribers. Now, I honestly don't 31 00:02:04,640 --> 00:02:07,960 Speaker 1: know how big this is in general. I don't have 32 00:02:08,160 --> 00:02:11,360 Speaker 1: a good finger on the pulse of the average consumer. 33 00:02:11,880 --> 00:02:15,440 Speaker 1: My sense is that most people really like the convenience 34 00:02:15,760 --> 00:02:19,800 Speaker 1: of streaming services and portable music options in general, and 35 00:02:19,800 --> 00:02:23,000 Speaker 1: that they don't really care quite as much about the 36 00:02:23,040 --> 00:02:26,720 Speaker 1: actual fidelity of the music they listen to. But I 37 00:02:26,720 --> 00:02:29,320 Speaker 1: could be totally in the wrong here. That's just from 38 00:02:29,360 --> 00:02:32,600 Speaker 1: my own, you know, feeling. I still like listening to 39 00:02:32,680 --> 00:02:35,120 Speaker 1: music on high quality now and again, but that tends 40 00:02:35,160 --> 00:02:40,400 Speaker 1: to be on a home system using outdated media. At 41 00:02:40,440 --> 00:02:43,079 Speaker 1: this point, most of the time, I'm actually just listening 42 00:02:43,080 --> 00:02:45,560 Speaker 1: to stuff without worrying about the fidelity as much I'm 43 00:02:45,800 --> 00:02:49,240 Speaker 1: a monster, I know. Spotify has also been making some 44 00:02:49,360 --> 00:02:52,040 Speaker 1: really big moves and podcasting, and has been for a 45 00:02:52,120 --> 00:02:56,399 Speaker 1: little while now. They have launched several podcasts that are 46 00:02:56,520 --> 00:03:00,160 Speaker 1: exclusive to the platform, and they've brought on others and 47 00:03:00,280 --> 00:03:04,680 Speaker 1: made them exclusive to Spotify. That exclusivity approach is a 48 00:03:04,720 --> 00:03:07,560 Speaker 1: tough one. You're trying to lure people over to using 49 00:03:07,639 --> 00:03:12,760 Speaker 1: a specific platform with content that's unique to that platform, 50 00:03:12,760 --> 00:03:14,880 Speaker 1: which is a different approach from how we do things 51 00:03:14,919 --> 00:03:17,560 Speaker 1: over at my Heart, for example, our shows appear on 52 00:03:18,280 --> 00:03:21,160 Speaker 1: all platforms, from my tunes to Spotify, to the I 53 00:03:21,280 --> 00:03:25,400 Speaker 1: Heart app and everything. Now, there's nothing necessarily wrong with 54 00:03:25,600 --> 00:03:29,600 Speaker 1: launching an exclusive show for your platform, except that you 55 00:03:29,680 --> 00:03:32,720 Speaker 1: run into the danger of limiting that shows audience. We 56 00:03:32,880 --> 00:03:36,880 Speaker 1: learned through the experience of Microsoft Mixer, for example, that 57 00:03:37,040 --> 00:03:39,680 Speaker 1: it doesn't always work to try and bring over heavy 58 00:03:39,760 --> 00:03:44,640 Speaker 1: hitters to create exclusive content and thus grow your audience 59 00:03:44,680 --> 00:03:48,280 Speaker 1: that way. That doesn't always work out the best. If 60 00:03:48,320 --> 00:03:51,400 Speaker 1: the creator of the show is getting paid truckloads of 61 00:03:51,440 --> 00:03:54,640 Speaker 1: cash in return for their agreement, they might not really 62 00:03:54,720 --> 00:03:57,720 Speaker 1: care so much if their audience isn't as huge. But 63 00:03:57,800 --> 00:04:01,240 Speaker 1: for smaller creators, exclusivity can sometimes mean that you never 64 00:04:01,320 --> 00:04:05,040 Speaker 1: get your feet under you. Still, Spotify does have some 65 00:04:05,160 --> 00:04:08,240 Speaker 1: truly heavy hitters in the space. The company has signed 66 00:04:08,240 --> 00:04:10,680 Speaker 1: a deal with d C Comics and there will be 67 00:04:10,960 --> 00:04:15,360 Speaker 1: numerous podcasts set in the d C Comic universe. The 68 00:04:15,400 --> 00:04:19,200 Speaker 1: first of those will be Batman Unburied. And Spotify also 69 00:04:19,480 --> 00:04:22,320 Speaker 1: signed a deal with the A G. Bio production company 70 00:04:22,360 --> 00:04:26,000 Speaker 1: founded by the Russo brothers. That's the pair behind Avengers 71 00:04:26,000 --> 00:04:29,599 Speaker 1: Infinity War and Avengers in the Game, so they've got 72 00:04:29,640 --> 00:04:32,839 Speaker 1: some serious star power behind them. It remains to be 73 00:04:32,880 --> 00:04:36,760 Speaker 1: seen if that is going to pull uh significantly more 74 00:04:36,920 --> 00:04:41,159 Speaker 1: users over onto that platform. Not long ago, Spotify acquired 75 00:04:41,240 --> 00:04:45,719 Speaker 1: the podcast hosting platform Megaphone, which includes a feature called 76 00:04:45,880 --> 00:04:49,480 Speaker 1: streaming ad Insertion. Now, if you've ever listened to an 77 00:04:49,480 --> 00:04:52,080 Speaker 1: old episode of Tech Stuff or Stuff you Should Know, 78 00:04:52,320 --> 00:04:54,520 Speaker 1: or Stuff you missed in history class or something like that, 79 00:04:55,040 --> 00:04:58,000 Speaker 1: you might have noticed that the ads that run on 80 00:04:58,640 --> 00:05:01,800 Speaker 1: any show, whether it's from the Deep Archives or not, 81 00:05:02,120 --> 00:05:05,320 Speaker 1: are their recent ads. So the episodes could be several 82 00:05:05,400 --> 00:05:07,559 Speaker 1: years old, but the ads are new, and that's because 83 00:05:07,600 --> 00:05:12,159 Speaker 1: of tools like streaming ad Insertion. Essentially that let's podcasters 84 00:05:12,240 --> 00:05:18,039 Speaker 1: designate specific tagged segments like time codes within a podcast, 85 00:05:18,560 --> 00:05:22,520 Speaker 1: and ads can swap out dynamically in those spots. This 86 00:05:22,640 --> 00:05:26,160 Speaker 1: gives podcasters the chance to leverage their whole back catalog 87 00:05:26,240 --> 00:05:29,840 Speaker 1: for ad deals, not just the most recent episodes. So 88 00:05:29,920 --> 00:05:31,719 Speaker 1: it also means you're not going to hear a five 89 00:05:31,800 --> 00:05:35,080 Speaker 1: year old advertisement for a toothbrush or a mattress and 90 00:05:35,200 --> 00:05:38,640 Speaker 1: has a code that doesn't work anymore. Spotify is building 91 00:05:38,640 --> 00:05:42,359 Speaker 1: out its ad capabilities for podcasters was something called the 92 00:05:42,440 --> 00:05:47,359 Speaker 1: Spotify Audience Network. Details are a little scarce, but the 93 00:05:47,400 --> 00:05:50,760 Speaker 1: company says that the service will help podcasters monetize their 94 00:05:50,800 --> 00:05:54,120 Speaker 1: shows more effectively. The plan is to first roll out 95 00:05:54,160 --> 00:05:58,880 Speaker 1: the service to Spotify's exclusive podcasts, and then sometime further 96 00:05:58,960 --> 00:06:01,320 Speaker 1: down the line, they will roll it out to third 97 00:06:01,440 --> 00:06:05,560 Speaker 1: party podcasts that are on the Spotify platform. The company 98 00:06:05,640 --> 00:06:09,359 Speaker 1: also announced new creator tools during its live event for 99 00:06:09,520 --> 00:06:12,600 Speaker 1: musical artists. These tools are meant to help with discovery 100 00:06:12,760 --> 00:06:16,039 Speaker 1: and making sales to new customers. The artists will be 101 00:06:16,080 --> 00:06:19,359 Speaker 1: able to select which tracks they want to prioritize for 102 00:06:19,480 --> 00:06:23,919 Speaker 1: discovery purposes, and Spotify will use a pop up notification 103 00:06:24,000 --> 00:06:28,039 Speaker 1: called Marquis to get attention from users. Markie was actually 104 00:06:28,080 --> 00:06:31,080 Speaker 1: introduced last year in twenty twenties. Just now it's going 105 00:06:31,120 --> 00:06:33,640 Speaker 1: to have some new features in it and get rolled 106 00:06:33,640 --> 00:06:36,560 Speaker 1: out to a larger number of artists, and that means 107 00:06:36,600 --> 00:06:40,040 Speaker 1: that we're gonna see a lot more artists potentially get 108 00:06:40,080 --> 00:06:43,279 Speaker 1: discovered through this tool. I think that's really cool. I 109 00:06:43,279 --> 00:06:47,200 Speaker 1: think discovery remains one of the toughest challenges for any 110 00:06:47,279 --> 00:06:51,760 Speaker 1: content creator, regardless of what medium they're working within and 111 00:06:51,800 --> 00:06:55,800 Speaker 1: what platform they're using. Discovery is just tough. We know 112 00:06:55,920 --> 00:07:00,279 Speaker 1: that more people than ever are creating awesome stuff and 113 00:07:00,320 --> 00:07:04,080 Speaker 1: putting it up online, and it is really challenging to 114 00:07:04,200 --> 00:07:09,160 Speaker 1: find the really good stuff among everything else that's being uploaded. 115 00:07:09,480 --> 00:07:14,200 Speaker 1: So I'm all four tools that allow creators to get 116 00:07:14,280 --> 00:07:17,720 Speaker 1: more eyeballs or ears as the case may be, on 117 00:07:17,760 --> 00:07:22,440 Speaker 1: their work and in non Spotify news, because believe me, 118 00:07:22,640 --> 00:07:25,720 Speaker 1: we actually have some. Google is lifting its ban on 119 00:07:25,800 --> 00:07:30,000 Speaker 1: political ads starting tomorrow. Google first put a ban on 120 00:07:30,080 --> 00:07:34,520 Speaker 1: political advertising following the insurrection at the nation's capital on 121 00:07:34,640 --> 00:07:38,640 Speaker 1: January six. Though Google only started the ban on January, 122 00:07:39,960 --> 00:07:44,280 Speaker 1: that particular ban expires tomorrow, that being Wednesday, and it 123 00:07:44,320 --> 00:07:47,720 Speaker 1: will mean that Google will accept political ads, including those 124 00:07:47,760 --> 00:07:51,520 Speaker 1: that focus on specific people whether those are candidates or 125 00:07:51,560 --> 00:07:55,480 Speaker 1: an elected official, as well as ballot measures. So if 126 00:07:55,520 --> 00:07:58,280 Speaker 1: you're in the United States, you might be groaning a 127 00:07:58,320 --> 00:08:00,120 Speaker 1: little bit at the thought of all this, because as 128 00:08:00,160 --> 00:08:03,480 Speaker 1: we just went through a truly tumultuous election, and the 129 00:08:03,480 --> 00:08:06,560 Speaker 1: thought of having another round of political ads hit us 130 00:08:06,680 --> 00:08:11,440 Speaker 1: so soon afterward is a bit disheartening. I would say. Facebook, 131 00:08:11,480 --> 00:08:14,040 Speaker 1: for the record, also has a ban on political ads 132 00:08:14,040 --> 00:08:17,080 Speaker 1: in place, and at least as of the time of 133 00:08:17,120 --> 00:08:21,280 Speaker 1: this recording, that ban is still in place. It will 134 00:08:21,320 --> 00:08:24,800 Speaker 1: be interesting to see if Google's decision will affect Facebook 135 00:08:25,120 --> 00:08:27,400 Speaker 1: or if this will just be a diverging path with 136 00:08:27,480 --> 00:08:31,920 Speaker 1: Google accepting ads and Facebook continuing to ban them. By 137 00:08:31,920 --> 00:08:34,200 Speaker 1: the time this episode goes out, we might know more 138 00:08:34,600 --> 00:08:38,120 Speaker 1: than maybe that Facebook will have already changed its own path. 139 00:08:38,400 --> 00:08:42,240 Speaker 1: I don't know. If you've got Netflix running on an 140 00:08:42,280 --> 00:08:45,760 Speaker 1: Android device, you might have some new stuff to watch. 141 00:08:46,480 --> 00:08:50,640 Speaker 1: The company has a feature called Downloads for You, and 142 00:08:50,679 --> 00:08:54,080 Speaker 1: the idea is that Netflix will select content based on 143 00:08:54,120 --> 00:08:57,679 Speaker 1: your viewing habits, and it will pre download stuff you 144 00:08:57,720 --> 00:09:01,160 Speaker 1: haven't watched yet for you, so you can watch it whenever, 145 00:09:01,200 --> 00:09:04,360 Speaker 1: even if you don't have a current Internet connection. You know, 146 00:09:04,679 --> 00:09:07,280 Speaker 1: like we used to have to do on airplanes. I 147 00:09:07,400 --> 00:09:12,360 Speaker 1: vaguely remember flying on airplanes. Anyway, you get to choose 148 00:09:12,480 --> 00:09:15,320 Speaker 1: the data limit for this feature if you even wanted 149 00:09:15,400 --> 00:09:17,440 Speaker 1: on at all. It's an opt in, so you can 150 00:09:17,520 --> 00:09:21,120 Speaker 1: go and turn on this option if you do want it. 151 00:09:21,360 --> 00:09:24,559 Speaker 1: You can choose to limit the data to one gigabyte, 152 00:09:24,840 --> 00:09:29,440 Speaker 1: three gigabytes, or five gigabytes. Also, the service will only 153 00:09:29,520 --> 00:09:32,640 Speaker 1: download content when it's over a WiFi connection, so you 154 00:09:32,640 --> 00:09:35,680 Speaker 1: don't have to worry about it eating up all of 155 00:09:35,720 --> 00:09:39,520 Speaker 1: the data on your cellular plan. I don't think I 156 00:09:39,559 --> 00:09:42,559 Speaker 1: would opt into this, but only because I tend to 157 00:09:42,600 --> 00:09:46,200 Speaker 1: watch really really bad movies on Netflix so that I 158 00:09:46,240 --> 00:09:48,480 Speaker 1: can talk about them with my buddies Eric and I 159 00:09:48,679 --> 00:09:52,480 Speaker 1: as and I don't need Netflix judging me more than 160 00:09:52,520 --> 00:09:55,520 Speaker 1: it already does, or providing me the next Shark Nado 161 00:09:55,600 --> 00:09:59,280 Speaker 1: movie before I'm really emotionally prepared for it. We do 162 00:09:59,400 --> 00:10:03,079 Speaker 1: know that this same service is coming to the iOS 163 00:10:03,320 --> 00:10:06,679 Speaker 1: versions of Netflix later on. I'm curious to see how 164 00:10:06,679 --> 00:10:10,559 Speaker 1: many people actually use it. Again, this gets into discovery, 165 00:10:10,760 --> 00:10:14,400 Speaker 1: the idea that Netflix is looking at the sort of 166 00:10:14,400 --> 00:10:19,160 Speaker 1: stuff you tend to view, drawing conclusions, and then selecting 167 00:10:19,160 --> 00:10:22,800 Speaker 1: stuff for you that it thinks you're going to like 168 00:10:22,960 --> 00:10:27,680 Speaker 1: based on your previous behaviors. Another version of AI knowing 169 00:10:27,720 --> 00:10:31,240 Speaker 1: ourselves better than we do. A study funded by the 170 00:10:31,440 --> 00:10:35,520 Speaker 1: Jane and Autos Erko Foundation and the Academy of Finland 171 00:10:35,760 --> 00:10:40,400 Speaker 1: found that the more human robot appears, the more harshly 172 00:10:40,640 --> 00:10:44,800 Speaker 1: we judge that robots actions, at least under a very 173 00:10:44,840 --> 00:10:50,199 Speaker 1: specific hypothetical situation. So the team gave this hypothetical situation 174 00:10:50,280 --> 00:10:54,040 Speaker 1: two people and had different versions of the scenario and 175 00:10:54,080 --> 00:10:58,520 Speaker 1: ask them to kind of grade the morality of a 176 00:10:58,640 --> 00:11:02,199 Speaker 1: person or robots to decision. And the problem was a 177 00:11:02,240 --> 00:11:07,000 Speaker 1: classic one, the trolley problem. You probably know this one generally. 178 00:11:07,040 --> 00:11:09,920 Speaker 1: They it's presented as the Other's a trolley that's hurtling 179 00:11:10,040 --> 00:11:12,960 Speaker 1: down a track and it's about to collide with a 180 00:11:12,960 --> 00:11:15,240 Speaker 1: group of five people, and if it does, it will 181 00:11:15,320 --> 00:11:17,840 Speaker 1: kill all five. You happen to be standing next to 182 00:11:17,880 --> 00:11:20,680 Speaker 1: a rail switch, and if you throw the switch, the 183 00:11:20,720 --> 00:11:23,640 Speaker 1: trolley will change to a new track of rails, and 184 00:11:24,000 --> 00:11:27,200 Speaker 1: it will definitely kill one person who's on that track, 185 00:11:27,400 --> 00:11:29,640 Speaker 1: but the five people on the other track will all 186 00:11:29,679 --> 00:11:33,439 Speaker 1: be spared. So would you do nothing and thus allow 187 00:11:33,720 --> 00:11:36,880 Speaker 1: five people to die through your inaction, or would you 188 00:11:36,920 --> 00:11:40,920 Speaker 1: throw the switch, thus dooming that one person on the 189 00:11:40,960 --> 00:11:46,079 Speaker 1: other track to die. I mean effectively you're killing someone, right, 190 00:11:46,160 --> 00:11:48,959 Speaker 1: I mean you're either killing someone by not acting at all, 191 00:11:49,040 --> 00:11:52,360 Speaker 1: or killing five people by not acting at all, or 192 00:11:52,400 --> 00:11:55,800 Speaker 1: you're actively killing someone by throwing a switch. It's almost 193 00:11:55,880 --> 00:11:58,040 Speaker 1: like having a loaded gun and you pull the trigger 194 00:11:58,080 --> 00:12:00,960 Speaker 1: in that case. So my point is there's not necessarily 195 00:12:01,080 --> 00:12:04,480 Speaker 1: a right answer to this question, or maybe not a 196 00:12:04,559 --> 00:12:08,200 Speaker 1: single right answer. It all depends upon your perspective and 197 00:12:08,800 --> 00:12:13,240 Speaker 1: whether you think that saving five people is worth dooming 198 00:12:13,280 --> 00:12:15,640 Speaker 1: someone else to death, or if you think somehow by 199 00:12:15,640 --> 00:12:19,600 Speaker 1: not acting absolves you of the responsibility. Well, the study 200 00:12:19,640 --> 00:12:22,800 Speaker 1: found that when people were presented with this scenario in 201 00:12:22,840 --> 00:12:26,280 Speaker 1: which a human had to make the decision or a 202 00:12:26,400 --> 00:12:31,119 Speaker 1: very utilitarian robot had to make the decision, the response 203 00:12:31,200 --> 00:12:34,920 Speaker 1: to those decisions were essentially that both that human and 204 00:12:35,000 --> 00:12:40,560 Speaker 1: that robot were making morally sound choices. However, the more 205 00:12:40,640 --> 00:12:44,440 Speaker 1: human the robot was supposed to be, the more harshly 206 00:12:44,480 --> 00:12:46,840 Speaker 1: people judged it. The more they said, oh, that's not 207 00:12:46,920 --> 00:12:50,600 Speaker 1: a moral choice, which is weird, right, because they would 208 00:12:50,600 --> 00:12:53,000 Speaker 1: give a pass to the regular robot and they would 209 00:12:53,000 --> 00:12:56,240 Speaker 1: give a pass to the human, but a human looking 210 00:12:56,520 --> 00:13:01,280 Speaker 1: robot suddenly seemed like that was different somehow, even though 211 00:13:01,320 --> 00:13:04,960 Speaker 1: it's the same situations. Now. This is just one study, 212 00:13:05,160 --> 00:13:07,200 Speaker 1: mind you, but it does show how the field of 213 00:13:07,280 --> 00:13:12,160 Speaker 1: robotics hinges not just on technology but also human psychology, 214 00:13:12,240 --> 00:13:15,000 Speaker 1: and the field of human and robot interactions is one 215 00:13:15,040 --> 00:13:19,480 Speaker 1: I really find fascinating. And our last story is after 216 00:13:19,720 --> 00:13:24,679 Speaker 1: twenty eight years, daft Punk has split up and retired. 217 00:13:25,440 --> 00:13:30,240 Speaker 1: They were incredibly influential in the house and techno music genres. 218 00:13:30,679 --> 00:13:33,200 Speaker 1: They were known for not just their music, but their 219 00:13:33,280 --> 00:13:37,920 Speaker 1: live performances, which became increasingly rare over the years. I 220 00:13:37,960 --> 00:13:42,600 Speaker 1: think Harder, Better, Faster, Stronger gets rediscovered every few years, 221 00:13:42,840 --> 00:13:46,280 Speaker 1: which just shows how amazing their ability was to craft music. 222 00:13:46,840 --> 00:13:51,440 Speaker 1: They also created the incredible soundtrack to Tron Legacy, and 223 00:13:51,520 --> 00:13:55,680 Speaker 1: I maintained that that score and Michael Shean's performance were 224 00:13:55,720 --> 00:13:59,640 Speaker 1: really the two standouts of that movie. Their last album 225 00:13:59,760 --> 00:14:04,680 Speaker 1: came out in so long daft Punk, thanks for all 226 00:14:04,720 --> 00:14:09,760 Speaker 1: the tunes, And that wraps up the headlines for today, Tuesday, 227 00:14:09,840 --> 00:14:14,760 Speaker 1: February one. We'll be back tomorrow with a full episode 228 00:14:14,760 --> 00:14:17,480 Speaker 1: of tech Stuff. If you guys have suggestions, for things 229 00:14:17,559 --> 00:14:20,600 Speaker 1: I should cover in future episodes of tech Stuff, give 230 00:14:20,640 --> 00:14:22,880 Speaker 1: me a shout let me know on Twitter. The handle 231 00:14:22,920 --> 00:14:26,200 Speaker 1: we use is text stuff hs W and I'll talk 232 00:14:26,200 --> 00:14:34,760 Speaker 1: to you again really soon. Text Stuff is an I 233 00:14:34,840 --> 00:14:38,360 Speaker 1: Heart Radio production. For more podcasts from My Heart Radio, 234 00:14:38,680 --> 00:14:41,880 Speaker 1: visit the I Heart Radio app, Apple Podcasts, or wherever 235 00:14:41,960 --> 00:14:43,480 Speaker 1: you listen to your favorite shows.