1 00:00:04,400 --> 00:00:12,239 Speaker 1: Welcome to tech Stuff, a production from iHeartRadio. Hey there, 2 00:00:12,240 --> 00:00:15,320 Speaker 1: and welcome to tech Stuff. I'm your host Jonathan Strickland. 3 00:00:15,320 --> 00:00:18,319 Speaker 1: I'm an executive producer with iHeartRadio and how the tech 4 00:00:18,360 --> 00:00:21,000 Speaker 1: are you. It's time for the tech news for Thursday, 5 00:00:21,040 --> 00:00:25,560 Speaker 1: April six, twenty twenty three. Earlier this week, I talked 6 00:00:25,560 --> 00:00:29,880 Speaker 1: about how Germany is considering a ban on open ais 7 00:00:30,360 --> 00:00:35,360 Speaker 1: chat GPT out of privacy concerns following Italy's move to 8 00:00:35,640 --> 00:00:39,199 Speaker 1: do just that very thing. Italy has issued essentially a 9 00:00:39,240 --> 00:00:43,800 Speaker 1: ban and Germany's considering the same sort of thing. Well. Today, 10 00:00:43,960 --> 00:00:47,800 Speaker 1: open Ai is scheduled to present Italian regulators a plan 11 00:00:47,920 --> 00:00:52,560 Speaker 1: to address their concerns and to essentially patch some vulnerabilities 12 00:00:52,600 --> 00:00:56,040 Speaker 1: in an effort to get that band lifted and perhaps 13 00:00:56,080 --> 00:00:59,960 Speaker 1: prevent a domino effect throughout the rest of the European Union. 14 00:01:00,640 --> 00:01:04,679 Speaker 1: As I mentioned in Tuesday's episode, the concerns here really 15 00:01:04,720 --> 00:01:07,880 Speaker 1: have nothing to do with chat GPT's reliability as an 16 00:01:07,920 --> 00:01:13,120 Speaker 1: information source, So this isn't concern about misinformation or anything 17 00:01:13,160 --> 00:01:17,440 Speaker 1: like that, or plagiarism. It's rather how open ai and 18 00:01:17,600 --> 00:01:22,920 Speaker 1: Microsoft handle user data and privacy, and it's also meant 19 00:01:23,000 --> 00:01:26,240 Speaker 1: to ensure that underage users are unable to access the tool, 20 00:01:26,440 --> 00:01:28,600 Speaker 1: and again this is a get due to the sensitive 21 00:01:28,680 --> 00:01:32,600 Speaker 1: nature of their personal information. So it's more about how 22 00:01:32,640 --> 00:01:38,000 Speaker 1: our companies like open ai and Microsoft getting EU citizen 23 00:01:38,120 --> 00:01:40,800 Speaker 1: information and how are they using it, and they want 24 00:01:40,800 --> 00:01:43,160 Speaker 1: to make sure that all of that is by the 25 00:01:43,200 --> 00:01:46,960 Speaker 1: regulations of the EU. We've known that Meta, the parent 26 00:01:46,959 --> 00:01:51,120 Speaker 1: company of Facebook and Instagram, has been working on its 27 00:01:51,160 --> 00:01:54,680 Speaker 1: own AI. The company has already made its large language 28 00:01:54,720 --> 00:01:58,680 Speaker 1: model called LAMA available to academics, and at least one 29 00:01:58,680 --> 00:02:03,000 Speaker 1: of those academics subsequent leaked the code to get hub. Now. 30 00:02:03,280 --> 00:02:08,400 Speaker 1: The financial newspaper Nica Asia reports that Meta is preparing 31 00:02:08,440 --> 00:02:13,840 Speaker 1: to launch a commercial AI product using generative language features 32 00:02:13,840 --> 00:02:17,920 Speaker 1: by the end of this year, so something similar to 33 00:02:18,160 --> 00:02:23,280 Speaker 1: Google Bard or Chat GPT. This doesn't necessarily mean that 34 00:02:23,320 --> 00:02:28,480 Speaker 1: the average Meta user will have direct interactions with this 35 00:02:28,639 --> 00:02:34,480 Speaker 1: generative AI. Rather, one potential use could be totally behind 36 00:02:34,520 --> 00:02:39,120 Speaker 1: the scenes, with the AI helping Meta's advertising division to 37 00:02:39,240 --> 00:02:44,240 Speaker 1: work with clients to create more effective ad campaigns. Maybe 38 00:02:44,320 --> 00:02:47,120 Speaker 1: you've seen what amounts to be the exact same ad, 39 00:02:47,240 --> 00:02:50,280 Speaker 1: but the ad itself is clearly geared for different demographics. 40 00:02:50,639 --> 00:02:55,160 Speaker 1: I'm reminded of seeing the same commercial in English and Spanish. 41 00:02:55,240 --> 00:02:58,160 Speaker 1: It was the exact same commercial with the same beats 42 00:02:58,200 --> 00:03:01,000 Speaker 1: and everything, but it did have a totally different cast 43 00:03:01,240 --> 00:03:04,520 Speaker 1: for the English version that from the Spanish one. It's 44 00:03:04,560 --> 00:03:07,080 Speaker 1: just that they were both working from the exact same script. 45 00:03:07,720 --> 00:03:10,240 Speaker 1: That's the sort of stuff that AI might help with. 46 00:03:10,760 --> 00:03:13,800 Speaker 1: It might help craft approaches that it calculates will be 47 00:03:13,880 --> 00:03:17,560 Speaker 1: most effective depending upon the audience. So an AI version 48 00:03:17,600 --> 00:03:20,560 Speaker 1: of this might actually not use the exact same script. 49 00:03:20,800 --> 00:03:23,360 Speaker 1: It might use a different script that the AI has 50 00:03:23,400 --> 00:03:29,240 Speaker 1: calculated will be more effective for the intended viewer. Further off, 51 00:03:29,680 --> 00:03:32,400 Speaker 1: a potential use of this AI might come in handy 52 00:03:32,760 --> 00:03:36,400 Speaker 1: for Meta's version of the metaverse. I think this one 53 00:03:36,840 --> 00:03:39,720 Speaker 1: is probably a bit more blue sky as it stands, 54 00:03:40,120 --> 00:03:44,240 Speaker 1: but the article mentions the use of describing a virtual 55 00:03:44,320 --> 00:03:47,560 Speaker 1: landscape that you have in mind and the AI can 56 00:03:47,640 --> 00:03:50,920 Speaker 1: help craft it for you. So instead of you having 57 00:03:50,920 --> 00:03:53,680 Speaker 1: to go through the trouble of learning how to use 58 00:03:53,800 --> 00:03:59,200 Speaker 1: the graphics tools or programming or whatever the UI is 59 00:03:59,240 --> 00:04:02,680 Speaker 1: for that meta space, you just tell the AI what 60 00:04:02,920 --> 00:04:07,000 Speaker 1: you want and it makes it kind of like your 61 00:04:07,040 --> 00:04:10,600 Speaker 1: own genie. Of course, that's only gonna matter if anyone 62 00:04:10,600 --> 00:04:13,920 Speaker 1: actually wants to go to the metaverse. We'll chat more 63 00:04:14,000 --> 00:04:17,000 Speaker 1: about that later in this episode. Anyway, it will be 64 00:04:17,080 --> 00:04:20,520 Speaker 1: interesting to see if Meta does indeed launch these or 65 00:04:20,560 --> 00:04:26,240 Speaker 1: other commercialized applications involving AI, particularly because the conversation around 66 00:04:26,279 --> 00:04:29,200 Speaker 1: AI has been getting a little bit spicy lately. People 67 00:04:29,240 --> 00:04:33,240 Speaker 1: have started to say we should pump the brakes on AI, 68 00:04:33,320 --> 00:04:37,960 Speaker 1: and I'm wondering, how is that going to impact initiatives 69 00:04:37,960 --> 00:04:41,680 Speaker 1: like this one. It might not. The initial thoughts of 70 00:04:41,800 --> 00:04:45,800 Speaker 1: using it to help with advertising seem like it would 71 00:04:45,839 --> 00:04:49,680 Speaker 1: be less obvious to the average person and therefore might 72 00:04:49,800 --> 00:04:53,039 Speaker 1: slip under the radar and be considered perfectly fine. We'll 73 00:04:53,080 --> 00:04:57,479 Speaker 1: have to wait and see. Speaking of Meta and AI, yesterday, 74 00:04:57,520 --> 00:05:01,800 Speaker 1: the company showed off its segment Anything model, or sam 75 00:05:02,600 --> 00:05:07,000 Speaker 1: This model can apparently identify objects in videos and still images, 76 00:05:07,320 --> 00:05:09,760 Speaker 1: even if the model had never been trained on those 77 00:05:09,839 --> 00:05:13,120 Speaker 1: particular objects. I'm not sure how the heck that works, 78 00:05:13,440 --> 00:05:16,960 Speaker 1: but that's how Reuters reported it. The Reuters report also 79 00:05:17,000 --> 00:05:19,240 Speaker 1: mentions that there were a few ways to interact with 80 00:05:19,320 --> 00:05:22,599 Speaker 1: the model. So for example, you might have a picture 81 00:05:22,800 --> 00:05:28,159 Speaker 1: opened up and you've got the sam AI agent active. 82 00:05:28,200 --> 00:05:31,800 Speaker 1: At the same time, you could ask it to outline 83 00:05:31,920 --> 00:05:35,520 Speaker 1: something specific in the picture. So, for example, maybe it's 84 00:05:35,520 --> 00:05:38,840 Speaker 1: a college dorm room and you tell Sam, hey, can 85 00:05:38,880 --> 00:05:42,400 Speaker 1: you outline the pizza in this picture, which, of course, 86 00:05:42,560 --> 00:05:44,719 Speaker 1: as we all know, will be stuck to the ceiling 87 00:05:44,839 --> 00:05:48,280 Speaker 1: or whatever. Anyway, Sam would then draw a little rectangle 88 00:05:48,360 --> 00:05:51,120 Speaker 1: around the item in question once it identified it in 89 00:05:51,160 --> 00:05:54,720 Speaker 1: the picture. This technology is in some ways an extension 90 00:05:54,839 --> 00:05:57,600 Speaker 1: of stuff that Meta has been doing for years, like 91 00:05:57,760 --> 00:06:01,919 Speaker 1: identifying specific people in photo. Admittedly, it has been a 92 00:06:01,920 --> 00:06:04,760 Speaker 1: while since I've been on Facebook, but I do remember 93 00:06:04,839 --> 00:06:09,200 Speaker 1: uploading images and then immediately seeing suggestions for tagging people 94 00:06:09,240 --> 00:06:12,479 Speaker 1: in my pictures because Facebook was helpfully letting me know 95 00:06:12,640 --> 00:06:15,880 Speaker 1: that it knows what all of my friends look like. Well, 96 00:06:16,279 --> 00:06:18,880 Speaker 1: now Facebook is essentially telling me that it knows what 97 00:06:19,080 --> 00:06:25,400 Speaker 1: everything looks like. I bet nothing can go wrong there. Finance, 98 00:06:25,520 --> 00:06:29,520 Speaker 1: the world's largest cryptocurrency exchange, continues to be at the 99 00:06:29,560 --> 00:06:33,360 Speaker 1: center of scrutiny. Most recently, the company has shut down 100 00:06:33,400 --> 00:06:38,320 Speaker 1: its derivative businesses in Australia after the country essentially revoked 101 00:06:38,760 --> 00:06:42,880 Speaker 1: Binance's financial services license there. Now. This is due to 102 00:06:42,920 --> 00:06:49,120 Speaker 1: an ongoing probe in which Binance apparently miscategorized some retail 103 00:06:49,240 --> 00:06:54,960 Speaker 1: investors as wholesale investors, and you might be left saying, well, wait, 104 00:06:55,080 --> 00:06:58,159 Speaker 1: so they lost their financial services license over what could 105 00:06:58,160 --> 00:07:03,080 Speaker 1: have been a clerical error. Here's the rub. See, Australian 106 00:07:03,240 --> 00:07:09,039 Speaker 1: law provides greater protections for retail investors. For the average 107 00:07:09,080 --> 00:07:12,960 Speaker 1: person who's investing, the government protects them more because they 108 00:07:13,000 --> 00:07:18,080 Speaker 1: don't have the same resources that a company investor would have. 109 00:07:18,560 --> 00:07:22,120 Speaker 1: Right so the government steps in and says the investor 110 00:07:22,200 --> 00:07:26,520 Speaker 1: deserves some protection. So the country's regulators are pointing out 111 00:07:26,560 --> 00:07:31,040 Speaker 1: that Binance was possibly categorizing investors as wholesale in an 112 00:07:31,080 --> 00:07:35,560 Speaker 1: effort to bypass that higher level of protection. Australians will 113 00:07:35,600 --> 00:07:39,800 Speaker 1: still be able to engage with the exchange, so people 114 00:07:39,840 --> 00:07:42,640 Speaker 1: who have money in crypto and they live in Australia 115 00:07:42,680 --> 00:07:45,800 Speaker 1: will still be able to use Binance for that purpose. However, 116 00:07:45,960 --> 00:07:49,360 Speaker 1: Binance will not be allowed to create derivative businesses in 117 00:07:49,440 --> 00:07:55,120 Speaker 1: Australia for at least the near term. The collapse of 118 00:07:55,280 --> 00:08:00,400 Speaker 1: FTX continues to have a massive impact beyond the crypt world. 119 00:08:00,640 --> 00:08:04,560 Speaker 1: You might remember FTX was the second largest crypto exchange 120 00:08:05,200 --> 00:08:08,920 Speaker 1: before things went pair shaped toward the end of last year, 121 00:08:09,520 --> 00:08:12,360 Speaker 1: and one of the things that did early last year 122 00:08:12,680 --> 00:08:17,840 Speaker 1: before everything went totally bonkers, was it established a philanthropic 123 00:08:17,960 --> 00:08:24,880 Speaker 1: agency called the FTX Foundation. This foundation would issue grants 124 00:08:25,040 --> 00:08:28,640 Speaker 1: to various applicants for all sorts of stuff. Well, since 125 00:08:28,720 --> 00:08:31,840 Speaker 1: FTX went belly up last year, that funding source has 126 00:08:31,880 --> 00:08:35,360 Speaker 1: been cut off. And worse than that, some folks who 127 00:08:35,400 --> 00:08:39,040 Speaker 1: have received grants are now scrambling to try and pay 128 00:08:39,080 --> 00:08:43,079 Speaker 1: that money back before it becomes legally mandated that they 129 00:08:43,120 --> 00:08:46,200 Speaker 1: do so. And as you might imagine, this has had 130 00:08:46,280 --> 00:08:49,880 Speaker 1: a huge impact on various research projects as well as 131 00:08:50,400 --> 00:08:54,440 Speaker 1: independent researchers. This includes students who have since had to 132 00:08:54,520 --> 00:08:59,600 Speaker 1: drop out of school because the foundation was funding their 133 00:08:59,679 --> 00:09:02,600 Speaker 1: education and that money is gone. Routers has a whole 134 00:09:02,640 --> 00:09:06,240 Speaker 1: piece about this. It's a pretty sad story if you 135 00:09:06,280 --> 00:09:09,120 Speaker 1: want to read more about it. And it really isn't 136 00:09:09,120 --> 00:09:12,440 Speaker 1: the fault of these researchers and students. It's not the 137 00:09:12,440 --> 00:09:15,520 Speaker 1: fault of the people who receive the grants. It's not 138 00:09:15,559 --> 00:09:17,840 Speaker 1: the fault of the people who gave the grants. The 139 00:09:17,880 --> 00:09:20,080 Speaker 1: group that was put in charge of the FTX Foundation 140 00:09:20,080 --> 00:09:23,400 Speaker 1: appeared to have been sincere in their efforts to help 141 00:09:23,559 --> 00:09:28,680 Speaker 1: fund these projects. The problem was that FTX was engaged 142 00:09:28,720 --> 00:09:33,199 Speaker 1: in shenanigans. And while you would say the people who 143 00:09:33,240 --> 00:09:35,720 Speaker 1: receive the grants aren't at fault and they should not 144 00:09:35,760 --> 00:09:39,760 Speaker 1: be penalized, you also have to think about the customers 145 00:09:39,800 --> 00:09:44,200 Speaker 1: of FTX who saw their investments mishandled and essentially stripped 146 00:09:44,200 --> 00:09:47,120 Speaker 1: away from them, and that they deserve to have as 147 00:09:47,200 --> 00:09:49,960 Speaker 1: much of their money returned to them as they possibly can. 148 00:09:50,760 --> 00:09:53,920 Speaker 1: And unfortunately that also means the money that had been 149 00:09:54,000 --> 00:10:00,880 Speaker 1: directed toward foundation efforts. It is an ugly, ugly situation. Well, 150 00:10:01,480 --> 00:10:15,000 Speaker 1: before things get uglier, let's take a quick break. We're 151 00:10:15,040 --> 00:10:19,480 Speaker 1: back the verges. J Peters, who actually wrote a couple 152 00:10:19,520 --> 00:10:22,199 Speaker 1: of articles we'll be talking about today. He has an 153 00:10:22,280 --> 00:10:26,720 Speaker 1: article that argues it might just be the absolute worst 154 00:10:26,760 --> 00:10:29,840 Speaker 1: time for Apple to jump into the mixed reality space, 155 00:10:30,360 --> 00:10:33,439 Speaker 1: and I'm inclined to agree. You might remember that everyone 156 00:10:33,480 --> 00:10:38,120 Speaker 1: expects Apple to announce its mixed reality headset. It's a 157 00:10:38,200 --> 00:10:43,080 Speaker 1: VR slash AR headset, a far cry from the original 158 00:10:43,160 --> 00:10:46,680 Speaker 1: concept of a pair of eyeglasses that were capable of 159 00:10:47,240 --> 00:10:53,040 Speaker 1: handling AAR functionality. This is more of your standard, you know, 160 00:10:53,360 --> 00:10:59,120 Speaker 1: headset with a full display in front of it. It's yeah, 161 00:10:59,160 --> 00:11:01,560 Speaker 1: it's a rough times Peter's points out the market for 162 00:11:01,640 --> 00:11:04,640 Speaker 1: high end VR headsets isn't exactly taking off like a 163 00:11:04,760 --> 00:11:09,280 Speaker 1: rocket right now, where sales are pretty bad. Whether it's 164 00:11:09,360 --> 00:11:12,960 Speaker 1: Meta's Quest Pro, which originally had the staggering price tag 165 00:11:13,000 --> 00:11:16,480 Speaker 1: of fifteen hundred dollars, which, to be fair, is only 166 00:11:16,600 --> 00:11:20,760 Speaker 1: half of what we expect Apple's headset to cost. Also, 167 00:11:20,840 --> 00:11:24,240 Speaker 1: Meta has since slashed that price down to a thousand dollars, 168 00:11:24,280 --> 00:11:27,080 Speaker 1: all the way down to Sony's VR setup for the 169 00:11:27,120 --> 00:11:31,719 Speaker 1: PS five. That demand just hasn't pulled through either. It 170 00:11:31,760 --> 00:11:35,720 Speaker 1: hasn't been there. There have been modest sales of lower 171 00:11:35,760 --> 00:11:39,960 Speaker 1: priced VR headsets, but they have been really modest. On 172 00:11:40,040 --> 00:11:44,920 Speaker 1: top of that, CNBC reports that research firm Piper Sanders 173 00:11:45,000 --> 00:11:49,200 Speaker 1: surveyed teens who had VR headsets and found that only 174 00:11:49,320 --> 00:11:55,000 Speaker 1: four percent of them would use their headsets daily and 175 00:11:55,679 --> 00:12:00,560 Speaker 1: fourteen percent would use them once a week. That's not 176 00:12:00,679 --> 00:12:03,800 Speaker 1: great like those numbers. That's you know, you're talking about 177 00:12:03,800 --> 00:12:06,560 Speaker 1: the people who have already bought a headset and they're 178 00:12:06,600 --> 00:12:09,640 Speaker 1: not using it that frequently. That's that's bad news for 179 00:12:09,679 --> 00:12:12,880 Speaker 1: a lot of companies right now. And worse than that 180 00:12:13,040 --> 00:12:16,520 Speaker 1: is that both of those numbers are taken out of 181 00:12:16,520 --> 00:12:19,560 Speaker 1: the teens who actually have a VR headset, and they 182 00:12:19,640 --> 00:12:22,559 Speaker 1: only make up twenty nine percent of all the teens 183 00:12:22,600 --> 00:12:26,800 Speaker 1: that were surveyed for this research. So the other way 184 00:12:26,840 --> 00:12:30,079 Speaker 1: to think about this is that less than a third 185 00:12:30,120 --> 00:12:35,000 Speaker 1: of all the teens surveyed in this project own a 186 00:12:35,120 --> 00:12:38,960 Speaker 1: VR headset. Out of them, only four percent of that 187 00:12:39,120 --> 00:12:44,720 Speaker 1: twenty nine percent use it every day. Yikes. I don't 188 00:12:44,760 --> 00:12:48,760 Speaker 1: find this really surprising personally, but that's due to two 189 00:12:48,920 --> 00:12:54,680 Speaker 1: really big factors. One is that VR tends to be expensive. 190 00:12:55,400 --> 00:12:58,120 Speaker 1: There are lower cost headsets out there, and some of 191 00:12:58,160 --> 00:13:01,319 Speaker 1: them are pretty good, although you know, you often need 192 00:13:01,360 --> 00:13:04,000 Speaker 1: to have a really decent computer to run the software, 193 00:13:04,000 --> 00:13:08,280 Speaker 1: so even if the headset isn't that expensive, you might 194 00:13:08,320 --> 00:13:11,320 Speaker 1: need an expensive computer to really make the most out 195 00:13:11,320 --> 00:13:14,760 Speaker 1: of it. And you're still talking about hundreds of dollars 196 00:13:15,040 --> 00:13:19,160 Speaker 1: or even a cheap headset. And while a team might 197 00:13:19,280 --> 00:13:21,360 Speaker 1: save up for a phone, you know, a lot of 198 00:13:21,400 --> 00:13:26,880 Speaker 1: smartphones cost around a thousand dollars. That's really expensive. But 199 00:13:27,000 --> 00:13:29,680 Speaker 1: a phone is something you can carry with you wherever 200 00:13:29,720 --> 00:13:31,840 Speaker 1: you go and you can use it pretty much anytime, 201 00:13:32,440 --> 00:13:38,160 Speaker 1: whereas a VR headset is only a sometimes technology. To paraphrase, 202 00:13:38,200 --> 00:13:41,800 Speaker 1: cookie monster. So it's a really big thing to ask 203 00:13:41,840 --> 00:13:45,280 Speaker 1: a consumer to hand over several hundred bucks or more 204 00:13:45,679 --> 00:13:48,360 Speaker 1: for some technology that they're only occasionally going to use. 205 00:13:49,000 --> 00:13:52,440 Speaker 1: Despite the efforts of companies like Microsoft and Meta trying 206 00:13:52,440 --> 00:13:56,000 Speaker 1: to convince us that the future is virtual. By the way, 207 00:13:56,040 --> 00:13:59,120 Speaker 1: I find it really fascinating that you have companies that 208 00:13:59,160 --> 00:14:02,360 Speaker 1: are pushing for this vision of the future where we're 209 00:14:02,400 --> 00:14:05,760 Speaker 1: going to interact with one another in a virtual realm. Meanwhile, 210 00:14:05,800 --> 00:14:08,800 Speaker 1: those very same companies are requiring employees to come back 211 00:14:08,800 --> 00:14:11,760 Speaker 1: into the office to work there because they don't want 212 00:14:11,800 --> 00:14:14,960 Speaker 1: them to work remotely. That seems like a mixed message 213 00:14:14,960 --> 00:14:18,200 Speaker 1: to me, does it. Not like they're saying, Hey, the 214 00:14:18,320 --> 00:14:21,080 Speaker 1: virtual world, that's gonna be amazing. Everyone's gonna want to 215 00:14:21,080 --> 00:14:24,400 Speaker 1: work there. You're gonna want these devices because it's going 216 00:14:24,440 --> 00:14:26,760 Speaker 1: to make work better, but you need to come into 217 00:14:26,800 --> 00:14:31,600 Speaker 1: the office disconnect there anyway. That's one big factor is 218 00:14:31,640 --> 00:14:34,720 Speaker 1: just the cost of these things. I think that's a 219 00:14:34,720 --> 00:14:38,960 Speaker 1: big barrier to entry. But the other one is content. Now, 220 00:14:39,000 --> 00:14:42,920 Speaker 1: there are some truly great VR experiences out there, but 221 00:14:43,040 --> 00:14:48,200 Speaker 1: there's not an overabundance of them. The library for really 222 00:14:48,280 --> 00:14:52,680 Speaker 1: good VR experiences is fairly thin, and this actually creates 223 00:14:52,680 --> 00:14:56,800 Speaker 1: a catch twenty two situation. Developers are not eager to 224 00:14:56,960 --> 00:15:00,880 Speaker 1: jump into creating really good VR experiences because they're not 225 00:15:01,000 --> 00:15:04,560 Speaker 1: likely to make back their investment because the installed base 226 00:15:04,640 --> 00:15:08,960 Speaker 1: for VR is pretty small. But the install base is 227 00:15:09,000 --> 00:15:12,920 Speaker 1: small in part because there's a lack of content, so 228 00:15:13,080 --> 00:15:18,000 Speaker 1: this sort of perpetuates itself anyway, long story short, I 229 00:15:18,120 --> 00:15:21,520 Speaker 1: remain skeptical that VR is going to emerge from being 230 00:15:21,600 --> 00:15:25,800 Speaker 1: a niche technology and become a mainstream tech for the 231 00:15:25,840 --> 00:15:29,520 Speaker 1: common consumer, at least in the near term. If it 232 00:15:29,560 --> 00:15:31,600 Speaker 1: does happen, it's going to take a lot more time, 233 00:15:32,000 --> 00:15:35,920 Speaker 1: and that's probably going to take even more steam out 234 00:15:35,920 --> 00:15:39,840 Speaker 1: of a lot of metaverse projects that are also connected 235 00:15:39,880 --> 00:15:43,080 Speaker 1: to this concept of VR and AAR. And we've already 236 00:15:43,080 --> 00:15:47,040 Speaker 1: been seeing cutbacks to metaverse divisions as companies look to 237 00:15:47,080 --> 00:15:51,000 Speaker 1: control costs. So I think the metaverse quest in general 238 00:15:51,480 --> 00:15:54,640 Speaker 1: is taking a really massive hit right now. Not all 239 00:15:54,680 --> 00:15:59,320 Speaker 1: metaverse projects involve VR or AR. A lot of them do, 240 00:15:59,360 --> 00:16:01,800 Speaker 1: but not all of them do, but I think pretty 241 00:16:01,880 --> 00:16:05,240 Speaker 1: much all of them have had the rug pulled out 242 00:16:05,240 --> 00:16:09,720 Speaker 1: from under them. Google has updated its developer policy for Android. 243 00:16:09,760 --> 00:16:12,360 Speaker 1: It will require developers to make it easier for users 244 00:16:12,400 --> 00:16:18,160 Speaker 1: to delete all their personal data off the respective apps, companies, services, 245 00:16:18,200 --> 00:16:21,960 Speaker 1: should they choose to do so so. Before Google required 246 00:16:21,960 --> 00:16:25,920 Speaker 1: developers to make it possible to delete an account. This 247 00:16:26,040 --> 00:16:28,440 Speaker 1: was a prerequisite for having an app on the Google 248 00:16:28,440 --> 00:16:31,880 Speaker 1: Play Store. So not only could you delete the app 249 00:16:31,920 --> 00:16:34,680 Speaker 1: off your phone, which would remove the app from your phone, 250 00:16:34,720 --> 00:16:39,760 Speaker 1: but it wouldn't magically delete your account with that app. Developer. 251 00:16:40,360 --> 00:16:42,920 Speaker 1: Now you would be able to actually go and say, hey, 252 00:16:43,680 --> 00:16:46,760 Speaker 1: take out my account entirely, delete the account, delete all 253 00:16:46,760 --> 00:16:49,480 Speaker 1: the information you have on me. That was something Google 254 00:16:49,520 --> 00:16:54,840 Speaker 1: had already demanded the developers supply. However, they didn't have 255 00:16:54,880 --> 00:16:58,520 Speaker 1: any rules in place about how developers could implement that policy, 256 00:16:58,800 --> 00:17:02,040 Speaker 1: which meant a oliper could have made it possible, but 257 00:17:02,320 --> 00:17:06,560 Speaker 1: really bloody annoying to delete an account. So you know, 258 00:17:06,600 --> 00:17:09,040 Speaker 1: you can make it a standard practice that if you 259 00:17:09,119 --> 00:17:12,199 Speaker 1: wanted to remove your app, sure you could uninstall it, 260 00:17:12,240 --> 00:17:14,000 Speaker 1: but if you wanted to delete your account, you had 261 00:17:14,040 --> 00:17:18,439 Speaker 1: to contact the developer and request a manual deletion, and 262 00:17:18,520 --> 00:17:20,359 Speaker 1: a lot of people just don't go to that step. 263 00:17:20,960 --> 00:17:24,800 Speaker 1: Now Google is saying that developers will have to include 264 00:17:24,840 --> 00:17:28,720 Speaker 1: a way for users to initiate account and data deletion 265 00:17:29,520 --> 00:17:33,000 Speaker 1: online or from within the app itself. So that way, 266 00:17:33,280 --> 00:17:35,680 Speaker 1: if you find out, say that the recipe app you've 267 00:17:35,680 --> 00:17:39,480 Speaker 1: been using, has been secretly selling your information to China, 268 00:17:39,600 --> 00:17:42,160 Speaker 1: you could hit self destruct on your account and your 269 00:17:42,280 --> 00:17:47,880 Speaker 1: data while also uninstalling the app, and hopefully that's exactly 270 00:17:47,880 --> 00:17:51,360 Speaker 1: what would then happen. It's a good step toward consumer protection, 271 00:17:51,520 --> 00:17:55,080 Speaker 1: but it is unclear how Google plans to actually enforce 272 00:17:55,280 --> 00:17:58,480 Speaker 1: these policies and to make sure that developers actually follow 273 00:17:58,520 --> 00:18:00,440 Speaker 1: the rules. I mean, you might have a button that 274 00:18:00,520 --> 00:18:03,760 Speaker 1: says delete my account, and the developer says your account 275 00:18:03,800 --> 00:18:06,800 Speaker 1: has been deleted, and nothing actually happened. So how does 276 00:18:06,840 --> 00:18:09,679 Speaker 1: Google make certain that the developer is following through on 277 00:18:09,840 --> 00:18:13,800 Speaker 1: the policy. Hopefully this will get sorted out and it 278 00:18:13,800 --> 00:18:16,879 Speaker 1: doesn't just become something that Google can point to and say, 279 00:18:17,240 --> 00:18:19,920 Speaker 1: but we told them it's against the rules, so it's 280 00:18:19,920 --> 00:18:23,679 Speaker 1: not our fault. You can tell that I don't have 281 00:18:23,680 --> 00:18:25,479 Speaker 1: a whole lot of faith in this kind of stuff. 282 00:18:26,000 --> 00:18:28,840 Speaker 1: Earlier I mentioned an article by j Peters of The 283 00:18:28,920 --> 00:18:30,440 Speaker 1: Virgin I said he was going to come up again, 284 00:18:30,440 --> 00:18:33,200 Speaker 1: and here it is. Mister Peters has been a busy 285 00:18:33,240 --> 00:18:36,960 Speaker 1: Little Bee this week. Yesterday The Verge published a piece 286 00:18:37,000 --> 00:18:42,080 Speaker 1: written by Peters about his experiences at fashion week. I mean, 287 00:18:42,240 --> 00:18:47,359 Speaker 1: to be more specific, Metaverse fashion Week held at decentra 288 00:18:47,560 --> 00:18:51,600 Speaker 1: Land and if you want a scathing indictment of the 289 00:18:51,640 --> 00:18:56,000 Speaker 1: metaverse concept in general and decentral Lands history in particular, 290 00:18:56,480 --> 00:18:59,639 Speaker 1: you have got to watch the video. The future is 291 00:18:59,680 --> 00:19:04,120 Speaker 1: a ed mole Decentraland and the Metaverse by Folding Ideas. 292 00:19:04,160 --> 00:19:07,480 Speaker 1: I've recommended the Folding Ideas channel before. He did a 293 00:19:07,520 --> 00:19:12,080 Speaker 1: really fascinating piece about NFTs. He's talked about the crypto world. 294 00:19:12,160 --> 00:19:15,959 Speaker 1: It's funny because he started off really being a channel 295 00:19:15,960 --> 00:19:19,879 Speaker 1: that analyzed it, dived into deep analysis of film and 296 00:19:20,000 --> 00:19:25,800 Speaker 1: television and narratives, and now he's tackling everything and his 297 00:19:25,880 --> 00:19:31,840 Speaker 1: work is truly it's phenomenal. At Peter's of The Verge, 298 00:19:31,880 --> 00:19:35,960 Speaker 1: he also mentions this specific video in his own coverage, 299 00:19:36,000 --> 00:19:39,840 Speaker 1: and his experience and the video from Folding Ideas both 300 00:19:39,920 --> 00:19:44,240 Speaker 1: paint a fairly bleak or at least bland image of 301 00:19:44,280 --> 00:19:48,200 Speaker 1: the metaverse, rather than a virtual world teeming with cool 302 00:19:48,280 --> 00:19:51,560 Speaker 1: avatars doing six stunts while daft punk is playing in 303 00:19:51,600 --> 00:19:54,439 Speaker 1: the background. It sounds like most of the experience is 304 00:19:54,480 --> 00:19:59,080 Speaker 1: pretty empty, like there are virtual landmarks and stuff, but 305 00:19:59,200 --> 00:20:03,239 Speaker 1: not that many people populating it, and the avatars you 306 00:20:03,280 --> 00:20:06,760 Speaker 1: do see can often just be standing idle. Maybe they're 307 00:20:06,760 --> 00:20:09,520 Speaker 1: logged in just for the purposes of being logged in, 308 00:20:09,640 --> 00:20:13,520 Speaker 1: possibly to accrue some sort of credit in the process, 309 00:20:14,040 --> 00:20:18,200 Speaker 1: kind of like those time wasting games that you have 310 00:20:18,359 --> 00:20:20,879 Speaker 1: where you make a turn and then it says you 311 00:20:20,880 --> 00:20:23,240 Speaker 1: have to wait five minutes for your next turn. Maybe 312 00:20:23,320 --> 00:20:27,000 Speaker 1: that's kind of what's happening. Peter says that the Metaverse 313 00:20:27,080 --> 00:20:30,639 Speaker 1: Fashion Week quote felt kind of like a County fair 314 00:20:30,880 --> 00:20:33,679 Speaker 1: end quote with its layout, so you could actually wander 315 00:20:33,760 --> 00:20:36,720 Speaker 1: from booth to booth and check things out. Peter said 316 00:20:36,760 --> 00:20:39,200 Speaker 1: most of the spaces weren't very interesting, and the bits 317 00:20:39,200 --> 00:20:42,840 Speaker 1: that were meant to be interactive frequently didn't work well, 318 00:20:43,359 --> 00:20:46,919 Speaker 1: if at all, so it could be a frustrating experience. 319 00:20:47,480 --> 00:20:52,159 Speaker 1: And yeah, you could look at virtual fashion and potentially 320 00:20:52,200 --> 00:20:56,920 Speaker 1: even purchase some with in world currency so that your 321 00:20:56,960 --> 00:21:03,120 Speaker 1: avatar could wear a bespoke virtual fur code or something, 322 00:21:03,640 --> 00:21:05,760 Speaker 1: but it wasn't you know, it didn't come across as 323 00:21:05,800 --> 00:21:07,960 Speaker 1: a ringing endorsement of the Metaverse in general, or at 324 00:21:08,040 --> 00:21:11,920 Speaker 1: least decentral Lands version of it. Folding Ideas found similar 325 00:21:11,960 --> 00:21:15,639 Speaker 1: issues when exploring decentral Land in general, including issues with 326 00:21:15,760 --> 00:21:19,399 Speaker 1: collision detection, which made it all but impossible to you know, 327 00:21:19,480 --> 00:21:22,320 Speaker 1: like climb a staircase to go to our virtual building's 328 00:21:22,440 --> 00:21:25,280 Speaker 1: second floor or you know, first floor if you happen 329 00:21:25,280 --> 00:21:28,359 Speaker 1: to be in England or something. Peter said the only 330 00:21:28,400 --> 00:21:31,399 Speaker 1: time he really saw users interacting with one another was 331 00:21:31,440 --> 00:21:34,640 Speaker 1: at the closing party for Fashion Week, when quote dozens 332 00:21:34,680 --> 00:21:37,760 Speaker 1: of avatars grooved on a virtual dance floor as a 333 00:21:37,840 --> 00:21:40,480 Speaker 1: video of a human DJ played on a big screen. 334 00:21:40,760 --> 00:21:45,760 Speaker 1: End quote. Based on Peters's article and the Folding Ideas video, 335 00:21:45,800 --> 00:21:49,439 Speaker 1: which again Peter's also references in the article, the current 336 00:21:49,480 --> 00:21:52,480 Speaker 1: incarnation of the Metaverse, or at least decentral Lands version 337 00:21:52,520 --> 00:21:59,200 Speaker 1: of the Metaverse, is pretty boring. Decentral Land, by the way, 338 00:21:59,280 --> 00:22:03,080 Speaker 1: is one of the groups that has abandoned plans to 339 00:22:03,200 --> 00:22:07,600 Speaker 1: incorporate VR into its version of the Metaverse, So instead 340 00:22:07,600 --> 00:22:10,840 Speaker 1: of going with a virtual reality approach, which is sort 341 00:22:10,880 --> 00:22:14,960 Speaker 1: of what Meta has been hinting at, they're looking at 342 00:22:15,200 --> 00:22:19,359 Speaker 1: going with the third person avatar approach, similar to what 343 00:22:19,400 --> 00:22:22,400 Speaker 1: you would see in a massively multiplayer online role playing game. 344 00:22:22,920 --> 00:22:28,240 Speaker 1: So maybe it's unfair to say that the metaverse is 345 00:22:28,359 --> 00:22:32,320 Speaker 1: kind of pointless and boring. Maybe you could argue we're 346 00:22:32,359 --> 00:22:34,280 Speaker 1: still a year or two out from being able to 347 00:22:34,320 --> 00:22:38,320 Speaker 1: see stuff that's really compelling beyond narrow use cases. That's 348 00:22:38,440 --> 00:22:41,800 Speaker 1: entirely possible. But even if that is the case, there 349 00:22:41,920 --> 00:22:43,960 Speaker 1: is a lot of work that has to be done 350 00:22:43,960 --> 00:22:46,840 Speaker 1: to make the metaverse something interesting enough for me to 351 00:22:46,920 --> 00:22:50,520 Speaker 1: want to check it out beyond say, you know, checking 352 00:22:50,520 --> 00:22:52,800 Speaker 1: out a curiosity. Like if one of my favorite bands 353 00:22:53,520 --> 00:22:56,760 Speaker 1: was going to do a virtual concert, I might check 354 00:22:56,800 --> 00:22:59,640 Speaker 1: that out because I like the band and I might 355 00:22:59,720 --> 00:23:03,639 Speaker 1: be enjoy being able to hear them play a set, 356 00:23:04,119 --> 00:23:06,520 Speaker 1: even though I wouldn't actually be there in person. I 357 00:23:06,640 --> 00:23:09,840 Speaker 1: might check that out. But I can't imagine going to 358 00:23:09,880 --> 00:23:14,760 Speaker 1: the metaverse just to go and that. Until that changes, 359 00:23:14,800 --> 00:23:17,440 Speaker 1: I think the metaverse is kind of at a non 360 00:23:17,520 --> 00:23:21,080 Speaker 1: starter for me. Okay, we got a few more stories 361 00:23:21,119 --> 00:23:23,360 Speaker 1: to cover before we get to that. Let's take one 362 00:23:23,359 --> 00:23:36,800 Speaker 1: more break. We're back. Vietnam has joined the list of 363 00:23:36,880 --> 00:23:41,760 Speaker 1: countries currently investigating TikTok So. In this case, Vietnam's government 364 00:23:41,800 --> 00:23:44,760 Speaker 1: says that this investigation is to ensure that TikTok is 365 00:23:44,800 --> 00:23:48,600 Speaker 1: complying with laws and regulations, and also that you know, 366 00:23:48,760 --> 00:23:52,920 Speaker 1: it's paying its proper taxes. Plus there are some concerns 367 00:23:52,960 --> 00:23:58,639 Speaker 1: about content, with government official Lee Kuangtudo saying quote the 368 00:23:58,760 --> 00:24:02,320 Speaker 1: platform needs to aid by local regulations on both content 369 00:24:02,520 --> 00:24:06,880 Speaker 1: and tax obligations end quote. And also that some content 370 00:24:06,960 --> 00:24:13,000 Speaker 1: on TikTok was quote toxic, offensive, false, and superstitious end quote, 371 00:24:13,800 --> 00:24:18,200 Speaker 1: which yeah, I mean that's true, But then there's also 372 00:24:18,240 --> 00:24:20,480 Speaker 1: a lot of that kind of stuff on pretty much 373 00:24:20,480 --> 00:24:24,040 Speaker 1: every single platform that allows for user generated content on it. 374 00:24:24,080 --> 00:24:26,880 Speaker 1: But I do get it. TikTok is built in such 375 00:24:26,880 --> 00:24:31,959 Speaker 1: a way that bad content can go viral awfully fast, 376 00:24:32,000 --> 00:24:35,119 Speaker 1: and next thing you know, people are watching videos that 377 00:24:35,240 --> 00:24:39,119 Speaker 1: convince them to try stupid, dangerous stuff, or they're watching 378 00:24:39,200 --> 00:24:42,040 Speaker 1: videos that contain blatant lies that are being passed off 379 00:24:42,040 --> 00:24:44,800 Speaker 1: as fact. And if you trust TikTok more than you 380 00:24:44,840 --> 00:24:49,480 Speaker 1: know a credible source, you're bound to get into trouble. Previously, 381 00:24:49,560 --> 00:24:53,640 Speaker 1: upon the request of Vietnam's government, TikTok removed one point 382 00:24:53,720 --> 00:24:57,080 Speaker 1: seven million videos off the platform. This was just toward 383 00:24:57,119 --> 00:24:59,440 Speaker 1: the end of last year. So we'll see if there's 384 00:24:59,440 --> 00:25:01,760 Speaker 1: going to be a similar purge. In the wake of 385 00:25:01,760 --> 00:25:06,040 Speaker 1: this new investigation, Toyota appears to be ready to dive 386 00:25:06,119 --> 00:25:11,520 Speaker 1: into electric vehicle development with real fervor after years of 387 00:25:11,600 --> 00:25:15,359 Speaker 1: resisting it. According to Reuters, the company has long leaned 388 00:25:15,359 --> 00:25:20,400 Speaker 1: on alternatives to pure electric vehicles, including fuel cell vehicles 389 00:25:20,400 --> 00:25:26,199 Speaker 1: and hybrids. The general perception in the car world that 390 00:25:26,359 --> 00:25:29,439 Speaker 1: I have picked up on is that Toyota's really dragged 391 00:25:29,440 --> 00:25:32,359 Speaker 1: its feet when it comes to electric vehicles and really 392 00:25:32,400 --> 00:25:37,040 Speaker 1: tried to push hard for different approaches. But after a 393 00:25:37,119 --> 00:25:42,000 Speaker 1: massive change in leadership, including a new CEO, the company 394 00:25:42,080 --> 00:25:46,800 Speaker 1: now appears to be repositioning to focus seriously on electric vehicles. 395 00:25:47,440 --> 00:25:49,680 Speaker 1: Whether or not Toyota can make up for a lost 396 00:25:49,720 --> 00:25:54,040 Speaker 1: time remains to be seen. The company is definitely behind 397 00:25:54,119 --> 00:25:58,080 Speaker 1: its competitors when it comes to creating the infrastructure needed 398 00:25:58,119 --> 00:26:03,000 Speaker 1: to build electric vehicles apidly at scale, but better late 399 00:26:03,000 --> 00:26:10,000 Speaker 1: than never. Sometimes international law enforcement activities get badass mission names. 400 00:26:10,040 --> 00:26:14,320 Speaker 1: Such is the case with a recent operation that seized 401 00:26:14,520 --> 00:26:18,960 Speaker 1: the Genesis market, an online market that catered to hackers 402 00:26:19,040 --> 00:26:22,240 Speaker 1: and data thieves and other na'er duells. So what was 403 00:26:22,280 --> 00:26:29,080 Speaker 1: the operation's name? Operation Cookie Monster? That's right, I referenced 404 00:26:29,080 --> 00:26:34,200 Speaker 1: cookie Monster twice in this news episode. Apparently, the sting 405 00:26:34,240 --> 00:26:38,800 Speaker 1: operation included two hundred eight property searches and one hundred 406 00:26:39,080 --> 00:26:45,440 Speaker 1: nineteen arrests. Yowza Ours Technical reports that while the public 407 00:26:45,600 --> 00:26:50,800 Speaker 1: facing web page was taken down, the dark web version 408 00:26:50,960 --> 00:26:53,439 Speaker 1: is still up. If you were to use tour and 409 00:26:53,520 --> 00:26:59,920 Speaker 1: to navigate to this marketplaces Onion based dark web website, 410 00:27:00,040 --> 00:27:02,680 Speaker 1: you could actually still go there. So this suggests that 411 00:27:03,240 --> 00:27:09,160 Speaker 1: the International Law Enforcement Group has not seized all of 412 00:27:09,280 --> 00:27:15,040 Speaker 1: the assets held by this particular organization, so there's still 413 00:27:15,040 --> 00:27:18,119 Speaker 1: work to be done. Apparently most of the activity on 414 00:27:18,119 --> 00:27:21,840 Speaker 1: this hacker market involved the buying and selling of private information. 415 00:27:22,680 --> 00:27:26,520 Speaker 1: There were around fifty nine thousand registered users on the 416 00:27:26,560 --> 00:27:30,800 Speaker 1: site yikes, and that they were using tools that were 417 00:27:30,840 --> 00:27:35,919 Speaker 1: doing things like creating a simulation of your browsers. So 418 00:27:35,960 --> 00:27:39,919 Speaker 1: if your machine had been one of the ones compromised, 419 00:27:40,200 --> 00:27:43,600 Speaker 1: someone could potentially be looking at what you're looking at 420 00:27:43,640 --> 00:27:47,199 Speaker 1: in real time as you use the browser and be 421 00:27:47,240 --> 00:27:50,040 Speaker 1: able to do things like logyear passwords and things of 422 00:27:50,040 --> 00:27:53,359 Speaker 1: that nature, real fun stuff. So yeah, it's a good 423 00:27:53,359 --> 00:27:58,840 Speaker 1: thing that it got seized, but obviously without the dark 424 00:27:58,840 --> 00:28:03,160 Speaker 1: web assets being turned off, you know, that there's still 425 00:28:03,200 --> 00:28:08,160 Speaker 1: more to be done. Finally, are you lonely and also 426 00:28:08,200 --> 00:28:11,320 Speaker 1: are you flush with cash? And are you tired of 427 00:28:11,400 --> 00:28:15,200 Speaker 1: dating poor people? Well, I have potentially good news for you. 428 00:28:15,680 --> 00:28:21,560 Speaker 1: Tender is contemplating a service currently referenced as tender Vault, 429 00:28:22,240 --> 00:28:25,240 Speaker 1: and this would let lonesome folks who have deep pockets 430 00:28:25,600 --> 00:28:29,680 Speaker 1: find others who are like themselves. So, according to Yahoo Life, 431 00:28:29,760 --> 00:28:33,160 Speaker 1: tender Vault would cost five hundred dollars a month, though 432 00:28:33,200 --> 00:28:35,160 Speaker 1: you could pay up front for a year and save 433 00:28:35,240 --> 00:28:38,040 Speaker 1: yourself some dough then it would just be five grand 434 00:28:38,320 --> 00:28:41,840 Speaker 1: for the whole year, and in return, you would have 435 00:28:41,960 --> 00:28:45,920 Speaker 1: your tender profile boosted. Plus you'd be able to look 436 00:28:45,960 --> 00:28:49,640 Speaker 1: for other folks who had also joined tender Vault. So 437 00:28:49,680 --> 00:28:53,240 Speaker 1: in other words, you could sort your searches so that 438 00:28:53,280 --> 00:28:55,680 Speaker 1: you're only looking for other people who also spent five 439 00:28:55,760 --> 00:28:58,120 Speaker 1: hundred bucks a month to be part of this group. 440 00:28:58,640 --> 00:29:01,320 Speaker 1: So all the snooty rich people can make certain they 441 00:29:01,320 --> 00:29:06,520 Speaker 1: could avoid all the innokds that stands for not all kind, dear, 442 00:29:07,440 --> 00:29:09,800 Speaker 1: and then they could avoid the rest of us, you know, 443 00:29:09,840 --> 00:29:13,280 Speaker 1: the unwashed peasants just trying to find love or a 444 00:29:13,280 --> 00:29:17,840 Speaker 1: hook up. Personally, I wonder if those boosted profiles will 445 00:29:17,880 --> 00:29:20,560 Speaker 1: be viewable by the general public, and if they will 446 00:29:20,680 --> 00:29:25,080 Speaker 1: see that it's a profile belonging to a Tinder Vault user, 447 00:29:25,280 --> 00:29:27,920 Speaker 1: because I could definitely see a rise in searches for 448 00:29:28,000 --> 00:29:31,000 Speaker 1: a brand new sugar Daddy or sugar mommy year or whatever. 449 00:29:32,000 --> 00:29:37,960 Speaker 1: Ain't love grand or maybe five grand a year. Okay, 450 00:29:38,000 --> 00:29:40,440 Speaker 1: that's it for this episode of tech Stuff. Hope you 451 00:29:40,480 --> 00:29:43,040 Speaker 1: are all well. If you'd like to reach out to me, 452 00:29:43,280 --> 00:29:45,920 Speaker 1: you can do so by sending me a message on Twitter. 453 00:29:46,120 --> 00:29:49,480 Speaker 1: The handle for the show is tech stuff HSW or 454 00:29:49,560 --> 00:29:51,760 Speaker 1: you can drop me a line by using the little 455 00:29:51,760 --> 00:29:55,480 Speaker 1: microphone icon on the tech Stuff page in the iHeartRadio app. 456 00:29:55,520 --> 00:29:57,720 Speaker 1: It's free to downloads free to use. Just tap that 457 00:29:57,720 --> 00:30:00,520 Speaker 1: little microphone you leave a voice message up to thirty seconds. 458 00:30:00,560 --> 00:30:03,720 Speaker 1: I'm link and I'll talk to you again really soon. 459 00:30:10,160 --> 00:30:14,800 Speaker 1: Text Stuff is an iHeartRadio production. For more podcasts from iHeartRadio, 460 00:30:15,160 --> 00:30:18,840 Speaker 1: visit the iHeartRadio app, Apple Podcasts, or wherever you listen 461 00:30:18,880 --> 00:30:19,920 Speaker 1: to your favorite shows.