1 00:00:04,400 --> 00:00:07,800 Speaker 1: Welcome to tech Stuff, a production from I Heart Radio. 2 00:00:12,119 --> 00:00:14,640 Speaker 1: Hey there, and welcome to tech stuff. I'm your host, 3 00:00:14,760 --> 00:00:18,000 Speaker 1: Jonathan Strickland. I'm an executive producer with I Heart Radio. 4 00:00:18,480 --> 00:00:22,080 Speaker 1: And how the tech are you? This is the tech 5 00:00:22,160 --> 00:00:29,440 Speaker 1: news for Thursday, January twenty twenty two. And uh, if 6 00:00:29,520 --> 00:00:32,760 Speaker 1: you are a Roku user, you may have experienced a 7 00:00:32,880 --> 00:00:37,560 Speaker 1: loss in service last night, that being Wednesday, January nineteenth, 8 00:00:37,560 --> 00:00:39,640 Speaker 1: in case you missed when I had just said the date. 9 00:00:40,440 --> 00:00:44,640 Speaker 1: Apparently something on the back end uh in technical terms, 10 00:00:45,040 --> 00:00:48,919 Speaker 1: done messed up as several owners of Roku streaming devices 11 00:00:48,960 --> 00:00:52,520 Speaker 1: and Roku television's saw their devices freeze or get caught 12 00:00:52,600 --> 00:00:55,920 Speaker 1: in a reboot loop. People took to Twitter to express 13 00:00:55,960 --> 00:00:59,800 Speaker 1: their frustration as their respective idiot boxes appeared to waive 14 00:00:59,840 --> 00:01:03,920 Speaker 1: the white flag. Idiot boxes or what some hoity toity 15 00:01:03,960 --> 00:01:08,640 Speaker 1: folks in the nineteen fifties used as a slang for television's. Apparently, 16 00:01:09,280 --> 00:01:13,000 Speaker 1: some users weren't even able to use non Roku services 17 00:01:13,120 --> 00:01:17,280 Speaker 1: on their Roku televisions, including stuff like video game consoles, 18 00:01:17,560 --> 00:01:20,480 Speaker 1: media players like DVD and blue ray, set top boxes, 19 00:01:21,000 --> 00:01:27,080 Speaker 1: or cable boxes, so the television was just defunct. Roku 20 00:01:27,240 --> 00:01:31,319 Speaker 1: was able to restore services later that night. But it's 21 00:01:31,360 --> 00:01:34,680 Speaker 1: another reminder that a lot of the technology we seek 22 00:01:34,680 --> 00:01:39,080 Speaker 1: out today relies heavily on a back end somewhere else, 23 00:01:39,600 --> 00:01:42,520 Speaker 1: and when something goes wrong in the back end, our 24 00:01:42,560 --> 00:01:45,840 Speaker 1: tech stops working, which is kind of like what happens 25 00:01:45,920 --> 00:01:48,000 Speaker 1: to me when my back end starts to act up. 26 00:01:48,040 --> 00:01:51,800 Speaker 1: But that's probably t m I. Anyway, When we think 27 00:01:51,800 --> 00:01:55,120 Speaker 1: of this, along with stuff like the one outage of 28 00:01:55,160 --> 00:01:58,880 Speaker 1: a large part of Amazon Web Services, we really see 29 00:01:59,520 --> 00:02:03,320 Speaker 1: how vulnerable some of our technology is to failure, and 30 00:02:03,360 --> 00:02:05,640 Speaker 1: it's beyond our control. We can't do anything about it. 31 00:02:05,680 --> 00:02:08,080 Speaker 1: We have to wait for someone else to fix something 32 00:02:08,280 --> 00:02:12,919 Speaker 1: somewhere else. Now, all that being said, most of the 33 00:02:12,960 --> 00:02:18,240 Speaker 1: time our technology works just fine because most of the 34 00:02:18,240 --> 00:02:22,280 Speaker 1: companies running those back end systems are spending a lot 35 00:02:22,320 --> 00:02:26,560 Speaker 1: of resources building out redundancy to minimize the chances of outages. 36 00:02:26,639 --> 00:02:28,800 Speaker 1: I mean, outages are just bad for business, So this 37 00:02:28,919 --> 00:02:33,639 Speaker 1: makes sense now. The outages still occasionally happen, but they 38 00:02:33,680 --> 00:02:37,720 Speaker 1: haven't been too frequent, and in most cases companies have 39 00:02:37,760 --> 00:02:41,160 Speaker 1: been able to resolve the issues within a few hours. 40 00:02:41,200 --> 00:02:43,200 Speaker 1: So while I think it's good to reflect on how 41 00:02:43,280 --> 00:02:46,240 Speaker 1: much of our tech relies on centralized systems that are 42 00:02:46,280 --> 00:02:49,280 Speaker 1: beyond our control. It's good to be aware of that. 43 00:02:50,000 --> 00:02:53,120 Speaker 1: I wouldn't go so far as to say we're headed 44 00:02:53,160 --> 00:02:58,840 Speaker 1: toward catastrophe because of that model. Also, the specifics of 45 00:02:58,880 --> 00:03:01,480 Speaker 1: what actually went raw with Roku, as at least as 46 00:03:01,520 --> 00:03:05,560 Speaker 1: I record this uh, haven't been disclosed. Roku hasn't said 47 00:03:05,600 --> 00:03:09,120 Speaker 1: what happened as I record this episode. I'm sure by 48 00:03:09,160 --> 00:03:12,080 Speaker 1: the time this goes live will know more. The International 49 00:03:12,120 --> 00:03:14,600 Speaker 1: Committee of the Red Cross has been hit with a 50 00:03:14,680 --> 00:03:19,440 Speaker 1: cyber attack. More specifically, a firm based in Switzerland that 51 00:03:19,520 --> 00:03:22,760 Speaker 1: provides data storage services to the Red Cross was the 52 00:03:22,800 --> 00:03:26,640 Speaker 1: actual target. The Red Cross has yet to name what 53 00:03:26,800 --> 00:03:30,840 Speaker 1: that firm is, which I think is an incredible courtesy 54 00:03:30,919 --> 00:03:34,840 Speaker 1: to the company. The hackers gained access to personal information 55 00:03:34,920 --> 00:03:38,120 Speaker 1: of more than half a million people in the process, 56 00:03:38,200 --> 00:03:41,320 Speaker 1: and that includes people who are in truly dire situations, 57 00:03:41,360 --> 00:03:44,280 Speaker 1: like people who are separated from their families in the 58 00:03:44,280 --> 00:03:48,520 Speaker 1: wake of a natural disaster or military conflict. In other words, 59 00:03:48,840 --> 00:03:53,560 Speaker 1: people who have already lost an incredible amount like in 60 00:03:53,600 --> 00:03:58,000 Speaker 1: some cases everything but their lives now have had their 61 00:03:58,080 --> 00:04:01,800 Speaker 1: data compromised through this attack. At the time of recording, 62 00:04:01,840 --> 00:04:04,680 Speaker 1: I don't have information on who was responsible for the 63 00:04:04,720 --> 00:04:08,360 Speaker 1: attack nor what their intentions were in the first place. 64 00:04:08,680 --> 00:04:11,960 Speaker 1: The Red Cross says that ransomware was not a factor. 65 00:04:12,520 --> 00:04:15,160 Speaker 1: The Red Cross released a statement saying the organization is 66 00:04:15,160 --> 00:04:19,680 Speaker 1: committed to contacting affected individuals and informed them of potential 67 00:04:19,760 --> 00:04:24,080 Speaker 1: risks they face as a consequence of this breach. So 68 00:04:24,720 --> 00:04:27,839 Speaker 1: do you remember on Tuesday when we had that breaking 69 00:04:27,839 --> 00:04:31,440 Speaker 1: news at the time anyway, that Microsoft was showing out 70 00:04:31,440 --> 00:04:35,719 Speaker 1: more than sixty eight billion dollars in cold hard cash 71 00:04:35,720 --> 00:04:39,719 Speaker 1: to purchase the video game company Activision Blizzard. If you 72 00:04:39,760 --> 00:04:44,080 Speaker 1: didn't listen to that episode and you've somehow avoided that news, 73 00:04:44,480 --> 00:04:49,919 Speaker 1: well surprise anyway. Microsoft has released a press statement that 74 00:04:50,000 --> 00:04:53,000 Speaker 1: said the acquisition would repel the company into third place 75 00:04:53,680 --> 00:04:58,640 Speaker 1: in the largest video game companies by revenue category, behind 76 00:04:58,720 --> 00:05:03,279 Speaker 1: Sony Intencent. And then Bloomberg reported that Sony's stock price 77 00:05:03,360 --> 00:05:10,279 Speaker 1: fell on Wednesday. That represents some twenty billion dollars of 78 00:05:10,400 --> 00:05:14,240 Speaker 1: market value, which is a big old yawlsa. Now, I 79 00:05:14,279 --> 00:05:18,159 Speaker 1: am no financial expert by any stretch of the imagination, 80 00:05:19,040 --> 00:05:21,400 Speaker 1: but I have a gut feeling that this drop in 81 00:05:21,480 --> 00:05:26,480 Speaker 1: value is strictly temporary and largely reactionary. I've seen some 82 00:05:26,520 --> 00:05:30,760 Speaker 1: analysts suggest that Microsoft will turn some franchises like the 83 00:05:30,800 --> 00:05:34,280 Speaker 1: popular Call of Duty series, which is one of Activision 84 00:05:34,279 --> 00:05:38,600 Speaker 1: Blizzards big titles, that Microsoft will turn that into Xbox 85 00:05:38,600 --> 00:05:43,280 Speaker 1: and PC exclusive titles, or at least they will contain 86 00:05:43,320 --> 00:05:46,040 Speaker 1: a lot of content that's exclusive to those platforms and 87 00:05:46,080 --> 00:05:49,880 Speaker 1: won't be allowed on other platforms, and that this could 88 00:05:50,279 --> 00:05:54,000 Speaker 1: in turn really hurt Sony because you know, PlayStation owners 89 00:05:54,000 --> 00:05:56,000 Speaker 1: wouldn't have access to that content. It will give them 90 00:05:56,040 --> 00:05:59,840 Speaker 1: an incentive to switch platforms. But then, you know, I 91 00:06:00,000 --> 00:06:04,120 Speaker 1: hear these arguments, but Sony has long leveraged exclusive content 92 00:06:04,200 --> 00:06:08,599 Speaker 1: on its consoles, either with exclusive titles where the only 93 00:06:08,640 --> 00:06:11,160 Speaker 1: place you can play those titles is on a PlayStation, 94 00:06:11,920 --> 00:06:16,000 Speaker 1: or with exclusive levels or other content like That's been 95 00:06:16,120 --> 00:06:20,400 Speaker 1: part and parcel of sony strategy for years, and in fact, 96 00:06:20,400 --> 00:06:22,920 Speaker 1: it's done that way more frequently than Microsoft. Now that 97 00:06:23,000 --> 00:06:25,560 Speaker 1: I think Microsoft wouldn't have loved to do the same thing, 98 00:06:26,120 --> 00:06:29,080 Speaker 1: they just didn't have the same opportunities in many cases. 99 00:06:29,680 --> 00:06:32,479 Speaker 1: And so far, at least, Microsoft is sending out the 100 00:06:32,520 --> 00:06:36,280 Speaker 1: message that they don't intend to limit the platforms upon 101 00:06:36,320 --> 00:06:39,640 Speaker 1: which games are appearing, so that seems to be saying 102 00:06:40,040 --> 00:06:46,400 Speaker 1: We're not going to suddenly turn off these titles for PlayStation. Uh. 103 00:06:46,760 --> 00:06:50,839 Speaker 1: Of course, that that message could just have the unspoken 104 00:06:51,320 --> 00:06:55,440 Speaker 1: for now, you know at the end of it. Um, 105 00:06:55,600 --> 00:07:00,279 Speaker 1: I don't know. It's also possible. In fact, Pole at 106 00:07:00,360 --> 00:07:03,120 Speaker 1: Microsoft had have said as much that the company really 107 00:07:03,160 --> 00:07:05,760 Speaker 1: just wants to get a version of Xbox game Pass 108 00:07:05,800 --> 00:07:07,760 Speaker 1: on PlayStation, which would be a heck of a thing. 109 00:07:08,640 --> 00:07:12,160 Speaker 1: But you know, it's a subscription based service that lets 110 00:07:12,200 --> 00:07:16,400 Speaker 1: people you know, access different video games for a monthly 111 00:07:16,520 --> 00:07:21,640 Speaker 1: fee instead of buying titles individually. And of course, titles 112 00:07:21,640 --> 00:07:24,200 Speaker 1: can disappear from the service after a while, kind of 113 00:07:24,240 --> 00:07:26,880 Speaker 1: like how some content can appear on Netflix and then 114 00:07:26,920 --> 00:07:32,440 Speaker 1: suddenly be inaccessible. Cough that Mitchell and web look cough anyway, Um, 115 00:07:33,760 --> 00:07:36,680 Speaker 1: it would be really interesting to see if Microsoft could 116 00:07:36,720 --> 00:07:40,600 Speaker 1: get game Pass onto PlayStation. It would be away for 117 00:07:40,720 --> 00:07:44,160 Speaker 1: Microsoft to make money off of a competitor's platform. That 118 00:07:44,200 --> 00:07:47,840 Speaker 1: would be incredible. Um. But I suspect at any rate 119 00:07:47,840 --> 00:07:51,120 Speaker 1: that we're gonna see Sony's market value recover before too long. 120 00:07:51,920 --> 00:07:55,480 Speaker 1: In fact, this acquisition isn't even scheduled to close until 121 00:07:55,600 --> 00:07:59,960 Speaker 1: sometime in Microsoft's twenty three fiscal year. Now, I should 122 00:08:00,080 --> 00:08:03,360 Speaker 1: point out that that fiscal year actually starts July one 123 00:08:03,560 --> 00:08:07,960 Speaker 1: of this year two. Uh. Microsoft fiscal year goes from 124 00:08:08,040 --> 00:08:13,920 Speaker 1: July one to June each year, so fiscal year two 125 00:08:14,160 --> 00:08:18,360 Speaker 1: will end June and fiscal year twenty twenty three will 126 00:08:18,360 --> 00:08:22,840 Speaker 1: start July one, two. Because fiscal years and calendar years 127 00:08:23,120 --> 00:08:26,640 Speaker 1: frequently don't line up. I mean, it's different for every company, 128 00:08:27,000 --> 00:08:29,200 Speaker 1: but it is not unusual to see a company have 129 00:08:29,280 --> 00:08:34,600 Speaker 1: its fiscal year that is significantly offset from the calendar year. Now. 130 00:08:34,640 --> 00:08:37,160 Speaker 1: One of the things some folks have commented on with 131 00:08:37,200 --> 00:08:40,600 Speaker 1: these Microsoft acquisitions, not just in gaming but in other 132 00:08:40,840 --> 00:08:43,200 Speaker 1: fields as well, is that the company has hiking up 133 00:08:43,200 --> 00:08:47,720 Speaker 1: its bridges to wade on into the metaverse competitive landscape, 134 00:08:47,720 --> 00:08:52,760 Speaker 1: as nebulous and vague as that is. Meanwhile, Ken Kutaragi, 135 00:08:52,880 --> 00:08:58,040 Speaker 1: who invented the PlayStation, a former Sony executive, has expressed 136 00:08:58,040 --> 00:09:02,679 Speaker 1: some skepticism about the metaverse concept, saying I can't see 137 00:09:02,720 --> 00:09:06,080 Speaker 1: the point of doing it, and y'all know me. Y'all 138 00:09:06,080 --> 00:09:09,400 Speaker 1: know I side with Kutaragi on this one, but I 139 00:09:09,440 --> 00:09:13,240 Speaker 1: also admit that my lack of enthusiasm about the metaverse 140 00:09:13,520 --> 00:09:17,000 Speaker 1: could totally be because I have blinders on and I'm 141 00:09:17,040 --> 00:09:20,120 Speaker 1: just not seeing the big picture. Uh. That's I totally 142 00:09:20,960 --> 00:09:24,480 Speaker 1: confess that could be the case. I don't see it, 143 00:09:25,080 --> 00:09:28,480 Speaker 1: but that doesn't mean that there's not something to be seen. Uh. 144 00:09:28,520 --> 00:09:30,120 Speaker 1: In fact, there's a lot of stuff out there that 145 00:09:30,160 --> 00:09:33,080 Speaker 1: gains a lot of traction that just baffles me, and 146 00:09:33,120 --> 00:09:35,040 Speaker 1: it proves to me that I'm one of those folks 147 00:09:35,040 --> 00:09:37,360 Speaker 1: who gets more and more out of touch as I 148 00:09:37,400 --> 00:09:40,240 Speaker 1: get older. It's that's I'm aware of it, and I 149 00:09:40,320 --> 00:09:43,679 Speaker 1: cannot stop it because I see stuff go popular and 150 00:09:43,720 --> 00:09:47,920 Speaker 1: I think I don't get it. But this isn't about 151 00:09:47,920 --> 00:09:52,200 Speaker 1: me anyway. Kudaragi like in the metaverse, which people will, 152 00:09:52,320 --> 00:09:56,640 Speaker 1: you know, presumably inhabit virtual spaces and be represented by 153 00:09:56,760 --> 00:10:01,000 Speaker 1: an avatar to stand in for their physical buddy. He 154 00:10:01,080 --> 00:10:05,240 Speaker 1: compared that to anonymous message board sites. Kutaragi appeared to 155 00:10:05,240 --> 00:10:07,160 Speaker 1: be saying that, you know, it's just a technological way 156 00:10:07,200 --> 00:10:10,800 Speaker 1: to pretend to be someone who you are not. And 157 00:10:10,840 --> 00:10:13,160 Speaker 1: when you put it that way, I can actually totally 158 00:10:13,200 --> 00:10:17,360 Speaker 1: see how the metaverse could potentially get super popular because 159 00:10:17,400 --> 00:10:21,200 Speaker 1: we've already seen social networking sites flourish as people use 160 00:10:21,280 --> 00:10:24,640 Speaker 1: those sites to create an image of their own lives 161 00:10:25,160 --> 00:10:28,200 Speaker 1: that isn't necessarily a reflection of reality. You know, it's 162 00:10:28,240 --> 00:10:32,480 Speaker 1: kind of a presentation of their idealized life, and it's 163 00:10:32,559 --> 00:10:36,920 Speaker 1: leaving out all the messy stuff and all the mundane stuff. Um. 164 00:10:36,960 --> 00:10:39,200 Speaker 1: You know, so many Instagrammers out there have kind of 165 00:10:39,240 --> 00:10:44,560 Speaker 1: cultivated this aura of being special and constantly blessed with 166 00:10:44,640 --> 00:10:48,520 Speaker 1: good fortune, you know, taking photos of themselves and exotic 167 00:10:48,600 --> 00:10:54,000 Speaker 1: locations with like expensive surroundings and you know, in great lighting, 168 00:10:54,040 --> 00:10:56,440 Speaker 1: and they look perfect, and they're using filters to make 169 00:10:56,440 --> 00:11:01,680 Speaker 1: them look even more perfect. It's become, you know, a 170 00:11:01,720 --> 00:11:04,440 Speaker 1: whole trope. And in fact, it's also a concern for 171 00:11:04,480 --> 00:11:07,560 Speaker 1: folks who worry about young kids, especially girls and young women, 172 00:11:08,080 --> 00:11:11,200 Speaker 1: who might experience mental health problems as they encounter this 173 00:11:11,320 --> 00:11:15,199 Speaker 1: presentation of a seemingly perfect life and they're comparing their 174 00:11:15,200 --> 00:11:21,760 Speaker 1: own lives negatively to that fake really example, So yeah, 175 00:11:21,760 --> 00:11:24,680 Speaker 1: in that respect, I could see how the metaverse could 176 00:11:24,679 --> 00:11:28,160 Speaker 1: take off, right where everyone says, finally I can present 177 00:11:28,240 --> 00:11:31,199 Speaker 1: to the world a version of myself that I wish 178 00:11:31,240 --> 00:11:35,200 Speaker 1: I were, as opposed to this sack of meeting bones 179 00:11:35,280 --> 00:11:38,199 Speaker 1: that I am. And meanwhile, I'm sitting here thinking like 180 00:11:38,360 --> 00:11:40,880 Speaker 1: I'm okay being a sack of meeting bones. I'll just 181 00:11:41,040 --> 00:11:44,760 Speaker 1: I'll be a grumpy hermit in the ldite forest. I 182 00:11:44,800 --> 00:11:48,600 Speaker 1: suppose uh, because this is not a selling point for me. 183 00:11:49,280 --> 00:11:52,559 Speaker 1: But um yeah, that seems to be what Kutaragi is saying. 184 00:11:52,559 --> 00:11:54,679 Speaker 1: I could be reading way too much into that. I 185 00:11:54,840 --> 00:11:57,440 Speaker 1: I admit I got a lot of baggage when it 186 00:11:57,480 --> 00:12:01,520 Speaker 1: comes to the concept of the metaverse. Okay, we've got 187 00:12:01,600 --> 00:12:04,080 Speaker 1: lots of other stories to cover, but before we get 188 00:12:04,080 --> 00:12:14,160 Speaker 1: to those, let's take a quick break. So before the break, 189 00:12:14,520 --> 00:12:16,280 Speaker 1: you know, we were talking about the metaverse, and of 190 00:12:16,320 --> 00:12:19,800 Speaker 1: course we're not done yet, because that is the topic 191 00:12:20,120 --> 00:12:23,240 Speaker 1: in early two. I sure hope by the end of 192 00:12:23,280 --> 00:12:26,480 Speaker 1: this year people aren't talking about as much, but I 193 00:12:26,520 --> 00:12:29,280 Speaker 1: think that might be a hope in vain. But PC 194 00:12:29,440 --> 00:12:33,520 Speaker 1: Gamer and several other outlets have reported that meta slash 195 00:12:33,559 --> 00:12:37,439 Speaker 1: Facebook holds a bunch of patents that may point toward 196 00:12:37,520 --> 00:12:41,760 Speaker 1: how the company will leverage certain technologies uh and the 197 00:12:41,800 --> 00:12:46,920 Speaker 1: metaverse to make truckloads more money, mainly through advertising. So 198 00:12:47,160 --> 00:12:50,439 Speaker 1: those patents include a lot of stuff about haptic feedback 199 00:12:50,720 --> 00:12:53,920 Speaker 1: and tracking user movement, so a lot of stuff about 200 00:12:53,960 --> 00:12:57,959 Speaker 1: how a user will interact with a virtual environment. How 201 00:12:58,000 --> 00:13:00,720 Speaker 1: that virtual environment will be able to detect act what 202 00:13:00,880 --> 00:13:03,800 Speaker 1: the users doing and thus render it so that it 203 00:13:03,920 --> 00:13:09,199 Speaker 1: makes sense within the virtual environment. And beyond that, it's 204 00:13:09,240 --> 00:13:12,080 Speaker 1: supposed to, you know, kind of help the virtual environment 205 00:13:12,160 --> 00:13:15,120 Speaker 1: understand why where someone is looking. And you can see 206 00:13:15,120 --> 00:13:17,560 Speaker 1: why that would be important, right because it needs to 207 00:13:17,600 --> 00:13:22,200 Speaker 1: present to the viewer an appropriate image. If I'm you know, 208 00:13:22,800 --> 00:13:25,200 Speaker 1: standing around and I'm turning my head left and right, 209 00:13:25,240 --> 00:13:27,920 Speaker 1: well the virtual environment better reflect that or else it's 210 00:13:27,920 --> 00:13:30,320 Speaker 1: going to be a terrible experience, probably make me sick, 211 00:13:30,720 --> 00:13:34,440 Speaker 1: and I'll never want to do it again. But some 212 00:13:34,800 --> 00:13:38,400 Speaker 1: of the tech that is covered by the patents includes 213 00:13:38,480 --> 00:13:42,240 Speaker 1: eye tracking stuff, so that the system running the show 214 00:13:42,400 --> 00:13:44,680 Speaker 1: knows not just where your head is pointed, but exactly 215 00:13:44,760 --> 00:13:46,760 Speaker 1: what you're looking at, like what are you looking at 216 00:13:47,200 --> 00:13:50,200 Speaker 1: within your field of view? Not just you know where 217 00:13:50,200 --> 00:13:52,760 Speaker 1: you're pointed to. And there are a lot of legit 218 00:13:52,840 --> 00:13:56,080 Speaker 1: reasons to want that tech in a virtual environment. For one, thing, 219 00:13:57,040 --> 00:13:59,880 Speaker 1: that can really teach developers what people are focusing on, 220 00:14:00,600 --> 00:14:03,320 Speaker 1: which can lead to improvements behind the scenes. You know, 221 00:14:03,360 --> 00:14:05,360 Speaker 1: you can look at those analytics and say, all right, well, 222 00:14:05,400 --> 00:14:08,920 Speaker 1: what's working and what's not working. Maybe developers have created 223 00:14:08,920 --> 00:14:13,280 Speaker 1: something really cool in that virtual environment, but it's being overlooked. 224 00:14:13,280 --> 00:14:15,440 Speaker 1: It's not getting any attention. And it's not that the 225 00:14:15,440 --> 00:14:18,680 Speaker 1: thing they created was bad, it's just for whatever reason, 226 00:14:18,720 --> 00:14:24,720 Speaker 1: the way it's positioned or whatever, is not getting people's attention. 227 00:14:24,920 --> 00:14:28,640 Speaker 1: So maybe that means developers should either not dedicate their 228 00:14:28,680 --> 00:14:31,720 Speaker 1: resources to building out similar stuff in the virtual world, 229 00:14:31,720 --> 00:14:34,120 Speaker 1: that it would just be a waste of time, or 230 00:14:34,320 --> 00:14:37,040 Speaker 1: maybe it would let them know that, hey, we need 231 00:14:37,080 --> 00:14:40,280 Speaker 1: to do something different with this asset because it could 232 00:14:40,320 --> 00:14:42,640 Speaker 1: have an impact. It's just not having an impact here. 233 00:14:43,200 --> 00:14:45,960 Speaker 1: So those are like legit reasons to use eye tracking. 234 00:14:46,280 --> 00:14:48,760 Speaker 1: But of course another major application of eye tracking tech 235 00:14:49,400 --> 00:14:53,080 Speaker 1: is to figure out where you might best serve ads 236 00:14:53,120 --> 00:14:55,800 Speaker 1: to inhabitants in the metaverse. You might say, where are 237 00:14:55,840 --> 00:14:58,600 Speaker 1: people looking a lot, Let's make sure we put ads 238 00:14:58,720 --> 00:15:03,240 Speaker 1: where people are looking, because then we can use the data, 239 00:15:03,360 --> 00:15:07,920 Speaker 1: the eye tracking data, and go to advertisers and say, look, 240 00:15:08,120 --> 00:15:10,960 Speaker 1: if you put an ad up in this spot in 241 00:15:11,000 --> 00:15:14,760 Speaker 1: the metaverse, x number of people are guaranteed to see 242 00:15:14,800 --> 00:15:19,160 Speaker 1: it every month, and thus we can charge bookoos of 243 00:15:19,160 --> 00:15:22,600 Speaker 1: money for you to have an ad place there and 244 00:15:22,680 --> 00:15:25,360 Speaker 1: This kind of makes me think of films like Minority Report. 245 00:15:25,600 --> 00:15:28,440 Speaker 1: They're little sequences in Minority Report where we see people 246 00:15:28,520 --> 00:15:31,360 Speaker 1: encounter digital ads all around them as they're just navigating 247 00:15:31,360 --> 00:15:34,160 Speaker 1: through their day. And I can see this tech being 248 00:15:34,240 --> 00:15:38,200 Speaker 1: used for stuff like augmented reality headsets, not just virtual reality, 249 00:15:38,280 --> 00:15:41,880 Speaker 1: which you know, depending on the application of augmented reality, 250 00:15:42,240 --> 00:15:46,520 Speaker 1: using eye tracking tech could be really cool and useful. 251 00:15:46,840 --> 00:15:50,120 Speaker 1: But I suspect a lot of focus no pun intended, 252 00:15:50,960 --> 00:15:54,720 Speaker 1: we'll be on how to monetize the capabilities, and uh, 253 00:15:54,760 --> 00:15:57,880 Speaker 1: you know, companies like Meta already kind of know what 254 00:15:57,960 --> 00:16:01,000 Speaker 1: you look at anyway, right, So yeah, this this definitely 255 00:16:01,080 --> 00:16:07,280 Speaker 1: sounds creepy because the idea of them analyzing where you're 256 00:16:07,360 --> 00:16:13,160 Speaker 1: looking sounds really invasive. But let me let you in 257 00:16:13,160 --> 00:16:16,000 Speaker 1: on a little secret because based on your browsing habits 258 00:16:16,040 --> 00:16:19,760 Speaker 1: and how much time you spend on social networks, MENA 259 00:16:19,800 --> 00:16:22,920 Speaker 1: already knows what you're looking at. Really, they already know 260 00:16:22,960 --> 00:16:26,760 Speaker 1: what you're interested in. Um, they already know what you're 261 00:16:26,800 --> 00:16:30,440 Speaker 1: not interested in based upon your behaviors. Like that kind 262 00:16:30,440 --> 00:16:33,560 Speaker 1: of data can be analyzed and they know how to 263 00:16:33,640 --> 00:16:37,480 Speaker 1: target you specifically. The browsing data and habits give pretty 264 00:16:37,520 --> 00:16:39,840 Speaker 1: much all the information needed, so the eye tracking thing 265 00:16:39,920 --> 00:16:43,200 Speaker 1: just becomes a more overt way of doing that and 266 00:16:43,320 --> 00:16:48,240 Speaker 1: provides a more precise way of implementing ads. But let 267 00:16:48,240 --> 00:16:51,840 Speaker 1: me tell you the creepy surveillance stuff that's already out 268 00:16:51,880 --> 00:16:56,360 Speaker 1: there that exists. That is the Facebook of today, and 269 00:16:56,640 --> 00:16:59,560 Speaker 1: you don't need eye tracking software too or hardware to 270 00:16:59,600 --> 00:17:03,880 Speaker 1: make that happen. Happy Thursday, y'all. On Monday this week, 271 00:17:04,119 --> 00:17:09,520 Speaker 1: hackers compromised a cryptocurrency trading platform called crypto dot com 272 00:17:09,560 --> 00:17:11,560 Speaker 1: and they were able to make off with around thirty 273 00:17:11,600 --> 00:17:16,720 Speaker 1: four million dollars worth of cryptocurrency. That is a heck 274 00:17:16,840 --> 00:17:20,520 Speaker 1: of a bank heist. The company recently copped to the 275 00:17:20,560 --> 00:17:24,760 Speaker 1: hack in a blog post, and it had halted withdrawals 276 00:17:24,960 --> 00:17:29,159 Speaker 1: for about fourteen hours for all users and send on 277 00:17:29,240 --> 00:17:31,440 Speaker 1: a message that essentially said, hey, y'all, it's tot's okay. 278 00:17:31,440 --> 00:17:35,199 Speaker 1: Your money is safe. But for four three customers that 279 00:17:35,280 --> 00:17:39,000 Speaker 1: wasn't really the case, at least not temporarily, because it 280 00:17:39,080 --> 00:17:41,280 Speaker 1: was their accounts that got hit and the hackers were 281 00:17:41,320 --> 00:17:45,080 Speaker 1: able to drain some of those accounts. Crypto dot Com 282 00:17:45,280 --> 00:17:48,840 Speaker 1: forced all users to reset their two factor authentication methods 283 00:17:48,880 --> 00:17:51,240 Speaker 1: so that they can access to their accounts. I've talked 284 00:17:51,240 --> 00:17:53,760 Speaker 1: about two factor authentication lots of times in the past 285 00:17:54,040 --> 00:17:56,760 Speaker 1: for those who are not really familiar with what that means. 286 00:17:57,760 --> 00:18:00,439 Speaker 1: It means you need you need two different ways of 287 00:18:00,480 --> 00:18:05,800 Speaker 1: authenticating who you are in order to access something, and 288 00:18:06,000 --> 00:18:08,800 Speaker 1: it falls into different buckets. And typically we think of 289 00:18:08,800 --> 00:18:15,080 Speaker 1: those buckets as being something you possess, like say your phone, uh, 290 00:18:15,160 --> 00:18:20,120 Speaker 1: something you know like a password, and something you are 291 00:18:20,480 --> 00:18:24,800 Speaker 1: like biometric data that's related to some unique physical feature 292 00:18:24,880 --> 00:18:28,000 Speaker 1: that you have, and that you need to provide one 293 00:18:28,480 --> 00:18:32,919 Speaker 1: from one of those from at least two categories for 294 00:18:33,960 --> 00:18:37,240 Speaker 1: to be able to access whatever it is. So commonly 295 00:18:37,320 --> 00:18:40,200 Speaker 1: this is something where it's like your password and then 296 00:18:40,280 --> 00:18:42,640 Speaker 1: you get a text message on your phone with like 297 00:18:42,680 --> 00:18:46,960 Speaker 1: a six digit key, and it's only when you provide 298 00:18:47,000 --> 00:18:49,120 Speaker 1: both of those pieces of information that you can gain 299 00:18:49,119 --> 00:18:52,560 Speaker 1: access to your account. To factor authentication or multi factor 300 00:18:52,600 --> 00:18:56,480 Speaker 1: authentication is very smart. You want to use it, if 301 00:18:56,520 --> 00:18:59,320 Speaker 1: it's available, you want it turned on. I know it 302 00:18:59,359 --> 00:19:03,199 Speaker 1: can be a half soul, but it is superior to 303 00:19:03,320 --> 00:19:09,040 Speaker 1: just relying on a single factor. However, as interesting as 304 00:19:09,080 --> 00:19:11,679 Speaker 1: it is that you know crypto dot Com told all 305 00:19:11,720 --> 00:19:14,320 Speaker 1: its users you need to reset your two factor authentication. 306 00:19:15,560 --> 00:19:19,000 Speaker 1: It appears that the hackers were able to access those 307 00:19:19,040 --> 00:19:23,919 Speaker 1: four three accounts without using two factor authentication in the 308 00:19:23,960 --> 00:19:28,919 Speaker 1: first place. So I don't know all the details about 309 00:19:28,920 --> 00:19:31,400 Speaker 1: the hack, but if that's the case, if the hackers 310 00:19:31,400 --> 00:19:33,359 Speaker 1: were getting access to stuff and they weren't having to 311 00:19:33,440 --> 00:19:37,480 Speaker 1: use two factor authentication, that's kind of like saying, hey, 312 00:19:37,560 --> 00:19:39,920 Speaker 1: you know, I know your house was robbed the other night, 313 00:19:40,160 --> 00:19:43,040 Speaker 1: so we've replaced all the locks on your front door, 314 00:19:43,200 --> 00:19:45,360 Speaker 1: and it turns out that the thieves came in through 315 00:19:45,400 --> 00:19:49,679 Speaker 1: the window. Like, Okay, well, I'm glad the door is 316 00:19:50,440 --> 00:19:54,440 Speaker 1: more secure. But if that's not how the the information 317 00:19:54,560 --> 00:19:58,760 Speaker 1: or in this case, the money was accessed, then that's 318 00:19:58,760 --> 00:20:03,000 Speaker 1: not really a solution. However, to the company's credit, it 319 00:20:03,040 --> 00:20:07,440 Speaker 1: did cover the losses experienced by each of those four accounts. 320 00:20:07,480 --> 00:20:12,199 Speaker 1: It restored the money in whatever cryptocurrency form it was 321 00:20:12,280 --> 00:20:17,240 Speaker 1: in to all FOE accounts, presumably under crypto dot COM's 322 00:20:17,240 --> 00:20:20,400 Speaker 1: own expense, and the hackers are reportedly in the middle 323 00:20:20,440 --> 00:20:24,120 Speaker 1: of using various crypto apps to launder the money they've stolen. 324 00:20:25,160 --> 00:20:29,600 Speaker 1: What a world. Okay, we're gonna take another quick break 325 00:20:29,600 --> 00:20:32,119 Speaker 1: when we come back we're gonna have a butt load 326 00:20:32,359 --> 00:20:43,200 Speaker 1: of Google news as well some other stuff. Okay, time 327 00:20:43,240 --> 00:20:46,720 Speaker 1: to transition into the Google heavy portion of the news 328 00:20:46,840 --> 00:20:50,760 Speaker 1: for today. First up, Alphabet CEO Sandar Pichai, along with 329 00:20:50,960 --> 00:20:55,280 Speaker 1: Apple CEO Tim Cook have reportedly been making the rounds 330 00:20:55,400 --> 00:21:00,080 Speaker 1: with lawmakers in the US, arranging private phone calls to 331 00:21:00,200 --> 00:21:04,480 Speaker 1: talk with senators about stuff. And you might wonder why, Well, 332 00:21:04,520 --> 00:21:07,639 Speaker 1: it's because of a piece of proposed legislation called the 333 00:21:07,680 --> 00:21:13,719 Speaker 1: American Innovation and Choice Online Act that could potentially really 334 00:21:13,840 --> 00:21:17,000 Speaker 1: shake things up for some big tech companies like Apple 335 00:21:17,240 --> 00:21:21,679 Speaker 1: and Google and Amazon. Part of that act focuses on 336 00:21:21,720 --> 00:21:25,359 Speaker 1: the practice of a company promoting its own products over 337 00:21:25,760 --> 00:21:31,000 Speaker 1: competitors in the company's various marketplaces. So let's stick with Google. 338 00:21:31,119 --> 00:21:33,280 Speaker 1: So this would mean that Google would not be allowed 339 00:21:33,320 --> 00:21:37,280 Speaker 1: to give its own products and services a promotional advantage 340 00:21:37,480 --> 00:21:42,760 Speaker 1: in Google ads on Google platforms like Search. Apple wouldn't 341 00:21:42,760 --> 00:21:45,160 Speaker 1: be able to do the same thing with Apple products 342 00:21:45,160 --> 00:21:48,440 Speaker 1: and services on Apple platforms, And of course Amazon has 343 00:21:48,480 --> 00:21:52,760 Speaker 1: been dingd for this multiple times, with the company being 344 00:21:52,800 --> 00:21:57,840 Speaker 1: accused of promoting its own products over that of competitors 345 00:21:57,880 --> 00:22:02,240 Speaker 1: on the Amazon marketplace. So if this legislation were to pass, 346 00:22:02,320 --> 00:22:05,600 Speaker 1: that would be a huge threat to the current business 347 00:22:05,640 --> 00:22:09,680 Speaker 1: models that these big tech companies rely upon. It's one 348 00:22:09,720 --> 00:22:12,320 Speaker 1: of the big ways these companies become so gargantuan in 349 00:22:12,359 --> 00:22:15,879 Speaker 1: the first place, and competitors say the practice is you know, 350 00:22:16,080 --> 00:22:19,880 Speaker 1: completely unfair, and even if the competitor were to offer 351 00:22:19,960 --> 00:22:24,280 Speaker 1: a superior product at a superior price, they are at 352 00:22:24,320 --> 00:22:29,160 Speaker 1: a promotional disadvantage. Whether the tech CEOs are making headway 353 00:22:29,240 --> 00:22:32,840 Speaker 1: with the senators, that's hard to say. I mean, money talks, 354 00:22:33,359 --> 00:22:37,560 Speaker 1: especially in politics, So it's entirely possible that you know, 355 00:22:37,640 --> 00:22:41,680 Speaker 1: these CEOs are persuading enough lawmakers to oppose or amend 356 00:22:41,680 --> 00:22:44,159 Speaker 1: the legislation to the point that it doesn't really do 357 00:22:44,240 --> 00:22:47,720 Speaker 1: anything um And of course, again this is proposed legislation. 358 00:22:47,720 --> 00:22:50,320 Speaker 1: It's not like it's been voted on yet. But there's 359 00:22:50,400 --> 00:22:54,960 Speaker 1: definitely a growing resentment toward big tech from multiple sectors, 360 00:22:55,000 --> 00:22:59,800 Speaker 1: including the common populace, and it has been growing for 361 00:22:59,840 --> 00:23:03,960 Speaker 1: some time now. So there might be some pressure from 362 00:23:04,000 --> 00:23:07,080 Speaker 1: constituents for some of these senators to kind of stick 363 00:23:07,119 --> 00:23:10,280 Speaker 1: to their guns and pass the legislation. We'll have to 364 00:23:10,800 --> 00:23:14,120 Speaker 1: wait and see if that comes to pass. In other 365 00:23:14,160 --> 00:23:18,880 Speaker 1: Google news, the company is reportedly establishing a new business unit. 366 00:23:19,240 --> 00:23:23,320 Speaker 1: This one focuses on blockchain and distributed computing and data 367 00:23:23,359 --> 00:23:27,600 Speaker 1: storage technologies like future ones. And this is according to Bloomberg, 368 00:23:27,720 --> 00:23:33,359 Speaker 1: which published that the the outlet had received a leaked 369 00:23:33,400 --> 00:23:36,719 Speaker 1: email from within the company revealing the fact that they 370 00:23:36,760 --> 00:23:41,520 Speaker 1: have established this blockchain business unit. Now, does this mean 371 00:23:41,560 --> 00:23:44,240 Speaker 1: that Google is going to launch its own cryptocurrency in 372 00:23:44,280 --> 00:23:47,879 Speaker 1: the future, or that maybe, you know, Android users or 373 00:23:47,960 --> 00:23:50,520 Speaker 1: Chrome users are going to be flooded with ads for 374 00:23:50,720 --> 00:23:54,280 Speaker 1: Google and f t s. I don't know. I hope not, 375 00:23:55,040 --> 00:23:58,439 Speaker 1: but I can't say for certain. But it is at 376 00:23:58,520 --> 00:24:01,600 Speaker 1: least an early indicator that Google is experimenting with more 377 00:24:01,680 --> 00:24:06,280 Speaker 1: technologies associated with stuff like the concept of Web three 378 00:24:06,400 --> 00:24:10,440 Speaker 1: and the metaverse. Google has already obviously been working with 379 00:24:11,000 --> 00:24:16,000 Speaker 1: virtual reality and augmented reality applications. I mean, Google Glass 380 00:24:16,080 --> 00:24:21,879 Speaker 1: was a very early a R commercial product, had a 381 00:24:21,960 --> 00:24:24,600 Speaker 1: very limited commercial run, but it was something like that. 382 00:24:25,320 --> 00:24:29,160 Speaker 1: So Google certainly has been dipping its toe in various 383 00:24:29,200 --> 00:24:34,840 Speaker 1: disciplines that are converging towards this metaverse concept. And I'm 384 00:24:34,840 --> 00:24:37,280 Speaker 1: thinking this might be just sort of a way for 385 00:24:37,359 --> 00:24:40,200 Speaker 1: Google to stay involved in the game. I don't see 386 00:24:40,200 --> 00:24:42,840 Speaker 1: Google as a company that's going to lead the way 387 00:24:43,000 --> 00:24:45,399 Speaker 1: in that space. But I also see it as a 388 00:24:45,400 --> 00:24:49,480 Speaker 1: company that doesn't want to find itself completely left out. 389 00:24:50,480 --> 00:24:54,920 Speaker 1: Google has a dominant position, some might call it a 390 00:24:55,440 --> 00:24:59,800 Speaker 1: monopolistic position when it comes to certain things like like 391 00:25:00,080 --> 00:25:05,080 Speaker 1: search and ad revenue or add ad services um and 392 00:25:05,480 --> 00:25:08,000 Speaker 1: I'm sure the company is thinking, well, we want to 393 00:25:08,080 --> 00:25:10,080 Speaker 1: future proof that. We don't want to make sure. We 394 00:25:10,119 --> 00:25:13,359 Speaker 1: want to make sure that the next stage of the Web, 395 00:25:13,400 --> 00:25:15,520 Speaker 1: whatever it may be, doesn't cut us out of it. 396 00:25:15,640 --> 00:25:17,560 Speaker 1: So that's what I see this as. I don't see 397 00:25:17,560 --> 00:25:21,080 Speaker 1: it necessarily as an indication that we're going to, you know, 398 00:25:21,200 --> 00:25:25,600 Speaker 1: see a Google version of crypto anytime soon, although that 399 00:25:25,760 --> 00:25:29,040 Speaker 1: could happen, I just I'm not seeing it right now. 400 00:25:31,240 --> 00:25:35,440 Speaker 1: And for a long time, Google has offered a service 401 00:25:35,480 --> 00:25:38,840 Speaker 1: in its g Suite product that gives you the chance 402 00:25:38,920 --> 00:25:42,200 Speaker 1: to create a custom domain built on top of Google, 403 00:25:42,680 --> 00:25:45,400 Speaker 1: and that gives you a lot of different options, including 404 00:25:45,520 --> 00:25:49,320 Speaker 1: you can have a Gmail address that isn't a Gmail address, 405 00:25:49,680 --> 00:25:53,399 Speaker 1: I mean it is Gmail, but the domain like it 406 00:25:53,400 --> 00:25:55,480 Speaker 1: could be a custom one. Like I could have a 407 00:25:55,520 --> 00:25:59,720 Speaker 1: Gmail address that has the domain Jonathan Strickland is awesome 408 00:26:00,080 --> 00:26:03,560 Speaker 1: dot com instead of Gmail dot com. Not that I 409 00:26:03,600 --> 00:26:06,080 Speaker 1: have an account with that domain, but I could if 410 00:26:06,119 --> 00:26:10,200 Speaker 1: I wanted to, and if it were available anyway. From 411 00:26:10,280 --> 00:26:13,600 Speaker 1: the moment the Google offered that service up through two 412 00:26:13,680 --> 00:26:17,199 Speaker 1: thousand and twelve, there was a free basic version of 413 00:26:17,240 --> 00:26:20,560 Speaker 1: the product, so you could enroll, you could get that 414 00:26:20,680 --> 00:26:24,000 Speaker 1: custom domain. You didn't have to spend any money, and 415 00:26:24,000 --> 00:26:28,359 Speaker 1: it was the bare bones version, but it existed, and uh, 416 00:26:28,520 --> 00:26:31,280 Speaker 1: if you ponied up the cash, then you could get 417 00:26:31,280 --> 00:26:34,760 Speaker 1: access to other extra features. Yeah, that basic service was 418 00:26:34,800 --> 00:26:37,720 Speaker 1: free of charge. But then in two thousand twelve, Google 419 00:26:37,760 --> 00:26:41,520 Speaker 1: stopped offering it for free, but it also allowed those 420 00:26:41,560 --> 00:26:45,960 Speaker 1: who had established a free account to keep using it. Now, 421 00:26:46,119 --> 00:26:48,760 Speaker 1: a decade later, the company has made the decision to 422 00:26:49,080 --> 00:26:51,840 Speaker 1: come to each one of those users who still has 423 00:26:51,880 --> 00:26:55,960 Speaker 1: an established free account and have for years, and now 424 00:26:55,960 --> 00:27:00,280 Speaker 1: Google is going to say you must pay tribute. We 425 00:27:00,320 --> 00:27:03,560 Speaker 1: will burn your domain to the ground and so salt 426 00:27:03,640 --> 00:27:09,080 Speaker 1: into the earth. Okay, not not that, but it's the 427 00:27:09,119 --> 00:27:12,000 Speaker 1: gist of what they're saying. So for the folks who 428 00:27:12,080 --> 00:27:14,680 Speaker 1: did get in early, while those free days are coming 429 00:27:14,720 --> 00:27:16,960 Speaker 1: to an end, and then they will have to either 430 00:27:17,119 --> 00:27:21,199 Speaker 1: pay or they will see their accounts suspended starting on 431 00:27:21,320 --> 00:27:24,160 Speaker 1: May first. Anyone with a free account but who has 432 00:27:24,600 --> 00:27:28,800 Speaker 1: billing information attached to their Google account will then be 433 00:27:29,320 --> 00:27:34,000 Speaker 1: upgraded quote unquote to the paid service automatically, which is 434 00:27:34,040 --> 00:27:37,520 Speaker 1: fun times and starting in July, those accounts will start 435 00:27:37,520 --> 00:27:40,200 Speaker 1: to get billed every month for continued use of the service. 436 00:27:40,880 --> 00:27:44,120 Speaker 1: If an account does not have billing information attached to it, 437 00:27:44,600 --> 00:27:47,639 Speaker 1: and if the user hasn't you know, updated their account 438 00:27:47,720 --> 00:27:51,879 Speaker 1: to have billing infoe attached by July one, their account 439 00:27:51,920 --> 00:27:55,160 Speaker 1: will be suspended. This is a pretty tough pill to swallow. 440 00:27:55,200 --> 00:27:57,359 Speaker 1: To have a free service suddenly switch up on you 441 00:27:57,480 --> 00:28:02,080 Speaker 1: like that. That's never fun. On a slightly more fun note, 442 00:28:02,400 --> 00:28:06,040 Speaker 1: Google has launched a beta program in South Korea, Taiwan, 443 00:28:06,119 --> 00:28:09,040 Speaker 1: and Hong Kong, and people in those regions can get 444 00:28:09,119 --> 00:28:13,280 Speaker 1: beta access to a Google Play Games app on Windows eleven. 445 00:28:14,040 --> 00:28:17,399 Speaker 1: Google had previously committed to bringing Android games to Windows 446 00:28:17,400 --> 00:28:20,200 Speaker 1: eleven back in December, and when does eleven includes an 447 00:28:20,200 --> 00:28:23,200 Speaker 1: Android app layer that's supposed to make this seamless. There's 448 00:28:23,240 --> 00:28:28,879 Speaker 1: been some use of Google apps onto Windows platforms in 449 00:28:28,920 --> 00:28:33,840 Speaker 1: the past, but this is supposed to make it really work. Uh. 450 00:28:34,080 --> 00:28:35,919 Speaker 1: At first, I was like, this doesn't interest me, And 451 00:28:35,920 --> 00:28:38,920 Speaker 1: then I thought, hum, if I have some Windows eleven machines, 452 00:28:38,960 --> 00:28:41,080 Speaker 1: that means I could play Marvel Puzzle Quest on a 453 00:28:41,120 --> 00:28:44,560 Speaker 1: PC and have my progress SYNCD across all devices. And 454 00:28:44,600 --> 00:28:47,800 Speaker 1: I started to come around to it because that's actually 455 00:28:47,840 --> 00:28:51,120 Speaker 1: the only Android game I currently have installed on my phone. 456 00:28:51,680 --> 00:28:54,760 Speaker 1: And I know I people will tell me that's not 457 00:28:54,800 --> 00:28:57,720 Speaker 1: even a good one. I don't care. I play it 458 00:28:57,760 --> 00:29:01,760 Speaker 1: pretty much every day. Um Google has said it will 459 00:29:01,800 --> 00:29:06,880 Speaker 1: expand this to other regions throughout two but it has 460 00:29:06,960 --> 00:29:11,240 Speaker 1: yet to release a timeline for that rollout. Amazon is 461 00:29:11,280 --> 00:29:13,720 Speaker 1: continuing to build on its presence in the brick and 462 00:29:13,760 --> 00:29:16,080 Speaker 1: mortar world. The company had already opened up a few 463 00:29:16,080 --> 00:29:19,000 Speaker 1: of its famous grocery stores, as well as purchased the 464 00:29:19,120 --> 00:29:21,960 Speaker 1: Whole Foods chain, and I had a couple of brick 465 00:29:21,960 --> 00:29:25,080 Speaker 1: and mortar bookstores. Now. Amazon has announced that it will 466 00:29:25,120 --> 00:29:28,320 Speaker 1: open a clothing store in Glendale, California, later this year 467 00:29:28,600 --> 00:29:32,520 Speaker 1: called Amazon Style, and the store will cover clothing from 468 00:29:32,520 --> 00:29:35,520 Speaker 1: a number of different designers at different price points, everything 469 00:29:35,520 --> 00:29:39,800 Speaker 1: from budget to high level designers, and the company says 470 00:29:39,840 --> 00:29:42,920 Speaker 1: that the stores will incorporate technology on a pretty fundamental 471 00:29:43,000 --> 00:29:47,360 Speaker 1: level and customers will interact with that technology using their smartphones. 472 00:29:47,880 --> 00:29:52,680 Speaker 1: So Amazon says it will reserve floor space to display items, 473 00:29:52,880 --> 00:29:56,160 Speaker 1: so there won't be any racks of clothes there. Instead, 474 00:29:56,240 --> 00:30:00,280 Speaker 1: you'll see an example of an outfit and uh way 475 00:30:00,320 --> 00:30:04,360 Speaker 1: to scan that outfit presumably, so use your phone, you 476 00:30:04,400 --> 00:30:08,120 Speaker 1: scan the outfit, and then you can use the app 477 00:30:08,320 --> 00:30:11,720 Speaker 1: to select you know, the size you want, the color 478 00:30:11,800 --> 00:30:13,680 Speaker 1: you want, the style you want of whatever it is, 479 00:30:14,160 --> 00:30:16,480 Speaker 1: and have it sent to a fitting room or even 480 00:30:16,560 --> 00:30:21,040 Speaker 1: to a pickup counter um anyway, then you could go 481 00:30:21,120 --> 00:30:23,160 Speaker 1: to the fitting room and try on the items. And 482 00:30:23,160 --> 00:30:25,440 Speaker 1: in the fitting rooms they're gonna have touch screens where 483 00:30:25,840 --> 00:30:28,600 Speaker 1: let's say you get in there and you try something on, 484 00:30:28,680 --> 00:30:30,880 Speaker 1: like I don't know, you want to get those pleather 485 00:30:31,120 --> 00:30:34,520 Speaker 1: chaps you see on the show floor, and you have 486 00:30:34,640 --> 00:30:36,080 Speaker 1: some sent to the fing room, and you go back 487 00:30:36,080 --> 00:30:38,040 Speaker 1: there and you say, yeah, these are just a little 488 00:30:38,040 --> 00:30:40,400 Speaker 1: too loose. They don't give them took us the own 489 00:30:40,440 --> 00:30:41,960 Speaker 1: f I was hoping for, And you could use the 490 00:30:42,000 --> 00:30:45,800 Speaker 1: touch screen to request a smaller size of pleather chaps 491 00:30:45,840 --> 00:30:47,800 Speaker 1: to be sent to the fitting room, so you can 492 00:30:47,840 --> 00:30:51,640 Speaker 1: really get that lift that you wanted. You might wonder 493 00:30:51,680 --> 00:30:54,200 Speaker 1: why I picked pleather chaps so back in Oh, I 494 00:30:54,200 --> 00:30:56,680 Speaker 1: don't know, more than twenty years ago, I was in 495 00:30:56,760 --> 00:30:58,840 Speaker 1: a play where I had to wear pleather chaps. So 496 00:30:59,800 --> 00:31:02,440 Speaker 1: it's just one of those things pops up in your 497 00:31:02,440 --> 00:31:06,960 Speaker 1: head every now and then. Anyway, Uh. Amazon also plans 498 00:31:06,960 --> 00:31:09,680 Speaker 1: to use its palm scanning technology that's used in other 499 00:31:09,760 --> 00:31:14,600 Speaker 1: stores to link customer accounts to specific customers, so you could, 500 00:31:14,840 --> 00:31:16,680 Speaker 1: you know, pick out the stuff that you want to 501 00:31:16,720 --> 00:31:20,480 Speaker 1: go to the pickup counter, scan your palm, and you 502 00:31:20,520 --> 00:31:24,840 Speaker 1: would automatically have the price of your purchases deducted from 503 00:31:24,880 --> 00:31:28,040 Speaker 1: your account. Now, as someone who hates to shop for 504 00:31:28,120 --> 00:31:32,240 Speaker 1: clothing at all, none of this appeals to me, but 505 00:31:32,440 --> 00:31:35,120 Speaker 1: I could easily imagine it being excited to folks who aren't, 506 00:31:35,200 --> 00:31:41,520 Speaker 1: you know, a grumpy chaps like myself. Twitch, another Amazon property, 507 00:31:41,840 --> 00:31:44,400 Speaker 1: published an open letter from the company's Vice president of 508 00:31:44,400 --> 00:31:48,640 Speaker 1: Global Trust and Safety, Angela Hessian, to lay out how 509 00:31:48,680 --> 00:31:52,480 Speaker 1: the platform has recently been tackling issues of abuse on Twitch, 510 00:31:52,960 --> 00:31:56,000 Speaker 1: and one of the measures was that Twitch deleted millions 511 00:31:56,080 --> 00:32:00,640 Speaker 1: of automated butts in order to curtail what are called 512 00:32:00,680 --> 00:32:04,640 Speaker 1: hate raids. Now, if you're not familiar with Twitch, you 513 00:32:04,720 --> 00:32:07,560 Speaker 1: might not be aware of raids in general. Twitch is 514 00:32:07,640 --> 00:32:12,640 Speaker 1: a live streaming platform mainly geared towards video games, but 515 00:32:12,680 --> 00:32:14,680 Speaker 1: there are other types of content on Twitch as well, 516 00:32:15,120 --> 00:32:18,480 Speaker 1: and raids don't necessarily have to be negative. In fact, 517 00:32:18,520 --> 00:32:22,880 Speaker 1: a raid can be really uplifting and awesome. So essentially, 518 00:32:23,160 --> 00:32:28,000 Speaker 1: a raid is when someone directs their audience to another streamer. 519 00:32:28,360 --> 00:32:31,160 Speaker 1: So you might have like popular streamers who have thousands 520 00:32:31,240 --> 00:32:35,040 Speaker 1: and thousands of people watching at a time, and sometimes 521 00:32:35,080 --> 00:32:36,720 Speaker 1: they like to do this to give kind of like 522 00:32:36,760 --> 00:32:40,440 Speaker 1: a crazy short term boost to lesser known or sometimes 523 00:32:40,440 --> 00:32:44,280 Speaker 1: outright obscure streamers who are on the platform. So you 524 00:32:44,320 --> 00:32:47,680 Speaker 1: could have someone who's really influential on Twitch come across 525 00:32:47,760 --> 00:32:50,880 Speaker 1: a person that they like, but that person has very 526 00:32:50,920 --> 00:32:53,880 Speaker 1: few viewers of their own, and then the popular person 527 00:32:54,000 --> 00:32:57,240 Speaker 1: will say, hey, let's go show them some love and 528 00:32:57,520 --> 00:33:01,800 Speaker 1: open up a link to that other, you know person's 529 00:33:01,960 --> 00:33:05,680 Speaker 1: stream and send floods of enthusiastic fans to go and 530 00:33:05,800 --> 00:33:08,480 Speaker 1: you know, show some love. Well. I hate raid is 531 00:33:08,520 --> 00:33:12,880 Speaker 1: obviously a much nastier beast as the name implies. This 532 00:33:12,920 --> 00:33:14,720 Speaker 1: is when a streamer gets hit with a flood of 533 00:33:14,760 --> 00:33:20,000 Speaker 1: abusive accounts hurling awful trash and chat and generally just 534 00:33:20,120 --> 00:33:24,040 Speaker 1: being terrible people. And of course the you know, the 535 00:33:24,040 --> 00:33:27,200 Speaker 1: people who are disproportionately the target of these kinds of 536 00:33:27,200 --> 00:33:30,880 Speaker 1: attacks tend to be, you know, people of color, or 537 00:33:31,080 --> 00:33:32,920 Speaker 1: people who are in the l g B t Q 538 00:33:33,240 --> 00:33:37,800 Speaker 1: plus community, or women, those tend to be the most 539 00:33:37,840 --> 00:33:40,760 Speaker 1: frequent targets of this kind of abuse. So Twitch has 540 00:33:40,800 --> 00:33:45,800 Speaker 1: deleted millions of bots that were essentially programmed to just 541 00:33:45,840 --> 00:33:49,160 Speaker 1: spread misery and abuse, which I think is definitely a 542 00:33:49,160 --> 00:33:52,160 Speaker 1: step in the right direction. Now, it's upsetting that anyone 543 00:33:52,160 --> 00:33:54,280 Speaker 1: would go to the trouble to set a bot loose 544 00:33:54,360 --> 00:33:57,040 Speaker 1: with this kind of intent in the first place. But 545 00:33:57,480 --> 00:34:00,320 Speaker 1: those are much deeper social problems that I don't expect 546 00:34:00,320 --> 00:34:03,720 Speaker 1: a company like Twitch to solve because it can't. But 547 00:34:03,960 --> 00:34:08,799 Speaker 1: seeing Twitch try to mitigate the you know, implementation of 548 00:34:08,840 --> 00:34:11,960 Speaker 1: that problem is a good thing. Not that the problem 549 00:34:12,040 --> 00:34:15,080 Speaker 1: is going to go away or even necessarily get better. 550 00:34:15,440 --> 00:34:18,319 Speaker 1: People who are determined to be hateful and nasty and 551 00:34:18,400 --> 00:34:23,400 Speaker 1: harmful will always find ways to do that. Now, hopefully 552 00:34:23,440 --> 00:34:27,399 Speaker 1: Twitch can make that hard enough and frustrating enough that 553 00:34:27,640 --> 00:34:30,880 Speaker 1: most evil trolls out there won't stick it out to 554 00:34:30,920 --> 00:34:33,239 Speaker 1: see it through. Uh, I mean, come on, there are 555 00:34:33,280 --> 00:34:35,799 Speaker 1: better things for you to do with your time than 556 00:34:36,040 --> 00:34:41,000 Speaker 1: try to torture someone else. And finally, in news that 557 00:34:41,160 --> 00:34:43,480 Speaker 1: for me falls into the I'll believe it when I 558 00:34:43,520 --> 00:34:48,440 Speaker 1: see it category, the young Company Space Entertainment Enterprise or 559 00:34:48,640 --> 00:34:52,359 Speaker 1: c s E has announced plans that will build and 560 00:34:52,400 --> 00:34:56,560 Speaker 1: operate a space station module that will include a content 561 00:34:56,880 --> 00:35:01,680 Speaker 1: studio as well as a sports arena by December of 562 00:35:01,719 --> 00:35:07,080 Speaker 1: twenty four. Now SEE is the company that's working with 563 00:35:07,120 --> 00:35:10,480 Speaker 1: Tom Cruise on an upcoming space movie with scenes that 564 00:35:10,520 --> 00:35:14,040 Speaker 1: will actually be shot in space, because apparently that will 565 00:35:14,080 --> 00:35:18,520 Speaker 1: add authenticity to something, and and apparently Tom Cruise won't 566 00:35:18,520 --> 00:35:23,879 Speaker 1: be satisfied with, you know, the crazy stuff he's done 567 00:35:23,880 --> 00:35:25,239 Speaker 1: in order to make a movie in the past. He 568 00:35:25,280 --> 00:35:27,560 Speaker 1: has to top it every time, so instead of hanging 569 00:35:27,560 --> 00:35:30,280 Speaker 1: outside an airplane, he has to go into space. Anyway, 570 00:35:30,640 --> 00:35:33,320 Speaker 1: this is all going to be part of axioms space 571 00:35:33,440 --> 00:35:36,400 Speaker 1: station modules. Those, in turn will be connected to the 572 00:35:36,440 --> 00:35:41,480 Speaker 1: International Space Station on a temporary basis. Axiom secured permission 573 00:35:41,480 --> 00:35:44,840 Speaker 1: from NASA back in to build out modules for the 574 00:35:44,880 --> 00:35:47,880 Speaker 1: I s S that will have a private commercial purpose, 575 00:35:48,400 --> 00:35:52,600 Speaker 1: and the eventual plan is that the Axiom modules will 576 00:35:52,719 --> 00:35:56,839 Speaker 1: detach from the International Space station, and then Axiom will 577 00:35:56,880 --> 00:36:00,640 Speaker 1: have its own space station, its own independence station down 578 00:36:00,640 --> 00:36:04,080 Speaker 1: the line when it's time to sunset the old International 579 00:36:04,120 --> 00:36:08,440 Speaker 1: Space Station, which is well pasted its expiration date its 580 00:36:08,480 --> 00:36:11,759 Speaker 1: original expiration date anyway for some of the modules. But 581 00:36:12,000 --> 00:36:15,400 Speaker 1: you know, it's been an operation for quite some time 582 00:36:15,400 --> 00:36:19,960 Speaker 1: and expected to continue for the next few years at least. Now. 583 00:36:20,000 --> 00:36:23,040 Speaker 1: Maybe I'm being way too pessimistic here. It's entirely possible 584 00:36:23,040 --> 00:36:26,280 Speaker 1: that we will see this module dedicated to content creation 585 00:36:26,320 --> 00:36:31,359 Speaker 1: and uh space sports by the end of I don't 586 00:36:31,400 --> 00:36:34,239 Speaker 1: know what kind of space sports. I do know they 587 00:36:34,280 --> 00:36:36,600 Speaker 1: will definitely be the kind I'll never be able to do, 588 00:36:36,840 --> 00:36:39,080 Speaker 1: for lots of reasons, among which are my lack of 589 00:36:39,120 --> 00:36:43,040 Speaker 1: athletic ability and my suspicion that going up there would 590 00:36:43,040 --> 00:36:45,320 Speaker 1: cost more than I will ever make in my lifetime. 591 00:36:46,680 --> 00:36:49,319 Speaker 1: More fun times, but it is kind of cool. And 592 00:36:49,360 --> 00:36:52,840 Speaker 1: you know what, it's Axiom that's actually building out this module, 593 00:36:52,840 --> 00:36:54,840 Speaker 1: which I think is going to be called C one. 594 00:36:55,920 --> 00:36:59,480 Speaker 1: And because axioms already in the process of building out modules, 595 00:36:59,520 --> 00:37:03,560 Speaker 1: and these are inflatable modules as well, um they they 596 00:37:04,000 --> 00:37:08,680 Speaker 1: use air to hold a firm shape in space. Because 597 00:37:08,680 --> 00:37:12,200 Speaker 1: of that, it's it's actually not maybe that far fetched 598 00:37:12,320 --> 00:37:15,960 Speaker 1: to think that it will be ready by It just 599 00:37:16,000 --> 00:37:20,160 Speaker 1: seems like an incredibly aggressive plan. But then these plans 600 00:37:20,160 --> 00:37:23,040 Speaker 1: could have been in the works for ages and we're 601 00:37:23,120 --> 00:37:27,400 Speaker 1: just now being let in on it. Um. I just 602 00:37:27,520 --> 00:37:33,600 Speaker 1: whenever anything involves space, I automatically think any deadline you 603 00:37:33,719 --> 00:37:37,120 Speaker 1: hear is more of a suggestion. Other evidence to point 604 00:37:37,120 --> 00:37:40,400 Speaker 1: to that the James Webb Space Telescope, which was supposed 605 00:37:40,400 --> 00:37:43,080 Speaker 1: to launch like more than twenty years ago, I think, 606 00:37:43,440 --> 00:37:47,400 Speaker 1: And finally did um always good to put an asterisk 607 00:37:47,800 --> 00:37:51,840 Speaker 1: next to those dates. That's it for the news for Thursday, 608 00:37:52,080 --> 00:37:56,000 Speaker 1: January two. If you have suggestions for topics I should 609 00:37:56,040 --> 00:37:58,560 Speaker 1: cover on tech Stuff, feel free to reach out to 610 00:37:58,560 --> 00:38:01,040 Speaker 1: be on Twitter. The handle for the show is text 611 00:38:01,080 --> 00:38:04,320 Speaker 1: Stuff H s W and I'll talk to you again 612 00:38:05,200 --> 00:38:14,200 Speaker 1: really soon Y. Text Stuff is an I Heart Radio production. 613 00:38:14,440 --> 00:38:17,279 Speaker 1: For more podcasts from I Heart Radio, visit the i 614 00:38:17,400 --> 00:38:20,640 Speaker 1: Heart Radio app, Apple Podcasts, or wherever you listen to 615 00:38:20,680 --> 00:38:21,600 Speaker 1: your favorite shows.