1 00:00:04,400 --> 00:00:07,800 Speaker 1: Welcome to tech Stuff, a production from I Heart Radio. 2 00:00:11,800 --> 00:00:14,360 Speaker 1: Hey there, and welcome to tech Stuff. I'm your host 3 00:00:14,440 --> 00:00:17,799 Speaker 1: Jonathan Strickland, Diamond Executive producer with I Heart Radio, and 4 00:00:17,840 --> 00:00:20,239 Speaker 1: how the tech are you. It's time for the tech 5 00:00:20,280 --> 00:00:24,919 Speaker 1: news for Tuesday, October eleventh, two thousand twenty two. Let's 6 00:00:24,920 --> 00:00:29,600 Speaker 1: get to it. Today, Meta will hold its Connect conference, 7 00:00:30,080 --> 00:00:32,800 Speaker 1: and folks are expecting that Mark Zuckerberg will give an 8 00:00:32,880 --> 00:00:36,560 Speaker 1: update as to the whole journey to the Metaverse thing, 9 00:00:37,159 --> 00:00:40,479 Speaker 1: and part of that will be unveiling of the high 10 00:00:40,760 --> 00:00:45,000 Speaker 1: end VR a gear that has been called Project Cambria 11 00:00:45,280 --> 00:00:48,800 Speaker 1: internally anyway, we expect it will probably be called something 12 00:00:48,800 --> 00:00:51,879 Speaker 1: else when it's released. The going bet now as it's 13 00:00:51,920 --> 00:00:53,720 Speaker 1: going to be the Quest pro By the time you 14 00:00:53,720 --> 00:00:57,200 Speaker 1: hear this, you might very well know that already, because 15 00:00:57,200 --> 00:00:59,720 Speaker 1: I'm recording this before the conference happens. I'm not sure 16 00:00:59,760 --> 00:01:03,800 Speaker 1: what features this new VR device is going to have, 17 00:01:03,960 --> 00:01:06,200 Speaker 1: but the guess is that it's going to have higher 18 00:01:06,240 --> 00:01:10,840 Speaker 1: resolution experiences. It's gonna incorporate eye tracking that can go 19 00:01:10,959 --> 00:01:15,480 Speaker 1: from interesting to creepy real quick, and it might have 20 00:01:15,520 --> 00:01:19,840 Speaker 1: some other mixed reality features built into it, as in, like, hey, 21 00:01:19,880 --> 00:01:22,840 Speaker 1: you've got augmented reality and my virtual reality. Hey you 22 00:01:22,920 --> 00:01:27,440 Speaker 1: got virtual reality and my augmented reality? Say that kind 23 00:01:27,440 --> 00:01:31,440 Speaker 1: of thing. Anyway, we don't know the details, we don't 24 00:01:31,440 --> 00:01:34,120 Speaker 1: know how expensive this thing is going to be. But 25 00:01:34,240 --> 00:01:37,600 Speaker 1: since Meta not too long ago hiked the price of 26 00:01:37,640 --> 00:01:41,240 Speaker 1: the two quest two models, there's a base model in 27 00:01:41,240 --> 00:01:45,120 Speaker 1: a slightly better model, those prices went up by a 28 00:01:45,200 --> 00:01:50,360 Speaker 1: hundred dollars each. They currently cost three and four nine, respectively. 29 00:01:51,040 --> 00:01:55,320 Speaker 1: And my guess is that Project Cambria will be fairly expensive. 30 00:01:55,680 --> 00:01:58,920 Speaker 1: Considering that Meta has had a really bad year this year, 31 00:01:59,280 --> 00:02:01,320 Speaker 1: I can't imagine and that they're going to try and 32 00:02:01,360 --> 00:02:04,160 Speaker 1: sell this at a massive loss. The company has been 33 00:02:04,200 --> 00:02:08,560 Speaker 1: dealing with scandals, it's seen a decline in its user base. 34 00:02:09,280 --> 00:02:12,639 Speaker 1: It's been trying in largely failing, to fight off TikTok's 35 00:02:12,680 --> 00:02:17,160 Speaker 1: appeal to younger users. It's been reeling from Apple's move 36 00:02:17,240 --> 00:02:21,520 Speaker 1: to give iOS users the ability to cut off app tracking, 37 00:02:21,560 --> 00:02:25,200 Speaker 1: which is really hurt their advertising model. Uh. It's seen 38 00:02:25,240 --> 00:02:28,040 Speaker 1: at stock price go down over the year by more 39 00:02:28,080 --> 00:02:31,040 Speaker 1: than sixty percent, like more than a hundred dollars per share. 40 00:02:31,680 --> 00:02:33,280 Speaker 1: And I'm sure there are a lot of folks over 41 00:02:33,320 --> 00:02:36,919 Speaker 1: at Meta who feel the company really needs a win, 42 00:02:37,440 --> 00:02:40,840 Speaker 1: so maybe that's what we'll get later today. The conference 43 00:02:40,880 --> 00:02:43,960 Speaker 1: begins at one pm Eastern. As I'm recording this, it's 44 00:02:43,960 --> 00:02:46,040 Speaker 1: about an hour away, so I'm sure I'll have a 45 00:02:46,080 --> 00:02:49,240 Speaker 1: lot more to talk about on Thursday's episode. The New 46 00:02:49,280 --> 00:02:55,120 Speaker 1: York Times published an article titled quote, Skepticism, Confusion, Frustration 47 00:02:55,560 --> 00:02:59,720 Speaker 1: inside Mark Zuckerberg's Metaverse struggles end quote, and this piece 48 00:03:00,000 --> 00:03:03,919 Speaker 1: had some light on how folks inside the company are 49 00:03:04,080 --> 00:03:08,240 Speaker 1: frustrated and confused by Zuckerberg's push toward all things metaverse. 50 00:03:08,919 --> 00:03:12,120 Speaker 1: The Times quotes several folks, many of them unnamed because 51 00:03:12,120 --> 00:03:13,839 Speaker 1: they didn't want to put their jobs at risk, which 52 00:03:13,880 --> 00:03:18,440 Speaker 1: is totally understandable. One investor that the Times quoted was 53 00:03:18,480 --> 00:03:22,120 Speaker 1: a guy named Matthew Ball, who I think rightfully pointed 54 00:03:22,160 --> 00:03:25,480 Speaker 1: out that while Zuckerberg is focusing on the metaverse, the 55 00:03:25,520 --> 00:03:29,399 Speaker 1: problems that Meta is facing right now are not necessarily 56 00:03:29,480 --> 00:03:32,840 Speaker 1: directly metaverse related. Apart from the fact that the company 57 00:03:32,880 --> 00:03:36,400 Speaker 1: is investing billions into something that's still many years away. 58 00:03:36,920 --> 00:03:40,720 Speaker 1: I think the Ball's point is that Zuckerberg is failing 59 00:03:40,760 --> 00:03:44,480 Speaker 1: to address the real world problems that Meta has encountered 60 00:03:44,480 --> 00:03:47,480 Speaker 1: this year. Which means not only is Zuckerberg looking at 61 00:03:47,560 --> 00:03:52,440 Speaker 1: something that doesn't fix the issues that the company currently faces, 62 00:03:53,200 --> 00:03:56,960 Speaker 1: those issues are also getting worse because the attention is 63 00:03:56,960 --> 00:04:00,360 Speaker 1: on this metaverse stuff. The piece also rewards the several 64 00:04:00,360 --> 00:04:03,400 Speaker 1: employees and Meta are frustrated with Zuckerberg changing his mind 65 00:04:03,440 --> 00:04:08,280 Speaker 1: about projects and rapidly demanding changes. They say that the 66 00:04:08,320 --> 00:04:12,760 Speaker 1: most important project is m MH, which is make Mark happy. Obviously, 67 00:04:12,800 --> 00:04:18,080 Speaker 1: this whole approach disrupts processes and makes them really failed 68 00:04:18,080 --> 00:04:20,640 Speaker 1: to make a lot of progress. The piece also mentions 69 00:04:20,640 --> 00:04:23,360 Speaker 1: that Zuckerberg has been urging Meta teams to hold virtual 70 00:04:23,440 --> 00:04:27,000 Speaker 1: meetings within the company's Horizon Workrooms app and that was 71 00:04:27,040 --> 00:04:30,240 Speaker 1: a huge inconvenience for a lot of employees because they 72 00:04:30,279 --> 00:04:33,720 Speaker 1: had not purchased a VR headset or had one set up, 73 00:04:34,240 --> 00:04:36,920 Speaker 1: so it sent lots of folks scrambling in order to 74 00:04:36,960 --> 00:04:40,359 Speaker 1: catch up. Also, by the way, just on a side note, yuck. 75 00:04:40,640 --> 00:04:43,120 Speaker 1: I mean, if if my boss told me that I 76 00:04:43,160 --> 00:04:45,800 Speaker 1: needed to buy a four hundred or five dollar piece 77 00:04:45,800 --> 00:04:49,240 Speaker 1: of equipment just so that I could attend work meetings, 78 00:04:50,040 --> 00:04:53,240 Speaker 1: I'd be pretty angry about that. Hopefully those employees will 79 00:04:53,279 --> 00:04:55,919 Speaker 1: get reimbursed for having to purchase VR equipment just to 80 00:04:55,960 --> 00:04:58,920 Speaker 1: attend meetings or something. The Times piece also says that 81 00:04:59,640 --> 00:05:03,920 Speaker 1: and Helnal poll of Meta of one thousand employees revealed 82 00:05:03,920 --> 00:05:07,040 Speaker 1: that less than sixty of them felt they really understood 83 00:05:07,040 --> 00:05:10,479 Speaker 1: the company's metaverse strategy. So still more than half say 84 00:05:10,480 --> 00:05:15,200 Speaker 1: they understand it, but like more than say I don't 85 00:05:15,360 --> 00:05:19,520 Speaker 1: really get it now the very least I would think 86 00:05:19,520 --> 00:05:22,520 Speaker 1: that reveals a problem with internal communications. It's very hard 87 00:05:22,560 --> 00:05:25,360 Speaker 1: to be a trailblazer if the folks on your team 88 00:05:25,400 --> 00:05:29,080 Speaker 1: don't understand where they're trying to go to. Again, maybe 89 00:05:29,160 --> 00:05:31,880 Speaker 1: the conference later today will indicate that the company has 90 00:05:31,880 --> 00:05:34,359 Speaker 1: addressed some of these issues. Will have to wait and see, 91 00:05:35,080 --> 00:05:37,120 Speaker 1: or you won't. You probably have already heard about it. 92 00:05:38,320 --> 00:05:42,200 Speaker 1: Let's pop on over to YouTube next. The platform is 93 00:05:42,240 --> 00:05:44,920 Speaker 1: making yet another change when it comes to users and 94 00:05:45,080 --> 00:05:49,000 Speaker 1: their accounts. YouTube sent out emails to YouTube account holders 95 00:05:49,040 --> 00:05:52,320 Speaker 1: and released an announcement that says the platform will soon 96 00:05:52,360 --> 00:05:57,040 Speaker 1: switch to at name style handles, much like other social 97 00:05:57,040 --> 00:06:01,520 Speaker 1: platforms like Twitter. So users who will have a channel, 98 00:06:01,560 --> 00:06:04,920 Speaker 1: they'll be able to secure a unique handle that applies 99 00:06:04,960 --> 00:06:08,200 Speaker 1: across the platform. So this handle feature will allow users 100 00:06:08,240 --> 00:06:11,560 Speaker 1: to mention others and stuff like video titles and comments. 101 00:06:12,120 --> 00:06:15,120 Speaker 1: So let's say you decided to do a YouTube video 102 00:06:15,160 --> 00:06:18,919 Speaker 1: about how awesome I am. You could tag me in 103 00:06:18,960 --> 00:06:22,360 Speaker 1: the title of your video, and I would be aware 104 00:06:22,400 --> 00:06:25,200 Speaker 1: that it existed, and because you're appealing to my ego, 105 00:06:25,839 --> 00:06:29,000 Speaker 1: I would likely promote that video because I want people 106 00:06:29,000 --> 00:06:31,440 Speaker 1: to know how awesome I am. Now, I'm not saying 107 00:06:31,440 --> 00:06:33,600 Speaker 1: you should do this, by the way, Specifically, don't do 108 00:06:33,640 --> 00:06:36,960 Speaker 1: this with me because my social reach is pathetically small 109 00:06:37,360 --> 00:06:39,839 Speaker 1: because I'm almost never on social these days, so it 110 00:06:39,880 --> 00:06:42,200 Speaker 1: won't do you any good. But if you try it 111 00:06:42,240 --> 00:06:45,160 Speaker 1: with folks who have a large, active following, I don't know. 112 00:06:45,440 --> 00:06:47,640 Speaker 1: Maybe it could work. Maybe you just irritate them. You 113 00:06:47,720 --> 00:06:51,640 Speaker 1: just don't know. Anyway, the apt feature can be useful 114 00:06:51,680 --> 00:06:55,040 Speaker 1: for community engagement, though it can also be irritating as 115 00:06:55,080 --> 00:06:57,839 Speaker 1: heck if it's misused, but that is the way of things. 116 00:06:58,440 --> 00:07:01,599 Speaker 1: I find this change interesting but because back when Google 117 00:07:01,640 --> 00:07:05,359 Speaker 1: Plus was still a thing, Google was really pushing to 118 00:07:05,440 --> 00:07:10,840 Speaker 1: require everyone to use their real legal name across Google platforms, 119 00:07:11,240 --> 00:07:15,600 Speaker 1: and that included Google Plus and YouTube. So this motivation 120 00:07:15,760 --> 00:07:19,400 Speaker 1: for this push came from a relatively good place, like 121 00:07:19,480 --> 00:07:21,120 Speaker 1: it wasn't just that they wanted to force you to 122 00:07:21,240 --> 00:07:24,080 Speaker 1: use your real name. The idea was that if you 123 00:07:24,120 --> 00:07:28,600 Speaker 1: hold people accountable by making them use their real names, 124 00:07:29,160 --> 00:07:31,600 Speaker 1: then they're going to be more careful about the stuff 125 00:07:31,640 --> 00:07:35,080 Speaker 1: they say online and you would see a big decrease 126 00:07:35,080 --> 00:07:39,000 Speaker 1: in stuff like abuse and trolling. But it was very 127 00:07:39,000 --> 00:07:41,760 Speaker 1: clear that the company had not considered a lot of 128 00:07:41,840 --> 00:07:44,520 Speaker 1: other issues that come along with forcing people to use 129 00:07:44,520 --> 00:07:49,720 Speaker 1: their real names. Here's a relatively benign example. There are 130 00:07:49,800 --> 00:07:53,960 Speaker 1: a lot of popular YouTube personalities who are known by 131 00:07:54,080 --> 00:07:57,280 Speaker 1: a sort of stage name or a handle, and if 132 00:07:57,320 --> 00:08:00,560 Speaker 1: you forced them to change to their real name, that 133 00:08:00,640 --> 00:08:03,400 Speaker 1: kind of messes with branding. It also poses as a 134 00:08:03,440 --> 00:08:07,880 Speaker 1: potential privacy and security risk. Anyway, YouTube, throughout their history 135 00:08:07,960 --> 00:08:10,600 Speaker 1: has really followed a rocky course when it comes to 136 00:08:11,160 --> 00:08:15,119 Speaker 1: how we identify users and channels on the site. They've 137 00:08:15,120 --> 00:08:18,120 Speaker 1: gone back and forth between being able to use handles 138 00:08:18,120 --> 00:08:21,600 Speaker 1: being able to use real names. Um, and these new 139 00:08:21,680 --> 00:08:25,040 Speaker 1: handles aren't actually going to rename anything. Instead, in theory, 140 00:08:25,080 --> 00:08:27,440 Speaker 1: it will make it easier to interact with other users 141 00:08:27,440 --> 00:08:31,880 Speaker 1: on the platform because you can tag them. Securing handles 142 00:08:31,880 --> 00:08:34,120 Speaker 1: looks like it's going to be a first come, first 143 00:08:34,160 --> 00:08:37,800 Speaker 1: served approach. Once it opens up. But YouTube is also 144 00:08:37,840 --> 00:08:41,080 Speaker 1: going to reach out to some channels in advance to 145 00:08:41,160 --> 00:08:44,680 Speaker 1: give them the opportunity to secure their handle before the 146 00:08:44,679 --> 00:08:46,720 Speaker 1: general public can rush in and grab them up. Which 147 00:08:46,760 --> 00:08:50,400 Speaker 1: makes sense. You don't want, uh, someone rushing in to 148 00:08:50,480 --> 00:08:54,080 Speaker 1: grab Jack septic I before Sean gets a chance to 149 00:08:54,120 --> 00:08:56,920 Speaker 1: do it. I call him Sean even though I've never 150 00:08:56,920 --> 00:08:59,079 Speaker 1: talked to him and he doesn't know who I am. 151 00:08:59,120 --> 00:09:02,440 Speaker 1: But anyway, that's an example, right, You wouldn't want someone 152 00:09:02,480 --> 00:09:07,320 Speaker 1: to to do the equivalent of website squatting or U 153 00:09:07,400 --> 00:09:11,400 Speaker 1: r L squatting on these handles. So those channels are 154 00:09:11,400 --> 00:09:13,520 Speaker 1: going to get the first chance to do it. UH 155 00:09:13,800 --> 00:09:15,719 Speaker 1: don't know what the criteria is going to be for that. 156 00:09:15,920 --> 00:09:18,120 Speaker 1: I assume it's going to be like the check marked 157 00:09:18,120 --> 00:09:21,920 Speaker 1: folks first and then maybe opened up to others after that. 158 00:09:23,000 --> 00:09:26,920 Speaker 1: We'll have to see. This past weekend was twitch Con, 159 00:09:27,000 --> 00:09:30,319 Speaker 1: which was an in person convention relating to Twitch streamers 160 00:09:30,360 --> 00:09:32,760 Speaker 1: and their fan bases. I got a couple of different 161 00:09:32,760 --> 00:09:35,240 Speaker 1: things to say about this. One is that Twitch has 162 00:09:35,280 --> 00:09:38,840 Speaker 1: been in the hot seat of criticism from creators recently, 163 00:09:39,360 --> 00:09:42,320 Speaker 1: primarily because of the company's revenue strategy. There have been 164 00:09:42,360 --> 00:09:45,360 Speaker 1: a lot of times where streamers have criticized Twitch for 165 00:09:45,480 --> 00:09:51,560 Speaker 1: lots of different things. For example, for uh somewhat vague 166 00:09:51,679 --> 00:09:56,800 Speaker 1: and uneven application of content standards, where some people say 167 00:09:57,280 --> 00:10:02,360 Speaker 1: it appears that certain folks get a past for publishing 168 00:10:02,440 --> 00:10:07,200 Speaker 1: content that other folks get reprimanded for. That's one example. 169 00:10:07,200 --> 00:10:10,120 Speaker 1: But the one that has been and in the focus 170 00:10:10,120 --> 00:10:15,000 Speaker 1: more recently has to do with revenue split. So until 171 00:10:15,320 --> 00:10:18,000 Speaker 1: very recently, Twitch had a revenue split for almost all 172 00:10:18,040 --> 00:10:23,400 Speaker 1: Twitch streamers. That was that means that streamers would get 173 00:10:23,400 --> 00:10:27,560 Speaker 1: fifty of revenue coming from subscriptions and Twitch would get 174 00:10:27,600 --> 00:10:30,240 Speaker 1: the rest. There are other ways that Twitch makes revenue 175 00:10:30,440 --> 00:10:33,079 Speaker 1: from these things, and that streamers make revenue from these 176 00:10:33,080 --> 00:10:36,440 Speaker 1: things as well, but that's really kind of the focus. Now. 177 00:10:36,520 --> 00:10:39,800 Speaker 1: Certain streamers who were kind of viewed as top performers 178 00:10:39,840 --> 00:10:44,000 Speaker 1: and thus likely to attract more people to joining Twitch, 179 00:10:44,520 --> 00:10:47,760 Speaker 1: they got a special treatment. They got a seventy thirty split, 180 00:10:48,120 --> 00:10:50,560 Speaker 1: so Twitch would only take thirty percent of their revenue 181 00:10:50,600 --> 00:10:53,160 Speaker 1: and the streamer would get to keep seventy. But that 182 00:10:53,320 --> 00:10:58,760 Speaker 1: was like this small elite group, and that's actually much 183 00:10:58,760 --> 00:11:01,160 Speaker 1: closer to the kind of revenue new split strategies we 184 00:11:01,200 --> 00:11:05,880 Speaker 1: see in other online platforms. But late last month, Twitch 185 00:11:05,960 --> 00:11:09,720 Speaker 1: said that any remaining seventy streamers out there will also 186 00:11:09,760 --> 00:11:13,320 Speaker 1: get switched to a fifty fifty arrangement once they earn 187 00:11:13,600 --> 00:11:18,120 Speaker 1: one thousand dollars as a threshold. And for those elite streamers, 188 00:11:18,160 --> 00:11:20,800 Speaker 1: that's going to be a big setback. Going from a 189 00:11:21,600 --> 00:11:25,800 Speaker 1: take to take is tough, and while we have yet 190 00:11:25,840 --> 00:11:27,640 Speaker 1: to see how this is all going to play out, 191 00:11:28,200 --> 00:11:31,400 Speaker 1: it's safe to say there are some Twitch streamers upset 192 00:11:31,480 --> 00:11:34,400 Speaker 1: with this change. Now, maybe they'll try and offset it 193 00:11:34,640 --> 00:11:38,599 Speaker 1: by incorporating more advertising in their streams, because if you 194 00:11:38,640 --> 00:11:40,960 Speaker 1: have more ads, then you generate more revenues, so you 195 00:11:40,960 --> 00:11:42,880 Speaker 1: can make up for the fact that you are getting 196 00:11:43,160 --> 00:11:46,520 Speaker 1: a smaller cut of the revenue you generate by putting 197 00:11:46,520 --> 00:11:48,360 Speaker 1: in more ads. But if you put in more ads, 198 00:11:49,000 --> 00:11:53,120 Speaker 1: then you might push away Twitch viewers. They might get 199 00:11:53,480 --> 00:11:56,080 Speaker 1: fed up with how many ads there are compared to 200 00:11:56,520 --> 00:12:00,000 Speaker 1: the amount of content they're getting. But anyway, let's talk 201 00:12:00,000 --> 00:12:03,080 Speaker 1: about twitch Con, particularly because the big story coming out 202 00:12:03,120 --> 00:12:06,920 Speaker 1: of twitch Con isn't about twitches policies had to do 203 00:12:07,000 --> 00:12:10,800 Speaker 1: with some really unfortunate injuries that were incurred by let's 204 00:12:10,800 --> 00:12:15,720 Speaker 1: call them enthusiastic streamers who leapt into a a foam pit, 205 00:12:16,160 --> 00:12:18,880 Speaker 1: that is, a pit that had foam blocks, like the 206 00:12:18,960 --> 00:12:21,760 Speaker 1: kind of blocks he would use for packing material, right, 207 00:12:21,840 --> 00:12:25,880 Speaker 1: those dark like charcoal gray foam blocks, that kind of 208 00:12:25,880 --> 00:12:30,640 Speaker 1: thing um And that pit proved to be far too shallow, 209 00:12:30,800 --> 00:12:32,680 Speaker 1: and since this was all done in a venue with 210 00:12:32,880 --> 00:12:36,400 Speaker 1: concrete floor, and then that some folks got hurt, a 211 00:12:36,440 --> 00:12:40,640 Speaker 1: few of them seriously injured. One of those was streamer 212 00:12:40,679 --> 00:12:44,440 Speaker 1: Adriana Chichick, who, after landing in the pit, said she 213 00:12:44,520 --> 00:12:47,480 Speaker 1: was unable to get up and she ultimately got some 214 00:12:47,559 --> 00:12:50,559 Speaker 1: medical attention, and she has since said that she broke 215 00:12:50,640 --> 00:12:53,320 Speaker 1: her back in two places and is going to undergo 216 00:12:53,360 --> 00:12:55,959 Speaker 1: surgery to insert a metal rod into her spine to 217 00:12:56,000 --> 00:12:58,640 Speaker 1: correct for the damage. The foam pit was part of 218 00:12:58,640 --> 00:13:02,600 Speaker 1: an American Gladiator's type attraction. It was sponsored by Lenovo 219 00:13:02,720 --> 00:13:07,280 Speaker 1: and Intel, and it was subsequently shut down after numerous injuries. 220 00:13:07,880 --> 00:13:09,640 Speaker 1: And on a personal note, I just want to say, 221 00:13:09,679 --> 00:13:12,719 Speaker 1: I hope the various folks who got injured in that 222 00:13:12,800 --> 00:13:16,480 Speaker 1: pit have a swift recovery. And to all those websites 223 00:13:16,520 --> 00:13:20,079 Speaker 1: out there that used animated gifts to show the moments 224 00:13:20,120 --> 00:13:23,440 Speaker 1: where folks got injured. Gross next time, just make it 225 00:13:23,440 --> 00:13:26,240 Speaker 1: a playable video. I did not want to actually see 226 00:13:26,280 --> 00:13:30,000 Speaker 1: people get hurt. Um that kind of stinks, And I 227 00:13:30,120 --> 00:13:32,719 Speaker 1: really hope that the hurt people get well soon. That's 228 00:13:32,720 --> 00:13:37,400 Speaker 1: obviously the most important part. All right, speaking of advertising, 229 00:13:37,440 --> 00:13:39,319 Speaker 1: we're gonna take a quick break. When we come back, 230 00:13:39,360 --> 00:13:51,319 Speaker 1: we've got some more news items. You know. I've talked 231 00:13:51,360 --> 00:13:54,439 Speaker 1: a lot on this show about how remote work has 232 00:13:54,440 --> 00:13:58,600 Speaker 1: really changed things for employees, how they are reluctant to 233 00:13:58,640 --> 00:14:01,880 Speaker 1: go back to working in the office, which is totally understandable. Like, 234 00:14:01,920 --> 00:14:04,280 Speaker 1: if you are able to do your job effectively at home, 235 00:14:05,000 --> 00:14:06,360 Speaker 1: there are a lot of good reasons to do it 236 00:14:06,400 --> 00:14:09,320 Speaker 1: that way. Right, you're not driving all the time or 237 00:14:09,360 --> 00:14:13,239 Speaker 1: otherwise commuting, you know, you you can really be efficient, 238 00:14:13,480 --> 00:14:15,240 Speaker 1: and you can stay in the comfort of your home. 239 00:14:15,320 --> 00:14:17,280 Speaker 1: You can help look after your family. There are a 240 00:14:17,320 --> 00:14:19,560 Speaker 1: lot of good reasons. But I've also talked about how 241 00:14:19,560 --> 00:14:22,720 Speaker 1: some companies and managers are not crazy about remote work, 242 00:14:22,800 --> 00:14:26,960 Speaker 1: partly because it makes it harder to monitor employees. It's 243 00:14:26,960 --> 00:14:29,720 Speaker 1: weird because, like every study you see about this kind 244 00:14:29,720 --> 00:14:34,400 Speaker 1: of stuff shows how that approach demoralizes employees, which ultimately 245 00:14:34,440 --> 00:14:37,760 Speaker 1: means you hurt the bottom line. Employees don't do as 246 00:14:37,760 --> 00:14:40,320 Speaker 1: good a job if they're not happy. And yet we 247 00:14:40,440 --> 00:14:43,320 Speaker 1: still see this happening like like it's it's like a 248 00:14:43,360 --> 00:14:45,520 Speaker 1: teacher having to watch over a classroom to make sure 249 00:14:45,520 --> 00:14:48,080 Speaker 1: no one's cheating during a pop quiz. You don't feel 250 00:14:48,080 --> 00:14:51,520 Speaker 1: great if you feel like your boss is treating you 251 00:14:51,640 --> 00:14:56,280 Speaker 1: like a misbehaving child. Anyway, there's one recent story that 252 00:14:56,400 --> 00:14:58,760 Speaker 1: really caught my eye that kind of turned things around 253 00:14:58,760 --> 00:15:00,520 Speaker 1: on a company that was doing the sort of thing. 254 00:15:00,920 --> 00:15:05,640 Speaker 1: A Florida based company called chett To hired a remote telemarketer. 255 00:15:06,000 --> 00:15:10,240 Speaker 1: This telemarketer was based in the Netherlands. Now, according to 256 00:15:10,360 --> 00:15:14,480 Speaker 1: a subsequent lawsuit that the telemarketer brought against chet Do, 257 00:15:15,280 --> 00:15:19,960 Speaker 1: the company managers ordered him to activate his webcam and 258 00:15:20,000 --> 00:15:23,560 Speaker 1: to keep it on throughout the entirety of his work day. 259 00:15:23,600 --> 00:15:27,000 Speaker 1: They also demanded that he screen share his screen the 260 00:15:27,200 --> 00:15:31,360 Speaker 1: entire time, and the telemarketer politely declined to acquiesce to 261 00:15:31,400 --> 00:15:36,000 Speaker 1: the company's request, and chatt To then fired this telemarketer. 262 00:15:36,200 --> 00:15:39,840 Speaker 1: So then he sued the company, And like I said, 263 00:15:39,880 --> 00:15:43,680 Speaker 1: this guy is based in the Netherlands. So a Dutch 264 00:15:43,720 --> 00:15:47,880 Speaker 1: court took up the case, and y'all, we don't treat 265 00:15:48,040 --> 00:15:50,720 Speaker 1: things like privacy that seriously here in the United States, 266 00:15:50,840 --> 00:15:53,400 Speaker 1: but they sure as heck do in Europe and the 267 00:15:53,440 --> 00:15:56,960 Speaker 1: Netherlands in particular. So the court found that the company 268 00:15:57,120 --> 00:16:00,720 Speaker 1: was being unreasonable and that the company demands on the 269 00:16:00,760 --> 00:16:04,680 Speaker 1: employee were quote in conflict with the respect for the 270 00:16:04,720 --> 00:16:08,280 Speaker 1: privacy of the workers end quote which it would be 271 00:16:08,360 --> 00:16:11,880 Speaker 1: very hard to agree to argue against. Rather, and that 272 00:16:11,880 --> 00:16:16,320 Speaker 1: that the company far overstepped the boundaries that an employer 273 00:16:16,400 --> 00:16:19,280 Speaker 1: has when it comes to its employees. So this Dutch 274 00:16:19,280 --> 00:16:22,160 Speaker 1: court found in favor of the telemarketer. They said this 275 00:16:22,320 --> 00:16:25,520 Speaker 1: was a case of unfair dismissal, and as a result, 276 00:16:25,600 --> 00:16:28,479 Speaker 1: the court has ordered Chett to to pay the telemarketers 277 00:16:28,520 --> 00:16:31,920 Speaker 1: court costs, to pay his back wages, to pay a 278 00:16:32,000 --> 00:16:35,160 Speaker 1: fine of fifty thou dollars, and it has to drop 279 00:16:35,240 --> 00:16:39,400 Speaker 1: the non compete clause from the telemarketers employment agreement. And 280 00:16:39,480 --> 00:16:41,520 Speaker 1: whether any of this happens or not, I don't know. 281 00:16:41,720 --> 00:16:44,920 Speaker 1: Chetto did not show up for the court case. Again, 282 00:16:44,960 --> 00:16:48,760 Speaker 1: I'm not shocked, but I imagine this will at least 283 00:16:48,800 --> 00:16:53,000 Speaker 1: make companies a little less eager to hire telemarketers in 284 00:16:53,040 --> 00:16:56,320 Speaker 1: the Netherlands right or teleworkers in general. As well as 285 00:16:56,440 --> 00:16:59,240 Speaker 1: other places that are like in the EU, for example, 286 00:16:59,240 --> 00:17:01,720 Speaker 1: where you have governed moments and regulatory agencies that take 287 00:17:01,760 --> 00:17:07,560 Speaker 1: stuff like privacy seriously. And uh, honestly, I think that 288 00:17:07,600 --> 00:17:10,800 Speaker 1: the requirement of having the webcam turned on was unreasonable 289 00:17:10,880 --> 00:17:13,000 Speaker 1: from the get go. I think it's unreasonable for any 290 00:17:13,000 --> 00:17:16,480 Speaker 1: company to demand that, at least as a general rule. 291 00:17:16,560 --> 00:17:20,960 Speaker 1: I think it's unreasonable. So hopefully we see stories like 292 00:17:21,040 --> 00:17:28,440 Speaker 1: this that convinced companies not to follow those kinds of tactics. Uh, 293 00:17:28,760 --> 00:17:30,840 Speaker 1: it might take a while for that to really filter 294 00:17:31,000 --> 00:17:33,760 Speaker 1: through to all the different companies in the United States, 295 00:17:33,760 --> 00:17:37,160 Speaker 1: because again, we're in a country that does not traditionally 296 00:17:37,800 --> 00:17:43,520 Speaker 1: value or protect privacy to any great extent. Yesterday, CNN 297 00:17:43,560 --> 00:17:48,600 Speaker 1: decided to sunset its Vault project. So this was an 298 00:17:48,720 --> 00:17:53,200 Speaker 1: n f T initiative that CNN launched last year, and 299 00:17:53,200 --> 00:17:56,879 Speaker 1: and the news company was minting n f T s 300 00:17:57,240 --> 00:18:00,960 Speaker 1: that were tied to real world events and headline So 301 00:18:01,320 --> 00:18:05,440 Speaker 1: the pitch that CNN gave to investors was that you 302 00:18:05,880 --> 00:18:08,720 Speaker 1: could purchase an n f T and own a piece 303 00:18:08,720 --> 00:18:13,040 Speaker 1: of history. You could own a headline, a moment of 304 00:18:13,160 --> 00:18:19,359 Speaker 1: newsworthy activity, which I find absolutely laughable. I find it 305 00:18:19,640 --> 00:18:23,000 Speaker 1: ludicrous because really, what you end up owning is a 306 00:18:23,040 --> 00:18:27,760 Speaker 1: digital token that at least in theory, represents this thing. 307 00:18:27,840 --> 00:18:30,119 Speaker 1: But it's just a digital token. It's a token that 308 00:18:30,119 --> 00:18:34,720 Speaker 1: can be traded or sold on a digital marketplace. But 309 00:18:34,800 --> 00:18:38,320 Speaker 1: it's not like you actually own a moment, like like, 310 00:18:38,520 --> 00:18:41,479 Speaker 1: let's say there's a moment where celebrity, a totally disked 311 00:18:41,480 --> 00:18:44,919 Speaker 1: celebrity be at a public event or whatever, and you're like, done, 312 00:18:45,320 --> 00:18:47,800 Speaker 1: I need to own that. I need to own the 313 00:18:47,800 --> 00:18:51,240 Speaker 1: moment where Harry Styles spit on Crisp Pine or whatever 314 00:18:51,280 --> 00:18:53,159 Speaker 1: it was, And so you buy an n f T 315 00:18:53,320 --> 00:18:57,360 Speaker 1: that represents that headline. You don't actually own anything other 316 00:18:57,440 --> 00:19:01,160 Speaker 1: than a token. Anyway, None of that stopped people from 317 00:19:01,160 --> 00:19:03,880 Speaker 1: buying n f t s because, as we have established 318 00:19:03,920 --> 00:19:08,240 Speaker 1: another episodes, n f t s, especially early on, we're 319 00:19:08,280 --> 00:19:11,080 Speaker 1: really seen as a potential cash grab, both for the 320 00:19:11,160 --> 00:19:13,639 Speaker 1: markets that were minting n f t s and for 321 00:19:13,720 --> 00:19:15,919 Speaker 1: the people who were buying them. It was like a 322 00:19:16,000 --> 00:19:19,600 Speaker 1: speculative market, and you can think of it as like 323 00:19:19,600 --> 00:19:21,840 Speaker 1: a long shot bet. You might think, all right, well, 324 00:19:21,880 --> 00:19:25,160 Speaker 1: each individual n f T is not that expensive, maybe 325 00:19:25,160 --> 00:19:27,840 Speaker 1: it's twenty bucks or something for the cheap ones that's 326 00:19:27,880 --> 00:19:30,640 Speaker 1: not terrible. And if it turns out that this one 327 00:19:30,720 --> 00:19:34,639 Speaker 1: is gonna end up being a seven digit item on 328 00:19:34,680 --> 00:19:38,640 Speaker 1: a marketplace, then I'm gonna make a huge profit from 329 00:19:38,680 --> 00:19:41,159 Speaker 1: a small investment. Of course, I'm gonna be tempted to 330 00:19:41,200 --> 00:19:45,919 Speaker 1: do that. But since those days, obviously the crypto market 331 00:19:45,960 --> 00:19:51,080 Speaker 1: has taken a real dive across multiple blockchains. We have 332 00:19:51,160 --> 00:19:56,880 Speaker 1: seen cryptocurrencies and an n f T values plummet and uh. 333 00:19:57,000 --> 00:20:00,240 Speaker 1: More and more people have expressed skeptic skepticism or just 334 00:20:00,320 --> 00:20:04,040 Speaker 1: outright dismissal of n f T s, and some folks 335 00:20:04,040 --> 00:20:07,560 Speaker 1: who were invested in the vault are now talking about 336 00:20:08,000 --> 00:20:12,600 Speaker 1: talking with lawyers potentially bringing lawsuits against CNN and accusing 337 00:20:12,600 --> 00:20:16,159 Speaker 1: the company of pulling the rug out from investors a 338 00:20:16,480 --> 00:20:20,800 Speaker 1: so called rug poll. So rug pull is when someone 339 00:20:20,880 --> 00:20:24,679 Speaker 1: sets up a project and gets investors involved in the 340 00:20:24,680 --> 00:20:28,440 Speaker 1: project and then subsequently cancels the project and then runs 341 00:20:28,480 --> 00:20:30,680 Speaker 1: off with the cash. That's a rug pole right where 342 00:20:30,680 --> 00:20:33,680 Speaker 1: you've set up the scheme where you can convince a 343 00:20:33,760 --> 00:20:36,040 Speaker 1: lot of people to pour money in early on, and 344 00:20:36,040 --> 00:20:39,000 Speaker 1: then you take the money and run. Well. CNN has 345 00:20:39,080 --> 00:20:42,119 Speaker 1: said that's not the case, and it's planning to compensate 346 00:20:42,200 --> 00:20:44,840 Speaker 1: investors by paying out the purchase price of n f 347 00:20:44,880 --> 00:20:49,000 Speaker 1: T s that investors were holding as captured on October six. 348 00:20:49,280 --> 00:20:51,199 Speaker 1: So if you bought an n f T like a 349 00:20:51,280 --> 00:20:54,399 Speaker 1: year ago, you don't get the value of the n 350 00:20:54,440 --> 00:20:57,080 Speaker 1: f T that you paid for at the point of purchase. 351 00:20:57,080 --> 00:20:59,720 Speaker 1: You you get whatever it was worth on October six 352 00:20:59,840 --> 00:21:03,240 Speaker 1: of this year. So CNN estimates that means folks are 353 00:21:03,280 --> 00:21:06,639 Speaker 1: going to recapture around twenty of what they invested in because, 354 00:21:06,680 --> 00:21:09,840 Speaker 1: as I mentioned earlier, crypto had a really bad time 355 00:21:09,840 --> 00:21:11,639 Speaker 1: of it this year, and the value of stuff like 356 00:21:11,760 --> 00:21:14,520 Speaker 1: n f T s has taken a nose dive. It 357 00:21:14,560 --> 00:21:20,040 Speaker 1: also gets worse. Marketplaces have processing fees, so if you 358 00:21:20,080 --> 00:21:24,320 Speaker 1: want to extract your money from an investment on these marketplaces, 359 00:21:24,600 --> 00:21:27,480 Speaker 1: you have to pay a processing fee in in as 360 00:21:27,480 --> 00:21:30,679 Speaker 1: part of that. Well, the marketplace that CNN used was 361 00:21:30,760 --> 00:21:34,119 Speaker 1: connected to the Flow blockchain, and the Flow blockchain has 362 00:21:34,160 --> 00:21:37,680 Speaker 1: a four dollar processing fee for transactions. So let's say 363 00:21:38,040 --> 00:21:41,000 Speaker 1: you bought one n f T from CNN. Let's say 364 00:21:41,000 --> 00:21:43,480 Speaker 1: it was a twenty n f T. We You're only 365 00:21:43,480 --> 00:21:49,159 Speaker 1: gonna get of that value back, right, That's all you're 366 00:21:49,160 --> 00:21:53,639 Speaker 1: gonna get. So one fifth of what you paid and 367 00:21:53,680 --> 00:21:56,000 Speaker 1: you paid twenty bucks, well, one fifth means you would 368 00:21:56,080 --> 00:21:58,840 Speaker 1: get four dollars back, but you also have to pay 369 00:21:58,840 --> 00:22:01,200 Speaker 1: a processing fee if you want to withdraw your money 370 00:22:01,240 --> 00:22:04,760 Speaker 1: from the marketplace. That processing fee is also four dollars, 371 00:22:04,800 --> 00:22:08,760 Speaker 1: which means you get nothing. You get no money back 372 00:22:09,440 --> 00:22:13,239 Speaker 1: because of the reduction in the purchase price and that 373 00:22:13,320 --> 00:22:15,919 Speaker 1: processing fee on top of it. This is another reason 374 00:22:16,000 --> 00:22:19,639 Speaker 1: why lots of critics of n f T s pointed 375 00:22:19,680 --> 00:22:22,760 Speaker 1: in f T s as being dangerous and not a 376 00:22:22,760 --> 00:22:25,760 Speaker 1: good investment, because on top of everything else, you've got 377 00:22:25,840 --> 00:22:29,400 Speaker 1: these processing fees that eat into any profit you might make, 378 00:22:30,000 --> 00:22:32,800 Speaker 1: and you are not guaranteed by any stretch of the imagination, 379 00:22:32,800 --> 00:22:35,000 Speaker 1: to make a profit in the first place. So it 380 00:22:35,080 --> 00:22:38,520 Speaker 1: may mean that you get even less back out if 381 00:22:38,520 --> 00:22:41,199 Speaker 1: you decide that you don't want to be invested in 382 00:22:41,200 --> 00:22:47,040 Speaker 1: the market anymore. Last week, six robotics companies, including Boston Dynamics, 383 00:22:47,200 --> 00:22:50,600 Speaker 1: signed a pledge stating that these companies are not going 384 00:22:50,640 --> 00:22:54,159 Speaker 1: to develop robots that can harm humans, and the company 385 00:22:54,160 --> 00:22:58,120 Speaker 1: has also urged other robotics companies to follow suit. And 386 00:22:58,240 --> 00:23:01,760 Speaker 1: there's been a lot of talk about robotics and how 387 00:23:01,760 --> 00:23:04,760 Speaker 1: they could be put to military use. A lot of 388 00:23:04,880 --> 00:23:08,159 Speaker 1: early ones, including at Boston Dynamics, were talked about as 389 00:23:08,240 --> 00:23:12,200 Speaker 1: sort of a load bearing device that could help carry 390 00:23:12,240 --> 00:23:15,719 Speaker 1: heavier equipment in the field for military, so kind of 391 00:23:15,720 --> 00:23:18,600 Speaker 1: like a mule that it would be carrying this stuff. 392 00:23:18,720 --> 00:23:21,959 Speaker 1: Not necessarily weaponized itself, but it might carry weapons that 393 00:23:22,280 --> 00:23:25,360 Speaker 1: you know, other soldiers can can take off of it 394 00:23:26,200 --> 00:23:31,080 Speaker 1: and use. But there's also been conversations about weaponized platforms, 395 00:23:31,640 --> 00:23:35,919 Speaker 1: whether paired with a human operator who works remotely to 396 00:23:36,160 --> 00:23:42,520 Speaker 1: actually fire the weapons, or even and autonomous system that 397 00:23:42,560 --> 00:23:46,360 Speaker 1: would have stuff like image recognition capabilities that would be 398 00:23:46,400 --> 00:23:50,080 Speaker 1: capable of engaging in combat. So you could have a 399 00:23:50,200 --> 00:23:54,879 Speaker 1: combat ready weaponized robotics platform. And that thought is really scary, 400 00:23:55,000 --> 00:23:59,440 Speaker 1: particularly because we know that computer vision isn't even close 401 00:23:59,480 --> 00:24:02,399 Speaker 1: to perfect. You know, we've seen so many stories of 402 00:24:02,440 --> 00:24:07,960 Speaker 1: computer systems misidentifying people or having a bias against specific 403 00:24:08,040 --> 00:24:12,320 Speaker 1: populations because the computer models themselves are faulty. We've seen 404 00:24:12,359 --> 00:24:15,919 Speaker 1: that over and over again. Now imagine such an imperfect 405 00:24:16,119 --> 00:24:20,560 Speaker 1: system armed with a gun, so that not only is 406 00:24:20,600 --> 00:24:25,400 Speaker 1: this a deadly weapon, it's one that could misidentify targets. Uh, 407 00:24:25,440 --> 00:24:27,719 Speaker 1: it could identify something as a target that's not a 408 00:24:27,720 --> 00:24:32,520 Speaker 1: target at all, and that leads to tragedy. Also, there's 409 00:24:32,560 --> 00:24:36,000 Speaker 1: a fear that if you have robot soldiers that is 410 00:24:36,040 --> 00:24:38,919 Speaker 1: what it really amounts to, that you would be less 411 00:24:40,440 --> 00:24:44,080 Speaker 1: reluctant to engage in combat because obviously you're not putting 412 00:24:44,160 --> 00:24:48,960 Speaker 1: people your people in harm's way, you're using equipment. So 413 00:24:49,119 --> 00:24:52,359 Speaker 1: there's a fear that this could lead to increased aggression 414 00:24:52,520 --> 00:24:57,520 Speaker 1: around the world. It's a very sobering idea. So these 415 00:24:57,560 --> 00:25:01,720 Speaker 1: companies stressed that their robotic platforms it have significant positive 416 00:25:01,840 --> 00:25:05,879 Speaker 1: use cases, like they could really help contribute to lots 417 00:25:05,880 --> 00:25:11,320 Speaker 1: of human endeavors. But if robotics companies agree to weaponize 418 00:25:11,640 --> 00:25:15,560 Speaker 1: their platforms, that will in turn undermine any efforts to 419 00:25:15,920 --> 00:25:19,960 Speaker 1: make robots for peaceful applications because it erodes trust and 420 00:25:20,000 --> 00:25:23,919 Speaker 1: it encourages fear. So people will be less likely to 421 00:25:24,000 --> 00:25:27,040 Speaker 1: invest in robotics companies that could be making stuff that 422 00:25:27,119 --> 00:25:32,560 Speaker 1: could really make positive changes in peaceful ways because there 423 00:25:32,560 --> 00:25:35,800 Speaker 1: are these other companies that are making kill bots. Essentially, 424 00:25:36,560 --> 00:25:39,480 Speaker 1: I'm really on the same page as these companies. I 425 00:25:39,520 --> 00:25:41,679 Speaker 1: think that this is the responsible thing to do. I 426 00:25:41,680 --> 00:25:44,199 Speaker 1: think it's the right thing to do. But whether we 427 00:25:44,280 --> 00:25:46,919 Speaker 1: see this spread throughout the entire robotics industry or not, 428 00:25:47,200 --> 00:25:51,600 Speaker 1: I just don't know. Russian speaking hackers have managed to 429 00:25:51,680 --> 00:25:55,119 Speaker 1: take down more than a dozen airport websites yesterday morning, 430 00:25:55,680 --> 00:25:59,560 Speaker 1: including the site for my home airport, Atlanta's Hertsfield Jackson 431 00:25:59,640 --> 00:26:03,560 Speaker 1: Internet National Airport. Now, the hackers appear to be aligned 432 00:26:03,880 --> 00:26:08,800 Speaker 1: with the Russian government, meaning their motivations appear to be 433 00:26:08,920 --> 00:26:13,480 Speaker 1: based off what the Russian government is doing, but it's 434 00:26:13,600 --> 00:26:16,040 Speaker 1: unknown whether or not they have any sort of state 435 00:26:16,200 --> 00:26:20,719 Speaker 1: sponsor relationship with the government, and maybe that they're acting 436 00:26:20,760 --> 00:26:25,040 Speaker 1: completely independently. In that case, they would be acting more 437 00:26:25,160 --> 00:26:28,480 Speaker 1: like anonymous does. It could be a loose group of 438 00:26:28,560 --> 00:26:31,600 Speaker 1: hackers that choose their own targets and they don't have 439 00:26:31,640 --> 00:26:35,600 Speaker 1: any actual connection with any authority, figure, or agency or 440 00:26:35,600 --> 00:26:38,600 Speaker 1: anything like that. So no one's going so far as 441 00:26:38,640 --> 00:26:43,320 Speaker 1: to say that Putin, for example, ordered hackers to take 442 00:26:43,320 --> 00:26:47,479 Speaker 1: down airport web pages, and I'm guessing that that's not 443 00:26:47,520 --> 00:26:49,600 Speaker 1: the case. I don't think Putin would have done that 444 00:26:49,800 --> 00:26:54,240 Speaker 1: simply because it's not an effective way to really disrupt things. 445 00:26:54,280 --> 00:26:58,720 Speaker 1: In fact, the airport websites that were affected had no 446 00:26:58,800 --> 00:27:03,160 Speaker 1: impact on actual travel whatsoever. Which isn't a surprise. I mean, 447 00:27:03,200 --> 00:27:06,400 Speaker 1: I don't think I have ever pulled up the Atlanta 448 00:27:06,440 --> 00:27:11,520 Speaker 1: Airports website. Instead, I rely upon whichever air carrier i'm using. 449 00:27:12,080 --> 00:27:14,920 Speaker 1: I use their web page or their app. I don't 450 00:27:14,960 --> 00:27:18,399 Speaker 1: go to the airport's web page itself. So this sounds 451 00:27:18,440 --> 00:27:21,919 Speaker 1: like it was, you know, barely an inconvenience in some ways. 452 00:27:22,800 --> 00:27:25,520 Speaker 1: Many of the sites were back up by late mornings. 453 00:27:25,600 --> 00:27:27,959 Speaker 1: That really shows that it wasn't a very effective attack. 454 00:27:28,600 --> 00:27:31,679 Speaker 1: It's just a temporary disruption. The hackers reportedly used a 455 00:27:31,800 --> 00:27:35,760 Speaker 1: di DOS, or distributed denial of service attack on these sites. 456 00:27:36,560 --> 00:27:39,840 Speaker 1: Those tend to be fairly superficial sorts of attacks that 457 00:27:39,920 --> 00:27:44,639 Speaker 1: can be effective, um, particularly if you have your website 458 00:27:44,640 --> 00:27:47,480 Speaker 1: hosted on a service that doesn't have good di DOS 459 00:27:47,520 --> 00:27:51,640 Speaker 1: protection on it. But it's it's a temporary effect. It's 460 00:27:51,680 --> 00:27:57,040 Speaker 1: not like something more insidious like breaching a system. It's 461 00:27:57,119 --> 00:28:00,560 Speaker 1: nothing like that. It's just really shutting down on traffic 462 00:28:00,560 --> 00:28:03,840 Speaker 1: to that website temporarily. So I'll be chatting more about 463 00:28:03,920 --> 00:28:06,880 Speaker 1: de DOS attacks a little bit later this week. That's 464 00:28:07,240 --> 00:28:10,040 Speaker 1: a preview. I've talked about them before, but I have 465 00:28:10,040 --> 00:28:13,199 Speaker 1: an idea for a themed set of episodes, so that 466 00:28:13,240 --> 00:28:16,000 Speaker 1: will be coming a little bit later. Also coming later 467 00:28:16,320 --> 00:28:19,080 Speaker 1: will be another couple of news stories, but before we 468 00:28:19,119 --> 00:28:31,600 Speaker 1: get to that, let's take another quick break. Delta Airlines 469 00:28:31,640 --> 00:28:36,119 Speaker 1: has pledged to invest sixty million dollars in startup job 470 00:28:36,080 --> 00:28:40,360 Speaker 1: By Aviation, which is in the electric air taxi business. 471 00:28:40,680 --> 00:28:43,280 Speaker 1: That's going a pretty far away. It's not really a 472 00:28:43,320 --> 00:28:47,160 Speaker 1: business yet, but that's what they're developing. Is is electric 473 00:28:47,680 --> 00:28:52,400 Speaker 1: air taxis. Delta also said that should the project hit 474 00:28:52,480 --> 00:28:56,120 Speaker 1: certain milestones uh in in a certain amount of time, 475 00:28:56,640 --> 00:28:59,680 Speaker 1: the company might up that investment as high as two 476 00:28:59,760 --> 00:29:04,720 Speaker 1: hund million dollars. Job has created an electric vehicle takeoff 477 00:29:04,840 --> 00:29:08,720 Speaker 1: and landing vehicles that this v tall vehicles v t 478 00:29:08,840 --> 00:29:12,240 Speaker 1: o L or e v O t L in this case, 479 00:29:12,240 --> 00:29:16,920 Speaker 1: because they are electric. They look like really large drones, 480 00:29:17,360 --> 00:29:20,520 Speaker 1: like like kind of like quad copter drones, but they're 481 00:29:20,560 --> 00:29:23,640 Speaker 1: big enough to hold people and luggage in them. The 482 00:29:23,720 --> 00:29:27,520 Speaker 1: model mentioned in Andrew Jay Hawkins article about this, which 483 00:29:27,560 --> 00:29:30,560 Speaker 1: was published on The Verge can hold up to five 484 00:29:30,680 --> 00:29:33,400 Speaker 1: people and the idea is to create a home to 485 00:29:33,520 --> 00:29:38,680 Speaker 1: airports service. But I maintain that's actually a misleading phrase 486 00:29:38,760 --> 00:29:42,360 Speaker 1: because these e v O t L vehicles are gonna 487 00:29:42,400 --> 00:29:44,520 Speaker 1: need a lot of clear space in order to land 488 00:29:44,520 --> 00:29:46,400 Speaker 1: and take off from them. I mean they're still gonna 489 00:29:46,400 --> 00:29:49,320 Speaker 1: be able to do this vertically, so not as much 490 00:29:49,360 --> 00:29:52,240 Speaker 1: space as like an airplane would need, but essentially the 491 00:29:52,280 --> 00:29:55,360 Speaker 1: same kind of space that a helicopter would need in 492 00:29:55,480 --> 00:29:57,840 Speaker 1: order to land and take off. So you know they're 493 00:29:57,840 --> 00:30:03,000 Speaker 1: gonna need what what folks are calling to ports, these uh, 494 00:30:03,160 --> 00:30:06,400 Speaker 1: these these small cleared areas where these vehicles can operate, 495 00:30:06,640 --> 00:30:11,200 Speaker 1: very similar to helicopter takeoff pads. Right, it's gonna be 496 00:30:11,280 --> 00:30:12,560 Speaker 1: the same sort of thing as that. It might be 497 00:30:12,600 --> 00:30:15,240 Speaker 1: on the top of the building, it might be in 498 00:30:15,480 --> 00:30:19,160 Speaker 1: an otherwise cleared area, but it can't just be anywhere. 499 00:30:19,240 --> 00:30:21,400 Speaker 1: You're not going to see one of these things pull 500 00:30:21,520 --> 00:30:23,600 Speaker 1: up right outside your home. So it's not like you 501 00:30:23,640 --> 00:30:26,160 Speaker 1: step out the front door, walk down the driveway and 502 00:30:26,280 --> 00:30:29,320 Speaker 1: jump into one of these flying air taxis. That's not 503 00:30:29,400 --> 00:30:32,240 Speaker 1: how it's gonna work. Instead, you're gonna have to head 504 00:30:32,240 --> 00:30:36,240 Speaker 1: to the nearest vertiport two catch your ride to the airport. 505 00:30:36,960 --> 00:30:41,080 Speaker 1: So I think calling it home to airport is misleading. 506 00:30:41,400 --> 00:30:43,720 Speaker 1: I just don't think that's a good way of describing this, 507 00:30:43,960 --> 00:30:45,960 Speaker 1: because you do still have to go somewhere else, and 508 00:30:46,000 --> 00:30:50,479 Speaker 1: depending upon the distribution of vertiports, it might not really 509 00:30:50,840 --> 00:30:53,920 Speaker 1: be that effective of a way to reduce the amount 510 00:30:53,920 --> 00:30:56,760 Speaker 1: of time it takes for you to get to an airport. 511 00:30:57,320 --> 00:30:59,560 Speaker 1: Maybe it will. Also, we don't know how much it's 512 00:30:59,560 --> 00:31:02,600 Speaker 1: gonna cost, We don't know where it's going to be available, 513 00:31:02,720 --> 00:31:05,080 Speaker 1: though Los Angeles and New York are likely to be 514 00:31:05,160 --> 00:31:08,680 Speaker 1: the first cities to use these kind of things. Moreover 515 00:31:08,720 --> 00:31:11,520 Speaker 1: than that, you know, we've we've got that sort of 516 00:31:12,000 --> 00:31:14,680 Speaker 1: element that we have to factor with. But one that's 517 00:31:14,720 --> 00:31:21,040 Speaker 1: even more important is that federal agencies have to create 518 00:31:21,080 --> 00:31:25,640 Speaker 1: certifications and approval processes for these kinds of vehicles, and 519 00:31:25,640 --> 00:31:28,800 Speaker 1: that has not happened yet. Right You're not going to 520 00:31:28,920 --> 00:31:33,480 Speaker 1: have the f a A just allow unlicensed or uncertified 521 00:31:33,480 --> 00:31:38,200 Speaker 1: electric vehicles to fly through airspace. That's not going to happen. 522 00:31:38,720 --> 00:31:44,160 Speaker 1: And no company so far has had its its air 523 00:31:44,240 --> 00:31:49,040 Speaker 1: taxi kind of vehicle receive any kind of certification from 524 00:31:49,080 --> 00:31:53,000 Speaker 1: these agencies. So there's a lot of legal process to 525 00:31:53,040 --> 00:31:55,800 Speaker 1: get through. It's not just technical, although there are other 526 00:31:55,800 --> 00:31:58,680 Speaker 1: technical challenges to like how do you scale up production 527 00:31:59,000 --> 00:32:01,480 Speaker 1: so that you meet demand If it turns out there's 528 00:32:01,600 --> 00:32:04,680 Speaker 1: enough demand where this company is going to need to 529 00:32:05,160 --> 00:32:08,960 Speaker 1: mass produce these vehicles. How does it do that? It's 530 00:32:09,080 --> 00:32:12,480 Speaker 1: very different from producing a single one. So there are 531 00:32:12,480 --> 00:32:14,240 Speaker 1: a lot of unanswered questions, and there are a lot 532 00:32:14,240 --> 00:32:16,479 Speaker 1: of things that have to fall into place before we 533 00:32:16,560 --> 00:32:21,120 Speaker 1: ever see air taxis being used, um, you know, in 534 00:32:21,160 --> 00:32:26,400 Speaker 1: a in an official consumer way. So Delta is pledging 535 00:32:26,400 --> 00:32:30,320 Speaker 1: money into something that could eventually mature into a useful service. 536 00:32:30,640 --> 00:32:32,720 Speaker 1: But we're still a good ways out from that because 537 00:32:32,760 --> 00:32:34,720 Speaker 1: we have all these other steps that have to take 538 00:32:34,800 --> 00:32:40,480 Speaker 1: place first. CNBC reports that the Biden Labor Department has 539 00:32:40,560 --> 00:32:47,520 Speaker 1: released a proposal today that could require regulators and courts 540 00:32:47,560 --> 00:32:54,640 Speaker 1: to reclassify gig workers as employees. Right now, the going 541 00:32:54,760 --> 00:32:57,600 Speaker 1: standard in the United States is to classify folks who 542 00:32:57,600 --> 00:33:02,200 Speaker 1: work in the gig economy as independent contractors, and that 543 00:33:02,280 --> 00:33:05,480 Speaker 1: has pros and cons. On the workers side, one of 544 00:33:05,480 --> 00:33:08,880 Speaker 1: the pros is that independent contractors have a lot more 545 00:33:08,960 --> 00:33:12,520 Speaker 1: freedom to define their own hours, so they can hop 546 00:33:12,520 --> 00:33:16,840 Speaker 1: on and hop off of work at a very flexibly 547 00:33:16,880 --> 00:33:20,840 Speaker 1: at their own schedule. But on the flip side, employers 548 00:33:21,000 --> 00:33:24,640 Speaker 1: can treat independent contractors very different from employees. They don't 549 00:33:24,680 --> 00:33:31,000 Speaker 1: have to, for example, provide health benefits too independent contractors. 550 00:33:31,120 --> 00:33:35,400 Speaker 1: So that's one of the big downsides for people who 551 00:33:35,400 --> 00:33:39,000 Speaker 1: work in the gig economy. They can be heavily exploited 552 00:33:39,440 --> 00:33:42,520 Speaker 1: and receive very little protection in return. They might not 553 00:33:42,600 --> 00:33:46,280 Speaker 1: get overtime pay, they might not be allowed to form 554 00:33:46,320 --> 00:33:49,040 Speaker 1: a union. There are a lot of downsides from the 555 00:33:49,080 --> 00:33:53,320 Speaker 1: employee side. So there's a possibility that in the future 556 00:33:53,320 --> 00:33:57,880 Speaker 1: will see UH a federal level rule put into place 557 00:33:58,600 --> 00:34:03,480 Speaker 1: where companies like door Dash and lift and Uber will 558 00:34:03,520 --> 00:34:09,680 Speaker 1: not be able to classify their workers as independent contractors. 559 00:34:09,719 --> 00:34:12,400 Speaker 1: They'll have to treat them as employees. That would be 560 00:34:12,440 --> 00:34:16,759 Speaker 1: a massive change to their business model. UH. It would 561 00:34:17,000 --> 00:34:19,879 Speaker 1: likely mean we'd see prices go up for a lot 562 00:34:19,920 --> 00:34:23,480 Speaker 1: of these services because the company's gonna pass that that 563 00:34:23,640 --> 00:34:27,840 Speaker 1: cost on to us the end consumer. And honestly, we 564 00:34:27,880 --> 00:34:30,240 Speaker 1: don't know how many of these would even just fold 565 00:34:30,320 --> 00:34:32,879 Speaker 1: up and go away as opposed to trying to work 566 00:34:32,920 --> 00:34:37,080 Speaker 1: within that system. UM. I think, ultimately, from a personal 567 00:34:37,160 --> 00:34:40,799 Speaker 1: opinion standpoint, this is a good move. I really want 568 00:34:40,800 --> 00:34:42,719 Speaker 1: to I always want to see more protections for the 569 00:34:42,760 --> 00:34:46,200 Speaker 1: people who are doing the work. I'm not as concerned 570 00:34:46,239 --> 00:34:48,600 Speaker 1: for the people at the top who are benefiting from 571 00:34:48,640 --> 00:34:53,880 Speaker 1: the exploitation of their workforce, not really losing sleep about 572 00:34:54,160 --> 00:34:57,520 Speaker 1: their how much take home bay they get. At the 573 00:34:57,600 --> 00:34:59,239 Speaker 1: end of the day, I really worry more about the 574 00:34:59,239 --> 00:35:03,480 Speaker 1: people who are actually doing all the labor. So I 575 00:35:03,600 --> 00:35:07,080 Speaker 1: just don't know what happens if this rule goes into place, 576 00:35:07,120 --> 00:35:11,520 Speaker 1: Like I don't know if those models hold up under 577 00:35:11,840 --> 00:35:14,839 Speaker 1: those sets of rules, or if we'll actually see them 578 00:35:14,840 --> 00:35:17,320 Speaker 1: all kind of collapse in on themselves because the only 579 00:35:17,360 --> 00:35:22,320 Speaker 1: way they can exist as a business is through being 580 00:35:22,360 --> 00:35:26,160 Speaker 1: able to exploit the labor force the way it does. UH. 581 00:35:26,239 --> 00:35:28,360 Speaker 1: If it turns out that's the way, I would argue 582 00:35:28,480 --> 00:35:31,080 Speaker 1: that says that business is not a great one for 583 00:35:31,200 --> 00:35:34,759 Speaker 1: us to support, because we shouldn't be exploiting people to 584 00:35:34,840 --> 00:35:37,560 Speaker 1: that level. But I'm also in a position of privilege 585 00:35:37,560 --> 00:35:40,840 Speaker 1: and I understand that, so you know, I'm not suggesting 586 00:35:40,840 --> 00:35:42,759 Speaker 1: that my point of view is the right one or 587 00:35:42,800 --> 00:35:45,200 Speaker 1: the only one or anything like that. It's just kind 588 00:35:45,200 --> 00:35:48,360 Speaker 1: of how I come at things anyway. These are just 589 00:35:48,440 --> 00:35:52,839 Speaker 1: proposed sets of rules. Nothing has been enacted, nothing has 590 00:35:52,840 --> 00:35:56,239 Speaker 1: been voted on, UH, and it may never come to that. 591 00:35:56,640 --> 00:36:01,280 Speaker 1: But it is interesting that we're seeing movement on that front. Finally, 592 00:36:01,360 --> 00:36:07,240 Speaker 1: I wanted to mention that artist Ashley Zelinski has created 593 00:36:07,440 --> 00:36:11,799 Speaker 1: a gallery of art that is based on and inspired 594 00:36:11,840 --> 00:36:16,479 Speaker 1: by images taken by the James web Space Telescope. These 595 00:36:16,520 --> 00:36:23,080 Speaker 1: include three dimensional physical sculptures, includes things like lasers that 596 00:36:23,160 --> 00:36:28,320 Speaker 1: create different images within a foggy environment, and some exhibits 597 00:36:28,360 --> 00:36:32,759 Speaker 1: that require visitors to to don a virtual reality headset 598 00:36:33,280 --> 00:36:37,479 Speaker 1: in order to see virtual uh exhibits, all of which 599 00:36:37,520 --> 00:36:40,760 Speaker 1: are again inspired by these images created by the James 600 00:36:40,760 --> 00:36:45,319 Speaker 1: Webb Space Telescope. And I just think that's really interesting idea. Uh. 601 00:36:45,560 --> 00:36:49,520 Speaker 1: I think that that creates this super cool experience too 602 00:36:50,520 --> 00:36:55,520 Speaker 1: kind of encounter the scientific data, but in a completely 603 00:36:55,560 --> 00:36:58,960 Speaker 1: different context. Right, You're not looking at it as just 604 00:36:59,040 --> 00:37:02,720 Speaker 1: a flat image. You're not looking at it as numbers 605 00:37:02,719 --> 00:37:05,600 Speaker 1: on a chart, you're not looking at it as statistics. 606 00:37:05,640 --> 00:37:09,920 Speaker 1: You're looking at it as an interpretation of that data. 607 00:37:10,120 --> 00:37:15,319 Speaker 1: And I think that's a super cool idea. It's in 608 00:37:15,400 --> 00:37:20,040 Speaker 1: New York City's O n X Studio, and so I'm 609 00:37:20,160 --> 00:37:22,239 Speaker 1: I have not seen it. I don't live in New York, 610 00:37:22,280 --> 00:37:24,960 Speaker 1: so I'm not It's not convenient to me, but I 611 00:37:25,000 --> 00:37:27,440 Speaker 1: would love to visit it because it just sounds like 612 00:37:27,480 --> 00:37:33,319 Speaker 1: such a interesting approach to taking something that is more 613 00:37:33,360 --> 00:37:37,920 Speaker 1: abstract and not as accessible for most of us and 614 00:37:37,960 --> 00:37:41,560 Speaker 1: turning it into something that could have potentially a really 615 00:37:41,600 --> 00:37:45,840 Speaker 1: emotional impact on you. Um. I just love these ideas 616 00:37:45,920 --> 00:37:51,040 Speaker 1: of artists taking inspiration from the scientific endeavors. Of course, 617 00:37:51,040 --> 00:37:52,880 Speaker 1: I say that because I have a tattoo on my 618 00:37:52,920 --> 00:37:55,880 Speaker 1: back that was part of an art project for NASA, 619 00:37:56,040 --> 00:38:00,080 Speaker 1: so so I i I am a canvas for that 620 00:38:00,239 --> 00:38:03,400 Speaker 1: kind of an art project. In fact, we shot a 621 00:38:03,520 --> 00:38:06,600 Speaker 1: video of Forward Thinking where I was getting this hattoo 622 00:38:07,120 --> 00:38:11,719 Speaker 1: as I was delivering the the episode on camera. So 623 00:38:12,400 --> 00:38:14,359 Speaker 1: that's an experience I never thought I would have when 624 00:38:14,360 --> 00:38:16,759 Speaker 1: I was a kid, but I got it, so that 625 00:38:16,840 --> 00:38:19,799 Speaker 1: was kind of cool. Anyway, I just thought that it 626 00:38:19,920 --> 00:38:22,879 Speaker 1: was an interesting concept. If you are in New York 627 00:38:22,920 --> 00:38:24,800 Speaker 1: City and you can go to O n X Studio 628 00:38:24,880 --> 00:38:27,479 Speaker 1: and check this out, let me know what you think, 629 00:38:27,600 --> 00:38:30,680 Speaker 1: because I'm curious. I would love it to be like 630 00:38:30,719 --> 00:38:36,080 Speaker 1: a really awesome, impactful experience. Obviously, with art such a 631 00:38:36,120 --> 00:38:39,520 Speaker 1: personal thing, one person might walk out saying my life 632 00:38:39,520 --> 00:38:42,120 Speaker 1: has been changed forever, and the other person could walk 633 00:38:42,160 --> 00:38:45,000 Speaker 1: out saying I could have done something else with those 634 00:38:45,040 --> 00:38:48,760 Speaker 1: two hours of my time. It's very individual in that way, 635 00:38:49,000 --> 00:38:51,480 Speaker 1: but I am very curious about it. So that's how 636 00:38:51,520 --> 00:38:53,960 Speaker 1: I went in this episode, was talking about that. If 637 00:38:54,000 --> 00:38:56,200 Speaker 1: you have suggestions for topics I should cover in future 638 00:38:56,200 --> 00:38:58,920 Speaker 1: episodes of tech Stuff, reach out. You can download the 639 00:38:58,920 --> 00:39:01,800 Speaker 1: I Heart Radio app for navigate over to tech Stuff, 640 00:39:02,120 --> 00:39:04,520 Speaker 1: use that microphone icon, leave a voice message up to 641 00:39:04,560 --> 00:39:07,640 Speaker 1: thirty seconds in length, or reach out on Twitter. The 642 00:39:07,640 --> 00:39:10,640 Speaker 1: handle for the show is tech Stuff hs W and 643 00:39:10,719 --> 00:39:20,000 Speaker 1: I'll talk to you again, Release it y. Tech Stuff 644 00:39:20,080 --> 00:39:23,279 Speaker 1: is an I heart Radio production. For more podcasts from 645 00:39:23,280 --> 00:39:27,040 Speaker 1: I Heart Radio, visit the i Heart Radio app, Apple Podcasts, 646 00:39:27,160 --> 00:39:29,160 Speaker 1: or wherever you listen to your favorite shows