1 00:00:04,360 --> 00:00:12,280 Speaker 1: Welcome to tech Stuff, a production from iHeartRadio. Be there 2 00:00:12,280 --> 00:00:15,960 Speaker 1: and welcome to tech Stuff. I'm your host Jonathan Strickland, 3 00:00:15,960 --> 00:00:19,680 Speaker 1: Diamond Executive producer with iHeartRadio and how the tech are you. 4 00:00:20,320 --> 00:00:24,480 Speaker 1: It's time for the tech news for Thursday March sixteenth, 5 00:00:24,960 --> 00:00:29,120 Speaker 1: twenty twenty three, and we're back on the Big Topic 6 00:00:29,720 --> 00:00:34,680 Speaker 1: of twenty twenty three AI, although artificial intelligence should really 7 00:00:34,680 --> 00:00:38,120 Speaker 1: watch it's back because if more tech related financial institutions 8 00:00:38,120 --> 00:00:40,320 Speaker 1: fail this year, I think there's going to be a 9 00:00:40,320 --> 00:00:43,200 Speaker 1: new contender for the Big Topic of twenty twenty three. 10 00:00:43,720 --> 00:00:49,120 Speaker 1: But this week open ai announced the release of GPT four. 11 00:00:49,640 --> 00:00:54,120 Speaker 1: This is the large language model that is now multimodal. 12 00:00:54,600 --> 00:00:56,480 Speaker 1: All right, let's break this down. So first of all, 13 00:00:56,560 --> 00:01:02,600 Speaker 1: GPT stands for generative pre train transformer. The word transformer 14 00:01:02,600 --> 00:01:06,000 Speaker 1: in this case doesn't refer to either a robot in 15 00:01:06,040 --> 00:01:09,600 Speaker 1: disguise or the device that can step voltage up or 16 00:01:09,720 --> 00:01:13,560 Speaker 1: down with alternating current. Instead, this kind of transformer is 17 00:01:13,600 --> 00:01:17,720 Speaker 1: a type of machine learning model that Google first introduced 18 00:01:17,760 --> 00:01:22,560 Speaker 1: back in twenty seventeen. The word generative indicates that it's 19 00:01:22,600 --> 00:01:27,360 Speaker 1: capable of creating something generating something as for a multimodal 20 00:01:27,959 --> 00:01:31,760 Speaker 1: that means this new version of GPT can accept different 21 00:01:31,840 --> 00:01:37,679 Speaker 1: kinds of input prompts to create output. So previously GPT 22 00:01:37,760 --> 00:01:42,360 Speaker 1: could only accept text, but now it can also accept images, 23 00:01:42,480 --> 00:01:47,200 Speaker 1: and it can give analysis of those images or presumably 24 00:01:47,319 --> 00:01:51,280 Speaker 1: incorporate those images into responses, like maybe you could ask 25 00:01:51,400 --> 00:01:57,960 Speaker 1: GPT who is the person in this picture? Or how 26 00:01:58,000 --> 00:02:01,120 Speaker 1: many elements are in this photograph? Or can you tell 27 00:02:01,160 --> 00:02:03,360 Speaker 1: me a story about what is in this picture? That 28 00:02:03,440 --> 00:02:08,720 Speaker 1: kind of stuff. GPT four also seriously ups the word 29 00:02:08,800 --> 00:02:14,400 Speaker 1: count on its responses. So the previous version GPT three 30 00:02:14,480 --> 00:02:18,200 Speaker 1: point five, that's the version that actually powers chat GPT, 31 00:02:19,000 --> 00:02:23,040 Speaker 1: it could create a response of around three thousand words max. 32 00:02:23,520 --> 00:02:27,799 Speaker 1: But GPT four can get positively low quacious and provide 33 00:02:28,040 --> 00:02:31,160 Speaker 1: deep responses to queries that could go up to twenty 34 00:02:31,240 --> 00:02:35,040 Speaker 1: five thousand words. So now we're getting into like Jonathan 35 00:02:35,080 --> 00:02:38,920 Speaker 1: in a casual conversation level here text stuff episode. In 36 00:02:38,919 --> 00:02:43,480 Speaker 1: other words, it's also apparently better at playing by the 37 00:02:43,560 --> 00:02:46,440 Speaker 1: rules that open ai has set. That is, it is 38 00:02:46,560 --> 00:02:51,120 Speaker 1: less likely than chat GPT three point five to respond 39 00:02:51,120 --> 00:02:56,200 Speaker 1: to requests that would break policy, you know, stuff like 40 00:02:56,880 --> 00:03:02,840 Speaker 1: generating hate speech or responding to requests to cause problems 41 00:03:02,840 --> 00:03:06,440 Speaker 1: within a community, it is less likely to do that, 42 00:03:06,720 --> 00:03:12,040 Speaker 1: not completely impervious to it, but it's better than the 43 00:03:12,120 --> 00:03:15,639 Speaker 1: previous generations were. Developers will also have a lot more 44 00:03:15,639 --> 00:03:18,080 Speaker 1: options when they use GPT as part of their apps. 45 00:03:18,880 --> 00:03:22,000 Speaker 1: They can even shape how the AI will respond to 46 00:03:22,040 --> 00:03:24,480 Speaker 1: a user. They can adjust stuff like the tone and 47 00:03:24,560 --> 00:03:29,079 Speaker 1: the style of responses. So if you have something that's 48 00:03:29,120 --> 00:03:31,720 Speaker 1: like a say, a fun and silly app that for 49 00:03:31,760 --> 00:03:34,079 Speaker 1: some reason needs to tap into the power of GPT, 50 00:03:34,960 --> 00:03:39,080 Speaker 1: you could adjust the tone to be more lighthearted and 51 00:03:39,160 --> 00:03:42,520 Speaker 1: playful rather than stick with the standard tone and style 52 00:03:43,200 --> 00:03:47,080 Speaker 1: of the normal response, which is what previous generations of 53 00:03:47,160 --> 00:03:51,280 Speaker 1: GPT were stuck with. And in a demonstration, open Ai 54 00:03:51,360 --> 00:03:55,960 Speaker 1: showed that GPT four can even do some pretty astonishing stuff, 55 00:03:56,000 --> 00:04:01,080 Speaker 1: some advanced tasks. So, for example, you could write an 56 00:04:01,240 --> 00:04:04,320 Speaker 1: idea for a basic website right like you're using a 57 00:04:04,360 --> 00:04:07,920 Speaker 1: notepad and you're writing out your concept or your website. 58 00:04:08,280 --> 00:04:11,080 Speaker 1: You can then feed that concept to GPT and it 59 00:04:11,120 --> 00:04:15,640 Speaker 1: could actually create the website complete with basic functionality. So 60 00:04:15,680 --> 00:04:19,719 Speaker 1: the example I saw had GPT produce a website based 61 00:04:19,760 --> 00:04:22,560 Speaker 1: around jokes, so you would get the setup of the 62 00:04:22,640 --> 00:04:26,640 Speaker 1: joke appearing as text and then to get the punchline, 63 00:04:26,640 --> 00:04:28,760 Speaker 1: you would have to click on a little box that 64 00:04:28,800 --> 00:04:33,359 Speaker 1: would reveal what the punchline was to the joke. And 65 00:04:33,440 --> 00:04:37,040 Speaker 1: this was all prompted by some pretty simple notes about 66 00:04:37,120 --> 00:04:42,279 Speaker 1: the parameters for this website. Like it wasn't copying the note, 67 00:04:42,279 --> 00:04:44,599 Speaker 1: it was taking the notes as directions and then making 68 00:04:44,640 --> 00:04:48,080 Speaker 1: the website based on those directions. That's pretty cool. Now, 69 00:04:48,080 --> 00:04:50,520 Speaker 1: it still has a lot of the problems of earlier 70 00:04:50,560 --> 00:04:55,279 Speaker 1: generations of GPT. Its responses can still be inaccurate and misleading. 71 00:04:56,360 --> 00:05:00,479 Speaker 1: The common term I've seen in this are hallucinations, where 72 00:05:00,680 --> 00:05:04,640 Speaker 1: the response ends up not being an accurate representation. It's 73 00:05:04,680 --> 00:05:09,440 Speaker 1: a hallucination. I don't really like that term personally. I 74 00:05:09,560 --> 00:05:13,920 Speaker 1: just I think it ends up kind of sugarcoating what 75 00:05:13,960 --> 00:05:17,880 Speaker 1: we're actually talking about here, which is trustworthiness and accuracy 76 00:05:18,400 --> 00:05:22,640 Speaker 1: and hallucination. I don't know, it seems a little too 77 00:05:23,120 --> 00:05:26,040 Speaker 1: it seems like it's letting GPT off the hook a 78 00:05:26,080 --> 00:05:29,840 Speaker 1: little too much. That's my own personal opinion. It's kind 79 00:05:29,839 --> 00:05:32,240 Speaker 1: of similar to how I feel about Tesla and naming 80 00:05:32,360 --> 00:05:35,400 Speaker 1: its driver assist feature full self driving. I don't think 81 00:05:35,440 --> 00:05:38,200 Speaker 1: that is an accurate representation of what it actually does 82 00:05:38,760 --> 00:05:42,080 Speaker 1: same sort of thing, right, Like names and words mean things, 83 00:05:42,760 --> 00:05:45,680 Speaker 1: and sometimes you have to ask why are these words 84 00:05:45,720 --> 00:05:49,400 Speaker 1: being used to describe this? Is it in an attempt 85 00:05:49,839 --> 00:05:53,600 Speaker 1: to kind of take some pressure off. And that's just 86 00:05:53,640 --> 00:05:57,240 Speaker 1: my own personal take on this. I could be way 87 00:05:57,240 --> 00:06:03,160 Speaker 1: way off base. Anyway, it's still pretty cool. But obviously, 88 00:06:03,200 --> 00:06:08,240 Speaker 1: if GPT produces information that is misleading or inaccurate, but 89 00:06:08,400 --> 00:06:12,200 Speaker 1: the presentation is put across in such a way as 90 00:06:12,240 --> 00:06:16,000 Speaker 1: to seem authentic, that's a real problem. And it could 91 00:06:16,000 --> 00:06:19,039 Speaker 1: also still be used to do stuff like promote ideologies 92 00:06:19,040 --> 00:06:23,880 Speaker 1: and create propaganda in ways that are perhaps not entirely ethical. 93 00:06:24,040 --> 00:06:27,760 Speaker 1: So while the new features are really impressive, the challenges 94 00:06:27,800 --> 00:06:30,320 Speaker 1: that we face with these kinds of AI models haven't 95 00:06:30,880 --> 00:06:35,600 Speaker 1: magically been dismissed. They still very much exist. However, one 96 00:06:35,680 --> 00:06:38,560 Speaker 1: issue that apparently we do not need to worry about 97 00:06:38,839 --> 00:06:43,040 Speaker 1: is GPT for's ability to destroy us all. So I'm 98 00:06:43,080 --> 00:06:46,520 Speaker 1: being a bit glib here, but Open Ai actually did 99 00:06:46,520 --> 00:06:51,160 Speaker 1: allow an AI research group called the Alignment Research Center 100 00:06:51,360 --> 00:06:55,160 Speaker 1: or ARC, to test the Large Language Model to make 101 00:06:55,200 --> 00:06:58,680 Speaker 1: sure it doesn't show signs of being capable of starting 102 00:06:58,680 --> 00:07:01,400 Speaker 1: the robot uprising and learning all of us into batteries 103 00:07:01,480 --> 00:07:05,840 Speaker 1: or something Specifically, the AI group was looking into possibilities 104 00:07:05,839 --> 00:07:12,360 Speaker 1: like the model's capacity to copy itself or to acquire resources, 105 00:07:12,400 --> 00:07:16,720 Speaker 1: whether digital or real world or whatever, or maybe its 106 00:07:16,720 --> 00:07:20,120 Speaker 1: ability to manipulate people with stuff like phishing attacks and 107 00:07:20,200 --> 00:07:23,200 Speaker 1: that kind of thing. Also whether or not it can 108 00:07:23,280 --> 00:07:26,440 Speaker 1: make high level plans and then maybe follow through on 109 00:07:26,480 --> 00:07:29,040 Speaker 1: those plans. So this kind of gets into some of 110 00:07:29,040 --> 00:07:32,080 Speaker 1: the scary stuff around AI that we typically think of 111 00:07:32,120 --> 00:07:35,920 Speaker 1: as being science fiction, right, like Terminator style stuff. The 112 00:07:36,000 --> 00:07:40,040 Speaker 1: thought of a sufficiently intelligent AI could be capable of 113 00:07:40,200 --> 00:07:43,040 Speaker 1: escaping whatever constraints we try to put on it, because 114 00:07:43,080 --> 00:07:46,960 Speaker 1: by definition, if it's super intelligent, then it's smarter than 115 00:07:47,000 --> 00:07:50,280 Speaker 1: we are, right, So if it's smarter than we are, 116 00:07:50,400 --> 00:07:52,480 Speaker 1: then there it stands to reason it would figure out 117 00:07:52,520 --> 00:07:54,880 Speaker 1: a way to get out of whatever box we try 118 00:07:54,880 --> 00:07:58,760 Speaker 1: to put it in, including the possibility of manipulating people 119 00:07:58,880 --> 00:08:02,400 Speaker 1: to essentially set it free. There have been thought experiments 120 00:08:02,440 --> 00:08:07,920 Speaker 1: that have shown this is entirely possible. The AI researchers 121 00:08:07,960 --> 00:08:12,280 Speaker 1: concluded that GPT four doesn't measure up to this kind 122 00:08:12,320 --> 00:08:16,080 Speaker 1: of dangerous AI, But then, as Ours Technica reports, there 123 00:08:16,120 --> 00:08:19,120 Speaker 1: are scarce details on the nature of the tests that 124 00:08:19,200 --> 00:08:23,040 Speaker 1: the research group actually conducted, so it's hard to judge 125 00:08:23,040 --> 00:08:28,000 Speaker 1: whether or not their findings are trustworthy themselves. Also, I 126 00:08:28,040 --> 00:08:30,360 Speaker 1: think we need to remind ourselves that whether or not 127 00:08:30,440 --> 00:08:35,479 Speaker 1: AI poses an existential threat to humanity, we can acknowledge 128 00:08:35,480 --> 00:08:39,640 Speaker 1: that AI contributes to real problems in the here and now. Now. 129 00:08:39,760 --> 00:08:43,440 Speaker 1: Usually this comes as an unintended consequence, such as the 130 00:08:43,440 --> 00:08:46,720 Speaker 1: inclusion of bias within a system that leads to one 131 00:08:46,720 --> 00:08:51,760 Speaker 1: group of people being disproportionately affected by negative outcomes, facial 132 00:08:51,800 --> 00:08:55,400 Speaker 1: recognition technologies being a great example of that, but it 133 00:08:55,440 --> 00:08:58,680 Speaker 1: could also be included by design, right, someone could make 134 00:08:58,800 --> 00:09:03,520 Speaker 1: an AI applicant designed to create harm in some way. 135 00:09:04,000 --> 00:09:07,360 Speaker 1: The alignment part of Alignment Research Center refers to the 136 00:09:07,440 --> 00:09:10,920 Speaker 1: goal of making sure that AI is in alignment or 137 00:09:11,000 --> 00:09:14,880 Speaker 1: agreement with basic principles that are meant to benefit humanity. 138 00:09:15,400 --> 00:09:18,080 Speaker 1: But here's the thing. Bias can come into that sort 139 00:09:18,080 --> 00:09:20,360 Speaker 1: of decision making process too, and you have to ask 140 00:09:20,400 --> 00:09:24,160 Speaker 1: tough questions of who decides that criteria, who is it 141 00:09:24,200 --> 00:09:28,800 Speaker 1: that's defining what is beneficial because believe me, there is 142 00:09:28,840 --> 00:09:32,560 Speaker 1: not a universal or objective truth to what that actually 143 00:09:32,880 --> 00:09:36,880 Speaker 1: even means. And even an effort to create an quote 144 00:09:36,920 --> 00:09:41,120 Speaker 1: unquote aligned AI could result in one that benefits a 145 00:09:41,160 --> 00:09:46,080 Speaker 1: small group at the expense of everyone else. Earlier this year, 146 00:09:46,120 --> 00:09:49,600 Speaker 1: I mentioned that the Chinese megacompany by Do was hard 147 00:09:49,600 --> 00:09:52,839 Speaker 1: at work on its own aipowered chatbot that it had 148 00:09:52,880 --> 00:09:56,280 Speaker 1: called ernie Bot. Well earlier Today, by Do showed off 149 00:09:56,440 --> 00:09:59,840 Speaker 1: ernie Bot, but did so with prerecorded videos in showing 150 00:10:00,000 --> 00:10:03,720 Speaker 1: off its capabilities. Obviously, prerecorded videos can be fudged a 151 00:10:03,720 --> 00:10:07,280 Speaker 1: little bit, so that brought some concern. It also didn't 152 00:10:07,320 --> 00:10:10,200 Speaker 1: launch any sort of public access to the tool. It's 153 00:10:10,240 --> 00:10:14,760 Speaker 1: keeping access very much limited. And dissatisfaction followed, and Bydu's 154 00:10:14,880 --> 00:10:18,280 Speaker 1: stock shares dropped as much as ten percent initially, though 155 00:10:18,320 --> 00:10:20,959 Speaker 1: that leveled off a little bit later, so the company 156 00:10:21,120 --> 00:10:26,720 Speaker 1: only lost around three billion dollars in value. Yikes. Bydu 157 00:10:26,800 --> 00:10:30,120 Speaker 1: seems to be taking a more measured approach to deploying 158 00:10:30,120 --> 00:10:34,480 Speaker 1: its AI chatbot, which I actually argue is the responsible 159 00:10:34,520 --> 00:10:37,120 Speaker 1: thing to do. We have seen numerous stories of how 160 00:10:37,200 --> 00:10:43,479 Speaker 1: GPT has caused problems with premature deployments, such as homophobic 161 00:10:43,559 --> 00:10:47,600 Speaker 1: jokes on the endless computer generated Seinfeld episode that's on Twitch. 162 00:10:48,200 --> 00:10:51,760 Speaker 1: All way to creating a concern among teachers that their 163 00:10:51,760 --> 00:10:54,760 Speaker 1: students could be using a chatbot to cheat on assignments, 164 00:10:54,800 --> 00:10:58,360 Speaker 1: and more so, a more deliberate rollout seems like it's 165 00:10:58,360 --> 00:11:01,079 Speaker 1: a good idea to me. But then I guess there's 166 00:11:01,080 --> 00:11:04,600 Speaker 1: this perception that Baidu is trailing behind companies like Microsoft 167 00:11:04,679 --> 00:11:07,960 Speaker 1: and Google, and there's this fear that it's not being competitive. 168 00:11:08,600 --> 00:11:12,000 Speaker 1: I think not being competitive in this case also means 169 00:11:12,000 --> 00:11:16,760 Speaker 1: you're sidestepping some potentially really troubling problems down the line. 170 00:11:17,240 --> 00:11:21,720 Speaker 1: But I'm not an investor, So there you go. Okay, 171 00:11:22,040 --> 00:11:24,480 Speaker 1: we've got a lot more stories to cover, but before 172 00:11:24,480 --> 00:11:37,120 Speaker 1: we get to that, let's take a quick break. We're back, 173 00:11:37,480 --> 00:11:40,640 Speaker 1: so let's talk about TikTok. We've got a couple of 174 00:11:40,679 --> 00:11:43,880 Speaker 1: stories about that. Again. The government of the United Kingdom 175 00:11:44,080 --> 00:11:48,400 Speaker 1: is following the latest trend of banning TikTok on government 176 00:11:48,440 --> 00:11:51,839 Speaker 1: owned devices, or at least it's considering that option. By 177 00:11:51,880 --> 00:11:54,560 Speaker 1: the time you hear this podcast, the matter may have 178 00:11:55,000 --> 00:11:57,920 Speaker 1: officially been decided, but as I was recording it, it 179 00:11:58,000 --> 00:12:01,440 Speaker 1: was still something that was being considered, but we were 180 00:12:01,520 --> 00:12:05,000 Speaker 1: expecting an announcement sometime today. The UK would follow in 181 00:12:05,040 --> 00:12:09,000 Speaker 1: the footsteps of the United States, Canada, the Netherlands, the 182 00:12:09,040 --> 00:12:12,520 Speaker 1: European Union in general, and several other countries if they 183 00:12:12,559 --> 00:12:15,280 Speaker 1: were to do this, and yes, the heart of the 184 00:12:15,320 --> 00:12:19,199 Speaker 1: matter is the security concerns of data potentially filtering to 185 00:12:19,360 --> 00:12:24,040 Speaker 1: TikTok's Chinese owned parent company, Byte Dance, and by extension, 186 00:12:24,360 --> 00:12:28,320 Speaker 1: to the Chinese government. Yesterday, The Wall Street Journal reported 187 00:12:28,559 --> 00:12:32,559 Speaker 1: that the United States could send TikTok and byte Dance 188 00:12:32,559 --> 00:12:36,679 Speaker 1: and ultimatum, which is essentially, you must sever all ties 189 00:12:36,720 --> 00:12:39,360 Speaker 1: between TikTok and bitte Dance. By Dance would have to 190 00:12:40,120 --> 00:12:43,280 Speaker 1: spin off TikTok as an entirely independent company and not 191 00:12:43,480 --> 00:12:49,600 Speaker 1: be the parent company anymore, or TikTok risks being banned nationwide. 192 00:12:49,960 --> 00:12:52,960 Speaker 1: This echoes what former President Donald Trump tried to do 193 00:12:53,040 --> 00:12:56,080 Speaker 1: a few years ago. He tried to force TikTok to 194 00:12:56,200 --> 00:12:59,280 Speaker 1: divorce itself from its parent company and become a truly 195 00:12:59,440 --> 00:13:04,360 Speaker 1: US organization. During Trump's presidency, the push was for TikTok 196 00:13:04,400 --> 00:13:08,160 Speaker 1: to sell itself to some larger American company, but this 197 00:13:08,280 --> 00:13:11,600 Speaker 1: never came to fruition, partly because TikTok argued in court 198 00:13:12,120 --> 00:13:15,200 Speaker 1: that this directive from the president would violate a law 199 00:13:15,280 --> 00:13:20,800 Speaker 1: called the Burman amendments that prevent presidents from using economic 200 00:13:20,880 --> 00:13:29,640 Speaker 1: pressures to force international communications platforms from doing stuff. So essentially, 201 00:13:29,640 --> 00:13:32,040 Speaker 1: they were saying this is against the law, and now 202 00:13:32,080 --> 00:13:34,719 Speaker 1: we're seeing a similar push from the Biden administration. I'm 203 00:13:34,760 --> 00:13:38,240 Speaker 1: not exactly sure how that same law wouldn't apply in 204 00:13:38,280 --> 00:13:40,800 Speaker 1: this case, Like, I don't know that there's a new 205 00:13:40,960 --> 00:13:45,520 Speaker 1: argument to be made if it's coming straight from the president. Anyway, 206 00:13:45,640 --> 00:13:49,040 Speaker 1: TikTok reps argue that a massive one and a half 207 00:13:49,160 --> 00:13:52,720 Speaker 1: billion dollar program meant to secure data in the US 208 00:13:53,160 --> 00:13:55,920 Speaker 1: and to allow third party security companies a chance to 209 00:13:56,080 --> 00:13:59,840 Speaker 1: monitor TikTok's operations to look for anything hinky going on. 210 00:14:00,640 --> 00:14:03,760 Speaker 1: They're saying that's more than enough to solve the worries 211 00:14:03,760 --> 00:14:06,360 Speaker 1: that TikTok is serving as a threat to national security. 212 00:14:06,800 --> 00:14:10,320 Speaker 1: And then further, reps for TikTok have said that if 213 00:14:10,320 --> 00:14:13,760 Speaker 1: by Dance did divest itself of TikTok, that wouldn't actually 214 00:14:13,800 --> 00:14:17,199 Speaker 1: prevent the transmission of data from TikTok to China. I'm 215 00:14:17,240 --> 00:14:20,080 Speaker 1: not entirely sure how all this tracks, but then I 216 00:14:20,160 --> 00:14:23,560 Speaker 1: haven't been given a glimpse of all the details, so 217 00:14:23,680 --> 00:14:25,680 Speaker 1: I doubt I would even understand all of it if 218 00:14:25,680 --> 00:14:29,480 Speaker 1: I did, because this is wrapping into things like finance 219 00:14:29,560 --> 00:14:33,240 Speaker 1: and trade more than the technology itself. Anyway, it looks 220 00:14:33,280 --> 00:14:36,560 Speaker 1: like we're heading to an impass and I'm not entirely 221 00:14:36,640 --> 00:14:39,720 Speaker 1: sure where it's going to go from there. But my 222 00:14:39,840 --> 00:14:42,280 Speaker 1: guess is we're not going to see the pressure on 223 00:14:42,320 --> 00:14:48,960 Speaker 1: TikTok going away anytime soon. Hey, y'all remember FTX, You know, 224 00:14:49,240 --> 00:14:51,640 Speaker 1: the company that used to be the second largest crypto 225 00:14:51,720 --> 00:14:54,680 Speaker 1: exchange in the world before word got out that the 226 00:14:54,720 --> 00:14:58,960 Speaker 1: team behind it was practicing some creative accounting that you know, 227 00:14:59,120 --> 00:15:01,360 Speaker 1: you might be able to look at another point of 228 00:15:01,400 --> 00:15:05,280 Speaker 1: view and call it, you know, fraud. But more bad 229 00:15:05,280 --> 00:15:08,480 Speaker 1: news has emerged about the company and its high profile 230 00:15:08,560 --> 00:15:13,680 Speaker 1: co founder, Sam Bankman Freed aka SBF. So the new 231 00:15:13,720 --> 00:15:16,680 Speaker 1: folks in charge of FTX, who are mostly going through 232 00:15:16,720 --> 00:15:19,320 Speaker 1: the process of selling the company off for parts in 233 00:15:19,440 --> 00:15:22,840 Speaker 1: order to return as much value to investors and customers 234 00:15:22,840 --> 00:15:26,480 Speaker 1: as possible, have said that they have uncovered transfers that 235 00:15:26,600 --> 00:15:31,640 Speaker 1: collectively amounted to around two point two billion dollars and 236 00:15:31,680 --> 00:15:35,680 Speaker 1: that this money was sent to SBF through various means, 237 00:15:35,680 --> 00:15:40,200 Speaker 1: primarily through Alameda Research, the hedge fund company that was 238 00:15:42,000 --> 00:15:48,160 Speaker 1: you know Yen to ftx's yang. An additional billion seemed 239 00:15:48,280 --> 00:15:50,720 Speaker 1: to be pulled from FTX to go to other key 240 00:15:50,800 --> 00:15:57,720 Speaker 1: members of FTX, and yeah, Alameda Research handled those transfers, 241 00:15:57,760 --> 00:16:02,160 Speaker 1: according to the new owners of T and that you know, 242 00:16:02,600 --> 00:16:06,320 Speaker 1: it's been going on for a while. And I personally 243 00:16:06,320 --> 00:16:10,400 Speaker 1: consider the report shocking but not surprising. By that, I mean, 244 00:16:10,520 --> 00:16:13,040 Speaker 1: I'm not at all surprised that SPF was trying to 245 00:16:13,080 --> 00:16:15,720 Speaker 1: pull as much cash out as he could before FTX 246 00:16:15,880 --> 00:16:19,680 Speaker 1: came crashing down. It is shocking exactly how much money 247 00:16:19,760 --> 00:16:23,920 Speaker 1: that was, right like to pull out two point two 248 00:16:24,040 --> 00:16:28,840 Speaker 1: billion dollars in various means as you're trying to salvage 249 00:16:28,840 --> 00:16:30,880 Speaker 1: as much as you can from a ship that's sinking. 250 00:16:31,560 --> 00:16:36,200 Speaker 1: Incredible Anyway. SPF is currently out on bail awaiting his trial, which, 251 00:16:36,240 --> 00:16:39,920 Speaker 1: according to the current court schedule, will not begin until 252 00:16:39,960 --> 00:16:47,080 Speaker 1: October this year because justice is super swift. The Verge 253 00:16:47,080 --> 00:16:51,640 Speaker 1: has an article titled the Silicon Valley bank fallout is 254 00:16:51,720 --> 00:16:54,240 Speaker 1: just beginning. It's a great article. You should go read 255 00:16:54,280 --> 00:16:57,080 Speaker 1: it on the Verge and the articles mostly about how 256 00:16:57,120 --> 00:16:59,720 Speaker 1: tech companies are trying to figure out where to go 257 00:16:59,760 --> 00:17:03,360 Speaker 1: for word from here, what's the safest place to bank 258 00:17:03,920 --> 00:17:06,800 Speaker 1: now that SVB has gone under, because if you remember, 259 00:17:06,960 --> 00:17:11,000 Speaker 1: SVB was like a financial pillar for the tech industry. 260 00:17:11,680 --> 00:17:15,040 Speaker 1: So this piece lays out the challenges that tech companies 261 00:17:15,080 --> 00:17:19,199 Speaker 1: currently face, including how to mitigate risk while ensuring that 262 00:17:19,280 --> 00:17:23,280 Speaker 1: tasks like making payroll aren't interrupted. Like you might argue, hey, 263 00:17:23,760 --> 00:17:26,960 Speaker 1: it'd be really good to make sure you're depositing in 264 00:17:27,040 --> 00:17:30,199 Speaker 1: different banks so that way you're not having all of 265 00:17:30,200 --> 00:17:32,720 Speaker 1: your eggs in one basket, and if that basket goes 266 00:17:32,800 --> 00:17:36,119 Speaker 1: belly up, you're still all right because you've got eggs 267 00:17:36,119 --> 00:17:38,639 Speaker 1: and other baskets, right, Except that if you're trying to 268 00:17:38,640 --> 00:17:42,680 Speaker 1: do things like pay out a large group of employees, 269 00:17:43,200 --> 00:17:46,600 Speaker 1: having that money dispersed makes it much more difficult to 270 00:17:46,640 --> 00:17:49,520 Speaker 1: do those kind of tasks. So that's the sort of 271 00:17:49,600 --> 00:17:54,600 Speaker 1: challenges that tech companies are looking at. The piece also 272 00:17:54,680 --> 00:17:58,119 Speaker 1: mentions that we're probably going to see a change in 273 00:17:58,280 --> 00:18:02,760 Speaker 1: how startups are funded and how they operate, and part 274 00:18:02,760 --> 00:18:05,720 Speaker 1: of me thinks that a reevaluation of startup culture is 275 00:18:06,040 --> 00:18:10,359 Speaker 1: long overdue. I have often worried that the tendency for 276 00:18:10,400 --> 00:18:14,200 Speaker 1: startups to launch without first creating a firm business plan, 277 00:18:15,200 --> 00:18:18,400 Speaker 1: instead essentially having a desire to get acquired by some 278 00:18:18,480 --> 00:18:22,480 Speaker 1: other company has really led to irresponsible behavior, and that 279 00:18:22,600 --> 00:18:26,240 Speaker 1: this has gone unchecked for too long. This created too 280 00:18:26,240 --> 00:18:31,199 Speaker 1: many companies that ultimately produced very little value. Right. You'll 281 00:18:31,240 --> 00:18:35,200 Speaker 1: hear about a company like Google or Meta or Amazon 282 00:18:35,560 --> 00:18:39,640 Speaker 1: scooping up some of these companies, and sometimes nothing really 283 00:18:39,640 --> 00:18:42,320 Speaker 1: seems to come of it, right, and you're just like, well, 284 00:18:42,720 --> 00:18:45,879 Speaker 1: what good did that startup due apart from make the 285 00:18:46,240 --> 00:18:50,720 Speaker 1: founders rich? And then you see the founders go off 286 00:18:50,720 --> 00:18:52,679 Speaker 1: and do the same thing again, where they'll start up 287 00:18:52,680 --> 00:18:57,200 Speaker 1: a new company, perhaps one that also has very little 288 00:18:57,440 --> 00:19:01,199 Speaker 1: business plan element to it, also with the hope that 289 00:19:01,240 --> 00:19:03,040 Speaker 1: they are going to get scooped up by another company, 290 00:19:03,200 --> 00:19:05,480 Speaker 1: and they do it again and again and again, because 291 00:19:05,560 --> 00:19:08,520 Speaker 1: coming up with an idea that sounds attractive is way 292 00:19:08,560 --> 00:19:11,840 Speaker 1: easier than making that idea work. And if you are 293 00:19:11,880 --> 00:19:16,119 Speaker 1: good at selling ideas, you can just do a serial 294 00:19:16,160 --> 00:19:19,240 Speaker 1: approach to create cool sounding idea, sell it off to 295 00:19:19,280 --> 00:19:24,840 Speaker 1: someone else, rents, and repeat and by an island somewhere, 296 00:19:25,520 --> 00:19:30,280 Speaker 1: I'm hoping that this represents a reality check moment where 297 00:19:30,280 --> 00:19:32,560 Speaker 1: we will see a more thoughtful and careful approach to 298 00:19:32,680 --> 00:19:38,840 Speaker 1: funding startups that might be a vain hope, because you know, 299 00:19:39,320 --> 00:19:43,720 Speaker 1: historically investors have shown a really strong desire to get 300 00:19:43,760 --> 00:19:47,000 Speaker 1: in on the thing that's possibly going to take the 301 00:19:47,000 --> 00:19:50,040 Speaker 1: world by storm, so that you get so rich that 302 00:19:50,119 --> 00:19:53,960 Speaker 1: you make Scrooge mcdock look like a pauper. Anyway, the 303 00:19:54,119 --> 00:19:55,960 Speaker 1: article in The Verge is a good one again, it's 304 00:19:55,960 --> 00:19:59,280 Speaker 1: called the Silicon Valley Bank fallout is just beginning. It 305 00:19:59,320 --> 00:20:01,639 Speaker 1: also takes time to point out that the collapse of 306 00:20:01,840 --> 00:20:05,359 Speaker 1: SVB was in large part the fault of some of 307 00:20:05,400 --> 00:20:10,800 Speaker 1: the venture capitalist investors themselves. Like you had the leaders 308 00:20:10,880 --> 00:20:14,680 Speaker 1: of big investment fund groups saying, hey, you should probably 309 00:20:14,680 --> 00:20:17,040 Speaker 1: pull your money out of SVB because if you don't, 310 00:20:17,080 --> 00:20:18,480 Speaker 1: someone else is going to do it, and then you 311 00:20:18,480 --> 00:20:20,159 Speaker 1: won't be able to get your money, and so they 312 00:20:20,240 --> 00:20:23,119 Speaker 1: ended up creating the very crisis that they were warning 313 00:20:23,160 --> 00:20:27,840 Speaker 1: people about. So I think a lot of financial institutions 314 00:20:27,920 --> 00:20:31,160 Speaker 1: may still view loans to the tech sector as being 315 00:20:31,240 --> 00:20:35,960 Speaker 1: risky because the venture capital community have proven themselves to 316 00:20:36,040 --> 00:20:42,080 Speaker 1: be self destructive. It's hard to trust in an industry 317 00:20:42,160 --> 00:20:48,840 Speaker 1: when you see the tendency to act in self interest 318 00:20:49,119 --> 00:20:53,240 Speaker 1: to the point where you harm everybody else. Very difficult 319 00:20:53,240 --> 00:20:56,919 Speaker 1: to put your trust into that group. Okay, still have 320 00:20:56,960 --> 00:20:59,439 Speaker 1: a few more stories to cover before I get to that. 321 00:20:59,560 --> 00:21:13,560 Speaker 1: Let's another quick break, Okay. The Federal Trade Commission or FTC, 322 00:21:13,880 --> 00:21:17,960 Speaker 1: has finalized its judgment against Epic Games regarding the company's 323 00:21:18,000 --> 00:21:22,840 Speaker 1: practice of enticing or perhaps even fooling Fortnite players into 324 00:21:22,880 --> 00:21:28,240 Speaker 1: making in game purchases through what the FTC calls dark patterns. 325 00:21:28,400 --> 00:21:32,919 Speaker 1: So essentially, dark patterns refers to creating an interface that 326 00:21:33,000 --> 00:21:36,520 Speaker 1: makes it really easy, far too easy, most would argue, 327 00:21:36,680 --> 00:21:39,280 Speaker 1: for a player to make an in game purchase, perhaps 328 00:21:39,280 --> 00:21:43,639 Speaker 1: without even knowing or understanding that it is a financial transaction. 329 00:21:44,280 --> 00:21:46,960 Speaker 1: One of the ways the FTC argued Epic Games did 330 00:21:47,000 --> 00:21:51,040 Speaker 1: this was that it ended up playing fast and loose 331 00:21:51,119 --> 00:21:55,040 Speaker 1: with button inputs, so that you might have a button 332 00:21:55,280 --> 00:21:58,959 Speaker 1: that normally has you back out of a menu option, right, Like, 333 00:21:59,400 --> 00:22:03,199 Speaker 1: maybe you've had your menu UI set up so that 334 00:22:03,240 --> 00:22:07,040 Speaker 1: when you hit B, you're backing out of that part 335 00:22:07,040 --> 00:22:08,680 Speaker 1: of the menu and you go to a larger part. 336 00:22:09,280 --> 00:22:12,399 Speaker 1: But then in a transaction, maybe you suddenly make B 337 00:22:12,840 --> 00:22:16,080 Speaker 1: confirm instead of backout, and so people who are used 338 00:22:16,119 --> 00:22:19,480 Speaker 1: to using bing and backout hit B, but now they've 339 00:22:19,480 --> 00:22:24,280 Speaker 1: just made a payment, and that might not be something 340 00:22:24,320 --> 00:22:27,200 Speaker 1: you could easily reverse or cancel out of, like maybe 341 00:22:27,240 --> 00:22:31,040 Speaker 1: there's no confirmed feature. You just boom, you've done it. 342 00:22:31,560 --> 00:22:33,320 Speaker 1: Another part of the problem is that a lot of 343 00:22:33,359 --> 00:22:35,760 Speaker 1: the folks who are playing Fortnite are kids, So the 344 00:22:35,840 --> 00:22:38,919 Speaker 1: FTC argued that Epic Games didn't include enough protections to 345 00:22:38,960 --> 00:22:42,840 Speaker 1: prevent kids from making in game purchases without parental consent, 346 00:22:43,480 --> 00:22:46,800 Speaker 1: so they just started racking up enormous charges on parents 347 00:22:46,880 --> 00:22:49,720 Speaker 1: credit cards that were associated with the account, and if 348 00:22:49,720 --> 00:22:52,360 Speaker 1: a player did go so far as to contact their 349 00:22:52,359 --> 00:22:56,640 Speaker 1: credit card company to dispute charges, Epic Games would then 350 00:22:56,800 --> 00:22:59,560 Speaker 1: lock that person's player account so they couldn't play the 351 00:22:59,560 --> 00:23:02,800 Speaker 1: game anymore. So now the FTC is telling Epic Games 352 00:23:02,840 --> 00:23:07,000 Speaker 1: to cough up two hundred forty five million dollars as punishment. 353 00:23:07,320 --> 00:23:10,480 Speaker 1: The FTC plans to use that money to provide refunds 354 00:23:10,520 --> 00:23:14,640 Speaker 1: to affected players. Plus, Epic will not be allowed to 355 00:23:14,680 --> 00:23:18,159 Speaker 1: block people who dispute credit card charges anymore per the 356 00:23:18,280 --> 00:23:22,200 Speaker 1: terms of this agreement. If you think you were affected 357 00:23:22,200 --> 00:23:25,480 Speaker 1: by Epic Games practices. In other words, if you feel 358 00:23:25,480 --> 00:23:28,000 Speaker 1: you were tricked into making a purchase on Fortnite and 359 00:23:28,160 --> 00:23:30,560 Speaker 1: you are a US citizen, you can go to the 360 00:23:30,560 --> 00:23:37,240 Speaker 1: website FTC dot gov slash fortnite that's foart n Ie, 361 00:23:37,640 --> 00:23:39,960 Speaker 1: and you can fill in a little online form to 362 00:23:39,960 --> 00:23:44,000 Speaker 1: start the refund process. South Korea and Samsung announced plans 363 00:23:44,000 --> 00:23:49,160 Speaker 1: to build a truly enormous semiconductor manufacturing campus this week. 364 00:23:49,560 --> 00:23:53,119 Speaker 1: It's one that Samsung is going to invest two hundred 365 00:23:53,160 --> 00:23:57,680 Speaker 1: thirty billion dollars in two The effort will make South 366 00:23:57,760 --> 00:24:02,480 Speaker 1: Korea much more competitive with Taiwan's TSMC as a semiconductor 367 00:24:02,520 --> 00:24:06,600 Speaker 1: manufacturing company that's responsible for more than half of all 368 00:24:06,640 --> 00:24:10,800 Speaker 1: the semiconductors used in advanced electronics today. It's also a 369 00:24:10,840 --> 00:24:15,359 Speaker 1: move to create a better supply in a world that 370 00:24:15,440 --> 00:24:18,280 Speaker 1: not so long ago was in a serious crunch due 371 00:24:18,320 --> 00:24:20,320 Speaker 1: to many factors, in the big one being the COVID 372 00:24:20,400 --> 00:24:24,879 Speaker 1: nineteen pandemic. Samsung was one of several companies hit hard 373 00:24:24,920 --> 00:24:28,520 Speaker 1: by supply chain issues, and by investing in semiconductor manufacturing 374 00:24:28,560 --> 00:24:31,600 Speaker 1: facilities within South Korea, there are hopes to head off 375 00:24:31,640 --> 00:24:33,840 Speaker 1: future problems like what we saw on the recent past. 376 00:24:34,320 --> 00:24:36,159 Speaker 1: This is not that different from what we're seeing here 377 00:24:36,160 --> 00:24:39,360 Speaker 1: in the United States. The US government has poured billions 378 00:24:39,400 --> 00:24:42,080 Speaker 1: of dollars or at least earmarked billions of dollars for 379 00:24:42,160 --> 00:24:46,080 Speaker 1: similar investments here in the US again to try and 380 00:24:46,160 --> 00:24:50,359 Speaker 1: alleviate some of these bottleneck points for supply chain issues 381 00:24:50,359 --> 00:24:54,280 Speaker 1: in the future. Over in Japan, the US startup Lift 382 00:24:54,359 --> 00:24:58,760 Speaker 1: Aircraft Incorporated held its first test flight of the Hexa 383 00:24:59,320 --> 00:25:04,399 Speaker 1: flying car vehicle with an actual human being piloting it. 384 00:25:04,760 --> 00:25:07,399 Speaker 1: So it was the first time it had a piloted 385 00:25:07,600 --> 00:25:11,280 Speaker 1: test flight. So it looks like like an oversized quad 386 00:25:11,280 --> 00:25:14,560 Speaker 1: copter that has a cockpit in it that can hold 387 00:25:14,640 --> 00:25:18,040 Speaker 1: a single rider. Now, the plan is to have flying 388 00:25:18,119 --> 00:25:21,080 Speaker 1: vehicles give rides to passengers in time for the twenty 389 00:25:21,119 --> 00:25:25,160 Speaker 1: twenty five Osaka Kansai Expo. And from what I read, 390 00:25:25,920 --> 00:25:28,239 Speaker 1: it is a single seater vehicle, so I'm guessing that 391 00:25:28,280 --> 00:25:32,199 Speaker 1: for passenger rides it must be either controlled remotely or 392 00:25:32,760 --> 00:25:36,480 Speaker 1: possibly autonomous. I think remote control is far more realistic, 393 00:25:37,000 --> 00:25:40,880 Speaker 1: but there's no way you would just hand over control 394 00:25:40,960 --> 00:25:43,359 Speaker 1: of a flying car to some rando and then just 395 00:25:43,440 --> 00:25:47,920 Speaker 1: say good luck out there. Anyway, this particular vehicle isn't 396 00:25:47,960 --> 00:25:51,720 Speaker 1: meant for the long haul because according to Asahi dot Com, 397 00:25:52,240 --> 00:25:55,920 Speaker 1: it can fly for about fifteen minutes, so you don't 398 00:25:55,960 --> 00:25:58,000 Speaker 1: even get twenty minutes of travel out of this thing, 399 00:25:58,080 --> 00:26:01,760 Speaker 1: so it's meant for very short distance as it travels 400 00:26:01,760 --> 00:26:04,760 Speaker 1: at speeds top speeds at around one hundred kilometers per 401 00:26:04,760 --> 00:26:08,680 Speaker 1: hour that's about sixty two miles per hour. Japan's Transport 402 00:26:08,680 --> 00:26:11,639 Speaker 1: Ministry is now hashing out the regulations that lift aircraft 403 00:26:11,680 --> 00:26:16,280 Speaker 1: and other vendors that are also planning similar services for 404 00:26:16,600 --> 00:26:20,040 Speaker 1: this particular exhibit. These are going to be the rules 405 00:26:20,080 --> 00:26:22,680 Speaker 1: that they'll have to follow, which is good because this 406 00:26:22,760 --> 00:26:24,960 Speaker 1: is really one area where I don't want to see 407 00:26:25,000 --> 00:26:29,720 Speaker 1: regulation trail too far behind the technological innovation. People's lives 408 00:26:29,760 --> 00:26:33,560 Speaker 1: are involved here right, their lives and their safety and 409 00:26:33,600 --> 00:26:37,720 Speaker 1: their health. I think that it's very good that there 410 00:26:37,920 --> 00:26:42,760 Speaker 1: is serious work being done in the regulation side. Richard 411 00:26:42,800 --> 00:26:47,679 Speaker 1: branson satellite launch company Virgin Orbit, has effectively closed up shop, 412 00:26:48,040 --> 00:26:51,760 Speaker 1: at least for now. According to BBC News, Virgin Orbit 413 00:26:51,840 --> 00:26:56,000 Speaker 1: has stopped all operations and put staff on furlough. This 414 00:26:56,200 --> 00:26:59,200 Speaker 1: is in the wake of a failed launch attempt that 415 00:26:59,359 --> 00:27:03,680 Speaker 1: happened off the coast of Ireland earlier this year. And 416 00:27:03,840 --> 00:27:06,160 Speaker 1: by a launch attempt, I mean an actual attempt to 417 00:27:06,280 --> 00:27:09,720 Speaker 1: launch a payload into orbit. I believe it was nine 418 00:27:09,800 --> 00:27:13,280 Speaker 1: satellites in total that were part of this payload. So 419 00:27:13,600 --> 00:27:18,000 Speaker 1: in early January, Virgin Orbit attempted to launch a launcher 420 00:27:18,119 --> 00:27:22,800 Speaker 1: one rocket which rides on an aircraft, especially fit seven 421 00:27:22,960 --> 00:27:26,840 Speaker 1: forty seven, and the seven forty seven reaches a certain altitude, 422 00:27:27,359 --> 00:27:31,680 Speaker 1: then it deploys the launch vehicle, which ignites its engines, 423 00:27:31,720 --> 00:27:34,840 Speaker 1: and then it goes from there. But something somewhere went wrong. 424 00:27:35,400 --> 00:27:39,240 Speaker 1: So at the time, Virgin Orbit said the first stage 425 00:27:39,320 --> 00:27:42,760 Speaker 1: rocket fired exactly as it was supposed to, it shut 426 00:27:42,800 --> 00:27:45,359 Speaker 1: down exactly as it was supposed to, and the second 427 00:27:45,440 --> 00:27:49,960 Speaker 1: stage ignited just like it was meant to. In fact, 428 00:27:50,000 --> 00:27:53,800 Speaker 1: that initially Virgin Orbit said that the payload had even 429 00:27:53,840 --> 00:27:58,560 Speaker 1: achieved orbit, but by the time you got to about 430 00:27:58,560 --> 00:28:03,000 Speaker 1: half an hour after launch, they changed their tune. They 431 00:28:03,040 --> 00:28:07,800 Speaker 1: said the payload failed to reach orbit, and there wasn't 432 00:28:07,880 --> 00:28:11,360 Speaker 1: much information about why. Later on we heard that apparently 433 00:28:11,440 --> 00:28:14,760 Speaker 1: an engine had overheated due to a filter becoming dislodged, 434 00:28:15,400 --> 00:28:19,199 Speaker 1: so that was possibly the reason for the failure to 435 00:28:19,240 --> 00:28:23,560 Speaker 1: reach orbit. Now, it's not unusual for young rocket companies 436 00:28:24,080 --> 00:28:27,760 Speaker 1: to have problems like this, and we indicate that something 437 00:28:27,920 --> 00:28:30,800 Speaker 1: isn't that difficult by saying, well, it's not rocket science. 438 00:28:31,000 --> 00:28:34,360 Speaker 1: And the reason we do that is because rocket science 439 00:28:34,640 --> 00:28:38,240 Speaker 1: is bloody difficult, so it should not come as a 440 00:28:38,240 --> 00:28:41,440 Speaker 1: surprise that there will be failures. But whether you think 441 00:28:41,480 --> 00:28:45,120 Speaker 1: of rocket science as being difficult or not, investors want 442 00:28:45,200 --> 00:28:47,960 Speaker 1: results when you're talking about companies, and if you fail 443 00:28:48,040 --> 00:28:51,200 Speaker 1: to provide good results, then the investors are more likely 444 00:28:51,240 --> 00:28:53,840 Speaker 1: to bail on you, and that seems to be what happened. 445 00:28:54,280 --> 00:28:58,800 Speaker 1: Virgin Orbit needed a real win back in January, and 446 00:28:58,880 --> 00:29:02,360 Speaker 1: the launch failure hit the company really hard. Ours Tetnica 447 00:29:02,440 --> 00:29:05,600 Speaker 1: predicted back in January that Virgin Orbit would face a 448 00:29:05,640 --> 00:29:09,360 Speaker 1: potential existential crisis as a result of it, and now 449 00:29:09,680 --> 00:29:12,520 Speaker 1: that crisis seems to have hit. And again this did 450 00:29:12,560 --> 00:29:15,240 Speaker 1: not come as a surprise because it wasn't just Ours 451 00:29:15,280 --> 00:29:19,600 Speaker 1: Tetnica predicting that Virgin Orbit was going to be in 452 00:29:19,680 --> 00:29:23,880 Speaker 1: real trouble as a result of this failure. Multiple analysts 453 00:29:23,960 --> 00:29:26,440 Speaker 1: had projected that at the rate that the company was 454 00:29:26,480 --> 00:29:30,520 Speaker 1: burning through money, it would be out of business this month. 455 00:29:31,040 --> 00:29:33,520 Speaker 1: And while we can't say that Virgin Orbit is really 456 00:29:33,560 --> 00:29:39,120 Speaker 1: most sincerely dead, it certainly has a rough road ahead 457 00:29:39,120 --> 00:29:44,880 Speaker 1: of it and if it wants to return to solvency, Finally, 458 00:29:45,040 --> 00:29:49,120 Speaker 1: NASA and a company called Axiom showed off the stylish 459 00:29:49,280 --> 00:29:52,800 Speaker 1: new moon suits yesterday. I've done full episodes about the 460 00:29:52,800 --> 00:29:56,040 Speaker 1: evolution of the space suit, including discussions about the differences 461 00:29:56,080 --> 00:30:01,240 Speaker 1: between astronaut and cosmonaut suits. While space suits have evolved 462 00:30:01,320 --> 00:30:05,880 Speaker 1: over time, for the most part, changes have been small iterations. 463 00:30:06,360 --> 00:30:10,239 Speaker 1: But these new suits have a lot of improvements and 464 00:30:10,360 --> 00:30:14,800 Speaker 1: more significant changes than earlier models. For one thing, they've 465 00:30:14,840 --> 00:30:18,360 Speaker 1: got a lot more joints in them, as in joints 466 00:30:18,360 --> 00:30:22,040 Speaker 1: that allow for movement, not you know, for twenty just blaze, 467 00:30:22,720 --> 00:30:27,000 Speaker 1: and these allow for greater freedom of movement when you're 468 00:30:27,120 --> 00:30:29,920 Speaker 1: galumping across the surface of the Moon, so you can 469 00:30:29,960 --> 00:30:33,640 Speaker 1: do things like crouch and squat, you know, important stuff 470 00:30:33,800 --> 00:30:36,520 Speaker 1: when you're playing a first person shooter. Also important stuff 471 00:30:36,520 --> 00:30:38,400 Speaker 1: if you're doing things like doing science on the Moon. 472 00:30:38,480 --> 00:30:40,680 Speaker 1: So yeah, astronauts are going to have a lot more 473 00:30:41,400 --> 00:30:45,520 Speaker 1: freedom of movement while they're wearing these suits. Another big 474 00:30:45,640 --> 00:30:49,000 Speaker 1: change is that previous suits required you to get into 475 00:30:49,120 --> 00:30:51,640 Speaker 1: a lower half first, so it's like you're putting on 476 00:30:51,680 --> 00:30:54,360 Speaker 1: a pair of space pants. Then you had the upper 477 00:30:54,400 --> 00:30:58,240 Speaker 1: half attached to you, and they would attach to the 478 00:30:58,280 --> 00:31:01,160 Speaker 1: lower half and create a seal. But yeah, the suits 479 00:31:01,160 --> 00:31:03,760 Speaker 1: were in two pieces, right, a lower in an upper half. 480 00:31:04,280 --> 00:31:07,400 Speaker 1: These new suits don't do that. Instead, they have an 481 00:31:07,520 --> 00:31:10,200 Speaker 1: entrance through the back, so you open up the back, 482 00:31:10,760 --> 00:31:14,200 Speaker 1: you get in, you get zipped up, essentially like you're 483 00:31:14,240 --> 00:31:18,920 Speaker 1: secured by your team, and then you're wearing your own 484 00:31:18,960 --> 00:31:21,760 Speaker 1: little space onesie that way. It's not little. I don't 485 00:31:21,800 --> 00:31:24,160 Speaker 1: know why I said little. These are big suits. They 486 00:31:24,160 --> 00:31:28,000 Speaker 1: also come equipped with some cool stuff like HD cameras, 487 00:31:28,000 --> 00:31:31,000 Speaker 1: so you can get that awesome high definition point of 488 00:31:31,080 --> 00:31:34,960 Speaker 1: view shot while someone's walking across the Moon. They have 489 00:31:35,040 --> 00:31:38,960 Speaker 1: also improved thermal insulation so that astronauts are going to 490 00:31:39,000 --> 00:31:41,520 Speaker 1: be able to wander the Chili regions of the South Pole. 491 00:31:42,160 --> 00:31:45,959 Speaker 1: That's the area that NASA has identified as the potential 492 00:31:46,120 --> 00:31:51,560 Speaker 1: site for a long term space colony type thing or 493 00:31:51,600 --> 00:31:54,440 Speaker 1: a moon base, and it's important to have that kind 494 00:31:54,440 --> 00:31:56,640 Speaker 1: of insulation if you don't want to freeze your tootsies off. 495 00:31:57,520 --> 00:32:02,320 Speaker 1: They are officially called the Axia Extra Vehicular Mobility Unit 496 00:32:02,640 --> 00:32:07,320 Speaker 1: or AXIMU for short. The versions NASA showed off had 497 00:32:07,520 --> 00:32:11,680 Speaker 1: this dark gray cover on them, but when used by 498 00:32:11,680 --> 00:32:16,440 Speaker 1: the astronauts, they will be white space suits and Interestingly, 499 00:32:16,840 --> 00:32:21,720 Speaker 1: NASA will not own these suits. They're not purchasing the suits. Instead, 500 00:32:22,640 --> 00:32:27,240 Speaker 1: it's kind of like a lunar tuck's rental. NASA will 501 00:32:27,240 --> 00:32:31,120 Speaker 1: go to Axiom and rent space suits for missions starting 502 00:32:31,240 --> 00:32:34,160 Speaker 1: with the planned lunar landing that right now aims to 503 00:32:34,200 --> 00:32:37,760 Speaker 1: return people to the Moon's surface by twenty twenty five. 504 00:32:38,440 --> 00:32:41,800 Speaker 1: Side note, I remain a little pessimistic about that timeline. 505 00:32:41,880 --> 00:32:45,240 Speaker 1: I think twenty twenty five is probably a doubt we're 506 00:32:45,280 --> 00:32:47,560 Speaker 1: going to make that, but I hope I'm wrong. It 507 00:32:47,560 --> 00:32:49,800 Speaker 1: would be great If I am, I will not be 508 00:32:49,920 --> 00:32:52,280 Speaker 1: upset at all if we do manage to get back 509 00:32:52,280 --> 00:32:55,480 Speaker 1: to the Moon by twenty twenty five. I think that's 510 00:32:55,480 --> 00:32:59,840 Speaker 1: a very exciting prospect. I think it's a very inspiring thing. 511 00:33:00,040 --> 00:33:03,600 Speaker 1: It gets people, especially kids, really excited about space and 512 00:33:03,760 --> 00:33:08,040 Speaker 1: science and engineering, and that's always wonderful. Not to mention, 513 00:33:08,160 --> 00:33:11,120 Speaker 1: we stand a chance to learn more, which is always cool. 514 00:33:11,800 --> 00:33:14,960 Speaker 1: And to this day this is true. I find when 515 00:33:15,000 --> 00:33:18,560 Speaker 1: I look up at the Moon, inevitably I just sit 516 00:33:18,600 --> 00:33:22,480 Speaker 1: there and marvel at the fact that we put people 517 00:33:22,840 --> 00:33:25,920 Speaker 1: up there. We put humans on the surface of the Moon. 518 00:33:25,960 --> 00:33:28,640 Speaker 1: They walked around on the Moon, they played golf, on 519 00:33:28,720 --> 00:33:32,360 Speaker 1: the Moon. Then we were able to get those same 520 00:33:32,400 --> 00:33:36,640 Speaker 1: people back home to Earth safely. And then Noel Brown 521 00:33:36,760 --> 00:33:38,960 Speaker 1: goes and says, yeah, but Stanley Kubrick shot the whole 522 00:33:39,000 --> 00:33:40,960 Speaker 1: thing on a sound stage and I lose my ever 523 00:33:41,160 --> 00:33:44,560 Speaker 1: love in mind. Y'all. This sounds like a joke, but 524 00:33:44,680 --> 00:33:48,040 Speaker 1: it happened repeatedly this past weekend. When I was in Austin, 525 00:33:48,120 --> 00:33:51,840 Speaker 1: Texas for South By Southwest, I would tell this story 526 00:33:51,960 --> 00:33:55,400 Speaker 1: about finding the moon really inspirational and they'd say, yeah. 527 00:33:55,440 --> 00:33:58,280 Speaker 1: I was just talking to Noel about conspiracy theories, and 528 00:33:58,400 --> 00:34:03,000 Speaker 1: it took everything in my body not to lay into Knoll. 529 00:34:03,920 --> 00:34:06,280 Speaker 1: Of course, I went Null's the sweetest person in the world, 530 00:34:06,320 --> 00:34:10,520 Speaker 1: but man, I was going bonkers by the end. All right, 531 00:34:11,440 --> 00:34:14,840 Speaker 1: enough of all that. That's the news for Thursday, March sixteenth, 532 00:34:14,920 --> 00:34:18,239 Speaker 1: twenty twenty three. I hope you are all well. If 533 00:34:18,239 --> 00:34:21,160 Speaker 1: you have any suggestions for future topics of tech stuff, 534 00:34:21,400 --> 00:34:23,799 Speaker 1: please reach out to me. You can do that by 535 00:34:23,840 --> 00:34:28,440 Speaker 1: going over onto Twitter and tweet at tech stuff hsw 536 00:34:28,719 --> 00:34:31,239 Speaker 1: let me know what you'd like to hear. Or you 537 00:34:31,239 --> 00:34:33,920 Speaker 1: can download the iHeartRadio app. It's free to download. It's 538 00:34:33,960 --> 00:34:36,279 Speaker 1: free to use. Navigate over to tech stuff. Put that 539 00:34:36,320 --> 00:34:38,480 Speaker 1: into the little search bar, it'll take you to the 540 00:34:38,520 --> 00:34:41,080 Speaker 1: tech stuff page. You can click on the little microphone 541 00:34:41,280 --> 00:34:43,920 Speaker 1: icon and leave a voice message up for thirty seconds 542 00:34:43,960 --> 00:34:45,359 Speaker 1: and like, let me know what you'd like to hear 543 00:34:45,520 --> 00:34:54,840 Speaker 1: and I'll talk to you again really soon. Tex Stuff 544 00:34:54,960 --> 00:34:59,479 Speaker 1: is an iHeartRadio production. For more podcasts from iHeartRadio, visit 545 00:34:59,520 --> 00:35:02,920 Speaker 1: the iHeart Radio app, Apple Podcasts, or wherever you listen 546 00:35:02,960 --> 00:35:04,040 Speaker 1: to your favorite shows.