1 00:00:01,080 --> 00:00:03,400 Speaker 1: My name is Lily Maddon and I'm a proud Arunda 2 00:00:03,640 --> 00:00:08,400 Speaker 1: Bungelung Cargoton woman from Gadigol Country. The Daily oz acknowledges 3 00:00:08,480 --> 00:00:10,639 Speaker 1: that this podcast is recorded on the lands of the 4 00:00:10,680 --> 00:00:14,240 Speaker 1: Gadighl people and pays respect to all Aboriginal and Torres 5 00:00:14,240 --> 00:00:17,159 Speaker 1: Strait Island and nations. We pay our respects to the 6 00:00:17,200 --> 00:00:19,960 Speaker 1: first peoples of these countries, both past and present. 7 00:00:25,280 --> 00:00:27,920 Speaker 2: Good morning and welcome to the Daily os It's Tuesday, 8 00:00:27,960 --> 00:00:30,680 Speaker 2: the twenty first of November. I'm Sam, I'm Zarah. It's 9 00:00:30,760 --> 00:00:32,720 Speaker 2: nice to be back on the podcast. Sarah. It is 10 00:00:32,800 --> 00:00:35,600 Speaker 2: nice to be a week away. You and I are traversing 11 00:00:35,640 --> 00:00:36,800 Speaker 2: across the world doing some. 12 00:00:36,800 --> 00:00:39,480 Speaker 3: Interesting I look forward to this month being our next 13 00:00:39,479 --> 00:00:42,480 Speaker 3: best month of podcast listeners, now that we also haven't 14 00:00:42,479 --> 00:00:43,199 Speaker 3: been here for a week. 15 00:00:43,680 --> 00:00:46,720 Speaker 2: Yes, the myth of listens going up when you and 16 00:00:46,760 --> 00:00:50,360 Speaker 2: I are not here and a half years ago. Okay, 17 00:00:50,479 --> 00:00:51,840 Speaker 2: what are we going to talk about today? 18 00:00:52,000 --> 00:00:55,880 Speaker 3: So last week one of the biggest names in artificial intelligence, 19 00:00:56,200 --> 00:00:59,880 Speaker 3: arguably the only name I know in artificial intelligence, Sam Altman. 20 00:01:00,360 --> 00:01:03,600 Speaker 3: He was removed as CEO of open Ai, the tech 21 00:01:03,640 --> 00:01:05,080 Speaker 3: company behind chat GPT. 22 00:01:05,440 --> 00:01:08,399 Speaker 2: If this technology goes wrong. It can go quite wrong, 23 00:01:09,120 --> 00:01:11,640 Speaker 2: and we want to be vocal about that. We want 24 00:01:11,640 --> 00:01:14,720 Speaker 2: to work with the government to prevent that from happening. Now. 25 00:01:14,720 --> 00:01:17,920 Speaker 2: Oltman co founded the company with Elon Musk. Might have 26 00:01:18,000 --> 00:01:20,560 Speaker 2: heard of him, and news of the board's decision to 27 00:01:20,640 --> 00:01:23,200 Speaker 2: dump him as CEO has taken many in tech and 28 00:01:23,240 --> 00:01:24,440 Speaker 2: business by surprise. 29 00:01:24,840 --> 00:01:28,840 Speaker 3: There are plenty of questions surrounding his departure. So in 30 00:01:28,920 --> 00:01:30,959 Speaker 3: the deep Dive today, I want to take you through 31 00:01:31,000 --> 00:01:34,440 Speaker 3: what we know about why Oltman was ousted. Before we 32 00:01:34,480 --> 00:01:36,839 Speaker 3: get there, though, Sam, what is making headlines. 33 00:01:39,319 --> 00:01:43,039 Speaker 2: The CEO of Optus, Kelly Bayo Rosemarin, has resigned after 34 00:01:43,080 --> 00:01:46,039 Speaker 2: a mass network outage earlier this month, which left ten 35 00:01:46,240 --> 00:01:50,200 Speaker 2: million Aussies without phone services. The outage occurred due to 36 00:01:50,400 --> 00:01:53,480 Speaker 2: a software upgrade, and over two hundred and twenty calls 37 00:01:53,480 --> 00:01:56,800 Speaker 2: to Triple zero failed to get through. During a Senate 38 00:01:56,840 --> 00:02:00,600 Speaker 2: hearing on Friday, rosemarn denied rumors that she was down, 39 00:02:00,840 --> 00:02:04,760 Speaker 2: but in a statement from Singtel, which is Optus's parent company, yesterday, 40 00:02:05,120 --> 00:02:07,880 Speaker 2: she said, having now had time for some personal reflection, 41 00:02:08,240 --> 00:02:10,600 Speaker 2: I have come to the decision that my resignation is 42 00:02:10,600 --> 00:02:12,760 Speaker 2: in the best interests of optis moving. 43 00:02:12,520 --> 00:02:18,400 Speaker 3: Forward, gender affirming procedures could be subsidized by Medicare. The 44 00:02:18,440 --> 00:02:22,440 Speaker 3: Medical Services Advisory Committee is considering an application that would 45 00:02:22,480 --> 00:02:28,000 Speaker 3: subsidize gender affirming surgery procedures, including facial and genital procedures, 46 00:02:28,360 --> 00:02:31,480 Speaker 3: chess surgery, and voice surgery. That's in a bid to 47 00:02:31,520 --> 00:02:35,200 Speaker 3: improve the quality of life of transgender people. The Committee 48 00:02:35,240 --> 00:02:38,280 Speaker 3: is an independent body that assesses and advises the government 49 00:02:38,360 --> 00:02:41,320 Speaker 3: on whether a new medical service should be publicly funded. 50 00:02:42,760 --> 00:02:45,800 Speaker 2: Over three thousand fires have broken out in Brazil this month, 51 00:02:45,919 --> 00:02:49,600 Speaker 2: destroying nearly seven hundred and seventy thousand hectares of the 52 00:02:49,680 --> 00:02:52,840 Speaker 2: largest tropical wetlands in the world. In the same period 53 00:02:52,960 --> 00:02:56,040 Speaker 2: last year, there were fewer than seventy fires in the area. 54 00:02:56,560 --> 00:02:59,079 Speaker 2: Endangered jaguars and more than one hundred and fifty other 55 00:02:59,080 --> 00:03:02,240 Speaker 2: mammal species are at risk from the destruction of habitat 56 00:03:02,280 --> 00:03:03,800 Speaker 2: caused by the blazers. 57 00:03:04,919 --> 00:03:07,880 Speaker 3: And the Good News. Gases produced by landfill in the 58 00:03:07,919 --> 00:03:11,600 Speaker 3: Act will be used to power over ten thousand homes. 59 00:03:12,200 --> 00:03:15,600 Speaker 3: The new gas expansion project at the Canberra Landfill Facility 60 00:03:15,639 --> 00:03:19,560 Speaker 3: will reduce carbon emissions by using methane from landfill to 61 00:03:19,600 --> 00:03:22,919 Speaker 3: then generate power. Once it's completed, the site should be 62 00:03:23,000 --> 00:03:30,480 Speaker 3: able to generate fifty thousand megawatt hours of energy. All right, Sam, 63 00:03:31,280 --> 00:03:36,279 Speaker 3: tell me, are you planning to set up a globally 64 00:03:36,440 --> 00:03:39,280 Speaker 3: defining tech company in the future. 65 00:03:39,440 --> 00:03:43,280 Speaker 2: No, globally defining media company would be nice. But it's 66 00:03:43,360 --> 00:03:45,960 Speaker 2: nice to be talking about Sam again. However, I am 67 00:03:46,080 --> 00:03:48,360 Speaker 2: noticing that every time there is a Sam on the podcast, 68 00:03:48,400 --> 00:03:50,760 Speaker 2: it's not for great reasons. 69 00:03:51,120 --> 00:03:53,040 Speaker 3: No, And I mean, if this is the first episode 70 00:03:53,080 --> 00:03:56,440 Speaker 3: you've listened to, we did an episode a week before 71 00:03:56,520 --> 00:04:01,000 Speaker 3: last about Sam Bankman free who was the founder of FTX, 72 00:04:01,040 --> 00:04:04,120 Speaker 3: and he was a tech founder who is now in jail. 73 00:04:05,120 --> 00:04:08,720 Speaker 3: This story is not following the same trajectory, but indeed 74 00:04:08,880 --> 00:04:12,560 Speaker 3: is another story I guess of the uber powerful like 75 00:04:12,880 --> 00:04:17,240 Speaker 3: very well known, very successful founder's named Sam, who are 76 00:04:17,279 --> 00:04:18,560 Speaker 3: having a bit of a fall from grace. 77 00:04:18,760 --> 00:04:20,160 Speaker 2: So why don't we start off with a bit more 78 00:04:20,160 --> 00:04:23,120 Speaker 2: information about Sam Altman? What do we need to know 79 00:04:23,160 --> 00:04:24,280 Speaker 2: about him as a founder? 80 00:04:24,600 --> 00:04:27,680 Speaker 3: So? Sam Altman is a thirty eight year old co 81 00:04:27,760 --> 00:04:31,039 Speaker 3: founder of the company that owns chat GPT. Now that 82 00:04:31,120 --> 00:04:34,599 Speaker 3: parent company is called open Ai, And though we've only 83 00:04:34,720 --> 00:04:37,560 Speaker 3: really been speaking about it recently, it actually was founded 84 00:04:37,640 --> 00:04:41,200 Speaker 3: in twenty fifteen, I thought I would ask the product 85 00:04:41,240 --> 00:04:45,360 Speaker 3: of Sam Altman's brain chat GPT exactly who Sam Altman is, 86 00:04:45,480 --> 00:04:48,760 Speaker 3: and here's what it said. Sam Altman is a well 87 00:04:48,800 --> 00:04:51,960 Speaker 3: known entrepreneur and investor and has been involved in various 88 00:04:51,960 --> 00:04:56,320 Speaker 3: projects and organizations, including co chairing open Ai. So that's 89 00:04:56,400 --> 00:04:59,479 Speaker 3: what the free version of chat GPT told me. But 90 00:04:59,600 --> 00:05:01,720 Speaker 3: because we don't pay for it, we don't have any 91 00:05:01,760 --> 00:05:05,599 Speaker 3: info past January twenty twenty two, so it doesn't actually 92 00:05:05,640 --> 00:05:09,760 Speaker 3: involve any news of his sacking in that summary. But 93 00:05:10,000 --> 00:05:13,520 Speaker 3: his backstory is I guess exactly what we've come to 94 00:05:13,600 --> 00:05:16,320 Speaker 3: expect from a tech quiz. In an interview with a 95 00:05:16,400 --> 00:05:19,840 Speaker 3: New Yorker, he said he disassembled a Macintosh computer when 96 00:05:19,880 --> 00:05:23,160 Speaker 3: he was eight years old. Another Sam trade the most 97 00:05:23,200 --> 00:05:25,320 Speaker 3: serifying MI is, I actually could see you doing that. 98 00:05:25,440 --> 00:05:27,120 Speaker 2: I would love that. I think of it as like 99 00:05:27,200 --> 00:05:28,159 Speaker 2: really complex lego. 100 00:05:28,560 --> 00:05:31,520 Speaker 3: Yeah right, what all eight year olds are doing there? Definitely, 101 00:05:32,320 --> 00:05:34,120 Speaker 3: And that was actually around the same time that he 102 00:05:34,160 --> 00:05:36,000 Speaker 3: started to get into computer programming. 103 00:05:36,240 --> 00:05:36,679 Speaker 2: Very cool. 104 00:05:36,760 --> 00:05:39,640 Speaker 3: In two thousand and five, when he was at Stanford University, 105 00:05:39,760 --> 00:05:42,320 Speaker 3: Altman turned to his friend and said, wouldn't it be 106 00:05:42,360 --> 00:05:44,599 Speaker 3: great if I could open my mobile phone and see 107 00:05:44,640 --> 00:05:47,080 Speaker 3: a map of where all my friends are. I know 108 00:05:47,160 --> 00:05:50,320 Speaker 3: that that idea doesn't seem so wild now that iPhones 109 00:05:50,360 --> 00:05:52,960 Speaker 3: and other tech we regularly engage with can do that, 110 00:05:53,440 --> 00:05:55,719 Speaker 3: but in the early two thousands that was still a 111 00:05:55,880 --> 00:05:58,719 Speaker 3: long way off. So Altman became inspired by the idea 112 00:05:58,760 --> 00:06:01,080 Speaker 3: of launching an app with some and he ended up 113 00:06:01,160 --> 00:06:04,400 Speaker 3: dropping out of UNI to do just that. Altman has 114 00:06:04,440 --> 00:06:08,000 Speaker 3: also got his own cryptocurrency called world Coin, currently has 115 00:06:08,160 --> 00:06:10,160 Speaker 3: millions of users around the world. 116 00:06:10,400 --> 00:06:12,840 Speaker 2: So it does seem like, even from a tender age 117 00:06:12,880 --> 00:06:15,800 Speaker 2: of age, he was destined for this career in tech. 118 00:06:16,480 --> 00:06:18,920 Speaker 2: How did he end up though, not just being another 119 00:06:19,160 --> 00:06:22,680 Speaker 2: tech whiz, but actually a really defining AI whiz. 120 00:06:22,839 --> 00:06:26,640 Speaker 3: Well, Oltman was involved in what's now called as effective 121 00:06:26,760 --> 00:06:29,760 Speaker 3: altruism for many years before he got into AI. 122 00:06:29,880 --> 00:06:30,920 Speaker 2: What do you mean by that. 123 00:06:31,080 --> 00:06:33,719 Speaker 3: Well, in the business world, it's basically all about building 124 00:06:33,800 --> 00:06:37,520 Speaker 3: a company that's focused on improving the welfare of society. 125 00:06:37,920 --> 00:06:41,280 Speaker 3: So investors like Altman wanted to put money into these 126 00:06:41,320 --> 00:06:44,640 Speaker 3: sorts of businesses. And hang in here because it actually 127 00:06:44,640 --> 00:06:46,320 Speaker 3: explains how we got to the point of a co 128 00:06:46,400 --> 00:06:48,600 Speaker 3: founder being kicked out of his own company. So just 129 00:06:48,640 --> 00:06:50,039 Speaker 3: put a pin in that we will come back to 130 00:06:50,080 --> 00:06:52,840 Speaker 3: that later. Pin put in, Yeah, okay, but the whole 131 00:06:52,839 --> 00:06:56,320 Speaker 3: point of effective altruism is to keep focused on promoting 132 00:06:56,360 --> 00:07:00,640 Speaker 3: a vision rather than being purely focused on profit. In 133 00:07:00,680 --> 00:07:05,080 Speaker 3: twenty fifteen, Altman launched open ai with Elon Musk who Yep, 134 00:07:05,160 --> 00:07:08,279 Speaker 3: no background needed. But the company's mission was to quote 135 00:07:08,360 --> 00:07:11,240 Speaker 3: advanced digital intelligence in the way that is most likely 136 00:07:11,280 --> 00:07:14,440 Speaker 3: to benefit humanity as a whole, unconstrained by a need 137 00:07:14,520 --> 00:07:18,800 Speaker 3: to generate a financial return. Very lofty ambition there, i'd say, 138 00:07:19,400 --> 00:07:22,320 Speaker 3: and at the outset, which I think is really interesting. 139 00:07:22,480 --> 00:07:25,720 Speaker 3: Open ai was actually listed as a non profit organization. 140 00:07:25,920 --> 00:07:28,320 Speaker 2: Wow, so basically a charity pretty much. 141 00:07:28,400 --> 00:07:31,360 Speaker 3: It was initially relying on donations to do its work 142 00:07:31,440 --> 00:07:34,920 Speaker 3: and was governed by the same laws as charities would be. 143 00:07:35,520 --> 00:07:38,440 Speaker 3: About three years later, though the company changed its structure. 144 00:07:39,120 --> 00:07:42,080 Speaker 3: It kept the nonprofit side of things and then launched 145 00:07:42,120 --> 00:07:45,480 Speaker 3: a subsidiary company called open ai Global, where it could 146 00:07:45,520 --> 00:07:49,800 Speaker 3: raise some money and could hire world leading talent. Because unfortunately, 147 00:07:49,960 --> 00:07:52,600 Speaker 3: Silicon Valley tech bros. Don't come for free. 148 00:07:52,480 --> 00:07:54,680 Speaker 2: So they basically split off the company and made a 149 00:07:54,720 --> 00:07:57,200 Speaker 2: nonprofit BIT and then a for profit. 150 00:07:56,920 --> 00:07:59,960 Speaker 3: Bit exactly and They also got huge backing from investor 151 00:08:00,360 --> 00:08:03,680 Speaker 3: The biggest one that we know of was from Microsoft, 152 00:08:04,040 --> 00:08:07,680 Speaker 3: obviously tech giant. They announced a big partnership with open 153 00:08:07,720 --> 00:08:10,360 Speaker 3: Ai earlier this year, and that was estimated to be 154 00:08:10,400 --> 00:08:13,000 Speaker 3: worth around ten billion dollars. So what you end up 155 00:08:13,000 --> 00:08:16,400 Speaker 3: with is quite an interesting company, and also a board 156 00:08:16,400 --> 00:08:19,120 Speaker 3: that's made up of six people, none of them are 157 00:08:19,120 --> 00:08:22,120 Speaker 3: investors in the company, which I'd say is pretty rare. 158 00:08:22,240 --> 00:08:23,040 Speaker 2: Yeah, very rare. 159 00:08:23,160 --> 00:08:26,320 Speaker 3: Yeah. What was also quite unusual was that Altman actually 160 00:08:26,320 --> 00:08:29,520 Speaker 3: admitted he didn't have any shares in his own company. 161 00:08:29,720 --> 00:08:30,800 Speaker 3: I have no equity in OPENINGI. 162 00:08:31,120 --> 00:08:33,560 Speaker 2: You need a lawyer or an agent. I'm doing this 163 00:08:33,600 --> 00:08:36,760 Speaker 2: because I love it. Interesting because he was really passionate 164 00:08:36,800 --> 00:08:39,000 Speaker 2: about that not for profit bit, but it was the 165 00:08:39,040 --> 00:08:41,920 Speaker 2: for profit bit that had the shareholders because they were 166 00:08:41,920 --> 00:08:43,319 Speaker 2: the ones that put money into the company. 167 00:08:43,520 --> 00:08:47,520 Speaker 3: It's so interesting and it's suddenly unorthodox. Under this new structure, 168 00:08:47,679 --> 00:08:50,800 Speaker 3: Sam Altman was appointed CEO, and he was also made 169 00:08:50,840 --> 00:08:53,760 Speaker 3: one of the six board members. Three of them worked 170 00:08:53,760 --> 00:08:56,120 Speaker 3: for open Ai, and three of them were directors of 171 00:08:56,160 --> 00:08:58,920 Speaker 3: other companies. And that brings us up to speed on 172 00:08:59,000 --> 00:09:02,679 Speaker 3: open Ai. That is, until a board meeting last. 173 00:09:02,440 --> 00:09:06,040 Speaker 2: Week really interesting situation. You have this company that's quite 174 00:09:06,120 --> 00:09:10,040 Speaker 2: literally developing something with the whole world watching, and we're 175 00:09:10,080 --> 00:09:11,240 Speaker 2: all tingering around with it. 176 00:09:11,280 --> 00:09:14,200 Speaker 3: The only thing people were talking about it like the 177 00:09:14,240 --> 00:09:15,480 Speaker 3: first six months of this year. 178 00:09:15,559 --> 00:09:18,520 Speaker 2: It's very rare that a company comes across the world 179 00:09:18,600 --> 00:09:20,959 Speaker 2: and changes the ways that we could work and we 180 00:09:21,000 --> 00:09:23,520 Speaker 2: could live right. This one did it. But all of 181 00:09:23,559 --> 00:09:26,640 Speaker 2: this is now for sam Oltman at least come crumbling down. 182 00:09:27,080 --> 00:09:31,760 Speaker 3: What's happened, basically, sam Oltman was fired as CEO by 183 00:09:31,760 --> 00:09:35,319 Speaker 3: his own company, so late last week, open AI released 184 00:09:35,320 --> 00:09:38,720 Speaker 3: a statement saying that board members had lost confidence in 185 00:09:38,760 --> 00:09:43,320 Speaker 3: Altman because he was not quote consistently candid with them. 186 00:09:43,600 --> 00:09:46,160 Speaker 3: I don't fully know what that means, but it clearly 187 00:09:46,160 --> 00:09:49,439 Speaker 3: has something to do with transparency or honesty or something 188 00:09:49,480 --> 00:09:52,559 Speaker 3: of that nature. A statement from the board of directors 189 00:09:52,559 --> 00:09:56,240 Speaker 3: said that open AI was grateful for Altman's contributions to 190 00:09:56,280 --> 00:09:59,320 Speaker 3: the founding and the growth of the company. The chair 191 00:09:59,360 --> 00:10:02,160 Speaker 3: of the board, Greg Brockman, was also removed and said 192 00:10:02,200 --> 00:10:05,679 Speaker 3: he was excluded from the vote to oust Altman as CEO. 193 00:10:06,040 --> 00:10:08,800 Speaker 3: A joint statement from Altman and Brockman said they were 194 00:10:08,960 --> 00:10:11,000 Speaker 3: shocked and saddened by the decisions. 195 00:10:11,400 --> 00:10:13,520 Speaker 2: So I have to reiterate, this is a really strange 196 00:10:13,520 --> 00:10:17,120 Speaker 2: situation to have a founder being sacked by his own 197 00:10:17,160 --> 00:10:20,079 Speaker 2: company but he doesn't even have shares in his own company. 198 00:10:20,520 --> 00:10:22,280 Speaker 2: It's just a very odd structure. 199 00:10:22,200 --> 00:10:25,120 Speaker 3: It is. But it's not the first time we've seen 200 00:10:25,160 --> 00:10:25,760 Speaker 3: something like this. 201 00:10:25,840 --> 00:10:28,280 Speaker 2: There have been some other high profile ones, right, Yeah. 202 00:10:28,200 --> 00:10:33,160 Speaker 3: So Steve Jobs was fired by Apple in nineteen eighty 203 00:10:33,160 --> 00:10:35,800 Speaker 3: five after he had a tiff with the company's board 204 00:10:35,800 --> 00:10:38,199 Speaker 3: of directors. I remember when we were writing our book 205 00:10:38,240 --> 00:10:40,600 Speaker 3: because we have a section on tech, I went really 206 00:10:40,640 --> 00:10:43,520 Speaker 3: deep into this because I was so intrigued by it because. 207 00:10:43,240 --> 00:10:45,600 Speaker 2: He returned, he got brought back to Yeah. 208 00:10:45,559 --> 00:10:49,960 Speaker 3: And I just can't imagine the egos and what actually 209 00:10:49,960 --> 00:10:53,600 Speaker 3: happens with like personality politics to be kicked out and 210 00:10:53,600 --> 00:10:57,000 Speaker 3: then re enter. Anyway, that's an aside. So there was Apple, 211 00:10:57,160 --> 00:10:59,400 Speaker 3: and then there was also Eduardo Saverron who was a 212 00:10:59,440 --> 00:11:02,600 Speaker 3: co founder of Facebook. He had his shares diluted in 213 00:11:02,640 --> 00:11:04,880 Speaker 3: an effort to reduce his power of the company, which 214 00:11:04,880 --> 00:11:08,040 Speaker 3: you may remember from the pivotal scene in the Social network. 215 00:11:08,200 --> 00:11:09,760 Speaker 2: I think we were going to let you parade around 216 00:11:09,760 --> 00:11:11,520 Speaker 2: in your ridiculous suits, pretending you were running the. 217 00:11:11,520 --> 00:11:15,199 Speaker 1: Summary my brothers at the cleaners, along with my honey 218 00:11:15,280 --> 00:11:16,559 Speaker 1: and my fuck you flip flops. 219 00:11:16,840 --> 00:11:20,000 Speaker 2: You pretentious douchebag securities here. You'll be leaving now. 220 00:11:20,240 --> 00:11:23,439 Speaker 3: And recently, the board of directors of Uber forced co 221 00:11:23,520 --> 00:11:28,120 Speaker 3: founder Travis Kalanik to resign in twenty seventeen after allegations 222 00:11:28,120 --> 00:11:30,199 Speaker 3: of sexual harassment at the company. 223 00:11:30,400 --> 00:11:32,880 Speaker 2: So that's clearly a different situation to what we have 224 00:11:33,000 --> 00:11:36,080 Speaker 2: on our hands here, but nonetheless the same result. Altman 225 00:11:36,320 --> 00:11:39,040 Speaker 2: is gone. What's going to happen with the company now? 226 00:11:39,320 --> 00:11:42,199 Speaker 3: So the board announced an interim CEO, But there has 227 00:11:42,280 --> 00:11:44,720 Speaker 3: been a lot of speculation over the weekend that Altman 228 00:11:44,880 --> 00:11:47,080 Speaker 3: could actually somehow get. 229 00:11:46,880 --> 00:11:48,840 Speaker 2: Back into the company Steve jobstyle. 230 00:11:49,000 --> 00:11:51,200 Speaker 3: He posted on x saying I love the Open Ai 231 00:11:51,280 --> 00:11:54,400 Speaker 3: team so much, and the new CEO shared it with 232 00:11:54,559 --> 00:11:57,000 Speaker 3: a blue love heart emoji. This is truly a sign 233 00:11:57,040 --> 00:12:01,199 Speaker 3: of the times when you know, major kaida well defining 234 00:12:01,280 --> 00:12:07,280 Speaker 3: companies are having their through half heart energies. But there 235 00:12:07,280 --> 00:12:09,840 Speaker 3: are reports in the media that Altmand went to open 236 00:12:09,840 --> 00:12:12,760 Speaker 3: Ai headquarters over the weekend to have talks about returning 237 00:12:12,800 --> 00:12:15,120 Speaker 3: as CEO, and he has the support of some big 238 00:12:15,160 --> 00:12:18,000 Speaker 3: investors and senior figures at the company. We don't know 239 00:12:18,040 --> 00:12:20,400 Speaker 3: for sure what the discussions with the board have been about, 240 00:12:20,440 --> 00:12:22,920 Speaker 3: but Altman did leave us a clue. On social media. 241 00:12:23,440 --> 00:12:26,520 Speaker 3: He posted a selfie wearing a guest lanyard that read 242 00:12:26,840 --> 00:12:29,320 Speaker 3: first and last Home I ever wear one of these. 243 00:12:29,440 --> 00:12:32,559 Speaker 2: This is a very odd It's all very odd, this situation. 244 00:12:33,400 --> 00:12:34,959 Speaker 3: It's just like a TV show. 245 00:12:35,240 --> 00:12:36,960 Speaker 2: But I just think it will be a TV show. 246 00:12:37,200 --> 00:12:39,720 Speaker 3: But I think, you know, this is an interesting one 247 00:12:40,120 --> 00:12:42,840 Speaker 3: of you know, we often speak about whether something is 248 00:12:42,920 --> 00:12:46,600 Speaker 3: news or noise at the daily I was like, this 249 00:12:47,080 --> 00:12:50,360 Speaker 3: is talking about this meaningful? Is it important that we 250 00:12:50,360 --> 00:12:52,280 Speaker 3: talk about or is it just you know, big tech 251 00:12:52,360 --> 00:12:55,199 Speaker 3: leaders and their ego. And I do think that this 252 00:12:55,280 --> 00:12:59,640 Speaker 3: one especially is really important to discuss because at least 253 00:12:59,640 --> 00:13:04,040 Speaker 3: in our industry, something like chatchipt has the power to 254 00:13:04,320 --> 00:13:08,160 Speaker 3: literally change from the bottom up how this industry works. 255 00:13:08,240 --> 00:13:10,680 Speaker 2: I agree, And I think who is leading chat ChiPT 256 00:13:11,080 --> 00:13:11,960 Speaker 2: is news. 257 00:13:11,920 --> 00:13:14,840 Speaker 3: Like whoever's hands that ends up in Yeah, Like, you know, 258 00:13:15,160 --> 00:13:18,520 Speaker 3: just as we saw Elon Musk's ownership of X really 259 00:13:18,640 --> 00:13:21,720 Speaker 3: changed the game, there is something to be said about 260 00:13:21,720 --> 00:13:26,360 Speaker 3: who is taking the reins of these platforms, and especially 261 00:13:26,360 --> 00:13:29,040 Speaker 3: with such little regulation because it's such a new thing. 262 00:13:29,440 --> 00:13:32,200 Speaker 3: I do think this is a really interesting story in 263 00:13:32,240 --> 00:13:34,400 Speaker 3: and of itself, but then also of course in the 264 00:13:34,440 --> 00:13:38,160 Speaker 3: broader kind of pattern of these big tech leaders just 265 00:13:38,280 --> 00:13:39,439 Speaker 3: falling like dominos. 266 00:13:39,640 --> 00:13:41,280 Speaker 2: And just in the quick update to this story that 267 00:13:41,400 --> 00:13:45,600 Speaker 2: only broke late last night here in Australia, Microsoft has 268 00:13:45,720 --> 00:13:50,360 Speaker 2: hired Sam Altman to lead their advanced AI research team. 269 00:13:50,559 --> 00:13:54,640 Speaker 2: They've actually hired him and Greg Brockman, and it's come 270 00:13:55,040 --> 00:13:59,679 Speaker 2: almost immediately after the pair left Open AI. So Microsoft's 271 00:13:59,720 --> 00:14:03,599 Speaker 2: chief executive announced the surprise move and basically said that 272 00:14:03,640 --> 00:14:06,880 Speaker 2: they still believe in the products that Sam Moltman and 273 00:14:07,280 --> 00:14:10,439 Speaker 2: Greg Brockman have to make and that they want them 274 00:14:10,480 --> 00:14:14,400 Speaker 2: to do it at Microsoft instead. Has this interesting subplot 275 00:14:14,559 --> 00:14:18,880 Speaker 2: of what is happening with the way that these founders 276 00:14:19,480 --> 00:14:22,600 Speaker 2: are setting up their companies and how do you end 277 00:14:22,680 --> 00:14:25,280 Speaker 2: up in a position where something like that being ousted 278 00:14:25,360 --> 00:14:28,840 Speaker 2: is even possible. And surely, after some of the examples 279 00:14:28,840 --> 00:14:32,200 Speaker 2: that we ran through, someone as smart you know, he 280 00:14:32,360 --> 00:14:34,040 Speaker 2: was making computers at that years old, someone of the 281 00:14:34,120 --> 00:14:37,120 Speaker 2: smartest Sam Moltman would have put some safeguards in place 282 00:14:37,160 --> 00:14:39,680 Speaker 2: to not get kicked out. But turns out he didn't. 283 00:14:40,960 --> 00:14:43,800 Speaker 2: That's all for today's episode of The Daily Os. We'll 284 00:14:43,800 --> 00:14:46,640 Speaker 2: be back again tomorrow. If you enjoyed this episode, would 285 00:14:46,680 --> 00:14:48,680 Speaker 2: love you to leave a rating on Spotify or even 286 00:14:48,720 --> 00:14:50,680 Speaker 2: comment on Apple. It really helps us get in front 287 00:14:50,720 --> 00:14:54,200 Speaker 2: of more people and helps us keep our job as founders. 288 00:14:54,520 --> 00:15:05,359 Speaker 2: For now Thinking appealed to Don the peace and comment