1 00:00:00,520 --> 00:00:04,480 Speaker 1: Already and this is the Daily This is the Daily OS. 2 00:00:05,120 --> 00:00:06,840 Speaker 1: Oh Now it makes sense. 3 00:00:14,760 --> 00:00:16,960 Speaker 2: Good morning and welcome to the Daily OS. It's Monday, 4 00:00:17,000 --> 00:00:19,079 Speaker 2: the sixteenth of March. I'm Sam Kazlowski. 5 00:00:19,400 --> 00:00:20,560 Speaker 1: I'm Billyefit Simon's. 6 00:00:20,920 --> 00:00:24,600 Speaker 2: Australia's biggest tech companies have cut thousands of jobs in 7 00:00:24,640 --> 00:00:27,120 Speaker 2: a matter of weeks, and they're all pointing to the 8 00:00:27,160 --> 00:00:30,880 Speaker 2: same reason, and that's AI. But experts say the picture 9 00:00:31,040 --> 00:00:34,440 Speaker 2: is slightly more complicated than the headlines, which often invoke 10 00:00:34,560 --> 00:00:38,560 Speaker 2: fear in most of us across almost any line of work. 11 00:00:38,840 --> 00:00:41,480 Speaker 2: So today I thought Billy and I would break down 12 00:00:41,680 --> 00:00:42,920 Speaker 2: all the latest developments. 13 00:00:47,120 --> 00:00:47,400 Speaker 1: Sam. 14 00:00:47,440 --> 00:00:50,560 Speaker 3: Before we get into the developments, I want to set 15 00:00:50,640 --> 00:00:54,280 Speaker 3: up that at the Daily OS you are the go 16 00:00:54,360 --> 00:00:58,240 Speaker 3: to person when it comes to AI. You obsess over 17 00:00:58,360 --> 00:01:02,320 Speaker 3: it more than anyone I know, and I'm interested in 18 00:01:02,360 --> 00:01:05,199 Speaker 3: the intro. You said that it invokes fear in most 19 00:01:05,200 --> 00:01:07,600 Speaker 3: of us, because you tend to have quite a positive 20 00:01:07,680 --> 00:01:10,920 Speaker 3: view in terms of what it could do for the 21 00:01:11,120 --> 00:01:13,679 Speaker 3: Daily Ohs and then also other companies. 22 00:01:14,120 --> 00:01:16,679 Speaker 2: I think it has a lot of positives even for 23 00:01:16,840 --> 00:01:19,120 Speaker 2: somebody who wants to get on top of life admin 24 00:01:19,280 --> 00:01:22,880 Speaker 2: which is everyone. But I have felt over the last 25 00:01:22,959 --> 00:01:26,440 Speaker 2: kind of twelve months, especially that I'm in the minority here, 26 00:01:26,640 --> 00:01:31,000 Speaker 2: I do feel like overall, there are more conversations about 27 00:01:31,040 --> 00:01:34,800 Speaker 2: this as a really scary idea than there are about 28 00:01:34,840 --> 00:01:39,200 Speaker 2: the opportunities. And I completely understand that. But I think 29 00:01:39,240 --> 00:01:42,680 Speaker 2: it's a really a really important part of our media 30 00:01:42,920 --> 00:01:46,760 Speaker 2: news cycle to at least acknowledge and address when we're 31 00:01:46,760 --> 00:01:49,040 Speaker 2: reading stories like the one we're going to talk about 32 00:01:49,080 --> 00:01:51,600 Speaker 2: today about job cards. It's really serious, it's about human 33 00:01:51,600 --> 00:01:53,960 Speaker 2: face to it. But yes, I'm glad you acknowledge my 34 00:01:54,000 --> 00:01:56,640 Speaker 2: optimism because it is important in the framing of all 35 00:01:56,680 --> 00:01:56,840 Speaker 2: of this. 36 00:01:57,040 --> 00:02:00,280 Speaker 3: Yes, and I think where the fear can come from 37 00:02:00,440 --> 00:02:04,760 Speaker 3: is just the speed at which everything is changing. 38 00:02:05,000 --> 00:02:05,400 Speaker 2: Totally. 39 00:02:05,800 --> 00:02:09,799 Speaker 3: Let's talk about the job cuts that major tech companies 40 00:02:09,919 --> 00:02:13,480 Speaker 3: have announced in Australia but also around the world. I 41 00:02:13,520 --> 00:02:16,760 Speaker 3: want to start with one of the biggest companies in Australia, 42 00:02:16,840 --> 00:02:19,720 Speaker 3: at Lassan. They had news last week that they were 43 00:02:19,760 --> 00:02:23,400 Speaker 3: cutting quite a significant portion of their workforce or about 44 00:02:23,400 --> 00:02:24,320 Speaker 3: ten percent is that right? 45 00:02:24,480 --> 00:02:27,360 Speaker 2: Ten percent of the global workforce, one in three of 46 00:02:27,400 --> 00:02:30,760 Speaker 2: their Australian workforce though, so I mean we're talking here 47 00:02:30,840 --> 00:02:33,840 Speaker 2: about sixteen hundred people around the world. So Atlasian is 48 00:02:34,440 --> 00:02:37,959 Speaker 2: headquartered in Sydney. It's a global tech company. The two 49 00:02:38,000 --> 00:02:40,720 Speaker 2: founders Mike cannon Brooks is Scott Farquha. They're the two 50 00:02:40,760 --> 00:02:45,560 Speaker 2: wealthiest tech entrepreneurs in Australia, maybe tied neck and neck 51 00:02:45,600 --> 00:02:48,680 Speaker 2: with the folks at Canva. So this is a massive 52 00:02:48,720 --> 00:02:52,320 Speaker 2: tech company. It's listed in America. And yeah, they sacked 53 00:02:52,320 --> 00:02:55,200 Speaker 2: sixteen hundred people, four hundred and eighty jobs in Australia. 54 00:02:55,440 --> 00:02:57,480 Speaker 2: The CEO, Mike cannon Brooks, he said in the video 55 00:02:57,520 --> 00:03:00,880 Speaker 2: message to staff, I know there are real consequences for everybody, 56 00:03:01,000 --> 00:03:03,960 Speaker 2: their families, their career plans, their friendships. But he was 57 00:03:04,080 --> 00:03:06,520 Speaker 2: very direct about the reason why, and it was the 58 00:03:06,560 --> 00:03:10,040 Speaker 2: increasing adoption of artificial intelligence across their business. 59 00:03:10,480 --> 00:03:13,760 Speaker 3: One thing I read his statement and he did seem 60 00:03:13,800 --> 00:03:16,560 Speaker 3: to have more nuance in the statement in terms of 61 00:03:16,639 --> 00:03:20,440 Speaker 3: saying that it wasn't that AI was necessarily replacing the 62 00:03:20,560 --> 00:03:23,400 Speaker 3: jobs and these are his words, ye, but that it 63 00:03:23,639 --> 00:03:25,320 Speaker 3: was the jobs are changing. 64 00:03:25,639 --> 00:03:29,600 Speaker 2: Yeah, I mean in the same announcement he also acknowledged 65 00:03:29,639 --> 00:03:31,799 Speaker 2: that they are putting out about one hundred and sixty 66 00:03:31,840 --> 00:03:33,359 Speaker 2: new jobs on the market. 67 00:03:33,600 --> 00:03:34,360 Speaker 1: Oh I missed that. 68 00:03:34,480 --> 00:03:37,520 Speaker 2: Yeah, so there's this kind of double edged sword here where, 69 00:03:37,760 --> 00:03:40,080 Speaker 2: and I wanted to kind of kind of thread this 70 00:03:40,200 --> 00:03:45,000 Speaker 2: throughout that there are opportunities for job creation according to 71 00:03:45,040 --> 00:03:48,200 Speaker 2: experts as well. Again that might be my optimism coming through. 72 00:03:48,240 --> 00:03:49,760 Speaker 2: We're going to try and keep this as balance as 73 00:03:49,800 --> 00:03:52,840 Speaker 2: possible for something so close to all of our lives. 74 00:03:53,480 --> 00:03:56,120 Speaker 2: But yeah, it's a complicated one. But I mean reading 75 00:03:56,120 --> 00:03:58,120 Speaker 2: some of the LinkedIn statements from people who had just 76 00:03:58,160 --> 00:04:00,720 Speaker 2: lost their jobs. We're talking about one work who was 77 00:04:00,760 --> 00:04:04,040 Speaker 2: two hundred days into their maternity leave. Then we've got 78 00:04:04,640 --> 00:04:08,680 Speaker 2: another Sydney based software engineer. He had been at Lassian 79 00:04:08,760 --> 00:04:12,720 Speaker 2: for sixteen years and he got let go via an email. 80 00:04:12,960 --> 00:04:15,520 Speaker 2: So there was a lot of pain within the Australian 81 00:04:15,560 --> 00:04:17,839 Speaker 2: tech community and a very human cost. 82 00:04:17,960 --> 00:04:22,039 Speaker 3: And that human cost of people losing their jobs as 83 00:04:22,080 --> 00:04:25,279 Speaker 3: the skills required in jobs changes. That's not specific to 84 00:04:25,320 --> 00:04:27,320 Speaker 3: it lasting that is something that we have seen across 85 00:04:27,320 --> 00:04:30,400 Speaker 3: different companies across Australia and then also overseas. 86 00:04:30,560 --> 00:04:32,720 Speaker 2: Yeah, I mean it's definitely not different. But what is 87 00:04:32,760 --> 00:04:34,640 Speaker 2: different about where we are in the point of time 88 00:04:34,720 --> 00:04:37,400 Speaker 2: is the frequency of these announcements. I mean, two weeks ago, 89 00:04:37,520 --> 00:04:41,719 Speaker 2: Wyse Tech Global, which is Australia's largest locally listed tech company, 90 00:04:41,920 --> 00:04:44,480 Speaker 2: It announced it would cut around two thousand jobs over 91 00:04:44,520 --> 00:04:46,599 Speaker 2: the next two years. That's more than a quarter of 92 00:04:46,640 --> 00:04:50,320 Speaker 2: its global workforce. Block, the company that owns after pay, 93 00:04:50,440 --> 00:04:54,080 Speaker 2: It announced it was cutting nearly half its workforce, also 94 00:04:54,160 --> 00:04:57,039 Speaker 2: citing AI. But it's not just the tech companies. I 95 00:04:57,040 --> 00:05:00,480 Speaker 2: mean any company that's using technology, which is pretty every 96 00:05:00,640 --> 00:05:04,440 Speaker 2: large corporation. They're also kind of looking for in all 97 00:05:04,480 --> 00:05:07,640 Speaker 2: of their annual reports, they're foreshadowing job losses in the future. 98 00:05:07,880 --> 00:05:11,440 Speaker 2: A restructured workforce reskilled a lot of those sorts of words. 99 00:05:11,600 --> 00:05:13,960 Speaker 2: So we're talking about banks, we're talking about medical companies, 100 00:05:14,000 --> 00:05:17,680 Speaker 2: we're talking about construction companies. Everything is being impacted here. 101 00:05:18,000 --> 00:05:20,000 Speaker 3: I feel like, just over the past two weeks it 102 00:05:20,080 --> 00:05:24,560 Speaker 3: has really dawned on me just how much jobs across 103 00:05:24,680 --> 00:05:28,200 Speaker 3: every industry are changing. And I have an example because 104 00:05:28,200 --> 00:05:30,360 Speaker 3: someone brought this up with me the other night at 105 00:05:30,360 --> 00:05:33,480 Speaker 3: the pub, saying that it's changing for models. And they 106 00:05:33,520 --> 00:05:37,920 Speaker 3: brought up a very popular fashion brand in Australia and 107 00:05:38,200 --> 00:05:40,680 Speaker 3: they showed me a photo of this beautiful new campaign 108 00:05:40,680 --> 00:05:43,760 Speaker 3: that they have and they said that model doesn't exist, 109 00:05:43,800 --> 00:05:47,719 Speaker 3: that model is AI. And you genuinely never would have 110 00:05:47,720 --> 00:05:51,400 Speaker 3: been able to tell. And when I looked at the comments, 111 00:05:51,480 --> 00:05:54,280 Speaker 3: it was somehow known that this was a A few 112 00:05:54,279 --> 00:05:58,240 Speaker 3: people did know, and the fashion brand actually responded to 113 00:05:58,240 --> 00:06:01,599 Speaker 3: comments saying, yes, this isn't a model that we have 114 00:06:01,839 --> 00:06:05,599 Speaker 3: produced with an AI artist, And that just goes to 115 00:06:05,640 --> 00:06:09,000 Speaker 3: show how the job market is completely changing and the 116 00:06:09,080 --> 00:06:12,520 Speaker 3: skills required for these things, like where once you needed 117 00:06:12,560 --> 00:06:15,440 Speaker 3: to be a model, we now are an AI artist. 118 00:06:15,520 --> 00:06:17,919 Speaker 3: That's just one example of one person, but it was 119 00:06:17,960 --> 00:06:18,720 Speaker 3: so interesting. 120 00:06:19,040 --> 00:06:21,680 Speaker 2: I cannot think of an industry where I have not 121 00:06:21,880 --> 00:06:24,600 Speaker 2: personally had a discussion with somebody in that industry who 122 00:06:24,640 --> 00:06:28,200 Speaker 2: says I'm completely unaffected. Yes, I mean even those like 123 00:06:28,320 --> 00:06:33,000 Speaker 2: intensely service based industries, people working in hospitals who are 124 00:06:33,040 --> 00:06:36,720 Speaker 2: hands on with patients in long recoveries, even they are 125 00:06:36,839 --> 00:06:40,159 Speaker 2: using AI for writing better patient notes or speeding up 126 00:06:40,160 --> 00:06:42,440 Speaker 2: the admin that they have to do. So you know, 127 00:06:42,560 --> 00:06:45,520 Speaker 2: no one really can say that they are not at 128 00:06:45,520 --> 00:06:47,120 Speaker 2: all going to be impacted. 129 00:06:46,680 --> 00:06:48,640 Speaker 1: By this, And that's the thing that we know. 130 00:06:48,760 --> 00:06:52,359 Speaker 3: It's going to impact everyone, but there are some people 131 00:06:52,480 --> 00:06:55,240 Speaker 3: who will be impacted more than others. What do we 132 00:06:55,320 --> 00:06:57,800 Speaker 3: know about who will be impacted the most? 133 00:06:58,320 --> 00:07:01,520 Speaker 2: So I think the most compro corehensive analysis of this 134 00:07:01,960 --> 00:07:05,880 Speaker 2: really big question comes from Jobs and Skills Australia, the 135 00:07:05,920 --> 00:07:09,960 Speaker 2: Federal Department, and they released a major report on this 136 00:07:10,360 --> 00:07:12,840 Speaker 2: late last year and it found that about seventy nine 137 00:07:12,880 --> 00:07:16,000 Speaker 2: percent of Australians so four and five of us, are 138 00:07:16,040 --> 00:07:19,600 Speaker 2: in jobs with low automation exposure, and that means AI 139 00:07:19,720 --> 00:07:23,160 Speaker 2: is unlikely to fully replace what they do. But and 140 00:07:23,200 --> 00:07:26,000 Speaker 2: this is the important bit, around eighty seven percent of 141 00:07:26,040 --> 00:07:29,640 Speaker 2: workers are in jobs where AI could materially reshape how 142 00:07:29,640 --> 00:07:33,200 Speaker 2: the work gets done, right, So I think that distinction 143 00:07:33,320 --> 00:07:35,760 Speaker 2: really matters a lot. So there is a distinction here 144 00:07:36,040 --> 00:07:40,280 Speaker 2: between AI fully replacing your job and AI changing your job. 145 00:07:40,760 --> 00:07:45,720 Speaker 2: And the language the agency is using is automation and augmentation. 146 00:07:46,160 --> 00:07:49,240 Speaker 2: So automation just means the machine is doing the task 147 00:07:49,320 --> 00:07:53,120 Speaker 2: instead of you, but augmentation means the machine is helping 148 00:07:53,160 --> 00:07:57,360 Speaker 2: you do your task better and faster. And so their 149 00:07:57,440 --> 00:08:02,960 Speaker 2: finding is that the augmentation a changing how you work. 150 00:08:02,400 --> 00:08:06,360 Speaker 2: That's more likely to be felt by everybody rather than 151 00:08:06,720 --> 00:08:08,920 Speaker 2: I don't have a job tomorrow because AI is here. 152 00:08:08,800 --> 00:08:12,080 Speaker 3: Today, Although that is something that as we have seen 153 00:08:12,120 --> 00:08:14,360 Speaker 3: in the past month, is happening. 154 00:08:13,960 --> 00:08:14,760 Speaker 1: To a lot of people. 155 00:08:14,960 --> 00:08:17,120 Speaker 2: Millions of workers are going to have to adapt to 156 00:08:17,200 --> 00:08:18,520 Speaker 2: that scenario for sure. 157 00:08:18,800 --> 00:08:21,160 Speaker 3: I mean, just to give an example of the automation. 158 00:08:22,000 --> 00:08:23,880 Speaker 3: I don't know much about coding, but what I do 159 00:08:24,040 --> 00:08:25,840 Speaker 3: know is that even at the daily ours, we used 160 00:08:25,840 --> 00:08:27,800 Speaker 3: to have someone who would write all of the code 161 00:08:27,880 --> 00:08:30,640 Speaker 3: of our website. I now know that there is something 162 00:08:30,680 --> 00:08:34,760 Speaker 3: called vibe coding, where basically you just tell Ai what 163 00:08:34,960 --> 00:08:38,160 Speaker 3: you want to create, just a normal plane English, and 164 00:08:38,240 --> 00:08:40,240 Speaker 3: it does all of the code for you. 165 00:08:40,480 --> 00:08:42,600 Speaker 2: Yeah, and just to clarify, he left because he got 166 00:08:42,640 --> 00:08:46,160 Speaker 2: a way better job because of an Oh, yes, you're right. 167 00:08:46,200 --> 00:08:49,000 Speaker 2: I mean, I'm vibe coding a lot. And just this 168 00:08:49,080 --> 00:08:53,160 Speaker 2: week I looked at something that is software program that 169 00:08:53,200 --> 00:08:55,800 Speaker 2: we spend about eight hundred dollars a month on and 170 00:08:56,000 --> 00:08:59,040 Speaker 2: I built it. I built it myself. And so the 171 00:08:59,120 --> 00:09:01,760 Speaker 2: questions that remain of what I built, and this is 172 00:09:01,800 --> 00:09:05,640 Speaker 2: what the technological community would say, is a secure to 173 00:09:05,679 --> 00:09:10,080 Speaker 2: put our data through. I have no idea about online cybersecurity. 174 00:09:09,400 --> 00:09:11,680 Speaker 1: Which is why we have not rolled it out exactly. 175 00:09:12,440 --> 00:09:15,040 Speaker 2: You know, is it sustainable and will it crash in 176 00:09:15,320 --> 00:09:17,679 Speaker 2: a week? I mean, is are we building kind of eventually, 177 00:09:18,400 --> 00:09:20,600 Speaker 2: I would say probably yes, And so there is still 178 00:09:20,640 --> 00:09:22,400 Speaker 2: a long way to go before we just kind of 179 00:09:22,840 --> 00:09:27,920 Speaker 2: get rid of any sort of software skills. But I'm 180 00:09:27,920 --> 00:09:30,640 Speaker 2: pretty comfortable with what I've built. So it's a really 181 00:09:30,679 --> 00:09:32,920 Speaker 2: interesting I mean, I'm a nerd in this space, as 182 00:09:32,920 --> 00:09:35,800 Speaker 2: we have acknowledged, but it's an interesting change to the workforce. 183 00:09:36,160 --> 00:09:38,120 Speaker 3: Sam, We've kind of flagged it, but I do want 184 00:09:38,160 --> 00:09:40,240 Speaker 3: to have a bit more of a conversation about how 185 00:09:40,320 --> 00:09:42,679 Speaker 3: the Daily Os is approaching this. But first, here is 186 00:09:42,679 --> 00:09:44,800 Speaker 3: a quick message from today's sponsor. 187 00:09:47,320 --> 00:09:47,440 Speaker 2: SA. 188 00:09:47,559 --> 00:09:49,400 Speaker 3: I do want to get to TDA, but first, before 189 00:09:49,440 --> 00:09:53,000 Speaker 3: we came in here, you mentioned that the Jobs and 190 00:09:53,080 --> 00:09:55,920 Speaker 3: Skills Australia Report is that the right name. Yeah, Jobs 191 00:09:55,920 --> 00:09:59,319 Speaker 3: and Skills Australia Report said that there was a difference 192 00:09:59,640 --> 00:10:01,720 Speaker 3: between genders of. 193 00:10:01,600 --> 00:10:04,280 Speaker 1: How we will all be impacted. What did it say? 194 00:10:04,440 --> 00:10:07,440 Speaker 2: So when it went through and identified the careers that 195 00:10:07,480 --> 00:10:11,280 Speaker 2: are most likely to be first replaced by AI, they 196 00:10:11,280 --> 00:10:16,960 Speaker 2: were talking about things like keyboard operators, telemarketers, call center workers, 197 00:10:17,040 --> 00:10:21,880 Speaker 2: accounting clerks, receptionists and because there's an overrepresentation of women 198 00:10:21,920 --> 00:10:25,200 Speaker 2: in those careers, the report then said that you know, 199 00:10:25,280 --> 00:10:28,360 Speaker 2: thirty six percent of women in the workforce fall into 200 00:10:28,360 --> 00:10:31,440 Speaker 2: this disrupted category compared to twenty six percent of men, 201 00:10:31,800 --> 00:10:34,680 Speaker 2: and so they recommended that policymakers and employers need to 202 00:10:34,720 --> 00:10:38,280 Speaker 2: pay close attention to this. And I think overlaying some 203 00:10:38,320 --> 00:10:42,520 Speaker 2: of you know, socioeconomic barriers that are in place, language barriers, 204 00:10:42,559 --> 00:10:46,040 Speaker 2: ability to access good technology. A good computer means that 205 00:10:46,080 --> 00:10:47,680 Speaker 2: you're going to be able to build with AI a 206 00:10:47,679 --> 00:10:49,680 Speaker 2: lot faster than if you have a bad computer, and 207 00:10:49,720 --> 00:10:51,280 Speaker 2: there's a big difference in price in all of that. 208 00:10:51,360 --> 00:10:54,800 Speaker 2: So I think overlaying some of those broader factors that 209 00:10:54,880 --> 00:10:57,640 Speaker 2: exist in society around these findings is really important. 210 00:10:57,880 --> 00:10:59,280 Speaker 3: Can I just pick up on something you said that 211 00:10:59,360 --> 00:11:03,160 Speaker 3: you said you need a better computer in order to 212 00:11:03,240 --> 00:11:07,480 Speaker 3: build with AI? Do you need to build with AI 213 00:11:07,880 --> 00:11:11,400 Speaker 3: in order to implement AI into your workflow? 214 00:11:11,880 --> 00:11:14,840 Speaker 2: No, I think build was probably a bad choice of 215 00:11:14,880 --> 00:11:18,640 Speaker 2: word there for me to use AI at a really 216 00:11:19,320 --> 00:11:21,360 Speaker 2: large scale. And what I mean by that is, let's 217 00:11:21,360 --> 00:11:25,440 Speaker 2: say that you're a call center worker and your boss 218 00:11:25,480 --> 00:11:28,480 Speaker 2: has said we're going to send you a file via 219 00:11:28,559 --> 00:11:31,240 Speaker 2: some AI program that has the phone numbers of one 220 00:11:31,320 --> 00:11:34,400 Speaker 2: hundred thousand people that you need to call. Your computer 221 00:11:34,520 --> 00:11:37,440 Speaker 2: needs to handle a file with one hundred thousand rows 222 00:11:37,440 --> 00:11:40,960 Speaker 2: in an Excel or Google spreadsheet, that's a big file. 223 00:11:41,400 --> 00:11:44,240 Speaker 2: And so because you're expected to get through that work faster, 224 00:11:44,320 --> 00:11:47,360 Speaker 2: and a call center because of some AI aspect to 225 00:11:47,400 --> 00:11:51,240 Speaker 2: it all, you have to handle larger packages of information. 226 00:11:51,640 --> 00:11:54,920 Speaker 2: So it's not necessarily building something. But if you're trying 227 00:11:54,920 --> 00:11:57,880 Speaker 2: to generate an image because your workplace has told you 228 00:11:57,920 --> 00:11:59,600 Speaker 2: that they're not going to hire, as you said before, 229 00:12:00,280 --> 00:12:02,920 Speaker 2: or a photographer, then generating an image will take a 230 00:12:02,960 --> 00:12:05,120 Speaker 2: lot more brain power in your computer. So there's all 231 00:12:05,160 --> 00:12:06,120 Speaker 2: of those kind of things. 232 00:12:06,360 --> 00:12:08,760 Speaker 3: So if someone's listening to this yeah, and is thinking 233 00:12:08,800 --> 00:12:12,600 Speaker 3: about whether or not their job is safe, where do 234 00:12:12,640 --> 00:12:16,240 Speaker 3: you think this conversation leaves them or leaves all of us? 235 00:12:16,280 --> 00:12:18,280 Speaker 1: Because that's something that we've all thought about, right. 236 00:12:18,640 --> 00:12:20,120 Speaker 2: I think we're going to have to keep checking in 237 00:12:20,160 --> 00:12:22,800 Speaker 2: with each other to see how this conversation evolves. You 238 00:12:22,880 --> 00:12:25,280 Speaker 2: said earlier that the thing that struck you is just 239 00:12:25,320 --> 00:12:27,840 Speaker 2: how fast this is moving. It struck me as well. 240 00:12:27,880 --> 00:12:31,679 Speaker 2: I mean, things are moving at wild pace. But at 241 00:12:31,679 --> 00:12:35,559 Speaker 2: the moment, I think there's kind of no core consensus 242 00:12:35,600 --> 00:12:38,240 Speaker 2: with how safe we all are. There seems to be 243 00:12:38,320 --> 00:12:44,360 Speaker 2: two big agreed ideas. One is that human skills really matter, 244 00:12:44,520 --> 00:12:48,760 Speaker 2: and so things like decision making, empathy, communication, the ability 245 00:12:48,840 --> 00:12:51,720 Speaker 2: to have good conversations in person, those skills are actually 246 00:12:51,760 --> 00:12:54,560 Speaker 2: going to get more important, and so one thing you 247 00:12:54,600 --> 00:12:56,840 Speaker 2: can do to try and make sure that you're a 248 00:12:56,920 --> 00:13:00,440 Speaker 2: worthwhile employee to keep around in the future is invest 249 00:13:00,440 --> 00:13:03,520 Speaker 2: in those skills and get better at your human connections. 250 00:13:03,760 --> 00:13:08,240 Speaker 2: And the other really agreed upon core consensus is that 251 00:13:08,280 --> 00:13:11,120 Speaker 2: those workers who are the most at risk are the 252 00:13:11,160 --> 00:13:13,920 Speaker 2: ones who are waiting for their jobs to be disrupted 253 00:13:14,280 --> 00:13:17,280 Speaker 2: and that they're not learning at least or getting exposed 254 00:13:17,320 --> 00:13:19,880 Speaker 2: to the tools that they're out there. And there was 255 00:13:19,920 --> 00:13:23,880 Speaker 2: this fantastic line from an Australian innovation expert who said 256 00:13:24,120 --> 00:13:26,560 Speaker 2: it only takes thirty hours to become a very good 257 00:13:26,559 --> 00:13:30,280 Speaker 2: AI operator, and that was really reassuring for me because 258 00:13:30,520 --> 00:13:32,520 Speaker 2: I don't have a coding background at all. I've never 259 00:13:32,600 --> 00:13:35,439 Speaker 2: written any sort of code. I'm not naturally good at 260 00:13:35,480 --> 00:13:38,320 Speaker 2: using this sort of software. And this isn't about getting 261 00:13:38,320 --> 00:13:42,360 Speaker 2: a new university degree. This is about watching some tutorials 262 00:13:42,400 --> 00:13:45,880 Speaker 2: online and so the barriers to entry are quite low 263 00:13:46,360 --> 00:13:48,800 Speaker 2: when you think about how you can actually just at 264 00:13:48,920 --> 00:13:52,119 Speaker 2: least get exposed to these kind of tools. Again, optimists, 265 00:13:52,160 --> 00:13:54,200 Speaker 2: I'm sorry, and I know that I might get some 266 00:13:54,600 --> 00:13:57,760 Speaker 2: unpleasant comments from listeners in the way that I'm framing things, 267 00:13:57,800 --> 00:14:00,160 Speaker 2: but I'm trying to balance all of this out. The 268 00:14:00,200 --> 00:14:02,080 Speaker 2: truth is that this kind of wave is here. 269 00:14:03,080 --> 00:14:06,360 Speaker 3: I think one thing that I want to talk about is, 270 00:14:06,640 --> 00:14:09,560 Speaker 3: you know, we're talking about other workplaces, but we are 271 00:14:09,600 --> 00:14:13,880 Speaker 3: currently sitting in a workplace, the Daily OS, and these 272 00:14:13,920 --> 00:14:17,840 Speaker 3: are conversations that we're obviously having not in the podcast 273 00:14:17,880 --> 00:14:23,400 Speaker 3: studio off mic, about how we approach it. And you know, 274 00:14:23,480 --> 00:14:26,560 Speaker 3: the lower barrier to entry is one thing, but also 275 00:14:27,400 --> 00:14:31,120 Speaker 3: the hesitancy around is this ethical to use? And especially 276 00:14:31,160 --> 00:14:34,680 Speaker 3: for journalism where you know your voice is really important 277 00:14:34,760 --> 00:14:38,320 Speaker 3: and you know there are very clear processes about what 278 00:14:38,360 --> 00:14:41,200 Speaker 3: you should do in order to produce quality journalism, and 279 00:14:41,240 --> 00:14:43,240 Speaker 3: the question of whether or not AI can do that. 280 00:14:43,880 --> 00:14:46,040 Speaker 3: I think there's two things there, Like it's not just 281 00:14:46,640 --> 00:14:48,560 Speaker 3: can I do this or can I not? It's also 282 00:14:49,000 --> 00:14:51,040 Speaker 3: do I want to do this? And do I want 283 00:14:51,080 --> 00:14:55,040 Speaker 3: my work to go down this path? And I mean, 284 00:14:55,080 --> 00:14:58,880 Speaker 3: we're talking about difficult conversations that other workplaces are having 285 00:14:58,880 --> 00:15:01,280 Speaker 3: and other companies are having. But I think it's important 286 00:15:01,280 --> 00:15:05,640 Speaker 3: to acknowledge that we are also having these conversations internally. 287 00:15:06,120 --> 00:15:10,520 Speaker 3: We've had many conversations disagreements about what we should do, 288 00:15:11,080 --> 00:15:13,240 Speaker 3: and I think the clear thing is is that no 289 00:15:13,280 --> 00:15:15,440 Speaker 3: one knows the answer no. 290 00:15:15,400 --> 00:15:18,360 Speaker 2: And I mean another aspect that you haven't mentioned is 291 00:15:18,400 --> 00:15:21,200 Speaker 2: how much we've talked about the environmental impact of this 292 00:15:21,320 --> 00:15:24,760 Speaker 2: changing strategy of different ways of working. So this is 293 00:15:25,280 --> 00:15:30,160 Speaker 2: incredibly complex, and I mean many of the lines that 294 00:15:30,200 --> 00:15:34,320 Speaker 2: you read from workers before the internet was around have 295 00:15:34,440 --> 00:15:38,240 Speaker 2: a similar sentiment behind them, which is that as humans, 296 00:15:38,280 --> 00:15:41,040 Speaker 2: we need stability, we need to be able to see 297 00:15:41,120 --> 00:15:44,320 Speaker 2: what's coming up, and there are these particular moments in 298 00:15:44,400 --> 00:15:47,760 Speaker 2: time where it's harder to do so than others. And 299 00:15:47,800 --> 00:15:51,880 Speaker 2: whether that's a pandemic or a technological innovation that we 300 00:15:52,560 --> 00:15:54,960 Speaker 2: kind of didn't see coming at the pace that it has. 301 00:15:55,440 --> 00:15:58,840 Speaker 2: I think that it's perfectly reasonable for everybody to be 302 00:15:58,920 --> 00:16:01,960 Speaker 2: thinking not just how do I get on top of it, 303 00:16:02,000 --> 00:16:04,520 Speaker 2: but do I actually want this? And do I actually 304 00:16:04,600 --> 00:16:08,080 Speaker 2: want to have my work changed in this way. It's 305 00:16:08,080 --> 00:16:09,880 Speaker 2: a difficult one to answer because then you have to 306 00:16:09,920 --> 00:16:12,640 Speaker 2: overlay the fact that you need to make a buck 307 00:16:13,120 --> 00:16:16,000 Speaker 2: and you need to stay employed. And if your boss 308 00:16:16,080 --> 00:16:18,720 Speaker 2: Sam says we'd like you to start using this tool, 309 00:16:18,880 --> 00:16:19,680 Speaker 2: and it's lucky you. 310 00:16:19,640 --> 00:16:21,760 Speaker 1: Have a billy to say, well, hold on, guys. 311 00:16:22,040 --> 00:16:24,320 Speaker 2: And that's what makes this podcast great, hopefully, is that 312 00:16:25,040 --> 00:16:27,360 Speaker 2: we can have some disagreements about all of this. We're 313 00:16:27,400 --> 00:16:32,040 Speaker 2: going to make some glorious mistakes, I'm sure, but it's here, 314 00:16:32,240 --> 00:16:35,000 Speaker 2: and I think at one level I'm quite a realist 315 00:16:35,480 --> 00:16:39,800 Speaker 2: in that I'm raring to go when I am put 316 00:16:39,800 --> 00:16:41,760 Speaker 2: in a scenario and said, this is the state of 317 00:16:41,760 --> 00:16:42,200 Speaker 2: play now. 318 00:16:43,360 --> 00:16:45,960 Speaker 3: And just to be clear, the Daily Olls is not 319 00:16:46,000 --> 00:16:49,360 Speaker 3: looking at replacing anyone or anything like that. That has 320 00:16:49,480 --> 00:16:53,080 Speaker 3: never been our approach, and where a small enough company 321 00:16:53,120 --> 00:16:56,920 Speaker 3: that is not required. But it's about how, like we 322 00:16:56,960 --> 00:16:59,960 Speaker 3: have talked about, how do we change our skills set. 323 00:17:00,040 --> 00:17:02,280 Speaker 1: It's even something as simple as proofreading. 324 00:17:02,400 --> 00:17:05,399 Speaker 3: Is that something now that you know we don't need 325 00:17:05,440 --> 00:17:07,840 Speaker 3: the editors to be spending time on and instead they 326 00:17:07,840 --> 00:17:09,760 Speaker 3: can work with journalists on the best angle and the 327 00:17:09,760 --> 00:17:13,040 Speaker 3: best way to approach your story. But something like proofreading 328 00:17:13,080 --> 00:17:15,720 Speaker 3: that doesn't need to happen through a person as much. 329 00:17:15,800 --> 00:17:17,679 Speaker 2: I know you would never say something like this, but 330 00:17:17,720 --> 00:17:19,639 Speaker 2: I've been trying to kind of prick my ears up 331 00:17:19,680 --> 00:17:22,600 Speaker 2: when I hear people talking about things they don't like doing, 332 00:17:23,240 --> 00:17:25,359 Speaker 2: and I'm trying to work out, Okay, you've said you 333 00:17:25,359 --> 00:17:28,800 Speaker 2: don't like this particular task. Let me figure out with 334 00:17:28,840 --> 00:17:30,959 Speaker 2: you a way that perhaps AI could be helpful. 335 00:17:30,960 --> 00:17:32,879 Speaker 1: There you're saying, I would never say I don't like 336 00:17:32,880 --> 00:17:33,320 Speaker 1: doing a job. 337 00:17:33,359 --> 00:17:37,840 Speaker 3: Billy, you love your job, Sam, Thank you for explaining 338 00:17:37,880 --> 00:17:38,320 Speaker 3: all of that. 339 00:17:38,520 --> 00:17:40,960 Speaker 1: Thanks really for having such an open conversation, of. 340 00:17:40,880 --> 00:17:43,200 Speaker 3: Course, and thank you so much for listening to this 341 00:17:43,280 --> 00:17:44,800 Speaker 3: episode of The Daily Ods. 342 00:17:44,840 --> 00:17:46,520 Speaker 1: We'll be back later today with. 343 00:17:46,560 --> 00:17:48,840 Speaker 3: The afternoon headlines, but until then, we hope you have 344 00:17:48,840 --> 00:17:52,639 Speaker 3: a good day. 345 00:17:53,560 --> 00:17:55,840 Speaker 2: My name is Lily Maddon and I'm a proud Adunda 346 00:17:56,080 --> 00:17:58,600 Speaker 2: bunge lung Chalcultin woman from Gadigal Country. 347 00:17:59,440 --> 00:18:02,720 Speaker 1: The Daily acknowledges that this podcast is recorded on the 348 00:18:02,760 --> 00:18:05,440 Speaker 1: lands of the Gadigal people and pays respect to all 349 00:18:05,600 --> 00:18:08,679 Speaker 1: Aboriginal and torrest Rate island and nations. We pay our 350 00:18:08,680 --> 00:18:11,840 Speaker 1: respects to the first peoples of these countries, both past 351 00:18:11,920 --> 00:18:12,440 Speaker 1: and present.