1 00:00:00,040 --> 00:00:01,400 Speaker 1: The first I want to tell you about my day, 2 00:00:03,080 --> 00:00:05,240 Speaker 1: and then I'll get Paul Spain on to explain about 3 00:00:05,240 --> 00:00:07,880 Speaker 1: my day. But my day, it's lovely at the moment 4 00:00:07,880 --> 00:00:10,080 Speaker 1: doing a Heather's job. Because normally I do a breakfast 5 00:00:10,080 --> 00:00:12,360 Speaker 1: show on a rock radio station called Gold. You should 6 00:00:12,360 --> 00:00:15,880 Speaker 1: try it sometime. It's very good. And so I'm normally 7 00:00:15,960 --> 00:00:17,840 Speaker 1: up early. I'm not you know, I don't get to sleep, 8 00:00:17,920 --> 00:00:19,919 Speaker 1: and at the moment this week, I am not doing 9 00:00:19,920 --> 00:00:22,280 Speaker 1: that show. I'm doing this show that means I sleep, 10 00:00:22,840 --> 00:00:25,760 Speaker 1: I do some breakfast and that's lovely. Read the newspaper, 11 00:00:26,160 --> 00:00:27,880 Speaker 1: start thinking about what I'm going to say, go to 12 00:00:27,920 --> 00:00:29,520 Speaker 1: the gym. So I went to the gym at eight 13 00:00:29,560 --> 00:00:31,360 Speaker 1: o'clock today and I did a good hour and a 14 00:00:31,400 --> 00:00:33,680 Speaker 1: half workout. And as I'm coming home, hot and sweaty 15 00:00:33,680 --> 00:00:35,880 Speaker 1: and getting ready to get into it, have a shower 16 00:00:35,960 --> 00:00:38,880 Speaker 1: and start writing stuff, I get an alert that's saying 17 00:00:39,320 --> 00:00:44,280 Speaker 1: Microsoft Outlook is down. It's a global cyber attack. It's 18 00:00:44,360 --> 00:00:49,440 Speaker 1: involving microsofte me emails down. Therefore, we are working with 19 00:00:49,520 --> 00:00:53,960 Speaker 1: Microsoft on the problem. This caused me great consternation because 20 00:00:54,000 --> 00:00:57,640 Speaker 1: how I write my stuff is on Microsoft word through 21 00:00:57,680 --> 00:01:02,160 Speaker 1: Outlook and then share it with work on Google. So 22 00:01:02,360 --> 00:01:04,760 Speaker 1: if I'm writing at home and I can't email myself 23 00:01:04,760 --> 00:01:06,080 Speaker 1: and all that sort of thing, and I can't use 24 00:01:06,120 --> 00:01:09,160 Speaker 1: the Google Home to actually sort of mirror the stuff back, 25 00:01:09,240 --> 00:01:12,679 Speaker 1: how amight I get these bond mots to myself? So 26 00:01:12,840 --> 00:01:15,560 Speaker 1: I actually told my producer who phoned up and we're 27 00:01:15,600 --> 00:01:17,200 Speaker 1: talking about I said, I'm going to stick it on 28 00:01:17,319 --> 00:01:20,800 Speaker 1: my memory stick. You mean you've got a memory stick 29 00:01:20,840 --> 00:01:23,040 Speaker 1: and I've got a memory stick. You're not supposed to 30 00:01:23,040 --> 00:01:25,600 Speaker 1: bring memory sticks in because you plug them into computers, 31 00:01:25,640 --> 00:01:29,080 Speaker 1: they bring in viruses or viray whatever. Anyway, I got 32 00:01:29,120 --> 00:01:31,000 Speaker 1: a memory stick and I was thinking about that, and 33 00:01:31,040 --> 00:01:34,280 Speaker 1: then we figured out how to get service back. So 34 00:01:34,440 --> 00:01:37,240 Speaker 1: joining me now is Paul Spain, tech commentator and Guerrilla 35 00:01:37,280 --> 00:01:38,000 Speaker 1: Tech CEO. 36 00:01:38,200 --> 00:01:40,920 Speaker 2: Hello, Paul, Hello, how are we doing? Andrew? 37 00:01:41,319 --> 00:01:43,520 Speaker 1: Well, we're good now that we're back online. I'm getting 38 00:01:43,560 --> 00:01:46,080 Speaker 1: a bit tired of the Microsoft cyber attacks. What's happening here? 39 00:01:47,760 --> 00:01:51,520 Speaker 2: Well, you know what we know around today is it 40 00:01:51,600 --> 00:01:55,000 Speaker 2: was largely in New Zealand that was impacted. So this 41 00:01:55,120 --> 00:01:58,840 Speaker 2: one wasn't so much of a global issue. It was 42 00:01:58,920 --> 00:02:02,600 Speaker 2: very very focused on New Zealand, and it appears as 43 00:02:02,680 --> 00:02:05,720 Speaker 2: though what we're dealing with is something called a distributed 44 00:02:05,760 --> 00:02:09,560 Speaker 2: denial of service attack, which you can, if you want to, 45 00:02:09,560 --> 00:02:12,600 Speaker 2: sort of think of it simply. It's a little bit 46 00:02:12,680 --> 00:02:15,600 Speaker 2: like when we have a protest that floods the motorway 47 00:02:15,600 --> 00:02:19,720 Speaker 2: with tractors or trucks or some other such thing, and 48 00:02:19,760 --> 00:02:24,720 Speaker 2: you've got an adversary who is basically trying to trying 49 00:02:24,760 --> 00:02:30,200 Speaker 2: to cause problems for Microsoft and Microsoft customers by flooding 50 00:02:30,240 --> 00:02:34,320 Speaker 2: sort of certain areas of the Internet with too much 51 00:02:34,400 --> 00:02:38,440 Speaker 2: data or traffic as we call it, and that brings 52 00:02:38,480 --> 00:02:42,560 Speaker 2: things to sometimes to a standstill, sometimes just sort of 53 00:02:42,560 --> 00:02:47,480 Speaker 2: slows slows things down. And that's really what's what's happened today. Now, 54 00:02:47,520 --> 00:02:49,480 Speaker 2: a lot of people haven't felt the pain, but other 55 00:02:49,520 --> 00:02:50,880 Speaker 2: people have. 56 00:02:51,240 --> 00:02:53,639 Speaker 1: Like me, who wants to do this sort of thing 57 00:02:53,720 --> 00:02:56,240 Speaker 1: to us? I mean, we've got weird New zeal We've 58 00:02:56,280 --> 00:02:57,480 Speaker 1: got no enemies in the world. 59 00:02:59,000 --> 00:03:03,120 Speaker 2: Well, there are always folks out to, you know, have 60 00:03:03,240 --> 00:03:07,240 Speaker 2: an impact. Sometimes that's they want to cause a financial impact, 61 00:03:07,919 --> 00:03:10,919 Speaker 2: and then they can put pressure on an organization such 62 00:03:10,919 --> 00:03:14,400 Speaker 2: as Microsoft and say, look, we're gonna you know, stop 63 00:03:14,440 --> 00:03:17,320 Speaker 2: your operation in New Zealand unless you front up and 64 00:03:17,560 --> 00:03:22,000 Speaker 2: drop you know x amount of you know, funds in 65 00:03:22,040 --> 00:03:26,960 Speaker 2: our direction, usually in cryptocurrency. So basically it's it's a 66 00:03:27,480 --> 00:03:30,240 Speaker 2: you know, it's used from that that perspective of a 67 00:03:30,360 --> 00:03:34,920 Speaker 2: ransom or a bribery type situation. Or it can be 68 00:03:35,840 --> 00:03:37,680 Speaker 2: you know, a country, so it can be sort of 69 00:03:37,720 --> 00:03:40,600 Speaker 2: you know, politically driven. It can be a country, uh, 70 00:03:40,680 --> 00:03:44,360 Speaker 2: you know, a particular cause that someone's trying to push 71 00:03:44,400 --> 00:03:46,240 Speaker 2: and to make a little bit of a point. So, 72 00:03:46,520 --> 00:03:48,520 Speaker 2: you know, these are the situations we see. 73 00:03:48,680 --> 00:03:50,480 Speaker 1: But today's outage, we don't know who did it. 74 00:03:52,640 --> 00:03:55,760 Speaker 2: No, I haven't. I haven't seen anything to you know, 75 00:03:55,840 --> 00:03:59,280 Speaker 2: to indicate that at this point. And there there are 76 00:03:59,280 --> 00:04:04,280 Speaker 2: often reason why an organization you might know but might 77 00:04:04,320 --> 00:04:07,800 Speaker 2: not just you know, disclose that publicly at a given 78 00:04:07,840 --> 00:04:10,760 Speaker 2: point in time. But you know, maybe we'll share you know, 79 00:04:10,880 --> 00:04:13,120 Speaker 2: further information down the track. 80 00:04:13,440 --> 00:04:14,960 Speaker 1: Was there any data taken? 81 00:04:16,760 --> 00:04:21,760 Speaker 2: This type of attack generally the attacker has no access 82 00:04:21,880 --> 00:04:26,080 Speaker 2: to you know, to any anybody's data. It's more you 83 00:04:26,120 --> 00:04:27,920 Speaker 2: know that that as I say, they're kind of it's 84 00:04:27,960 --> 00:04:30,719 Speaker 2: like flooding a motorway. So it's not like they're they're 85 00:04:30,760 --> 00:04:34,080 Speaker 2: inside your systems, inside your building and so on, it's 86 00:04:34,120 --> 00:04:38,200 Speaker 2: more they're trying to block you get access, you gaining access, 87 00:04:38,360 --> 00:04:41,720 Speaker 2: rather than than them having having any level of access themselves. 88 00:04:42,120 --> 00:04:46,839 Speaker 1: Did Microsoft were they helpful or were they hendry? 89 00:04:49,120 --> 00:04:53,640 Speaker 2: Look, you know, I think we expect organizations like Microsoft 90 00:04:53,720 --> 00:04:56,200 Speaker 2: to be really well geared up to deal with this 91 00:04:56,200 --> 00:05:01,840 Speaker 2: this type of attack, and you know, in this case, yeah, 92 00:05:01,839 --> 00:05:03,760 Speaker 2: maybe they didn't kind of get it quite right in 93 00:05:03,839 --> 00:05:07,240 Speaker 2: terms of their their ability to to you know, to 94 00:05:07,320 --> 00:05:13,280 Speaker 2: block this and deal with it correctly. Yeah. They I mean, yeah, 95 00:05:13,720 --> 00:05:17,360 Speaker 2: there's certainly certainly some culpability, you know, I think too, 96 00:05:18,400 --> 00:05:21,520 Speaker 2: you know, to a degree. But you know, depending on 97 00:05:21,600 --> 00:05:25,520 Speaker 2: the scale and the and the size of you know, 98 00:05:25,680 --> 00:05:29,400 Speaker 2: of this type of attack, you know, there there probably 99 00:05:29,520 --> 00:05:33,320 Speaker 2: is a point where a company can can completely block it. 100 00:05:33,520 --> 00:05:37,479 Speaker 2: And and there's a point where, yeah, those sort of 101 00:05:37,480 --> 00:05:40,160 Speaker 2: attackers can can have a level of impact that we've 102 00:05:40,200 --> 00:05:43,800 Speaker 2: seen other New Zealand organizations hit with this type of 103 00:05:43,839 --> 00:05:48,040 Speaker 2: attack n z X in the past. Uh, and they 104 00:05:48,600 --> 00:05:51,160 Speaker 2: you know, maybe didn't have some things quite right and 105 00:05:51,200 --> 00:05:53,599 Speaker 2: that you know, they made some changes and and you 106 00:05:53,640 --> 00:05:56,839 Speaker 2: know they eventually you know, got through that and came 107 00:05:56,920 --> 00:05:58,120 Speaker 2: back to full operation. 108 00:05:58,320 --> 00:06:03,160 Speaker 1: So Paul, one last question, Okay, and after CrowdStrike and 109 00:06:03,240 --> 00:06:07,600 Speaker 1: after realizing how many people use outlook and teams, are we, 110 00:06:07,760 --> 00:06:11,480 Speaker 1: in fact, in the business world generally relying on too 111 00:06:11,480 --> 00:06:14,920 Speaker 1: small a number of companies to provide our services techwise, 112 00:06:15,000 --> 00:06:17,760 Speaker 1: So it's easy to take out a whole country. It's 113 00:06:17,800 --> 00:06:19,320 Speaker 1: easy to almost take out a whole. 114 00:06:19,120 --> 00:06:23,440 Speaker 2: World in simple terms. You know, I would say, yes, 115 00:06:23,560 --> 00:06:26,800 Speaker 2: I wish there was more competition, uh, you know, to 116 00:06:27,040 --> 00:06:31,159 Speaker 2: these these biggest global providers. And so if you look 117 00:06:31,200 --> 00:06:34,000 Speaker 2: at a Microsoft three sixty five, which is what the 118 00:06:34,279 --> 00:06:38,560 Speaker 2: very large majority of organizations in New Zealand use for 119 00:06:39,080 --> 00:06:45,280 Speaker 2: you know, for their core technology email and document editing 120 00:06:45,360 --> 00:06:49,520 Speaker 2: and storage and spreadsheets and so on, they're very very dominant. 121 00:06:49,560 --> 00:06:52,240 Speaker 2: Google have a very small slice of that that that 122 00:06:52,400 --> 00:06:55,839 Speaker 2: market here. It would be great if, you know, if 123 00:06:55,880 --> 00:06:59,000 Speaker 2: Google and others we're more competitive and that it was 124 00:06:59,040 --> 00:07:01,360 Speaker 2: split a little bit more evenly. But at the moment, 125 00:07:01,400 --> 00:07:05,839 Speaker 2: you know, Microsoft's offerings, you know, are generally, you know, 126 00:07:05,839 --> 00:07:10,040 Speaker 2: probably head and shoulders above the competition, and so therefore 127 00:07:10,360 --> 00:07:13,960 Speaker 2: they hold the you know, the line's share of the market. 128 00:07:14,040 --> 00:07:16,280 Speaker 2: It would be much easier if that wasn't the case, 129 00:07:16,320 --> 00:07:17,720 Speaker 2: but it's where we're at right now. 130 00:07:18,080 --> 00:07:21,840 Speaker 1: Absolutely Paul, Thank you so much. Paul from Gorilla Gorilla 131 00:07:21,880 --> 00:07:24,400 Speaker 1: Tech CEO, tech commentator. 132 00:07:24,560 --> 00:07:27,760 Speaker 2: For more from Heather Duplessy Allen Drive, listen live to 133 00:07:27,840 --> 00:07:30,880 Speaker 2: news talks. It'd be from four pm weekdays, or follow 134 00:07:30,920 --> 00:07:32,640 Speaker 2: the podcast on iHeartRadio