1 00:00:00,440 --> 00:00:02,160 Speaker 1: Heather Duple class out. 2 00:00:02,160 --> 00:00:06,760 Speaker 2: More and more companies are using AI for recruiting staff. Spark, Kmart, Wilworths. 3 00:00:06,880 --> 00:00:08,799 Speaker 2: All of these guys in retail and hospital Apart from 4 00:00:08,840 --> 00:00:11,479 Speaker 2: Spark obviously, they all use the software by a company 5 00:00:11,520 --> 00:00:12,680 Speaker 2: called sapier Ai. 6 00:00:13,240 --> 00:00:13,440 Speaker 1: Now. 7 00:00:13,560 --> 00:00:16,599 Speaker 2: The program basically does the chat and video interviews and 8 00:00:16,640 --> 00:00:19,640 Speaker 2: then scores the candidate's written responses. The results can then 9 00:00:19,680 --> 00:00:22,720 Speaker 2: be tracked by the hiring team. Woolworths a loan, has 10 00:00:22,840 --> 00:00:25,799 Speaker 2: used the system to interview fifty nine thousand people in 11 00:00:25,800 --> 00:00:29,400 Speaker 2: the past year. Saperai founder and CEO Barb Heiman is 12 00:00:29,400 --> 00:00:32,760 Speaker 2: with us. Now, Hey, bub Hi, how does this work? 13 00:00:32,800 --> 00:00:32,879 Speaker 1: So? 14 00:00:32,960 --> 00:00:35,159 Speaker 2: Does it just ask sort of basic questions of the 15 00:00:35,159 --> 00:00:37,320 Speaker 2: first round applicants and then screen out the ones who've 16 00:00:37,320 --> 00:00:37,880 Speaker 2: got no shot? 17 00:00:39,040 --> 00:00:42,680 Speaker 1: Yeah, So what we do is basically do the interview 18 00:00:42,720 --> 00:00:44,640 Speaker 1: for you. If you think about hiring, you want to 19 00:00:44,640 --> 00:00:47,000 Speaker 1: meet someone, get to know them, figure out whether they're 20 00:00:47,040 --> 00:00:49,519 Speaker 1: the right foot of the job. The technology does it 21 00:00:49,560 --> 00:00:51,760 Speaker 1: for you. It does it in a way that is friendly. 22 00:00:52,360 --> 00:00:57,760 Speaker 1: It's not time pressured, and so you see amazing feedback 23 00:00:57,800 --> 00:01:02,400 Speaker 1: from candadate's. Candadates love it, feel stressed. It's much more inclusive, 24 00:01:02,440 --> 00:01:06,440 Speaker 1: for those with a disability, and it allows you to 25 00:01:06,440 --> 00:01:08,640 Speaker 1: get to the best candidates really fast, and often getting 26 00:01:08,640 --> 00:01:10,759 Speaker 1: to the best candidates first means that you win them. 27 00:01:11,760 --> 00:01:13,320 Speaker 2: How does it do the video interviews? I mean, I 28 00:01:13,360 --> 00:01:16,040 Speaker 2: can understand if you're doing like a text interview, you 29 00:01:16,080 --> 00:01:18,199 Speaker 2: can sort of screen out like that, But what about 30 00:01:18,200 --> 00:01:19,520 Speaker 2: a video one? What are you looking for there? 31 00:01:19,640 --> 00:01:22,240 Speaker 1: So the video there is no AI at all, and 32 00:01:22,280 --> 00:01:24,920 Speaker 1: we strongly believe that. Look, I would love video to 33 00:01:24,959 --> 00:01:28,200 Speaker 1: disappear as a tool to evaluate people because I think 34 00:01:28,200 --> 00:01:30,640 Speaker 1: it's very hard to remote bias. You know, most biases 35 00:01:30,680 --> 00:01:33,320 Speaker 1: are unconscious. The video is there is what we call 36 00:01:33,360 --> 00:01:35,839 Speaker 1: another data point, another signal, so someone has to watch 37 00:01:35,880 --> 00:01:39,200 Speaker 1: it and make a decision. You see that data the AI? 38 00:01:40,520 --> 00:01:42,880 Speaker 2: Does the AI do a video interview that's recorded and 39 00:01:42,920 --> 00:01:44,720 Speaker 2: then somebody needs to watch it? Does that how it works? 40 00:01:44,920 --> 00:01:47,280 Speaker 1: Yeah, it's not really an AI doing a video. It's 41 00:01:47,280 --> 00:01:49,200 Speaker 1: just like any video. You might use zoom to do 42 00:01:49,240 --> 00:01:51,920 Speaker 1: a video. There's no AI in the video at all. 43 00:01:51,960 --> 00:01:54,200 Speaker 1: The AI comes into the chats. So what we're doing 44 00:01:54,320 --> 00:01:57,160 Speaker 1: is taking language data. How does you respond to a 45 00:01:57,240 --> 00:01:59,240 Speaker 1: question about how you work with the team, how you 46 00:01:59,280 --> 00:02:02,280 Speaker 1: work with customers, and what do we learn from that response, 47 00:02:02,320 --> 00:02:04,520 Speaker 1: Do we learn that you're someone who's great with people, 48 00:02:05,040 --> 00:02:07,080 Speaker 1: that you really humble, that you're a fast learner, that 49 00:02:07,120 --> 00:02:09,520 Speaker 1: you're a good thinker. That's what we're doing, which is 50 00:02:09,520 --> 00:02:11,400 Speaker 1: what humans do, but we're doing it without all the 51 00:02:11,400 --> 00:02:15,200 Speaker 1: bias that we humans bring, and that's using science has 52 00:02:15,240 --> 00:02:18,880 Speaker 1: been around for decades now natural language processing, and that's 53 00:02:18,880 --> 00:02:22,120 Speaker 1: what provides for the accuracy of hiring, so that you 54 00:02:22,200 --> 00:02:24,760 Speaker 1: hire people who end up staying, not people who come 55 00:02:24,800 --> 00:02:26,359 Speaker 1: in and leave one week later. 56 00:02:27,120 --> 00:02:29,400 Speaker 2: So being a quick thinker is a positive. 57 00:02:30,840 --> 00:02:33,680 Speaker 1: Well depends on what the role requires. So each role 58 00:02:33,760 --> 00:02:36,240 Speaker 1: has its own custom requirements. If you think about your 59 00:02:36,280 --> 00:02:38,440 Speaker 1: hiring for someone in your team, or you're hiring for 60 00:02:38,480 --> 00:02:42,000 Speaker 1: someone for customer service cabin crew, it's fundamentally different what's 61 00:02:42,000 --> 00:02:44,639 Speaker 1: important for the role. So depending on what's important for 62 00:02:44,680 --> 00:02:47,280 Speaker 1: the role dictates the traits and the qualities that we're 63 00:02:47,280 --> 00:02:50,440 Speaker 1: looking for. Some jobs require really high critical thinking and 64 00:02:50,480 --> 00:02:55,400 Speaker 1: amazing common skills, others don't. And then we're identifying the questions. 65 00:02:55,400 --> 00:02:57,760 Speaker 1: We're creating the questions, which are always what we call 66 00:02:57,840 --> 00:03:00,400 Speaker 1: behavioral They ask you to draw on what you've done 67 00:03:00,440 --> 00:03:03,880 Speaker 1: in the past and in that basis, you get everyone 68 00:03:03,919 --> 00:03:05,799 Speaker 1: to share their story of who they are in a 69 00:03:05,840 --> 00:03:09,840 Speaker 1: way that is obviously a lot more comfortable than what 70 00:03:09,880 --> 00:03:13,120 Speaker 1: it often is for people facing someone across the table, 71 00:03:13,160 --> 00:03:15,360 Speaker 1: which is scary for a lot of candidates out there. 72 00:03:15,800 --> 00:03:19,680 Speaker 2: Bob, if being a quick thinker is something that's a 73 00:03:19,720 --> 00:03:23,920 Speaker 2: positive and presumably you are required to answer pretty articulately 74 00:03:24,000 --> 00:03:28,280 Speaker 2: quite fast, that would surely disadvange You're. 75 00:03:28,160 --> 00:03:30,760 Speaker 1: Not You're not. So what's really important is that the 76 00:03:30,840 --> 00:03:34,040 Speaker 1: chat is not timed. You could take an hour, you 77 00:03:34,040 --> 00:03:35,760 Speaker 1: could take twenty minutes, you could come back to it 78 00:03:35,800 --> 00:03:38,920 Speaker 1: five times. Well, we're not testing for quick thinking. What 79 00:03:38,960 --> 00:03:43,000 Speaker 1: we're testing for is attributes like critical thinking or are 80 00:03:43,040 --> 00:03:45,600 Speaker 1: you someone that's brilliant a customer service? Are you someone 81 00:03:45,680 --> 00:03:48,400 Speaker 1: that goes above and beyond? Are you someone that has 82 00:03:48,440 --> 00:03:52,000 Speaker 1: good communication skills? But speed we don't see in any 83 00:03:52,000 --> 00:03:55,560 Speaker 1: ways correlated with quality of higher So it is an 84 00:03:55,640 --> 00:03:58,160 Speaker 1: untimed interview, right. 85 00:03:58,920 --> 00:04:00,800 Speaker 2: So how do you though? I mean, one of the 86 00:04:00,800 --> 00:04:04,040 Speaker 2: things I imagine you're going to struggle to actually quantify 87 00:04:04,440 --> 00:04:09,280 Speaker 2: is charisma, personality, enthusiasm, tone, which is all the stuff 88 00:04:09,280 --> 00:04:12,440 Speaker 2: that you get in actual face to face interviews. How 89 00:04:12,440 --> 00:04:14,240 Speaker 2: can you possibly pick up whether somebody's got that. 90 00:04:15,720 --> 00:04:19,279 Speaker 1: Well, we would say that those are heuristics. They are 91 00:04:19,800 --> 00:04:22,080 Speaker 1: signs that we look for based on our gut or 92 00:04:22,160 --> 00:04:24,480 Speaker 1: we think what we think drive success in a role. 93 00:04:24,520 --> 00:04:28,039 Speaker 1: You gaves a lot of conversation about vibe hiring, but 94 00:04:28,279 --> 00:04:30,599 Speaker 1: is that actually delivering results for your business? You know, 95 00:04:30,640 --> 00:04:32,680 Speaker 1: what we're trying to solve for is who actually is 96 00:04:32,680 --> 00:04:34,880 Speaker 1: a fit for the role? Do you think to be 97 00:04:34,880 --> 00:04:37,600 Speaker 1: great and stick around? I do? I mean, what does 98 00:04:37,680 --> 00:04:41,360 Speaker 1: vibe high mean? Vibe high means basically I like you, yeah, 99 00:04:41,360 --> 00:04:42,839 Speaker 1: and I think you're a good fit for the brand. 100 00:04:42,880 --> 00:04:45,400 Speaker 1: It's what we call mirror hiring, and mirror hiring is 101 00:04:45,400 --> 00:04:46,239 Speaker 1: a recipe for buys. 102 00:04:47,720 --> 00:04:50,359 Speaker 2: But bob our guts have been have been trained for 103 00:04:51,000 --> 00:04:53,840 Speaker 2: flipp and millennia, but. 104 00:04:53,880 --> 00:04:55,960 Speaker 1: They're leading us astray. If you think about the two 105 00:04:56,040 --> 00:04:57,760 Speaker 1: most important decisions to make in our life, who do 106 00:04:57,800 --> 00:04:59,599 Speaker 1: we partner with? Where do we work? We get it 107 00:04:59,600 --> 00:05:02,360 Speaker 1: wrong with of the time, and so you know, how 108 00:05:02,360 --> 00:05:05,760 Speaker 1: do you kind of debias? The gut and the gut 109 00:05:06,080 --> 00:05:09,000 Speaker 1: and bias is a wonderful shortcut to make a fast decision, 110 00:05:09,000 --> 00:05:10,640 Speaker 1: but it doesn't necessarily make it a good decision. 111 00:05:10,720 --> 00:05:12,000 Speaker 2: Right, Well, are you going to roll out some AI 112 00:05:12,120 --> 00:05:13,279 Speaker 2: for picking husbands and wives? 113 00:05:13,320 --> 00:05:16,359 Speaker 1: Now you know what I can't. I would be a 114 00:05:16,360 --> 00:05:18,240 Speaker 1: millionaire from the number of times people have said that 115 00:05:18,279 --> 00:05:22,240 Speaker 1: to me. Really, yeah, everyone wants to use AI to 116 00:05:22,320 --> 00:05:24,400 Speaker 1: try and figure out the right partner, not the partner 117 00:05:24,440 --> 00:05:28,240 Speaker 1: that you pick because they you know, they meet your parents' expectations, 118 00:05:28,279 --> 00:05:30,040 Speaker 1: or they're a bit of a smunk. You know, how 119 00:05:30,080 --> 00:05:32,799 Speaker 1: do you actually choose someone who really aligns with your values? 120 00:05:32,880 --> 00:05:35,480 Speaker 1: Like you can do that through science. It's just that 121 00:05:35,560 --> 00:05:40,000 Speaker 1: we as humans feel really uncomfortable using science, you know, 122 00:05:40,160 --> 00:05:42,080 Speaker 1: data and anything but the gut or the gut of 123 00:05:42,120 --> 00:05:45,040 Speaker 1: our friends and family to help us on those decisions. 124 00:05:44,640 --> 00:05:49,000 Speaker 1: It's a lot of human bias really around using a 125 00:05:49,040 --> 00:05:50,560 Speaker 1: different approach for these decisions. 126 00:05:50,880 --> 00:05:53,160 Speaker 2: Interesting, Bob, thank you very much for talking us STIFYD. 127 00:05:53,160 --> 00:05:55,960 Speaker 2: They absolutely fascinating. It is Bob Heiman, founder and CEO 128 00:05:56,000 --> 00:05:57,800 Speaker 2: of sa PAAI. Do you know what I bet you 129 00:05:57,839 --> 00:05:59,520 Speaker 2: there is something out there where you can find a 130 00:05:59,520 --> 00:06:02,240 Speaker 2: partner you can I mean that's basically what Tinder is, 131 00:06:02,240 --> 00:06:03,720 Speaker 2: isn't it where you sort of like you put all 132 00:06:03,760 --> 00:06:06,320 Speaker 2: your important stuff, but it's it's not smart enough. You've 133 00:06:06,320 --> 00:06:07,680 Speaker 2: got to go for you've got to go for more 134 00:06:07,720 --> 00:06:10,279 Speaker 2: than that. You've got to go for values, like like 135 00:06:10,480 --> 00:06:13,520 Speaker 2: values to me would be how often do you wash 136 00:06:13,600 --> 00:06:18,360 Speaker 2: the sheets? Do you wipe the bench after you've made 137 00:06:18,360 --> 00:06:20,000 Speaker 2: a coffee and left the crimes? You know, like that 138 00:06:20,080 --> 00:06:21,600 Speaker 2: kind of stuff like I'm into that. I think that's 139 00:06:21,640 --> 00:06:23,680 Speaker 2: really important, and that kind of most of those things. 140 00:06:23,680 --> 00:06:25,640 Speaker 2: If the AI could just solve that stuff like no 141 00:06:25,760 --> 00:06:28,440 Speaker 2: lies please, also has the screen out whether the person 142 00:06:28,520 --> 00:06:30,440 Speaker 2: is lying just to impress you. If you could fix 143 00:06:30,480 --> 00:06:32,920 Speaker 2: all of that stuffy how many marriages would be saved 144 00:06:33,160 --> 00:06:35,320 Speaker 2: by the dishwasher washer being packed properly. 145 00:06:35,000 --> 00:06:39,000 Speaker 1: Out For more from Heather Duplessy Allen Drive, listen live 146 00:06:39,160 --> 00:06:39,839 Speaker 1: to news talks. 147 00:06:39,839 --> 00:06:43,040 Speaker 2: It'd be from four pm weekdays, or follow the podcast 148 00:06:43,120 --> 00:06:44,080 Speaker 2: on iHeartRadio