1 00:00:04,400 --> 00:00:07,800 Speaker 1: Welcome to tex Stuff, a production from I Heart Radio. 2 00:00:11,840 --> 00:00:14,480 Speaker 1: This season of Smart Talks with IBM is all about 3 00:00:14,600 --> 00:00:18,480 Speaker 1: new creators, the developers, data scientists, c t o s 4 00:00:18,560 --> 00:00:23,280 Speaker 1: and other visionaries creatively applying technology in business to drive change. 5 00:00:23,800 --> 00:00:26,680 Speaker 1: They use their knowledge and creativity to develop better ways 6 00:00:26,680 --> 00:00:30,280 Speaker 1: of working, no matter the industry. Join hosts from your 7 00:00:30,320 --> 00:00:33,880 Speaker 1: favorite Pushkin Industries podcasts as they use their expertise to 8 00:00:33,960 --> 00:00:37,480 Speaker 1: deepen these conversations, and of course Malcolm Gladwell will guide 9 00:00:37,520 --> 00:00:39,839 Speaker 1: you through the season as your host and provide his 10 00:00:39,920 --> 00:00:43,000 Speaker 1: thoughts and analysis along the way. Look out for new 11 00:00:43,040 --> 00:00:45,480 Speaker 1: episodes of Smart Talks with IBM on the I Heart 12 00:00:45,560 --> 00:00:49,159 Speaker 1: Radio app, Apple Podcasts, or wherever you get your podcasts, 13 00:00:49,360 --> 00:00:56,360 Speaker 1: and learn more at IBM dot com slash smart talks. Hello, Hello, 14 00:00:56,520 --> 00:01:00,560 Speaker 1: Welcome to Smart Talks with IBM, a podcast from Pushkin Industries, 15 00:01:00,800 --> 00:01:05,679 Speaker 1: I Heart Radio and ib AM. I'm Malcolm Gladmow. This season, 16 00:01:05,720 --> 00:01:10,000 Speaker 1: we're talking to new creators, the developers, data scientists, c 17 00:01:10,200 --> 00:01:13,440 Speaker 1: t o s and other visionaries who are creatively applying 18 00:01:13,520 --> 00:01:18,200 Speaker 1: technology in business to drive change. Channeling their knowledge and expertise, 19 00:01:18,360 --> 00:01:23,240 Speaker 1: they're developing more creative and effective solutions no matter the industry. 20 00:01:23,520 --> 00:01:27,200 Speaker 1: Our guest today is Angela Hood, founder and CEO of 21 00:01:27,360 --> 00:01:31,840 Speaker 1: This Way Global. Angela's mission is to eliminate discrimination in 22 00:01:31,880 --> 00:01:35,760 Speaker 1: the hiring process. Angela is a serial entrepreneur who saw 23 00:01:35,800 --> 00:01:39,480 Speaker 1: the potential to use automation technology as a way to 24 00:01:39,600 --> 00:01:43,720 Speaker 1: combat the human biases that lead to unfair hiring practices 25 00:01:44,000 --> 00:01:48,559 Speaker 1: and a less diverse, less competitive workforce. On today's show, 26 00:01:48,720 --> 00:01:51,880 Speaker 1: you'll hear how automation makes it easier than ever to 27 00:01:51,920 --> 00:01:55,840 Speaker 1: connect businesses with the right candidates, why automation is such 28 00:01:55,880 --> 00:01:59,480 Speaker 1: a powerful tool to mitigate bias, and how Angela's own 29 00:01:59,520 --> 00:02:05,200 Speaker 1: experiences with discriminatory hiring inspired her to take action. Angela 30 00:02:05,240 --> 00:02:09,080 Speaker 1: spoke with Jacob Goldstein, host of the pushkin podcast What's 31 00:02:09,120 --> 00:02:13,480 Speaker 1: Your Problem and former host of nprs Planet Money. Jacob 32 00:02:13,520 --> 00:02:16,799 Speaker 1: has been a business journalist for over a decade, reporting 33 00:02:16,840 --> 00:02:20,440 Speaker 1: for NPR, The Wall Street Journal, the Miami Herald, and 34 00:02:20,560 --> 00:02:23,720 Speaker 1: is the author of the book Money, The True Story 35 00:02:23,880 --> 00:02:27,160 Speaker 1: of a Made Up Thing. Okay, let's get to the interview. 36 00:02:30,760 --> 00:02:33,320 Speaker 1: Can you tell me just you know we're gonna get 37 00:02:33,360 --> 00:02:36,280 Speaker 1: into it a lot, but very briefly, what is this 38 00:02:36,320 --> 00:02:41,960 Speaker 1: weay Global? So our technology is built specifically to match 39 00:02:42,040 --> 00:02:45,560 Speaker 1: all people to all jobs without bias. And the last 40 00:02:45,639 --> 00:02:49,040 Speaker 1: part is the hardest part and also the most important. 41 00:02:49,600 --> 00:02:52,799 Speaker 1: And where did the idea for the company come from? 42 00:02:52,840 --> 00:02:56,600 Speaker 1: So I'm a female engineer, and um, you know, I'm 43 00:02:56,600 --> 00:03:00,800 Speaker 1: going out into the workforce after graduating, and I think 44 00:03:00,840 --> 00:03:02,920 Speaker 1: that just like everyone else, I can just put my 45 00:03:03,040 --> 00:03:06,080 Speaker 1: name up at the top of my resume and submit 46 00:03:06,200 --> 00:03:10,840 Speaker 1: this to companies and they'll know entertain me for an interview. 47 00:03:11,560 --> 00:03:14,120 Speaker 1: And what I found was because of the type of 48 00:03:14,280 --> 00:03:16,880 Speaker 1: engineering role I was looking for, which is in the 49 00:03:16,919 --> 00:03:21,240 Speaker 1: construction industry, that was not the case. The recruiters and 50 00:03:21,280 --> 00:03:24,079 Speaker 1: also the hiring managers at these companies would see my 51 00:03:24,240 --> 00:03:26,560 Speaker 1: name and think, well, I don't think that we want 52 00:03:26,600 --> 00:03:29,160 Speaker 1: a woman, or we don't think that she really understands 53 00:03:29,200 --> 00:03:31,480 Speaker 1: the job because it's out in the field. And so 54 00:03:31,639 --> 00:03:34,400 Speaker 1: a mentor of mine said, why don't you use your initials, 55 00:03:34,560 --> 00:03:38,600 Speaker 1: which are conveniently a L. And so I would submit 56 00:03:38,800 --> 00:03:43,040 Speaker 1: my resume as a L hood and people thought I 57 00:03:43,080 --> 00:03:45,800 Speaker 1: was a man. And so then at the same company, 58 00:03:45,840 --> 00:03:48,680 Speaker 1: for the same job, I would get interviews. And that 59 00:03:48,760 --> 00:03:50,680 Speaker 1: was the first moment where I realized there was a 60 00:03:50,680 --> 00:03:53,520 Speaker 1: lot of bias in the market and turns out that 61 00:03:53,600 --> 00:03:57,200 Speaker 1: there's a lot more biases, and we work to correct 62 00:03:57,240 --> 00:03:59,360 Speaker 1: for all of them. It's funny that kind of story 63 00:03:59,360 --> 00:04:02,120 Speaker 1: shouldn't be shocking, right, Like I know that I shouldn't 64 00:04:02,160 --> 00:04:05,440 Speaker 1: be shocked by it, and yet I still kind of am. 65 00:04:05,480 --> 00:04:08,840 Speaker 1: So clearly there's a tremendous amount of bias in the world, 66 00:04:09,000 --> 00:04:12,040 Speaker 1: and bias in recruiting. And you know, we're familiar with 67 00:04:12,080 --> 00:04:15,920 Speaker 1: these kind of stories of human bias, but but now 68 00:04:15,960 --> 00:04:21,479 Speaker 1: there's this new problem, right, which is algorithmic bias. What 69 00:04:21,680 --> 00:04:25,360 Speaker 1: is that tell me about algorithmic bias? A lot of 70 00:04:25,600 --> 00:04:30,960 Speaker 1: algorithms are underpinned by machine learning, and machine learning very 71 00:04:31,000 --> 00:04:34,039 Speaker 1: simply can happen kind of two different ways. You study 72 00:04:34,720 --> 00:04:37,479 Speaker 1: what has happened in the past, and you try to 73 00:04:37,640 --> 00:04:42,840 Speaker 1: duplicate that faster and more efficiently, and so that in 74 00:04:42,920 --> 00:04:48,520 Speaker 1: this context would be called supervised learning, and that seemed 75 00:04:48,560 --> 00:04:52,719 Speaker 1: like the logical place for nearly every recruiting technology to 76 00:04:52,800 --> 00:04:56,839 Speaker 1: start the problem with that is that there's been so 77 00:04:56,920 --> 00:05:00,680 Speaker 1: much historical bias that all you would really be doing 78 00:05:01,320 --> 00:05:06,400 Speaker 1: is capturing that company or that hiring manager recruiters bias 79 00:05:07,120 --> 00:05:11,400 Speaker 1: and duplicating it really fast, very efficiently. So you would 80 00:05:11,440 --> 00:05:15,400 Speaker 1: just be expanding bias much much faster than how a 81 00:05:15,480 --> 00:05:21,240 Speaker 1: human could. The flip side is something that's called unsupervised. 82 00:05:21,360 --> 00:05:24,680 Speaker 1: So this is where you build a system essentially a 83 00:05:24,720 --> 00:05:29,240 Speaker 1: black box. It's doing all types of calculations and decisions internally, 84 00:05:30,040 --> 00:05:33,640 Speaker 1: and then it's not biased in theory, but you have 85 00:05:33,680 --> 00:05:37,159 Speaker 1: no idea what it's basing its opinions on, so it 86 00:05:37,240 --> 00:05:42,760 Speaker 1: can kind of create bizarre results. Also, it's not explainable, 87 00:05:43,400 --> 00:05:45,840 Speaker 1: and so then you get caught in this catch twenty 88 00:05:45,839 --> 00:05:48,760 Speaker 1: two of I don't want to do bias at scale, 89 00:05:49,160 --> 00:05:52,440 Speaker 1: but I need to be explainable. So what do you do? 90 00:05:52,800 --> 00:05:56,200 Speaker 1: After thirteen failures, we finally figured out a way to 91 00:05:56,240 --> 00:06:00,520 Speaker 1: do this. The methodology that we finally found that innerrated 92 00:06:00,560 --> 00:06:05,480 Speaker 1: the results that run biased was not ever allowing the 93 00:06:05,600 --> 00:06:10,320 Speaker 1: math model or the computer to see the information that 94 00:06:10,480 --> 00:06:15,360 Speaker 1: causes bias. So we had to not let gender enter 95 00:06:15,480 --> 00:06:20,160 Speaker 1: into the system, ethnicity couldnot enter into the system, things 96 00:06:20,200 --> 00:06:23,960 Speaker 1: like that, and so then the logical question is, okay, well, 97 00:06:24,000 --> 00:06:26,839 Speaker 1: so if you don't allow those pieces of information to 98 00:06:26,920 --> 00:06:32,120 Speaker 1: come in, how can you then enable qualified people that 99 00:06:32,160 --> 00:06:36,719 Speaker 1: are also diverse to surface without bias? And the answer 100 00:06:36,839 --> 00:06:41,080 Speaker 1: is when you remove these factors. It happens naturally, and 101 00:06:41,160 --> 00:06:44,440 Speaker 1: we learned this by testing. We've had fifteen and a 102 00:06:44,480 --> 00:06:47,640 Speaker 1: half trillion matching events go through our system and almost 103 00:06:47,960 --> 00:06:51,360 Speaker 1: it's been now almost a decade, and through this we've 104 00:06:51,440 --> 00:06:54,760 Speaker 1: learned a lot. People are very diverse. If you will 105 00:06:54,839 --> 00:06:58,480 Speaker 1: just remove your own bias, you'll start seeing them. So 106 00:06:58,520 --> 00:07:01,520 Speaker 1: it's a it's an automated version of what you as 107 00:07:01,560 --> 00:07:05,360 Speaker 1: an individual did, uh before you started the company, when 108 00:07:05,400 --> 00:07:09,159 Speaker 1: you switch from putting your full name on your applications 109 00:07:09,200 --> 00:07:13,320 Speaker 1: to just your initials, effectively hiding your gender. Yeah, it 110 00:07:13,400 --> 00:07:16,320 Speaker 1: started with that. What we also learned though, is even 111 00:07:16,360 --> 00:07:20,120 Speaker 1: if you conceal your name, there are words where maybe 112 00:07:20,200 --> 00:07:23,280 Speaker 1: someone is a waitress in a previous job, and so 113 00:07:23,400 --> 00:07:27,120 Speaker 1: then the persons like, ah, that's a female, right, So 114 00:07:27,160 --> 00:07:29,040 Speaker 1: then we had to go one step further. We had 115 00:07:29,040 --> 00:07:33,000 Speaker 1: to say, now we have to neutralize these gender specific 116 00:07:33,080 --> 00:07:37,080 Speaker 1: words inside resume so that a person cannot look at 117 00:07:37,080 --> 00:07:41,800 Speaker 1: the document and still sus out ethnicity, gender and other 118 00:07:41,880 --> 00:07:47,720 Speaker 1: biasing attributes. It's remarkable that after hiding prejudicial information from 119 00:07:47,760 --> 00:07:52,240 Speaker 1: the computer like a candidate's gender or ethnicity, a qualified 120 00:07:52,360 --> 00:07:57,920 Speaker 1: diverse workforce assembled naturally as a result, for the overburdened recruiter. 121 00:07:58,160 --> 00:08:02,200 Speaker 1: That means there are huge advantage is to using intelligent automation. 122 00:08:02,720 --> 00:08:07,640 Speaker 1: That's a win win. As the conversation continues, Angela explains 123 00:08:07,920 --> 00:08:13,400 Speaker 1: how IBMS technology enabled her to simplify her customers hiring processes, 124 00:08:13,920 --> 00:08:17,360 Speaker 1: and she also shed some light on how far intelligent 125 00:08:17,480 --> 00:08:21,960 Speaker 1: automation has come in the past few years. How does 126 00:08:22,000 --> 00:08:25,800 Speaker 1: intelligent automation look different today than it did, say, five 127 00:08:25,880 --> 00:08:33,160 Speaker 1: years ago, it actually works, as the first thing. The 128 00:08:33,160 --> 00:08:38,800 Speaker 1: the level of innovation that has taken place is absolutely incredible. 129 00:08:39,559 --> 00:08:42,640 Speaker 1: And here's the thing about it is people have had 130 00:08:42,760 --> 00:08:45,600 Speaker 1: some negative interactions with things that said that they were 131 00:08:45,640 --> 00:08:48,440 Speaker 1: automated and they're now they're like, I don't want to 132 00:08:48,520 --> 00:08:52,760 Speaker 1: use it. The level of innovation that has happened is 133 00:08:52,880 --> 00:08:56,960 Speaker 1: absolutely incredible. And for them to not try something because 134 00:08:56,960 --> 00:08:59,280 Speaker 1: they tried something a decade ago and it didn't work, 135 00:08:59,760 --> 00:09:02,959 Speaker 1: that it's just completely the wrong approach. We're going to 136 00:09:03,080 --> 00:09:07,000 Speaker 1: see massive innovation over the next five to ten years too, 137 00:09:07,480 --> 00:09:09,160 Speaker 1: and you don't want to miss that. You don't want 138 00:09:09,160 --> 00:09:11,600 Speaker 1: to say, oh, I said on the sidelines because I 139 00:09:11,600 --> 00:09:14,160 Speaker 1: had a bad experience a decade ago. So I think 140 00:09:14,240 --> 00:09:17,520 Speaker 1: if you know, if you're anywhere involved in technology or 141 00:09:18,240 --> 00:09:21,280 Speaker 1: business growth, you need to be part of this. This 142 00:09:21,360 --> 00:09:26,200 Speaker 1: is your economy in play a role. So what is 143 00:09:26,320 --> 00:09:32,720 Speaker 1: a digital employee? Right? So our partnership with IBM, Watson 144 00:09:32,840 --> 00:09:37,280 Speaker 1: Orchestrate is around the DIG. So D I, G. E. 145 00:09:37,640 --> 00:09:41,680 Speaker 1: Y is a DIG who's a digital employee And I 146 00:09:41,840 --> 00:09:44,840 Speaker 1: always think of it honestly, is more of a concierge. 147 00:09:45,000 --> 00:09:48,319 Speaker 1: You can have all of your job descriptions living inside 148 00:09:48,360 --> 00:09:50,760 Speaker 1: of Box, for instance, and so there's all the job 149 00:09:50,800 --> 00:09:54,240 Speaker 1: descriptions and you're like, oh, I need to find someone 150 00:09:54,440 --> 00:10:00,240 Speaker 1: for this job. Watson goes into Box, grabs the odd 151 00:10:00,240 --> 00:10:05,120 Speaker 1: description and then sends that into this way system and 152 00:10:05,240 --> 00:10:10,280 Speaker 1: this way automatically surfaces up to three hundred qualified people 153 00:10:10,800 --> 00:10:15,280 Speaker 1: from diverse organizations. Right, so now the recruiter has not 154 00:10:15,440 --> 00:10:17,520 Speaker 1: had to figure out where are they going to source 155 00:10:17,600 --> 00:10:21,199 Speaker 1: these people from. They haven't had to sort out how 156 00:10:21,200 --> 00:10:24,440 Speaker 1: they're going to reach out to diverse organizations because we 157 00:10:24,480 --> 00:10:28,320 Speaker 1: have partners. And so now that part has been taken 158 00:10:28,360 --> 00:10:31,559 Speaker 1: care of, and then Wat's an organestrate does the next step, 159 00:10:31,600 --> 00:10:36,199 Speaker 1: which is sends out communication to the candidates that you 160 00:10:36,440 --> 00:10:41,480 Speaker 1: are interested in automatically, and then you get to sit 161 00:10:41,559 --> 00:10:44,440 Speaker 1: and wait for these people to respond back to you 162 00:10:44,600 --> 00:10:48,080 Speaker 1: of their interest in discussing something with you. Now all 163 00:10:48,120 --> 00:10:51,040 Speaker 1: of this has been automated, and essentially what I just 164 00:10:51,120 --> 00:10:55,080 Speaker 1: described could easily take a person three weeks to go 165 00:10:55,120 --> 00:10:58,280 Speaker 1: and identify all the talent. So you take three weeks 166 00:10:58,400 --> 00:11:01,800 Speaker 1: and you put this down to roughly three or four minutes. 167 00:11:01,960 --> 00:11:06,439 Speaker 1: Now it's absolutely incredible, and I think it gives recruiters 168 00:11:06,480 --> 00:11:08,320 Speaker 1: the time to do what they really want to do, 169 00:11:08,360 --> 00:11:12,280 Speaker 1: which is talked to people. How did you decide that 170 00:11:12,360 --> 00:11:18,160 Speaker 1: automation was the right tool to fight bias? That was 171 00:11:18,200 --> 00:11:21,720 Speaker 1: a journey, be as I think a lot of entrepreneurship 172 00:11:21,840 --> 00:11:26,480 Speaker 1: is an innovation. When we hire technology, we're hiring technology 173 00:11:26,559 --> 00:11:28,800 Speaker 1: to do a job for us. So what is the 174 00:11:28,880 --> 00:11:33,920 Speaker 1: job to be done here? It is to identify qualified 175 00:11:34,000 --> 00:11:38,480 Speaker 1: talent without bias. So when you start breaking this down, 176 00:11:39,080 --> 00:11:42,280 Speaker 1: you realize that if humans could do it, we would 177 00:11:42,280 --> 00:11:45,280 Speaker 1: have already done it. There's been a desire to have 178 00:11:45,440 --> 00:11:48,920 Speaker 1: this happen for many, many years, and we were not 179 00:11:49,040 --> 00:11:52,400 Speaker 1: successful at it. And the reason why is bias is 180 00:11:52,440 --> 00:11:55,880 Speaker 1: not discrimination. These things get confused all the time. Bias 181 00:11:56,120 --> 00:12:00,480 Speaker 1: is a product of our survival mechanism. We are always 182 00:12:00,480 --> 00:12:03,640 Speaker 1: going to survive as humans, and so we we need 183 00:12:03,760 --> 00:12:07,080 Speaker 1: these survival skills. That's part of bias. So we're not 184 00:12:07,080 --> 00:12:09,040 Speaker 1: going to get rid of it, and it's not a 185 00:12:09,120 --> 00:12:14,400 Speaker 1: character flaw. Bias is just inherently human and we are human. 186 00:12:15,120 --> 00:12:18,439 Speaker 1: And the best purpose that I think technology can serve 187 00:12:18,679 --> 00:12:20,840 Speaker 1: is the fact that it can do some things that 188 00:12:20,920 --> 00:12:23,920 Speaker 1: we can't do. We have to be very careful about 189 00:12:24,000 --> 00:12:27,120 Speaker 1: how we engineer it. Our own technology was engineered with 190 00:12:27,280 --> 00:12:31,760 Speaker 1: removing bias is the priority. But we can really have 191 00:12:32,080 --> 00:12:34,920 Speaker 1: technology make us better humans because it can do things 192 00:12:34,920 --> 00:12:38,680 Speaker 1: we can't do. Despite the potential to vastly improve the 193 00:12:38,720 --> 00:12:43,520 Speaker 1: way we hire, most companies still think automation is inaccessible, 194 00:12:44,120 --> 00:12:48,000 Speaker 1: perhaps a luxury to aspire to in the future, but 195 00:12:48,120 --> 00:12:50,880 Speaker 1: we live in a time when companies are hungrier than 196 00:12:50,880 --> 00:12:55,880 Speaker 1: ever to fill positions quickly. Jacob asked Angela what automation 197 00:12:55,920 --> 00:12:59,959 Speaker 1: can deliver for businesses today and how a companies creative 198 00:13:00,000 --> 00:13:04,640 Speaker 1: It is linked with its diversity. How prevalent is intelligent 199 00:13:04,679 --> 00:13:10,559 Speaker 1: automation in talent acquisition workflows today? So our data says 200 00:13:10,640 --> 00:13:17,199 Speaker 1: that in enterprise that roughly seven percent have adopted some 201 00:13:17,320 --> 00:13:21,280 Speaker 1: level of truly automated technology. But when you look at 202 00:13:21,480 --> 00:13:23,920 Speaker 1: the job market in toll like, you know, if you 203 00:13:23,960 --> 00:13:27,080 Speaker 1: look at the millions of employers we have, it's you know, 204 00:13:27,160 --> 00:13:32,959 Speaker 1: less than three percent have adopted automation. These are companies 205 00:13:33,000 --> 00:13:37,320 Speaker 1: that have a smaller workforce to do a great amount 206 00:13:37,320 --> 00:13:40,880 Speaker 1: of work. They're recovering from a pandemic, they need help, 207 00:13:41,160 --> 00:13:45,800 Speaker 1: and they think that automation is expensive, and it's actually 208 00:13:45,880 --> 00:13:49,000 Speaker 1: the opposite. It's not expensive at all. And so I 209 00:13:49,240 --> 00:13:53,520 Speaker 1: would encourage businesses that are mid market in small businesses 210 00:13:53,760 --> 00:13:57,160 Speaker 1: to embrace technology in a way that they haven't done. So, 211 00:13:57,320 --> 00:13:59,920 Speaker 1: I mean, there's one more piece of sort of what's 212 00:14:00,000 --> 00:14:02,800 Speaker 1: going on now that seems really interesting in the context 213 00:14:02,800 --> 00:14:05,839 Speaker 1: of what you do, and that is the incredible demand 214 00:14:06,080 --> 00:14:08,600 Speaker 1: for workers right now. Right there's I don't know, ten 215 00:14:08,679 --> 00:14:12,440 Speaker 1: million plus job openings, there's the great resignation, and so 216 00:14:12,480 --> 00:14:18,559 Speaker 1: I'm curious how automation is helping both companies and workers 217 00:14:18,880 --> 00:14:24,720 Speaker 1: through this process. Now, there's never been a job market 218 00:14:24,800 --> 00:14:27,560 Speaker 1: like we are living in right now, and so we 219 00:14:27,720 --> 00:14:30,360 Speaker 1: have to think of as employers, we have to think 220 00:14:30,400 --> 00:14:33,600 Speaker 1: of how do I attract this talent. The other thing 221 00:14:33,600 --> 00:14:36,800 Speaker 1: about the volume of jobs that are open. Is if 222 00:14:36,800 --> 00:14:39,800 Speaker 1: you just do the simple math, there's two jobs for 223 00:14:39,840 --> 00:14:42,760 Speaker 1: every one person looking for a job. Okay, so that 224 00:14:42,920 --> 00:14:47,040 Speaker 1: is astounding to begin with. But of the jobs that 225 00:14:47,120 --> 00:14:50,840 Speaker 1: we have available in the market, most people do not 226 00:14:51,160 --> 00:14:56,760 Speaker 1: have the skill set required to fill those jobs. Inside 227 00:14:56,880 --> 00:14:59,600 Speaker 1: the talent pool that is actively looking for a job. 228 00:15:00,600 --> 00:15:02,600 Speaker 1: So now you have to go out and you need 229 00:15:02,640 --> 00:15:05,480 Speaker 1: to be looking for passive talent. You need to be 230 00:15:05,600 --> 00:15:08,560 Speaker 1: cultivating a relationship with the people that do have the 231 00:15:08,560 --> 00:15:11,640 Speaker 1: skills you need. When you go to them, you need 232 00:15:11,680 --> 00:15:13,360 Speaker 1: to be able to say two things. You need to 233 00:15:13,440 --> 00:15:16,880 Speaker 1: be able to say, we use the best technology to 234 00:15:16,960 --> 00:15:20,760 Speaker 1: identify you because you were special, and we really want 235 00:15:20,800 --> 00:15:22,920 Speaker 1: you to come to work for us. That's number one. 236 00:15:23,600 --> 00:15:25,920 Speaker 1: Two you need to say. And when you get here, 237 00:15:26,160 --> 00:15:28,760 Speaker 1: we're going to help you automate those parts of your 238 00:15:28,840 --> 00:15:32,280 Speaker 1: job that you've never really enjoyed before, because we want 239 00:15:32,320 --> 00:15:35,840 Speaker 1: you to be able to dig in in the areas 240 00:15:35,880 --> 00:15:39,040 Speaker 1: you're passionate about, because you're gonna be happier and you're 241 00:15:39,040 --> 00:15:41,920 Speaker 1: gonna have a better work life balance. That is how 242 00:15:41,920 --> 00:15:44,400 Speaker 1: you win talent in this market. Yeah, what have you 243 00:15:44,480 --> 00:15:48,440 Speaker 1: heard back from recruiters about about this? You know, increased 244 00:15:48,480 --> 00:15:51,680 Speaker 1: integration of technology. So one of the things that I 245 00:15:51,680 --> 00:15:55,200 Speaker 1: think has been maybe the most surprising is that it's 246 00:15:55,280 --> 00:15:59,480 Speaker 1: really opened up the communication between hiring managers and recruiters 247 00:15:59,560 --> 00:16:02,720 Speaker 1: inside the same company. And there has long been a 248 00:16:02,800 --> 00:16:08,120 Speaker 1: silo of hiring managers putting out job descriptions and saying recruiters, 249 00:16:08,160 --> 00:16:11,240 Speaker 1: you know, go find people that make this. And then 250 00:16:11,480 --> 00:16:15,280 Speaker 1: the recruiter needs additional support because they're getting questions from 251 00:16:15,280 --> 00:16:18,360 Speaker 1: the candidates or there's some questions around what are the 252 00:16:18,480 --> 00:16:22,840 Speaker 1: real job specific requirements, and they have trouble getting those 253 00:16:22,880 --> 00:16:27,720 Speaker 1: answers from the hiring manager. Hiring managers very busy and 254 00:16:27,720 --> 00:16:31,080 Speaker 1: they have their own job to do. So by making 255 00:16:31,120 --> 00:16:35,600 Speaker 1: this more efficient, you start getting much better interactions between 256 00:16:35,600 --> 00:16:40,560 Speaker 1: the entire company. And in this current market, companies are 257 00:16:41,040 --> 00:16:44,000 Speaker 1: truly desperate to find the talent that they need. The 258 00:16:44,120 --> 00:16:47,360 Speaker 1: people want to be found, and now the technology is 259 00:16:47,400 --> 00:16:51,440 Speaker 1: there to help make this seamless. So that's the automation piece. 260 00:16:52,000 --> 00:16:54,640 Speaker 1: Let's talk about the diversity piece sort of you know, 261 00:16:55,240 --> 00:16:59,200 Speaker 1: landing here right. So on the diversity side, how does 262 00:16:59,680 --> 00:17:03,960 Speaker 1: how does a diverse workforce help make a business more creative? 263 00:17:06,040 --> 00:17:09,400 Speaker 1: A lot of the big consulting firms have dug in 264 00:17:09,520 --> 00:17:12,560 Speaker 1: for the last decade and said is there really an 265 00:17:12,720 --> 00:17:17,600 Speaker 1: R O I around diversity, And uniformly the answer has 266 00:17:17,640 --> 00:17:23,639 Speaker 1: been yes. There is increased profits, a more consistent workforce, 267 00:17:23,760 --> 00:17:26,240 Speaker 1: meaning people don't want to leave. There's not the same 268 00:17:26,320 --> 00:17:29,880 Speaker 1: level of attrition when the workforce is more diverse, and 269 00:17:30,160 --> 00:17:33,840 Speaker 1: better recruiting numbers. So all of that is like the outcome. 270 00:17:34,400 --> 00:17:37,119 Speaker 1: But I think the key thing to understand is the 271 00:17:37,200 --> 00:17:41,400 Speaker 1: why behind this. The why is that when you're diverse, 272 00:17:42,359 --> 00:17:45,960 Speaker 1: you come to solutions and you come to questions and 273 00:17:46,040 --> 00:17:50,119 Speaker 1: challenges from a different perspective. And when you have a 274 00:17:50,160 --> 00:17:55,919 Speaker 1: diverse workforce that is collaborating and bringing their creativity to 275 00:17:56,560 --> 00:18:00,919 Speaker 1: the market and you are using their insight to develop 276 00:18:01,000 --> 00:18:04,479 Speaker 1: better solutions, You're going to create better solutions. You're going 277 00:18:04,800 --> 00:18:07,560 Speaker 1: to get those solutions to market faster. You're going to 278 00:18:07,640 --> 00:18:12,200 Speaker 1: understand positioning of your value proposition inside the market. All 279 00:18:12,240 --> 00:18:15,800 Speaker 1: of these things happened with far more clarity when you 280 00:18:15,880 --> 00:18:20,240 Speaker 1: have a diverse workforce. You mentioned earlier that you failed 281 00:18:20,359 --> 00:18:24,280 Speaker 1: was it thirteen times? And I'm curious if sort of 282 00:18:24,600 --> 00:18:27,879 Speaker 1: getting through those failures and working your way to success 283 00:18:28,600 --> 00:18:31,240 Speaker 1: was a place where you did some creative problem solving. 284 00:18:33,520 --> 00:18:38,359 Speaker 1: I would say that would be an understatement at moments. Uh, 285 00:18:38,520 --> 00:18:41,320 Speaker 1: there are times where you know, I just say, like 286 00:18:41,359 --> 00:18:44,200 Speaker 1: thirteen failures kind of in passing. But there were times 287 00:18:44,280 --> 00:18:48,480 Speaker 1: where I felt like I was close to breaking as 288 00:18:48,560 --> 00:18:50,960 Speaker 1: an innovator, and the fact that was, like, there's just 289 00:18:51,040 --> 00:18:57,040 Speaker 1: non solution for this. The thirteen failures is incredibly gut wrenching. 290 00:18:57,280 --> 00:19:00,440 Speaker 1: But I was fortunate. I had very supportive and susters 291 00:19:00,440 --> 00:19:02,760 Speaker 1: and so we got through it. Uh, And I'm very 292 00:19:02,800 --> 00:19:05,600 Speaker 1: proud of the company we are today because of those failures. 293 00:19:06,240 --> 00:19:09,119 Speaker 1: So just to to wrap up, let's let's talk a 294 00:19:09,160 --> 00:19:12,240 Speaker 1: little bit about the future. We've done the past, we've 295 00:19:12,320 --> 00:19:14,679 Speaker 1: done the present. Let's talk a little bit about the future. 296 00:19:15,119 --> 00:19:18,480 Speaker 1: I mean, how do you think the hiring process will 297 00:19:18,560 --> 00:19:21,760 Speaker 1: look in the future, whatever, five years, ten years, And 298 00:19:21,800 --> 00:19:27,520 Speaker 1: in particular, what role will will automation, intelligent automation, augmented intelligence, 299 00:19:27,560 --> 00:19:31,879 Speaker 1: what role will will all that play? Well, if you 300 00:19:32,000 --> 00:19:36,160 Speaker 1: look back in decades ago, there were people that would 301 00:19:36,200 --> 00:19:38,760 Speaker 1: work for the same company for ten twenty years, and 302 00:19:38,800 --> 00:19:42,680 Speaker 1: that was, you know, not that unusual. Now very uncommon, 303 00:19:42,800 --> 00:19:45,840 Speaker 1: and in the future, I think it will be absolutely rare. 304 00:19:46,440 --> 00:19:49,680 Speaker 1: I think we're looking more likely at people that will 305 00:19:49,720 --> 00:19:53,080 Speaker 1: work for multiple companies. We're seeing that with the rise 306 00:19:53,119 --> 00:19:56,960 Speaker 1: of the gig economy, we obviously are seeing people love 307 00:19:57,040 --> 00:20:00,960 Speaker 1: to work remote. I know when we have an active 308 00:20:01,040 --> 00:20:05,000 Speaker 1: job that goes out into our marketplace and if it 309 00:20:05,080 --> 00:20:09,200 Speaker 1: is remote and also prioritize diversity, you will have twenty 310 00:20:09,240 --> 00:20:12,560 Speaker 1: to thirty times more applicants. So I think that we're 311 00:20:12,600 --> 00:20:17,320 Speaker 1: going to start seeing companies really investing in those two attributes, 312 00:20:17,920 --> 00:20:21,760 Speaker 1: trying to keep as many jobs remote as possible, just 313 00:20:21,800 --> 00:20:26,199 Speaker 1: because it attracts talent that companies are really struggling to 314 00:20:26,200 --> 00:20:29,800 Speaker 1: find right now. And I think the level of automation 315 00:20:30,000 --> 00:20:33,919 Speaker 1: is going to continue to increase, that will continue to 316 00:20:33,960 --> 00:20:36,679 Speaker 1: increase an investment over the next five to ten years. 317 00:20:36,720 --> 00:20:39,160 Speaker 1: In twenty years, I think we will all look back 318 00:20:39,200 --> 00:20:42,800 Speaker 1: and say, why did we all do these crazy parts 319 00:20:42,920 --> 00:20:46,720 Speaker 1: of our job? Why didn't we automate those It's because 320 00:20:46,760 --> 00:20:51,399 Speaker 1: we were waiting for technology like Orchestrate provides. Do you 321 00:20:51,520 --> 00:20:56,080 Speaker 1: have any specific advice for businesses that want to incorporate 322 00:20:56,119 --> 00:21:00,160 Speaker 1: technology and automation in their in their business and their work. 323 00:21:00,560 --> 00:21:03,640 Speaker 1: I would say, realize that you use automation every day, 324 00:21:03,920 --> 00:21:07,560 Speaker 1: You use AI every day. So when you're using Google 325 00:21:07,600 --> 00:21:10,560 Speaker 1: Maps or something like that, you're you're using your smartphone. 326 00:21:11,040 --> 00:21:14,360 Speaker 1: You're accessing this kind of technology as a consumer, as 327 00:21:14,359 --> 00:21:17,879 Speaker 1: an individual, there's no reason why you should worry about 328 00:21:17,920 --> 00:21:21,760 Speaker 1: adopting it as a business, and don't feel intimidated by it. 329 00:21:21,920 --> 00:21:25,480 Speaker 1: You are absolutely ready to use it and your business 330 00:21:25,520 --> 00:21:28,119 Speaker 1: is ready to benefit from it. Just don't have that fear. 331 00:21:28,880 --> 00:21:31,760 Speaker 1: We certainly is a company work with companies of all sizes. 332 00:21:31,880 --> 00:21:36,280 Speaker 1: We have companies that have five to tend employees only, 333 00:21:36,320 --> 00:21:39,840 Speaker 1: and we have some that have hundreds of thousands employees. 334 00:21:40,480 --> 00:21:42,960 Speaker 1: That's a great thing about automation is it doesn't care 335 00:21:43,040 --> 00:21:44,920 Speaker 1: the size of your company. It will work for you. 336 00:21:45,880 --> 00:21:48,040 Speaker 1: Angela's fun to talk to you. Thank you for your time, 337 00:21:48,480 --> 00:21:51,320 Speaker 1: congratulations and making it through to thirty And if you 338 00:21:51,320 --> 00:21:54,480 Speaker 1: really think about that, that's a It's a really impressive 339 00:21:54,600 --> 00:21:57,600 Speaker 1: level of persistence. Like I could imagine failing a few times, 340 00:21:57,600 --> 00:22:00,560 Speaker 1: but I would have given up at nine or something 341 00:22:02,280 --> 00:22:06,080 Speaker 1: at seven. At seven, I was like, I'm a crazy person. 342 00:22:10,359 --> 00:22:13,840 Speaker 1: It is vitally important to get hiring right. What could 343 00:22:13,880 --> 00:22:18,200 Speaker 1: be more essential to an organization's success than deciding which 344 00:22:18,280 --> 00:22:22,000 Speaker 1: human beings make up that organization. If we let our 345 00:22:22,080 --> 00:22:27,480 Speaker 1: biases go unchecked, we end up excluding qualified candidates, leaving 346 00:22:27,480 --> 00:22:31,919 Speaker 1: our workforce is less diverse and therefore less competitive because 347 00:22:31,960 --> 00:22:35,399 Speaker 1: of it. Angela made an interesting point earlier that I 348 00:22:35,440 --> 00:22:38,199 Speaker 1: want to go back to. She said that bias is 349 00:22:38,200 --> 00:22:42,359 Speaker 1: not a character flaw, it's a survival instinct, and that 350 00:22:42,440 --> 00:22:45,159 Speaker 1: the best purpose technology can serve is to make us 351 00:22:45,240 --> 00:22:49,040 Speaker 1: better humans by doing things for us that we can't. 352 00:22:49,880 --> 00:22:53,240 Speaker 1: Bias is in human nature and we'll never truly get 353 00:22:53,320 --> 00:22:56,520 Speaker 1: rid of it, but the first step to minimizing its 354 00:22:56,560 --> 00:23:00,520 Speaker 1: impact is to acknowledge it's a problem we need help with. 355 00:23:01,640 --> 00:23:06,040 Speaker 1: Intelligent automation can make hiring more efficient. When we allow 356 00:23:06,119 --> 00:23:10,480 Speaker 1: computers to mitigate our biases, better hiring is the result. 357 00:23:11,240 --> 00:23:14,159 Speaker 1: Sometimes to build the best team possible, we have to 358 00:23:14,200 --> 00:23:17,359 Speaker 1: know when to listen to our human instincts and when 359 00:23:17,400 --> 00:23:21,560 Speaker 1: to set them aside. On the next episode of Smart 360 00:23:21,600 --> 00:23:25,760 Speaker 1: Talks with IBM, how to use data creatively in order 361 00:23:25,800 --> 00:23:29,920 Speaker 1: to solve novel problems, We talk with YouTube content creator 362 00:23:30,160 --> 00:23:36,000 Speaker 1: and IBM's senior Data science and AI technical specialist, Nicholas Renaud. 363 00:23:37,240 --> 00:23:40,480 Speaker 1: Smart Talks with IBM is produced by Matt Romano, David 364 00:23:40,560 --> 00:23:45,920 Speaker 1: jaw Royston Deserve and Edith Rousselo with Jacob Goldstein were 365 00:23:46,080 --> 00:23:50,679 Speaker 1: edited by Sophie Crane. Are Engineers are Jason Gambrel, Sarah 366 00:23:50,720 --> 00:23:56,120 Speaker 1: Bragair and Ben Tolliday. Theme song by Granmascope. Special thanks 367 00:23:56,200 --> 00:24:00,360 Speaker 1: to Carlie mcglory, Andy Kelly, Kathy Callaghan and the eight 368 00:24:00,400 --> 00:24:04,600 Speaker 1: Bar and IBM teams, as well as the Pushkin marketing team. 369 00:24:04,760 --> 00:24:07,480 Speaker 1: Smart Talks with IBM is a production of Pushkin Industries 370 00:24:07,720 --> 00:24:11,720 Speaker 1: and I Heart Media. To find more Pushkin podcasts, listen 371 00:24:11,800 --> 00:24:15,560 Speaker 1: on the i Heart Radio app, Apple Podcasts, or wherever 372 00:24:16,000 --> 00:24:19,800 Speaker 1: you listen to podcasts. I'm Malcolm Gladwell. This is a 373 00:24:19,880 --> 00:24:26,560 Speaker 1: paid advertisement from IBM.