AI can add bias to hiring practices: One company found another way

3 years ago 321

After anglicizing his name, the founder of Knockri got a job. So he created a solution to remove bias from artificial intelligence in hiring.

TechRepublic's Karen Roby spoke with Jahanzaib Ansari, co-founder and CEO of Knockri, a behavioral skills assessment platform, about unconscious bias in artificial intelligence. The following is an edited transcript of their conversation.

Karen Roby: I think what makes this really interesting and why I wanted to talk to you, Jahanzaib, is because your desire to create this company was rooted in your own personal story.

SEE: Hiring Kit: Video Game Programmer (TechRepublic Premium)

Jahanzaib Ansari: I was actually applying to jobs and I wouldn't hear back from employers. I have a long, ethnic name, which is Jahanzaib, and so my co-founder, Maaz, is like, "Why don't you just anglicize it?" And we went from a variation of Jacob, Jordan, Jason, and literally in four to six weeks, I got a job. And so yeah, with that experience, we just felt like there are so many people that are being overlooked and that there has to be a better solution.

Karen Roby: Suffice it to say, you certainly have a passion for this work.

Jahanzaib Ansari: I think making sure that each and every single person has a fair shot and has fair opportunity is something I think that deeply resonates with me. Just going through this, being broke, and being judged on your name, which has no correlation to a success predictor on the job role, I just feel like there had to be something done, and how can we do this on a massive scale? And so that is essentially when we created Knockri, and we got together with my third co-founder, whose name is Faisal Ahmed, and he's a machine learning scientist. And we got together with an IO psychologist, which is pretty much an industrial organizational psychologist, and we're extremely science and evidence-based.

Karen Roby: When we talk about the bias that exists, Jahanzaib, how much is there? How big of a problem is this?

Jahanzaib Ansari: Unfortunately, it's been a systemic problem, because it's been going on for so long now. And so all of us have certain biases that we've grown up with, and those unfortunately come out a lot of the time in these interactions such as a job hiring process. For example, if I'm trying to hire somebody and he went to the same school as me and I'm favoring that versus his or her skills and the abilities that they can bring or not bring versus somebody that actually has those, it causes a lot of these problems. And that's just one of them, is just trying to hire somebody that looks like you, that maybe talks like you, and just has gone to the same school. And that's why a lot of organizations unfortunately have this very, like, bro-culture, and this even further creates gender and racial disparities in the workforce.

Karen Roby: Expand just a little bit on Knockri, the work that you guys are doing. And how this work is helping to eliminate bias in AI.

SEE: Digital transformation: A CXO's guide (free PDF) (TechRepublic)

Jahanzaib Ansari: Essentially, what we've built is an asynchronous behavioral video assessments that helps organizations reduce bias and shortlist the best-fit candidates for them to interview. And so a lot of organizations out there, they have a problem with gender and racial disparity, they have a problem with having to effectively and in a scientific manner screen thousands of candidates. And so what we've done is that we've built a system that is completely void of human biases and it solely relies on a candidate's skills. And so what this means is that a lot of other vendors out there, they're going to train their AI models on historical data, on historical learnings from the organization, and this can lead to a lot of can of worms.

I'm not sure if you've heard about the Amazon story, but they had created a resume screening technology and unfortunately they had trained it on historical data from the organization, which was essentially just kicking out female candidates because a lot of the hiring managers just hired a lot of males. So, if you train your AI technology with historical data that on top of that has showed bias, then that creates a perpetual problem. And so what we've done is that we have created technology that objectively identifies foundational skills in a candidate's interview response, and it's not trained on human interview ratings or on performance metrics. Rather, it is built to only identify specific skills within a candidate's interview transcript by just focusing on the behavior.

And the way that we do this is we're just extracting their speech and converting it into text, and that's it. Essentially, that is the highest predictor of success on the job role is making sure that the behaviors and the skills of the candidates are actually aligning to the key indicators of success on the job role. And so we have really taken a very scientific approach on approaching this.

Karen Roby: OK. Just looking down the road, let's say two years from now, where do you see AI and will bias in AI be a thing of the past by then, do you think?

Jahanzaib Ansari: What I'm going to say is that I think we have definitely made tremendous progress. When we had initially started off a couple of years ago, we saw organizations like that had moved from a state of fear of AI technology to educating themselves and now finally embracing it. And now what we're seeing is that there needs to be a standard of regulation. And so essentially Knockri as an organization itself is working with several bodies to make sure that our technology is not biased. So we are going through an algorithmic audit at the moment because we would like to set that golden standard and make sure that each and every single company in addition to the great results that we have provided them algorithmically they have full faith in our technology. And I feel like a lot of companies are going to request this. It's going to be similar to an ISO certification, and that is what we're seeing in the market currently.

Karen Roby: How did you guys come up with the name Knockri? What is the meaning behind it?

Jahanzaib Ansari: So the word Knockri actually means the word "job" to about a billion people. So in three different languages, in Urdu, which is the language of Pakistan, Hindi, which is India, and also Punjabi as well, which is India and Pakistan. So that is how we came up with it. It means the word job and also just knocking on the door of opportunity. So that's how we came up with it.

Data, Analytics and AI Newsletter

Learn the latest news and best practices about data science, big data analytics, and artificial intelligence. Delivered Mondays

Sign up today

Also see

Bias in hiring

TechRepublic's Karen Roby spoke with Jahanzaib Ansari, co-founder and CEO of Knockri, a behavioral skills assessment platform, about unconscious bias in artificial intelligence.

Image: Aleutie/Shutterstock

Read Entire Article