Dee Davis (00:00)
Hi and welcome to Management Under Construction. I'm Dee Davis.
Brad Wyant (00:04)
and I'm Brad Wyant.
Dee Davis (00:04)
And today we are going to be talking about hiring discrimination. Ooh, that is such a icky sticky phrase but there's hiring discrimination going on in business everywhere. And there's even some lawsuits about this topic that we're going to go over today. Since the beginning of humans, there's been discrimination, whether we intend it or we don't intend it.
This is an everlasting topic for human beings. today we want to focus on the unintentional kind of discrimination, not the kind where people teach hate and all the terrible things. We don't want to focus on that. That is a very different topic. We're going to talk about unintentional biases. It's based on life experiences, examples, and things we've been taught and experienced throughout our lives.
So why do we have bias? It's actually one of the ways that we as humans minimize conflict. we did an episode not too long ago about resolving conflicts and hard conversations People don't like conflict. We want to remove conflict as much as possible. It's built into who we are as human beings. And so if everyone around us thinks like us and has similar experiences, there's just less conflict.
That's the basic nature of why we do these kinds of things. But we have to intentionally think about these things and try to remove those biases and not let those biases run our lives, especially in the business world.
The result is that it can look like people of the same color, gender, age group, or whatever are working in the same company. It's also known as racial gender or age discrimination. How have you encountered this in your personal or professional life, Brad?
Brad Wyant (01:52)
Well, I think we've talked about this on other podcasts. It ends up being that more men find themselves drawn to construction than women. And the conversation often revolves around is that because construction is a more male activity, or is it because there are men hiring men already in construction? Do men create a space where anyone is welcome in this business? And
My answer to most of those questions has been, overwhelmingly no. Men have a hard time creating a space where anyone, regardless of gender, are welcome in a construction space. Luckily, I've worked for most of my career in California where racial discrimination is a lot lower than other parts of the world. And I never felt or saw any racial discrimination in the businesses I've worked in.
I've seen Hispanic business owners, black business owners treated with equal respect and kindness throughout my career, but I've seen a lot of gender discrimination and there's that bias that we all have. And I think trying to find a way to unpack that can be emotionally difficult for a lot of people, but I'm sure we'll get into that a little bit deeper. It's something that's fraught for a lot of us in identity and what it means to be male or female and how we view the world, which gets very personal.
Dee Davis (03:06)
And I think it changes for us over our lives too, because, being a female in a male dominated industry, and I've spent most of my career in male dominated industries, I've certainly experienced gender bias. And it's expressed itself in attitudes, it's expressed itself in the way I was treated or ignored or
not paid equally it's expressed itself in lots and lots of different ways and being five foot tall as a tiny person doesn't really help. I truly believe that is another form of bias. Small people have a tendency to be Less heard treated, less important than tall people. It's a thing.
We're not talking about height bias necessarily, but I think it plays into it. How you physically present yourself. if you're a large person, you have a big deep voice, you have a tendency to command respect in a room where somebody is a small person with a higher pitched voice, not as much.
Brad Wyant (03:57)
absolutely there's a fun anecdote from a negotiation book that I read called Never Split the Difference by Chris Voss where he talks about using your DJ radio host voice in certain circumstances where he talks about usually we talk at this kind of register but when you need to when you need to make somebody hear you you go into this lower register and he says that works because he's done research that shows that
inherently all people respect and are more fearful, more attentive to a lower voice. mean, these inherent biases are built into us all and the data has been out there for a long time to show it.
Dee Davis (04:36)
it's just another form of bias that we are not consciously aware of. And I'm starting to get to the point in my career where age bias is becoming a thing. I've heard about age bias for a long, long time. I'm starting to experience it.
There's still conversations out there about, don't want to bring somebody in that doesn't have this much time left in their career. Where in reality, that is completely irrelevant because people job hop and move around constantly. If you're 25 years old or you're 50 years old, you're going to get more out of somebody that's 50 years old.
Brad Wyant (04:51)
So that's interesting to me.
Dee Davis (05:14)
that's going to work for another five years, then you're going to get out of a 25 year old that's going to be with you for the next three years.
Brad Wyant (05:20)
very true, very true. That's interesting that you say that's the first time you've encountered age discrimination or bias though. think we all encounter it when we're young. It's like, they're young, they don't know anything. There are young people out there that know tons of things. We often don't listen to the young voices in the room because we assume they don't have anything to offer, but sometimes they do.
Dee Davis (05:39)
Well, that's different color of age bias. It absolutely happens. I've seen it. And I'm 100 % sure that I have experienced that. One of the things that's popping into my head is when I was a carpenter, I was building custom homes. And I was working with this general contractor. And we were going to meet with this guy that was
funding really a lot of these projects that we were doing and he wanted to meet me because I was running some of the work. I was very young. whether this was age bias or, gender bias, hard to say. He wanted to feel my hands. He wanted to see if I had calluses on my hands I don't know if he just didn't think I was really a carpenter.
or if he just didn't believe that there could be a woman out there swinging a hammer, I'm not really sure what the issue was, but he said, let me see your hands. And I said, what? He let me see your hands. Okay. And I put my hands out, palm down, and he flipped him over and he felt the palms of my hands. And I said, what, are you doing? And he goes, I wanted to see if you had any calluses.
Okay, sure. I've never forgotten that it was such a strange experience. that might've been age bias, that might've been gender bias, maybe a little of both. I don't know. It presents itself in crazy, crazy ways in all industries. I just read an article right before we started recording about the phenomenon of a woman saying something in a meeting and
Basically it just gets ignored and then a man says it two minutes later and everybody's like, yeah, that's a great idea. Wow. Really? Ladies, anyone out there that that's happened to, I'm sure it's happened to me multiple times. I can't put my finger on one right at the moment, but there's many ways that all these. types of discrimination show themselves in business.
And so we thought, great, we're going to eliminate all bias and we're going to create software that will eliminate all of our hiring biases and interviewing biases because the humans are flawed. And all these lawsuits have happened over the years of, people saying that they've been discriminated against, in hiring practices for.
age or gender or race I've read tons and tons of articles on this stuff. So we said we're going to put the computers in charge and fix it. Well, there are platforms like Workday, for example. They were invented to take the biased humans out of the equation and put the AI bots in charge. So how did that work out?
we're going to pick on Workday because this is the actual lawsuit that's in progress right now today. This is June 22nd of 2025 when we're recording this episode. And this lawsuit is going right now. Workday is used currently in over 11,000 organizations worldwide. And 1.1 billion applicants have been rejected through the Workday platform.
And that's what this class action lawsuit is about. It's been filed for violation of anti-discrimination laws.
So the way the platform works is it sorts, ranks, and rejects applicants without any human intervention.
That's what they're being sued for. The lawsuit specifically is indicating age discrimination by it auto rejecting people that are over 40. With rejections coming through within minutes or hours of applying, even in the middle of the night when you really couldn't even defend that a human was likely to have viewed the resume. The class action lawsuit indicates these plaintiffs had this happen hundreds of times.
with different potential employers that use this software.
Brad Wyant (09:12)
Let's jump into that. think that's really interesting. It sounds like the plaintiffs in this case are trying to build a case to prove that A, the software is making its own choices. That it's not as if the software is just sorting and saying, start with this stack and go here. It's making decisions. So that's one thing they're establishing. They're also establishing that those decisions the software is making, which are having real world implications, not just
prioritizing somebody's workflow are discriminatory. And that's a very complicated legal case to build out of that. But it's also an important one in this context of discussions. It's not just that Workday evaluates, it also acts, which is a huge place for our world to have gotten where a robot effectively or an AI, something someone has trained, is now deciding who to hire and who not to hire. Pretty wild.
Dee Davis (10:06)
Yeah, and who gets human eyes to look at it, the resume and who, potentially then gets an interview or who just gets auto rejected at 2am.
Brad Wyant (10:15)
Mm-hmm.
Dee Davis (10:16)
we're going to dive a little deeper here. the industry of hiring made this, problem statement. It time consuming to go through hundreds of resumes. We want to do it faster and we want to do it with less bias. We don't want to be sued for saying that we discriminated. this was supposed to be the solution. There was a problem statement put out there.
these software's platforms, and there are many, this is we're just using Workday as one of them, to solve this problem. Interviews take time, we know that. Sorting through resumes take time, we know that. Employers don't wanna take the time to sort through hundreds of resumes and do potentially dozens of interviews. But here's the reality, here's what's actually happened.
It's all gone online now. Good luck actually handing your resume to a real human anywhere. You can't do it at Taco Bell. You can't do it at a construction company. You have to have a personal in to maybe hand somebody your resume or email directly to a person. You have to go through these platforms, these online platforms. There's online job postings. So yes, resume reviews.
are faster because now you have machines doing it. And I see advertisements for AI all the time. In the five minutes you spent reading this article, I could have reviewed this entire contract for you, something like that.
Brad Wyant (11:35)
Hmm
Dee Davis (11:36)
So yes, resume reviews are faster, but what criteria are they using? So we'll talk about that in a little bit more detail. Hiring processes have in fact slowed down. It is actually taking longer to hire somebody than it was before. the wheels are turning slower and you have even more applicants than you had before because it's all online and you're not.
physically going to locations and people can do more virtual working You have more applicants, 50 to 100 % more applicants than you had before we started using these platforms.
Here's what happened. AI is actually duplicating human bias, not eliminating it.
I'm not surprised, I guess, when I found this out. I looked up these lawsuits. I looked up some of the research that's been being used to back these lawsuits and to argue against it.
The argument against it is saying, the platform's not actually making decisions, but it is because it's auto rejecting people. So it is making decisions even at a very elementary level. And it's making a decision that is devastating to the job seeker because, it's actually throwing perfectly good people out of the pool. People that are.
perfectly qualified for the job out of the pool. So what we have to remember is AI is stupid. It's stupid. keep thinking AI is smart. It's doing all this thing. It's amazing. It's stupid. It is being shown that AI is giving preferential keywords such as baseball and Canada a negative
points to words such as softball in Africa when these words have nothing to do with the job description and here's why.
they have to train AI because it doesn't know anything. in order to train AI, what they're doing is they're taking resumes of existing employees and uploading it. And they're saying, this is what a good resume looks like because Joe's worked here for 27 years and we like Joe a lot. he's one of our best people.
We like Mindy a lot. She's one of our best people. Well, guess what? If Joe happens to mention in his resume that he likes baseball.
And maybe you have a few people in your organization that happen to mention that they like baseball or race cars or cooking, or it doesn't matter what it is. AI is not smart enough to understand that that's not relevant.
Brad Wyant (14:00)
Well, and I think there's a very important feature of what you just described that we should go back to. If you train AI using your existing pool of employees.
Let's say like you're saying that, most of your employees that you currently have at certain levels in the company, or who've been with the company for a long time are white and male and happened to have put that they like baseball in the fun about you section of the resume. People from other countries who didn't grow up playing baseball, women who didn't play baseball, probably played softball, AI is automatically going to select for those things as opposed to
selecting the things that actually matter. It's not smart enough to discern between credentials that actually make an employee useful to the firm and credentials that happen to exist among employees who have been successful in the firm. And then you're replicating that bias, like you said earlier. So.
AI is as stupid as we use it. Not. Not necessarily stupid in itself, it's just only as stupid as the applicant. Stupid as the user. If you drive a sports car into a wall, you it's the operator error. I think that comes in there.
Dee Davis (15:03)
Well, and that's exactly it. So everybody wants AI to be this magical thing that instantly with a keystroke solves all of our problems. And that's not how it works. garbage in garbage out, right? And that's kind of what's happening right now. You have to put meaningful data in to get meaningful data out. And you have to take the time to teach it what's important
and what's not important.
I assumed when I started this research project that what AI was doing, and I think to some extent it probably is, is it was taking the job description and it was saying, okay, your resume matches 30 out of 300 words. And comparing that to, my resume versus your resume, well, yours only had 25 of 30, so.
I'm ranked higher than you are. I assumed that that's what was happening. I don't know because I guess I just couldn't imagine what else might be going on.
Brad Wyant (16:03)
I've been told by the people that worked in the career office at Ross, the MBA program I just finished, that's exactly what was going on, that a lot of companies are using AI as a first screen to see whether you're even relevant to the job. And just saying, okay, as long as there are enough keywords from your resume that match our job description, you pass to the next level. I think that they probably...
didn't want to go as far as this workday lawsuit is going because they want us to have a little hope. Maybe they don't want to crush our dreams that it's all AI. And in most instances, think it's especially for the levels of certain jobs, you do end up interviewing with somebody real in a post-graduate degree career experience, I would think.
a lot of companies are using it as a screen.
Dee Davis (16:53)
Yeah, and perhaps it's a matter of how you use it. I've actually heard that people are.
putting hidden words in the background of the resume. Somebody told me that recently that people are putting in white or like clear writing hidden words in the background to pass through the high level screener that are just a whole bunch of buzzwords to get by the bots to at least make it to the next step.
So perhaps there's more than one thing going on here, but in this particular lawsuit, that's not what's happening. It's giving preferential treatment to certain words that have nothing to do with the job description and therefore taking other words that are not on the preferred list and giving negative points for those and discounting people based on that. So job seekers are absolutely feeling
dehumanized and ignored and angry and negative towards employers right now because of all this stuff that is going on. And I don't blame them. I don't blame them. I've heard so many stories from people and as a consultant, I don't have to go through this very much, although I do go through sometimes the automated application process to get my resume out there to certain clients.
So I do have to go through that part of it. and that's been happening for years and years and years. It's gotten worse because now they're trying to take AI bots and interview people with them. oy vey
When I heard about this, I thought, have got to be kidding me. And I'm reading these articles and listening to people's stories about how they got set up for an interview it's an interview like this. It's a team's interview and they go in and it's a bot.
asking them questions, you're recording your response. You had a similar experience with this. Brad, tell us about it.
Brad Wyant (18:47)
Yeah, so when I was applying to business schools, I had a automated interview with Northwestern's program. I submitted my resume, submitted my essays, which I painstakingly wrote without AI. And then they gave me 60 seconds to answer a few questions on video. And I just totally ran out of time. I didn't rehearse to answer questions.
On a video format where there's no other person interact with. And I think the lack of watching somebody else's emotions. Play through their face to express their listening or not following what I was describing was really tough because like okay well I'm trying to make somebody else understand something but I can't see them do that and. There's one side of me that's like well that's a skill. Actors have to put themselves on tape.
all the time to get parts. They read a monologue into a camera, not knowing whether the audience is engaging with them or not. But is that the skill that you want to be testing your employees for? If you're going to be doing that kind of a thing, acting basically to a camera where nobody gets to react to it until later, then maybe you can justify that. But if that's not a core skill for your employees in this position, having to
explain something, having to put themselves on tape explaining something, then maybe I find that to be
selecting for the wrong skills.
If you want to find great actors, it's probably the best way to do it, I just found it very impersonal it's also like we're going to talk about a little bit. It's like, OK, so this is how little you care about the job or the position or the role or the academic opportunity. You don't even want to give me the time. But I got empathize, too,
This is happening in colleges. Like we said earlier, we've seen increases of 50 to 100 % more applicants for online job postings. Having to sift through that many job applications to get to the right candidate is just more work. The openness of the Internet, making it easier to find a job application to apply to, not necessarily to find a job, just to find a job application that's open.
puts employers in a tough spot because if you don't consider all of the people that applied, maybe that's discrimination. I'm sure that some lawyer would tell us, of course it is. But it's just, it's a poor response to a challenge, I think.
Dee Davis (21:01)
did you know that it was going to be a automated interview process before you got on it? Did they tell you or did you were you surprised?
Brad Wyant (21:09)
They said it was going to be automated like the hour or two before I had the window within which to record myself and they gave me like a one minute rehearsal period during which I could, experiment with it. So I didn't have any issues with the interface, but the other interviews I had were all video conference interviews and I was ready for those because I've been doing that my whole career. So it would have taken me a long time to prepare to rehearse a one minute
answer to a question I didn't have beforehand and be able to answer that in one minute basically what happened was they gave you a window where you were going to be able to log into the system and you get to read the question for 30 seconds and then you get to answer it for 60 seconds. And that format of interviewing that way was just totally foreign to me. If I knew that was going to be the format
within a couple of days of having the window, I would have had to spend a lot of time rehearsing to be able to do that, I think, to get there.
Dee Davis (22:07)
Yeah, brevity is maybe what they were selecting for. I don't know. The whole idea of it is crazy to me. I've got some really bad news for you because one of the things that this is bringing up is what is happening to these automated interviews? Where is this information going? How is it being assessed? Does somebody then have to sit and watch your responses?
And if they're doing that, then why didn't they just do the interview? And how is it being graded and assessed and compared? Or is it just getting filed and nobody's ever even looking at it? And they're just a colossal waste of your time. I don't know. And nobody knows the answer to these questions is probably varies by organization. I would pose the question to any employer, why bother?
let's say you have 100 resume responses to your job ad. You're going to narrow it down to 10, 8 for an interview. You're not going to interview 50. No one's going to do that. You read through it. You take the best.
You might narrow it down to 30 or 40 that you're actually sitting down and reviewing and then you narrow it down to OK, these are the ones that we're going to interview. And then you actually take the time to interview those people. There's all different kinds of ways you can interview people. CNN.
did an experiment on these virtual interviews. And some of the investigative reporters used a bunch of these different platforms. They just fake applied for jobs to see what would happen with some of these platforms.
They wanted to see what would happen if they deliberately didn't do what they were supposed to do. So in one particular case, the gal who was doing it, she's a native, I think it was German speaker. And one of the things that she was being rated on was her English proficiency. So she read from a German text.
like a psychology text, just sat there and read from this text 100 % in German for an English language assessment. It came back that she was proficient in English.
She didn't utter one word of English.
to this AI platform and it assessed her as proficient in English. Then she did this. She said, okay, for the next one, I responded with four words repeatedly for every question they asked. Every single question they asked no matter what it was, all I said was, I love teamwork, I love teamwork, I love teamwork, I love teamwork.
She was rated highly.
So what is happening with these platforms in the background? How is this stuff being assessed? You have to ask yourself. Somebody like you who pulls their guts out trying to figure out how to answer these questions briefly, correctly And somebody says, love teamwork 2,800 times and they get rated highly.
Brad Wyant (24:51)
So what strikes me about hearing that having moved back to the Bay Area life update, I'm now in Oakland, California, not in Colorado anymore, not in Ann Arbor anymore. That sounds like some real Silicon Valley crap to me. Somebody said move fast and break things and just sent it without any idea of the ramifications of their actions, the consequences. And in the tech world of like
Oh, if Google breaks for a couple of minutes or if Facebook goes down, But but when it's real people's lives and their livelihoods, yeah, there's some real consequences that you can't just send it and move fast and break things when you're talking about.
something as big as employee discrimination. want to circle back to something that we talked about earlier with the video thing. The more I think about it, the more I think maybe there's an opportunity there for removal of discrimination if there's no person, no human person asking the question and a panel of real human beings sit and listen to the person answering the question.
Maybe there's some kind of an opportunity to remove bias there. Maybe some people get stage fright and they don't get stage fright videotaping themselves and there's some kind of. Thing that you're not that you're accidentally selecting for there that the video removes you from. I guess I can't fully leave myself. Out totally against the video. One minute 60 seconds, no human being interview thing, but.
It's just, it's very odd. And it seems like maybe there are a lot of opportunities to like this, the reading from the German philosophy textbook, like I saying, I love teamwork 40 times in a row where it can be manipulated. And if this is the point where at this tool, that means this tool is not ready. It can't be used. And this lawsuit should go forward. I think.
Dee Davis (26:33)
Yeah, broken. Like really, really broken. Not just a little bit broken, but a lot broken.
So the ACLU, not surprisingly, has warned that AI hiring tools pose an enormous danger of exacerbating discrimination in the workplace. And I don't always agree with ACLU, but I got to tell you, I agree with them in this case. I think they are definitely right on this. I am seeing a lot of great people out there out of work for prolonged periods of times because they have not figured out how to hack the
the automated hiring system. And I'm sure I would not be able to do it either. In 2018, Amazon actually did away with a candidate ranking tool that they were using when it was proven, I have a feeling through a lawsuit, that was giving males preferential treatment.
So how does this happen?
Brad Wyant (27:25)
Yeah, I'm sure that it was through a lawsuit and I think...
think there's always been a danger of discrimination. just think that using these AI tools poorly and leaving them as a straw man argument of like, well, it's not our fault. The AI is all screwed up. It's, it's the kind of thing you would think a dystopian novel from science fiction would write about where people try to sue an AI. Of course, we're not going to end up there. think that this lawsuit will be a very interesting case of creating a precedent for
not suing the AI, but suing the use of the AI, suing the creators of the AI, and defining clearly that blaming the AI is not a legal fallback. That is not a way you can extricate yourself from responsibility, which is what a lot of these companies are probably trying to do, I think, by using it.
Dee Davis (28:09)
Yeah, I don't know that they're necessarily trying to sue AI. I think they're trying to sue the software makers that created it because it sounds to me like what's happening is it's like a skeleton and they're handing over skeletons to these companies and the companies are supposed to figure out how to train it or maybe the training of it is a service that they offer and it's
being trained in a really not great way that is replicating human bias. And even if you manually enter, here's the things that are important. There's a potential for bias there because who's doing that? Humans?
Brad Wyant (28:50)
So.
Dee Davis (28:51)
there was an example of using an existing employees resume. There was a bunch of people at this company named Thomas. So it started giving preferential treatment to anyone named Thomas.
these are things that would be so obvious to a human eye. Now I have read about, people, don't want to put their pictures on a resume because they feel that maybe because they're female, maybe because they look a certain way or their skin's a certain color I've also heard of people changing their names if they have a very ethnic name.
to change the name on the resume or put an initial or something like that. And they get better results, which is messed up. That is messed up. And again, not saying that those things are not happening at the end of this. I'm going to give my perspective on, where I think we should go from here, but I want to fall back a little bit the Thomas thing cracked me up. So basically if you have three people named Thomas in your company, then
We should hire people named Thomas, clearly. That's the answer. The software uses resume parsers that are used to assess the data and try to determine if the candidate is good or bad. And when I'm listening to this CNN interview and explanation of how this all works, my mind immediately went to one of my favorite movies, Charlie and the Chocolate Factory. Do you remember the scene? And I'm talking about
original Gene Wilder, Charlie and the Chocolate Factory, Where they're in the room where the geese are laying the golden eggs. And the golden egg drops down and it's on this scale and on one end of the scale is good and the other end of the scale is bad. They drop down and it sits there for a second and the egg
egginator, I think they called it. If it went to bad, then it went boop and it opened this trash chute and the egg rolled down the trash chute. And if it said good, then like white gloves came out and put it very carefully in this basket. And so the eggdonator, this is this is our modern version of the eggdonator, is these parsers are deciding whether a candidate is good or bad based on what?
I don't know how the egg donator decided, do you? I dont think we know.
Brad Wyant (31:00)
But then I remember it picks up one of the kids and decides it's bad and drops it in because it's a, it's a spoiled brat of a kid. So some verification there maybe.
Dee Davis (31:08)
Oh yeah, all of it was 100 % right. My very favorite scene in that movie completely has nothing to do with this, but my very favorite scene in that movie is when the girl turns into a blueberry. I don't know why, but I giggle every time. So these parsers or eggdontaors for us are going to decide that if most of the people in your company, existing employees are under 40 and brown,
that under 40 in brown is good and over 40 in not brown is bad.
putting a very well qualified candidate in the incinerator, not a good situation for us.
Brad Wyant (31:46)
No, absolutely not. That's pretty clear evidence of bias.
Dee Davis (31:50)
So what do we do Brad? Should we be using these software applications, AI and bots to replace humans and take humans out of this? Should we put humans back? Should we do something in the middle? What do you think?
Brad Wyant (32:02)
trying to imagine for myself the conversation that led to these outcomes a long time ago. I'm sure that it was something along the lines of a bunch of people sitting around a boardroom and saying, we've had too many lawsuits about discrimination. We need to find a solution. Somebody bright and intelligent and ambitious jumps up and says, I know let's use AI. And everyone said, yes, of course, AI is the solution to everything. Let's do that. And then a bunch of people who had never used AI in these circumstances said, let's do it. It has to be done by such and such date.
We're going to start using it soon because we have this critical problem. We need to churn through candidates faster, and this is the cheapest and best solution. And then people went to work on that. People try to come up with algorithms. People try to come up with solutions that avoid bias, and those people were not allowed to finish their work in time because no one ever is. Let's be honest, we never get as much time as we want to do anything.
All of this technology being applied is being applied so prematurely that it's not having the intended outcomes. So I think what we do about that is we let those people who are very well-meaning people developing these AI tools keep doing their work and let them tell us when it's not biased anymore. And not just that, we put people who understand bias from a legal perspective, from an ethnographic standpoint,
analyze these AI outputs and interact with these data scientists and computer programmers and come to a consensus about whether there really is bias there or not. I'm sure part of that process will involve acknowledging that absence of bias is a pinnacle which we all strive for as people, really should strive for, but that is unachievable, like perfection, like ultimate quality. It's not a realistic goal, but that we have to set some threshold
and understand that there's a good enough that we're not at that we need to achieve before this goes back out, before this makes life altering decisions for the members of the company, for the company's lawyers to have to deal with lawsuits and for the people who are applying. It's not that AI has no future in this part of business. AI has a part, has a future in every part of business, but we're not there yet and people need to be given the time that they need to work these solutions out.
Dee Davis (34:13)
I don't know if you've ever been in a company that did a software rollout. they decided to change the accounting program or to change the purchasing program or the construction management software. And they rolled it out before it was ready. They announced change and walked away with a million bugs in the system. And that's where I feel like we are with this kind of stuff.
Just because it exists doesn't mean that it's ready to be used. Doesn't mean that it's going to perform up to our level of expectations. We have Star Trek expectations. We want to beam instead of fly or drive hours and hours to get somewhere. Beaming technology is available, but if it obliterates you in the process, that's not really helping. So we need to just like self-driving cars, same thing, they exist.
Are they ready? Not so much. Still lots and lots of problems, right? So with any of this kind of stuff, I would agree that this is a premature rollout. by at least a decade. I think it's going to take years to work this stuff out. If we can ever get there, it is such a pipe dream to eliminate bias completely. I don't know that it's possible. It's like
asking for world peace and it's a lovely idea. I I'm not entirely convinced that it's possible. For the meantime I'm voting no on using these these platforms. I say we go back it wasn't really broken to begin with and we tried to fix it. I don't know that it was really all that broken. I think it was inconvenient.
and time consuming, but I'm not sure that it was really that broken And we rushed to fix something that maybe just needed to be tweaked instead of completely renovated. I think we're going to get better candidates when we have hiring managers who understand the work reviewing the resumes. Now, what I do remember back in the day when I had a regular W-2 kind of job was
they would send this stuff through HR. Somebody would write a job description and they would give it to HR and HR would put the ad out and HR would get all the resumes and they would review them all and decide which ones they forwarded to the hiring manager. And at that time, I remember thinking, I didn't think HR was qualified enough. Like what are they looking for? They work in HR, they don't work in engineering.
especially when you're talking about something very technical. What are they searching for when they're looking? Are they doing keyword searches? I don't know. whenever I would hire, whenever I was looking for somebody, I wanted to see all the resumes. you do several passes. You do the first round pass,
What kind of education do they have? What kind of experience do they have? Is it at least somewhat suitable for what I'm looking for? You can toss it into the bad egg incinerator pretty quickly if there's no way this person is gonna be right for the role. It doesn't take that much time. If AI is proving to be just as biased as a person and dumb on top of it all, then I say we spend our time and our energy.
on bringing attention to our subconscious biases and working on those and not falling prey to those biases in the process. even if we can get to the point where we are hiring people with zero bias, does that still translate to the workplace if we haven't trained people to get rid of their biases?
I feel like we're trying to avoid something important here.
Brad Wyant (37:34)
And I think it's about normalizing the introduction of bias training in hiring and non-hiring work situations.
As a civilization, we should be striving to teach our children to use less bias in their decision making. one of the psychological principles that people study is bias in terms of a biological reaction. And when it came down to discrimination between that berry and this berry, bias was very important. Not eating that berry that people told you not to eat, because it might kill you.
Bias that that's an important function of the brain and eliminating that is impossible. There's going to be bias, but.
striving to help employees recognize when they're being biased and how to address the bias in the emotion and the reaction, that is an attainable goal. Getting employees that are less biased than they are now through training and discipline and better practices where more people look at candidates who have different backgrounds, who have had different career experiences within the company, as opposed to it all being
Let's say for the sake of argument, five white guys who all went to the same university looking at a bunch of other white guys and making that decision that that that kind of bias removal is it is an attainable goal that we should all strive for.
Dee Davis (38:48)
I would love to see us bring humanity back into the equation of hiring and selecting people. This is your business. This is your livelihood, your company, your daily environment. We spend more time with people at work generally than we spend with our families at home. Getting the right people in there to operate your business and serve your customers should be our primary focus.
making sure that we have the right people in the right seats. Delegating that stuff out to software and bots, I just don't understand it. I'm going to pull my old chick card here, I guess, and say, if I had a client that wanted me to do one of those automated interviews, I think I would just walk away.
I need to connect with the people. I have questions. That's the other thing I don't like about that. It gives you as the interviewee, zero opportunity to ask any questions. How fair is that? They ask canned questions. They may or may not ever look at your responses or who knows what happens to your responses. You get zero opportunity to ask any questions or interface with people that you're going to be working with.
That's not an environment I want to work in. And we're not sending great vibes out to people as employers. When you're doing that kind of stuff, is it efficient? Is it cheaper? Is it take less of your time? Yeah, but is that really where you want to cut corners?
Brad Wyant (40:11)
It's terrible situation. think a lot of companies are being forced to cut corners wherever they can with pressure from management, with pressure of ever more competitive environment. But
using this AI stuff in its current form as anything more than a first screen.
if we're to go to the video thing. think the best use case for that would be. We're trying to verify that you're a real person. We're receiving these many applications. We just don't have volume. We think that there were being bombarded by bots, so just take these three minutes. Tell us about a career experience of yours and we'll get back to you and.
If people pass that test and they're not. Unpresentable the other there, you know backgrounds messier, they're not wearing a professional set of garments. If they speak in run on sentences, if they are hard to understand, if they're rude or course, then those are like got it. We're not moving on, but. That screen probably won't screen that many people out, and that's where the efficacy of these solutions being.
touted as AI is going to save the world right now tomorrow. That's when people get impatient and impatience often, as it ever has, leads to waste, inefficiency and bad results. if you're in a position of hiring power, don't be hasty. Don't rush into getting everything right now tomorrow. That's not a good attitude in any part of life or business.
Dee Davis (41:40)
Thanks for joining us. We'll see you next time.