AccelPro | Employment & Labor Law
AccelPro | Employment Law
On the EEOC and AI in the Workplace
0:00
Current time: 0:00 / Total time: -21:18
-21:18

On the EEOC and AI in the Workplace

With Jim Keaney, Senior Associate at Sandberg Phoenix | Interviewed by Matt Crossman

We hope you’re finding value in our weekly expert interviews and series of events. Your engagement and feedback shape the topics we cover and the career products we offer. Sign up now, and your annual membership will include a free trial and a discount of more than 40%.

Welcome to AccelPro Employment Law, where we provide expert interviews and coaching to accelerate your professional development. Today we’re featuring a conversation about the EEOC and AI in the workplace. Our guest is Jim Keaney, a senior associate at Sandberg Phoenix. 

Keaney walks us through the EEOC’s Artificial Intelligence and Algorithmic Fairness Initiative. Keaney says the EEOC has taken a broad view of technology because it changes so quickly and so often, and the law often has a hard time keeping up.

“The main takeaway is for employers to talk to the software providers, ask questions, ask hard questions.” That way, even if you make missteps along the way, the EEOC appears willing to take your efforts into consideration. “The more of a due diligence background paper trail you can develop, the more likely I think you’re not going to have as much of a headache from the EEOC on the issue,” he says.


AccelPro’s expert interviews and coaching accelerate your professional development. Join AccelPro Employment Law now for a free trial of everything we offer to members.


Interview References:


TRANSCRIPT

I. WHAT THE EEOC GUIDANCE ON AI COVERS

Matt Crossman, Host: The EEOC has launched what it calls the Artificial Intelligence and Algorithmic Fairness Initiative.

As part of this initiative, the EEOC published guidance aimed to prevent workplace discrimination in connection with the use of automated or algorithmic systems that draw upon artificial intelligence technology. Broadly speaking, what does that guidance address that HR execs and employment attorneys need to know?

Jim Keaney: In the big picture, the guidance provides a high-level view of the EEOC’s position on this issue, or common questions of the use of AI in the employment context and how that fits into this concern about disparate impact. 

Disparate impact is unintentional discrimination or discriminatory patterns or practices that arise from what otherwise appears to be facially neutral criteria or selection procedures. 

Under the law, it has three different layers to it, and it’s important to know what those layers are to understand what this guidance does and does not address.

First: Is there an employment practice that has a disparate impact on one of these protected categories?

The second layer to that is, if there is that disparate impact, is the practice job related and consistent with business necessity?

And even if it is job related and consistent with business necessity, is there another alternative way to reduce or eliminate that discriminatory impact—another practice that could be done? So this guidance does not address those really hard and complicated second and third layers in terms of what’s consistent with business necessity, what’s job related, and how do we figure out alternatives?

This is just a high level, what do we view as far as this first question of figuring out whether there’s a disparate impact in using this technology. So it just scratches the surface, but I think the guidance stands for the proposition that the EEOC sees the use of AI as completely fair game, and employers and business executives need to be aware of their use of it and be proactive in terms of controlling and understanding the impacts of the use of that technology.

MC: One of the challenges of this is AI is really complicated. I was surprised at how wide the EEOC’s definition of the use of AI was. It goes far beyond screening resumes for keywords, which frankly is what most people think of when they think of AI and hiring. What other systems or applications is the EEOC talking about here?

JK: That’s a good point. I think the EEOC is intentionally taking a broad view of this because technology changes fast, and it changes all the time. Law tends to lag behind technology in that regard. But like you said, resume scanners, that’s a big one—scanning just for certain keywords and screening applicants on that basis.

Other technologies are the use of chatbots in sourcing, recruiting or interviewing and employee monitoring software—even after they’re hired—for keystrokes, that type of thing. All these things are falling under this broad umbrella of how the EEOC is viewing the definitions of algorithm, of software, of AI. It is very broad, but I think that’s an intentional thing.

If you’re an employer in Illinois, for example, you probably know that four years ago, Illinois passed a law on the use of video interviewing software in the employment context. There’s certain requirements about how you explain what you’re doing, what you’re looking for, et cetera. Some of these things are new, some of them are old, but I think that broad scope is intentional.

MC: How do these new technologies fit into existing legal frameworks for handling employment related issues and disputes?

JK: I would say it really remains to be seen. I think the EEOC is starting to take a stab at that, and they will continue to promulgate guidance on the second or third layers of the disparate impact analysis and how it plays out there.

I think the devil’s in the details. We’ll just have to wait and see how this technology develops side by side with the law.

MC: One of the things that interested me as I was reading up on this is the four-fifths rule. What is that and how does that apply?

JK: The four-fifths rule is basically a way to compare selection rates across protected categories. You look at the number of applicants in one group, the percentage of that group that was hired, and then you do the same for another group and you get two percentages. You look at one percentage and you basically see whether that percentage is 80 percent of the other percentage. And it gives you a rule of thumb overall just to say, “OK, is there some initial indication here of an adverse impact? What’s going on?”

And again, it’s a rule of thumb. It doesn’t apply in all scenarios. When you get into things like statistics, the size of the data set, all these things come into play. And so it’s not a hard-and-fast rule. It’s not the law, but it is a rule of thumb that employers can use in reviewing or auditing their own use of AI technology in employment practices.

MC: A related question to that. One of things that AI allows is testing. You can run the program without actually applying it to see if there is either disparate or adverse impact. How important is that, and what do you see happening along those lines?

JK: In looking at this guidance, the main takeaway is for employers to talk to the software providers, ask questions, ask hard questions. “Have you tested this software? What’s your experience in your customer use of this software? Has it been effective? How have you ensured it’s been effective?”

In an ideal world, sure, you would get black-and-white answers at the outset of buying a software package for use in employment. But I think as a practical matter and as a reality here, the EEOC is mindful that these things have effects over time. 

They may need to be tweaked and reviewed, and so I think the guidance isn’t you have to test for this, that, and the other thing at the outset. It’s more, look at what you’re doing, conduct what they call a self-analysis, which is just a watered-down term for audit. Being proactive is the principle here. If that means testing before, great, if you can do that. But continuing to monitor it seems to be a particular focus.

II. THE EMPLOYER IS RESPONSIBLE FOR WHAT THE VENDOR CREATES

MC: One of the many challenges that AI presents is if you are an HR executive, or a hiring manager, you’re experiened at hiring people. It’s not reasonable to expect you to be an expert on AI or to understand how the AI system works.

So as you said, you hire a vendor. And I don’t want to say you’re passing the buck because that’s not fair, but you are hiring somebody to complete a task for which you are responsible. So the question is, who is responsible? Is the employer responsible under Title VII, even if the tools are designed by a software vendor? Or is the software vendor responsible?

JK: It’s a great question because anyone familiar with employment laws is probably mindful that some of these labor agencies will take broad views of who’s the employer, who’s responsible. Joint employment is a concept under many laws.

The EEOC in the recent guidance is clear: the onus is on the employer in terms of getting this right. Now, there could be some factual situations where a vendor was on the hook, but from an enforcement perspective, the investigative focus will be on employers, what they did or didn’t do.

And certainly if you have a vendor who gives you a package that results in these remarkably discriminatory impacts, and the employer did nothing, that’s an easy case, whereas if they do a lot, it seems less likely (the EEOC would get involved). But the EEOC has taken the position that the employer liability here is central, and you can’t get away with the use of software that’s discriminatory just because you didn’t design it.

MC: Not all companies are big enough to hire a vendor. Is there a size of company or a scope of project where if you fall under it, the burden is less, or are you responsible whether you’re hiring one person or 500?

JK: As a threshold matter, Title VII only applies to employers of certain sizes. You have that jurisdictional limit—15 or more employees. But I’m not aware of the EEOC coming out and saying employers of certain sizes beware or suggesting that they’re going to be under scrutiny going forward

Now, as a practical matter, I think it might be true that larger employers are more likely to be scrutinized for their practices. That’s not because they’re big and bad or anything like that. It’s just the fact that they’re one, like you said, more likely to have the resources to implement that technology and two, disparate impact is really all about statistics.

Anyone with just a rudimentary familiarity with statistics knows that the bigger dataset, the better that evidence is. The EEOC picks their cases judiciously based upon what they can show or believe they can show. And so I think they would probably have statistically stronger cases with bigger companies. But I’ve not seen the EEOC announce anything formal in that regard.

III. BEING PROACTIVE WILL PROTECT YOU IF YOU MAKE MISTAKES

MC: If I’m an HR executive, I have to be thinking this is a challenging time to be overseeing hiring. If I rely too much on my own discretion, I can get myself in trouble because inevitably my decisions are going to reveal my biases. It’s part of the reason I would turn to AI is if I can design a program sufficiently neutral, then I eliminate that problem.

But if I rely too much on the algorithms, I can run into the same problem. So for HR execs and employment attorneys who want to do the right thing under the law, what are some best practices to set up systems that comply with Title VII and the EEOC guidance?

Best practices will vary considering what software you’re using and what you’re using it for. The technology is undoubtedly flexible and getting more flexible, and more powerful. 

Proactivity seems to be the biggest message coming from the EEOC. There’s not a whole lot of specific guidance about if you do X, Y, and Z then you’re in good shape. I think it’s really about not turning a blind eye to what they’re trying to shine a light on as being an issue. 

Even from an enforcement perspective, if the EEOC happens to scrutinize what you do, the more you can show that we talked to the vendor about this issue, we asked them what they do to monitor and guard against disparate impacts with the use of their technology, and this is what they told us etc. The more of a due diligence background paper trail you can develop, the more likely you’re not going to have as much of a headache from the EEOC on that issue.

And it’s not only just that work at the outset of rolling out a practice like this that’s important. I think what you need to do going forward is have some amount of periodic reviews of your own use of the procedures, these AI tools and what they’re generating.

Without getting too far out of my specialty as a lawyer, you can filter variables and data points. These vendors should be flexible and cooperative with you in trying to see what, if anything, can be done to minimize any disparate impacts that may be invariably resulting from your practices or procedures.

MC: You used a phrase I like there about not getting too far out of your specialty. That’s an issue that comes up again and again in the legal world. Either related specifically to AI or just in your practice in general, how do you keep up with stuff that is frankly a little beyond you, if I may put it that way?

JK: That’s always a challenge, even just keeping up with legal updates and case law. It’s a similar principle as applied to technology. And I think we’re all learning, and we’re all learning fast. And I think that expectation is not going to go away, that you have to keep up with things, whether it’s from a technological perspective or a legal perspective.

I think it’s an increasing reality of everything we do. There’s people with areas of expertise, and the EEOC is not coming out and saying HR managers now need to be data statisticians. That’s not the message yet, at least.

Working in close partnership with vendors through whom you’re getting this software is likely your best bet to maintain best practices going forward and avoid any sort of enforcement action from the EEOC.

IV. ALL  IN THE FAMILY

MC: Now I want to pivot and ask you professional development questions. My first question is, is there anybody in your family who is not an attorney?

JK: My dad, my brother, my sister—we’re all attorneys. My dad practiced for a big local law firm here for 42 years. He did it the old way, he stayed at one firm. My siblings both practiced at another big firm here in St. Louis for a number of years. Two of my grandfathers and an uncle were also attorneys. So we have about seven in the immediate family. I’m genetically predisposed to being a lawyer, I guess.

MC: I like that—genetically predisposed. Did your dad say, “Jim, you’re going to be an attorney?” Was there any chance you weren’t actually going to do that?

JK: It was never impressed upon me that I had to become an attorney, and that’s why I always joked that it’s genetics. I was very fortunate to have the freedom to try to do whatever I wanted and, the law just seemed to be a natural fit.

MC: Are you all in the same area of law?

JK: The first seven, eight years of my practice, I was doing labor and employment law, but I was doing it mostly on the plaintiff’s side. I came to Sandberg Phoenix a few years ago to do labor and employment law, but now I’m primarily, if not exclusively, on the management or defense side.

Neither my siblings nor my dad did labor and employment strictly. If you consider non-compete litigation within the umbrella of employment, which it can be, my dad did a quite a bit of that.

It’s fun when I come across a published case and I see his name on it. Same with my grandpa. Same deal. My brother does commercial litigation. My sister was primarily transactional mergers-and-acquisition type work.

MC: More broadly speaking, how much more prepared do you think you were both for law school and for your career because you grew up immersed in the legal community?

JK: I want to say it was more helpful than it probably was. We weren’t the type of family to talk about my dad’s cases. Now we would still have our family arguments. So maybe that’s part of the lawyer upbringing. But who doesn’t?

I think it gave me the confidence to know that it’s possible and get through some humps in the practice as far as your first deposition, your first trial, your first experience in certain respects. My dad could do it, my sister did it, my brother did it, so therefore I can do it.

MC: Was there a particular piece of advice or wisdom that your dad passed down that was helpful?

JK: There was. One of the most important things he’s taught me is be resilient. You’re going to have bad days, you’re going to have experiences where you wish you could do it over again. It’s a stressful experience. You want to do a good job. But one of the most helpful things was to not forget that the law is about making sense.

The law is about human beings at the end of the day. He gave that lesson to me in the context of his first jury trial, which as a matter of fact, in law, he should have won. It was fairly clear. But he was just unprepared as a young attorney to deal with a witness on the stand who was very empathetic and who was very sympathetic and was crying. He said he didn’t ever want to do litigation again after he lost that trial because the jury came out and they ruled against him because they just found that witness to be so sympathetic.

As much of what we do is about the law and understanding it and complying with it, when you get to a trial setting, emotion can take over, and there’s many other aspects of the practice you need to take into account as you’re practicing.

MC: I love it that your dad has the wisdom to teach you from a moment of failure rather than a moment of success.

JK: Yeah, he’s a very humble guy. He had a long, distinguished career with pro bono work as well.

MC: One of the things I have found fascinating about the interviews I’ve done about employment law is it always comes back to representing a human being and you are either defending or advocating for them. Is that part of what draws you to employment law — there’s always, at the heart of each case, an individual who either needs to be defended or advocated for.

JK: I think that’s right. It makes you feel good when you achieve justice on behalf of a company or an individual when that’s what should happen. There’s a dark side to that, too. Loss of a job can feel like a loss of something important.

There can be emotionally charged situations. And you’re constantly dealing with different situations, things you never thought you’d see. But that’s what keeps you on your toes. 

At the end of the day, you’re there to help your client through a difficult situation more often than not. They know it’s difficult. You know it’s difficult. But having a human-centered approach to the practice will serve you well no matter who you represent.

This AccelPro audio transcript has been edited and organized for clarity. This interview was recorded on July 6, 2023.

AccelPro’s interviews and products accelerate your professional development. Our mission is to improve your day-to-day job performance and make your career goals achievable.

Send your comments and career questions to questions@joinaccelpro.com. You can also call us at 614-642-2235.

If your colleagues in any sector of the employment law field might be interested, please let them know about AccelPro. As our community grows, it grows more useful for its members.

AccelPro | Employment & Labor Law
AccelPro | Employment Law
AccelPro’s expert interviews and coaching accelerate your professional development. Our mission is to improve your everyday job performance and make your career goals achievable. How? By connecting with a group of experienced Employment Law professionals.
You’ll get the knowledge and advice you need to navigate your changing field. You’ll hear deep dives with experts on the most important Employment Law topics. You’ll give and receive advice on how to make difficult job decisions. Join now to accelerate your career: https://joinaccelpro.com/employment-law/