Skip to main content
SearchLoginLogin or Signup

Employers’ Use of AI in Hiring: Discriminatory or the Way of the Future?

Published onAug 30, 2024
Employers’ Use of AI in Hiring: Discriminatory or the Way of the Future?
·

Have you ever submitted a job application online? If so, then your application may have never even reached the eyes of the intended employer, but rather been rejected through an artificial intelligence (AI) filter. In recent years, companies have been increasingly utilizing AI to screen applicants. Employers program the software to filter the applications based on certain desirable criteria they are seeking. However, such practices have proven to be legally problematic as they can be biased and discriminate against applicants who fall into certain categories.

The first case of this kind, EEOC v. iTutorGroup, Inc. et al, was recently settled. This case originated when an applicant submitted a job application to a digital tutoring company and was denied. This same applicant then submitted another identical application, except changed the birth date to be more recent, and was then offered an interview. In this case, an English-language tutoring company for Chinese students utilized AI software to automatically reject applications over a certain age. Although they were qualified, men 60 years of age or older and women 55 years of age or older were automatically denied a job. Ultimately, this company rejected more than 200 applicants based solely on their age. This case settled in August, resulting in the tutoring company paying $365,000 to a group of applicants who were rejected based on their age, as well as mandating “extensive and continuing training for those involved in hiring tutors, issuance of a robust new anti-discrimination policy, and strong injunctions against discriminatory hiring based on age or sex and requesting applicants’ birth dates.”

This company’s conduct violated the Age Discrimination in Employment Act. Under this Act, the Department of Labor specifically protects parties from this conduct by prohibiting employers from discriminating against potential employees or current employees 40 years of age or older based on their age. The Equal Employment Opportunity Commission (EEOC) enforces this Act.

The EEOC is currently making a push to stop exactly these types of scenarios, where employers’ use of AI in making hiring decisions results in discrimination. The EEOC published a Draft Enforcement Plan in the Federal Register in January. In this Plan, the EEOC discusses its strategy to eliminate barriers in both recruitment and hiring, incorporating “the use of automated systems, including artificial intelligence or machine learning, to target job advertisements, recruit applicants, or make or assist in hiring decisions where such systems intentionally exclude or adversely impact protected groups.”

According to a 2023 statistic, approximately 88% of companies globally utilize AI in Human Resources and hiring. However, since AI is a relatively new technological advancement that companies are employing, there is little regulation in this area. A New York City law requiring employers to independently audit their AI software for biases when used in hiring decisions recently went into effect, making it the first of its kind. These companies now face fines if they do not publish the results of their audits on their website. Other states are likely to follow New York’s lead, leaving employers scrambling to comply.

Additionally, preventing companies from discriminatory AI hiring practices is proving to be a difficult task for the EEOC. This is because rejected applicants are often unaware that they were discriminated against. Therefore, these applicants never file charges against employers, and the issue is never brought to the attention of the EEOC. For example, in the iTutorGroup case, the applicant only discovered they were being discriminated against when they submitted a second application with the only difference being their age and was granted an interview.

The benefits of utilizing AI in the hiring process include reduced costs as well as a faster and more streamlined process. On the other hand, utilizing AI in the hiring process can result in biases as well as overlooking more distinctive, unconventional talents applicants may possess. In the end, employers are left struggling to find the balance between embracing the benefits of the technological advancements of AI’s use in hiring and ensuring that their practices do not perpetuate biases and result in discriminatory practices.

Kelly Ryder is a third-year law student at Wake Forest University School of Law. She holds a B.A. from Duke University. This past summer, Kelly interned with Fried Frank in New York City and will be returning full-time after graduation.

Reach Kelly here:

Email:  [email protected]

Comments
0
comment
No comments here
Why not start the discussion?