THE LAW FIRM FOR EMPLOYERS

Compliance Matters TM


New Guidance for Using AI in Employee Selection Procedures



The Equal Employment Opportunity Commission (“EEOC”) has recently put out a technical assistance document that provides some direction to employers who seek to incorporate more tools powered by Artificial Intelligence (“AI”) in their hiring, promotion, and termination processes. Here are the key takeaways.



  • The EEOC wants employers to be aware of the potential pitfalls in the increasing use of algorithmic decision-making tools in light of federal anti-discrimination laws. 
  • Examples of those tools include resume scanners that prioritize applications using certain keywords; employee monitoring software; “virtual assistants” or “chatbots” that ask job candidates about their qualifications and reject those who do not meet pre-defined requirements; video interviewing software that evaluates candidates based on their facial expressions and speech patterns; and testing software that provides “job fit” scores.
  • When the above tools are used to serve as a basis for an employment decision, they fall under the existing EEOC framework in its Uniform Guidelines on Employee Selection Procedures.



  1. Under the Guidelines, a selection procedure has a disparate impact when it causes a selection rate for individuals in a protected class that is “substantially” less than individuals in another group.
  2. This is measured by the 4/5 rule, that is to say, if the ratio of the selection rates for two groups is less than 4/5 (or 80%), then the difference between the two groups is “substantial.”
  3. However, the EEOC cautions that the 4/5 rule is a rule of thumb that will not apply in every situation and will not guarantee immunity from liability.


  • The burden is on the employer to ensure that the use of these new tools as a selection procedure does not create a disparate impact. This means the tool or procedure cannot negatively impact the selection rate for a protected class or cause different selection results for one group compared to another.
  • Employers are responsible for the use of the above tools even if the tools are designed or administered by another entity, such as the software vendor. The EEOC recommends employers have conversations with their vendors to ensure that they have adequately considered and evaluated potential disparate impact issues.
  • Failing to adopt a less discriminatory algorithm that was considered during the development process may give rise to liability.



The document is titled “Assessing Adverse Impact in Software, Algorithms, and Artificial Intelligence Used in Employment Selection Procedures Under Title VII of the Civil Rights Act of 1964” and can be read in its entirety here. A similar document regarding the Americans with Disabilities Act, titled “The Americans with Disabilities Act and the Use of Software, Algorithms, and Artificial Intelligence to Assess Job Applicants and Employees” can be found here.

As always, we are available to answer any questions and for all your employment-related needs. Please call your firm contact at 818-508-3700 or visit us online at www.brgslaw.com.


Sincerely,



Richard S. Rosenberg

Katherine A. Hren

Olga G. Peña

www.brgslaw.com
LinkedIn Share This Email

LOS ANGELES OFFICE

15760 Ventura Boulevard

18th Floor

Encino, CA 91436

818.508.3700


EAST COAST OFFICE

18067 West Catawba Avenue

Suite 201

Cornelius, NC 28031

704.765.1409


ORANGE COUNTY OFFICE

3 Park Plaza

Suite 1520

Irvine, CA 92614

949.431.0470