What are the EEOC AI hiring guidelines?
The EEOC's AI hiring guidelines are rooted in Title VII of the Civil Rights Act, the Americans with Disabilities Act, and the Age Discrimination in Employment Act. The core principle: employers cannot use AI or algorithmic hiring tools that discriminate — whether intentionally or through facially neutral practices with disparate impact on protected groups — unless the tool is demonstrated to be job-related and consistent with business necessity. Under Title VII, employers must track selection rates by race, sex, and national origin; validate tools showing adverse impact; and consider alternative practices with less discriminatory impact. The ADA requires individualized assessment and accommodation for candidates with disabilities. The ADEA requires monitoring for age-based adverse impact. All three statutes place the legal compliance obligation on the employer — not the AI vendor. EmployArmor helps employers operationalize these obligations by cataloging AI tools, collecting disparate impact data, and maintaining EEOC compliance documentation.
Last updated: March 2026
EEOC AI Hiring Guidelines: Key Requirements
| Law | Protected Class | Key AI Hiring Obligation |
|---|---|---|
| Title VII | Race, color, religion, sex, national origin | Track selection rates; validate if adverse impact found |
| ADA | Disability | Individualized assessment; reasonable accommodation process |
| ADEA | Age (40 and older) | Monitor age-based selection rates; validate if adverse impact |
| UGESP | All protected classes | Job-relatedness validation for tools with disparate impact |
How Employers Can Operationalize EEOC AI Hiring Guidance
- 1
Catalog Every AI Tool in the Hiring Process
Identify all AI and automated tools used across sourcing, screening, ranking, assessment, and interview stages — including ATS AI features, job board matching algorithms, video interview analyzers, and chatbot pre-screeners. Map each tool to the specific EEOC-covered protected classes it may affect.
- 2
Request Technical Documentation from AI Vendors
Obtain available documentation from each AI vendor — including algorithmic design descriptions, training data provenance, validation studies, and disparate impact data — to establish what evidence exists to support job-relatedness and what gaps require additional employer-initiated analysis.
- 3
Track Selection Rates by Protected Group
For each AI tool used in hiring, collect and analyze selection rates disaggregated by race, sex, national origin, age group, and disability status. Calculate adverse impact ratios using the 4/5ths rule (selection rate for protected group vs. highest-selection group) to identify tools requiring attention.
- 4
Commission Validation Studies for Tools with Adverse Impact
For AI tools showing adverse impact on any protected group, commission an independent validation study demonstrating job-relatedness and business necessity. This requires specifying the job performance criteria the tool is designed to predict and demonstrating a strong correlation between tool scores and actual job performance.
- 5
Establish ADA Accommodation Processes for AI Assessments
Create a documented process for candidates to request accommodations when AI assessment tools — particularly video interview analyzers and cognitive assessments — may disadvantage individuals with disabilities. Ensure responses to accommodation requests are timely, documented, and consistent.
- 6
Maintain Comprehensive EEOC Compliance Records
Keep records of all AI tools used, vendor documentation received, selection rate data by protected class, validation study results, and accommodation requests and outcomes. This documentation is evidence of good-faith compliance efforts in the event of EEOC inquiry or litigation.
By the Numbers
Title VII
Federal civil rights law that prohibits discrimination based on race, color, religion, sex, and national origin — foundational to EEOC's AI hiring disparate impact framework. Applies to all employers with 15+ employees.
ADA
Americans with Disabilities Act — prohibits discrimination based on disability and requires reasonable accommodations. Applies to employers with 15+ employees. AI tools that screen out individuals with disabilities without individualized assessment may violate ADA.
ADEA
Age Discrimination in Employment Act — prohibits discrimination against individuals 40 and older. AI hiring tools trained on recent graduate data may produce adverse impact on older workers if career history length or credential age is penalized.
4/5ths Rule
UGESP's disparate impact threshold: if a protected group's selection rate falls below 80% of the highest-selection group's rate, adverse impact is presumed and the employer must validate the tool as job-related.
Frequently Asked Questions
What federal laws govern AI hiring under EEOC guidance?
Three federal statutes ground EEOC AI hiring guidance: Title VII (race, color, religion, sex, national origin), the ADA (disability), and the ADEA (age 40+). EEOC has issued technical assistance documents addressing each statute's application to AI and algorithmic hiring tools, emphasizing that employers cannot delegate discrimination liability to AI vendors.
What is disparate impact theory and how does it apply to AI hiring?
Disparate impact under Title VII means facially neutral practices that disproportionately exclude protected groups are unlawful unless job-related and consistent with business necessity. For AI tools, employers must track selection rates by protected class, identify adverse impact using the 4/5ths rule, and validate tools that show adverse impact as predictive of job performance.
What are an employer's obligations under the ADA when using AI hiring tools?
The ADA requires individualized assessment for candidates potentially screened out by AI tools and a documented accommodation process for AI assessments that may disadvantage individuals with disabilities. Employers cannot rely solely on AI tool outputs when those tools may systematically underrate individuals with disabilities — reasonable accommodations and alternative assessment methods must be considered.
How does the ADEA apply to AI-powered hiring tools?
The ADEA prohibits discrimination against individuals 40 and older. AI hiring tools may inadvertently penalize older workers if trained on data that favors recent graduates or shorter career histories. Employers should monitor selection rates by age group and validate or redesign tools that show age-based adverse impact.
What validation requirements exist for AI hiring tools under EEOC guidance?
Under UGESP, employers must demonstrate job-relatedness for any selection procedure — including AI tools — that produces adverse impact. This typically requires a criterion-related validation study showing the AI tool's scores correlate with actual job performance. Content or construct validity studies may also be appropriate depending on the tool design. The obligation to validate rests with the employer, not the vendor.
How can employers operationalize EEOC AI hiring guidance in practice?
Practical steps include cataloging all AI tools, requesting vendor technical documentation, tracking selection rates by protected class, commissioning validation studies for tools with adverse impact, establishing ADA accommodation processes, and maintaining comprehensive compliance records. EmployArmor automates AEDT cataloging, disparate impact data collection, and EEOC compliance documentation.
EmployArmor automates EEOC AI hiring compliance — run a free compliance scan.
References
- NYC Administrative Code § 20-871–20-875 (Local Law 144 of 2021). NYC Dept. of Consumer and Worker Protection
- Illinois Artificial Intelligence Video Interview Act (820 ILCS 42). Illinois General Assembly
- Colorado SB 24-205, "Consumer Protections for Artificial Intelligence" (2024). Colorado General Assembly
- EEOC Technical Assistance, "The ADA and AI to Assess Job Applicants" (May 2022). U.S. Equal Employment Opportunity Commission