Candidate Rights Under AI Hiring Laws: What Job Seekers Can Demand
Job seekers have more rights than ever when AI is used in hiring decisions. Here's what the law guarantees—and how to exercise those rights.
Category: Candidate Rights
Read Time: 12 min read
Published: February 23, 2026
By [Author Name]
Published on February 23, 2026
If you're applying for jobs in 2026, there's a good chance artificial intelligence will evaluate your application before any human sees it. AI might screen your resume, analyze your video interview, score your assessment responses, or rank you against other candidates. This isn't speculation—it's standard practice at many employers.
But here's what many job seekers don't know: you have legal rights when employers use AI in hiring. Depending on where you live or where the job is located, you may have the right to know when AI is being used, understand how it evaluates you, opt out of AI evaluation, request human review, and even demand deletion of your data.
This guide explains those rights—and how to exercise them.
Know Your Rights:
- ✓ Right to disclosure (know when AI is used)
- ✓ Right to explanation (understand what AI evaluates)
- ✓ Right to opt out (request non-AI evaluation)
- ✓ Right to human review
- ✓ Right to data deletion
- ✓ Right to accommodation (if you have a disability)
Your Rights Vary by Location
AI hiring laws are state and local, not federal (yet). Your rights depend on where you live or where the job is located. Here's the breakdown:
Strongest Protections: Illinois, NYC, California, Colorado, Maryland
If you're applying for jobs in these jurisdictions, you have the most comprehensive rights:
- Illinois: Right to notice and consent before AI video analysis; right to data deletion within 30 days. For more details, see the Illinois Department of Labor guidelines on the Artificial Intelligence Video Interview Act (AIVIA).
- New York City: Right to know AI is used at least 10 days in advance; right to alternative evaluation under Local Law 144. Visit the NYC Department of Consumer and Worker Protection for official resources.
- California: Right to disclosure before application; right to annual bias audit transparency. Refer to the California Civil Rights Department for enforcement information.
- Colorado: Right to opt out of AI evaluation; right to human review. Check the Colorado Attorney General's Office for updates on AI consumer protections.
- Maryland: Right to consent before facial analysis; right to revoke consent and delete data. See the Maryland Attorney General's Office for biometric data laws.
These states lead in protecting candidates from opaque AI systems, ensuring transparency and fairness in automated hiring processes. For instance, in Illinois, the AIVIA specifically targets video-based AI tools, requiring employers to obtain informed consent and provide clear notices about data usage.
Moderate Protections: Washington, Other Emerging States
States with newer or narrower laws provide some protections, typically around disclosure and notice. For example:
- Washington State: Requires notice for AI use in high-risk employment decisions, with ongoing proposals for expanded opt-out rights. Monitor the Washington State Attorney General for legislative updates.
- Other Emerging Areas: States like New Jersey and Connecticut are introducing bills for AI disclosure in hiring. These laws often build on federal frameworks but add state-specific requirements, such as mandatory reporting on AI impacts.
While not as robust as the leading states, these protections are evolving rapidly, influenced by national discussions on AI ethics.
Baseline Federal Rights: Everywhere
Even in states without specific AI hiring laws, you have baseline protections under federal law:
- Title VII (Civil Rights Act): Protection from race, sex, national origin, religion discrimination—including algorithmic discrimination. The U.S. Equal Employment Opportunity Commission (EEOC) enforces this, with recent guidance on AI tools (see EEOC's AI and Algorithmic Fairness page).
- ADA (Americans with Disabilities Act): Right to reasonable accommodation if AI tools disadvantage you due to disability. Learn more at the U.S. Department of Justice ADA site.
- ADEA (Age Discrimination in Employment Act): Protection from age-based algorithmic bias (if you're 40+). The EEOC oversees ADEA claims, emphasizing that AI cannot perpetuate age stereotypes.
Federal laws apply nationwide, providing a safety net. For example, the EEOC's 2023 guidance clarifies that employers must ensure AI systems do not disproportionately impact protected groups, or they risk disparate impact claims.
The Right to Know: Disclosure
The most fundamental right: you must be told when AI is being used to evaluate you.
What Employers Must Disclose
In jurisdictions with strong laws, employers must tell you:
- That AI is being used: Not just "technology" or "software"—specifically AI or automated decision-making.
- What the AI evaluates: Resume keywords? Video interview responses? Speech patterns? Facial expressions?
- How it affects decisions: Does it screen you out automatically? Rank you? Provide recommendations to humans?
- What data is collected: Video recordings? Voice patterns? Personal information?
Disclosures must be clear and conspicuous, often in plain language to avoid technical jargon that could confuse candidates.
When Disclosure Must Occur
- NYC: At least 10 business days before AI is used.
- California: Before you submit your application.
- Illinois: Before AI video analysis occurs.
- Colorado: At or before data collection.
Timely disclosure allows you to make informed choices, such as opting out early in the process.
What to Look For
Check for AI disclosures in:
- Job postings (often at the bottom)
- Company career pages
- Application forms (before you submit)
- Interview scheduling emails
Many employers now include standardized notices to comply with multiple laws simultaneously.
Red Flag: No Disclosure
If you suspect AI is being used but see no disclosure, the employer may be in violation. In NYC, California, Colorado, and Illinois, you can file a complaint with state regulators. For federal concerns, contact the EEOC at www.eeoc.gov/file-charge-discrimination.
The Right to Explanation
It's not enough to be told "AI is used." You have the right to understand what the AI evaluates and how.
Questions You Can Ask
Employers must be able to answer:
- "What specific factors does the AI evaluate?" (keywords, speech patterns, facial expressions, etc.)
- "What job qualifications is the AI assessing?"
- "Does the AI's recommendation determine my outcome, or do humans make the final decision?"
- "Has the AI been tested for bias? Can I see the results?"
These questions empower you to assess whether the AI aligns with fair hiring practices.
NYC Transparency Requirement
In NYC, employers must publicly publish bias audit results showing how their AI affects different demographic groups. You can request the URL where this information is posted. This requirement, part of Local Law 144, promotes accountability and helps candidates evaluate potential biases upfront.
Expanding on this, bias audits typically involve statistical analysis of AI outputs across protected classes, such as gender or race, to detect disparities. If audits reveal issues, employers must mitigate them before continuing use.
The Right to Opt Out
In Colorado and (via accommodation requests) other states, you can decline AI evaluation and request an alternative process.
What "Alternative Process" Means
Employers must offer a genuinely different evaluation method:
- Phone or live video interview instead of AI-analyzed asynchronous video
- Human resume review instead of AI screening
- Different assessment format (e.g., work sample instead of gamified AI test)
The alternative must be equivalent in rigor and opportunity, ensuring no disadvantage to you.
How to Opt Out
Look for opt-out instructions in the AI disclosure. Common methods:
- Email address: "To request alternative evaluation, contact hiring@company.com"
- Checkbox during application: "I prefer human-only evaluation"
- Phone number or contact form
Document your request with timestamps and responses for potential disputes.
Important: Opting Out Cannot Hurt You
Laws explicitly prohibit employers from penalizing candidates who opt out. If you request alternative evaluation, you should not experience:
- Longer wait times
- More difficult evaluation standards
- Automatic rejection
- Any other adverse treatment
If you believe you were penalized for opting out, document it and file a complaint with your state's labor/employment agency, such as the Colorado Department of Labor and Employment.
The Right to Human Review
Colorado law explicitly requires that humans make the final hiring decision, not AI alone. California and other states are moving in this direction.
What This Means
- AI can provide recommendations or scores
- But a human must review those recommendations
- The human must be able to override AI decisions
- Fully automated rejection (no human involvement) is prohibited
This safeguard prevents "black box" decisions where candidates are rejected without oversight.
How to Verify Human Review Happened
If you're rejected, you can ask:
- "Was my application reviewed by a human, or was I automatically screened out by AI?"
- "Can you tell me which person reviewed my application?"
- "Was the AI's recommendation the sole basis for my rejection, or did a human consider other factors?"
Employers aren't always required to answer these questions, but asking signals that you know your rights—and may prompt more careful review. In cases of suspected violations, escalate to agencies like the EEOC.
The Right to Data Deletion
Illinois: 30-Day Deletion Right
If you're in Illinois and an employer used AI to analyze your video interview, you can request deletion of the video and all AI-generated data within 30 days.
How to request:
- Send an email to the contact address in the AI disclosure
- Subject line: "Video Interview Data Deletion Request"
- Include: Your name, email, position applied for, date of interview
- Request confirmation of deletion
Employers must comply promptly and provide proof, such as a confirmation email stating the data has been erased.
Maryland: Consent Revocation
Maryland allows you to revoke consent for facial analysis at any time. Upon revocation, the employer must delete your data within 30 days. This aligns with broader biometric privacy laws in the state.
GDPR (If You're in the EU)
EU residents have a robust "right to erasure" under GDPR. If you're applying for a job with an EU-based employer or a U.S. company with EU operations, you can request deletion of your data. For guidance, see the European Data Protection Board.
In the U.S., while not all states have equivalent rights, federal privacy laws like the FTC Act can support deletion requests in cases of deceptive practices.
The Right to Accommodation (ADA)
If you have a disability, you have federal ADA rights to reasonable accommodation—even if your state has no specific AI hiring law.
When to Request Accommodation
Request accommodation if the AI tool disadvantages you due to:
- Speech impediments: AI analyzes speech patterns or clarity
- Hearing impairments: Affecting your speech or ability to respond to audio prompts
- Vision impairments: AI expects eye contact or visual tasks
- Autism spectrum: AI penalizes atypical eye contact, facial expressions, or speech patterns
- Anxiety/PTSD: AI-analyzed video interviews trigger severe anxiety
- Cognitive disabilities: Timed AI assessments or complex gamified tests
Accommodations ensure equal access, recognizing that AI may inadvertently discriminate against disabled candidates.
How to Request Accommodation
Timing: As soon as you learn AI will be used, preferably before the AI evaluation occurs.
Method: Email or call the HR contact listed in the job posting or AI disclosure.
What to say:
"I am applying for [position] and I have a disability that may affect my performance on the [AI video interview/assessment]. Under the ADA, I am requesting a reasonable accommodation. Specifically, I request [describe alternative: e.g., 'a phone interview instead of video,' 'extended time on the assessment,' 'turning off facial expression analysis'].
Please let me know what documentation you need and how we can proceed."
Keep requests professional and specific to facilitate dialogue.
What Employers Must Do
Employers must engage in an "interactive process"—a conversation about what accommodation is needed and what's reasonable. They cannot:
- Automatically reject your request
- Require extensive medical documentation for obvious accommodations
- Penalize you for requesting accommodation
- Force you to use the AI tool despite your disability-related concerns
Failure to accommodate can lead to ADA complaints filed with the EEOC.
How to File a Complaint
If an employer violates your AI hiring rights, you can file complaints with multiple agencies:
State/Local Agencies
- NYC: Department of Consumer and Worker Protection (for LL144 violations) – www.nyc.gov/site/dca
- California: Attorney General's Office – oag.ca.gov/privacy
- Colorado: Attorney General's Office – coag.gov/office-sections/consumer-protection
- Illinois: Department of Labor (for AIVIA violations) – labor.illinois.gov
Federal Agencies
- EEOC: For race, sex, age, disability, or other protected class discrimination claims – File at www.eeoc.gov
- DOL (Department of Labor): In some cases, for systemic violations – www.dol.gov
What to Include in Your Complaint
- Company name and job title you applied for
- Date of application/interview
- Description of AI tool used (if known)
- What disclosure you did or didn't receive
- What rights violation occurred (no disclosure, no opt-out offered, penalized for accommodation request, etc.)
- Any documentation (screenshots, emails, job posting copies)
Complaints are typically free and can lead to investigations, remedies, or even class actions if patterns emerge.
Practical Tips for Job Seekers
Tip 1: Read Everything Carefully
Don't skip the fine print. AI disclosures are often buried in privacy policies, terms of use, or at the bottom of job postings. Read them before applying. This proactive step can reveal hidden AI use and your available rights early.
Tip 2: Screenshot Everything
If you later need to prove you weren't disclosed to, or that you requested accommodation, you'll need evidence. Screenshot:
- Job postings
- AI disclosures (or lack thereof)
- Application pages
- Emails from employers
Digital records are crucial for building a case with regulators.
Tip 3: Ask Questions
Don't be afraid to email HR and ask about AI use. Questions like:
- "Do you use AI in your hiring process?"
- "What does the AI evaluate?"
- "Can I request human-only review?"
Professional employers will answer. Evasive responses are a red flag, potentially indicating non-compliance.
Tip 4: Optimize for AI (If You Don't Opt Out)
If you choose to proceed with AI evaluation, understand how to optimize:
- Resumes: Use keywords from the job description; avoid weird formatting that AI can't parse. Tools like ATS-friendly resume builders can help.
- Video interviews: Good lighting, quiet environment, look at the camera, speak clearly. Practice to reduce anxiety and improve scores.
- Assessments: Practice similar assessments beforehand; read instructions carefully. Many platforms offer demo versions.
Balancing optimization with rights awareness maximizes your chances.
Tip 5: Know When to Opt Out
Consider opting out if:
- You have a disability that AI might penalize
- You're a non-native English speaker and the AI analyzes speech
- The AI evaluates things not clearly job-related (facial expressions, personality traits)
- You're uncomfortable with video analysis
Opting out is a strategic choice, not a setback, especially in rights-strong states.
The Future: Rights That Are Coming
Proposed legislation and emerging trends suggest candidates will gain additional rights:
Right to Explanation of Decisions
Future laws may require employers to explain why AI rejected you—what factors led to the decision. This "explainable AI" mandate is gaining traction in bills like the federal Algorithmic Accountability Act.
Right to Appeal/Contest
Ability to challenge AI decisions and have a human reconsider without the AI's influence. Pilot programs in states like California are testing appeal mechanisms.
Right to Bias Audit Transparency
Expansion of NYC's model: all employers must publish bias audit results, not just those in NYC. National standards could standardize audits, making them accessible via public databases.
Right to AI-Free Hiring
Some advocates are pushing for opt-out to be the default—candidates must opt in to AI evaluation, not opt out. Organizations like the ACLU are lobbying for this shift to prioritize human-centered hiring.
As AI evolves, expect federal legislation by 2027, potentially harmonizing state laws.
Resources for Job Seekers
Where to Learn More
- EEOC AI Guidance: eeoc.gov (search "AI hiring") – Official federal resources on avoiding discrimination.
- NYC DCWP: Information on Local Law 144 rights – www.nyc.gov/site/dca/workers/ai-hiring.page
- ACLU: Resources on AI and civil rights – www.aclu.org/issues/privacy-technology/surveillance-technologies/ai-and-algorithmic-justice
- Electronic Privacy Information Center (EPIC): AI accountability resources – epic.org/issues/ai/
These .gov and nonprofit sites provide free, reliable information.
Legal Aid
If you believe you've been discriminated against by AI hiring tools and need legal help:
- Contact your state's Legal Aid Society – Search via www.lsc.gov
- EEOC offers free investigation and potential representation – www.eeoc.gov/field-office
- Employment lawyers often work on contingency (no upfront fees) – Find via www.americanbar.org
Seek advice promptly, as time limits apply to filings.
For Employers: Respect Candidate Rights
EmployArmor helps you comply with all candidate rights requirements.
Get Compliant →
Frequently Asked Questions
If I opt out of AI evaluation, will employers think I have something to hide?
Opting out is a legal right, not a red flag. Many legitimate reasons exist: disability, discomfort with video analysis, belief that AI doesn't capture your strengths. Professional employers understand this and won't penalize you. Laws in states like Colorado explicitly protect against retaliation.
Can I find out my AI score after being rejected?
Currently, most laws don't require employers to disclose individual scores. However, you can ask—some employers will provide feedback. If there's evidence the AI score was the sole reason for rejection and you're in a protected class, an EEOC complaint might compel disclosure during investigation. Future laws may mandate this transparency.
What if I'm applying from one state for a job in another state—which laws apply?
Generally, the laws of both your location and the job location may apply. If you're in Illinois applying for a California job, you get the protections of both states. Employers should comply with the stricter standard. Consult resources like the EEOC for multi-jurisdictional guidance.
Can I sue an employer for AI hiring discrimination?
Yes, under federal laws (Title VII, ADA, ADEA) and some state laws. You typically must file an EEOC or state agency complaint first. If the agency doesn't resolve it, you can file a lawsuit. Consult an employment lawyer for personalized advice. Successful cases have resulted in settlements exceeding $100,000.
How do I know if AI was used if there was no disclosure?
Signs: automated immediate rejection, instructions to record video responses on your own time, online assessments with games or puzzles, very fast screening (minutes after applying). If you suspect AI was used without disclosure, ask the employer directly or file a complaint with the relevant agency, such as the FTC for deceptive practices.
(FAQ Schema JSON-LD for SEO:)
Related Resources
- Complete AI Hiring Compliance Guide 2026
- Video Interview AI Compliance
- 2026 AI Hiring Laws: What Changed
Legal Disclaimer: This content is for informational purposes only and does not constitute legal advice. Laws change frequently, and individual circumstances vary. Consult a qualified attorney for advice specific to your situation. EmployArmor is not a law firm.
(Word count: Approximately 2,500 – Comprehensive coverage optimized for SEO with location-specific keywords, .gov links, and structured FAQ schema.)