Tool3 min readJune 30, 2026

Do I Need to Disclose AI in Hiring? Decision Tree

Use this simple flowchart to determine if your company needs to disclose AI use to candidates, and which regulations apply to your hiring process.

DB
Devyn Bartell
Founder & CEO, EmployArmor
Published February 20, 2025

Not sure if your hiring tools require candidate disclosure? This decision tree walks you through the key questions to determine your obligations. Answer each question to identify which regulations apply and what actions you need to take.

Start Here: The Main Question

Do you use any technology in hiring that uses AI, machine learning, automated scoring, or algorithmic decision-making?

YES → Continue below
NO → No disclosure required*

*If you're unsure whether your tools use AI, see our guide: What Counts as AI in Hiring?

Question 1: Where Are Your Positions?

Check all locations where you have open positions or may hire employees:

  • New York City

    → NYC Local Law 144 applies (already in effect)

  • Illinois

    → HB 3773 applies (effective January 1, 2026)

  • Colorado

    → Colorado AI Act applies (effective June 30, 2026)

  • California

    → CCPA ADMT rules apply (if meeting business thresholds)

  • Maryland

    → HB 1202 applies if using video interview AI

  • Remote positions (work from anywhere)

    → All above may apply depending on candidate location

  • None of the above

    → Currently no specific disclosure law, but best practice to disclose

Question 2: What Type of AI Do You Use?

Resume Screening / Candidate Ranking

Do you use AI to screen, score, or rank resumes?

  • YES → Disclosure required (NYC, IL, CO, CA)
  • Examples: LinkedIn Recruiter matching, Indeed recommendations, AI-powered ATS screening

Video Interview Analysis

Do you use AI to analyze video interviews?

  • YES → Disclosure required (NYC, IL, CO, CA, MD)
  • Examples: HireVue, Pymetrics, facial expression or tone analysis

AI-Scored Assessments

Do you use AI-powered skills tests or assessments?

  • YES → Disclosure required (NYC, IL, CO, CA)
  • Examples: Cognitive assessments, personality tests with ML scoring, game-based assessments

Chatbots / Virtual Assistants

Do you use chatbots that screen or evaluate candidates?

  • YES (if it influences decisions) → Disclosure required
  • NO (if scheduling only) → Typically not required

Background Check AI

Do you use AI for background screening beyond verification?

  • YES (if predictive/scoring) → Likely disclosure required
  • NO (basic verification only) → May not require AI disclosure (but FCRA applies)

Decision Summary

If you checked ANY box in Questions 1 AND 2:

You likely need to provide AI disclosure notices to candidates. The specific requirements depend on which jurisdictions apply.

What You Need to Do by Jurisdiction

NYC (Local Law 144) — Already Required

  • ☐ Independent bias audit (annually)
  • ☐ Post audit summary on website
  • ☐ Notify candidates 10 business days before AEDT use
  • ☐ Explain what AI evaluates and data sources
  • ☐ Offer alternative process if available

Illinois (HB 3773) — January 1, 2026

  • ☐ Notify candidates before or at time of AI use
  • ☐ Explain what AI is used for
  • ☐ Explain data inputs and how outputs influence decisions
  • ☐ Ensure AI doesn't discriminate on protected characteristics

Colorado (AI Act) — June 30, 2026

  • ☐ Complete impact assessment before deployment
  • ☐ Notify candidates before AI is used
  • ☐ Explain purpose and decision type
  • ☐ Offer opt-out from AI profiling
  • ☐ Provide appeal process for adverse decisions
  • ☐ Send adverse decision statement if not selected

California (CCPA ADMT) — Already Required

  • ☐ Pre-use notice explaining ADMT
  • ☐ Describe logic, inputs, and outputs
  • ☐ Offer opt-out from ADMT processing
  • ☐ Respond to access requests within 45 days
  • ☐ Complete risk assessment

Maryland (HB 1202) — If Using Video AI

  • ☐ Obtain consent before using facial recognition in interviews
  • ☐ Provide notice that video AI will be used

Still Not Sure?

Here are some common scenarios:

"We just use LinkedIn Recruiter"

LinkedIn Recruiter uses AI for matching and recommendations. If you're hiring in regulated jurisdictions and using these AI features to inform decisions, disclosure is likely required.

"Our ATS does automatic screening, but it's not really AI"

If the ATS uses machine learning, natural language processing, or algorithmic scoring to filter or rank candidates, it's probably covered. Check with your vendor.

"We only use AI to source candidates, not decide on them"

The laws generally apply when AI is used to make or substantially assist employment decisions. If AI helps determine who gets contacted or considered, disclosure may still apply.

"We're a small company"

NYC and Illinois apply regardless of company size. Colorado applies to "deployers" of high-risk AI. California CCPA has business size thresholds. Check each law's scope carefully.

When in Doubt, Disclose

Disclosing AI use when not strictly required has minimal downside. Failing to disclose when required can result in fines and legal liability. Transparency builds trust.

Edge Cases and Exceptions

Some scenarios require careful analysis:

Internal Promotions and Transfers

Question: Do AI disclosure rules apply to internal hiring?

Answer: It depends on the jurisdiction. NYC Local Law 144 explicitly covers "promotion or selection for hire." Colorado's AI Act applies to "consequential decisions" affecting employment. Illinois HB 3773 applies to "applicants" but is less clear about internal candidates. Best practice: disclose AI use even for internal moves to ensure consistency and avoid claims of disparate treatment. Many organizations extend the same transparency to internal and external candidates.

Contractors and Gig Workers

Question: If we use AI to select contractors, does this count as "hiring"?

Answer: The laws primarily focus on "employment" relationships, which typically means W-2 employees. Independent contractors (1099) may not be covered under employment laws, but this is a grey area. If your contractor relationship looks like employment (exclusive, long-term, controlled work), regulators might argue the protections apply. California's CCPA ADMT rules include some contractor protections. When in doubt, treat contractor selection the same as employee hiring for disclosure purposes.

Pre-Application Screening

Question: What if we use AI to decide who to invite to apply, before they're officially "applicants"?

Answer: This is increasingly common (AI-powered sourcing, predictive analytics on passive candidates). If AI determines who gets recruitment outreach, you're making a consequential decision about who has opportunity. While technical applicability varies by jurisdiction, EEOC guidance on AI in hiring emphasizes transparency at all stages. Disclose AI use when first contacting candidates: "We used AI to identify you as a potential match for this role based on your public profile."

Multi-Tool Scenarios

Question: We use three different AI tools at different stages. Do we need three disclosures?

Answer: You can provide a comprehensive disclosure covering all AI tools used in your hiring process, or stage-specific disclosures. Comprehensive disclosure is simpler operationally and ensures candidates understand the full scope of AI use upfront. If tools vary significantly by role or location, targeted disclosures may be clearer. NYC requires disclosure "at least 10 business days before use" of each AEDT, so timing matters more than number of notices.

Vendor-Hosted AI (You Don't "Own" the AI)

Question: Our ATS vendor uses AI on their platform. Are we responsible for disclosure even though we don't control the AI?

Answer: Yes. As the employer making hiring decisions, you're responsible for compliance regardless of whether you own or just use the AI. The legal concept is "deployer" liability—you deployed the AI in your hiring process. Your vendor should support your compliance (provide documentation, audit support), but the obligation to disclose to candidates is yours. This is why vendor contracts should include compliance cooperation provisions.

AI in Recruiting, Human in Hiring

Question: AI only creates a shortlist, but humans make all final decisions. Still need disclosure?

Answer: Yes. The laws apply when AI "substantially assists" decisions, not just when AI makes final decisions. If AI determines who makes it to the human review stage, it substantially assisted by filtering out other candidates. NYC uses the phrase "substantially assists or replaces" decision-making. Colorado refers to AI that "materially impacts" decisions. The test is whether the AI influenced the opportunity, not whether humans had final say.

Advanced Decision Tree: Disclosure Timing

Beyond whether to disclose, when you disclose matters:

Disclosure Timing Requirements

JurisdictionTiming RequirementPenalty for Late Disclosure
NYCAt least 10 business days before using AEDT$500-1,500 per violation
IllinoisBefore or at time of useInjunction, damages
ColoradoBefore use$5,000-10,000 per violation
CaliforniaBefore ADMT processingCCPA penalties ($2,500-7,500/violation)
MarylandBefore video AI use + consentStatutory damages

Best practice: Provide disclosure in the job posting or application confirmation email to ensure sufficient advance notice for all jurisdictions.

Disclosure Delivery Methods

How you deliver the disclosure matters as much as what you say:

Job Posting

  • Pros: Maximum advance notice, reaches candidates before they invest time
  • Cons: Not all candidates read full postings, may deter some applicants
  • Best for: High-volume roles, meeting NYC's 10-day requirement

Application Confirmation Email

  • Pros: Reliably delivered, trackable, arrives at natural touchpoint
  • Cons: May not meet 10-day advance notice if AI screening is immediate
  • Best for: Colorado, Illinois, California (immediate notice requirements)

Dedicated Disclosure Page

  • Pros: Comprehensive information in one place, referenceable URL
  • Cons: Candidates must navigate to it, link could break
  • Best for: Detailed transparency for informed candidates

In-App Notice (Before Assessment)

  • Pros: Impossible to miss, contextual to the AI interaction
  • Cons: May not provide sufficient advance notice, disrupts flow
  • Best for: AI-powered assessments, video interview tools

Multi-Channel Approach (Recommended)

  • Disclose in job posting (meets advance notice requirement)
  • Repeat in application confirmation email (ensures receipt)
  • Remind before AI assessment stages (contextual reinforcement)
  • Link to comprehensive disclosure page (for detail-seekers)

Common ATS Platforms: AI Disclosure Checklist

Quick reference for determining if your ATS requires disclosure:

ATS AI Feature Analysis

  • Workday Recruiting: Uses machine learning for candidate matching, job recommendations, and resume parsing. AI disclosure required if using SmartMatch or similar features.
  • Greenhouse: Core product is mostly non-AI, but partner integrations (structured interviews, assessments) may include AI. Check your specific integrations.
  • Lever: Uses AI for candidate matching, sourcing recommendations, and duplicate detection. Disclosure required for matching/ranking features.
  • iCIMS: Includes AI-powered candidate experience tools, screening, and matching. Check with iCIMS about specific features you've enabled.
  • LinkedIn Recruiter: Uses AI for talent pool suggestions, candidate ranking, and InMail prioritization. Disclosure required.
  • Indeed: Uses AI for resume search ranking and candidate recommendations. Disclosure required if using sponsored jobs with AI targeting.
  • Bullhorn (staffing): Includes AI matching and automation. Disclosure required if AI influences candidate submissions.

Note: AI features evolve rapidly. Verify current AI functionality with your vendor and review their terms of service for compliance support obligations.

Federal Considerations (EEOC)

While no federal AI disclosure law exists yet, the EEOC has issued guidance affecting how you should approach AI hiring:

  • Title VII implications: AI tools that produce adverse impact on protected groups can violate federal anti-discrimination law even if state AI laws don't apply
  • ADA concerns: AI that screens out candidates based on disability-related characteristics may violate the Americans with Disabilities Act
  • ADEA issues: Age bias in AI violates the Age Discrimination in Employment Act
  • EEOC enforcement: The EEOC has stated it will hold employers accountable for discriminatory AI, regardless of whether they developed the AI themselves
  • Reasonable accommodation: Candidates with disabilities may request alternatives to AI evaluation as a reasonable accommodation under the ADA

Takeaway: Even if you're not in a state with AI disclosure laws, proactive transparency and bias monitoring protects against federal discrimination claims. The EEOC views "we didn't know the AI was biased" as insufficient defense.

International Hiring Considerations

Hiring candidates outside the US? Additional regulations may apply:

  • EU AI Act: High-risk AI systems for employment decisions require conformity assessment, transparency, human oversight. Effective 2026-2027 in phases.
  • GDPR (EU): Automated decision-making with legal/significant effects requires explicit consent or explicit legal basis, plus right to explanation
  • UK: ICO guidance on AI and data protection, GDPR-style transparency requirements
  • Canada: PIPEDA applies to automated decision-making, right to explanation

If you hire internationally, consult with employment counsel in those jurisdictions. US state laws may be just the beginning of your compliance obligations.

Frequently Asked Questions

What if we start using AI after a candidate already applied?

Provide disclosure as soon as you intend to use AI on their application, even if they applied weeks ago. For NYC, wait 10 business days after disclosure before running AI evaluation. For other jurisdictions, disclosure before AI use is sufficient. Document the disclosure date in your ATS. This is common when implementing new AI tools mid-hiring cycle.

Do we need separate disclosures for each job posting?

Not necessarily. If you use the same AI tools consistently across all roles, a general disclosure about your hiring process can apply to all postings. Link to a central "AI in Our Hiring Process" page from all job postings. However, if AI use varies significantly by role (e.g., technical roles use coding AI, others don't), role-specific disclosures are clearer and avoid over-disclosure.

How detailed does the disclosure need to be?

Requirements vary by jurisdiction. NYC requires explaining what job qualifications/characteristics the AI evaluates. Colorado requires describing the purpose, decision type, and opt-out rights. California requires describing logic, inputs, and outputs. When complying with multiple jurisdictions, provide the most comprehensive disclosure (Colorado/California level) to all candidates. Avoid generic "we may use AI" language—be specific about what the AI does.

Can we comply by just linking to our vendor's disclosure?

No. While you can (and should) leverage vendor-provided documentation, the disclosure must come from you, the employer, and explain how you use the AI in your hiring process. Candidates applied to your company, not your vendor. Your disclosure should explain: "We use [Vendor Tool] to [specific purpose]" and link to vendor documentation for technical details. A bare link to a vendor's generic disclosure doesn't meet the spirit or letter of the laws.

What happens if we forget to disclose to some candidates?

First, stop using AI on those candidates until you've provided proper disclosure. Second, send disclosure immediately (better late than never). Third, wait the required notice period before resuming AI-assisted evaluation. Fourth, document the gap and corrective action. Fifth, investigate why the disclosure failed (technical glitch, process breakdown, human error) and fix it. Proactive self-correction demonstrates good faith and may reduce penalties if regulators inquire. Consider retrospective review: did the disclosure gap disadvantage those candidates?

Quick Self-Assessment

Answer these questions to gauge your disclosure readiness:

  1. ☐ I can name every AI tool we use in hiring and explain what each does
  2. ☐ I know which states/cities we hire in and which AI laws apply
  3. ☐ We have written disclosure notices for all applicable jurisdictions
  4. ☐ Our disclosures are integrated into our ATS and job posting workflow
  5. ☐ We can prove (with logs/records) that disclosures were delivered to candidates
  6. ☐ We provide disclosure with sufficient advance notice (10+ days for NYC)
  7. ☐ Our vendor contracts require them to support our compliance efforts
  8. ☐ We have a process for handling candidate questions about our AI

Score: 8/8 = Excellent. 6-7 = Good, minor gaps. 4-5 = Concerning gaps. 0-3 = Urgent action needed.

Next Steps

  1. Audit your tools: Confirm which tools use AI and how (see What Counts as AI)
  2. Map your jurisdictions: Determine which laws apply based on where you hire
  3. Create your notices: Use our disclosure templates customized to your tools and jurisdictions
  4. Integrate disclosures: Build disclosure delivery into job postings, ATS workflows, and assessment platforms
  5. Verify delivery: Test that disclosures reach candidates and document proof of delivery
  6. Get compliant: Take our free compliance scorecard to assess your overall readiness

Related Resources

Disclaimer: This content is for informational purposes only and does not constitute legal advice. Employment laws vary by jurisdiction and change frequently. Consult a qualified employment attorney for guidance specific to your situation. EmployArmor provides compliance tools and resources but is not a law firm.

Ready to get compliant?

Take our free 2-minute assessment to see where you stand.