Meta Facebook Job Ad Discrimination: EEOC Settlement Guide

Meta settled EEOC charges over Facebook's job ad algorithm discriminating by age and gender. What employers using social media for hiring must know in 2026.

Meta Facebook Job Ad Algorithm Discrimination: The EEOC Settlement Employers Must Understand

Meta (formerly Facebook) settled charges with the U.S. Equal Employment Opportunity Commission (EEOC) in 2022 over allegations that its ad delivery algorithm discriminated based on age and gender when displaying job advertisements. The case—combined with a separate $115 million class action settlement over Facebook's job ad targeting features—established that social media advertising platforms are now enforcement territory for employment discrimination law.

According to the EEOC's settlement agreement, Meta agreed to changes in how its advertising system handles job postings and paid $115 million to settle a class action brought on behalf of workers who were excluded from seeing job ads due to age and gender targeting.

The case is significant not just because of its size, but because of its breadth: both the platform (Meta) and the employers who used discriminatory targeting features faced legal liability. If your company has used Facebook or Instagram job advertising with demographic audience restrictions, you may have residual legal exposure that requires immediate audit. Track this and related cases in our AI hiring lawsuits tracker.

What Happened: The Facebook Job Ad Discrimination Timeline

2016–2018: Discriminatory Targeting Features Available

Facebook's advertising platform allows advertisers to target job ads by age, gender, and other demographic characteristics. Employers across industries use these features to exclude older workers and women from seeing job postings for specific roles.

2018: ACLU and ProPublica Investigation

The American Civil Liberties Union and ProPublica publish a joint investigation demonstrating that major employers—including Verizon, Amazon, Target, Goldman Sachs, and others—are using Facebook to exclude workers over 40 from seeing job ads. The investigation triggers widespread regulatory attention.

2019: First EEOC Charges

The EEOC files charges against multiple employers who used Facebook's demographic targeting in job advertisements. Simultaneously, a class action lawsuit is filed against Facebook itself, alleging the platform violated the ADEA (29 U.S.C. § 623) and Title VII (42 U.S.C. § 2000e) by enabling discriminatory targeting.

2022: Meta Settlement

Meta reaches a settlement resolving the class action. Meta agrees to:

  • Pay $115 million in damages
  • Eliminate age and gender targeting options for job, housing, and credit ads in the United States
  • Implement a "Special Ad Category" for employment ads requiring broader audience delivery
  • Submit to third-party monitoring of its advertising systems

2023–2026: Employer Enforcement Continues

The EEOC continues pursuing employers who used discriminatory ad targeting. The FTC opens a separate investigation into whether Meta's advertising systems violate unfair trade practice laws. Plaintiff attorneys launch additional class actions against employers who used the targeting features. The OFCCP also examines algorithmic ad delivery for federal contractors as part of routine compliance audits.

Title VII of the Civil Rights Act (42 U.S.C. § 2000e-3)

Title VII prohibits employers from limiting, segregating, or classifying job applicants in any way that would deprive or tend to deprive them of employment opportunities because of sex, among other protected characteristics. Configuring a Facebook job ad to show only to male users is a textbook violation of this provision.

The EEOC has long held that discriminatory job advertising is itself an unlawful employment practice—not just evidence of discriminatory hiring intent. The Facebook case brought this principle into the social media advertising context.

The Age Discrimination in Employment Act (29 U.S.C. § 623(a))

The ADEA makes it unlawful for employers to "limit, segregate, or classify" employees or applicants in any age-discriminatory way. Setting a Facebook job ad to appear only to users aged 25–40 explicitly violates 29 U.S.C. § 623(a).

The ADEA covers workers 40 and older. When an employer excludes everyone over 40 from even seeing a job posting, it eliminates them from the applicant pool entirely—a clear ADEA violation.

ADEA penalty exposure: Back pay, front pay, liquidated damages (equal to back pay amount for willful violations), plus attorney's fees. Class actions can aggregate these damages across hundreds of affected workers.

Disparate Impact: When Algorithmic Delivery Is the Problem

Even without intentional demographic targeting, Facebook's algorithm could produce discriminatory results by optimizing for "engagement"—and historically, engagement patterns reflect historical discrimination. If an algorithm learns that software engineering job ads get more clicks from men, it begins delivering those ads primarily to men, even without explicit demographic targeting.

Under the disparate impact doctrine established in Griggs v. Duke Power Co. (401 U.S. 424, 1971), this algorithmic behavior constitutes unlawful employment discrimination. Employers cannot escape liability by claiming the algorithm—not a human—made the targeting decision.

State Equivalents

Multiple states have their own equivalents to Title VII and the ADEA with broader coverage or higher penalties. For a full breakdown, see our AI hiring laws by state:

  • California FEHA (Gov. Code § 12940): Covers employers with 5+ employees; no cap on compensatory damages
  • New York State Human Rights Law (Exec. Law § 296): Covers all employers; allows unlimited compensatory damages
  • Illinois Human Rights Act (775 ILCS 5): Covers employers with 15+ employees; broad protected class coverage

The Two-Layer Liability Problem: Platform and Employer

The Facebook/Meta enforcement pattern revealed a "two-layer liability" problem unique to social media job advertising:

Layer 1: Platform Liability (Meta)

Meta was liable for building and operating advertising tools that enabled discriminatory targeting. Meta's settlement required the company to:

  • Remove age and gender targeting for employment ads
  • Implement "Special Ad Category" rules requiring broader audience delivery
  • Monitor for algorithmic bias in ad delivery

Layer 2: Employer Liability

Employers who used the discriminatory targeting features remained separately liable for their own conduct. Using Facebook's age or gender targeting for job ads was itself an unlawful employment practice by the employer—Meta's eventual changes to the platform do not retroactively eliminate employer liability for past conduct.

According to EEOC data, the agency continued pursuing charges against employers who used discriminatory Facebook ad targeting even after Meta's settlement, treating the employer conduct as independently actionable.

This means: if your company used Facebook job ad demographic targeting before 2022 to exclude workers over 40 or to show ads only to one gender, you may have ongoing liability exposure even though Meta has since changed its platform. Use our AI hiring compliance checklist to assess your current exposure.

What "Special Ad Category" Means for Employers in 2026

Following the settlement, Meta implemented "Special Ad Category" (SAC) rules for employment, housing, and credit ads. When you classify your Facebook/Instagram job ad as an employment ad:

  • You cannot target by age, gender, zip code, or other legally protected characteristics
  • Meta's algorithm delivers ads to a broader, more demographically diverse audience
  • You lose the ability to use many standard audience targeting features

Some employers resisted the SAC classification, attempting to run job ads under other categories to preserve targeting options. This practice is itself legally risky—using non-employment ad categories to circumvent SAC rules while recruiting could be characterized as an intentional attempt to evade anti-discrimination requirements. This is an area the FTC has flagged as a deceptive trade practice.

Impact on Algorithmic Delivery: Beyond Intentional Targeting

The Meta case's most important long-term implication is not about intentional targeting—it's about algorithmic delivery bias. Even under SAC rules, Meta's algorithm still optimizes ad delivery based on engagement signals, which can reproduce demographic bias.

Research from the University of Southern California (2021) found that even without explicit demographic targeting, Facebook's algorithm naturally segregates ad delivery along gender and racial lines because historical engagement patterns reflect historical discrimination.

For employers, this means:

  • Running a job ad under SAC rules does not guarantee non-discriminatory reach
  • Employers must monitor delivery reports and analyze demographic patterns
  • If your "neutral" job ad consistently reaches a skewed demographic, you may need to adjust platform, targeting approach, or supplemental outreach

Review your state employment compliance obligations—some states impose additional monitoring requirements beyond federal law.

Comparison: Job Ad Platform Requirements by Law

Platform / MethodGoverning LawRequired AuditProhibited Targeting
Facebook/Meta jobsTitle VII / ADEANone required (but recommended)Age, gender, race, religion
LinkedIn sponsored jobsEO 11246 (federal contractors)Affirmative action reviewDemographic proxies
Indeed / ZipRecruiterTitle VIINone requiredProtected class exclusion
ATS/algorithmic screeningNYC Local Law 144Annual bias auditDisparate impact
Video interview AIIL AI ActDisclosure + consentDemographic screening

Who Was Actually Hurt: The Plaintiff Classes

The class action against Meta covered workers who were:

  • Excluded from seeing job ads due to age targeting (workers 40+)
  • Excluded from seeing job ads due to gender targeting (primarily women excluded from male-coded job categories)
  • Harmed by algorithmic delivery bias that resulted in demographic skew without explicit targeting

The settlement plaintiffs included workers from across the United States who could demonstrate they would have applied for positions they never saw—and who were qualified for those positions—but for discriminatory ad delivery.

According to court filings, expert analysis showed that Facebook's job ad delivery patterns produced statistically significant demographic disparities across multiple industries and job categories, consistent with systematic discrimination.

Practical Steps: Audit Your Past and Current Job Advertising

Audit Historical Facebook Job Ad Campaigns

If your company used Facebook job advertising before 2022 with demographic targeting:

  1. Retrieve your historical ad campaign data (Facebook Ad Manager maintains records)
  2. Identify any campaigns where age or gender targeting was used
  3. Assess the potential scope of affected workers
  4. Consult employment counsel about residual liability exposure

Configure Current Campaigns Correctly

For current Facebook/Instagram job advertising:

  1. Always select "Employment" as the Special Ad Category
  2. Configure targeting by skills, interests, and behaviors—not demographics
  3. Use geographic targeting at the city/county level (zip code targeting is prohibited under SAC)
  4. Review and save your audience configuration settings for records

Monitor Delivery Reports

After each campaign, download delivery reports showing demographic breakdowns of who saw your ads:

  1. Navigate to Ads Manager → Reports
  2. Download audience demographics for each job ad
  3. Compare to qualified labor market demographics for your region and job category
  4. Flag significant deviations for investigation

Supplement with Non-Algorithmic Outreach

Do not rely exclusively on social media advertising for recruitment:

  • Post on diversity-focused job boards
  • Partner with professional associations serving underrepresented groups
  • Use the EEOC's Employer Information Report (EEO-1) data to identify gaps
  • Document all supplemental outreach efforts

Get your free AI hiring compliance assessment →

State-Level Protections Beyond EEOC

New York City: NYC Local Law 144

New York City's Local Law 144 covers all employers (regardless of size) and requires bias audits for automated employment decision tools. The NYC Commission on Human Rights has also taken the position that algorithmic job ad delivery producing demographic exclusion may violate the NYCHRL. Visit the NYC CCHR's official LL 144 page for enforcement guidance.

California Fair Employment and Housing Act (Gov. Code § 12940)

California's FEHA prohibits discriminatory advertising and applies to employers with 5 or more employees. Unlike Title VII, FEHA has no cap on compensatory damages—a significant exposure for California employers.

Illinois Human Rights Act (775 ILCS 5/2-102)

Illinois prohibits age and sex discrimination in employment advertising. The Illinois Department of Human Rights has jurisdiction over job advertising complaints, and remedies include actual damages, attorney's fees, and civil penalties.

Frequently Asked Questions

Did Meta's $115 million settlement mean employers were off the hook?

No. Meta's settlement resolved Meta's liability as a platform provider. Employers who used discriminatory targeting features in their job ads remained separately liable for their own conduct. The EEOC continued pursuing employers who used Facebook demographic targeting, treating those cases as independent violations of Title VII (42 U.S.C. § 2000e) and the ADEA (29 U.S.C. § 623).

Is it still possible to discriminate using Facebook job ads in 2026?

Yes, though Meta's platform changes have eliminated explicit demographic targeting. The more significant risk in 2026 is algorithmic delivery bias—Meta's algorithm may still deliver job ads to demographically skewed audiences even without explicit targeting, based on historical engagement patterns. Employers must monitor delivery data and supplement with targeted outreach to ensure equitable reach.

What is the "Special Ad Category" and why does it matter?

Meta's Special Ad Category (SAC) for employment ads restricts certain targeting features to prevent discriminatory ad delivery. When you classify your Facebook job ad as employment-related, you cannot target by age, gender, zip code, or other characteristics that proxy for protected class status. Failure to use SAC for job ads—or attempting to circumvent it by using other ad categories—is legally risky and may be treated as intentional evasion of anti-discrimination requirements.

Can we use "Lookalike Audiences" for job advertising on Facebook?

Lookalike audiences—which target users similar to your existing employees or customers—are prohibited under Facebook's Special Ad Category rules for employment. Using lookalike audiences for job ads encodes your existing workforce's demographic profile into your advertising, reproducing whatever bias exists in your current workforce. This practice is both prohibited by Meta's current policy and potentially unlawful under Title VII (42 U.S.C. § 2000e).

What if our vendor or marketing agency placed the discriminatory ads on our behalf?

You remain liable. Employment discrimination law does not allow employers to outsource liability to advertising agencies or recruitment marketing vendors. Your advertising agency was acting as your agent, and their actions in placing demographic-targeted job ads are attributable to you as the employer. Review your contracts with advertising vendors to ensure they include anti-discrimination compliance obligations.

How long should we retain job advertising records?

Title VII requires employers to retain records relevant to discrimination charges for the duration of any pending proceeding—typically at least 2 years as a baseline. The OFCCP requires federal contractors to maintain affirmative action records for 2 years (41 CFR 60-1.12). Best practice is to maintain job advertising records, including delivery reports and targeting configurations, for 4 years.

What remedies can workers recover for Facebook job ad discrimination?

Under Title VII (42 U.S.C. § 2000e), workers can recover: back pay, front pay, compensatory damages (up to $300,000 for employers with 500+ employees), and attorney's fees. Under the ADEA (29 U.S.C. § 623), willful violations entitle workers to liquidated damages equal to back pay. State laws may allow additional or unlimited damages.

Key Takeaways

  • Facebook job ad demographic targeting was unlawful under Title VII (42 U.S.C. § 2000e) and the ADEA (29 U.S.C. § 623). Age and gender targeting that excluded workers from seeing job ads constituted discriminatory employment practices by the employers who used those features.

  • Meta settled, but employer liability remains. Meta's $115 million settlement resolved the platform's liability—not the liability of employers who used discriminatory targeting. EEOC enforcement against employers who used these features continued independently.

  • Algorithmic delivery bias is still a risk. Even under SAC rules, Facebook's algorithm may deliver job ads to demographically skewed audiences. Monitor delivery reports and supplement with targeted outreach.

  • Special Ad Category is mandatory for job ads. Using any other ad category for employment-related advertising risks being treated as intentional circumvention of anti-discrimination requirements.

  • Audit your history. If your company used Facebook demographic targeting for job ads before 2022, consult employment counsel about residual liability exposure.

  • State laws extend beyond EEOC. Review AI hiring laws by state and your state employment compliance obligations—California, New York, and Illinois provide broader protections than federal law.


EmployArmor monitors your job advertising compliance across all major platforms, flags discriminatory targeting configurations before you publish, and generates audit documentation for EEOC and state agency investigations. Get your free compliance assessment →

Last updated: March 2026. This content is for informational purposes only and does not constitute legal advice. Consult qualified employment counsel for guidance specific to your situation.

Ready to comply?

Get your personalized compliance assessment in 2 minutes — free.