Illinois Human Rights Act Amendment (HB 3773) Guide

Illinois amends its Human Rights Act to require formal AI notice, prohibit proxy discrimination, and hold employers strictly liable for discriminatory effects, regardless of intent

Illinois Human Rights Act Amendment (HB 3773) Guide

What's inside

House Bill 3773

Effective January 1, 2026, House Bill 3773 amends the Illinois Human Rights Act to require employers to provide formal notification when using AI in employment decisions. It explicitly prohibits discrimination driven by AI, including the use of zip codes as proxies for protected classes.

Test Icon

Effective January 1, 2026

Compliance required from the start of 2026

Docs Icon

Amends the Illinois Human Rights Act

Expands existing civil rights protections to cover AI

Risks Icon

Mandates formal AI notice requirements

Employers must notify when AI is used in decisions

Audit Icon

Prohibits AI-driven discrimination

Including use of zip codes as proxies for protected classes

Why the Illinois HB 3773 amendment matters

  • Expansion of liability: Employers are strictly liable if AI deployment produces a discriminatory effect on protected classes, regardless of whether the discrimination was intentional. Intent is no longer a defence.
  • Direct civil rights violations: Failure to provide required notices, or the use of prohibited technical proxies such as zip codes, constitutes a formal civil rights violation under the Illinois Human Rights Act.
  • Regulatory scrutiny extends beyond hiring: Under Illinois HB 3773, promotion, renewal, discipline, discharge, and selection for training are all covered employment decisions.
  • Claim of “third-party use” no longer stands: The burden of technical oversight sits with the deployer, not the vendor. Employers must conduct proactive due diligence on all third-party HR technology they use
Download Guide

Zip codes as proxies

HB 3773 prohibits zip codes in AI models as an anchor against proxy discrimination. While geographically neutral, their use often functions as a high-fidelity "stand-in" for protected classes.

AI systems trained on historical hiring data can learn these patterns and replicate them at scale, even when the data scientist never intended to use a protected variable. The result is what the law calls a discriminatory effect.

Under HB 3773's Evidence First rule, the focus is on outcome, not intent. If an AI system using zip codes produces disparate impact on a protected class, the employer is liable, irrespective of whether they knew the variable was problematic.

Industry perspectives

Voices from leading HR innovators

Popp

We partnered with Warden AI for continuous, monthly third-party auditing to detect any potential bias issues, and we publish every result publicly.

Beamery

Warden helps us stay aligned with evolving AI regulations. Ongoing third-party audits provide trusted evidence and reduce legal exposure.

Warden's Platform

Buy, build, and defend AI with confidence

Warden provides the independent assurance needed to prove fairness and satisfy regulators.

Audit Icon

Independent AI Audits

Whether built in-house or provided by vendors, third-party bias and compliance testing gives clients and regulators proof your AI systems are assessed.

Monitoring Icon

Continuous AI Monitoring

Ongoing re-testing and oversight help you catch issues early and show that your AI is actively governed, not reviewed once and forgotten.

Curate Icon

Defensible Audit Trails

Timestamped, versioned records of AI performance over time help you demonstrate reasonable oversight and respond confidently to legal or client scrutiny.

Join the companies
building trust in AI

Request Demo