A proactive approach to NYC LL 144 can transform a legal obligation into a strategic advantage. While the law mandates specific actions like bias audits and candidate notifications, fulfilling these requirements is also an opportunity to build a more equitable and trustworthy hiring process. By embedding fairness and transparency into your operations, you not only mitigate legal risk but also strengthen your employer brand. This article outlines how to move beyond a simple compliance checklist and develop a comprehensive strategy that demonstrates a genuine commitment to responsible innovation, turning regulatory requirements into a cornerstone of your talent strategy.
Key Takeaways
- Master the law's three pillars: To comply, you must secure annual bias audits from an independent party, provide clear notice to candidates about using automated tools, and publish a summary of the audit results on your website.
- Define Your AEDT Footprint: The law covers any tool that significantly influences hiring or promotion decisions in NYC. Your first step is to conduct an internal inventory of your HR software to identify which systems qualify as an AEDT and therefore require auditing.
- Build a System for Continuous Compliance: Meeting the law's standards requires an ongoing strategy, not a one-time fix. This involves creating repeatable processes for annual audits, managing documentation for transparency, and establishing clear protocols for candidate communication and accommodation requests.
What Is NYC Local Law 144?
New York City has established a new standard for fairness in hiring with a landmark regulation known as Local Law 144. This law directly addresses the growing use of artificial intelligence in employment decisions. For any organization that recruits, hires, or promotes talent within the five boroughs, understanding this law is a legal requirement. The regulation sets specific rules for using automated tools, aiming to increase transparency and prevent bias in the workplace. It fundamentally changes how employers can leverage technology to build their teams, placing a new emphasis on accountability and equity.
The Law's Purpose and Scope
The primary goal of Local Law 144 is to prevent discrimination from automated hiring software. The law prohibits employers and employment agencies from using an Automated Employment Decision Tool (AEDT) unless the tool has undergone an impartial audit for bias. According to the city's Department of Consumer and Worker Protection, the regulation is designed to ensure these systems do not unfairly screen out candidates based on their race or gender. The law’s scope is specific, applying to tools that substantially assist or replace human decision-making for hiring or promotion.
Key Dates and Implementation
While the law was passed in 2021, its enforcement did not begin immediately. The rules officially went into effect on July 5, 2023, marking the end of the grace period for businesses to bring their hiring practices into compliance. Since that date, any employer using an AEDT for a New York City-based role must meet the law's requirements, including conducting annual bias audits and notifying candidates about the use of such technology. The law is actively enforced, and companies that fail to comply are subject to financial penalties for each violation.
Who Is Required to Comply?
Compliance is mandatory for any employer or employment agency using an AEDT to screen candidates for a job located in New York City. This applies even if your company headquarters is outside the city; if you are hiring for a position within the five boroughs, the law pertains to you. It covers decisions related to both hiring and promotion. The regulation affects a wide range of organizations, from large enterprises using sophisticated HR software to the staffing and recruitment agencies that source candidates for them. If an automated tool influences who gets an interview or a promotion in NYC, the user of that tool is responsible.
What Is an Automated Employment Decision Tool (AEDT)?
An Automated Employment Decision Tool, or AEDT, is any computational system used to substantially assist or replace human judgment in employment decisions. The definition is intentionally broad, capturing a wide range of technologies that use machine learning, statistical modeling, or artificial intelligence. If a tool automates screening, assesses candidates, or influences who gets hired or promoted, it likely falls under this category. Common examples include résumé scanners that filter applicants, video analysis software, and predictive skills assessments. Understanding which of your systems qualify as an AEDT is the critical first step toward complying with regulations like NYC Local Law 144.
The Role of AEDTs in Hiring
Companies use AEDTs to manage high volumes of applications and streamline the hiring process. These tools can quickly sort through thousands of résumés or conduct initial screenings, saving recruiters significant time. However, this efficiency comes with a risk, as the algorithms powering these tools can inadvertently perpetuate or even amplify existing biases. In response, NYC Local Law 144 does not ban these tools but instead regulates them to promote transparency and equity. The law requires employers to ensure any AEDT they use undergoes a rigorous and impartial AI bias audit before it can be implemented. This audit is meant to identify whether the tool’s outputs result in a disparate impact on candidates based on their race, ethnicity, or gender.
How to Determine if a Tool Is an AEDT
Figuring out if a specific piece of software qualifies as an AEDT can be challenging because the legal definition is broad. The central question is whether the tool "substantially assists or replaces" a person's judgment. Ask yourself: Does this software automatically reject applicants who lack a specific keyword? Does it rank candidates and recommend a top percentage for interviews? If the tool does more than simply organize information, it probably falls under the law. For large organizations, the first step toward compliance is often inventorying all HR technology to identify which systems meet the AEDT criteria and require auditing. This internal review is essential for building a clear and defensible compliance strategy.
What Does NYC LL 144 Require?
New York City's Local Law 144 establishes a clear framework for employers and staffing agencies using automated tools in their hiring and promotion processes. The law is built on three core pillars: independent bias auditing, transparency for candidates, and accountability for employers. For businesses covered by the law, compliance is not a one-time checklist. It requires a systematic approach to how you select, deploy, and communicate about your hiring technology. The regulations are designed to give candidates more insight and control over how their information is used, while placing the responsibility on employers to prove their tools are equitable.
Understanding these specific obligations is the first step for any organization operating in New York City to ensure its practices are fair, transparent, and legally sound. This means examining your current hiring stack, identifying which tools fall under the law's definition of an AEDT, and implementing new procedures for auditing, notification, and disclosure. The law fundamentally shifts the dynamic, moving from a model where automated systems operate in a black box to one where their function and impact must be openly documented and justified. The following sections break down each of these requirements in detail, providing a clear path for what your organization needs to do to meet these new standards of practice.
The Mandate for Bias Audits
The central requirement of Local Law 144 is that any automated employment decision tool (AEDT) must undergo an impartial bias audit before it can be used. This audit must be conducted annually by an independent auditor to assess whether the tool leads to a disparate impact based on sex, race, or ethnicity. The goal is to identify and measure potential biases in the algorithm's outcomes. After the audit is complete, a summary of the results, including the date of the audit and the source and type of data used, must be published on the employer’s website. This public disclosure is a key part of the law’s push for accountability. A thorough AI bias audit provides the necessary analysis to meet these standards.
Requirements for Notifying Candidates
Transparency with candidates is another cornerstone of the law. Employers must notify applicants or employees about the use of an AEDT at least 10 business days before it is used. This notice must be clear and conspicuous, whether posted on a careers page or sent directly to individuals. The notification must explain that an automated tool will be used in connection with the assessment or evaluation. It also needs to specify the job qualifications and characteristics the tool will use to make its assessment. Crucially, this notice must also inform candidates of their right to request an alternative selection process or a reasonable accommodation, giving them agency in how they are evaluated for a role.
Obligations for Data Disclosure
Beyond the initial notice, employers have an ongoing duty to provide information upon request. If a candidate who lives in New York City applies for a position, they have the right to ask for more details about the AEDT. Upon request, the employer must disclose the type of data the tool collects, the source of that data, and the company's data retention policy. This requirement empowers candidates to understand what information is being used to evaluate their qualifications. For staffing and recruitment agencies that manage large volumes of applicants, having a clear and efficient process for handling these requests is essential for maintaining compliance and building trust with their talent pool.
Providing an Alternative Selection Process
The law ensures that candidates are not forced to be evaluated by an automated system. The required notice must explicitly state that an individual can request an alternative selection process. While the law does not define what this alternative must look like, it is generally understood to be a method that does not rely on the AEDT, such as a manual review by a human recruiter. This option must be a reasonable alternative and not disqualify the candidate from consideration. This provision, along with the option to request a reasonable accommodation, ensures a more accessible and equitable hiring process. Adhering to a standard like Warden Assured can help organizations build the comprehensive compliance framework needed to manage these alternatives effectively.
How Bias Audits Work Under Local Law 144
Under Local Law 144, a bias audit is a specific, impartial analysis of your automated employment decision tools (AEDTs). It is a formal process designed to measure whether a tool produces biased outcomes for candidates based on their race, ethnicity, or gender. The law sets clear expectations for how these audits must be conducted, who can perform them, and what information must be shared publicly. For employers in New York City, understanding this process is the first step toward responsible AI use and legal compliance. The audit serves as a critical mechanism for accountability, requiring companies to look closely at the tools they use to screen, assess, and hire talent.
Requirements for an Independent Auditor
Local Law 144 mandates that the bias audit must be performed by an “independent auditor.” This means you cannot ask your internal data science team or the tool’s vendor to conduct the audit themselves. The auditor must be impartial and capable of exercising objective judgment without any conflicts of interest related to the employer or the tool being assessed. This requirement ensures the integrity and credibility of the audit results. An independent auditor brings an unbiased perspective to the AI bias auditing process, which is essential for building trust with both regulators and candidates. Their role is to provide an honest evaluation of the tool's performance against established fairness metrics.
Bias Testing and Statistical Standards
The core of the audit involves rigorous statistical testing. An AEDT, which is any computational tool using machine learning or AI to aid hiring decisions, is analyzed to determine its impact rate. The audit calculates the selection rate for different demographic categories (e.g., male, female, Black, white, Hispanic) and compares them. The goal is to identify whether the tool has a statistically significant, adverse impact on any specific group. The law requires auditors to calculate an "impact ratio" by comparing the selection rate of a specific category to the rate of the most selected category. This quantitative analysis provides concrete evidence of a tool's fairness, or lack thereof, in practice.
Publicly Disclosing Audit Results
Transparency is a cornerstone of Local Law 144. After the audit is complete, you are required to publish a summary of the results on your company’s website. This summary must be easily accessible to job applicants and must include the date of the most recent audit, the source and type of data used to conduct the audit, and the tool's distribution date. You must also disclose your data retention policy. This public disclosure allows candidates to understand how they are being evaluated. By making this information available, companies can demonstrate their commitment to fairness, and tools that meet these standards can be recognized in a public Warden Assured Directory.
Common Compliance Challenges for Employers
Complying with NYC Local Law 144 introduces new operational responsibilities for employers and HR vendors. While the goal is to promote fairness, the path to compliance presents several distinct challenges. These range from understanding the legal language to implementing new technical and communication workflows. Addressing these hurdles requires a clear strategy and a detailed understanding of the law's specific mandates. For many, it means rethinking how they select, deploy, and manage the automated tools that have become integral to modern recruiting.
Interpreting the Law's Complexities
Understanding the law's specific terms is a primary hurdle. The regulation applies to "automated employment decision tools" (AEDTs), but the definition can feel ambiguous, leaving many to question which of their software tools fall under its scope. The law's core requirements are twofold: first, you must conduct a bias audit on the tool, and second, you must notify candidates and employees about its use. This sounds straightforward, but determining what constitutes a compliant bias audit or an adequate notice requires careful interpretation of the legal text and its associated rules, which can be a significant undertaking for internal teams.
Finding a Qualified Auditor
Local Law 144 requires that the bias audit be performed by an "independent" auditor. This stipulation immediately raises a critical question for employers: who is qualified to perform this audit? The auditor must not only be external but also possess the technical expertise to analyze complex algorithms and the statistical knowledge to measure adverse impact accurately. Finding a partner who understands both the legal requirements and the technical nuances of AI in hiring is a significant challenge. This process involves vetting potential auditors for their experience, methodology, and ability to produce a report that will withstand regulatory scrutiny.
Managing Compliance Costs
Compliance is an ongoing commitment, not a one-time fix. The law mandates that an AEDT must have undergone a bias audit within the year prior to its use. This means employers must budget for annual audits for every applicable tool in their hiring stack. These recurring costs can be substantial, especially for organizations that use multiple automated systems. The key is to view this not as a simple expense but as a necessary investment in risk management and operational integrity. Proactive and continuous monitoring can help streamline this process, making it more predictable and manageable over the long term.
Communicating with Candidates and Employees
The law also focuses on transparency with job applicants and current employees. You must provide advance notice before using an AEDT, explaining its function and the data it uses. Crucially, this notice must also inform individuals of their right to request an alternative selection process or a reasonable accommodation. Implementing this requirement presents a logistical challenge. Teams must create clear, accessible notices and integrate them seamlessly into the hiring workflow. They also need to be prepared to handle requests for alternatives, ensuring the process remains fair and efficient for all candidates.
The Penalties for Non-Compliance
Failing to comply with NYC Local Law 144 carries significant financial and legal risks. The law is an enforceable regulation with clear penalties for violations. For employers, staffing agencies, and HR vendors, understanding these consequences is a critical part of developing a compliance strategy. The costs of non-compliance extend beyond initial fines, creating potential for ongoing legal battles, reputational harm, and operational disruption.
Fines and Enforcement Actions
The New York City Department of Consumer and Worker Protection (DCWP) is the agency responsible for enforcing Local Law 144. According to the Office of the New York State Comptroller, the DCWP can fine companies between $500 and $1,500 per day for each violation. A single compliance failure, such as not publishing a bias audit, can result in accumulating daily penalties until the issue is resolved. The law’s requirements are specific, covering everything from the audit itself to disclosing the types of data collected and the company’s data retention policy. An oversight in any of these areas can trigger enforcement actions.
Litigation Risks and Legal Exposure
The financial penalties for non-compliance can escalate quickly. According to analysis from Deloitte, the first violation incurs a fine of up to $500, but subsequent violations can cost between $500 and $1,500. Crucially, each day a violation persists is treated as a separate offense. This structure creates significant legal exposure for organizations that are slow to address compliance gaps. Beyond direct fines, failure to adhere to the law opens the door to civil litigation. Candidates who believe they were assessed by a non-compliant tool could pursue legal action, leading to costly court battles and damage to the company’s brand.
The DCWP Complaint Process
The law empowers individuals to hold employers accountable. If a candidate or employee believes a company used an AEDT without following the rules, they can file a complaint online with the DCWP. This process allows anyone to report a failure to conduct a bias audit, publish the results, or provide the required notices. However, a report from the New York State Comptroller has raised questions about the current system, noting that the process for handling complaints may not be fully effective. This suggests that while the official complaint channel exists, employers should also be prepared for scrutiny from other sources, including public interest groups and legal advocates.
How to Report a Violation
New York City’s Local Law 144 gives job applicants and employees clear rights and establishes procedures for addressing potential non-compliance. If an individual believes an employer has failed to meet its obligations, the law provides specific avenues for recourse. Understanding these mechanisms is important for both individuals seeking to exercise their rights and for employers aiming to maintain compliant and transparent hiring practices. The law empowers individuals to hold employers accountable and includes safeguards to protect them throughout the process.
Filing a Complaint with the DCWP
If you believe an employer or hiring firm has used an automated employment decision tool without adhering to the law, you have the right to take action. The primary channel for this is the Department of Consumer and Worker Protection (DCWP). An individual can file a complaint if a company fails to conduct a required bias audit, does not make the audit results public, or neglects to provide the necessary notifications about the tool's use. The DCWP is responsible for investigating these claims and enforcing the law, which may include issuing penalties for violations. This process ensures there is a formal and accessible way to report and resolve compliance issues.
Understanding Anti-Retaliation Protections
The law includes strong anti-retaliation provisions to protect individuals who exercise their rights. Employers are explicitly prohibited from taking adverse action against a job applicant or employee for filing a complaint or requesting information about an AEDT. This means a company cannot legally penalize you, for instance by rescinding a job offer or terminating employment, for questioning its use of an automated tool or reporting a potential violation. These protections are designed to ensure that people feel safe holding employers accountable without fear of reprisal. It reinforces a culture of transparency and encourages compliance by removing the threat of negative consequences for speaking up.
An Applicant's Right to Transparency
A fundamental aspect of Local Law 144 is an applicant’s right to know how they are being evaluated. Companies must inform candidates if an AEDT will be used in their assessment and specify which job qualifications or characteristics the tool will consider. Beyond this initial notice, you have the right to request more detailed information. Upon request, an employer must disclose the type of data the tool collects, its source, and the company’s data retention policy. This level of transparency allows candidates to better understand the hiring process. For employers, fulfilling these disclosure requirements is a key part of demonstrating good faith and adhering to the law's standards.
A Strategic Approach to Compliance
Meeting the requirements of NYC Local Law 144 involves more than a last-minute compliance check. It calls for a thoughtful, ongoing strategy to ensure your hiring tools are fair and transparent. By integrating compliance into your operational workflow, you not only address legal risks but also build a more equitable and trustworthy recruitment process. A proactive approach helps you stay ahead of regulatory changes and demonstrates a genuine commitment to fairness. The following steps outline a strategic framework for managing compliance effectively and responsibly.
Conduct Regular Bias Audits
Local Law 144 mandates that any automated employment decision tool, or AEDT, must undergo a bias audit before it can be used. This is not a one-time task. The law requires these audits to be performed annually to ensure the tool remains fair over time. Think of it as a regular health check for your AI, confirming it operates without creating adverse impacts on candidates based on their race, ethnicity, or gender. An AI bias audit must be conducted by an independent, impartial auditor. The results of this audit, including a summary of the findings, must then be made publicly available on your company’s website, creating a new layer of transparency for your hiring practices.
Maintain Comprehensive Documentation
Under the law, transparency is not optional. You must be prepared to provide candidates with clear information about the AI tools you use. This requires maintaining thorough documentation that details what type of data your AEDT collects, where that data comes from, and how long you retain it. If a candidate asks, you need to be able to explain the tool’s function and the data informing its outputs. Keeping these records organized is essential for responding to candidate inquiries promptly and for demonstrating due diligence to regulators. This documentation serves as your evidence of a compliant and defensible process, should your practices ever be questioned.
Establish Proactive Monitoring
Compliance is a continuous effort, not a single event. Because an AEDT must be audited within one year of its use, organizations need a system for ongoing oversight. Proactive monitoring helps you track your tool's performance between formal audits, identifying potential issues like model drift, where an AI's accuracy changes over time. By establishing a system for continuous evaluation, you can catch and correct biases before they become systemic problems or lead to non-compliance. An AI assurance platform can help operationalize this process, providing a consistent framework for testing and validation that keeps your tools aligned with fairness standards year-round.
Develop Clear Communication Protocols
Effective communication is fundamental to LL 144 compliance. Employers are required to notify candidates that an AEDT is being used in the hiring or promotion process. This notice must also inform applicants of their right to request an alternative selection process. Your communication should be clear, direct, and easy for any candidate to understand. Developing standardized protocols for these notifications ensures consistency and helps build trust with your applicant pool. By being upfront about your use of technology, you reinforce your commitment to a fair process, which is a core principle of the Warden Assured standard for responsible AI.
Related Articles
- NYC Local Law 144 - Solution - Warden AI
- HR Tech Compliance: Everything You Need to Know about NYC Local Law 144 - Warden AI
- Colorado SB 205 vs. NYC Local Law 144: How Do AI Hiring Regulations Compare? - Warden AI
- Navigating the NYC Bias Audit Law for HR Tech platforms - Warden AI
- What is a Third-Party AI Audit? A Simple Guide - Warden AI
NYC LL 144 FAQs
My company isn't based in New York City. Does this law still apply to us?
Yes, it very well might. The law's jurisdiction is determined by the location of the job, not the location of your company's headquarters. If you are using an automated tool to screen candidates for a position based within the five boroughs of New York City, you are required to comply with Local Law 144. This applies whether the role is fully in-person, hybrid, or a remote position for a resident of the city.
How do I know if a specific software I use is considered an AEDT?
The key question to ask is whether the tool "substantially assists or replaces" human decision-making. If a piece of software does more than simply organize applications, it likely qualifies. For example, if it automatically scores résumés, ranks candidates based on predictive assessments, or filters out applicants before a human sees them, it almost certainly falls under the definition of an Automated Employment Decision Tool and would require a bias audit.
Can the vendor who sold us the tool perform the required bias audit?
No, the law is very clear on this point. The audit must be conducted by an impartial and independent third party. This means neither your own company nor the vendor that developed or sold you the software can perform the audit. The purpose of this rule is to ensure the analysis is objective and free from any potential conflicts of interest, which is essential for the credibility of the results.
What happens if a bias audit finds that our tool has a disparate impact?
The law requires you to publish a summary of the audit results, regardless of the findings. It does not explicitly prohibit you from using a tool that shows bias, but continuing to do so creates significant legal and reputational risk. A finding of disparate impact should be a signal to take immediate action, which could involve working with the vendor to correct the issue, adjusting how the tool is used, or discontinuing its use altogether in favor of a more equitable alternative.
What does an "alternative selection process" actually mean in practice?
The law requires you to offer an alternative but does not provide a strict definition of what it must be. Generally, it is understood to be a process that does not involve the use of the automated tool. A common example would be a manual review of a candidate's application by a human recruiter. The alternative must be a reasonable accommodation that allows the candidate to be fairly considered for the role without being penalized for opting out of the automated assessment.



