NYC Local Law 144: What HR Tech Companies Need to Know in 2026

If your AI tool screens, scores, or ranks job candidates and any of your customers have employees in New York City — you need a bias audit. This isn’t a suggestion. It’s the law.

NYC Local Law 144, which went into full enforcement in 2023, requires independent bias audits for any Automated Employment Decision Tool (AEDT) used in hiring or promotion decisions within New York City. And the implications extend far beyond NYC’s borders.

What Is Local Law 144?

Local Law 144 is a New York City regulation that requires employers and employment agencies to conduct an annual independent bias audit of any automated tool used to substantially assist or replace discretionary decision-making in hiring or promotion. The law also requires employers to notify candidates that an AI tool is being used and to make the audit results publicly available on their website.

An Automated Employment Decision Tool (AEDT), as defined by the law, includes any computational process derived from machine learning, statistical modeling, data analytics, or artificial intelligence that issues simplified outputs — scores, classifications, or recommendations — used to substantially assist or replace discretionary decision-making in employment decisions.

Who Does It Apply To?

This is where it gets important for HR tech companies: Local Law 144 doesn’t just apply to NYC-based companies. It applies to any employer using an AI tool for hiring or promotion decisions involving candidates or employees who work in New York City — or who would work in New York City if hired.

That means if you build an AI hiring tool and even one of your customers uses it to evaluate candidates for a role in NYC, your tool needs to be audited. The practical effect is that most AI hiring tools with any enterprise customer base will need a bias audit, because the odds of having zero NYC exposure are essentially zero.

What Does the Audit Involve?

The bias audit is a statistical analysis that calculates the impact of the AI tool on different demographic groups. Specifically, the audit examines disparate impact across race/ethnicity and gender categories.

The auditor calculates impact ratios — comparing the selection rate for each demographic group against the most-selected group. The widely used benchmark is the four-fifths rule: if any group’s selection rate is less than 80% of the highest group’s rate, that’s a flag for potential bias. The audit must also examine intersectional categories — looking at combinations of race and gender, not just each dimension in isolation.

The audit must be conducted by an independent auditor — meaning someone who has no financial interest in the outcome and was not involved in developing or deploying the tool.

What Happens If You Don’t Comply?

The NYC Department of Consumer and Worker Protection can impose penalties of $500 to $1,500 per violation — and each day that the tool is used without a valid audit, and each candidate evaluated without proper notice, counts as a separate violation. The fines can add up quickly.

But the real risk isn’t the fines. It’s losing enterprise customers. Large employers are increasingly requiring bias audit documentation before they’ll purchase or renew an AI hiring tool. If your competitor has a clean audit and you don’t have one at all, you’re going to lose deals. The audit has become a de facto sales requirement for the HR tech market.

Colorado and Other States Are Next

NYC was first, but it won’t be the last. The Colorado AI Act, signed into law in 2024, establishes broader requirements for high-risk AI systems — including hiring tools — and will require impact assessments and risk management practices. Illinois has enacted the AI Video Interview Act. Maryland restricts facial recognition in hiring. Multiple other states have bills in progress.

The trend is clear: multi-state AI compliance is coming, and companies that get ahead of it now will have a significant advantage. The good news is that a well-designed bias audit can be structured to satisfy multiple jurisdictions simultaneously. One rigorous audit, properly documented, can cover NYC, Colorado, and emerging state requirements.

Need an AI Hiring Bias Audit?

Book a free scoping call with our team. We conduct independent bias audits that satisfy NYC Local Law 144, the Colorado AI Act, and emerging state requirements — all in one engagement.

Book a Free Scoping Call →

Get in touch

Ready to Test Your AI? Let's Talk.

Book a free scoping call. We’ll review your AI application, identify your attack surface, and give you a fixed-price quote — no obligations.

Bellavi AI © 2026 | All Rights Reserved