How employers can prepare for New York City's AI law

M+Isolation+Photo from AdobeStock

As technology has advanced, companies have increasingly turned to automated hiring tools to help them pick the right person for the job faster and with more certainty. But do employers know whether their hiring tools come with gender and racial biases built within their systems? 

New York City's Local Law 144 requires them to find out. 

Local Law 144, otherwise known as New York City's AI law, regulates the use of automated employment decision tools, or AEDTs, making it unlawful for employers to use hiring and promotion tools that have not been annually audited by an independent party for glaring biases. New York City's Department of Consumer and Workplace Protection will begin to enforce Law 144 on July 5, 2023, six months after the AI law was supposed to go into effect. Yet, even with the extended deadline, employers still have their work cut out for them.

Read more: No more ChatGPT? Here's what the 'pause' on generative AI means for the workplace

From identifying whether their hiring tool is considered an AEDT and acquiring an independent auditor, to being prepared to offer candidates and employees subjected to an AEDT assessment an alternative process upon request, Local Law 144 comes with a host of compliance challenges — but the law is ultimately trying to minimize discriminatory hiring practices, underlines Daniel Kadish,  an associate in the labor, employment and benefits practice at law firm Morgan Lewis

"One of the biggest concerns New York City is trying to address is whether these artificial intelligence tools have some form of bias inherent to them," says Kadish. "To help [employers] understand if the tool is resulting in some sort of impact on certain categories of individuals and create an awareness of that record."

Identifying the AEDTs
Kadish first advises employers to reach out (preferably alongside their legal counsel) to HR and talent acquisition to identify if and how many AEDTs are present in their hiring and promotion process. And while some employers may assume an "AI law" will not impact something as common as resume scanning software, they could very well be wrong. Notably, AEDTs are defined as any process that uses machine learning, statistical modeling, data analytics or artificial intelligence to provide a simplified output, like a score, classification or recommendation to greatly aid or replace human decision-making. 

Read more: Attention all job seekers: Protect yourself from these 7 job scams

"If the tool looks at all the resumes the organization has collected and ranks them or shows a percentage match with the company, that tool would probably be covered by the AI law," says Kadish. "But if someone takes a test, and the tool identifies if the [candidate] scores well after an interview or multiple interviews, that tool may not be covered. To know requires a significant amount of internal analysis for organizations."

In other words, deciding whether the hiring tool is subject to an audit may come down to how hiring managers use it. For example, if the test score is just a consideration in the overall hiring process, it may not be considered an AEDT. BUt if the hiring manager is instructed to only allow people who score above a certain threshold to proceed to the next interview stage, then it likely would fall under Law 144. 

Doing the audit right
Once an employer identifies their AEDT tools, they then need to acquire an independent auditor, which is a person or a group that is not involved in using or developing said AEDT, nor have any financial incentives related to the tool. Employers will likely need to provide historical data from their own use of the AEDT, unless their data is deemed insignificant because the employer has not started using the AEDT. Then an employer may use historical data from another employer using the same tool. 

Read more: How advisers can embrace AI tools without sacrificing human connection

The audit itself must include data examining how race, ethnicity and gender impact the tool's conclusions, with the law even stipulating that the auditor must test for how the tool responds to race and gender on an intersectional level. 

The audit results consider how hiring and promotions tools respond. For hiring, the auditor is looking at the rate at which individuals are selected to move forward in the interview process or how the tool classifies them; for promotions, the auditor looks at the rate at which the tool gives individuals from a specific group a score above its sample median score. However, auditors can exclude a sex, race or ethnicity category representing less than 2% of the data from the overall score. 

Transparency is key
But the AI law's requirements don't end there.

"Employers have to post the results of the bias audit at least on an annual basis," says Kadish. "The law also requires organizations to provide notice to candidates that they will be subject to an AEDT."

Read more: How small businesses can leverage AI

This is easier said than done, notes Kadish. Employers have to disclose audit results on the employment page of their website in a "clear and conspicuous manner," while also sharing the use of an AEDT with their candidates on either the employment page of their website, on the job posting or via email 10 days before the tool is used on the candidate. The candidate can then request an alternative assessment process.

If employers cannot abide by these stipulations, they can incur a $500 to $1,500 penalty each day they are in violation of the law. 

"The law itself provides pretty strict penalties," says Kadish. "Each day there is a failure to produce the required information is an independent violation. And failure to provide notice to each individual candidate could also constitute a separate violation."

Local Law 144 is not the end
Kadish points out that while New York City's AI law is more thorough than other laws that came before it in Illinois and Maryland, it does mark the move towards regulating AI — and state and federal governments are bound to follow suit sooner or later.

Read more: Is your manager trying to replace you with AI?

Kadish advises employers to review their hiring and promotion processes on an ongoing basis and reconsider how their technology plays a role in who makes it at their organization.

"This is just step one. There will likely be continued focus on this topic," he says. "So stay aware of any new developments. It's always a challenge for an organization to try complying with differing laws on similar topics."

For reprint and licensing requests for this article, click here.
Technology Recruiting Law and regulation
MORE FROM EMPLOYEE BENEFIT NEWS