NYC local law 144-21 aims to bring transparency to the use of AI and other algorithms in employment decisions. Here’s what we know about New York’s legislation so far, including requirements for a bias audit of automated employment decision systems (AEDTs) and steps affected organizations can take now to prepare.
Due to increased use of artificial intelligence (AI) and other algorithms by organizations, especially in potential high-impact decisions such as employment, stakeholders are concerned about potential bias in AI models, where they are used, how they operate, what the underlying data is, and what organizations are doing to monitor and oversee their AI models.
New York City’s law (NYC LL 144-21) is part of a broader trend of regulatory bodies, including US federal, state, and local governments and agencies as well as international bodies such as those within the European Union, that are seeking to bring transparency to the use of automated decision systems, including AI, in employment and other decisions.
Organizations that use automated employment decision tools (AEDTs) to substantially assist or replace discretionary decision-making for employment decisions (e.g., hiring, promotions, etc.) in New York City will be required to have a bias audit performed by an independent auditor. The tools include data analytics, statistical modeling, machine learning (ML), and AI that generate simplified outputs like candidate scores, classifications, or hiring recommendations.
The bias audit shall include but not be limited to testing AEDTs to assess the tools’ disparate impact on employment decisions for candidates or employees based on protected categories (e.g., sex, ethnicity, and race). In cases where individuals assessed by an AEDT are not included in the required calculations because they fall within an unknown category, organizations will need to indicate the number of these individuals. In addition, an independent auditor may exclude a category that represents less than 2% of the data being used for the bias audit from the required calculations for the impact ratio.
Organizations will need to make the date of the most recent bias audit, a summary of results, and distribution date of the AEDT prior to its use publicly available on the employment section of their websites in a clear and conspicuous manner.
• An AEDT will be used as part of the assessment of the employee or candidate
• The job qualifications and characteristics (i.e., data elements) that the AEDT will use in the assessment of the employee or candidate
• If not already disclosed on the organization’s website, information about the type of data collected for the AEDT, the source of such data, and the organization’s data retention policy shall be available within 30 days upon written request by a candidate.
Penalties for organizations that fail to comply with NYC LL 144-21 include a civil penalty of $500 for a first violation and each additional violation occurring on the same day as the first violation and a penalty between $500 and $1,500 for each subsequent violation. Each day an organization uses an AEDT in violation of NYC LL 144-21 and fails to provide any required notice to an employee or candidate shall constitute a separate violation. For further details regarding the current rules, refer to the Notice of Adoption.
Organizations continue to evaluate the implications of NYC LL 144-21. As of the date of publication of this document, the enforcement of this law has been postponed from the originally announced date of January 1, 2023, to July 5, 2023. Key considerations or areas for clarification from stakeholders include but are not limited to:
On April 6, 2023, the DCWP issued a Notice of Adoption with the final new rules and additional guidance to clarify the requirements in NYC LL 144-21.
While there are still questions related to NYC LL 144-21, there are several practices that organizations using an AEDT can consider to be proactive and prepared for the law going into effect.
In addition to the practices to address the current proposed requirements for NYC LL 144-21, there are more holistic practices around governance, model development and testing, and the review function that can provide a strong foundation for organizations, especially those involved in developing automated decision systems, to prepare for potential expanded requirements around the use of automated decision systems, including AEDTs, as the regulatory environment continues to evolve. We have included several leading practices for organizations to consider below.
Deloitte & Touche LLP continues to monitor the regulatory environment and can help organizations prepare for potential bias audits and respond to current proposed requirements regarding automated decision systems as well as strengthen overarching practices for governance, model development and testing, and the review function.