Are you ready for NYC's new algorithmic hiring law?

AI hiring tools need to be audited for bias before January 1, 2023—and Credo AI is here to help you get in compliance.

NYC Algorithmic Hiring Law (Law no. 144)

Measure & Manage Bias in your AI Hiring Tools

New York City is the first locality to pass a new law requiring any algorithmic hiring tool used on its residents to be audited annually for disparate impact. Credo AI can help employers and HR vendors get in compliance with New York City's new bias audit requirements.

NYC Employers

Get Ready for Your Bias Audit

Evaluate HR vendor risk and manage bias audits with an AI use case registry.

The Credo AI Governance Platform provides your organization with a centralized repository to keep track of bias audits across all of your HR use cases. Request bias audit reports from your vendors and make sure they're meeting all of NYC's new requirements with a Policy Pack tailored to the New York City law.

HR Vendors

Get Ready for Customer Requests

Empower your internal audit team to measure bias effectively, making reporting and compliance a breeze.

Assessing bias in your machine learning models is easier than ever with the open source Credo AI Lens Assessment Framework, which plugs directly into your CI/CD pipelines to ensure that you're meeting any requirements. Generate dynamic reports with Credo AI to share your results with customers and regulators.

See a Demo of HR Vendor Risk & Bias Audit


Your demo request has been submitted.
We will be in touch soon!
Oops! Something went wrong. Please review the form.