AI hiring tools need to be audited for bias before January 1, 2023—and Credo AI is here to help you get in compliance.
New York City is the first locality to pass a new law requiring any algorithmic hiring tool used on its residents to be audited annually for disparate impact. Credo AI can help employers and HR vendors get in compliance with New York City's new bias audit requirements.
The Credo AI Governance Platform provides your organization with a centralized repository to keep track of bias audits across all of your HR use cases. Request bias audit reports from your vendors and make sure they're meeting all of NYC's new requirements with a Policy Pack tailored to the New York City law.
Assessing bias in your machine learning models is easier than ever with the open source Credo AI Lens Assessment Framework, which plugs directly into your CI/CD pipelines to ensure that you're meeting any requirements. Generate dynamic reports with Credo AI to share your results with customers and regulators.