signzy

API Marketplace

downArrow
Logo
Responsive
Decorative line

OCC Fair Lending AI Guidance

United States

United States

2023

Lending

Consumer Protection

Overview

The OCC Fair Lending AI Guidance refers to policy statements and supervisory expectations issued by the Office of the Comptroller of the Currency (OCC) regarding the use of artificial intelligence and machine learning in credit underwriting, risk modeling, and decision-making. While not a standalone regulation, this guidance builds on existing fair lending laws, including the Equal Credit Opportunity Act (ECOA) and the Fair Housing Act (FHA), to ensure that AI-driven systems do not result in unlawful discrimination.
The OCC emphasizes the need for model explainability, bias detection, and ongoing monitoring of AI systems to comply with fair lending obligations. Banks and lenders are expected to evaluate whether automated credit decisions may disproportionately impact protected classes, even if unintentional. These expectations apply to national banks, federal savings associations, and third-party fintech partners using AI in consumer credit.

Key Obligations

  • Ensure AI and ML models are explainable and auditable
  • Conduct disparate impact analysis to detect potential discrimination
  • Monitor models regularly for accuracy, bias, and data drift
  • Comply with ECOA and FHA regardless of model complexity
  • Document model governance, controls, and remediation plans
  • Oversee third-party vendors and tools involved in AI decision-making

FAQ

Is the OCC AI guidance legally binding?

No, it’s not a regulation but supervisory guidance. However, it reflects the OCC’s expectations during examinations and enforcement.

What does “explainability” mean in this context?

Lenders must be able to explain credit decisions made by AI in a clear, consumer-friendly way to comply with adverse action notice requirements.

Are third-party fintech vendors covered by the guidance?

Yes. Banks are responsible for ensuring compliance even when they use external AI or underwriting tools from vendors.

What happens if AI models show bias?

Lenders must take corrective action to remediate any discriminatory outcomes and demonstrate strong governance around AI use.