Papers
Topics
Authors
Recent
Search
2000 character limit reached

An Introduction to Artificial Intelligence and Solutions to the Problems of Algorithmic Discrimination

Published 8 Nov 2019 in cs.CY and cs.LG | (1911.05755v1)

Abstract: There is substantial evidence that AI and Machine Learning (ML) algorithms can generate bias against minorities, women, and other protected classes. Federal and state laws have been enacted to protect consumers from discrimination in credit, housing, and employment, where regulators and agencies are tasked with enforcing these laws. Additionally, there are laws in place to ensure that consumers understand why they are denied access to services and products, such as consumer loans. In this article, we provide an overview of the potential benefits and risks associated with the use of algorithms and data, and focus specifically on fairness. While our observations generalize to many contexts, we focus on the fairness concerns raised in consumer credit and the legal requirements of the Equal Credit and Opportunity Act. We propose a methodology for evaluating algorithmic fairness and minimizing algorithmic bias that aligns with the provisions of federal and state anti-discrimination statutes that outlaw overt, disparate treatment, and, specifically, disparate impact discrimination. We argue that while the use of AI and ML algorithms heighten potential discrimination risks, these risks can be evaluated and mitigated, but doing so requires a deep understanding of these algorithms and the contexts and domains in which they are being used.

Citations (12)

Summary

No one has generated a summary of this paper yet.

Paper to Video (Beta)

No one has generated a video about this paper yet.

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.