The EEOC wants to make AI hiring fairer for people with disabilities



That hiring algorithms can disadvantage people with disabilities is not exactly new information. In 2019, for my first piece at the Brookings Institution, I wrote about how automated interview software is definitionally discriminatory against people with disabilities. In a broader 2018 review of hiring algorithms, the technology advocacy nonprofit Upturn concluded that “without active measures to mitigate them, bias will arise in predictive hiring tools by default” and later notes this is especially true for those with disabilities. In their own report on this topic, the Center for Democracy and Technology found that these algorithms have “risk of discrimination written invisibly into their codes” and for “people with disabilities, those risks can be profound.” This is to say that there has long been broad consensus among experts that algorithmic hiring technologies are often harmful to people with disabilities, and that given that as many as 80% of businesses now use these tools, this problem warrants government intervention.


The EEOC’s concerns are largely focused on two problematic outcomes: (1) algorithmic hiring tools inappropriately punish people with disabilities; and (2) people with disabilities are dissuaded from an application process due to inaccessible digital assessments.

Illegally “screening out” people with disabilities

First, the guidance clarifies what constitutes illegally “screening out” a person with a disability from the hiring process. The new EEOC guidance presents any disadvantaging effect of an algorithmic decision against a person with a disability as a violation of the ADA, assuming the person can perform the job with legally required reasonable accommodations. In this interpretation, the EEOC is saying it is not enough to hire candidates with disabilities in the same proportion as people without disabilities. This differs from EEOC criteria for race, religion, sex, and national origin, which says that selecting candidates at a significantly lower rate from a selected group (say, less than 80% as many women as men) constitutes illegal discrimination.

Author(s): Alex Engler

Publication Date: 26 May 2022

Publication Site: Brookings





We study the results of a massive nationwide correspondence experiment sending more than
83,000 fictitious applications with randomized characteristics to geographically dispersed jobs
posted by 108 of the largest U.S. employers. Distinctively Black names reduce the probability of
employer contact by 2.1 percentage points relative to distinctively white names. The magnitude
of this racial gap in contact rates differs substantially across firms, exhibiting a between-company
standard deviation of 1.9 percentage points. Despite an insignificant average gap in contact rates
between male and female applicants, we find a between-company standard deviation in gender
contact gaps of 2.7 percentage points, revealing that some firms favor male applicants while
others favor women. Company-specific racial contact gaps are temporally and spatially persistent,
and negatively correlated with firm profitability, federal contractor status, and a measure of
recruiting centralization. Discrimination exhibits little geographical dispersion, but two digit
industry explains roughly half of the cross-firm variation in both racial and gender contact gaps.
Contact gaps are highly concentrated in particular companies, with firms in the top quintile of
racial discrimination responsible for nearly half of lost contacts to Black applicants in the
experiment. Controlling false discovery rates to the 5% level, 23 individual companies are found
to discriminate against Black applicants. Our findings establish that systemic illegal
discrimination is concentrated among a select set of large employers, many of which can be
identified with high confidence using large scale inference methods.

Author(s): Patrick M. Kline, Evan K. Rose, and Christopher R. Walters

Publication Date: July 2021, Revised August 2021

Publication Site: NBER Working Papers, also Christopher R. Walters’s own webpages