[ad_1]
Making technology available for hire means ensuring that a candidate can use that technology and that the skills it assesses don’t unfairly rule out candidates with disabilities, says Alexandra Givens, CEO of the Center for Democracy and Technology, an organization dedicated to civil rights in the digital age.
Artificial intelligence-powered recruiting tools often don’t factor in people with disabilities when creating their training data, she said. Such people have long been excluded from the workforce, so algorithms modeled on the company’s previous hires do not reflect their potential.
Even if the models could account for outliers, how disability manifests itself varies from person to person. For example, two people with autism may have very different strengths and challenges.
“As we automate these systems and employers strive for what is fastest and most efficient, they lose the opportunity for people to actually demonstrate their skills and ability to get the job done,” Givens says. “And this is a huge loss.”
A hands-on approach
It is difficult for government regulators to track AI hiring tools. In December 2020, 11 senators wrote a letter to the U.S. Equal Employment Opportunity Commission expressing concern about the use of recruiting technologies in the wake of the COVID-19 pandemic. The letter asked about the agency’s authority to investigate whether these tools are discriminatory, especially against people with disabilities.
In January, EEOC responded with a letter that was leaked to the MIT Technology Review.… In the letter, the commission indicated that it cannot investigate AI hiring tools without a specific statement of discrimination. The letter also raised concerns about industry hesitation over data sharing and said differences in software across companies would prevent the EEOC from setting any general rules.
“I was surprised and disappointed when I saw the answer,” says Roland Boehm, a lawyer and advocate for people with mental health problems. “The whole point of this letter seemed to make the EEOC seem more like a passive observer than a law enforcement agency.”
The agency usually starts an investigation after a person files a discrimination claim. However, when using AI hiring technology, most candidates do not know why they were denied a job. “I believe the reason we are not seeing more coercive measures or private litigation in this area is because candidates do not know that they are being judged or judged by a computer,” says Keith Sonderling, EEOC. Commissioner.
Sonderling believes artificial intelligence will improve the hiring process and hopes the agency will issue guidelines for employers on how best to implement it. He says he welcomes Congressional oversight.
…
[ad_2]
Source link