December 23, 2019
No items found.

Artificial Intelligence May Make HR's Job Easier, but Employment Discrimination Still Abounds

Proponents of artificial intelligence (AI) and machine learning have promised tools will usher in a new age of digitized work. AI tools can now be used scan thousands of documents and reduce repetitive work tasks, algorithms can predict your shopping habits and recommend products before you even think of them, and machine learning software can be trained to identify cancer from MRIs. Often, the creators and designers of these tools tout AI's supposed objectivity. However, what technologists are less interested in publicizing is how AI can be used to reinforce discriminatory policing, violate civil rights, enable employment discrimination and reinforce class, gender, and race disparities

In a recent New York Times op-ed, Dr. Ifeoma Ajunwa of Cornell's Industrial and Labor Relations School highlighted hiring companies and HR departments increased use of these tools. Ajunwa points out that employers are not merely utilizing these technologies to screen candidates, but  are actively barring candidates from being considered for employment. As an example, she posits a company that relies on a hiring algorithm trained to seek candidates without gaps in their employment. Ajunwa notes, such a stipulation would automatically screen out women applicants who have taken time off for child care or for those who have had long-term medical issues. And, because AI relies on specific rules created by humans, there is no way for the technology to check itself against employment law or ethical norms about employment discrimination. It would simply filter out applicants who don't meet the criteria.

Dr. Ajunwa is not the only one sounding the alarm about employers increasing reliance on AI and other tools, which creators purport to be objective. According to Cathy O'Neil, author of Weapons of Math Destruction, such algorithmic bias is common in hiring, especially in low-wage jobs where massive retail companies rely on sophisticated AIs that consider aspects of your life you would not think have any bearing on employment, such as your credit score, medical and mental health histories, personality tests, and driving record.

In recent years, several lawsuits and investigations regarding AI discrimination have appeared and several researchers in tech have started to develop methods to illuminate the hidden bias in machine learning and AI technologies. However, as Dr. Ajunwa notes, there are few concrete laws on the books that can protect applicants from algorithmic discrimination. Moreover, the Harvard Business Review cautioned that unlike other forms of employment testing, many of these AI-based tools remain empirically untested, leaving the door open to to ethical and legal problems.

white line

Berke-Weiss Law attends City Bar Webinar on Pregnancy during the Pandemic

June 25, 2020
Pregnancy Discrimination
Since the end of March, we’ve spent a great deal of time talking about the economic and social impacts of coronavirus and the lockdowns on working parents, but today we want to talk about how it’s affecting pregnancy. Specifically, what is and isn’t being done to help pregnant women during this incredibly strange and new time.

Berke-Weiss Law Weekly Roundup

June 19, 2020
No items found.
In this edition, we’re looking at several employment-related stories, including more news on the childcare front, new considerations for coronavirus workplace safety, as well as some news about a project in which the Firm is participating.

Title VII Now Applies to Gay and Transgender People, the Supreme Court Rules

June 15, 2020
No items found.
In a stunning victory for LGBT employees and the movement at large, the U.S. Supreme Court has held 6-3 that gay and transgender people are protected by Title VII of the 1964 Civil Rights Act, which bans employment discrimination “because of sex.”

Get In Touch

Knowing where to turn in legal matters can make a big difference. Contact our employment lawyers to determine if we can help you.