Connect with us

Hi, what are you looking for?

Uncategorized

Tutoring firm settles US agency’s first bias lawsuit involving AI software

© Reuters. FILE PHOTO: A pedestrian passes a “Help Wanted” sign in the door of a hardware store in Cambridge, Massachusetts, U.S., July 8, 2022. REUTERS/Brian Snyder/File Photo

By Daniel Wiessner

(Reuters) – A China-based tutoring company has agreed to settle a U.S. government agency’s novel lawsuit claiming it used hiring software powered by artificial intelligence to illegally weed out older job applicants.

The 2022 lawsuit against iTutorGroup Inc was the first by the U.S. Equal Employment Opportunity Commission (EEOC) involving a company’s use of AI to make employment decisions.

The commission, which enforces workplace bias laws, in 2021 launched an initiative to ensure that AI software used by U.S. employers complies with anti-discrimination laws. The EEOC has warned that it will focus enforcement efforts on companies that misuse AI.

ITutorGroup agreed to pay $365,000 to more than 200 job applicants allegedly passed over because of their age, according to a joint filing made in New York federal court on Wednesday. The settlement must be approved by a federal judge.

The company, which provides English-language tutoring to students in China, denied wrongdoing in the settlement.

The EEOC had alleged that iTutorGroup in 2020 programmed online recruitment software to screen out women aged 55 or older and men who were 60 or older.

ITutorGroup, a unit of Ping An Insurance Group Co of China (SS:), did not immediately respond to a request for comment. An EEOC spokesperson said the agency would not comment until the settlement is approved.

At least 85% of large U.S. employers are using AI in some aspects of employment, according to recent surveys.

That includes software that screens out job applicants before a human reviews any applications, human resources “chatbots,” and programs that conduct performance reviews and make recommendations for promotions.

Many worker advocates and policymakers are concerned about the potential for existing biases to be baked into AI software, even unintentionally.

In a pending proposed class action in California federal court, Workday (NASDAQ:) is accused of designing hiring software used by scores of large companies that screens out Black, disabled and older applicants. Workday has denied wrongdoing.

Experts expect an increasing number of lawsuits accusing employers of discriminating through their use of AI software.

 

 

 

Read the full article here

Click to comment

Leave a Reply

Your email address will not be published. Required fields are marked *

You May Also Like