Check your artificial intelligence 'bossware' tools for bias, says U.S. agency head

FILE - The emblem of the U.S. Equal Employment Opportunity Commission is shown on a podium in Vail, Colo., Tuesday, Feb. 16, 2016, in Denver. Charlotte Burrows, chair of the Equal Employment Opportunity Commission, told The Associated Press, Thursday, May 18, 2023, that the agency is trying to educate employers and technology providers about their use of these surveillance tools as well as AI tools that streamline the work of evaluating job prospects. (AP Photo/David Zalubowski, File)

The head of the U.S. agency charged with enforcing civil rights in the workplace says artificial intelligence-driven 鈥渂ossware鈥 tools that closely track the whereabouts, keystrokes and productivity of workers can also run afoul of discrimination laws.

Charlotte Burrows, chair of the Equal Employment Opportunity Commission, told The Associated Press that the agency is trying to educate employers and technology providers about their use of these surveillance tools as well as AI tools that streamline the work of evaluating job prospects.

And if they aren't careful with say, draconian schedule-monitoring algorithms that penalize breaks for pregnant women or , or allowing faulty software to screen out graduates of women鈥檚 or historically Black colleges 鈥 they can't blame AI when the EEOC comes calling.

鈥淚鈥檓 not shy about using our enforcement authority when it鈥檚 necessary,鈥 Burrows said. 鈥淲e want to work with employers, but there鈥檚 certainly no exemption to the civil rights laws because you engage in discrimination some high-tech way.鈥

The federal agency put out its latest set of guidance Thursday on the use of automated systems in employment decisions such as who to hire or promote. It explains how to interpret a key provision of the Civil Rights Act of 1964 known as Title VII that bars job discrimination based on race, color, national origin, religion or sex, which includes .

Burrows said one important example involves widely-used resum茅 screeners and whether or not they can produce a biased result if they are based on biased data.

鈥淲hat will happen is that there鈥檚 an algorithm that is looking for patterns that reflect patterns that it鈥檚 already familiar with,鈥 she said. 鈥淚t will be trained on data that comes from its existing employees. And if you have a non-diverse set of employees currently, you鈥檙e likely to end up with kicking out people inadvertently who don鈥檛 look like your current employees.鈥

Amazon, for instance, abandoned its own resume-scanning tool to recruit top talent after finding it favored men for technical roles 鈥 in part because it was comparing job candidates against the company鈥檚 own male-dominated tech workforce.

, including the Department of Justice, have been for the past year, with previous sets of guidance about how some AI tools could discriminate against people with disabilities and violate the Americans with Disabilities Act.

In some cases, the EEOC has taken action. In March, the operator of tech job-search website settled with the agency to end an investigation over allegations it was allowing job posters to exclude workers of U.S. national origin in favor of immigrants seeking work visas. To settle the case, the parent company, DHI Group, agreed to rewrite its programming to “scrape” for discriminatory language such as “H-1Bs Only,” a reference to a type of work visa.

Much of the EEOC's work involves investigating the complaints filed by employees who believe they were discriminated against. And while it's to know if a biased hiring tool resulted in them being denied a job, Burrows said there is 鈥済enerally more awareness鈥 among workers about the tools that are increasingly being used to monitor their productivity.

Those tools have ranged from radio frequency devices to track nurses, to monitoring the minute-by-minute tightly controlled schedule of warehouse workers and delivery drivers, to tracking keystrokes or computer mouse clicks as many office employees started working from home during the pandemic. Some might violate civil rights laws, depending on how they're being used.

Burrows noted that the 好色tv Labor Relations Board is also looking at such AI tools. The NLRB sent a memo last year warning that overly intrusive surveillance and management tools can impair the rights of workers to communicate with each other about union activity or unsafe conditions.

鈥淚 think that the best approach there -- I鈥檓 not saying not to use it, it鈥檚 not per se illegal -- but is to really think what it is that employers are looking to measure and maybe measure that directly,鈥 Burrows said. 鈥淚f you鈥檙e trying to see if the work is getting done, maybe check that the work is getting done.鈥

The 好色tv Press. All rights reserved.

More Science Stories

Sign Up to Newsletters

Get the latest from 好色tvNews in your inbox. Select the emails you're interested in below.