Rodexo
Economy News News

AI reveals, injects gender bias in the workplace

AI reveals, injects gender bias in the workplace genders2 crop 600x338


Artificial intelligence takes the human bias out of the research on gender bias. (Photo: Shutterstock)

While lots of people worry about artificial intelligence becoming aware of itself, then running amok and taking over the world, others are using it to uncover gender bias in the workplace. And that’s more than a little ironic, since AI actually injects not just gender, but racial bias into its data—and that has real-world consequences.

A Fox News report highlights the research with AI that reveals workplace bias, uncovered by research from Boston-based Palatine Analytics. The firm, which studies workplace issues, “analyzed a trove of data—including employee feedback and surveys, gender and salary information and one-on-one check-ins between managers and employees—using the power of artificial intelligence.”

Its findings? In the companies it studied, Palatine found that the C-level suite was on average heavily dominated by men, while entry-level positions had a 45 percent female and 55 percent male breakdown. After reviewing its data on more than a dozen major variables, “including gender, tenure, position, the name of the manager, salaries, and the number of promotions/raises received,” says Palatine Analytics CEO Archil Cheishvili in a statement, the data revealed that women received fewer salary increases and promotions.

In addition, the AI-powered study found that men and women were equally as likely to meet goals, but men received 25 percent more positive evaluations compared to women in the same role.

The data also revealed that while women were providing almost identical performance review scores to men and women, 70 percent of men provided higher evaluations to men than to women. This disparity was more pronounced in senior positions, it found, where approximately 75 percent of men provided higher reviews to men than to women.

But wait—there’s more. A Bloomberg report tells how a former student at Stanford University’s Artificial Intelligence Lab, Timnit Gebru, now a member of a Microsoft Corp. team called FATE (Fairness, Accountability, Transparency and Ethics in AI), is working to find and remove “biases that creep into data and can skew results.” In other words, AI can actually introduce bias—both gender-based and race-based—into decisionmaking, unfairly penalizing women and minorities.

With companies and governments making use of machine learning, image recognition and other AI tool, says the report, “to help predict everything from the creditworthiness of a loan applicant to the preferred treatment for a person suffering from cancer,” they’re using “tools [that] have big blind spots that particularly affect women and minorities.”

Not only is AI more likely to identify a nurse in a photo as female, simply because more women than men are nurses, it’s also using algorithms that are biased from sets of data, such as gender biases that “could lead algorithms to do things like conclude people named John would make better computer programmers than ones named Mary.” Sometimes it’s as simple as AI learning its biases from language or from photographs.

Scientists at Boston University and Microsoft’s New England lab, says the report, explain in a paper called “Man is to Computer Programmer as Woman is to Homemaker?” how they “combed through the data, keeping legitimate correlations (man is to king as woman is to queen, for one) and altering ones that were biased (man is to doctor as woman is to nurse). In doing so, they created a gender-bias-free public dataset and are now working on one that removes racial biases.”

We’ll have to hope they hurry.

 

Related posts

Final reading on first-quarter 2018 GDP

Rodexo

‘He is not going to Madrid’: Catalan leader defies Spain’s call to return, face judge – World

Rodexo

Let’s get going on NAFTA renegotiation, says Marc Garneau – Politics

Rodexo

Leave a Comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.