Type to search

How to avoid accidentally racist AI models – Forrester Report

Technology & Data

How to avoid accidentally racist AI models – Forrester Report

Share

A new Forrester report cautions against firms accidentally implementing discriminatory AI and machine learning.

A Forrester report by Brandon Purcell ‘The Ethics Of AI: How To Avoid Harmful Bias And Discrimination’ cautions brands using artificial intelligence (AI) and machine learning, saying without careful implementations these technologies can pick up and amplify societal bias. Purcell advises readers to pay close attention to the data used in developing analysis models, drawing examples of accidental racism in Google Photos machine learning image analysis.

Google Photos is a platform developed by Google that analyses images uploaded by users, identifying and categorising objects within the image, using this data to train a deep neural network and develop a larger source library. In 2015 a black Haitian-American programmer in Brooklyn uploaded images of himself and a friend for image analysis, drawing on the source materials the program was basing its analysis on, Google Photo’s labelled the pair as ‘gorillas’. Forrester’s report says that machine learning algorithms can learn to discriminate against anything including race, gender and sexual orientation, given inappropriate source data.

Creating biased models can bring harm to customers and brand image. Forrester warns against damaged reputations, legal consequences and potential losses in market share; Quoting one female consumer of 27: “I would definitely stop doing business with any company altogether if I found out that they discriminate against anyone!”

Forrester reports that machine learning models can become biased from either a misrepresentative input data library or redundant algorithmic encoding. The report advises to use data that is proportionally representative of the population the model will be applied to, otherwise known as independent and identically distributed (IID) training data. Lest risking situations such as the Faceapp’s ‘hot’ filter lightening the skin tone in darker users.

The report also warns against developers inadvertently implementing a variable that serves as a proxy to bias. Forrester’s example: lending decisions that depend on an applicant’s status as a single parent are inherently sexist because 82% of single parents are women.

 

Further Reading:

 

 

 

 

Image copyright: kritchanut / 123RF Stock Photo

Image copyright: koya79 / 123RF Stock Photo

Tags:
Josh Loh

Josh Loh is assistant editor at MarketingMag.com.au

  • 1

You Might also Like

Leave a Comment