Australian AI company says sorry for asking potential staff to describe their skin tone

Appen removes ‘paper bag test’ from its application form after US woman’s tweet goes viral

An Australian artificial intelligence company has apologised for a question on a recruitment application which asked potential employees to describe their skin tone.

The Australian Securities Exchange-listed company, Appen, boasts 1 million contractors working at 70,000 locations across the globe who work to label photographs, text, audio and other data to improve AI systems used by large tech companies.

Houston-based Charné Graham was approached by recruiters on LinkedIn to apply for a contract social media evaluator role with Appen, so she started filling out an application form.

After ticking a box saying she is “Black or African American”, she was asked to select her complexion, from light to brown to black. Her tweet about the application form went viral, gaining 16,400 retweets and 73,100 likes.

Has anyone ever seen this on a job application? pic.twitter.com/OVqwNbTioR

— Charné Graham (@CharneGraham) May 10, 2021

She said she had not continued with her application for the role after seeing the “paper bag test” – a term used to describe a 20th century discriminatory practice where an African American person’s skin colour was compared to a brown paper bag.

Guardian Australia has sought comment from Graham. She told Nine newspapers she could not understand how information about her complexion was relevant for the tasks involved in the job.

“I’m aware that Appen is an artificial intelligence company but as a Black woman the question is very off putting and triggering with no clear explanation as to why you would need that information,” she said.

Appen’s senior vice-president of human resources and crowdsourcing, Kerri Reynolds, told Guardian Australia in a statement the question had been removed after Graham pointed it out.

“We collect data from our crowd of contractors in an effort to take the bias out of AI,” she said. “We acknowledge that without an explanation up front as to why it is so important to ask some of these questions, and the way the question was presented, it missed the mark and that’s on us to fix …

“To be clear, there is no intended racism in our hiring processes, practices or policies. We continually work to reflect the cultural and ethnic diversity both in our workforce, and with crowd workers in 170+ countries who speak 235+ languages.”

It comes at a time when there is increased focus on ethics in AI. Two Google engineers quit the company in February over concern about the impact the company’s research could have on marginalised groups.

Three groups – Black in AI, Queer in AI and Widening NLP – wrote an open letter this week stating they would no longer take Google funding in response to the company’s treatment of the two engineers.

“The potential for AI technologies to cause particular harm to members of our communities weighs heavily on our organisations,” they said. “We share a mandate to not merely increase the representation of members from our respective communities in the field of AI, but to create safe environments for them and to protect them from mistreatment.”

Contributor

Josh Taylor

The GuardianTramp

Related Content

Article image
'Two Goliaths': Apple labels Epic's Australian challenge to in-app purchases 'self-serving'
Federal court to decide whether Fortnite maker’s case can be heard while legal action under way in US

Josh Taylor

23, Mar, 2021 @7:25 AM

Article image
Cases challenging mobile phone detection cameras could clog NSW courts, MPs warn
Legislation that reverses onus of proof described as ‘a dangerous precedent’

Josh Taylor

13, Nov, 2019 @2:52 AM

Article image
Not just nipples: how Facebook's AI struggles to detect misinformation
Automated moderation can be a blunt instrument – as users trying to post an image of Aboriginal men in chains discovered

Josh Taylor

16, Jun, 2020 @5:30 PM

Article image
Covid check-ins fall 25% at NSW venues as Sydney records mystery new local case
Exclusive: threat of a quarantine breach ever-present, epidemiologist warns, and herd immunity ‘a long time off’

Josh Taylor

05, May, 2021 @3:17 AM

Article image
TikTok says Australian users' data won't be shared with foreign power
Vow comes despite app’s own transparency reports revealing that it complies with requests

Josh Taylor

17, Jul, 2020 @4:33 AM

Article image
Artificial intelligence has arrived, but Australian businesses are not ready for it
Despite spending the world’s second-largest amount on automation, Australian companies are not ready for robots – or for retraining staff

Max Opray

24, Jan, 2017 @11:33 PM

Article image
AI skin cancer diagnoses risk being less accurate for dark skin – study
Research finds few image databases available to develop technology contain details on ethnicity or skin type

Nicola Davis Science correspondent

09, Nov, 2021 @11:30 PM

Article image
Cybersecurity: is the office coffee machine watching you?
Artificial intelligence and machine learning can identify threats to an organisation – but at what cost to privacy and whistleblowers?

Max Opray

28, Apr, 2017 @2:44 AM

Article image
With robots, is a life without work one we'd want to live?
Being gainfully employed is about more than money. We need to consider what will give our lives purpose and connection in the age of automation

Matthew Beard

26, Sep, 2016 @7:21 AM

Article image
Humans are going to have the edge over robots where work demands creativity | Tim Dunlop
Jobs will be lost in the age of automation but the creative industries will grow and the ability to work with ambiguity, diversity and empathy will be valued

Tim Dunlop

26, Sep, 2016 @2:23 AM