Subscribe now

Technology

Artificially intelligent robot perpetuates racist and sexist prejudice

Virtual robot run by artificial intelligence acts in a way that conforms to toxic stereotypes when asked to pick faces that belong to criminals or homemakers

By Chris Stokel-Walker

27 June 2022

Young woman's eye with futuristic circular element showing user interface data screen panel

Robots running on artificial intelligence software will inherit the same biases AI gets from humans

Yuichiro Chino/Getty Images

A robot running an artificial intelligence (AI) model carries out actions that perpetuate racist and sexist stereotypes, highlighting the issues that exist when tech learns from data sets with inherent biases.

Andrew Hundt at the Georgia Institute of Technology in Atlanta and his colleagues set up a virtual experiment using a robot running CLIP, a neural network developed by a company called OpenAI and trained by pairing together images taken from the internet and accompanying text prompts.

The virtual robot, operating in…

Sign up to our weekly newsletter

Receive a weekly dose of discovery in your inbox. We'll also keep you up to date with New Scientist events and special offers.

Sign up

To continue reading, subscribe today with our introductory offers

Piano Exit Overlay Banner Mobile Piano Exit Overlay Banner Desktop