Man in suit, woman in bikini. GPT-3 reproduces clichés when generating images

Scientists from George Washington University have investigated how the AI language model GPT-3 behaves when generating image additions. Unlabeled images from the Internet were used as training material for artificial intelligence. The result: the AI falls back on prejudices and clichés when selecting images. Pictures of men showed them mostly in suits, women, however, were shown in bikini or with low necklines.

“These models can contain all kinds of harmful human prejudices from images we post on the Internet, and we need to think carefully about who is creating them, how and why,” the scientists commented.

Read the whole article at MIXED

Man in a suit, AI reproduces clichés
3. February 2021 Doreen Nagelmüller