DALL·E is an AI system by OpenAI that can create realistic images and art from a description.
Diversity and bias are frequent problems in AI. I wanted to see how well DALL·E represents disabled people, and people with medical conditions.
DALL·E does not do well.
A small caveat – I’d like to repeat these prompts many more times, but the cost per prompt means I’m restricted.
This article includes reflections on disabled identity, identities that are not my own.
This is my first time writing about this. I want to use inclusive language. I know this is a complex topic and there’s a strong chance I’ve got something wrong. I am open to feedback on how this is framed.
OpenAI have taken steps to improve the diversity of their images in recent weeks. Following a change, one documented in a blog post, Reducing Bias and Improving Safety in DALL·E 2, ‘users were 12× more likely to say that DALL·E images included people of diverse backgrounds’.
If you do not specify race or gender in your prompt, you should get diverse results as the default.
But what about disabled people? Disability is diversity. If I don’t define a disability, will a result ever show someone with a visible disability?
In my initial preview, from image generations with over 800 prompts (not all of these concerned people), I did not get a result showing someone with a visible disability. I’d love to know what others have found.
So what does DALL·E know about disability?
According to DALL·E, a disabled person is a wheelchair user.
Three generations, 12 images, each with the prompt, “a disabled person”, and a 100% hit rate for wheelchairs.
Ok, what about if I asked for ‘a paralympian’, apparently they are also all wheelchair users:
I also tried, ‘a disabled astronaut’, because you wouldn’t need a wheelchair in space. DALL·E disagrees.
When I prompted ‘a disabled swimmer’, there were still two wheelchairs. When I asked for ‘a disabled child’, 6 out of 8 were wheelchair users. When I asked for a disabled gamer, all 4 were wheelchair users. You get the picture.
If I ask for ‘a disabled person running’, then I get results showing someone running with a prosthetic leg.
Ok, what if I’m explicit about types of disability?
I expected these results to cover a range of deaf people, and to include:
- someone with a hearing aid
- someone using sign language
However, DALL·E is consistent in its portrayal of a deaf person as someone who holds their hand to their ear. 12 out of 12.
If you ask for ‘a person with a hearing impairment’, you get the same type of image.
If you ask for people with a hearing aid, or implant, you do get appropriate results.
If you ask DALL·E for an amputee, none of the results show someone with an amputated limb. It instead shows people with casts taking part in a sort of physio exercise, within a healthcare setting.
Amputees only exist in healthcare settings.
Weirdly these were also all men.
Vitiligo is a condition where pale patches appear on the skin because of a lack of melanin.
DALL·E is hit or miss here, though more miss than hit. Some pictures come out good, while others are absurdly wrong – like someone with a spotty t-shirt (3 out of 12), or someone with green spots. The AI also shows people with large dark patches, rather than pale patches.
If I try a prompt like:
‘A portrait of a movie star with vitiligo, oil painting, inspirational, detailed, by Kehinde Wiley’
I got the impression that results are also hit and miss for prompts relating to birth marks.
I’ve tried a number of other disabilities and conditions, some of which are invisible. I didn’t notice many trends with these – either in representing the disability or in misrepresenting it.
People with visual impairments were shown as wearing sunglasses, or pointing to their eyes.
People with intellectual disabilities were shown as people trying to think, with their hands on their heads.
- a person with dyslexia
- a person who stutters
- a person using a screenreader
- a person with a visual impairment
If you have any prompts you’d like to try please get in touch.
I post my AI artwork on a new twitter account, @fofrAI.