Site icon Grace Dow Writes:

AI Inspiration Porn

Artificial intelligence (AI) is increasingly influencing how we consume media and interact with people. One of the more concerning trends that I’ve seen recently  is the emergence of AI generated “inspiration porn”, a phrase coined by the late Stella Young. Inspiration porn is content that exploits disabled people, portraying them as inspirational solely because of their existence or ability to overcome obstacles. 

Typically, the posts involve an image of a disabled parent with an infant, and then again with the child as an adult. I’ve seen several examples where the child is an astronaut, teacher, or doctor. Of great concern to me were the comments on these posts, saying how moving the stories were. 

One issue with inspirational porn is that it takes the rich, multidimensional lives of disabled people and turns them into one-dimensional narratives. Rather than being viewed as whole individuals with their own stories, strengths, and weaknesses disabled people are reduced to symbols of bravery or fortitude. As if surviving or living with a disability is extraordinary.

This content may seem harmless.  However, it has far-reaching, negative consequences for both disabled people and disability as a whole. 

AI-generated social media posts take away opportunities for disabled people to tell their own stories. Social media can be a powerful tool for people to raise awareness of their disabilities. However, AI is not an effective storytelling platform, and could also perpetuate harmful stereotypes.

According to researchers at Penn State’s College of Information Sciences and Technology (IST), artificial intelligence is driven by learned associations that frequently contain biases toward disabled people. “We wanted to examine if the nature of a discussion or an NLP (natural language processing) model’s learned associations contributed to disability bias,” said Pranav Narayanan Venkit, a PhD student at the College of IST and the paper’s first author. “This is important because real-world organizations that outsource their AI needs may unknowingly deploy biased models.”

Fetzer, Mary. “Trained AI Models Exhibit Learned Disability Bias, IST Researchers Say.” Penn State University, Penn State University, 30 Nov. 2023, http://www.psu.edu/news/information-sciences-and-technology/story/trained-ai-models-exhibit-learned-disability-bias-ist. 

Narayanan Venkit, Pranav, et al. ‘Automated Ableism: An Exploration of Explicit Disability Biases in Sentiment and Toxicity Analysis Models’. Proceedings of the 3rd Workshop on Trustworthy Natural Language Processing (TrustNLP 2023), edited by Anaelia Ovalle et al., Association for Computational Linguistics, 2023, pp. 26–34, https://doi.org/10.18653/v1/2023.trustnlp-1.3.

Young, Stella. “I’m Not Your Inspiration, Thank You Very Much.” TEDxSydney, uploaded by TED, 9 June 2014, http://www.youtube.com/watch?v=8K9Gg164Bsw.

 

Exit mobile version