AI is here, and it’s had a major boost in development over the past five years. According to research by Our World in Data, a system that just two decades ago learned to play backgammon below human comprehension can now fully recognize language and images at the same level.
AI can be used for task management: checking the weather, transcribing audio, translating languages, and more. Perhaps more alarmingly, it can also create images and videos, write articles and essays, and help (or hurt) you in the hiring process. AI is not new, but its growing use and investment could have an impact on LGBTQ artists and what they contribute.
Many LGBTQ artists are feeling the effects of AI-generated imagery on their already underfunded voices. According to Gallup, roughly 7.6 percent of U.S. adults self-identify as LGBTQ, and according to 2021 UCLA research roughly 17 percent of LGBTQ individuals are living in poverty. This software, if remained unchecked, can take away employment opportunities for those who have spent much of their lives devoted to their craft, relying on their industries’ need for authenticity, creating an even bigger disparity for LGBTQ people, especially those of color and non-cis identities.
In 2022, a group of artists brought forth a class-action lawsuit against AI image generators, Midjourney and Stability AI. The software companies were accused of ripping off the various artists’ previously existing — and original — work. It would make images that stylistically looked either eerily similar or straight up copy and paste. Research additionally indicated that the generators would use the artist’ names and work as suggestions for making specific design choices for the user-requested depictions, like “Lord of The Rings-style art.”
The software used and retooled the art without providing consent, compensation or credit to the original creators. While NCSL reports various states working to keep this expanding technology limited, there is currently very little legislation being done on the federal level to require a rigorous checks and balances system when it comes to AI use, and the ethical dilemmas the stories of these artists raise. The results could be especially disastrous for artists of marginalized backgrounds, including those who are LGBTQ.
AI-generated imagery is currently all the rage on social media. As this tech received more investment and access to real and previously existing works of art (photography, graphic design, film, drawing, etc), it is able to be more and more realistic, and many who might not take the time to research an image’s validity are falling prey to scams of AI images being passed off as real. This adds in a new way for misinformation to spread, an already problematic “infodemic” in the U.S. that directly impacts LGBTQ organizations and communities.
Video AI technology such as OpenAI’s Sora, which can create video from text, is now being promoted by art institutions themselves, such as long-respected organizations like Tribeca Enterprises, and halting the active work of those within the film industry. Earlier this year, Hollywood producer Tyler Perry put an $800 million studio expansion on hold after seeing the power and magnitude of what Sora could do.
While the technology may reach a breaking point and general disinterest, the underrepresentation of queer identities and storytelling can still be widened by artists whose work is taken out of their control. Additionally, much of popular art such as film has historically been from a heteronormative storytelling perspective. In an age where queer cinema is expanding, and intersectional voices are growing louder, AI generators could snuff them out, relegating audience goers to the same stories that have been told for hundreds of years, without any ingenuity.
The energy required for AI generators to work is now a trending topic, which connects to economic and environmental disparities faced by LGBTQ individuals.
According to a report from The Washington Post, using ChatGPT to draft up an email requires a little over a bottle of water to cool the equipment housed in the companies data centers. When about 16 million people use one generator weekly for a year, it requires the same amount of water as consumed by all Rhode Island households for 1.5 days. Water-cooling can deplete local areas of natural resources, resulting in local climate change.
A 2024 UCLA research study found that LGBTQ+ couples are at a greater risk of exposure toward negative effects of climate change compared to heterosexual couples, largely due to their increased likelihood of living in lower-income communities. In 2023, an article published on Earth.org entitled “Climate Change’s Unequal Burden: Why Do Low-Income Communities Bear the Brunt?” highlighted key problems that lower-income communities face, including loss of livelihood and increased health risks, from climate change. As LGBTQ+ individuals are more likely to be in poverty or have lower income than non-LGBTQ+ individuals, it is not hard to see how depletion of local resources, like water, with the overuse of generative AI can put local LGBTQ+ communities in danger.
This story is part of the Digital Equity Local Voices Fellowship lab through News is Out. The lab initiative is made possible with support from Comcast NBCUniversal.