Is Generative AI the Future of Fandom?

An AI-generated image by Max Justh

After ‘Stranger Things’ ended with a finale that disappointed many fans, some are using a new tool to take matters into their own hands: generative AI.

With a 53% audience score on Rotten Tomatoes, Season Five is by far the lowest rated. Fans cite plot holes, failed romances and queerbaiting as the biggest sources of disappointment. Some even suggested a secret ninth episode — the “real” finale — had yet to air. (The widespread theory, known as ‘Conformity Gate,’ has since been debunked.)

Some fans channeled their dismay into creating fanfiction, fanart or fan edits — videos that rearrange clips to highlight themes or imagine romances. Now, some editors have gone a step further, using AI tools to alter clips or create new ones that bring their headcanons to life. Experts, like University of Southern California media scholar Henry Jenkins and Columbia University data journalism professor Jonathan Soma, say these edits are likely protected from copyright law.

Wondering where the Demogorgons were in the final battle? Max Justh recreated it on YouTube, complete with an AI-generated army of Demogorgons spliced into existing clips.

Torn that Will Byers and Mike Wheeler didn’t end up together? Instagram creator @kiss.topher, who declined to be named due to potential legal liability, is one of many creators who makes AI reels of the characters confessing their love and kissing.

“What I often say is fandom is born of a mixture of fascination and frustration, right? If it didn’t fascinate you, you wouldn’t keep watching. But if it didn’t on some level frustrate you, you wouldn’t feel compelled to rework it,” said Jenkins, who wrote ‘Textual Poachers: Television Fans and Participatory Culture.’ “Fans are one group that’s trying this technology out, and I guess ‘Stranger Things’ has come along at the right moment to benefit from this enhanced technology and skillset.”

Jenkins credits ‘Star Trek’ fan Kandy Fong with creating the first fan edit in 1974. Fong put ‘Star Trek’ film clips inside a slide carousel and presented the slideshow at sci-fi conventions, manually flipping through the images while playing parody folk music.

Later, fans would connect two VCRs to copy and arrange clips on a blank videotape. By the late 1990s, digital editing allowed them to more persuasively recontextualize footage. Jenkins recalled a ‘Star Trek’ video where two characters appear to touch each other using a splitscreen effect. It was the first time he saw the characters touch in a fabricated way.

Today, generative AI has brought fan edits to a new level of realism. Justh uses AI image and video generation tools like Nano Banana Pro and Hailuo AI to create hyperrealistic shots which he later edits into one-and-a-half minute videos.

For Justh, generating the visuals is relatively fast. It only takes a few minutes for Nano Banana Pro to turn a text prompt into an image, and for Hailuo AI to turn an image into a video. But the process requires precision.

“You have to take into account who you want to generate, the camera angle, where the action takes place, the cinematic style and so on,” he said.

The real work comes in postproduction, where Justh can spend hours in Adobe Premiere Pro performing color corrections, adding sound effects and editing the footage to make the video look as real as possible.

Alternatively, @kiss.topher makes edits of Will and Mike in the software ComfyUI, which allows him to access image and video generation tools like Z-Image Turbo and LTX-2. Before he makes videos, though, he trains these tools to recognize the characters’ faces. Otherwise, the AI won’t know how to generate their faces.

To fix this, @kiss.topher uploads over 100 images of Will and Mike to a training tool in ComfyUI, with AI-generated descriptions for each one. That way, he defines who Will and Mike are, so the AI tools can properly represent them. The training alone can take up to 20 hours. Then, by accessing the AI tools through ComfyUI, he uses a text prompt to create images, and uses the images to create a video.

These AI models open up doors for creative expression, Justh said.

“As a kid, I always wanted to be a movie director,” he said. “I think those modern tools can give you that possibility.”

For now, AI fan edits are probably safe from copyright law. Copyright has always been a concern for fanwork, Jenkins said. But as “transformative works” that remix source material to create something new, most of it is protected under fair use. AI companies, he said, likely rely on transformative use to avoid copyright infringement.

Soma echoed this sentiment.

“AI companies have a lot more to lose as compared to a random person making a fake ‘Stranger Things’ ending,” he said. “Any org could file a takedown notice for these things, but it’s usually not worth their time.”

Copyright issues aside, the reception of these videos among fans is divided. Some are excited to see fanfiction come to life, but others are troubled by the rise of AI in creative work.

Noah Murase, a Stanford University student who has written almost 100 fanfictions, worries about artistic theft. He used to post fanfiction on Archive of Our Own, an open source repository of fanwork run by the nonprofit Organization for Transformative Works, but removed them out of concern that large language models will use his work as training data.

“Your work is no longer your work. It can become anyone’s work, which is just terrifying to me,” Murase said.

Given fan concerns like Murase’s, the archive has code in place to mitigate large-scale scraping of the site that could be used to train LLMs.

For Murase, AI-generated content lacks what he calls “literary drive,” which he says sets human-made work apart.

“I once wrote, within three months, 200,000 words for a [fanfiction],” Murase said. “That, I feel like, is literary drive. The fact that I had an idea, I saw the vision, the gears were turning in my head. And I created something that was true to me, true to the work, and that … made it good.”

Author

  • Emily Tarinelli

    Emily Tarinelli graduated from Mount Holyoke College in 2025, with a B.A. in English and two minors in journalism and gender studies. She received high honors in English based on her undergraduate thesis, which analyzed contemporary feminist and queer horror media. Emily was an active member of Mount Holyoke’s student newspaper, where she developed a beat in covering the intersection of gender and athletics. She extended her passion for covering feminist and LGBTQ+ issues to local communities beyond the realm of sports through journalism internships with the United States Coast Guard Academy Alumni Association, Amherst Media and the School Superintendents Association. After studying for a semester in Edinburgh, Scotland, she returned to the United States determined to use journalism to help audiences expand their worldview as hers was abroad. In her free time, Emily enjoys swimming, reading, watching movies, spending time outdoors and sharing her love of coffee. She grew up in southeastern Connecticut, and while she is excited to explore the West Coast, she remains proud of her New England roots.

Scroll to Top