Ardent Forest Thumbnail, showing Ollie in front of a forest.

Ardent Forest: Behind The Scenes Of The Analog Horror Movie

Ardent Forest Thumbnail, showing Ollie in front of a forest.

On April Fool’s Day 2023, I released a new project. This project is one I’ve wanted to make for a long time, despite it being quite different from my usual work. This project was the TTRPG-themed analog horror short called Ardent Forest. 

Since it aired on Twitch, quite a few people have asked me questions about it. So this post will break down the short and its creation.

If you’ve not seen it yet, I’ve uploaded the whole thing to Youtube, so you can watch it here!

Why make this?

I made this for two main reasons. The first reason is that I love fourth-wall-breaking media, including ARGs, Unfiction, and Analog Horror. 

Secondly, I’ve been fascinated by the speedy rise of AI. While AI-generated content is rife with ethical issues, one thing that has really caught my attention is the constant arms race between users and platform owners. 

AI sites, keen to avoid getting into legal trouble, keep putting in safeguards to stop their AIs generating false, dangerous, or offensive statements. On the other hand, users keep finding ways to trick AI into circumventing those restrictions. In my eyes, this is a fantastic metaphor for how we treat our identity and the identities of those around us, with society often forcing people into niches they don’t want, often for others’ gain.

So, the rise of AI-generated streams (like the infinite Seinfeld and infinite Steamed Hams streams) felt like the perfect place to explore this concept.

Was it made with AI?

The main question I’ve been asked since the stream ended is: “Did you use AI to make this?”

The answer is no. There was no AI used during Ardent Forest’s creation. 

The text was generated using a python script and then edited by hand, the animation was handled with Unity and DaVinci Resolve, and the dialogue was recorded via Text-To-Speech. 

Making the text

The bulk of the text was created using Markov chains. To put it simply (and terribly,) Markov chains take previously written text and work out the probability of one word following another. It then uses these probabilities to make new text. Think mad libs with roll tables. 

To make my model, I fed Markovify a massive load of my writing. This included my social media posts, the text from all my games, several years of campaign notes, several unpublished essays, and other random bits of writing. This meant the model was built from 956, 137 of my words spanning over 1099 pages.

Which, honestly, impressed even me as I had no idea I had even written that much stuff!

Word-Count for Ardent Forest
That’s a lot of words…

I then generated sentences in massive batches and imported them into a spreadsheet. I then scrolled through and found the best generations, edited them a little, and decided what character would fit the line best. 

Then, when I had done that, I sorted the lines into scenes, trying to make sure each one flowed well while still feeling like it was generated by a computer. 

Interestingly, the randomly generated lines really helped me shape the plot. At first, I was going to be way less overt with the analog horror sub-narrative, but the model created some lines that fit the idea I was going for, so I decided to focus on that a little more heavily. 

Making the characters

Initially, I planned to copy Critical Role’s format of having 1 DM and 7 players. However, I quickly realized this would be hard to pull off, as not only would I need to give each character a distinct voice, but I would also need to animate them all. 

Because of that, I quickly reduced the cast to 1 DM and 2 Players. These characters were named DM, Alpha, and Omega in the original draft. But once I was sure of the format, I gave them names. 

The DM became Darien. This is a shoutout to the DIC dub of Sailor Moon, which opted to rename Mamoru Chiba (Tuxedo Mask’s civilian identity) to Darien Shields. Because I fully believe that every project I work on should have at least one outdated pop-culture reference. 

Omega became Olga because the alliteration made it easier for me to keep track of the names while recording, and it was a name that the text-to-speech could easily pronounce, which is good as it makes it more obvious when she starts to have issues. 

Finally, Alpha ended up being named Ollie. Now, I know what you’re thinking. To keep with the alliterative theme, why doesn’t she have a name starting with the letter A? Well…she did. 

Originally her name was Allie, but I made a typo. I didn’t notice this typo until after I edited the cast intros. By then, I was already too far into the process, so I opted to run with it, not wanting to rerecord and re-edit several scenes.

Getting the audio

Recording audio was the most straightforward bit of the project, as I grabbed a simple text-to-speech generator and used it to read out each bit of the script and then edited those readings together to form the dialogue. 

Animating the short 

Animating the short was my biggest worry as I’m not an animator.  

After experimenting with several methods, I opted to animate the whole thing using Unity and its Cinematic Studio. Mostly because I already have a little bit of Unity experience as I used it while making my TTRPG review show IHF, making the process feel less intimidating.

Thankfully, most real AI streams use simple visuals, so I could get away with simply posing the limbs and having them wave around clumsily without having to worry too much about realism or human anatomy.

Ardent Forest Timeline
The Timeline From A Distance

To make it look more AI-Generated, I heavily reduced the color range and resolution of the footage in DaVinci Resolve. 

I then made a series of animations featuring randomly moving dots and lines and overlaid these on the edited footage. These videos make the environment look like it is constantly shifting and wobbling, almost like it is being generated on the fly, further adding to the project’s uncanny valley vibe.

Ardent Forest Screenshot
Before Effects
Ardent Forest Screenshot
After Effects

Fake crashes and the desktop

The last thing to be filmed was the fake crash desktop scenes. Those were thankfully pretty simple, as I just set up a virtual machine and recorded it in OBS. The fake terminals and error pop-ups were made in both Batch and Visual Basic, and I just launched them with shortcuts whenever I wanted one to appear. 

The desktop section also includes a cheeky reference to Alan Resnick, one of my favorite artists. A copy of his “Live Forever As You Are Now” video can be seen in the top right of the screen during these segments. I included this as it touches on many of the same themes Ardent Forest does, so it felt right to shout it out. 

The only tricky bit of the desktop scenes was getting the sound right. As without good sound design, the scenes wouldn’t have the “something is going wrong” vibe I wanted.

I wanted the audience to hear the character running the stream’s reaction to the AI’s glitching. However, it didn’t feel right to have the character speak, so I opted to convey that character’s emotions via their breathing, almost as if they had forgotten to mute their mic during the chaos. 

Of course, I wanted to make it sound accidental. As if the mic wasn’t fully set up and was a good distance away from the keyboard. To get that effect, I wrapped a (clean) sock around my mic to muffle the sound.

I’ll tell you this, nothing makes you question your life choices more than pretending to hyperventilate in front of a sock-covered microphone at three in the afternoon. It makes me pity my neighbors. 

Things I would do differently 

I’ll be the first to admit this project wasn’t smooth sailing. I put aside about two months to bring the project from concept to completion. Alas, life issues led to me losing a good chunk of those two months. This meant the final week was a massive rush, but I’m proud of what I achieved.

However, during production, some ideas were cut for various reasons. This includes: 

Longer runtime: Originally, I wanted to run the stream for 24-48 hours before things started to go wrong to make the project feel more realistic. However, I soon realized this would take several extra months of writing and animating to pull off, and I, unfortunately, didn’t have the time. 

Original characters and models: All the models used during the short are from the Unity Asset Store. I did consider paying an artist to make new models, but I decided against it. The timescale was overly tight, making it unfair to the artist, and the idea of making the artist spend hours making characters that would only be used once didn’t sit right with me. 

Conclusion

I’m very proud of Ardent Forest, especially as I did the project solo. It means a lot to me because of its very personal themes. And because it forced me to experiment with software and mediums I don’t usually get to use. 

But hopefully, I’ll get the chance to do more stuff like this in the future.

Jonathon Greenall is a freelance writer, artist, and tabletop roleplaying game designer who has written for CBR, Polygon, Nintendo Life, Gayley Dreadful, Enbylife, and many other publications. They have also published several popular and highly-praised tabletop roleplaying games including “You Have One Ability….The Ability To Fuck This Up,” “Macarons, Milkshakes, And Magic,” and “Wander Wizards.”

Jonathon has always been fascinated by media, from the big hitters to the small, obscure, and often overlooked titles that linger on the sidelines, capturing both the on and off-camera stories that make these shows so fascinating.

Jonathon is also a major anime fan, having been exposed to the medium through shows like Sailor Moon and Revolutionary Girl Utena. Since then, Jonathon has maintained a passion for anime, watching most new shows each season and hunting down overlooked gems from previous ones.