I took Easter off from thinking about the future of storytelling, media and the world at large, which was a really nice thing to do. Not that I don’t like thinking about these thing (and putting my thoughts into practice) but it’s seriously nice to rest the brain every now and then. And with the world being in the state it is, it’s even more taxing on that poor brain. So, instead I spent it with my family at a cultural-centre-slash-fancy-ish-hotel from the 70s, complete with a glass sauna by the sea and a truly magnificent breakfast. Thus invigorated I thought I’d take a stab at trying to narrow down those commandments or commitments I alluded to in the last post.

First off, and something I feel is perhaps almost too obvious, is transparency. In the world of generative AI it’s becoming increasingly clear that it’s the right thing to do, to be transparent. Your audience has a definitive right to know when you’ve used AI to create the stories and the media they consume. Now, this shouldn’t detract or distract from the story, and every single use of AI does not necessarily require a disclaimer or a separate note. But if you’ve used AI substantially, to create the story or the voices or the art, you should probably disclose the what, the how and the how much. When I was doing a talk for European documentary filmmakers a couple of weeks ago on the use of AI in factual and documentary storytelling, it was easy to agree that in the documentary business this is even more important.

It does go a bit further than just explaining what you’ve created with AI, though. As I talked about in an earlier post, you as the creator set the limits and the boundaries for AI within your separate projects, and these boundaries (and whom is setting them!) should also be part of the disclosure.  

Building on that last sentence, another commitment would be the commitment to disclose the origin of your AI content. This is something that is easy to forget in the heat of the creative moment, but you should strive to make it possible to trace where something that’s been generated by AI has come from. What model has been used, what prompts and human decisions helped shape it? The more official your content and media is, the more important this point becomes. Create something for your friend group? Not very necessary to explain all that went into creating the thing. Create something for a government? Or a major network? Or an educational institution? Quite a lot more necessary.

Also, drawing on my own experiences and discussions I’ve had, I’d argue that it’s a good thing to commit to not add to cultural exploitation or re-colonialisation with the help of AI. I’ve worked on a few projects where I’ve mixed AI generated material with indigenous Sámi story worlds and beings from those worlds. I’ve not a single drop of Sámi blood in my veins and I was deeply aware that the Sámi people have been treated fairly horribly over the years. Theirs is an oral tradition and is as such even more sensitive to easily generated material ostensibly depicting those stories and that world. The people and communities that are in possession of that knowledge and those traditions, they should have the final say regarding how AI is used in connection to them (which is also how we, ultimately, produced our series).

Then, of course, you have the vocal judges of everything you create, the audience and the trust you try to build with them. AI carries an inherent risk of upending that trust, if not applied correctly (and I’m not so sure it can’t do it even if applied correctly, to be fair…). 

I went and re-read Jeff Gomez latest essay on the Mythic Field (which I encourage everyone else to do as well!). The Mythic Field, Jeff argues, is the origin level of any storyworld. It’s the layer that reflects what the creator envisioned for the story and the storyworld, it’s the layer that the storyworld and the stories draw on to build the internal logic that makes audiences feel secure and safe and eager to engage with the storyworld.

The Mythic Field is not a story bible, though it may be expressed through one. It is not canon, though canon emerges from it. And it is not a set of constraints imposed on creativity. Rather, it is the origin level of a storyworld, the underlying structure of meaning that reflects the intrinsic vision, experience, and symbolic language of its creators. It begins to take shape long before characters act or events unfold, coalescing out of the artistic, emotional, and subjective impulses that compelled the world into existence in the first place. – Jeff Gomez, March 2026

When all of this works, audiences feel it. As Jeff writes, it’s like a narrative gravity of sorts, where you can feel that there’s a weight to the world, where there are consequences to events and actions, where there is a certain logic to how characters react and behave… it’s something we as an audience react to instinctively, that we respond to and relate to and that gives us the confidence to engage and interact with the stories and the storyworld. 

This Mythic Field, much like the Story Core I talked about earlier, is not something that AI can create. It’s built on lived experiences, on being a human, on a creator’s notion regarding what is important and true and what matters. 

If we don’t ensure we are able to properly harness the creative capabilities of AI, our Mythic Field runs the risk of being diluted or distorted, something the audience will feel very quickly. And once that trust in the core of the storyworld and the stories we want to tell is gone, it’s really really difficult to get back. 

This brings me to my last point, the one about committing to a certain narrative integrity. I think we all feel if someone is trying to tell us a story and invite us into a larger storyworld and is not being honest. Some of us perhaps don’t care that we’re not honest, others might feel that it’s worth a bit of discomfort to generate some brilliant piece of content. And if we go back to the craft of being a documentary filmmaker, many of us have been doing re-enactments or illustrations for decades, thereby arguably distorting the truth already. But the use of AI now… it makes this line between illustrating and fabricating (dare I say lying?) much much easier to cross. 

It is, of course, a bit more to this question of integrity than that. As creators we do sign an invisible contract with our audiences. They step into our storyworld armed with promises we’ve made them as creators of the stories as well as their guides. Generative AI, with new content created at breathtaking speeds, can ruin this commitment in a heartbeat, unless we commit to governing it actively.

I’ll be back shortly with some thoughts on all this in practice. Until then!

Leave a comment