Representing Writing

By Mark Rickerby (Website)


There is no universal method of representing writing with computers beyond strings, which are linear sequences of characters. To create writing generators that operate at the level of narrative structures or use expressive, persuasive and literary elements of discourse, we need to think carefully about how we represent these latent aspects of the text. The way we represent these hidden structures is an important force in shaping what we can do with the generator.


One way to think about this problem is to look at generated writing as a series of layers or rock strata, moving from the low level atoms of language, from phonemes, letters and words, up to the higher level of meaning in sentences and paragraphs, and larger groupings like passages and chapters. Building on top of these layers leads us to representations of narrative, through elements of discourse, tone, characters, and plot structures. In concrete terms, we can look at all these components of writing as a tree formed out of the different levels of structure found in a text.

This model is immensely helpful in tackling the complexity of text generation. It makes it easier to understand the different methods of working with generated text by thinking about what what level they apply to.

Another way of thinking about this model isn’t so much about the levels themselves, more about the direction we’re operating in — whether we’re ascending or descending through the levels. Depending on which path we take, we end up with very different levels of control over the resulting text.


We could be starting with structure — plots, themes, tropes or narrative archetypes — and figuring out how to turn that structure into text by working down towards sentences. Or we could be starting with a large corpus of existing texts, working from the smallest pieces of syntax with sense and meaning being emergent.

It turns out these aren’t just ways of classifying generative writing methods, but also AI philosophies that define the way we approach authorship.

The symbolic approach is about templates and top-down organisation, encoding our formalist ideas about the rules, patterns and constraints we want to apply to a piece of writing. It’s more intentionally directed and crafted, but also potentially complex and requiring a lot of manual effort to get right.

The statistical approach involves turning text into data that we can operate on mathematically, and processing it using algorithms that aren’t traditionally associated with natural language or text. Statistical methods usually avoid encoding rules about language or rules about narratives and plots. They treat text as a distribution of probabilities. It’s more akin to musical sampling and remixing than anything associated with traditional writing.

Symbolic Statistical
‘Top-down’ ‘Bottom-up’
Authored rules, world models Sampled from existing texts
Examples: grammars, graph rewriting, agent-based systems, goal-directed planning Examples: n-grams, Markov chains, word vectors, recurrent neural networks

A lot of generative writing in recent years tends to cluster around one or other of these poles, exemplified by the weird and wonderful experimental works coming out of #NaNoGenMo and PROCJAM which are often made using a single specific method.

Every generative method is captivating and interesting by itself, but it’s important to emphasise that the tradeoffs between these methods are not an either/or proposition. They all have different strengths and weaknesses, and operate at different levels of the writing process. Statistical methods require meticulous corpus selection and pruning to get right, while symbolic methods require a big investment in modelling and design.

There are fascinating possibilities for building writing machines composed by multiple generative methods, each addressing a specific level of language or narrative, with their inputs and outputs feeding one another. Think of a Markov chain name generator embedded inside an expansion grammar that generates story fragments. Or a machine learning model that generates a plot which is filled in by templated sentences and entities from a world model.

As always, the needs of the story shape its generator, while the generator determines the limits of the story that can be told. In generative writing, the author is not effaced so much as working in collaboration with the system. Whether this involves curation, pruning, parameter tweaking or writing microcopy and sentence fragments is totally up to our imagination and creative vision for what we want to produce.