In a recent scientific study highlighted by The Atlantic, researchers from the University of Vermont and the University of Adelaide determined the core emotional trajectories of stories by taking advantage of advances in computing power and natural language processing to analyze the emotional arcs of 1,737 fictional works in English available in the online library Project Gutenberg.

How did they do it and what did they discover?

Their research, published online, involved assigning happiness scores to 10,000 frequently used words determined by a crowdsourcing project, then breaking up blocks of text to analyze the happiness of each block and mapping those scores on an emotional trajectory. They discovered six emotional arcs that form the foundation for complex narrative storytelling:


  1. Rags to Riches (rise)
  2. Riches to Rags (fall)
  3. Man in a Hole (fall then rise)
  4. Icarus (rise then fall)
  5. Cinderella (rise then fall then rise)
  6. Oedipus (fall then rise then fall) 

The researchers recognize that emotional arcs and plot structures may be similar, but are not necessarily the same. They also point out multiple well-known theories that all narratives can be boiled down three plots, seven plots, a different seven plots, 20 plots, or even 36 plots depending on who you read and believe. Furthermore, the researchers realize that multiple, connected emotional arcs appear in longer, more complex works of fiction, so they limited their study sample to works of fiction between 10,000 words and 200,000 words.

Finally, using download data from Project Gutenberg, the researchers determined the popularity of each of the six core emotional arcs. Based on the emotional arc mapping and download data, the most popular emotional arcs are:

  1. Cinderella (rise then fall then rise)
  2. Oedipus (fall then rise then fall)
  3. Two sequential Man in the Hole arcs (fall then rise then fall then rise)
  4. Cinderella with a tragic ending (rise then fall then rise then fall)

Can a computer tell a story with emotional resonance if it doesn't understand what an emotion feels like? 

Why is this important to filmmakers?

First, I would like to point out that I am certain all of the mathematicians and scientists involved in this study are way smarter than I am, but I can already hear some of you reading this and going, "Duh"—especially when you realize this research proves that all narratives string together a series of rises and falls of emotions, which most of us would accept as common knowledge.

Perhaps more interesting is the data about the most popular emotional arcs. Audiences are interested in emotionally complex stories, not necessarily simple emotional journeys. Oh, and happy endings after some emotional turmoil are popular, but audiences like tragedies, too. Again, this is something I'd say most of us knew. But now we have scientific evidence to back it up, thanks to AI.

The Atlantic article goes a bit further, though. When asked about the implications of this research, lead author Andy Reagan, a Ph.D. candidate in mathematics at the University of Vermont, explains that this research could serve as a foundation for scientists who are currently trying to teach computers how to write original stories. Reagan notes, "This is an active area of research, and there are a lot of hard problems yet to be solved. In addition to the plot, structure, and emotional arc, to write great stories, a computer will need to create characters and dialogue that are compelling and meaningful."

HAL 9000 No Film School A.I. Alexa Amazon What's My MovieMeet the writer of tomorrow.

But how do I feel about this as a screenwriter?

I'm not so keen to have computers write screenplays, and not just because I thought the recent effort of AI screenwriter Benjamin's short sci-fi film Sunspring was terrible (though perversely interesting to watch for about two minutes before getting fed up with it). Storytelling is fundamentally human. Before we as a species even created written languages, humans told stories and created art as a way to entertain, to educate, to pass on history. Just like computers, written words themselves are merely tools we use to communicate. We connect words to imbue meaning and emotion when we tell stories.

We can teach computers the supposed emotional value of words (although I don't recall Jane Austen ever using the saddest crowdsourced word of all—terrorist—to describe Mr. Darcy). We can teach computers to understand and generate natural language. But can we teach computers how to feel? Should we? Can a computer tell a story with emotional resonance if it doesn't understand what an emotion feels like? And even if we figure out how to teach a computer how to feel, do we really want computers to create stories for us?

Admittedly, most of my experience with AI is through the fictional lens of movies. And I can tell you that I found the ending of Spike Jonze's Her, when all of the sentient operating systems disappear into the network, to be incredibly terrifying (Samantha is Skynet!). Of course, I was supposed to be horrified when Ava walks out of her creator's lair in the wake of murder and imprisonment at the end of Ex Machina. I don't want to live inside The Matrix and I certainly don't need to get caught between Sarah Connor and whichever version of The Terminator they've sent back this time.

Final image of 'Her'The moment when I realized 'Her' was actually a horror movie.

I find joy in telling stories. If I didn't, I certainly wouldn't keep on banging my head against my computer screen to figure out how to break yet another story for a new screenplay. I particularly find joy in the emotional reactions I evoke from an audience when I tell a story—around the dinner table, through a screenplay, or up on the big screen. I believe this joy is evident in my storytelling, regardless of whether or not my character's emotional journey ends in happiness or tragedy. But I sincerely doubt a computer or AI that we train to write stories will ever be able to find joy, no matter how much emotional value we assign to its database of words.

Sorry, Wall-E.

Source: The Atlantic