Tool to Identify Social Biases in Creative Writing Ultimately Leaves Artists in Control
the_post_thumbnail_caption(); ?>
Studious Asians, sassy yet helpless women and greedy shopkeepers: These tired stereotypes of literature and film not only often offend the people they caricature, but can drag down what might otherwise have been a compelling narrative.
Researchers at the University of Maryland’s Human-Computer Interaction Lab are working to combat these clichés with the creation of DramatVis Personae (DVP), a web-based visual analytics system powered by artificial intelligence that helps writers identify stereotypes they might be unwittingly giving fictional form among their cast of characters (or dramatis personae).
“DVP is designed to integrate smoothly with the writer’s own creative process,” said Naimul Hoque, a third-year doctoral student in information studies who recently presented DVP at the annual ACM SIGCHI Conference on Designing Interactive Systems.
It allows them to analyze existing literature for research, upload their written content as it becomes available, or even write in the tool itself, and then have its text analytics and visualizations update in real time, he said.
Using a database of previous literature and natural language-processing methods, DVP automatically detects characters and collects data about them as the story progresses, including their aliases, mentions and actions. The author can then furnish demographic information for each character, such as their age, ethnicity, gender and more.
“The DVP dashboard uses this continually growing dataset to visualize the presence of characters and social identities over time,” said Niklas Elmqvist, a professor in the College of Information Studies with an appointment in the University of Maryland Institute for Advanced Computer Studies.
Prior to developing DVP, Hoque and Elmqvist conducted an interview study with nine creative writers. The team, which also includes Bhavya Ghai, a computer science doctoral student at Stony Brook University, asked them about their creative process, how they navigate harmful stereotypes and how they could benefit from the tool’s support.
After their initial design and implementation, the researchers again approached writers through formative interviews and focus groups to test their tool. Then the team conducted a user study with 11 participants to evaluate DVP’s effectiveness. It revealed that they could answer questions related to bias detection more efficiently using DVP compared to a simple text editor.
Hoque, who is advised by Elmqvist, said that the participants particularly appreciated how the tool does not interfere with their artistic freedom, since it’s important for them to write about existing problems.
“Writers are in full control of the system as they get to define what bias is, and how they can mitigate that,” he said. “This tool makes it easy to find otherwise unconscious and nuanced social biases.”