Using Artificial Intelligence to combat sexism in the literary world

Posted on Tuesday, April 25, 2017

The team of undergraduate students who created Just Review, a new advocacy project that aims to tackle gender bias in book reviews.

By McGill Reporter Staff

Last year, Professor Andrew Piper caused a bit of a stir in the international world of books and those who review them. In a piece Piper wrote for The New Republic called Women write family, men write war, the  William Dawson Scholar in the Department of Languages, Literatures, and Cultures says the stereotypes about what interests women and men are still entrenched and iron clad even today.

Women and men write books about all kinds of topics but research shows that only certain books get reviewed. In simple terms, women get reviewed when they write about family, emotions and sexuality while men are seen as experts on the economy, politics, statecraft, the military, etc. Gender bias in the literary world has been a hot topic in recent years, with many women speaking out on the widespread inequality that they face as authors. Feminist activists have been fighting back against the overrepresentation of male authors in book reviews.

Andrew Piper, William Dawson Scholar in the Department of Languages, Literatures, and Cultures. / Photo: Owen Egan

The work Piper has been doing in his lab, called .txtLAB, inspired a group of students to do computerized content analysis of literary reviews. They formed an advocacy project and have uncovered important findings about gender bias in book reviews. They also developed tools that editors can use to self-assess for bias. “Pretty impressive for a cohort of McGill undergrads!” says Piper.

Piper says the students’ proposal was a totally novel idea. “Cultural advocacy? It made perfect sense, so I put together an internship on ‘computational cultural advocacy’ focusing on women and the public sphere,” he says.

Professor Carrie Rentschler in Communication Studies also helped supervise because of her background in feminist media studies. “We received excellent applications and chose five students to lead the project. They came from different backgrounds — social justice advocacy, women’s studies, literary studies, computer science, linguistics — but were united by a concern for gender equality,” says Piper. “They spent the year identifying the problem in greater detail, focusing on this question of ‘topic bias’ (i.e. what women can be experts on); developing connections with key stakeholders; undertaking surveys and developing a self-assessment tool. The goal was to take research out into the world and try to effect positive change.”

Maxine Dannatt, Rosie Decter, Eva Portelance, Ariane Schang, and Sophie Stuart-Sheppard launched Just Review, a new advocacy project that aims to tackle gender bias in book reviews. The Just Review website features research and resources designed to help publications and the general public work towards an equitable book review culture. Just Review is building on this work by adding a new lens for understanding inequality: topic bias.

Just Review’s website highlights research conducted on this important issue so far, including an analysis of topic bias in over 10,000 book reviews in the New York Times.

Just Review’s research has shown that women are much more likely to be reviewed when writing about topics associated with family and sexuality, while men are more likely to be reviewed when writing on public-facing topics like science, politics, and military conflict. This topic bias reinforces stereotypes about masculinity and femininity and unspoken rules for what men, women, and gender non-conforming authors can and cannot write about.

But Just Review doesn’t stop at research: it seeks to motivate change. JustReview.org has resources to help combat gender bias in book reviews, including the Bias Tracker tool which allows publications to self-assess for topic bias, and a tip sheet with simple strategies editors can use to keep bias out of their pages. Just Review is working toward a culture where all genders can write about all topics and receive fair coverage, providing the tools to make equity a reality

Professor Piper reacts with fervour to the idea that feminism and feminist content analysis is passé.

“Passé? As long as there is gender bias, feminism has a key role to play in our society,” he says. “For me, data and computational analysis is an important new dimension for addressing gender inequality. It can show us all of the ways that our beliefs about gender continue to conform to incredibly rigid stereotypes. There is no reason to accept these states of affairs as norms. We can do better.”

Piper and the Just Review team are leading the way in the area of cultural analytics, or the use of computation to study culture.

“The humanities has developed a variety of highly sophisticated methods to study culture in great detail — how a poem or painting works, the nature of our computational interfaces or platforms, etc.” says Piper. “But these disciplines suffer from a key deficit which is the ability to talk about the vast amounts of cultural production that have been going on for decades if not centuries now.

“Using data allows us to try to understand human creativity and cultural practices at a much broader scale. It allows us to assess things like gender bias in the media, not simply through counting labels (how many women get reviewed?) but also by studying the language surrounding these works,” Piper continues. “That is the key new dimension: our ability to move past counting people and start understanding the nuanced representations that inform culture, whether it be language, sound, or visual art.”

Piper says students’ work is important because they are drawing attention to something we didn’t know before and trying to do something about it. It is known that women are less likely to be reviewed or be reviewers of books. But even when they do achieve equality of numbers, the language and topics surrounding the books that do get reviewed is still significantly biased towards topics associated with emotions and domesticity.

“And finally the work is important because they are students.” Says Piper. “It shows that with effort and mentoring young women can become cultural advocates. They can take their academic interests and apply them to existing problems in the world and effect change. We’ll be measuring the problem in the years to come to see how much of a dent we can make, but they have made a major step towards getting this process started.”

Share this article

Category: Other News

Tag: , , , , , ,

2 Responses to Using Artificial Intelligence to combat sexism in the literary world

  1. Carmen says:

    Great article on gender bias in publishing. However, it was surprising not to see a picture of Co-Supervisor Professor Carrie Rentschler. Perhaps even an article on gender bias still shows bias by not making more visible the work of women in academia!!!

  2. Daphnée Azoulay says:

    This is awesome. Thank you so much for doing this work.

Post a Comment

  1. You can use these tags: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <s> <strike> <strong>