Skip to main content Skip to main navigation


To Reduce Bias, You Must Identify It First! Towards Automated Gender Bias Detection

Lorena Göritz; Daniel Stattkus; Jan Heinrich Beinke; Oliver Thomas
In: Proceedings of the International Conference on Information Systems (ICIS). International Conference on Information Systems (ICIS-2022), Forty-Third International Conference on Information Systems, Copenhagen 2022, December 9-14, Kopenhagen, Denmark, AIS, 11/2022.


Stereotypical gender representation in textbooks influences the personal and professional development of children. For example, if women do not pursue a STEM career because of gender stereotypes, this is not only an individual problem but also negative for society in general. It is hence crucial that textbooks do not convey gender stereotypes but are gender-balanced. Currently, textbook analysis is predominantly conducted manually, if at all. However, this is time-consuming and consequently cost-intensive. Therefore, as part of a design science research project, we developed a gender language analyzer. Our initial prototype is already capable of automatically analyzing textbooks and recommending suggestions regarding gender-balancing. We will further improve our prototype in the next design science research cycle (e.g., by integrating self-learning techniques). With this tool, publishers will be able to automatically analyze textbooks to reduce gender bias. Moreover, we provide the scientific community with design knowledge regarding automated identification of gender bias.

Weitere Links