Skip to main content Skip to main navigation

AI tool identifies bias in learning materials - StereOFF wins award in "Together it's AI" ideas competition

| Learning & Education | Awards | Smart Enterprise Engineering | Osnabrück / Oldenburg

StereOFF is the name of an AI tool that can automatically identify gender stereotypical language in learning materials. The cross-sector idea StereOFF was developed by researchers from the German Research Center for Artificial Intelligence (DFKI) in cooperation with the Didactic Innovations company. At the re:publica 2023 in Berlin, StereOFF has now been awarded a prize in the "Together it's AI" ideas competition organised by the Civic Innovation Platform of the think tank of the Federal Ministry of Labour and Social Affairs.

© Thomas Rafalzyk
Lorena Göritz and Daniel Stattkus received the award on behalf of the Smart Enterprise Engineering research department of DFKI, as did Jannick Eckle (from right to left) for Didactic Innovations.

Representing Federal Minister Hubertus Heil, State Secretary Nermin Fazlic awarded prizes in Berlin to a total of 27 ideas in three thematic areas. StereOFF was among the winners in the category Teaching and Knowledge. Lorena Göritz and Daniel Stattkus accepted the award on behalf of the Smart Enterprise Engineering research department of DFKI, as did Jannick Eckle for Didactic Innovations.

The European Commission identifies gender stereotypes as one of the main causes of gender inequalities, as stereotypical roles and norms limit the choices and freedoms of girls and boys or men and women, according to the submitted outline of the idea. StereOFF strives for gender equality in education and training. "Reducing, for example, often stereotypical male portrayals in MINT learning content aims to inspire more girls and women to enter MINT fields," says DFKI scientist Lorena Göritz. The idea is therefore particularly interesting for training and further education institutes as well as textbook publishers.

Gender Bias Lexicon

The basis of the text analyses are publicly available so-called text corpora, such as film transcripts, which show gender bias, i.e. distortions that are shaped by gender-related stereotyping or prejudices. From this data, the researchers develop a gender bias lexicon using data-driven Natural Language Processing techniques. On this basis, publishers, for example, can then automatically identify gender-stereotypical role models in textbooks or social media agencies can examine large quantities of published texts for stereotypical language in real time.

StereOFF not only analyses the learning materials, but also recommends alternative suggestions. The aim is to present sectors such as the MINT field or the profession of educators in a more balanced way, to facilitate integration, to promote diversity and thus to counter the shortage of skilled workers. The software developed within StereOFF will be made publicly available in the future.

 

© Thomas Rafalzyk
State Secretary Nermin Fazlic awarded DFKI researcher Lorena Göritz.