Skip to main content Skip to main navigation

Publication

Evaluating German Transformer Language Models with Syntactic Agreement Tests

Karolina Zaczynska; Nils Feldhus; Robert Schwarzenberg; Aleksandra Gabryszak; Sebastian Möller
In: Sarah Ebling; Don Tuggener; Manuela Hürlimann; Mark Cieliebak; Martin Volk (Hrsg.). Proceedings of the 5th Swiss Text Analytics Conference (SwissText) & 16th Conference on Natural Language Processing (KONVENS). Swiss Text Analytics Conference & Conference on Natural Language Processing (SwissText & KONVENS-2020), Joint Online Conference, June 23-25, Zürich, Switzerland, Vol. 2624, CEUR Workshop Proceedings, 6/2020.

Abstract

Pre-trained transformer language models (TLMs) have recently refashioned natural language processing (NLP): Most stateof-the-art NLP models now operate on top of TLMs to benefit from contextualization and knowledge induction. To explain their success, the scientific community conducted numerous analyses. Besides other methods, syntactic agreement tests were utilized to analyse TLMs. Most of the studies were conducted for the English language, however. In this work, we analyse German TLMs. To this end, we design numerous agreement tasks, some of which consider peculiarities of the German language. Our experimental results show that state-of-the-art German TLMs generally perform well on agreement tasks, but we also identify and discuss syntactic structures that push them to their limits.

Projekte

Weitere Links