Skip to main content Skip to main navigation

Publication

Comparing BERT with an intent based question answering setup for open-ended questions in the museum domain

Md Mahmud Uz Zaman; Stefan Schaffer; Tatjana Scheffler
In: 32. Konferenz Elektronische Sprachsignalverarbeitung. Elektronische Sprachsignalverarbeitung. Elektronische Sprachsignalverarbeitung (ESSV-2021), March 3-5, Berlin/Virtual, Germany, TUDpress, Dresden, 2021.

Abstract

BERT-based models achieve state-of-the-art performance for factoid question answering tasks. In this work, we investigate whether a pre-trained BERT model can also perform well for open-ended questions. We set up an online experiment, from which we collected 111 user-generated open-ended questions. These questions were passed to a pre-trained BERT QA model and a dedicated intent recognition based module. We have found that the simple intent based module was around 25% more often correct than the pre-trained BERT model, indicating that open-ended questions still require different solutions compared to factoid questions.

Projekte

Weitere Links