Until today, audio or media guides accompany exhibition visits and offer explanations for museum objects. The CHIM project aims to develop a learning, conversational museum guide that enables interactive knowledge transfer. The "chatbot in the museum" is a learning, multimodal dialog system and a potential "game changer" in the growing market of museum knowledge transfer. With the help of artificial intelligence, the voice-based dialog system will guide visitors through the exhibitions. The goal is for the chatbot to answer questions about objects as correctly as possible. CHIM uses automatic speech recognition and a hybrid system for speech understanding, in which the intention of a user utterance must be determined and a meaning assigned to the text. Machine learning methods are used for this technology. In CHIM, structured knowledge from museum databases on the one hand and language models trained on large data sets on the other hand are used for this purpose. Many factual questions can be answered directly using data from the museum database. One of our research questions is how well CHIM can answer open-ended questions using large language models when a specific object and associated text are provided as context. To obtain training material, we first conducted a successful question setting campaign. The next step will be to map the questions to existing content. The data will be used to create AI models that can generate relevant answers. Computer scientists, multimedia guide specialists and museum experts are collaborating to realize the project. The project aims to develop an economically viable solution. It is to be ensured that CHIM is connected to various knowledge transfer systems and at the same time offers users comprehensible information. The structuring of the data by means of semantic annotation, adequate dialog strategies, and the possibility of multimodal intention recognition are technical pillars of the project.
Partners
Linon Medien KG