Skip to main content Skip to main navigation

Publikation

Using Large Language Models to Generate Authentic Multi-agent Knowledge Work Datasets

Desiree Heim; Christian Jilek; Adrian Ulges; Andreas Dengel
In: INFORMATIK Proceedings 2024. AI@WORK Workshop (AI@WORK-2024), INFORMATIK Festival 2024, September 24-26, Wiesbaden, Germany, Gesellschaft für Informatik e.V. Bonn, 2024.

Zusammenfassung

Current publicly available knowledge work data collections lack diversity, extensive annotations, and contextual information about the users and their documents. These issues hinder objective and comparable data-driven evaluations and optimizations of knowledge work assistance systems. Due to the considerable resources needed to collect such data in real-life settings and the necessity of data censorship, collecting such a dataset appears nearly impossible. For this reason, we propose a configurable, multi-agent knowledge work dataset generator. This system simulates collaborative knowledge work among agents producing Large Language Model-generated documents and accompanying data traces. Additionally, the generator captures all background information, given in its configuration or created during the simulation process, in a knowledge graph. Finally, the resulting dataset can be utilized and shared without privacy or confidentiality concerns. This paper introduces our approach’s design and vision and focuses on generating authentic knowledge work documents using Large Language Models. Our study involving human raters who assessed 53% of the generated and 74% of the real documents as realistic demonstrates the potential of our approach. Furthermore, we analyze the authenticity criteria mentioned in the participants’ comments and elaborate on potential improvements for identified common issues.

Projekte

Weitere Links