Publication
AutoQIR: Auto-Encoding Questions with Retrieval Augmented Decoding for Unsupervised Passage Retrieval and Zero-shot Question Generation
Stalin Varanasi; Muhammad Umer Butt; Günter Neumann
In: Large Language Models for Natural Language Processing. International Conference on Recent Advances in Natural Language Processing (RANLP-2023), located at RANLP, September 4-6, Varna, Bulgaria, Pages 1171-1179, ISBN ISBN 978-954-452-092-2, INCOMA Ltd. Shoumen, BULGARIA, 9/2023.
Abstract
Dense passage retrieval models have become state-of-the-art for information retrieval on many Open-domain Question Answering (ODQA) datasets. However, most of these models rely on supervision obtained from the ODQA datasets, which hinders their
performance in a low-resource setting. Recently, retrieval-augmented language models have been proposed to improve both zero-shot and supervised information retrieval. However, these models have pre-training tasks that are agnostic to the target task of passage retrieval. In this work, we propose Retrieval Augmented Auto-encoding of Questions for zeroshot dense information retrieval. Unlike other pre-training methods, our pre-training method is built for target information retrieval, thereby making the pre-training more efficient. Our method consists of a dense IR model for encoding questions and retrieving documents during training and a conditional language model that maximizes the question's likelihood by marginalizing over retrieved documents. As a by-product, we can use this conditional language model for zero-shot question generation from documents. We show that the IR model obtained through our method improves the current state-of-the-art of zero-shot dense information retrieval, and we improve the results
even further by training on a synthetic corpus created by zero-shot question generation.