Skip to main content Skip to main navigation

Publication

IDEBench: A Benchmark for Interactive Data Exploration

Philipp Eichmann; Emanuel Zgraggen; Carsten Binnig; Tim Kraska
In: David Maier; Rachel Pottinger; AnHai Doan; Wang-Chiew Tan; Abdussalam Alawini; Hung Q. Ngo (Hrsg.). Proceedings of the 2020 International Conference on Management of Data. ACM SIGMOD International Conference on Management of Data (SIGMOD-2020), June 14-19, Pages 1555-1569, ACM, 2020.

Abstract

Existing benchmarks for analytical database systems such as TPC-DS and TPC-H are designed for static reporting scenarios. The main metric of these benchmarks is the performance of running individual SQL queries over a synthetic database. In this paper, we argue that such benchmarks are not suitable for evaluating database workloads originating from interactive data exploration (IDE) systems where most queries are ad-hoc, not based on predefined reports, and built incrementally. As a main contribution, we present a novel benchmark called IDEBench that can be used to evaluate the performance of database systems for IDE workloads. As opposed to traditional benchmarks for analytical database systems, our goal is to provide more meaningful workloads and datasets that can be used to benchmark IDE query engines, with a particular focus on metrics that capture the trade-off between query performance and quality of the result. As a second contribution, this paper evaluates and discusses the performance results of selected IDE query engines using our benchmark. The study includes two commercial systems, as well as two research prototypes (IDEA, approXimateDB/XDB), and one traditional analytical database system (MonetDB).

Weitere Links