Skip to main content Skip to main navigation


AnyQ: An Evaluation Framework for Massively-Parallel Queue Algorithms

Michael Kenzel; Stefan Lemme; Richard Membarth; Matthias Kurtenacker; Hugo Devillers; Markus Steinberger; Philipp Slusallek
In: Proceedings of the 37th IEEE International Parallel & Distributed Processing Symposium (IPDPS). IEEE International Parallel & Distributed Processing Symposium (IPDPS-2023), May 15-19, St. Petersburg, FL, USA, Pages 736-745, IEEE, 5/2023.


Concurrent queue algorithms have been subject to extensive research. However, the target hardware and evaluation methodology on which the published results for any two given concurrent queue algorithms are based often share only minimal overlap. A meaningful comparison is, thus, exceedingly difficult. With the continuing trend towards more and more heterogeneous systems, it is becoming more and more important to not only evaluate and compare novel and existing queue algorithms across a wider range of target architectures, but to also be able to continuously re-evaluate queue algorithms in light of novel architectures and capabilities. To address this need, we present AnyQ, an evaluation framework for concurrent queue algorithms. We design a set of programming abstractions that enable the mapping of concurrent queue algorithms and benchmarks to a wide variety of target architectures. We demonstrate the effectiveness of these abstractions by showing that a queue algorithm expressed in a portable, high-level manner can achieve performance comparable to hand-crafted implementations. We design a system for testing and benchmarking queue algorithms. Using the developed framework, we investigate concurrent queue algorithm performance across a range of both CPU as well as GPU architectures. In hopes that it may serve the community as a starting point for building a common repository of concurrent queue algorithms as well as a base for future research, all code and data is made available as open source software at


Weitere Links