Skip to main content Skip to main navigation

Publication

Quantifying and explaining machine learning uncertainty in predictive process monitoring: an operations research perspective

Nijat Mehdiyev; Maxim Majlatow; Peter Fettke
In: Annals of Operations Research (AOR), Pages 1-40, Springer, 4/2024.

Abstract

In the rapidly evolving landscape of manufacturing, the ability to make accurate predictions is crucial for optimizing processes. This study introduces a novel framework that combines predictive uncertainty with explanatory mechanisms to enhance decision-making in complex systems. The approach leverages Quantile Regression Forests for reliable predictive process monitoring and incorporates Shapley Additive Explanations (SHAP) to identify the drivers of predictive uncertainty. This dual-faceted strategy serves as a valuable tool for domain experts engaged in process planning activities. Supported by a real-world case study involving a medium-sized German manufacturing firm, the article validates the model’s effectiveness through rigorous evaluations, including sensitivity analyses and tests for statistical significance. By seamlessly integrating uncertainty quantification with explainable artificial intelligence, this research makes a novel contribution to the evolving discourse on intelligent decision-making in complex systems.

Projects

More links