Skip to main content Skip to main navigation

Publication

Conceptual Explanations of Neural Network Prediction for Time Series

Ferdinand Küsters; Peter Schichtel; Sheraz Ahmed; Andreas Dengel
In: IJCNN. International Joint Conference on Neural Networks (IJCNN), IEEE, 2020.

Abstract

Deep neural networks are black boxes by construction. Explanation and interpretation methods therefore are pivotal for a trustworthy application. Existing methods are mostly based on heatmapping and focus on locally determining the relevant input parts triggering the network prediction. However, these methods struggle to uncover global causes. While this is a rare case in the image or NLP modality, it is of high relevance in the time series domain. This paper presents a novel framework, i.e. Conceptual Explanation, designed to evaluate the effect of abstract (local or global) input features on the model behavior. The method is model-agnostic and allows utilizing expert knowledge. On three time series datasets Conceptual Explanation demonstrates its ability to pinpoint the causes inherent to the data to trigger the correct model prediction.