Attribution-Guided Visualization Explanation

Given the critical need for more reliable autonomous driving models, explainability has become a focal point within the research community. In testing autonomous driving models, even slight perception differences can dramatically influence decision-making processes. Understanding the specific reasons why a model decides to stop or keep forward remains a significant challenge. This paper presents a novel attribution-guided visualization method aimed at exploring the triggers behind decision shifts, providing clear insights into the underlying why and why not of such decisions. More specifically, we propose the cumulative layer fusion attribution method that identifies the parameters most critical to decision-making. These attributions then inform the visualization updates, ensuring that changes in decisions are driven only by modifications to critical information. Furthermore, we develop an indirect regularization method that increase visualization quality without necessitating extra hyperparameters. Experiments on large datasets demonstrate that our method produces valuable visualization explanations and outperforms state-of-the-art methods in both qualitative and quantitative evaluations.

Article

“Exploring Decision Shifts in Autonomous Driving with Attribution-Guided Visualization.” IEEE Transactions on Intelligent Transportation Systems. Article GitHub Repo

“Visualization comparison of vision transformers and convolutional neural networks.” IEEE Transactions on Multimedia. Article GitHub Repo

“Understanding contributing neurons via attribution visualizations.” Neurocomputing. Article GitHub Repo

“Group visualization of class-discriminative features.” Neural Networks. Article GitHub Repo

Other Applications

To obtain an intuitive and integral understanding of neuron attributions, we propose a novel viewpoint to interpret neuron attributions of an entire layer, i.e., visualizing their meanings integrally. This video shows how the mask and neural network visualization are generated.