R.I.P.
๐ป
Ghosted
GNNShap: Scalable and Accurate GNN Explanation using Shapley Values
January 09, 2024 ยท Entered Twilight ยท ๐ The Web Conference
Repo contents: .gitignore, LICENSE, README.md, baselines, cppextension, dataset, examples, gnnshap, models, pretrained, requirements.txt, results, run_baseline_experiments.sh, run_gnnshap.py, run_gnnshap_experiments.sh, train.py, train_large.py
Authors
Selahattin Akkas, Ariful Azad
arXiv ID
2401.04829
Category
cs.LG: Machine Learning
Cross-listed
cs.SI
Citations
25
Venue
The Web Conference
Repository
https://github.com/HipGraph/GNNShap
โญ 42
Last Checked
2 months ago
Abstract
Graph neural networks (GNNs) are popular machine learning models for graphs with many applications across scientific domains. However, GNNs are considered black box models, and it is challenging to understand how the model makes predictions. Game theoric Shapley value approaches are popular explanation methods in other domains but are not well-studied for graphs. Some studies have proposed Shapley value based GNN explanations, yet they have several limitations: they consider limited samples to approximate Shapley values; some mainly focus on small and large coalition sizes, and they are an order of magnitude slower than other explanation methods, making them inapplicable to even moderate-size graphs. In this work, we propose GNNShap, which provides explanations for edges since they provide more natural explanations for graphs and more fine-grained explanations. We overcome the limitations by sampling from all coalition sizes, parallelizing the sampling on GPUs, and speeding up model predictions by batching. GNNShap gives better fidelity scores and faster explanations than baselines on real-world datasets. The code is available at https://github.com/HipGraph/GNNShap.
Community Contributions
Found the code? Know the venue? Think something is wrong? Let us know!
๐ Similar Papers
In the same crypt โ Machine Learning
R.I.P.
๐ป
Ghosted
XGBoost: A Scalable Tree Boosting System
R.I.P.
๐ป
Ghosted
Batch Normalization: Accelerating Deep Network Training by Reducing Internal Covariate Shift
R.I.P.
๐ป
Ghosted
Semi-Supervised Classification with Graph Convolutional Networks
R.I.P.
๐ป
Ghosted
Proximal Policy Optimization Algorithms
R.I.P.
๐ป
Ghosted