The goal of the hackathon is to implement a Python package that allows us to apply game-theoretic concepts such as Shapley Values and Shapley Interactions to machine learning models.
You should implement the issues listed in the repository as thoroughly as possible, ensuring sufficient tests and documentation.
You should also implement as modularly as possible so that the package can be extended at a later time.
As references, you can use Explaining by Removing: A Unified Framework for Model Explanation (Section 4.2) and Explaining Machine Learning Models with Conditional Shapley Values in R and Python (Section 2.2).
Package management is handled via uv, which is one of the fastest package managers for Python.
After installation, you can install the required packages based on pyproject.toml using uv sync --dev.
We have already added some required packages; additional necessary packages can be added using uv add <package>, see here.
We aim for high modularity even in the package management, thus we have different groups for the dependencies.
Those which are core dependencies, needed in every case, are added via uv add <package>.
Others which would only be needed in certainty cases, e.g. see the torch dependency group in pyproject.toml, are added via uv add <package> --group <dependency_group_name>.
Finally, you can start the implementation; issues for this are defined here.
Packages (XAI pacakges) that use imputation/masking.
- shap.maskers: https://github.com/shap/shap
- shapiq.imputer: https://github.com/mmschlk/shapiq
- fippy.samplers: https://github.com/gcskoenig/fippy
- sklearn.impute: https://scikit-learn.org/stable/api/sklearn.impute.html
- shapr: https://github.com/NorskRegnesentral/shapr