prosem2/topics_iml.csv

3.0 KiB

1indextitlechapterwhomailnamenote1note2
21Why do we care about IML?3.1-3.3Danieldaniel.wilmes@cs.uni-dortmund.de1e.g. Is IML required?{null}
32How to do IML research?3.4-3.6Danieldaniel.wilmes@cs.uni-dortmund.de2e.g. How to evaluate Interpretability{null}
43Linear Models5.1-5.3Jellejelle.huentelmann@cs.tu-dortmund.de3Simple Models are simple to explainProgramming task: Do a linear regression on a simple dataset!
54Decision Trees5.4Jellejelle.huentelmann@cs.tu-dortmund.de4Programming task: Train a decision Tree on a simple dataset!Could be combined with 5
65Rule Based Methods5.5-5.6Jellejelle.huentelmann@cs.tu-dortmund.de5{null}Could be combined with 4
76Partial Dependence Plot8.1Carinacarina.newen@cs.uni-dortmund.de6How much does chainging a feature change the output?Could be combined with 7
87Accumulated Local Effects8.2Carinacarina.newen@cs.uni-dortmund.de7How much effect does changing a feature have on the average predictionCould be combined with 6
98Feature Interactions8.3Carinacarina.newen@cs.uni-dortmund.de8In general features are not independentMeasure the effect of interactions between them
109Functional Decomposition8.4Danieldaniel.wilmes@cs.uni-dortmund.de9Describe a function by feature interactions and their interactions{null}
1110Permutation Feature Importance8.5Binbin.li@tu-dortmund.de10How much does a feature change, if we permute its values{null}
1211Global Surrogates8.6Binbin.li@tu-dortmund.de11Replace a complicated model by an interpretable one{null}
1312Prototypes8.7Binbin.li@tu-dortmund.de12Represent some model output by well fitting data instances{null}
1413Individual Conditional Expectation9.1-9.2Chiarachiara.balestra@cs.uni-dortmund.de13Show the effect one feature has on the prediction{null}
1514Counterfactual Explanations9.3-9.4Chiarachiara.balestra@cs.uni-dortmund.de14What to do to change a prediction?{null}
1615Shapley Values9.5-9.6Chiarachiara.balestra@cs.uni-dortmund.de15Use game theory to explain the output of a model{null}
1716Learned Features10.1Benediktbenedikt.boeing@cs.tu-dortmund.de16Conv. NN contain Intepretable FeaturesProgramming task: Visualise your own classifier!
1817Saliency Maps10.2Simonsimon.kluettermann@cs.uni-dortmund.de17Different parts of an image have different effect/importance on the classification of an imageProgramming task: Generate one Saliency Map yourself!
1918Concept Detection10.3Simonsimon.kluettermann@cs.uni-dortmund.de18Replace Features by Concepts{null}
2019Adversarials10.4Benediktbenedikt.boeing@cs.tu-dortmund.de19Slight changes in a neural network can change its output drastically{null}
2120Influential Instances10.5Simonsimon.kluettermann@cs.uni-dortmund.de20Single examples can change the output of a NN drastically{null}