prosem2/topics_iml.csv

22 lines
3.0 KiB
Plaintext
Raw Normal View History

2022-09-23 12:02:11 +02:00
"index";"title";"chapter";"who";"mail";"name";"note1";"note2"
1;"Why do we care about IML?";"3.1-3.3";"Daniel";"daniel.wilmes@cs.uni-dortmund.de";1;"e.g. Is IML required?";"{null}"
2;"How to do IML research?";"3.4-3.6";"Daniel";"daniel.wilmes@cs.uni-dortmund.de";2;"e.g. How to evaluate Interpretability";"{null}"
3;"Linear Models";"5.1-5.3";"Jelle";"jelle.huentelmann@cs.tu-dortmund.de";3;"Simple Models are simple to explain";"Programming task: Do a linear regression on a simple dataset!"
4;"Decision Trees";"5.4";"Jelle";"jelle.huentelmann@cs.tu-dortmund.de";4;"Programming task: Train a decision Tree on a simple dataset!";"Could be combined with 5"
5;"Rule Based Methods";"5.5-5.6";"Jelle";"jelle.huentelmann@cs.tu-dortmund.de";5;"{null}";"Could be combined with 4"
6;"Partial Dependence Plot";"8.1";"Carina";"carina.newen@cs.uni-dortmund.de";6;"How much does chainging a feature change the output?";"Could be combined with 7"
7;"Accumulated Local Effects";"8.2";"Carina";"carina.newen@cs.uni-dortmund.de";7;"How much effect does changing a feature have on the average prediction";"Could be combined with 6"
8;"Feature Interactions";"8.3";"Carina";"carina.newen@cs.uni-dortmund.de";8;"In general features are not independent";"Measure the effect of interactions between them"
9;"Functional Decomposition";"8.4";"Daniel";"daniel.wilmes@cs.uni-dortmund.de";9;"Describe a function by feature interactions and their interactions";"{null}"
10;"Permutation Feature Importance";"8.5";"Bin";"bin.li@tu-dortmund.de";10;"How much does a feature change, if we permute its values";"{null}"
11;"Global Surrogates";"8.6";"Bin";"bin.li@tu-dortmund.de";11;"Replace a complicated model by an interpretable one";"{null}"
12;"Prototypes";"8.7";"Bin";"bin.li@tu-dortmund.de";12;"Represent some model output by well fitting data instances";"{null}"
13;"Individual Conditional Expectation";"9.1-9.2";"Chiara";"chiara.balestra@cs.uni-dortmund.de";13;"Show the effect one feature has on the prediction";"{null}"
14;"Counterfactual Explanations";"9.3-9.4";"Chiara";"chiara.balestra@cs.uni-dortmund.de";14;"What to do to change a prediction?";"{null}"
15;"Shapley Values";"9.5-9.6";"Chiara";"chiara.balestra@cs.uni-dortmund.de";15;"Use game theory to explain the output of a model";"{null}"
16;"Learned Features";"10.1";"Benedikt";"benedikt.boeing@cs.tu-dortmund.de";16;"Conv. NN contain Intepretable Features";"Programming task: Visualise your own classifier!"
17;"Saliency Maps";"10.2";"Simon";"simon.kluettermann@cs.uni-dortmund.de";17;"Different parts of an image have different effect/importance on the classification of an image";"Programming task: Generate one Saliency Map yourself!"
18;"Concept Detection";"10.3";"Simon";"simon.kluettermann@cs.uni-dortmund.de";18;"Replace Features by Concepts";"{null}"
19;"Adversarials";"10.4";"Benedikt";"benedikt.boeing@cs.tu-dortmund.de";19;"Slight changes in a neural network can change its output drastically";"{null}"
20;"Influential Instances";"10.5";"Simon";"simon.kluettermann@cs.uni-dortmund.de";20;"Single examples can change the output of a NN drastically";"{null}"