17 lines
720 B
Plaintext
17 lines
720 B
Plaintext
Goal for this case study: Have better hyperparameters than pyod!
|
|
So each of you: Gets assigned two algorithms!
|
|
<l2st>
|
|
One fairly simple before
|
|
One more complicated one today
|
|
</l2st>
|
|
Try to find the best possible hyperparameters for your algorithms
|
|
<l2st>
|
|
Try to be clever (for example: PCA: $n_{components}<n_{features}$. Maybe $\frac{n_{components}}{n_{features}}$ constant?
|
|
</l2st>
|
|
Afterwards
|
|
<l2st>
|
|
Write down your findings into a simple function (given data, what are my best hyperparameters)
|
|
Write down your finding into a report (together, double collumn. 6 Pages per student, plus comparison of algorithms to each other)
|
|
One final presentation together in front of my colleagues. About 10min per student.
|
|
</l2st>
|