KDDM
KDDM@LMU
KDDM@LMU
203
0.0 (0)
Fichier Détails
Cartes-fiches | 203 |
---|---|
Langue | Deutsch |
Catégorie | Informatique |
Niveau | Université |
Crée / Actualisé | 01.08.2019 / 03.08.2019 |
Lien de web |
https://card2brain.ch/box/20190801_kddm
|
Intégrer |
<iframe src="https://card2brain.ch/box/20190801_kddm/embed" width="780" height="150" scrolling="no" frameborder="0"></iframe>
|
Name two famous kernels.
- polynomial kernel
- includes correlations = crossterms
- radial base function (RBF) / gaussian kernel
- distance of features non-linear weighted
What are sources for errors? (2) How to tackle them? (2, 3)
- overfitting = large difference between test-error (large) and train-error (small)
- reduce VC(H)
- increase |S| = m
- underfitting = large training-error
- increase VC(H)
- increase d (number of features)
- or: problem not learnable
What are learning curves good for? How do act in both cases?
- see if train- and test-error are likely to converge
- if not: reduce complexity of learner
- if true: increase m for convergence