Premium Partner

Pattern

Pattern

Pattern


Kartei Details

Karten 28
Sprache English
Kategorie Informatik
Stufe Universität
Erstellt / Aktualisiert 20.01.2021 / 17.05.2022
Lizenzierung Keine Angabe
Weblink
https://card2brain.ch/box/20210120_pattern
Einbinden
<iframe src="https://card2brain.ch/box/20210120_pattern/embed" width="780" height="150" scrolling="no" frameborder="0"></iframe>

Which of these formulas are correct?

You are using a binary classifier on i.i.d. data. A point p is classified in class c1 if the L2 distance to the mean c1 is smaller than to the mean of c2. The two classes have equal prior probabilies. What assumptions must be made such that a Bayes Classifier is equivalent?

If the covariance matrix of two normal distribuions of your two classes are isotropic and identical which of the following two methods can you than use to classify a new point x?

If two events A and B are independent, then:

Given three statistically dependent random variables a,b,c. You want to factorize the joint density p(a,b,c) into the marginal densities of the individual variables a,b,c using the product rule of probability. How many different factorizations are there?

Why is the log-likelihood instead of the likelihood itself often computed (MC)?

How does the introduced skin and non-skin priors influence the classification results on the test images and why does this happen?

You are estimating the density function of a random variable x. You decide to model x with a Gaussian Mixture Model (GMM) consisting of M components, because a single Gaussian distribution captures your data of size N poorly. Each component has a mixing coefficient c_i and parameters theta_i. Select the correct statements.