Die Suche ergab 37 Treffer

von sroth
25. Mai 2010 11:42
Forum: Statistisches Maschinelles Lernen
Thema: EX2 Problem 2
Antworten: 2
Zugriffe: 493

Re: EX2 Problem 2

To the original question: You should randomize first, then split, even if that leads to slightly different proportions of the classes. To the follow up: The purpose of the randomization is to ensure that the order of the data points doesn't matter more than anything else. The proportion of the diffe...
von sroth
25. Mai 2010 11:36
Forum: Statistisches Maschinelles Lernen
Thema: Problem1: the problem?
Antworten: 1
Zugriffe: 461

Re: Problem1: the problem?

p(y|x) somehow needs to relate to the distribution of z, since y = x+z...

Note that the x and y in the explanatory formulas leading up to the task are not necessarily the same as the x and y of the task itself.
von sroth
25. Mai 2010 11:31
Forum: Statistisches Maschinelles Lernen
Thema: kde_density formula
Antworten: 1
Zugriffe: 464

Re: kde_density formula

Short answer: There is a typo in the Bishop book, see http://research.microsoft.com/en-us/um/people/cmbishop/prml/prml-errata-1st-pr-2009-09-09.pdf Long answer: To understand the concepts it is important to understand rather than memorize the formulas. If you understand where the pre-factor comes fr...
von sroth
10. Mai 2010 18:33
Forum: Statistisches Maschinelles Lernen
Thema: Matlab tasks
Antworten: 4
Zugriffe: 1495

Re: Matlab tasks

Yes, you should use the closed form analytical solution. And it is fine to use std, mean, cov, etc.
What I mean by "by hand" is that you are not allowed to use the NaiveBayes class from the statistics toolbox, the mle function, or similar advanced functions.

I hope that clarifies it,
Stefan Roth
von sroth
10. Mai 2010 11:33
Forum: Statistisches Maschinelles Lernen
Thema: Matlab tasks
Antworten: 4
Zugriffe: 1495

Re: Matlab tasks

For problem 2 the use of the statistics toolbox is fine. I.e. you may use mvnpdf (as is also mentioned in the problem text).
For problem 4 you may also use mvnpdf and normpdf, but you should code the classifier and the parameter estimation procedure by hand.

Best regards,
Stefan Roth
von sroth
4. Mai 2010 17:52
Forum: Statistisches Maschinelles Lernen
Thema: Asignment 1 Problem 2
Antworten: 3
Zugriffe: 590

Re: Asignment 1 Problem 2

You have two options (both are fine): * If you plot them into separate coordinate systems (e.g. with subplot), then either p(x|C_1) or p(x | C_1) p(C_1) is fine, since the prior just rescales the function. * If you plot both into the same coordinate system (i.e. with "hold"), you should use the join...
von sroth
30. Apr 2010 11:32
Forum: Statistisches Maschinelles Lernen
Thema: Asignment 1 Problem 2
Antworten: 3
Zugriffe: 590

Re: Asignment 1 Problem 2

You have to do both parts: The questions in the bullet points (with 10 points each) belong to part B, i.e. the bivariate case. Part A on the other hand (20 points) is about univariate Gaussians; only the tasks described in the 6 lines of text starting at "Part A" are relevant for this problem. Hope ...
von sroth
26. Jun 2008 23:46
Forum: Statistisches Maschinelles Lernen
Thema: p3
Antworten: 3
Zugriffe: 789

Re: p3

No, by marginal independence I mean independence of the type p(a,b) = p(a) * p(b). Basically plain independence, i.e. *without* conditioning on other variables.
von sroth
26. Jun 2008 23:44
Forum: Statistisches Maschinelles Lernen
Thema: P1
Antworten: 7
Zugriffe: 1132

Re: P1

You shouldn't use random xi's, but they should rather be determined by the optimization. So you need to make sure that they are used as part of the optimization (i.e. they need to be minimized over as well) and then you should see the setting of C change the result. Is that what you are doing now?
von sroth
25. Jun 2008 16:57
Forum: Statistisches Maschinelles Lernen
Thema: P1
Antworten: 7
Zugriffe: 1132

Re: P1

No, in problem 1 the value of b needs to come out of the quadratic optimization (remember that you optimize over w and b).

(Sorry for the late reply, but as you know I am away at a conference)
von sroth
13. Jun 2008 21:13
Forum: Statistisches Maschinelles Lernen
Thema: p3 ML
Antworten: 4
Zugriffe: 843

Re: p3 ML

The EM algorithm that you need to implement is actually doing (approximate) ML estimation. If you look back at the slides for mixture models, you will notice that there is no closed form solution for exact ML estimation in mixture models. Hence we need to approximate somehow. The EM algorithm allows...
von sroth
13. Jun 2008 14:40
Forum: Statistisches Maschinelles Lernen
Thema: p3 ML
Antworten: 4
Zugriffe: 843

Re: p3 ML

Not really. The E-step of the EM algorithm looks somewhat similar, but otherwise no.

But just to make sure: All the formulas that you need to implement are given on the assignment sheet. You do not need to make any derivations.
von sroth
13. Jun 2008 14:37
Forum: Statistisches Maschinelles Lernen
Thema: p4
Antworten: 1
Zugriffe: 575

Re: p4

You should give a written answer to the questions...
von sroth
5. Jun 2008 19:52
Forum: Statistisches Maschinelles Lernen
Thema: Ex 3 P1
Antworten: 7
Zugriffe: 1094

Re: Ex 3 P1

I unfortunately can't say much more without giving away the solution.

One more hint: You should keep in mind that the Gaussian does not have to have zero mean.
von sroth
5. Jun 2008 16:16
Forum: Statistisches Maschinelles Lernen
Thema: Ex 3 P1
Antworten: 7
Zugriffe: 1094

Re: Ex 3 P1

Yes, all terms that contain w will be part of the Gaussian. All terms that contain beta, but are not already part of the Gaussian will be part of the Gamma distribution.

Zur erweiterten Suche