3.1.b Gaussian Features -> sequential baysian learning?

Moderator: Statistisches Maschinelles Lernen

ichitaka
Neuling
Neuling
Beiträge: 3
Registriert: 14. Jun 2018 18:48

3.1.b Gaussian Features -> sequential baysian learning?

Beitrag von ichitaka »

So as there is a terminology established in the homework that is not used in any literature at all (searching for gaussian features in the bishop book or interweb does not return any useful results) i hope anybody can help me here.

with features we mean weights i suppose. so every mean of a weight is in a linear relationship to all the other weights, this makes sense as the sum at every point x of all features is supposed to be 1.

We are supposed to plot the design matrix of each feature over time? where does the time component come from? are we supposed to do some kind of sequential learning ? So we train our model on the data? but since our input is 1 dimensional and the sum is supposed to be 1 of all features at every x. would that not mean, that we will just project the data point on itself at the end? Why is not any terminology used that is based on the common literature that is even used in the lecture? this is unnecessary confusing for a 4 points task.

MFB
Erstie
Erstie
Beiträge: 21
Registriert: 3. Okt 2013 17:10

Re: 3.1.b Gaussian Features -> sequential baysian learning?

Beitrag von MFB »

On slide 5 of the linear regression lecture, features have been defined as phi(x).

You can download a free ebook on Gaussian processes for machine learning here:

http://www.gaussianprocess.org/gpml/

Antworten

Zurück zu „Statistisches Maschinelles Lernen“