1.3c: Numerical Optimization

Moderator: Statistisches Maschinelles Lernen

Benutzeravatar
Polona
Windoof-User
Windoof-User
Beiträge: 40
Registriert: 10. Okt 2010 10:45

1.3c: Numerical Optimization

Beitrag von Polona » 18. Mai 2016 18:53

Hallo,

i wrote a simple gradient descent algorithm for Rosenbrock’s function. After 100 steps i get local minimum at

Code: Alles auswählen

[-0.9932861   0.99665107  0.99833032  0.99916774  0.9995852   0.99979328
  0.99989698  0.99994866  0.99997441  0.99998725  0.99999365  0.99999683
  0.99999842  0.99999921  0.9999996   0.99999978  0.99999986  0.99999987
  0.9999998   0.99999963]
How am i supposed to plot a learning curve? I dont change alpha. Or should i update alpha at each iteration?

Regards,
Polona

M_UE
Neuling
Neuling
Beiträge: 6
Registriert: 11. Mai 2016 19:20

Re: 1.3c: Numerical Optimization

Beitrag von M_UE » 26. Apr 2018 15:30

you should plot the value f(x_i) where x_i is the input at step i (for 10000 steps)

Benutzeravatar
Polona
Windoof-User
Windoof-User
Beiträge: 40
Registriert: 10. Okt 2010 10:45

Re: 1.3c: Numerical Optimization

Beitrag von Polona » 26. Apr 2018 15:33

Hi,

well i asked this question like 2 years ago. But hey, thanks! :D

Kind regards
Polona

M_UE
Neuling
Neuling
Beiträge: 6
Registriert: 11. Mai 2016 19:20

Re: 1.3c: Numerical Optimization

Beitrag von M_UE » 26. Apr 2018 15:47

oh. ok.

Antworten

Zurück zu „Statistisches Maschinelles Lernen“