Weekly update for week 38

FYS-STK3155/4155 weekly update

Hi all,

here follows the weekly digest for FYS-STK3155/4155.

Last week we ended (for now) our discussion of linear regression and resampling techniques. Most of this material is covered by chapters 3 and 7 of the text of Hastie et al and the lecture slides on regression. Project 1 addresses many of these aspects, in particular also resampling techniques like bootstrap and cross-validation. An analysis of the bias and the variance as discussed in chapter 7.3 of Hastie et al, https://www.springer.com/gp/book/9780387848570

 

This week we will continue our discussion of Logistic regression for classification problems, this time see chapter 4 (and chapter 4.4) in particular of Hastie et al.  This leads us to an important topic since we cannot derive analytical expressions the parameters β as in linear regression (Ordinary least squares  and Ridge regression).  We will need t evaluate the derivatives of the cost functions using for example gradient descent methods. This material is covered by the slides at https://compphysics.github.io/MachineLearning/doc/pub/Splines/html/Splines-bs.html

(note: the material will be updated today and tomorrow).

 

In most of our Machine Learning algorithms from now and on, we will have to resort to numerical evaluation of the derivatives of teh cost function.  Gradient methods will follow us for the rest of the semester. 

This would also apply to the case where you wish to write your own Lasso code for project 1. There however you should feel free to 

 use the scikit-learn functionality. 

 

At the lab we will work on project 1, however, we will at the beginning of each lab session briefly discuss how to structure and write the report. See again how we evaluate the projects at https://github.com/CompPhysics/MachineLearning/tree/master/doc/Projects/2018/HowWeEvaluateProjects

 

 

Best wishes to you all,

Bendik, Kristine, Morten and ?yvind

Publisert 18. sep. 2018 09:06 - Sist endret 18. sep. 2018 09:06