Weekly plans and update for week 39

Hi all and welcome back to a new week with FYS-STK3155/4155. Last week we started our discussion of logistic regression and classification problems, as well as trying to summarize linear regression and project 1. 

This week we extend the logistic regression formalism to include more classes. We limited our discussion last week to a binary classification problem. As we also saw last week, logistic regression leads to an optimization problem which cannot be solved analytically as we obtained in Ridge or OLS regression. This leads us to a central theme (and the working horses of all supervised learning algorithms), namely the family of gradient methods. We will devote some time to this and discuss both standard gradient descent and stochastic gradient methods. 

The week is thus going to look like this:

Wednesday: Lab and project 1

Thursday: Finalizing logistic regression and start discussing gradient descent, steepest descent and Newton-Raphson approaches

Friday: The family of stochastic gradient descent methods

 

We will continue with these topics next week as well in connection with our start on Neural Networks.

The recommended reading is Reading suggestions for both days: "Aurelien Geron's chapter 4":"https://github.com/CompPhysics/MachineLearning/blob/master/doc/Textbooks/TensorflowML.pdf" and "Murphy sections 8.3 and 8.5":"https://github.com/CompPhysics/MachineLearning/blob/master/doc/Textbooks/MachineLearningMurphy.pdf"

 

 

There is also a video with the main aims of what will be covered at  /studier/emner/matnat/fys/FYS-STK3155/h20/forelesningsvideoer/OverarchingAimsWeek39.mp4?vrtx=view-as-webpage

 

The slides for week 39 are at https://compphysics.github.io/MachineLearning/doc/pub/week39/html/week39-reveal.html

 

Best wishes to you all,

the fys-stk gang

Publisert 22. sep. 2020 23:55 - Sist endret 22. sep. 2020 23:55