Beskjeder

Publisert 11. des. 2020 16:09

Dear all, first, thx to everybody for heroic efforts with project 2 (and 1). We are almost done with feedback (we miss 4-5 of you, but they should be there soon). 

And the reports look really good, congratulations to you all. We are proud of you!

 

Since we have received several requests for an extended deadline (due to conflicts with other exams etc etc), we have extended the deadline to December 16 at midnight.

If you have already submitted, you can always update/change etc your report.

 

If needed, we can always organize a one-to-one zoom session.

 

Best wishes to you all,

Morten et al

Publisert 8. des. 2020 11:08

Dear all,

we hope you all are doing well and that project 3 is on track.

tomorrow we will have a new (and the last one this semester) digital lab from 10am to 12pm. The zoom link is the same as the one we have used for our lectures, see link below.

best wishes to you all,

Morten et al.

 

/// zoom link

Topic: Lectures FYS-ST3155/4155
Time: This is a recurring meeting Meet anytime

Join Zoom Meeting
https://msu.zoom.us/j/95069361786?pwd=OE8vRXFDWHNNTWJDZFFnZldHYnZTUT09

Meeting ID: 950 6936 1786
Passcode: 356225

Publisert 1. des. 2020 18:34

Hi all, hope you are doing well. Just a quick reminder that we are running a question and answer digital lab tomorrow from 10am to 12pm.

The zoom link is the same as the one we used for the lectures

 

Topic: Lectures FYS-ST3155/4155
Time: This is a recurring meeting Meet anytime

Join Zoom Meeting
https://msu.zoom.us/j/95069361786?pwd=OE8vRXFDWHNNTWJDZFFnZldHYnZTUT09

Meeting ID: 950 6936 1786
Passcode: 356225

 

Best wishes to you all and good luck with project 3. 

Morten et al,

ps we will run a similar Q&A lab next Wednesday as well, same time and same zoom link

Publisert 24. nov. 2020 08:40

Hi all! This is sadly our last week and the semester comes to an end. Hopefully you are all doing well and have been able to start working on the last project.

This Wednesday is our last lab session with in-person labs. We will also keep the digital lab open from 8am to 6pm. We will also offer digital labs on Wednesday December 2 from 10am-12pm and Wednesday December 9 from 10am-12pm.  The zoom link will be mailed to you later. 

 

Else, on Thursday we wrap up our discussion on support vector machines, repeating the linear classification approach discussed last week and venturing into Kernels for non-linear problems and regression as well with examples. Finally, we summarize, with perspectives for future ML studies and a discuss...

Publisert 19. nov. 2020 05:30

Dear all, on Friday we will have presentations about possible data sets for project 3 by many of you. There are many exciting topics. 

Here are the various projects that will be presented during the first lecture (and possibly parts of the second lecture as well).  All Titles are tentative.   Talks are approx 5-10 mins with roughly 5 minutes for discussions. The schedule is flexible and we hope you will enjoy the broad creativity expressed in these suggestions. We love them! 

 

 

* Maria Emine Nylund: Lego Bricks Classifier

* Fabio Rodrigues Pereira: Financial Machine Learning

* Markus Borud Pettersen: Machine Learning and Brain Grid Cells

* Jing Sun and Endrias Getachew Asgedom: Machine learning-based approaches to denoising microseismic data

* Felicia...

Publisert 18. nov. 2020 05:17

Dear all, we hope you are all doing well and thx to everybody again with great efforts on project 2. 

We hope to be able to send feedback in approx two weeks, in due time before the final project 3.

This week and next week as well we still have in-person labs but will keep the digital labs open throughout the day.  Next week ends also the regular teaching semester. 

The zoom link for the digital lab is the same as those for the labs from 8-10 am and 4pm-6pm today, see

https://uio.zoom.us/j/8226009630

 

Else, this Friday (first lecture) we will have presentations from several of you about possible variants of project 3.   Please feel free to submit proposal and also make a small...

Publisert 13. nov. 2020 05:18

Dear All, since most of you are stressed with the coming deadline, I would like to propose that we postpone today's lecture on support vector machines to next Thursday. Thus, the lecture of Friday November 13 is cancelled.  

 

We will reconvene next Thursday (November 19) and cover support machines (today's topic). 

 

Project 3 is also available, see https://compphysics.github.io/MachineLearning/doc/Projects/2020/Project3/html/Project3-bs.html

 

In project 3, we encourage you to select data sets which may be closer to your own research and/or interests.  The project has also a suggested topic on solving differential equations using deep learning methods like neural networks. 

If you have some specific data set...

Publisert 12. nov. 2020 07:07
Publisert 9. nov. 2020 23:32

Dear All, we hope you are all doing well and staying healthy. 

First, and perhaps most importantly, we are still allowed to have the in-person labs. We will let you know asap if this changes.

 

Else, last week we discussed ensemble methods like boosting, bagging, voting and random forests. 

On Thursday we will wrap up our discussion on gradient boosting with further examples during the first lecture. For the second lecture on Thursday (115pm-2pm) 

Iwe will have a guest lecturer, John Aiken, who is an expert on applications of gradient boosting and xgboost applied to social sciences. John recently defended his thesis, where the primary focus was on boosting methods applied to data from the social sciences.

 

On Friday we start with our second-last topics: Support Vector Machines for classifi...

Publisert 4. nov. 2020 12:12

Dear all, we hope you all are doing well and are staying healthy.

Here comes the weekly FYS-STK3155/4155 update. Our labs on Wednesdays will continue as normal, in person labs and two digital labs. Please feel free to attend the labs.

You should now have received  feeback for project 1, but let us know if we have forgotten somebody or something seems wrong.

Else, concerning project 2, several of you have asked for a small extension. We will update you before the weekend. Please stay tuned. 

 

Besides the labs, this 

Thursday we will wrap up what we did last week, finalized our discussion on  Bagging and Voting before moving to Random forests and Boosting methods.

Friday is dedicated to  Boosting and gradient boosting.

As reading background we recommend

Ge...

Publisert 4. nov. 2020 12:11

Dear all,  since many of you have asked for a possible extension, we decided that the new deadline (and this is the final one) is Friday (midnight) November 13 instead of Monday November 9. We hope this helps in improving project 2.

Else, except some few glitches, we believe everybody should by now have gotten the feedback for project 1.

The deadline for project 3 has been changed by a week and is now set to December 14.   

 

Best wishes to you all,

Kristian, Michael, Morten, Nicolai, Per-Dimitri, Stian and ?yvind

Publisert 27. okt. 2020 17:46

Hi all and welcome to a new week. We hope all is well.  The most important message this week (until new information on Thursday at 12pm) is that we will have our normal in-person labs on Wednesday. This may change after this coming Thursday but we hope to be able to keep the in-person labs. These are essential to the course. 

Else, we have now finalized (almost) our evaluations of project 1 and the feedback and final scorers will be posted by the end of this week. 

 

Last week we discussed how to solve ordinary differential and partial differential equations using deep learning and we presented a widely used unsupervised learning algorithm, namely the Principal Component analysis (PCA). We will wrap up the PCA material during the first lecture on Thursday and then start with decision trees, bagging, boosting and random forests. This will keep us busy for the nex...

Publisert 20. okt. 2020 16:42

Dear all, some good news: this Wednesday we are back to in-person labs, please do use them, it is a unique opportunity in these odd times to discuss projects and many other topics related to FYS-STK, machine learning in general and more. We look forward to see you in real life. For those who cannot attend, we will run as usual our digital labs according to schedule.

Good news thus.

 

Else, on Thursday we will wrap up the recurrent NN discussion with examples using tensforflow and keras.

Thereafter we will look at an interesting application of deep learning and neural networks, namely solving ordinary differential equations and partial differential equations. These slides for week 43 are not yet ready, hopefully they are are ready by tomorrow (I have a bug in one of the codes, yes!).

On Friday the hope is to discuss the partial compone...

Publisert 13. okt. 2020 22:45

Dear, we hope you all are doing well. We have had an unfortunate outbreak of covid-19 due to a party at a nearby dorm. The consequences for us is that we have to run the Wednesday lab session as online labs only.

The zoom link is the same as the one for the two digital labs, that is 

Zoom link

Join Zoom Meeting

https://uio.zoom.us/j/8226009630

 

Next week we will most likely return to  normal in-person sessions (except for the two digital labs).

 

Else, last week we finalized our discussion on feed forward neural networks and how to write our own code (highly relevant for project 2). We discussed also how to use the API (application programming interface) Keras and tensorflow in order to set up a feed forward neural network. We ended with a motivation for why convolutional NNs...

Publisert 7. okt. 2020 07:44

Dear all, we hope all is fine with project 1 close to the deadline. 

Project 2 is now ready if you would like to get started. Just go to the GitHub address of the course, see for example https://compphysics.github.io/MachineLearning/doc/web/course.html and scroll down to project 2.

If you spot typos, errors, inconsistencies etc please let us know asap.  The deadline is set to one month from now, November 9 (Monday). 

 

Concerning project writing, we remind you that we posted a video early september on how to write projects, see ...

Publisert 2. okt. 2020 11:23

Dear all, since many of you have asked for an extension, we have decided to extend the deadline to next Saturday for project 1. The new deadline is now Saturday October 10 (midnight). You can upload a pdf file of the report

(with links to GitHub repo) or just upload the link the tour GitHub/GitLab folder. There is a time stamp on GitHub and that serves as proof of delivery time.

 

Your folder should contain typically three folders

1) the report as a pdf file (we can then annotate feedback on the pdf file)

2) a folder with your codes/jupyter notebook

3) a folder with test runs of your codes. This allows us to benchmark your code against those results.

 

You should also have a  README file which explains where we can find the various files. 

 

Hope this helps and best wishes to you all.

 

Morten et al.

Publisert 30. sep. 2020 07:42

Hello everybody,

we hope you are all doing fine. What comes here is our weekly summary with plans for this week's lectures and lab.

The lab session is pretty obvious, we keep working on project 1.

Last week we discussed logistic regression and gradient methods, and started to look at stochastic gradient descent and its family of methods.

Although it does not sound like a very exciting topic (and far away from all the interesting data we can analyze with ML methods), finding the minima of the cost/risk function is the true bottleneck of all machine learning methods. For our 'simpler' methods linear regression and logistic regression, we end up with clear convex functions and the search for an optimal minimum is facilitated by that.&nbsp...

Publisert 23. sep. 2020 09:42

Good morning everybody,

here's a small note on cross-validation (CV) and project 1. We discussed this during the lecture on Thursday September 17.

What we try to summarize here is a possible approach to the CV tasks in project 1.

 

1) It is sufficient to perform what many call the standard CV approach in order to obtain the estimated MSE test error (on the test data).

This means, with say five folds, to split your data in five folds and perform training on four of the folds and test on one. This is repeated for all possible distributions of your test data (5 here). The final MSE is then given by the average of these five contributions.

This is what is often done. And it would be sufficient in project 1. 

Many people have however criticized this approach, see for example ...

Publisert 22. sep. 2020 23:56

Hi all and welcome back to a new week with FYS-STK3155/4155. Last week we started our discussion of logistic regression and classification problems, as well as trying to summarize linear regression and project 1. 

This week we extend the logistic regression formalism to include more classes. We limited our discussion last week to a binary classification problem. As we also saw last week, logistic regression leads to an optimization problem which cannot be solved analytically as we obtained in Ridge or OLS regression. This leads us to a central theme (and the working horses of all supervised learning algorithms), namely the family of gradient methods. We will devote some time to this and discuss both standard gradient descent and stochastic gradient methods. 

The we...

Publisert 22. sep. 2020 23:55

Hi all and welcome back to a new week with FYS-STK3155/4155. Last week we started our discussion of logistic regression and classification problems, as well as trying to summarize linear regression and project 1. 

This week we extend the logistic regression formalism to include more classes. We limited our discussion last week to a binary classification problem. As we also saw last week, logistic regression leads to an optimization problem which cannot be solved analytically as we obtained in Ridge or OLS regression. This leads us to a central theme (and the working horses of all supervised learning algorithms), namely the family of gradient methods. We will devote some time to this and discuss both standard gradient descent and stochastic gradient methods. 

The we...

Publisert 16. sep. 2020 17:09

Hi all, here comes our weekly mailing (late this week, sorry for this) with updates and summaries. 

 

Last week we discussed and derived the expressions for shrinkage methods like Ridge and Lasso and we made an analysis in terms of the singular value decomposition of a matrix and a Bayesian approach. 

We showed also that the matrix XTX is proportional to the covariance of the same matrix. This meant that for example Ridge Regression shrinks those singular values which are smallest the most (or those which have the smallest covariance).   The consequence, since the variance of β is proportional to the inverse of this  matrix, is that for Ridge Regression shrinks 'away' attempts at reducing the variance of those $$\beta$ values that have a large variance. 

Lasso does a similar job. 

This wee...

Publisert 8. sep. 2020 11:10

Hi everybody and welcome back.

What follows is our weekly digest with plans as well.

First I wanted to let you know that I am in the process of reorganizing the lecture material. As of now it looks more like chapters of a book. I will reorganize it as both a book and weekly lecture slides. Hopefully this will make it easier for you read the material beforehand.  As of now the material is subdivided into topics, and each file an be pretty long and difficult to navigate. I plan to have a first version of the lecture book by the end of this week. It will be updated continuously, and if you spot typos, inconsistencies etc it will be extremely helpful if you could convey them to us. Stay tuned thus. 

 

Else, this week we will end our discussion of Regression methods and discuss the mathematics and coding of Ridge and Lasso. This is co...

Publisert 1. sep. 2020 22:59

Hi all, we hope the week started the best possible way. Here comes our weekly summary from last week and this week's plans (week 36).

Last week we ended with a discussion on statistics and probability theory, the lecture notes at https://compphysics.github.io/MachineLearning/doc/pub/Statistics/html/._Statistics-bs000.html give you some of the elements that were discussed, similarly chapters 2 and 7.1 of Murphy's text are also good reads, see ...

Publisert 26. aug. 2020 10:16

We have had some technical issues with this course, but this is hopefully fixed and you should now be able to change groups in StudentWeb, given available places in the group you want.

There is also about five available places in the course.


Regards
Espen Murtnes
Student administration

Publisert 25. aug. 2020 17:56

Dear all, here follows an update on the lab sessions for our first lab day, Wednesday August 25.

For the digital labs at 815-10am (lab 2) and 415-6pm (lab 7), the zoom link is

Zoom link

Join Zoom Meeting

https://uio.zoom.us/j/8226009630

 

Many of you have asked the study admin of the Department of Physics (write to studieadm@fys.uio.no) for changes in lab schedule. And some of you have not been assigned a lab session yet. We hope this can solved by the end of this week. 

 

We are also planning to: 

1) keep lab 8 open so you can stay in room F?434 till 6pm, this will be in person.

2) offer an additional digital lab nr 9. This needs to be approved by the studieadmin.

 

Irrespective of an additional lab, we keep lab 8 open, the room has with social distancing a capacity of 20...