Regularization Methods for Machine Learning 2016
RegML

folder regularization_methods_for_machine_learning_2016 (16 files)
fileClass 1 - Statistical Learning Theory-E1bIqR8Bqr0.mkv 816.82MB
fileClass 2 - Tikhonov regularization and kernels-xevu1vRdX6w.mp4 576.41MB
fileClass 3 - Early Stopping and Spectral Regularization-D4C0GfbV4kE.mkv 508.79MB
fileClass 4 - Regularization for multi-task learning-FX9wIzyhGSA.mp4 392.28MB
fileClass 5 - Sparsity based regularization-ItfAmoSZRJs.mp4 647.77MB
fileClass 6 - Structured sparsity-uusYjnAhH98.mkv 659.39MB
fileClass 7 - Dictionary learning-mqDzeVsiyig.mkv 488.30MB
fileClass 8 - Deep learning-vRJllrpBao0.mp4 484.25MB
filelectures/lec3.pdf 244.75kB
filelectures/lec4.pdf 5.91MB
filelectures/lec5.pdf 430.89kB
filelectures/lec6.pdf 238.18kB
filelectures/lec7.pdf 1.36MB
filelectures/lec8.pdf 4.60MB
filelectures/lect1.pdf 1.80MB
filelectures/lect2.pdf 522.21kB
Type: Course
Tags:

Bibtex:
@article{,
title= {Regularization Methods for Machine Learning 2016},
keywords= {},
journal= {},
author= {RegML},
year= {},
url= {http://lcsl.mit.edu/courses/regml/regml2016/},
license= {},
abstract= {Understanding how intelligence works and how it can be emulated in machines is an age old dream and arguably one of the biggest challenges in modern science. Learning, with its principles and computational implementations, is at the very core of this endeavor. Recently, for the first time, we have been able to develop artificial intelligence systems able to solve complex tasks considered out of reach for decades. Modern cameras recognize faces, and smart phones voice commands, cars can see and detect pedestrians and ATM machines automatically read checks. In most cases at the root of these success stories there are machine learning algorithms, that is softwares that are trained rather than programmed to solve a task. Among the variety of approaches to modern computational learning, we focus on regularization techniques, that are key to high- dimensional learning. Regularization methods allow to treat in a unified way a huge class of diverse approaches, while providing tools to design new ones. Starting from classical notions of smoothness, shrinkage and margin, the course will cover state of the art techniques based on the concepts of geometry (aka manifold learning), sparsity and a variety of algorithms for supervised learning, feature selection, structured prediction, multitask learning and model selection. Practical applications for high dimensional problems, in particular in computational vision, will be discussed. The classes will focus on algorithmic and methodological aspects, while trying to give an idea of the underlying theoretical underpinnings. Practical laboratory sessions will give the opportunity to have hands on experience.


RegML is a 20 hours advanced machine learning course including theory classes and practical laboratory sessions. The course covers foundations as well as recent advances in Machine Learning with emphasis on high dimensional data and a core set techniques, namely regularization methods. In many respect the course is compressed version of the 9.520 course at MIT".

| CLASS | DAY      | TIME          | SUBJECT                                                                                              | FILES  |
|-------|----------|---------------|------------------------------------------------------------------------------------------------------|--------|
| 1     | Mon 6/27 | 9:30 - 11:00  | Introduction to Statistical Machine Learning                                                         | Lect_1 |
| 2     | Mon 6/27 | 11:30 - 13:00 | Tikhonov Regularization and Kernels                                                                  | Lect_2 |
| 3     | Mon 6/27 | 14:00 - 16:00 | Laboratory 1: Binary classification and model selection                                              | Lab 1  |
| 4     | Tue 6/28 | 9:30 - 11:00  | Early Stopping and Spectral Regularization                                                           | Lect_3 |
| 5     | Tue 6/28 | 11:30 - 13:00 | Regularization for Multi-task Learning                                                               | Lect_4 |
| 6     | Tue 6/28 | 14:00 - 16:00 | Laboratory 2: Spectral filters and multi-class classification                                        | Lab 2  |
| -     | Wed 6/29 | 9:30 - 10:00  | Workshop: Federico Girosi - Health Analytics and Machine Learning                                    |        |
| -     | Wed 6/29 | 10:00 - 10:30 | Workshop: Massimiliano Pontil - A Class of Regularizers based on Optimal Interpolation               |        |
| -     | Wed 6/29 | 10:30 - 11:00 | Workshop: Gadi Geiger - Visual and Auditory Aspects of Perception in Developmental Dyslexia          |        |
| -     | Wed 6/29 | 11:00 - 11:30 | Coffee Break                                                                                         |        |
| -     | Wed 6/29 | 11:30 - 12:00 | Workshop: Alessandro Verri - Extracting Biomedical Knowledge through Regularized Learning Techniques |        |
| -     | Wed 6/29 | 12:00 - 12:30 | Workshop: Thomas Vetter - Learning the Appearance of Faces: Probabilistic Morphable Models           |        |
| -     | Wed 6/29 | Afternoon     | Free                                                                                                 |        |
| 7     | Thu 6/30 | 9:30 - 11:00  | Sparsity Based Regularization                                                                        | Lect_5 |
| 8     | Thu 6/30 | 11:30 - 13:00 | Structured Sparsity                                                                                  | Lect_6 |
| 9     | Thu 6/30 | 14:00 - 16:00 | Laboratory 3: Sparsity-based learning                                                                | Lab 3  |
| 10    | Fri 7/1  | 9:30 - 11:00  | Data Representation: Dictionary Learning                                                             | Lect_7 |
| 11    | Fri 7/1  | 11:30 - 13:00 | Data Representation: Deep Learning                                                                   | Lect_8 |},
superseded= {},
terms= {}
}


Send Feedback Start
   0.000005
DB Connect
   0.000333
Lookup hash in DB
   0.000600
Get torrent details
   0.000553
Get torrent details, finished
   0.000604
Get authors
   0.000127
Parse bibtex
   0.001126
Write header
   0.000579
get stars
   0.000416
home tab
   0.001127
render right panel
   0.000030
render ads
   0.000046
fetch current hosters
   0.001188
Done