Info hash | 8b47f45382645882a23e0f8d9d9fbb764b3eb378 |
Last mirror activity | 5d,00:11:24 ago |
Size | 41.06GB (41,055,413,520 bytes) |
Added | 2016-09-09 03:55:15 |
Views | 1949 |
Hits | 5899 |
ID | 3375 |
Type | multi |
Downloaded | 2112 time(s) |
Uploaded by | joecohen |
Folder | mit_9.520_statistical_learning_theory_fall_2015 |
Num files | 26 files [See full list] |
Mirrors | 14 complete, 0 downloading = 14 mirror(s) total [Log in to see full list] |
mit_9.520_statistical_learning_theory_fall_2015 (26 files)
9.520 - 09_09_2015 - Class 01 - Prof. Tomaso Poggio - The Course at a Glance-6AWZS4Ho2Z8.mp4 | 1.90GB |
9.520 - 09_14_2015 - Class 02 - Prof. Tomaso Poggio - The Learning Problem and Regularization-iO919hIhO-w.mp4 | 873.84MB |
9.520 - 09_16_2015 - Class 03 - Carlo Ciliberto & Charlie Frogner - Math Camp-DTVRj8LJjZo.mp4 | 1.55GB |
9.520 - 10_05_2015 - Class 08 - Prof. Lorenzo Rosasco - Regularized Least Squares-uy4fM16pKPo.mkv | 1.79GB |
9.520 - 10_07_2015 - Class 09 - Prof. Lorenzo Rosasco - Iterative Regularization via Early Stopping-0cBWfVsxjUE.mkv | 1.71GB |
9.520 - 10_13_2015 - Class 10 - Prof. Lorenzo Rosasco - Sparsity Based Regularization-mYJilxJZ7EI.mkv | 1.49GB |
9.520 - 10_14_2015 - Class 11 - Prof. Lorenzo Rosasco - Proximal Methods-9PRqrZLQktA.mkv | 1.68GB |
9.520 - 10_19_2015 - Class 12 - Prof. Lorenzo Rosasco - Structured Sparsity Regularization-c0rQyTkd13I.mkv | 1.75GB |
9.520 - 10_21_2015 - Class 13 - Prof. Lorenzo Rosasco - Multiple Kernel Learning-QdXqMUVr4Xo.mkv | 1.64GB |
9.520 - 10_26_2015 - Class 14 - Charlie Frogner - Generalization Bounds, Intro to Stability-25D8BfFYk7Y.mkv | 1.17GB |
9.520 - 10_28_2015 - Class 15 - Charlie Frogner - Stability of Tikhonov Regularization-o7WpkR21N2o.mkv | 1.32GB |
9.520 - 11_16_2015 - Class 19 - Prof. Lorenzo Rosasco - Regularization for Multi-Output Learning I-SXcKHyz6Rhs.mkv | 1.75GB |
9.520 - 11_18_2015 - Class 20 - Carlo Ciliberto - Regularization for Multi-Output Learning II-0HSxR_REzV0.mkv | 1.86GB |
9.520 - 11_23_2015 - Class 21 - Prof. Lorenzo Rosasco - Learning Data Representation - from Fourier...-Ai9B9rKzY1Y.mkv | 1.60GB |
9.520 - 11_25_2015 - Class 22 - Prof. Lorenzo Rosasco - Learning Data Representation - Autoencoders...-mO3HJhEyR8c.mkv | 1.78GB |
9.520 - 11_25_2015 - Class 23 - Prof. Lorenzo Rosasco - Learning Data Representation...-VfXAi3nasIE.mkv | 1.83GB |
9.520 - 11_2_2015 - Class 16 - Prof. Lorenzo Rosasco - Consistency, Learnability and Regularization-vSqwlC5prtw.mkv | 1.71GB |
9.520 - 11_4_2015 - Class 17 - Prof. Lorenzo Rosasco - On-line Learning-P11YKe83oOY.mkv | 1.74GB |
9.520 - 11_9_2015 - Class 18 - Prof. Lorenzo Rosasco - Manifold Regularization-f6pb4RqgqE4.mkv | 1.77GB |
9.520 - 12_2_2015 - Class 24 - Prof. Tomaso Poggio - Learning Data Representation - Deep Theory I-6rTveDIT_go.mkv | 1.50GB |
9.520 - 12_7_2015 - Class 25 - Dr. Gemma Roig - Learning Data Representation - DNN Tips and Tricks-XFMVFfKfSfs.mkv | 897.56MB |
9.520 - 12_9_2015 - Class 26 - Prof. Tomaso Poggio - Learning Data Representation - Deep Theory II--4MbMDCeyh4.mkv | 918.65MB |
9.520 - 9_21_2015 - Class 04 - Prof. Lorenzo Rosasco - Reproducing Kernel Hilbert Spaces-KZZD5sBwGCA.mkv | 1.71GB |
9.520 - 9_23_2015 - Class 05 - Prof. Lorenzo Rosasco - Dictionaries, Feature Maps and Mercer Theorem-SRJF8ZhjfBw.mkv | 1.73GB |
9.520 - 9_28_2015 - Class 06 - Prof. Lorenzo Rosasco - Tikhonov Regularization and the ...-7XsJxCMnLm4.mkv | 1.65GB |
9.520 - 9_30_2015 - Class 07 - Prof. Lorenzo Rosasco - Logistic Regression and Support ...-19mTiF0MmyA.mkv | 1.72GB |
Type: Course
Tags:
Bibtex:
Tags:
Bibtex:
@article{, title= {MIT Course 9.520 - Statistical Learning Theory and Applications, Fall 2015}, keywords= {}, journal= {}, author= {Center for Brains, Minds and Machines (CBMM)}, year= {}, url= {http://www.mit.edu/~9.520/fall15/}, license= {}, abstract= {Course description The class covers foundations and recent advances of Machine Learning from the point of view of Statistical Learning Theory. Understanding intelligence and how to replicate it in machines is arguably one of the greatest problems in science. Learning, its principles and computational implementations, is at the very core of intelligence. During the last decade, for the first time, we have been able to develop artificial intelligence systems that can solve complex tasks considered out of reach. ATM machines read checks, cameras recognize faces, smart phones understand your voice and cars can see and avoid obstacles. The machine learning algorithms that are at the roots of these success stories are trained with labeled examples rather than programmed to solve a task. Among the approaches in modern machine learning, the course focuses on regularization techniques, that provide a theoretical foundation to high- dimensional supervised learning. Besides classic approaches such as Support Vector Machines, the course covers state of the art techniques exploiting data geometry (aka manifold learning), sparsity and a variety of algorithms for supervised learning (batch and online), feature selection, structured prediction and multitask learning. Concepts from optimization theory useful for machine learning are covered in some detail (first order methods, proximal/splitting techniques...). The final part of the course will focus on deep learning networks. It will introduce a theoretical framework connecting the computations within the layers of deep learning networks to kernel machines. It will study an extension of the convolutional layers in order to deal with more general invariance properties and to learn them from implicitly supervised data. This theory of hierarchical architectures may explain how visual cortex learn, in an implicitly supervised way, data representation that can lower the sample complexity of a final supervised learning stage. The goal of this class is to provide students with the theoretical knowledge and the basic intuitions needed to use and develop effective machine learning solutions to challenging problems. Prerequisites We will make extensive use of linear algebra, basic functional analysis (we cover the essentials in class and during the math-camp), basic concepts in probability theory and concentration of measure (also covered in class and during the mathcamp). Students are expected to be familiar with MATLAB. | Class | Date | Title | Instructor(s) | |---------------------------|------------|--------------------------------------------------------------------|---------------| | Class 01 | Wed Sep 09 | The Course at a Glance | TP | | Class 02 | Mon Sep 14 | The Learning Problem and Regularization | TP | | Class 03 | Wed Sep 16 | Math Camp | CF/CC | | Class 04 | Mon Sep 21 | Reproducing Kernel Hilbert Spaces | LR | | Class 05 | Wed Sep 23 | Dictionaries, Feature Maps and Mercer Theorem | LR | | Class 06 | Mon Sep 28 | Tikhonov Regularization and the Representer Theorem | LR | | Class 07 | Wed Sep 30 | Logistic Regression and Support Vector Machines | LR | | Class 08 | Mon Oct 05 | Regularized Least Squares | LR | | Class 09 | Wed Oct 07 | Iterative Regularization via Early Stopping | LR | | Mon Oct 12 - Columbus Day | | | | | Class 10 | Tue Oct 13 | Sparsity Based Regularization | LR | | Class 11 | Wed Oct 14 | Proximal Methods | LR | | Class 12 | Mon Oct 19 | Structured Sparsity Regularization | LR | | Class 13 | Wed Oct 21 | Multiple Kernel Learning | LR | | Class 14 | Mon Oct 26 | Generalization Bounds, Intro to Stability | CF/TP | | Class 15 | Wed Oct 28 | Stability of Tikhonov Regularization | CF/TP | | Class 16 | Mon Nov 02 | Consistency, Learnability and Regularization | LR | | Class 17 | Wed Nov 04 | On-line Learning | LR | | Class 18 | Mon Nov 09 | Manifold Regularization | LR | | Wed Nov 11 - Veterans Day | | | | | Class 19 | Mon Nov 16 | Regularization for Multi-Output Learning I | LR | | Class 20 | Wed Nov 18 | Regularization for Multi-Output Learning II | CC | | Class 21 | Mon Nov 23 | Learning Data Representation: from Fourier to Compressed Sensing | LR | | Class 22 | Wed Nov 25 | Learning Data Representation: Autoencoders and Dictionary Learning | LR | | Class 23 | Mon Nov 30 | Learning Data Representation: Deep Neural Networks (DNNs) | LR | | Class 24 | Wed Dec 02 | Learning Data Representation: Deep Theory I | TP | | Class 25 | Mon Dec 07 | Learning Data Representation: DNN Tips and Tricks | Gemma Roig | | Class 26 | Wed Dec 09 | Learning Data Representation: Deep Theory II | TP | }, superseded= {}, terms= {} }