Semester:

WS 2018 
Type:

Lecture 
Lecturer:


Credits:

V3 + Ü1 (6 ECTS credits) 
#  Date  Time  Room  Type  Title 

1  10:30  12:00  TEMP2  Lecture  Word Embeddings  
2  10:30  12:00  TEMP1  Lecture  Recurrent Neural Networks I  
3  10:30  12:00  TEMP2  Lecture  Recurrent Neural Networks II 
News
 We are aware that there are currently some problems with registration for this course in the new rwth online system. We are working to fix them. Please be patient.
Lecture Description
The goal of Machine Learning is to develop techniques that enable a machine to "learn" how to perform certain tasks from experience.
The important part here is the learning from experience. That is, we do not try to encode the knowledge ourselves, but the machine should learn it itself from training data. The tools for this are statistical learning and probabilistic inference techniques. Such techniques are used in many realworld applications. This lecture will teach the fundamental machine learning knowhow that underlies such capabilities. In addition, we show current research developments and how they are applied to solve realworld tasks.
Example questions that could be addressed with the techniques from the lecture include
 Is this email important or spam?
 What is the likelihood that this credit card transaction is fraudulent?
 Does this image contain a face?
Exercises
The class is accompanied by exercises that will allow you to collect handson experience with the algorithms introduced in the lecture.
There will be both pen&paper exercises and practical programming exercises (roughly 1 exercise sheet every 2 weeks). Please submit your solutions electronically through the L2P system.
We ask you to work in teams of 23 students.
Literature
The first half of the lecture will follow the book by Bishop. For the second half, we will use the Deep Learning book by Goodfellow as a reference.
 Christopher M. Bishop, Pattern Recognition and Machine Learning, Springer, 2006
 Ian Goodfellow, Yoshua Bengio, Aaron Courville, Deep learning, MIT Press, 2016
Wherever research papers are necessary for a deeper understanding, we will make them available in the L2P.
Additional Resources
 Kevin Murphy, Machine Learning  A Probabilistic Perspective, MIT Press, 2012.
Python Resources
 A comprehensive python tutorial which is quite long
 Gives a very basic introduction to python and control loops (A sub topic of above link)
 This subsection gives an overview of python data structures such as list, dictionaries etc. (Again a sub topic of above link)
 A basic numpy tutorial
 PyCharm, a Python IDE
Date  Title  Content  Material 

Introduction  Introduction, Probability Theory, Bayes Decision Theory, Minimizing Expected Loss  
Prob. Density Estimation I  Parametric Methods, Gaussian Distribution, Maximum Likelihood  
Prob. Density Estimation II  Bayesian Learning, Nonparametric Methods, Histograms, Kernel Density Estimation  
Prob. Density Estimation III  Mixture of Gaussians, kMeans Clustering, EMClustering, EM Algorithm  
Linear Discriminants I  Linear Discriminant Functions  
Exercise 1  Python Tutorial, Probability Density, GMM, EM  
Linear Discriminants II  Linear Discriminant Functions  
Linear SVMs  Linear SVMs  
NonLinear SVMs  NonLinear SVMs  
Model Combination  Model Combination, AdaBoost, Exponential error, Sequential Additive Minimization  
Exercise 2  Linear Discriminants, SVMs  
Neural Networks I  SingleLayer Perceptron, MultiLayer Perceptron, Mapping to Linear Discriminants, Error Functions, Regularization, Multilayer Networks, Chain rule, Gradient Descent  
Neural Networks II  Backpropagation, Computational Graphs, Stochastic Gradient Descent, Minibatch Learning, Optimizers (Momentum, RMSProp, AdaGrad, Adam)  
Tricks of the Trade  Initialization (Glorot, He), Nonliniearities, Dropout, Batch Normalization, Learning Rate Schedules  
Exercise 3  Adaboost, Deep Learning Companion Slides (preparation for Exercise 4)  
Convolutional Neural Networks I  CNNs, Convolutional Layers, Pooling Layers, LeNet  
Convolutional Neural Networks II  ImageNet Challenge, Notable Architectures, AlexNet, VGGNet, Inception, Visualizing CNNs  
Exercise 4  Softmax, Backpropagation, Deep Learning Implementation from scratch  
TensorFlow Tutorial  TensorFlow Tutorial  
Convolutional Neural Networks III  Residual Networks, Applications of CNNs  
Exercise 5  Convolutional Neural Networks  
Word Embeddings  Dealing with Discrete Data, Word Embeddings, word2vec, GloVe, Hierarchical Softmax, Motivation for Recurrent Networks  
Recurrent Neural Networks I  Plain RNNs, Backpropagation through Time, Practical Issues, Initialization  
Recurrent Neural Networks II  LSTM, GRU, Applications of RNNs  
Exercise 6  RNNs  
Repetition  Repetition 