# Lecture notes

All of our lecture notes have been specifically written for Jacobs courses and are detailed and comprehensive, such that they can completely replace external textbooks. Feel free to distribute with giving credits to the respective authors.

Lecture |
Coverage |

Machine Learning (2nd year Master programs, RUG) | The concept of “modeling”, the ML landscape at large, decision trees and random forests, linear regression, temporal learning tasks, state-based vs. signal-based timeseries modeling, Takens’ theorem, K-means clustering, PCA, self-organizing maps, comparison and mutual transformations between discrete symbolic and continuous real-valued data formats and methods, bias-variance dilemma, regularization, cross-validation, mixtures of Gaussians and EM algorithm, Parzen windows, Bayesian model estimation, sampling methods - especially MCMC sampling, online adaptive modeling, LMS algorithm, (stochastic) gradient descent optimization, multi-layer perceptrons and backpropagation algorithm |

Neural Networks (AI) (2nd year Bachelor programs, RUG) | Rehearsal of machine learning basics, feedforward and recurrent neural networks, a glimpse at deep learning, Hopfield networks, Boltzmann distribution, MCMC sampling, simulated annealing, Boltzmann machine and restricted Boltzmann machine, reservoir computing. The course includes a fastyet broad introduction to dynamical systems (slides) |

Materials for the ML and NN semester projects at RUG | |

Formal languages and logic (legacy Jacobs University, undergraduate) | Regular and context-free languages with their automata and grammars; first-order logic, mathematical basis of logic programming |

Computability and complexity (legacy Jacobs University, undergraduate) | Turing machines, random access machines, recursive functions, lambda calculus, undecidability theorems; complexity classes, hierarchy and cross-class theorems, NP-completeness, model-theoretic characterizations of complexity |

Machine learning (legacy Jacobs University, undergraduate) | Curse of dimensionality and feature extraction, K-means clustering, linear regression, learning optimal decision functions, bias-variance dilemma, regularization, cross-validation, multi-layer perceptron, gradient descent optimization and the backprop algorithm, probability refresher |

Machine learning (legacy Jacobs University, graduate) | Bias-variance dilemma and curse of dimensionality; time series prediction and Takens theorem; essentials of probability and estimation theory; linear classifiers and RBF networks; K-means clustering; linear adaptive filters and LMS algorithm; multilayer perceptrons; recurrent neural networks; hidden Markov models and the EM algorithm |

Algorithmical and statistical modeling (legacy Jacobs University, graduate) | Essentials of probability and estimation theory; multivariate Gaussians; representing and estimating distributions by mixtures of Gaussians and Parzen windows; maximum likelihood estimation algorithms based on gradient descent and on EM; elementary and MCMC sampling methods, demo: constructing phylogentic trees; simulated annealing and energy interpretation of distributions, spin glass models, Boltzmann machines; Bayesian networks and graphical models, join-tree inference algorithm; introduction to fuzzy logic |

Principles of Statistical Modeling (legacy Jacobs University, graduate) |
These LN provide a detailed, example-rich, carefully explained, mathematically rigorous introduction to basic concepts of probability theory, not shying away from sigma-fields as most other introductory texts for non-mathematicians do. The different mindsets of requentist vs. Bayesian statistics are explained. Overview on uses of probability concepts in the natural sciences, signal processing, statistics, mathematics, and machine learning. Introduction to the approach to statistical thinking following the route laid out by J. C. Kiefer. |

Machine Learning: a general introduction for non-computer-scientists (legacy Jacobs University, transdisciplinary; 4-week course module) | Richly illustrated low-math, low-tech introduction to machine learning, emphasis on neural networks, many examples (annotated slides) |

Boolean logic and some elements of computational complexity (legacy Jacobs University, transdisciplinary; 4-week course module) | Step-by-step explanation of the very basic operations of digital information processing, plus a glance on how “computation” at large can be understood |

Introduction to dynamical systems | Slides of a 8-hr crash course on dynamical systems, wide-scope |

Essentials of measure theory, integration, and probability (legacy Jacobs University, graduate tutorial, by Manjunath Gandhi) | A “pocket guide” summary of main definitions and theorems |