News
All inverted classrooms take place virtually until further notice. You can pose questions in our MatterMost group at any time.
The lecture (LSF) will be taught in English and addresses Master and PhD students in Mathematics or related fields. We will use a hybrid format, with videos of the lectures that can be asynchronically assessed and on-site inverted classroom lectures and practical exercises.
Links
Exercises and Downloads
The main resource of videos, pdfs, and exercise material is this website that you can access with a password provided in the lecture. The following table specifies when we are going to talk about which contents (i.e., until when you should have studied which material).
Date | Chapter | Chapter Name | File name |
---|---|---|---|
11.10.22, 11h15 | 0 | Organizational stuff | organization |
18.10.22, 11h15 | 1.1 | History of AI and ML | omml_1.1 |
18.10.22, 11h15 | 1.2 | Some Concepts | omml_1.2 |
1.11.22, 11h15 | 2.1 | Case Study: Text Classification | omml_2.1a, omml_2.1b |
1.11.22, 11h15 | 2.2 | Case Study: Image Recognition | omml_2.2 |
8.11.22, 11h15 | 2.3 | Universal approximation theorem | omml_2.3 |
8.11.22, 11h15 | 2.4 | Incorporating Domain Knowledge | omml_2.4 |
15.11.22, 11h15 | 3.1 | Overview: Optimization Problem | omml_3.1 |
22.11.22, 11h15 | 3.2 | Overview: Methods | omml_3.2a, omml_3.2b |
29.11.22, 11h15 | 4.1 | Derivatives for general functions F | omml_4.1 |
6.12.22, 11h15 | 4.2 | Backpropagation as a Special Case of AD | omml_4.2 |
6.12.22, 11h15 | 4.3 | Deep Learning and Differential Equations | omml_4.3 |
13.12.22, 11h15 | 5.1-2 | Preliminaries, First Convergence Results | omml_5.1-2 |
10.1.23, 11h15 | 5.3 | Convergence Results for Strong Convexity | omml_5.3 |
10.1.23, 11h15 | 5.4-5.5 | Convergence Results for General Objectives, Work Complexity | omml_5.4 |
17.1.23, 11h15 | 6 | Noise Reduction Methods | omml_6 |
17.1.23, 11h15 | 7 | Second-Order Methods | omml_7 |
24.1.23, 11h15 | 8.1 | Gradient Methods with Momentum | 8.1_gradientmomentum (no video) |
24.1.23, 11h15 | 8.2 | Gradient Methods with Acceleration | 8.2_acceleratedgradient (no video) |
24.1.23, 11h15 | 8.3 | Alternating Direction Methods | 8.3_admm (no video) |
18.10.22, 11h15 | 9.01 | AI and the Work Market | omml_9.01 |
1.11.22, 11h15 | 9.02 | AI and the Work Market 2 | omml_9.02 |
1.11.22, 11h15 | 9.03 | AI and Creativity | omml_9.03 |
8.11.22, 11h15 | 9.04 | How the Enlightenment ends | omml_9.04 |
15.11.22, 11h15 | 9.05 | AI and Consciousness | omml_9.05 |
22.11.22, 11h15 | 9.06 | Why are Myths so important? | omml_9.06 |
29.11.22, 11h15 | 9.07 | AI and TÜV | omml_9.07 |
6.12.22, 11h15 | 9.08 | Algorithms and Humanism | omml_9.08 |
13.12.22, 11h15 | 9.09 | AI and Dataism | omml_9.09 |
20.12.22, 11h15 | 9.10 | AI and The Matrix | omml_9.10 |
10.1.23, 11h15 | 9.11 | AI and Research | omml_9.11 |
17.1.23, 11h15 | 9.12 | Evolution and Intelligent Design | omml_9.12 |
24.1.23, 11h15 | 9.13 | AI, Economy, and Politics | omml_9.13 |
24.1.23, 11h15 | 9.14 | AI, Social Scoring, and Fairness | omml_9.14 |
The pdf slides may be updated and extended during the course of the semester. The complete material in one file omml.pdf will be made available as one complete pdf, also in a printable 4-on-1 format at the end of the course.
Information
- Lecture with 4+2 SWS and 9 ECTS-Credits
- Inverted classroom on Tue 11h15-12h45 (G05-211) and exercises on Fri 13h15-14h45 (G05-117). The time slot on Wed 7h15-8h45 is reserved for looking at the asynchronous lecture material.
- Lecturer practical exercises and
Requirements
Mathematical basics (Analysis and Linear Algebra) and programming skills. Introduction to Optimization. The lecture Nonlinear Optimization is highly recommended, but not absolutely necessary.
Module description
The lecture is a master lecture in the mathematics curriculum and described in the module handbook (currently page 31) as a Wahlpflicht module:- WPF MA (Module 12, 13, 14)
- WPF MA;M 1-3 (Module M3D)
A translation of the module description:
- Goals and competences: The students acquire competences with respect to modeling and algorithmically solving optimization problems that are at the basis of modern machine learning techniques. A rigorous mathematical analysis of convergence theory and implementation aspects of different algorithms is the guiding theme of the lecture. In the exercises the students learn how to implement algorithms efficiently on a computer and to apply them to concrete problem instances.
- Content: An introduction to mathematically formulating machine learning problems in a generalized way, calculating derivatives, stochastic and deterministic derivative-based algorithms, convergence theory. See above for a table of contents.
The lecture is also open to other master and PhD students of OVGU. In particular, there is an agreement that ORBA students may choose the lecture as a Wahlpflicht (with 10 CP to motivate the independent study of mathematical foundations necessary to follow the lecture). However, please note that the lecture is addressed to mathematical master students and assumes a good understanding of mathematical basics, especially in the second part of the lecture. If you are mainly interested in applying machine learning and not so much in analyzing the training process, other lectures might be better suited for you. Note that the lecture Concepts and Algorithms of Optimization is not sufficient as a requirement, you will have to invest more time to acquire additional mathematical knowledge.
Material: mathematical background
- Roadmap of Mathematics for Deep Learning, a collection of necessary mathematical background with links to online ressources in Mathematics
Material: machine learning
- Jason Mayes Machine Learning 101
- 7 steps of Machine Learning (Google Cloud)
- Neural Networks
- Neural networks and backpropagation
- Convolutional Networks
- Corresponding text
- Convolution arithmetics with animations
- Generative Adversarial Networks
- Overview of Neural Networks
- Datasets for Machine Learning
- Free ML courses from Amazon ML University
- Lecture Deep Learning by Yann LeCun
- Neural Networks by 3Blue1Brown
- Approximation Properties of Neural Networks
Material: optimization and machine learning
- Optimization Methods for Large-Scale Machine Learning survey paper by Bottou, Curtis, Nocedal
- Home page of Julien Mairal with many related talks
- Artificial Intelligence Summer School, July 2018, Grenoble
- The mathematics behind Deep Learning
- The Connection Between Applied Mathematics and Deep Learning
Material: AI and the future of mankind
- Spiegel Online Article giving reference to the study The Future of Jobs
- Zeit Online Podcast with Richard Socher (in German)
- Zeit Online Podcast with Yuval Harari
Material: hands on
- Python Data Science Handbook
- Harvard Data Science Course
- Scikit Learn
- Scikit Learn Test Datasets
- Scikit Learn Algorithm Cheat Sheet
- Facebook research projects
- Google AI resources
Questions?
Feel free to send me an email with general questions: