Lecturer: Prof. Wang Jian-Sheng
Schedule/Venue: Tuesday LT20/Friday LT28, 2:00-4:00.
Final exam: 4 May 2026, 1:00pm.
Reference books: "Deep Learning, Foundations and Concepts", C. M. Bishop; "Deep Learning", Goodfellow, Bengio, and Courville; "Machine Learning", Lindholm, et al; "Dive into Deep Learning", at d2l.ai by A. Zhang, et al; "Neural Networks", Haykin.
Official Syllabus: This course presents the mathematical and computational foundations of machine learning with an emphasis on deep learning networks, preparing the students with sufficient background for more advanced topics such as AI in physics or any of the other sciences. The learning outcomes include sufficient familiarity with the programming environment for machine learning with Python, a deeper understanding of the building blocks of neural networks, and numerical training algorithms for machine learning. The course will draw applications in science as examples to illustrate the concepts of deep learning.
Course Outline:
Week 1: 13, 16 Jan, biological neurons and artificial neural networks
Week 2: 20, 23 Jan, python, numpy, pytorch
Week 3: 27, 30 Jan, linear regression, maximum likelihood, cross-validation, homework 1 due
Week 4: 3, 6 Feb, perceptrons, feedforward network, stochastic gradient descent and other optimization algorithms
Week 5: 10, 13 Feb, backprogation
Week 6: 20 Feb, regularization (17 feb is Chinese new year), homework 2 due
Recess week, no classes
Week 7: 3, 6 Mar, physics informed neural network (PINN) (midterm test this week on Friday, 6 March)
Week 8: 10, 13 Mar, convolutional network (CNN)
Week 9: 17, 20 Mar, recurrent neural network (RNN), Long Short-Term memory, homework 3 due
Week 10: 24, 27 Mar, encoder-decoder, attention, transformer
Week 11: 31 Mar, generative model, diffusion model (3 apr good friday)
Week 12: 7, 10 Apr,
Week 13: 14, 17 Apr, unsupervised learning, Boltzmann machine, homework 4 due
Homework problem sets are on Canvas at Files. Upload homework as PDF on Canvas at Assignments.
Example jupyter notebook codes: autograd,
least squares,
least squares (using torch.nn and SGD), ...