- Overview
- Scientific machine learning with and without data
- Machine learning practices
- Hello world example: Hand-written digits recognition
- Programming frameworks, hardware, and workflow
- A hitchhiker’s guide to deep learning
- The four pillars: data, model, loss function, and optimization
- Deep learning primitives: CNN, GNN, and transformer
- Symmetries in machine learning
- Invariant and equivariant neural networks
- DeepMD, Euclidean equivariant GNN, Tensor field networks
- Permutation symmetry and quantum wavefunctions
- Differentiable programming
- The engine of deep learning: automatic differentiation on computation graphs
- Differentiable DFT/MD/Tensor networks/..., and why they are useful
- Generative models-I
- A dictionary of generative models and statistical physics
- Boltzmann machines
- Autoregressive models
- Variational autoencoders
- Generative models-II
- Normalizing flows
- Diffusions models
- Applications of generative models to many-body problems
- The Universe as a generative model
- Wrap up
- AI for science: why now?
Title image generated by stable diffusion with the prompt: "a tile image for the course on 'Machine learning for physicists', eye-catching, artist style with sci-fi feeling". (Yes, “tile” instead of "title" :P)