(Left to Right): Avalanche activity cascades in a sandpile automaton; a vortex street formed by flow past a cylinder; and Turing patterns in a reactiondiffusion model. All simulations from the course homeworks; a higherresolution video may be viewed here
Computational Physics
Summary
Materials for UT Austin’s graduate computational physics course, taught by William Gilpin.
This course aims to provide a very broad survey of computational methods that are particularly relevant to modern physics research. We will aim to cover efficient algorithm design and performance analysis, traditional numerical recipes such as integration and matrix manipulation, and emerging methods in data analysis and machine learning. Our goal by the end of the class will be to feel comfortable approaching diverse, openended computational problems that arise during research, and to be ready to design and share new algorithms with the broader research community.
The class website is located here. If you are enrolled in the course at UT, the syllabus and calendar are here
Contents
Many links below direct to Google Colaboratory, and can be runinbrowser without any installation as long as you are signed into a Google account. To download the raw source files, please refer to the GitHub repository
Homework Assignments
 HW1: The sandpile cellular automaton and directed percolation. Covers recursion, runtime scaling, vectorization
 HW2: Linear dynamical systems and decomposing a chaotic flow. Covers numerical linear algebra, optimization, and unsupervised learning
 HW3: Turing patterns and phase separation. Covers numerical integration; finitedifferences and spectral methods
 HW4: Predicting turbulence with operator methods. Covers Supervised learning, time series forecasting, ridge, kernel, and logistic regression
Lecture Slides

Lecture 1: Python syntax for Scientific Computing
[Live Notebook] 
Lecture 1b: Objectoriented programming to find firstpassage times of Brownian motion
[Live Notebook] [vid1] [vid2] 
Lecture 1c: Vectorization, arrays, and the Mandelbrot set
[Live Notebook] 
Lecture 2: Runtime complexity, convolutions, and the continuous Game of Life
[Raw Notebook] 
Lecture 3: Finding the Feigenbaum constant with recursion and dynamic programming
[Raw Notebook] 
Lecture 4: Detecting the onset of turbulence with the Fast Fourier Transform
[Raw Notebook] 
Lecture 5: Condition Number and the irreversibility of chaos
[Raw Notebook] 
Lecture 6b: Probing collaborator graphs with LU matrix inversion
[Live Notebook] 
Lecture 7: Spectral graph theory and the QR eigenvalue algorithm
[Live Notebook] 
Lecture 9: Krylov subspace methods & Conjugate gradient methods
[Live Notebook] 
Lecture 11a: Multivariate Optimization and Potential Flows
[Live Notebook] 
Lecture 11b: Evolving Cellular Automata with Genetic Algorithms
[Live Notebook] 
Lecture 11c: Monte Carlo methods and Hard Sphere Packing
[Live Notebook] 
Lecture 12: Numerical Integration and predicting chaos
[Live Notebook] 
Lecture 13: Variable step integration, symplectic and stochastic systems
[Live Notebook] 
Lecture 15: Diffusion, relaxation, and instability
[Live Notebook] 
Lecture 16: Shocks, solitons, and hyperbolic partial differential equations
[Live Notebook] 
Lecture 17: Spectral solving using the Dedalus Python Package

Lecture 19: Classification, Logistic Regression, and phases of matter

Lecture 20: Overfitting, biasvariance tradeoff, and doubledescent
[Live Notebook] 
Lecture 22: Time series representation, featurizing chaos, kernel methods
[Live Notebook] 
Lecture 23: Gaussian mixtures, expectationmaximization, and superresolution microscopy
[Live Notebook] 
Lecture 24: Predicting the Reynolds number of turbulence with deep learning
[Live Notebook] 
Lecture 25: Types of neural networks; symmetries in physical systems

Lecture 26: Training neural networks with backpropagation
[Live Notebook]
Notes
Laboratory Exercises
 Lab 1: Getting started with Python
 Lab 2: git, GitHub, and GitHub Pages
 Lab 3: Documentation and Formatting
 Lab 4: Automatically creating online documentation with Sphinx
 Lab 5: Unit Testing
 Lab 6: Structuring an OpenSource Repository
Example Final Projects
 Quantum Reinforcement Learning with the Grover method
 Modelling the contractile dynamics of muscle
 Tight binding and Anderson localization on complex graphs
 Neural System Identification by Training Recurrent Neural Networks
 Assimilating a realistic neuron model onto a reducedorder model
 Testing particle phenomenology beyond the Standard Model with Bayesian classification
 Monte Carlo sampling for manybody systems
Usage and improvements
If you are teaching a similar course, please feel free to use any or all of these materials. If you have any suggestions for improvements or find any errors, I would very much appreciate any feedback.
For errors or typos, please consider opening an issue or submitting corrections as pull requests on GitHub.
For students, courserelated questions are best posted on GitHub as Discussions or Issues on the course repository; for other issues, I can be reached via email
Requirements
We will primarily use Python 3 with the following packages
 numpy
 matplotlib
 scipy
 scikitlearn
 jupyter
For projects and other parts of the class, you might also need
 ipykernel
 scikitimage
 umaplearn
 statsmodels
 pytorch
 jax
 numba
Attributions
Portions of the material in this course are adapted or inspired by other opensource classes, including: Pankaj Mehta’s Machine Learning for Physics Course, Chris Rycroft’s Numerical Recipe’s Course, Volodymyr Kuleshov’s Applied Machine Learning course, FeiFei Li’s Deep Learning for Computer Vision course, Lorena Barba’s CFD course and Jim Crutchfield’s Nonlinear Dynamics course