Neural Operators are mappings between (discretized) function spaces, like from the IC of a PDE to its solution at a later point in time. FNOs do so by employing a spectral convolution that allows for multiscale properties. Let's code a simple example in JAX: https://github.com/Ceyron/machine-learning-and-simulation/blob/main/english/neural_operators/simple_FNO_in_JAX.ipynb
-------
👉 This educational series is supported by the world-leaders in integrating machine learning and artificial intelligence with simulation and scientific computing, Pasteur Labs and Institute for Simulation Intelligence. Check out https://simulation.science/ for more on their pursuit of 'Nobel-Turing' technologies (https://arxiv.org/abs/2112.03235 ), and for partnership or career opportunities.
-------
📝 : Check out the GitHub Repository of the channel, where I upload all the handwritten notes and source-code files (contributions are very welcome): https://github.com/Ceyron/machine-learning-and-simulation
📢 : Follow me on LinkedIn or Twitter for updates on the channel and other cool Machine Learning & Simulation stuff: https://www.linkedin.com/in/felix-koehler and https://twitter.com/felix_m_koehler
💸 : If you want to support my work on the channel, you can become a Patreon here: https://www.patreon.com/MLsim
🪙: Or you can make a one-time donation via PayPal: https://www.paypal.com/paypalme/FelixMKoehler
-------
Timestamps:
00:00 Intro
01:09 What are Neural Operators?
03:11 About FNOs and their multiscale property
05:05 About Spectral Convolutions
09:17 A "Fourier Layer"
10:18 Stacking Layers with Lifting & Projection
11:01 Our Example: Solving the 1d Burgers equation
12:04 Minor technicalities
13:07 Installing and Importing packages
14:02 Obtaining the dataset and reading it in
15:44 Plot and Discussion of the dataset
17:51 Prepare training & test data
22:23 Implementing Spectral Convolution
34:25 Implementing a Fourier Layer/Block
37:48 Implementing the full FNO
43:14 A simple dataloader in JAX
44:18 Loss Function & Training Loop
52:34 Visualize loss history
53:31 Test prediction with trained FNO
57:13 Zero-Shot superresolution
59:59 Compute error as reported in FNO paper
01:03:45 Summary
01:05:46 Outro