Autoregressive ResNet for Kuramoto-Sivashinsky (KS) in JAX

Autoregressive ResNet for Kuramoto-Sivashinsky (KS) in JAX

1.708 Lượt nghe
Autoregressive ResNet for Kuramoto-Sivashinsky (KS) in JAX
Let's train a convolutional Residual Network (ResNet) to become an autoregressive emulator (neural operator) for the Kuramoto-Sivashinsky equation using the JAX deep learning library. Here is the code: https://github.com/Ceyron/machine-learning-and-simulation/blob/main/english/neural_operators/simple_resnet_for_ks_in_jax.ipynb ----- 👉 This educational series is supported by the world-leaders in integrating machine learning and artificial intelligence with simulation and scientific computing, Pasteur Labs and Institute for Simulation Intelligence. Check out https://simulation.science/ for more on their pursuit of 'Nobel-Turing' technologies (https://arxiv.org/abs/2112.03235 ), and for partnership or career opportunities. ------- 📝 : Check out the GitHub Repository of the channel, where I upload all the handwritten notes and source-code files (contributions are very welcome): https://github.com/Ceyron/machine-learning-and-simulation 📢 : Follow me on LinkedIn or Twitter for updates on the channel and other cool Machine Learning & Simulation stuff: https://www.linkedin.com/in/felix-koehler and https://twitter.com/felix_m_koehler 💸 : If you want to support my work on the channel, you can become a Patreon here: https://www.patreon.com/MLsim 🪙: Or you can make a one-time donation via PayPal: https://www.paypal.com/paypalme/FelixMKoehler ------- Timestamps: 00:00:00 Intro 00:01:59 About the KS equation 00:03:30 Imports 00:03:54 Defining Constants 00:05:58 Reference Simulator (ETDRK2) 00:07:32 Drawing Initial Conditions (ICs) 00:11:02 Autoregressive Rollout of Reference Trajectories 00:14:38 Visualizing a Trajectory 00:16:44 Pre-Processing the Train Trajectories 00:17:56 Redo Data Generation for Test Trajectories 00:19:54 Slicing two-snapshot windows out of Train Trajectories 00:28:45 Implementing a Res-Block 00:25:30 Implementing ResNet Architecture 00:40:31 Training Loop 00:50:59 Training Loss History 00:52:23 Error Rollout against test trajectories 00:59:35 Correlation Rollout against test trajectories 01:05:15 Sample prediction trajectory 01:07:24 Comparing the spectrum after decorrelation 01:18:00 Outro