Welcome to JAX-in-Cell Discussions! #1
Replies: 2 comments 1 reply
-
|
Hi, this is Mufei, a postdoc from Oxford. Thank you for sharing this interesting PIC code. I am pretty curious about how this Python-based code can be combined with ML. Best wishes |
Beta Was this translation helpful? Give feedback.
-
|
Hi @LPI-MF, thank you for reaching out! The connection with ML is its differentiability. A differentiable forward model such as JAX-in-Cell can be coupled to machine learning by placing the simulator directly in the training loop. This is only possible due to its automatic differentiation capabilities, where we can obtain gradients of task-relevant objectives (diagnostics and losses) with respect to inputs, parameters, and embedded model components. We can then do gradient-based optimization and learn over high-dimensional parameter spaces without the instability and cost of finite-difference derivatives—so gradients can propagate end-to-end through any workflow. This allows us to perform tasks such as parameter inference and calibration from experimental diagnostics, data assimilation, optimal design of actuators/geometry/profiles, model-based control, and end-to-end training of hybrid physics–ML components such as closures or subgrid models, all while optimizing the actual objective rather than proxy quantities. |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
-
👋 Welcome!
We’re using Discussions as a place to connect with other members of our community. We hope that you:
build together 💪.
To get started, comment below with an introduction of yourself and tell us about what you do with this community.
Beta Was this translation helpful? Give feedback.
All reactions