DeepXDE is a library for scientific machine learning. Use DeepXDE if you need a deep learning library that
- solves forward and inverse partial differential equations (PDEs) via physics-informed neural network (PINN),
- solves forward and inverse integro-differential equations (IDEs) via PINN,
- solves forward and inverse fractional partial differential equations (fPDEs) via fractional PINN (fPINN),
- approximates nonlinear operators via deep operator network (DeepONet),
- approximates functions from multi-fidelity data via multi-fidelity NN (MFNN),
- approximates functions from a dataset with/without constraints.
DeepXDE supports three tensor libraries as backends: TensorFlow 1.x (tensorflow.compat.v1 in TensorFlow 2.x), TensorFlow 2.x, and PyTorch.
Papers on algorithms
- Solving PDEs and IDEs via PINN: SIAM Rev.
- Solving fPDEs via fPINN: SIAM J. Sci. Comput.
- Solving stochastic PDEs via NN-arbitrary polynomial chaos (NN-aPC): J. Comput. Phys.
- Solving inverse design/topology optimization via hPINN: arXiv
- Learning nonlinear operators via DeepONet: Nat. Mach. Intell., J. Comput. Phys., J. Comput. Phys.
- Learning from multi-fidelity data via MFNN: J. Comput. Phys., PNAS
DeepXDE has implemented many algorithms as shown above and supports many features:
- complex domain geometries without tyranny mesh generation. The primitive geometries are interval, triangle, rectangle, polygon, disk, cuboid, and sphere. Other geometries can be constructed as constructive solid geometry (CSG) using three boolean operations: union, difference, and intersection.
- multi-physics, i.e., (time-dependent) coupled PDEs.
- 5 types of boundary conditions (BCs): Dirichlet, Neumann, Robin, periodic, and a general BC, which can be defined on an arbitrary domain or on a point set.
- different neural networks, such as (stacked/unstacked) fully connected neural network, residual neural network, and (spatio-temporal) multi-scale fourier feature networks.
- 6 sampling methods: uniform, pseudorandom, Latin hypercube sampling, Halton sequence, Hammersley sequence, and Sobol sequence. The training points can keep the same during training or be resampled every certain iterations.
- conveniently save the model during training, and load a trained model.
- uncertainty quantification using dropout.
- many different (weighted) losses, optimizers, learning rate schedules, metrics, etc.
- callbacks to monitor the internal states and statistics of the model during training, such as early stopping.
- enables the user code to be compact, resembling closely the mathematical formulation.
All the components of DeepXDE are loosely coupled, and thus DeepXDE is well-structured and highly configurable. It is easy to customize DeepXDE to meet new demands.
If you are looking for information on a specific function, class or method, this part of the documentation is for you.
- deepxde.data.constraint module
- deepxde.data.data module
- deepxde.data.dataset module
- deepxde.data.fpde module
- deepxde.data.func_constraint module
- deepxde.data.function module
- deepxde.data.helper module
- deepxde.data.ide module
- deepxde.data.mf module
- deepxde.data.pde module
- deepxde.data.sampler module
- deepxde.data.triple module