DeepXDE is a library for scientific machine learning and physics-informed learning. DeepXDE includes the following algorithms:

DeepXDE supports five tensor libraries as backends: TensorFlow 1.x (tensorflow.compat.v1 in TensorFlow 2.x), TensorFlow 2.x, PyTorch, JAX, and PaddlePaddle. For how to select one, see Working with different backends.

Documentation: ReadTheDocs

_images/pinn.png _images/deeponet.png _images/mfnn.png _images/backend.png


DeepXDE has implemented many algorithms as shown above and supports many features:

  • enables the user code to be compact, resembling closely the mathematical formulation.

  • complex domain geometries without tyranny mesh generation. The primitive geometries are interval, triangle, rectangle, polygon, disk, ellipse, star-shaped, cuboid, sphere, hypercube, and hypersphere. Other geometries can be constructed as constructive solid geometry (CSG) using three boolean operations: union, difference, and intersection. DeepXDE also supports a geometry represented by a point cloud.

  • 5 types of boundary conditions (BCs): Dirichlet, Neumann, Robin, periodic, and a general BC, which can be defined on an arbitrary domain or on a point set; and approximate distance functions for hard constraints.

  • 3 automatic differentiation (AD) methods to compute derivatives: reverse mode (i.e., backpropagation), forward mode, and zero coordinate shift (ZCS).

  • different neural networks: fully connected neural network (FNN), stacked FNN, residual neural network, (spatio-temporal) multi-scale Fourier feature networks, etc.

  • many sampling methods: uniform, pseudorandom, Latin hypercube sampling, Halton sequence, Hammersley sequence, and Sobol sequence. The training points can keep the same during training or be resampled (adaptively) every certain iterations.

  • 4 function spaces: power series, Chebyshev polynomial, Gaussian random field (1D/2D).

  • data-parallel training on multiple GPUs.

  • different optimizers: Adam, L-BFGS, etc.

  • conveniently save the model during training, and load a trained model.

  • callbacks to monitor the internal states and statistics of the model during training: early stopping, etc.

  • uncertainty quantification using dropout.

  • float16, float32, and float64.

  • many other useful features: different (weighted) losses, learning rate schedules, metrics, etc.

All the components of DeepXDE are loosely coupled, and thus DeepXDE is well-structured and highly configurable. It is easy to customize DeepXDE to meet new demands.

User guide

API reference

If you are looking for information on a specific function, class or method, this part of the documentation is for you.


Indices and tables