deepxde.nn.tensorflow

deepxde.nn.tensorflow.deeponet module

class deepxde.nn.tensorflow.deeponet.DeepONet(*args, **kwargs)[source]

Bases: NN

Deep operator network.

Lu et al. Learning nonlinear operators via DeepONet based on the universal approximation theorem of operators. Nat Mach Intell, 2021.

Parameters:
  • layer_sizes_branch – A list of integers as the width of a fully connected network, or (dim, f) where dim is the input dimension and f is a network function. The width of the last layer in the branch and trunk net should be the same for all strategies except “split_branch” and “split_trunk”.

  • layer_sizes_trunk (list) – A list of integers as the width of a fully connected network.

  • activation – If activation is a string, then the same activation is used in both trunk and branch nets. If activation is a dict, then the trunk net uses the activation activation[“trunk”], and the branch net uses activation[“branch”].

  • num_outputs (integer) – Number of outputs. In case of multiple outputs, i.e., num_outputs > 1, multi_output_strategy below should be set.

  • multi_output_strategy (str or None) –

    None, “independent”, “split_both”, “split_branch” or “split_trunk”. It makes sense to set in case of multiple outputs.

    • None

    Classical implementation of DeepONet with a single output. Cannot be used with num_outputs > 1.

    • independent

    Use num_outputs independent DeepONets, and each DeepONet outputs only one function.

    • split_both

    Split the outputs of both the branch net and the trunk net into num_outputs groups, and then the kth group outputs the kth solution.

    • split_branch

    Split the branch net and share the trunk net. The width of the last layer in the branch net should be equal to the one in the trunk net multiplied by the number of outputs.

    • split_trunk

    Split the trunk net and share the branch net. The width of the last layer in the trunk net should be equal to the one in the branch net multiplied by the number of outputs.

build_branch_net(layer_sizes_branch)[source]
build_trunk_net(layer_sizes_trunk)[source]
call(inputs, training=False)[source]
static concatenate_outputs(ys)[source]
merge_branch_trunk(x_func, x_loc, index)[source]
class deepxde.nn.tensorflow.deeponet.DeepONetCartesianProd(*args, **kwargs)[source]

Bases: NN

Deep operator network for dataset in the format of Cartesian product.

Parameters:
  • layer_sizes_branch – A list of integers as the width of a fully connected network, or (dim, f) where dim is the input dimension and f is a network function. The width of the last layer in the branch and trunk net should be the same for all strategies except “split_branch” and “split_trunk”.

  • layer_sizes_trunk (list) – A list of integers as the width of a fully connected network.

  • activation – If activation is a string, then the same activation is used in both trunk and branch nets. If activation is a dict, then the trunk net uses the activation activation[“trunk”], and the branch net uses activation[“branch”].

  • num_outputs (integer) – Number of outputs. In case of multiple outputs, i.e., num_outputs > 1, multi_output_strategy below should be set.

  • multi_output_strategy (str or None) –

    None, “independent”, “split_both”, “split_branch” or “split_trunk”. It makes sense to set in case of multiple outputs.

    • None

    Classical implementation of DeepONet with a single output. Cannot be used with num_outputs > 1.

    • independent

    Use num_outputs independent DeepONets, and each DeepONet outputs only one function.

    • split_both

    Split the outputs of both the branch net and the trunk net into num_outputs groups, and then the kth group outputs the kth solution.

    • split_branch

    Split the branch net and share the trunk net. The width of the last layer in the branch net should be equal to the one in the trunk net multiplied by the number of outputs.

    • split_trunk

    Split the trunk net and share the branch net. The width of the last layer in the trunk net should be equal to the one in the branch net multiplied by the number of outputs.

build_branch_net(layer_sizes_branch)[source]
build_trunk_net(layer_sizes_trunk)[source]
call(inputs, training=False)[source]
static concatenate_outputs(ys)[source]
merge_branch_trunk(x_func, x_loc, index)[source]
class deepxde.nn.tensorflow.deeponet.PODDeepONet(*args, **kwargs)[source]

Bases: NN

Deep operator network with proper orthogonal decomposition (POD) for dataset in the format of Cartesian product.

Parameters:
  • pod_basis – POD basis used in the trunk net.

  • layer_sizes_branch – A list of integers as the width of a fully connected network, or (dim, f) where dim is the input dimension and f is a network function. The width of the last layer in the branch and trunk net should be equal.

  • activation – If activation is a string, then the same activation is used in both trunk and branch nets. If activation is a dict, then the trunk net uses the activation activation[“trunk”], and the branch net uses activation[“branch”].

  • layer_sizes_trunk (list) – A list of integers as the width of a fully connected network. If None, then only use POD basis as the trunk net.

References

L. Lu, X. Meng, S. Cai, Z. Mao, S. Goswami, Z. Zhang, & G. E. Karniadakis. A comprehensive and fair comparison of two neural operators (with practical extensions) based on FAIR data. arXiv preprint arXiv:2111.05512, 2021.

call(inputs, training=False)[source]

deepxde.nn.tensorflow.fnn module

class deepxde.nn.tensorflow.fnn.FNN(*args, **kwargs)[source]

Bases: NN

Fully-connected neural network.

call(inputs, training=False)[source]
class deepxde.nn.tensorflow.fnn.PFNN(*args, **kwargs)[source]

Bases: NN

Parallel fully-connected neural network that uses independent sub-networks for each network output.

Parameters:

layer_sizes – A nested list to define the architecture of the neural network (how the layers are connected). If layer_sizes[i] is int, it represent one layer shared by all the outputs; if layer_sizes[i] is list, it represent len(layer_sizes[i]) sub-layers, each of which exclusively used by one output. Note that len(layer_sizes[i]) should equal to the number of outputs. Every number specify the number of neurons of that layer.

call(inputs, training=False)[source]

deepxde.nn.tensorflow.nn module

class deepxde.nn.tensorflow.nn.NN(*args, **kwargs)[source]

Bases: Model

Base class for all neural network modules.

apply_feature_transform(transform)[source]

Compute the features by appling a transform to the network inputs, i.e., features = transform(inputs). Then, outputs = network(features).

apply_output_transform(transform)[source]

Apply a transform to the network outputs, i.e., outputs = transform(inputs, outputs).

property auxiliary_vars

Any additional variables needed.

Type:

Tensors

num_trainable_parameters()[source]

Evaluate the number of trainable parameters for the NN.