deepxde.optimizers.tensorflow_compat_v1 package

Submodules

deepxde.optimizers.tensorflow_compat_v1.optimizers module

deepxde.optimizers.tensorflow_compat_v1.optimizers.get(loss, optimizer, learning_rate=None, decay=None)[source]

Retrieves an Optimizer instance.

deepxde.optimizers.tensorflow_compat_v1.optimizers.is_external_optimizer(optimizer)[source]

deepxde.optimizers.tensorflow_compat_v1.scipy_optimizer module

TensorFlow interface for SciPy optimizers.

class deepxde.optimizers.tensorflow_compat_v1.scipy_optimizer.ExternalOptimizerInterface(loss, var_list=None, equalities=None, inequalities=None, var_to_bounds=None, **optimizer_kwargs)[source]

Bases: object

Base class for interfaces with external optimization algorithms. Subclass this and implement _minimize in order to wrap a new optimization algorithm. ExternalOptimizerInterface should not be instantiated directly; instead use e.g. ScipyOptimizerInterface. @@__init__ @@minimize

minimize(session=None, feed_dict=None, fetches=None, step_callback=None, loss_callback=None, **run_kwargs)[source]

Minimize a scalar Tensor. Variables subject to optimization are updated in-place at the end of optimization. Note that this method does not just return a minimization Op, unlike Optimizer.minimize(); instead it actually performs minimization by executing commands to control a Session. :param session: A Session instance. :param feed_dict: A feed dict to be passed to calls to session.run. :param fetches: A list of Tensor`s to fetch and supply to `loss_callback

as positional arguments.

Parameters:
  • step_callback – A function to be called at each optimization step; arguments are the current values of all optimization variables flattened into a single vector.

  • loss_callback – A function to be called every time the loss and gradients are computed, with evaluated fetches supplied as positional arguments.

  • **run_kwargs – kwargs to pass to session.run.

class deepxde.optimizers.tensorflow_compat_v1.scipy_optimizer.ScipyOptimizerInterface(loss, var_list=None, equalities=None, inequalities=None, var_to_bounds=None, **optimizer_kwargs)[source]

Bases: ExternalOptimizerInterface

Wrapper allowing scipy.optimize.minimize to operate a tf.compat.v1.Session. Example: ```python vector = tf.Variable([7., 7.], ‘vector’) # Make vector norm as small as possible. loss = tf.reduce_sum(tf.square(vector)) optimizer = ScipyOptimizerInterface(loss, options={‘maxiter’: 100}) with tf.compat.v1.Session() as session:

optimizer.minimize(session)

# The value of vector should now be [0., 0.]. ``` Example with simple bound constraints: ```python vector = tf.Variable([7., 7.], ‘vector’) # Make vector norm as small as possible. loss = tf.reduce_sum(tf.square(vector)) optimizer = ScipyOptimizerInterface(

loss, var_to_bounds={vector: ([1, 2], np.infty)})

with tf.compat.v1.Session() as session:

optimizer.minimize(session)

# The value of vector should now be [1., 2.]. ``` Example with more complicated constraints: ```python vector = tf.Variable([7., 7.], ‘vector’) # Make vector norm as small as possible. loss = tf.reduce_sum(tf.square(vector)) # Ensure the vector’s y component is = 1. equalities = [vector[1] - 1.] # Ensure the vector’s x component is >= 1. inequalities = [vector[0] - 1.] # Our default SciPy optimization algorithm, L-BFGS-B, does not support # general constraints. Thus we use SLSQP instead. optimizer = ScipyOptimizerInterface(

loss, equalities=equalities, inequalities=inequalities, method=’SLSQP’)

with tf.compat.v1.Session() as session:

optimizer.minimize(session)

# The value of vector should now be [1., 1.]. ```

deepxde.optimizers.tensorflow_compat_v1.tfp_optimizer module

Module contents

deepxde.optimizers.tensorflow_compat_v1.get(loss, optimizer, learning_rate=None, decay=None)[source]

Retrieves an Optimizer instance.

deepxde.optimizers.tensorflow_compat_v1.is_external_optimizer(optimizer)[source]