contrib.keras.optimizers.Adam
tf.contrib.keras.optimizers.Adam
class tf.contrib.keras.optimizers.Adam
Defined in tensorflow/contrib/keras/python/keras/optimizers.py
.
Adam optimizer.
Default parameters follow those provided in the original paper.
Arguments:
lr: float >= 0. Learning rate. beta_1: float, 0 < beta < 1. Generally close to 1. beta_2: float, 0 < beta < 1. Generally close to 1. epsilon: float >= 0. Fuzz factor. decay: float >= 0. Learning rate decay over each update.
References: - Adam - A Method for Stochastic Optimization
Methods
__init__
__init__( lr=0.001, beta_1=0.9, beta_2=0.999, epsilon=1e-08, decay=0.0, **kwargs )
from_config
from_config( cls, config )
get_config
get_config()
get_gradients
get_gradients( loss, params )
get_updates
get_updates( params, constraints, loss )
get_weights
get_weights()
Returns the current value of the weights of the optimizer.
Returns:
A list of numpy arrays.
set_weights
set_weights(weights)
Sets the weights of the optimizer, from Numpy arrays.
Should only be called after computing the gradients (otherwise the optimizer has no weights).
Arguments:
weights: a list of Numpy arrays. The number of arrays and their shape must match number of the dimensions of the weights of the optimizer (i.e. it should match the output of `get_weights`).
Raises:
ValueError: in case of incompatible weight shapes.
© 2017 The TensorFlow Authors. All rights reserved.
Licensed under the Creative Commons Attribution License 3.0.
Code samples licensed under the Apache 2.0 License.
https://www.tensorflow.org/api_docs/python/tf/contrib/keras/optimizers/Adam