tensorflow::ops::ResourceApplyAdam

tensorflow::ops::ResourceApplyAdam

#include <training_ops.h>

Update '*var' according to the Adam algorithm.

Summary

lr_t

Arguments:

  • scope: A Scope object
  • var: Should be from a Variable().
  • m: Should be from a Variable().
  • v: Should be from a Variable().
  • beta1_power: Must be a scalar.
  • beta2_power: Must be a scalar.
  • lr: Scaling factor. Must be a scalar.
  • beta1: Momentum factor. Must be a scalar.
  • beta2: Momentum factor. Must be a scalar.
  • epsilon: Ridge term. Must be a scalar.
  • grad: The gradient.

Optional attributes (see Attrs):