Artificial Intelligence - Tensorflow

Back to Course

Lesson Description

Lession - #1097 Tensorflow Optimizers

Optimizers are the extended class, which include added information to train a specific model. The optimizer class is initialized with given parameters but it is important to remember that no Tensor is needed. The optimizers are used for improving speed and performance for training a specific model.

The basic optimizer of TensorFlow is −


This class is defined in the specified path of tensorflow/python/training/optimizer.py.

Following are some optimizers in Tensorflow −

  • Stochastic Gradient descent
  • Stochastic Gradient descent with gradient clipping
  • Momentum
  • Nesterov momentum
  • Adagrad
  • Adadelta
  • RMSProp
  • Adam
  • Adamax

    We will focus on the Stochastic Gradient descent. The illustration for creating optimizer for the same is mentioned below −

    def sgd(cost, params, lr = np.float32(0.01>
    : g_params = tf.gradients(cost, params>
    updates = [] for param, g_param in zip(params, g_params>
    : updates.append(param.assign(param - lr*g_param>
    return updates

    The basic parameters are defined within the specific function. In our subsequent chapter, we will focus on Gradient Descent Optimization with implementation of optimizers.

    Does TensorFlow work on Mac M1?
    End. Today you've effectively introduced TensorFlow with GPU support on a M1 Pro MacBook.

    pip install tensorflow=1.13.1 Code here

    What is tf zeros?
    Syntax: tf.zeros(shape, dataType>
    Parameters: shape: It takes the shape of the tensor we are going to generate. dataType: It is the type of tensor in the resulting element. It can be a 'float32′, 'int32′ or 'bool'.