...

Artificial Intelligence - Keras

Back to Course

Lesson Description


Lession - #899 Keras-Modules


As we learned before, Keras modules contains pre-defined classes, functions and variables which are helpful for deep learning algorithm. Allow us to gain proficiency with the modules given by Keras in this part.

Available modules

Below is the list of Available Modules in the keras -
  • initializers - Gives a list of initializers function.
  • regularizers - Gives a list of regularizers functions.
  • constraints - Gives a list of constraints function.
  • Activations - Gives a list os activator functon.
  • Losses - Gives a list of loss function.
  • metrics - Gives a list of metrics function . We can learn it in details in Model Training part.
  • optimizers - Gives a list of optimizer function. We can learn it in details in Model Training part.
  • callback - Gives a list of callback function. We can utilize it during the training interaction to print the moderate information as well as to stop the actual training (EarlyStopping technique>
    in view of some condition.
  • Text processing - Provides functions to change over text into NumPy array appropriate for machine learning. We can involve it in information phase of machine learning.
  • Image processing - Provides functions to change over pictures into NumPy array reasonable for machine learning. We can involve it in information phase of machine learning.
  • sequence processing - Provides functions to produce time based information from the given input information. We can involve it in information phase of machine learning.
  • Backend - Provides function of the backend library like TensorFlow and Theano.
  • Utilities - Provides lot of utility function helpful in deep learning.

    Backend module

    backend module is utilized for keras backend activities. By default, keras runs on top of TensorFlow backend. Assuming that you need, you can change to other backends like Theano or CNTK. Default backend setup is defined inside your root directory under .keras/keras.json file.

    Keras backend module can be imported utilizing below code
    >>> from keras import backend as k
    If we are utilizing default backend TensorFlow, the below function returns TensorFlow based data as determined below −
    >>> k.backend(>
    'tensorflow' >>> k.epsilon(>
    1e-07 >>> k.image_data_format(>
    'channels_last' >>> k.floatx(>
    'float32'
    Allow us to see a some of the significant backend functions utilized for information analysis in brief −

    get_uid(>

    It is the identifier for the default graph. It is defined below −
    >>> k.get_uid(prefix=''>
    1 >>> k.get_uid(prefix=''>
    2

    reset_uids

    It is utilized resets the uid value.
    >>> k.reset_uids(>
    Now, again execute the get_uid(>
    . This will be reset and change again to 1.
    >>> k.get_uid(prefix=''>
    1

    placeholder

    It is utilized starts up a placeholder tensor. Basic placeholder to hold 3-D shape is displayed below −
    >>> data = k.placeholder(shape = (1,3,3>
    >
    >>> data
    ="" dtype="float32"> If you use int_shape(>
    , it will show the shape. >>> k.int_shape(data>
    (1, 3, 3>

    dot

    It is used to multiply two tensors. Consider a and b are two tensors and c will be the result of multiply of ab. Expect a shape is (4,2>
    and b shape is (2,3>
    . It is defined below,
    >>> a = k.placeholder(shape = (4,2>
    >
    >>> b = k.placeholder(shape = (2,3>
    >
    >>> c = k.dot(a,b>
    >>> c
    ="" dtype="float32"> >>>

    ones

    It is utilized to initialize all as one value.
    >>> res = k.ones(shape = (2,2>
    >
    #print the value >>> k.eval(res>
    array([[1., 1.], [1., 1.]], dtype = float32>

    batch_dot

    It is used to perform the product of two data in batches. Input aspect should be 2 or higher. It is displayed below −
    >>> a_batch = k.ones(shape = (2,3>
    >
    >>> b_batch = k.ones(shape = (3,2>
    >
    >>> c_batch = k.batch_dot(a_batch,b_batch>
    >>> c_batch <tf.Tensor 'ExpandDims:0' shape = (2, 1>
    dtype = float32>

    Variable

    It is utilized to initializes a variable. Give us perform simple transpose activity access this variable.
    >>> data = k.variable([[10,20,30,40],[50,60,70,80]]>
    #variable initialized here >>> result = k.transpose(data>
    >>> print(result>
    Tensor("transpose_6:0", shape = (4, 2>
    , dtype = float32>
    >>> print(k.eval(result>
    >
    [[10. 50.] [20. 60.] [30. 70.] [40. 80.]]
    To access from numpy −
    >>> data = np.array([[10,20,30,40],[50,60,70,80]]>
    >>> print(np.transpose(data>
    >
    [[10 50] [20 60] [30 70] [40 80]] >>> res = k.variable(value = data>
    >>> print(res>
    <tf.Variable 'Variable_7:0' shape = (2, 4>
    dtype = float32_ref>

    is_sparse(tensor>

    It is utilized to check whether the tensor is spare or not
    >>> a = k.placeholder((2, 2>
    , sparse=True>
    >>> print(a>
    SparseTensor(indices = Tensor("Placeholder_8:0", shape = (?, 2>
    , dtype = int64>
    , values = Tensor("Placeholder_7:0", shape = (?,>
    , dtype = float32>
    , dense_shape = Tensor("Const:0", shape = (2,>
    , dtype = int64>
    >
    >>> print(k.is_sparse(a>
    >
    True

    to_dense(>

    It is utilized to converts spare into dense.
    >>> b = k.to_dense(a>
    >>> print(b>
    Tensor("SparseToDense:0", shape = (2, 2>
    , dtype = float32>
    >>> print(k.is_sparse(b>
    >
    False

    random_uniform_variable

    It is utilized to initialize using uniform distribution concept.
    k.random_uniform_variable(shape, mean, scale>
    Here,
  • shape − indicates the rows and columns in the arrangement of tuples.
  • mean − mean of uniform distribution.
  • scale − standard deviation of uniform appropriation.

    Allow us to view the below example utilization
    >>> a = k.random_uniform_variable(shape = (2, 3>
    , low=0, high = 1>
    >>> b = k. random_uniform_variable(shape = (3,2>
    , low = 0, high = 1>
    >>> c = k.dot(a, b>
    >>> k.int_shape(c>
    (2, 2>

    utils module

    utils gives valuable utilities capacity to deep learning. A portion of the techniques gave by the utils module is as per the following −

    HDF5Matrix

    from keras.utils import HDF5Matrix data = HDF5Matrix('data.hdf5', 'data'>

    to_categorical

    >>> from keras.utils import to_categorical 
    >>> labels = [0, 1, 2, 3, 4, 5, 6, 7, 8, 9] 
    >>> to_categorical(labels>
    array([[1., 0., 0., 0., 0., 0., 0., 0., 0., 0.], [0., 1., 0., 0., 0., 0., 0., 0., 0., 0.], [0., 0., 1., 0., 0., 0., 0., 0., 0., 0.], [0., 0., 0., 1., 0., 0., 0., 0., 0., 0.], [0., 0., 0., 0., 1., 0., 0., 0., 0., 0.], [0., 0., 0., 0., 0., 1., 0., 0., 0., 0.], [0., 0., 0., 0., 0., 0., 1., 0., 0., 0.], [0., 0., 0., 0., 0., 0., 0., 1., 0., 0.], [0., 0., 0., 0., 0., 0., 0., 0., 1., 0.], [0., 0., 0., 0., 0., 0., 0., 0., 0., 1.]], dtype = float32>
    >>> from keras.utils import normalize >>> normalize([1, 2, 3, 4, 5]>
    array([[0.13483997, 0.26967994, 0.40451992, 0.53935989, 0.67419986]]>

    print_summary

    It is used to print the summary of the model.
    from keras.utils import print_summary print_summary(model>

    plot_model

    It is utilized to make the model representation in dot configuration and save it to file.
    from keras.utils import plot_model 
    plot_model(model,to_file = 'image.png'>

    keras-ocr python provides out-of-the-box OCR models and an end-to-end training pipeline to build new OCR models.