...

Artificial Intelligence - Keras

Back to Course

Lesson Description


Lession - #905 Keras-Customized Layer


Keras permits to make our own customized layer. When another layer is made, it tends to be utilized in any model with practically no limitation. Allow us to figure out how to make new layer in this section.

Keras gives a base layer class, Layer which can sub-classed to make our own customized layer. Allow us to make a simple layer which will find weight in view of normal distribution and afterward do the essential computation of tracking down the summation of the result of information and its weight during training.

Step 1 - Import the necessary Module

First,import the necessary Modules
from keras import backend as K 
from keras.layers import Layer
Here,
  • backend is utilized to get to the dot function.
  • Layer is the base class and we will be sub-classing it to make our layer.

    Step 2 - Define a Layer Class

    Allow us to make new class, MyCustomLayer by sub-classing Layer class −
    class MyCustomLayer(Layer>
    : ...

    Step 3 - Initialize the Layer Class

    Allow us to keras initializer our new class as determined below −
    def __init__(self, output_dim, **kwargs>
    : self.output_dim = output_dim super(MyCustomLayer, self>
    .__init__(**kwargs>
    Here,
  • Line 2 sets the result dimension.
  • Line 3 calls the base or super layer's init function.

    Step 4 - Implement Build Method

    build is the fundamental method and its just intention is to appropriately build the layer. It can do anything connected with the internal working of the layer. When the custom usefulness is done, we can call the base class build function. Our custom build function is as per the following −
    def build(self, input_shape>
    : self.kernel = self.add_weight(name = 'kernel', shape = (input_shape[1], self.output_dim>
    , initializer = 'normal', trainable = True>
    super(MyCustomLayer, self>
    .build(input_shape>
    Here,
  • Line 1 defines the build technique with one contention, input_shape. Shape of the information is referred by input_shape.
  • Line 2 makes the weight corresponding to include shape and set it in the kernal. It is our custom usefulness of the layer. It makes the weight utilizing 'typical' initializer.
  • Line 6 calls the base class, build method.

    Step 5 - Implement call method

    call method does the specific working of the layer during training process.

    Our custom call method is as per the following
    def call(self, input_data>
    : return K.dot(input_data, self.kernel>
    Here,
  • Line 1 define the call technique with one contention, input_data. input_data is the input information for our layer.
  • Line 2 return the dot result of the information, input_data and our layer's kernal, self.kernel.

    Step 6 - Implement compute_output_shape method

    def compute_output_shape(self, input_shape>
    : return (input_shape[0], self.output_dim>
    Implementing the build, call and compute_output_shape finishes the making a customized layer. The last and complete code is as per the following
    from keras import backend as K from keras.layers import Layer
    class MyCustomLayer(Layer>
    : def __init__(self, output_dim, **kwargs>
    : self.output_dim = output_dim super(MyCustomLayer, self>
    .__init__(**kwargs>
    def build(self, input_shape>
    : self.kernel = self.add_weight(name = 'kernel', shape = (input_shape[1], self.output_dim>
    , initializer = 'normal', trainable = True>
    super(MyCustomLayer, self>
    .build(input_shape>
    # Be sure to call this at the end def call(self, input_data>
    : return K.dot(input_data, self.kernel>
    def compute_output_shape(self, input_shape>
    : return (input_shape[0], self.output_dim>

    Using our customized layer

    Allow us to make simple model involving our customized layer as indicated below −
    from keras.models import Sequential 
    from keras.layers import Dense 
    
    model = Sequential(>
    model.add(MyCustomLayer(32, input_shape = (16,>
    >
    >
    model.add(Dense(8, activation = 'softmax'>
    >
    model.summary(>
    Running the application will print the model summary as below −
    Model: "sequential_1" 
    _________________________________________________________________ 
    Layer (type>
    Output Shape Param #================================================================ my_custom_layer_1 (MyCustomL (None, 32>
    512 _________________________________________________________________ dense_1 (Dense>
    (None, 8>
    264 ================================================================= Total params: 776 Trainable params: 776 Non-trainable params: 0 _________________________________________________________________
    keras GRU comprises of the reset gate and the update gate instead of the input, output and forget gate of the LSTM.

    keras vae are a deep learning technique for learning latent representations.

    keras Bidirectional LSTMs are supported in Keras via the Bidirectional layer wrapper.
    What is the best Keras K-fold cross validation? k=10.