Artificial Intelligence - Keras

Back to Course

Lesson Description

Lession - #956 Keras-Time Series Prediction using LSTM RNN

In this part, let us compose a simple Long Short Term Memory (LSTM>
based RNN to do arrangement analysis. A succession is a bunch of values where each value relates to a specific instance of time. Allow us to think about a basic example of reading a sentence. reading and understanding a sentence includes perusing the word in the provided request and attempting to see each word and its significance in the given setting lastly understanding the sentence in a positive or negative feeling.

Here, the words are considered as values, and first value relates to initially word, second value compares to second word, and so on, and the request will be totally kept up with. Sequence Analysis is utilized habitually in normal language handling to track down the sentiment analysis of the given text.

Allow us to make a LSTM model to investigate the IMDB film audits and see as its positive/negative feeling.

The model for the sequence analysis can be addressed as underneath − The core features of the model are as per the following −

  • Input layer utilizing Embedding layer with 128 featuress.
  • First layer, Dense comprises of 128 units with typical dropout and repetitive dropout set to 0.2.
  • Output layer, Dense comprises of 1 unit and 'sigmoid' actuation function.
  • Use binary_crossentropy as misfortune function.
  • Use adam as Optimizer.
  • Use accuracy as measurements.
  • Utilize 32 as cluster size.
  • Utilize 15 as epochs.
  • Utilize 80 as the most extreme length of the word.
  • Utilize 2000 as the greatest number of word in a given sentence.

    Step 1 Import the Modules

    Allow us to import the necessary modules:
    from keras.preprocessing import sequence 
    from keras.models import Sequential 
    from keras.layers import Dense, Embedding 
    from keras.layers import LSTM 
    from keras.datasets import imdb

    Step 2 Load Data

    Allow us to import the imdb dataset.
    (x_train, y_train>
    , (x_test, y_test>
    = imdb.load_data(num_words = 2000>
    imdb is a dataset given by Keras. It addresses an assortment of films and its audits.
    num_words address the most extreme number of words in the audit.

    Step 3 Process the Data

    Allow us to change the dataset as indicated by our model, so it very well may be taken care of into our model. The information can be changed utilizing the underneath code −
    x_train = sequence.pad_sequences(x_train, maxlen=80>
    x_test = sequence.pad_sequences(x_test, maxlen=80>
    sequence.pad_sequences convert the list of information with shape, (information>
    into 2D NumPy array of shape (information, timesteps>
    . Fundamentally, it adds timesteps idea into the given information. It produces the timesteps of length, maxlen.

    Step 4 Create the model

    Allow us to create model.
    model = Sequential(>
    model.add(Embedding(2000, 128>
    model.add(LSTM(128, dropout = 0.2, recurrent_dropout = 0.2>
    model.add(Dense(1, activation = 'sigmoid'>
    We have utilized Embedding layer as info layer and afterward added the LSTM layer. At long last, a Dense layer is utilized as result layer.

    Step 5 Compile the model

    Allow us to aggregate the model utilizing chosen loss function, optimizer and metrics.
    model.compile(loss = 'binary_crossentropy', 
       optimizer = 'adam', metrics = ['accuracy']>

    Step 6 Train the model

    LLet us train the model utilizing fit(>
       x_train, y_train, 
       batch_size = 32, 
       epochs = 15, 
       validation_data = (x_test, y_test>

    Step 7 Evaluate the model

    Allow us to assess the model utilizing test information.
    score, acc = model.evaluate(x_test, y_test, batch_size = 32>
    print('Test score:', score>
    print('Test accuracy:', acc>
    keras plot_model(>
    is a generic plot-function, which accepts many model-objects, like lm , glm , lme , lmerMod etc.
    keras normalization:A preprocessing layer which normalizes continuous features
    keras without tensorflow:You can use TensorFlow without Keras and you can use Keras with CNTK, Theano, or other machine learning libraries