LSTM Layer

This module belongs to the category " LSTM Layer" .

Description

LSTM layers or Long Short Term Memory , are similar to RNN , except that they can be used to keep track of relevant far past sequence. It is

Parameters

The LSTM is mainly based on the TensorFlow - Keras API. To set the type to use, we can select between: automatic, sequential and functional.

We can choose between : Automatic, Sequential and Functional API.

Activation function

We may choose one from the several types of activation function included in the LSTM module's parameters:

  • Softmax Exponential Linear Unit (ELU)

  • Scaled Exponential Linear Unit (SELU)

  • Softplus

  • Softsign

  • Rectified Linear Unit (RELU)

  • Hyperbolic tangent

  • Sigmoid

  • Hard Sigmoid

  • Exponential (base e)

  • Identity function (Linear)

Dropout function

The Dropout function is well-known for mitigating overfitting.

Last updated

Was this helpful?