LSTM Layer

This module belongs to the category " LSTM Layer" .

Description

LSTM layers or Long Short Term Memory , are similar to RNN , except that they can be used to keep track of relevant far past sequence. It is

Parameters

The LSTM is mainly based on the TensorFlow - Keras API. To set the type to use, we can select between: automatic, sequential and functional.

Activation function

We may choose one from the several types of activation function included in the LSTM module's parameters:

  • Softmax Exponential Linear Unit (ELU)

  • Scaled Exponential Linear Unit (SELU)

  • Softplus

  • Softsign

  • Rectified Linear Unit (RELU)

  • Hyperbolic tangent

  • Sigmoid

  • Hard Sigmoid

  • Exponential (base e)

  • Identity function (Linear)

Dropout function

The dropout function should be used carefully. As this article states , it should only be reserved for certain circumstances : "Generally, we only need to implement regularization when our network is at risk of over fitting. This can happen if a network is too big, if you train for too long, or if you don’t have enough data."

The TensorFlow documentation describes other strategies to use for preventing under or over fitting.

Last updated