SmartPredict
  • Documentation
  • OVERVIEW
    • Presentation
      • Key features
      • Who may benefit from its use
      • The SmartPredict Graphical User Interface (GUI)
        • The SmartPredict Modules
        • Focus on the Notebooks
        • Practical shortcuts
    • Prerequisites
  • Getting started
    • Getting started (Part1) : Iris classification project
      • Project description
      • Step 1. Create the project
      • Step 2. Upload the dataset
      • Step 3. Preprocess the dataset
      • Step 4. Build the flowchart
        • Set up the flowchart
        • Configure the modules
      • Step 5. Run the build
      • Step 6. Deploy the project
      • Step 7. Make inferences with our model
      • Conclusion
  • Getting started (Part 2): Predicting the passengers' survival in the Titanic shipwreck
    • Project description
    • Step 1. Create the project
    • Step 2. Upload the dataset
    • Step 3. Preprocess the dataset
    • Step 4. Build the flowchart
    • Step 5. Run the build
    • Step 6. Deploy the pipeline
    • Step 7. Make inferences with our pipeline
  • MODULE REFERENCE
    • CORE MODULES
      • Introduction
      • Basic Operations
        • Item Saver
      • Web Services
        • Web Service IN and OUT
      • Data retrieval
        • Data fetcher
        • Data frame loader/converter
        • Image data loader
      • Data preprocessing
        • Introduction
        • Array Reshaper
        • Generic Data Preprocessor
        • Missing data handler
        • Normalizer
        • One Hot Encoder
        • Ordinal Encoder
      • Data selection
        • Features selector
        • Generic data splitter
        • Labeled data splitter
      • Training and Prediction
        • Predictor DL models
        • Predictor ML models
        • Predictor ML models (Probabilistic models)
        • Trainer ML models
        • Trainer/Evaluator DL models
      • Evaluation and fine-tuning
        • Cross Validator for ML
        • Evaluator for ML models
      • Machine Learning algorithms
        • ML modules in SmartPredict
        • Decision Tree Regressor
        • KNeighbors Classifier
        • KNeighbors Regressors
        • Linear Regressor
        • Logistic Regressor
        • MLP Regressor
        • Naive Bayes Classifier
        • Random Forest Classifier
        • Random Forest Regressor
        • Support Vector Classifier
        • Support Vector Regressor
        • XGBoost Classifier
        • XGBoost Regressor
      • Deep learning algorithms
        • Dense Neural Network
        • Recurrent Neural Networks
      • Computer Vision
        • Convolutional Recurrent Networks
        • Fully Convolutional Neural Networks
        • Face detector
        • Image IO
        • Image matcher
        • Yolo
      • Natural Language Processing
        • Introduction
        • Text cleaner
        • Text vectorizer
      • Times Series processing
        • TS features selector
      • TensorFlow API
        • LSTM Layer
        • Dense Layer
      • Helpers
        • Data/Object Logger
        • Object Selector (5 ports)
      • Conclusion
  • CUSTOM MODULES
    • Function
    • Class
    • Use cases
Powered by GitBook
On this page
  • Description
  • Parameters
  • Activation function
  • Dropout function

Was this helpful?

  1. MODULE REFERENCE
  2. CORE MODULES
  3. TensorFlow API

LSTM Layer

This module belongs to the category " LSTM Layer" .

PreviousTensorFlow APINextDense Layer

Last updated 5 years ago

Was this helpful?

Description

LSTM layers or Long Short Term Memory , are similar to RNN , except that they can be used to keep track of relevant far past sequence. It is

Parameters

The LSTM is mainly based on the To set the type to use, we can select between: automatic, sequential and functional.

Activation function

We may choose one from the several types of activation function included in the LSTM module's parameters:

  • Softmax Exponential Linear Unit (ELU)

  • Scaled Exponential Linear Unit (SELU)

  • Softplus

  • Softsign

  • Rectified Linear Unit (RELU)

  • Hyperbolic tangent

  • Sigmoid

  • Hard Sigmoid

  • Exponential (base e)

  • Identity function (Linear)

Dropout function

The dropout function should be used carefully. As states , it should only be reserved for certain circumstances : "Generally, we only need to implement regularization when our network is at risk of over fitting. This can happen if a network is too big, if you train for too long, or if you don’t have enough data."

The describes other strategies to use for preventing under or over fitting.

this article
TensorFlow documentation
TensorFlow - Keras API.
We can choose between : Automatic, Sequential and Functional API.
The Dropout function is well-known for mitigating overfitting.