Recurrent Neural Networks

This module belongs to the category " Deep learning algorithms" .

Description

This module is used to initiate a Recurrent Neural Networks estimator based on SmartPredict library.

Parameters

The Recurrent Neural Networks module

Optimization parameters

The available optimizers for the RNN are:

  • adam

  • rmsprop

  • adagrad

  • adamax

  • sgd

Other optimization parameters can also be set: Learning rate , Beta Gradient, Clipping by Norm or by Value, Epsilon Momentum, Rho.

Loss function

The loss function can be:

  • binary_xentropy

  • categorical_xentropy

  • sparse_categorical _xentropy

  • hinge

  • mae

  • cosine

The module also contains a range of parameters such as : layer cells, timestamp, hidden recurrent units...

Last updated