Dense Layer

This module belongs the 'TensorFlow API' category .

Description

The Dense Layer module in a neural network is a layer that performs the integration of the input along with an activation.

Parameters

The Dense Layer module contains an enriched set of functions to configure at will.

We can , for instance, set the Tensorflow Keras API as either:

  • Automatic,

  • Sequential

  • or Functional.

Activation function

The Activation function, in turn, could be :

  • Sigmoid

  • Softmax

  • Exponential linear unit (ELU)

  • Scaled Exponential Linear Unit (SELU)

  • Softplus

  • Softsign

  • Rectified Linear Unit(RELU)

  • Hyperbolic tangent

  • Sigmoid Hard

  • Sigmoid Exponential (base e)

  • Identity function (Linear)

Input

As an input we can choose among : dimension, shape and batch input shape.

Initialization

Both Kernel and Bias can be attributed an initialization based on :

  • Zeros

  • Ones

  • Constant

  • Random

  • Normal

  • Random Uniform

  • Truncated Normal

  • Variance scaling

  • Orthogonal

  • Identity

  • Le Cun Uniform

  • Glorot Normal

  • He Normal

  • Le Cun Normal

  • He Uniform

Regularizers are available: bias , activity , kernel .

Activity regularizer: None, L1 regularizer, L2 regularizer

Last updated