# Dense Layer

## Description

The **Dense** **Layer** module in a neural network is a layer that performs the integration of the input along with an activation.

## Parameters

The Dense Layer module contains an enriched set of functions to configure at will.

![ There are many types of activation functions in the Dense Layer module. ](https://1833277725-files.gitbook.io/~/files/v0/b/gitbook-legacy-files/o/assets%2F-Lyc1OXsKqB2S62LxsOR%2F-M1V6op4p7QlR3ozPwCx%2F-M1V7KFhuE9CUasYeJbz%2Fimage.png?alt=media\&token=ea6bf77f-1bf7-4dae-97ea-3c2902884db7)

We can , for instance,  set the **Tensorflow Keras API**  as either:     &#x20;

* **Automatic**,&#x20;
* **Sequential**&#x20;
* or  **Functional**.

### Activation function

The **Activation function,** in turn, could be :&#x20;

* **Sigmoid**&#x20;
* **Softmax**
* **Exponential linear unit (ELU)**&#x20;
* **Scaled Exponential Linear Unit (SELU)**
* **Softplus**&#x20;
* **Softsign**&#x20;
* **Rectified Linear Unit(RELU)**
* **Hyperbolic tangent**&#x20;
* **Sigmoid Hard**&#x20;
* **Sigmoid Exponential (base e)**&#x20;
* **Identity function (Linear)**

### Input                                                                                                                                                  &#x20;

As an input we can choose among :  **dimension**,  **shape** and  **batch input shape.**

### **Initialization**&#x20;

Both **Kernel** and **Bias** can be attributed an [initialization](http://tflearn.org/initializations/) based on :

* **Zeros**&#x20;
* **Ones**&#x20;
* **Constant**&#x20;
* **Random**&#x20;
* **Normal**&#x20;
* **Random Uniform**&#x20;
* **Truncated Normal**&#x20;
* **Variance scaling**
* **Orthogonal**&#x20;
* **Identity**&#x20;
* **Le Cun Uniform**&#x20;
* **Glorot Normal**&#x20;
* **He Normal**
* **Le Cun Normal**
* **He Uniform**

Regularizers are available:  **bias , activity , kernel .**

Activity regularizer:  **None, L1 regularizer, L2 regularizer**
