Loading…
P-TELU: Parametric Tan Hyperbolic Linear Unit Activation for Deep Neural Networks
This paper proposes a new activation function, namely, Parametric Tan Hyperbolic Linear Unit (P-TELU) for deep neural networks. The work is inspired from two recently proposed functions: Parametric RELU (P-RELU) and Exponential Linear Unit (ELU). The specific design of P-TELU allows it to leverage t...
Saved in:
Main Authors: | , |
---|---|
Format: | Conference Proceeding |
Language: | English |
Subjects: | |
Online Access: | Request full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | This paper proposes a new activation function, namely, Parametric Tan Hyperbolic Linear Unit (P-TELU) for deep neural networks. The work is inspired from two recently proposed functions: Parametric RELU (P-RELU) and Exponential Linear Unit (ELU). The specific design of P-TELU allows it to leverage two advantages: (1) the flexibility of tuning parameters from the data distribution similar to P-RELU and (2) better noise robustness similar to ELU. Owing to larger gradient and early saturation of tan hyperbolic compared to exponential function, the proposed activation allows a neuron to reach/exit from the noise robust deactivation state earlier and faster. The performance of the proposed function is evaluated on CIFAR10 and CI-FAR100 image dataset using two convolutional neural network (CNN) architectures: KerasNet, a small 6 layer CNN model, and on 76 layer deep ResNet architecture. Results demonstrate enhanced performance of the proposed activation function in comparison to the existing activation functions. |
---|---|
ISSN: | 2473-9944 |
DOI: | 10.1109/ICCVW.2017.119 |