Skip to content

Latest commit

 

History

History
49 lines (34 loc) · 1.65 KB

README.md

File metadata and controls

49 lines (34 loc) · 1.65 KB

ActivationFunctions using Custom Layers in Keras

GitHub stars GitHub forks made-with-python GitHub license

Activation functions are an important are of deep learning research .Many new activation functions are being developed ,these include bio-inspired activtions, purely mathematical activation functions including others . Despite, such advancements we usually find ourselves using RELU and LeakyRELU commonly without using/thinking about others. In the following notebooks I showcase how easy/difficult it is to port an activation function using Custom Layers in Keras and Tensorflow!

Link to main notebook --> Activations.ipynb

Implemented activations:

  • LeakyReLu
  • ParametricReLu
  • Elu
  • SElu
  • Swish
  • GELU

Structure

src
|
|-- Activations.ipynb
|-- utils
     |-- Utils.ipynb
     |-- utils.py
     
references
|
|--Ref1
|--Refn

Usage

 git clone  https://github.com/Agrover112/ActivationFunctions.git

References