Development to resume from June
Python package containing all mathematical backend algorithms used in Machine Learning. The full documentation for Echo is provided here.
Echo-AI Package is created to provide an implementation of the most promising mathematical algorithms, which are missing in the most popular deep learning libraries, such as PyTorch, Keras and TensorFlow.
The package contains implementation for following activation functions (✅ - implemented functions, 🕑 - functions to be implemented soon, :white_large_square: - function is implemented in the original deep learning package):
# | Function | Equation | PyTorch | TensorFlow-Keras | TensorFlow - Core |
---|---|---|---|---|---|
1 | Weighted Tanh | ✅ | ✅ | 🕑 | |
2 | Swish | ✅ | ✅ | 🕑 | |
3 | ESwish | ✅ | ✅ | 🕑 | |
4 | Aria2 | ✅ | ✅ | 🕑 | |
5 | ELiSH | ✅ | ✅ | 🕑 | |
6 | HardELiSH | ✅ | ✅ | 🕑 | |
7 | Mila | ✅ | ✅ | 🕑 | |
8 | SineReLU | ✅ | ✅ | 🕑 | |
9 | Flatten T-Swish | ✅ | ✅ | 🕑 | |
10 | SQNL | ✅ | ✅ | 🕑 | |
11 | ISRU | ✅ | ✅ | 🕑 | |
12 | ISRLU | ✅ | ✅ | 🕑 | |
13 | Bent's identity | ✅ | ✅ | 🕑 | |
14 | Soft Clipping | ✅ | ✅ | 🕑 | |
15 | SReLU | ✅ | ✅ | 🕑 | |
15 | BReLU | ✅ | ✅ | 🕑 | |
16 | APL | ✅ | ✅ | 🕑 | |
17 | Soft Exponential | ✅ | ✅ | 🕑 | |
18 | Maxout | ✅ | ✅ | 🕑 | |
19 | Mish | ✅ | ✅ | 🕑 | |
20 | Beta Mish | ✅ | ✅ | 🕑 | |
21 | RReLU | ⬜ | 🕑 | 🕑 | |
22 | CELU | ⬜ | ✅ | 🕑 | |
23 | HardTanh | ⬜ | ✅ | 🕑 | |
24 | GLU | ⬜ | 🕑 | 🕑 | |
25 | LogSigmoid | ⬜ | ✅ | 🕑 | |
26 | TanhShrink | ⬜ | ✅ | 🕑 | |
27 | HardShrink | ⬜ | ✅ | 🕑 | |
28 | SoftShrink | ⬜ | ✅ | 🕑 | |
29 | SoftMin | ⬜ | ✅ | 🕑 | |
30 | LogSoftmax | ⬜ | ✅ | 🕑 | |
31 | Gumbel-Softmax | ⬜ | 🕑 | 🕑 | |
32 | LeCun's Tanh | ✅ | ✅ | 🕑 | |
33 | TaLU | 🕑 | ✅ | 🕑 | |
34 | SiLU | ✅ | ✅ | 🕑 | |
35 | GELU | 🕑 | 🕑 | 🕑 | |
36 | NReLU | 🕑 | 🕑 | 🕑 | |
37 | CReLU | 🕑 | ✅ | 🕑 | |
38 | ProbAct | 🕑 | 🕑 | 🕑 | |
39 | Noisy Activation Function | 🕑 | 🕑 | 🕑 | |
40 | NLReLU | ✅ | ✅ | 🕑 |
The repository has the following structure:
- echoAI # main package directory
| - Activation # sub-package containing activation functions implementation
| |- Torch # sub-package containing implementation for PyTorch
| | | - functional.py # script which contains implementation of activation functions
| | | - weightedTanh.py # activation functions wrapper class for PyTorch
| | | - ... # PyTorch activation functions wrappers
| |- TF_Keras # sub-package containing implementation for Tensorflow-Keras
| | | - custom_activation.py # script which contains implementation of activation functions
| - __init__.py
- Observations # Folder containing other assets
- docs # Sphinx documentation folder
- LICENSE # license file
- README.md
- setup.py # package setup file
- Scripts #folder, which contains the Black and Flake8 automated test scripts
- Smoke_tests # folder, which contains scripts with demonstration of activation functions usage
- Unit_tests # folder, which contains unit test scripts
To install echoAI package from PyPI run the following command:
$ pip install echoAI
Sample scripts are provided in Smoke_tests folder.
PyTorch:
# import PyTorch
import torch
# import activation function from echoAI
from echoAI.Activation.Torch.mish import Mish
# apply activation function
mish = Mish()
t = torch.tensor(0.1)
t_mish = mish(t)
TensorFlow Keras:
#import tensorflow
import tensorflow as tf
from tensorflow.keras import layers
from tensorflow.keras.layers import Dense, Flatten
# import activation function from echoAI
from echoAI.Activation.TF_Keras.custom_activation import Mish
model = tf.keras.Sequential([
layers.Flatten(),
layers.Dense(128, input_shape=(784,)),
Mish(), # use the activation function
layers.Dense(64, activation='relu'),
layers.Dense(10, activation='softmax')])
# Compile the model
model.compile(optimizer = "adam", loss = "mean_squared_error", metrics = ["accuracy"])
# Fit the model
model.fit(x = X_train, y = y_train, epochs = 3, batch_size = 128)