A project template to simplify building and training deep learning models using Keras.
This template allows you to simply build and train deep learning models with checkpoints and tensorboard visualization.
In order to use the template you have to:
python main.py -c [path to configuration file]
A simple model for the mnist dataset is available to test the template. To run the demo project:
python main.py -c configs/simple_mnist_config.json
tensorboard --logdir=experiments/simple_mnist/logs
This template also supports reporting to Comet.ml which allows you to see all your hyper-params, metrics, graphs, dependencies and more including real-time metric.
Add your API key in the configuration file:
For example: "comet_api_key": "your key here"
Here's how it looks after you start training:
You can also link your Github repository to your comet.ml project for full version control.
├── main.py - here's an example of main that is responsible for the whole pipeline.
│
│
├── base - this folder contains the abstract classes of the project components
│ ├── base_data_loader.py - this file contains the abstract class of the data loader.
│ ├── base_model.py - this file contains the abstract class of the model.
│ └── base_train.py - this file contains the abstract class of the trainer.
│
│
├── model - this folder contains the models of your project.
│ └── simple_mnist_model.py
│
│
├── trainer - this folder contains the trainers of your project.
│ └── simple_mnist_trainer.py
│
|
├── data_loader - this folder contains the data loaders of your project.
│ └── simple_mnist_data_loader.py
│
│
├── configs - this folder contains the experiment and model configs of your project.
│ └── simple_mnist_config.json
│
│
├── datasets - this folder might contain the datasets of your project.
│
│
└── utils - this folder contains any utils you need.
├── config.py - util functions for parsing the config files.
├── dirs.py - util functions for creating directories.
└── utils.py - util functions for parsing arguments.
You need to:
You need to:
Note: To add functionalities after each training epoch such as saving checkpoints or logs for tensorboard using Keras callbacks:
Note: You can use fit_generator instead of fit to support generating new batches of data instead of loading the whole dataset at one time.
You need to:
Note: You can also define a different logic where the data loader class has a function get_next_batch if you want the data reader to read batches from your dataset each time.
You need to define a .json file that contains your experiment and model configurations such as the experiment name, the batch size, and the number of epochs.
Responsible for building the pipeline.
We can now load models without having to explicitly create an instance of each class. Look at:
python from_config.py -c configs/simple_mnist_from_config.json
python from_config.py -c configs/conv_mnist_from_config.json
Create a command line tool for Keras project scaffolding where the user defines a data loader, a model, a trainer and runs the tool to generate the whole project. (This is somewhat complete now by loading each of these from the config)
Any contributions are welcome including improving the template and example projects.
This project template is based on MrGemy95's Tensorflow Project Template.
Thanks for my colleagues Mahmoud Khaled, Ahmed Waleed and Ahmed El-Gammal who worked on the initial project that spawned this template.