Real-Time Style Transfer with Keras

Brick nuke gif

This is an attempt at implementing something like Real-Time Style Transfer with the Keras framework.

Install

Usage

After installation you'll find train-rtst.sh, render-rtst.sh and rtst.py on your path. The shell scripts are just wrappers around rtst.py to demonstrate usage and maybe be a little convenient. There's also a script rtst-download-training-images.sh that will download a small batch of images randomly selected from a subset of ImageNet 2012.

Examples

There's an examples folder. Example of an example:

Train a brick texturizer: ./make-example-texturizer.sh bricks0 path/to/training/images path/to/evaluation/images path/to/vgg16/weights.h5

Texturize a gif with that brick texturizer: VGG_WEIGHTS=/path/to/vgg.h5 ./texturize-gif.sh path/to/your.gif bricks0 out/bricks0gif

Differences from the paper