DeepX Build Status Coverage Status PyPI

DeepX is a deep learning library designed with flexibility and succinctness in mind. The key aspect is an point-free shorthand to describe your neural network architecture.

DeepX supports Tensorflow, PyTorch and Jax. DeepX is also still in alpha, so there are likely going to be changes that change the API/names/design. I'm very open to feedback and suggestions, so shoot me an email if you are interested in contributing.

Installation

$ pip install deepx

Quickstart

The first step in building your first network is to define your model. The model is the input-output structure of your network. Let's consider the task of classifying MNIST with a multilayer perceptron (MLP).

>>> from deepx import nn
>>> net = nn.Relu(200) >> nn.Relu(200) >> nn.Softmax(10)

Our model behaves like a function.

import tensorflow as tf
net(tf.ones((10, 784)))

To get the weights out of the network, we can just say:

net.get_parameters()

We can also use a convolutional neural network for classification and it'll work exactly the same!

net = (nn.Reshape([28, 28, 1])
        >> nn.Conv([2, 2, 64])
        >> nn.Conv([2, 2, 32])
        >> nn.Conv([2, 2, 16])
        >> nn.Flatten() >> nn.Relu(200) >> nn.Relu(200) >> nn.Softmax(10))

That's it, we're done!

Keras

DeepX allows you to use Keras layers with the same >> composition format. It's really easy! All layers in tf.keras.layers are wrapped in the deepx.keras package. This allows you to compose them.

import deepx.keras as nn
net = (
    nn.Conv2D(64, (5, 5), padding='same') >> nn.ReLU() >> nn.MaxPooling2D(padding='same')
    >> nn.Conv2D(64, (5, 5), padding='same') >> nn.ReLU() >> nn.MaxPooling2D(padding='same')
    >> nn.Flatten() >> nn.Dense(1024)
    >> nn.ReLU() >> nn.Dense(10) >> nn.Softmax()
)

These layers are only compatible with Tensorflow, however, since the only compatible backend between Keras and DeepX is Tensorflow.

Distributions

DeepX also wraps the distributions that ship with Tensorflow Probability and PyTorch. This enables you write probabilistic neural networks (like those in the VAE) very easily.

decoder = nn.Relu(L, 500) >> nn.Relu(500) >> layers.Bernoulli(D)
encoder = nn.Relu(D, 500) >> nn.Relu(500) >> layers.Gaussian(L)