Kaggle Notebooks allow users to run a Python Notebook in the cloud against our competitions and datasets without having to download data or set up their environment.
This repository includes our Dockerfiles for building the CPU-only and GPU image that runs Python Notebooks on Kaggle.
Our Python Docker images are stored on Google Container Registry at:
Note: The base image for the GPU image is our CPU-only image. The gpu.Dockerfile adds a few extra layers to install GPU related libraries and packages (cuda, libcudnn, pycuda etc.) and reinstall packages with specific GPU builds (torch, tensorflow and a few mores).
To get started with this image, read our guide to using it yourself, or browse Kaggle Notebooks for ideas.
First, evaluate whether installing the package yourself in your own notebooks suits your needs. See guide.
If you the first step above doesn't work for your use case, open an issue or a pull request.
./build
Flags:
--gpu
to build an image for GPU.--use-cache
for faster iterative builds.A suite of tests can be found under the /tests
folder. You can run the test using this command:
./test
Flags:
--gpu
to test the GPU image.For the CPU-only image:
# Run the image built locally:
docker run --rm -it kaggle/python-build /bin/bash
# Run the pre-built image from gcr.io
docker run --rm -it gcr.io/kaggle-images/python /bin/bash
For the GPU image:
# Run the image built locally:
docker run --runtime nvidia --rm -it kaggle/python-gpu-build /bin/bash
# Run the image pre-built image from gcr.io
docker run --runtime nvidia --rm -it gcr.io/kaggle-gpu-images/python /bin/bash
To ensure your container can access the GPU, follow the instructions posted here.
A Tensorflow custom pre-built wheel is used mainly for:
Building Tensorflow from sources:
The Dockerfile and the instructions can be found in the tensorflow-whl folder/.