You, Me and Docker

Docker is a powerful tool that allows developers to create, deploy, and run applications in containers. One of the benefits of using Docker is the ability to create reproducible environments that can be easily ported to different development environments.

In this guide, we will,

  1. Create a requirements.txt file
  2. Create a Dockerfile for Deep learning environments using the requirements.txt
  3. Build and run the docker image with the scope of using running a jupyter notebook.

 

Step 0: Create folders and associated files

The following command creates a new folder called as dockerLearning and creates empty files called as Dockerfile and requirements.txt.

$ mkdir dockerLearning && \
 cd dockerLearning && \
 touch Dockerfile && \
 touch requirements.txt

The following should be your resulting folder structure.

dockerLearning
            |
            |-Dockerfile
            |-requirements.txt

 

Step 1: Create a requirements.txt file

Inside the requirements.txt file, at the root directory of your project you can add your python dependencies. Here is an example of a requirements.txt file that includes PyTorch and Jupyter Notebook:

# this should be insider your requirements.txt
torch
jupyter
numpy
torchvision

 

Step 2: Create a Dockerfile

Here is an example of a Dockerfile that includes the necessary commands to build an image with PyTorch and Jupyter Notebook. This file will be used to build the Docker image. You can read more about each of these from here.

# This line pulls and already built image from a repository being hosted by pytorch \
# which is specifically built and optimized for deeplearning use.
FROM pytorch/pytorch:1.13.0-cuda11.6-cudnn8-runtime

# This step runs the set of commands specific to ubuntu distro
RUN apt-get update && apt-get install -y \
    build-essential \
    libssl-dev \
    libgl1-mesa-dev \
    libffi-dev \
    python3-dev \
    python3-pip \
    && ln -sf /usr/bin/python3 /usr/bin/python &&\
    rm -rf /var/lib/apt/lists/* /tmp/* /var/tmp/*

# This copies the requirements.txt from the host to inside the docker image.
COPY requirements.txt .

# This step runs the copied requirements.txt file inside the docker.
RUN pip3 install -r requirements.txt \
  pip install opencv-python

# So that we can access it in our browser
EXPOSE 8888

# some additional commands to start our jupyter notebook 
CMD ["jupyter", "notebook", "--port=8888", "--no-browser", "--ip=0.0.0.0", "--allow-root"]

 

Step 3: Let’s build the image

The following command builds the Dockerfile and, hopefully, you’re at the root directory.

docker build -t myimagename .

Once the above command is successful, you can run your built image with the following command. I have used the tag --gpus all which ensures that the docker image has access to all the GPUs. This command maps the container’s 8888 port to the host machine’s 8888 port. You should then be able to access Jupyter Notebook by navigating to http://localhost:8888 in your web browser.

docker run --gpus all -p 8888:8888 myimagename:latest

Thats it. In the next post i will try to wrap the whole thing into a docker-compose file and will walk you through how to mount a host directory into the docker image. This will help to keep the data you create inside the docker persistent.

Abhishek Vivekanandan
Abhishek Vivekanandan
Research Fellow at FZI Forschungszentrum Informatik

My research interests include distributed robotics, mobile computing and programmable matter.