Dockerizing your application makes things like development, deployment, distribution, and continuous integration much simpler. Today, we will look at how to run your Node.js application in a Docker container, using Docker Compose to further facilitate the process. See last week’s post, How to Dockerize Django and Postgres, if you are interested in doing the same for your Django application.
This end result of this tutorial will work for any Node.js application, but for demonstration purposes, I will use a generic React app created by create-react-app.
Creating the Docker Image
We are going to start first with the Docker image. Create a
Dockerfile at the root of your project’s directory. The end result should look like this:
FROM node:8.15.0-jessie WORKDIR /home/node/app COPY package*.json ./ RUN npm install COPY . ./ EXPOSE 3000 CMD ["npm", "start"]
Let’s break this down one block at a time.
We are extending the official Node.js Docker image, hosted on DockerHub. We have pinned a specific version of it so that it will not change unless we update this Dockerfile.
WORKDIR /home/node/app COPY package*.json ./ RUN npm install
We set our working directory to
/home/node/app and copy only
package-lock.json, then run
npm install. We copy only these two files here to optimize for the Docker cache when building the image. The
ADD commands (as well as some other commands) create layers in a Docker image. Docker caches these layers, and invalidates
COPY layers if the files in them have changed. Invalidating a layer means all layers below them are also invalidated. Since
npm install is a fairly expensive and time-consuming task, we want to make sure any layers above it are as unlikely to be invalidated as possible. With this setup,
npm install will only need to run again if
package-lock.json are modified.
COPY . ./
Here, we copy the rest of the code into the working directory, including
node_modules. This is so that we can run the application. This produces two problems.
node_modulesfrom the host system is copied into the container. This is not what we want, as sharing this directory created on one operating system doesn’t always work when copied to a different one. This also overwrites the
- During development, we will have to rebuild the image each time code is changed in order to see the results.
More on these issues later.
This exposes the port 3000, which is where React is hosted by default. Update this to match your own application.
CMD ["npm", "start"]
On container start,
npm start will be run in order to start the Node.js server.
Docker allows us to define a
.dockerignore file in the same directory as our
Dockerfile in order to ignore certain files or directories when using
COPY. We will take advantage of this to exclude
npm-debug.log from our container. Add
.dockerignore at the same level as your
Dockerfile with the following contents:
COPY . ./ in our
Dockerfile will ignore these files.
Using Docker Compose
To make building our image and running our container much easier, we will configure Docker Compose. Create
docker-compose.yml at the same level as your
Dockerfile, with the following contents:
version: "3" volumes: blog_node_modules: services: app: build: . user: "node" volumes: - ./:/home/node/app - blog_node_modules:/home/node/app/node_modules/ ports: - "3000:3000"
Now, let’s break down the important bits.
This defines a named Docker volume. Giving a volume a name is not required, but it makes managing them easier. We will use this in a bit.
This defines both the build context and the location of the Dockerfile, if they are the same. In my case, this should be the directory where
docker-compose up is ran, or the project’s root directory.
volumes: - ./:/home/node/app - blog_node_modules:/home/node/app/node_modules/
This part is really important. This defines two volumes (actually a bind mount and a volume). The first is a bind mount from the current directory on the build host to the
/home/node/app directory in the container (which we set as the working directory). This essentially shares a directory on your machine with the container, which makes it so that a code change on the host machine triggers a hot-deploy on the container without requiring a rebuild of the image.
The problem with creating this bind mount is that our old nemesis,
node_modules, is being copied to our container again. The solution is to bind a named volume (
blog_node_modules, that was created earlier) to
/home/node/app/node_modules/. Creating a volume in this way persists the image’s
node_modules and prevents it from being overwritten by the host’s version. It also persists the container’s
node_modules even when the container is removed and a new one is created. If you want to better understand why this works, read this StackOverflow post.
You may wonder why we bother copying the code into the image in the Dockerfile if we just end up getting the same result by creating this bind mount. The answer is mobility. The Docker image we created previously stands on its own, meaning you can run your application with only it; without needing this Docker Compose configuration. The extra configuration only makes things easier, especially during development.
ports: - "3000:3000"
Lastly, this section maps the container’s port to the build host’s port.
Using your Application
Now that we’ve Dockerized our Node.js application, it’s time to go over how to use it.
docker-compose up to start the app. You can then Ctrl-C to stop the container. Use
docker-compose down to remove the container. See full
docker-compose documentation here.
The containers, volumes, etc. that are created by Docker Compose can also be managed with Docker commands. See the reference for those here.
Installing New Packages
When installing new packages, they will not be automatically installed on your Docker container. When installing a new package, follow the following steps.
- Install the package on your host machine with
npm installor add the package to your
docker-compose down -vto remove all containers and volumes.
docker-compose up --buildto rebuild images and recreate volumes.
Alternatively, if you don’t want to remove any containers or volumes, you can run bash on your running container and install the packages this way. Use these steps.
- Install the package (as before), making sure it is listed in
- Start an interactive bash session as root with this command:
docker exec -it -u root <your_container_name> bash.
npm installon your container.
As always, example code is located on GitHub.