Using Docker as a basic Node developing environment

For the past few days, I’ve been working on this small NodeJS project. Node is a very powerful and fast JavaScript runtime environment, and it’s gotten a lot of attention these last few years as a credible back-end for Web infrastructures - so I’ve decided to try it out. As a Django developer, I’m used to having all these tweaks that automatically set your project up and running - whereas Node does the minimum for the developer, and you have to set everything around your program yourself: database, session management… So I decided to set them up myself, and to wrap them in a Docker image.

 Docker: the whys

During my time at Weezevent, we were using Docker environments to manage our API servers. Last week, the new project I’ve been talking to you about was a cool opportunity for me to dust up this Docker knowledge I acquired and to create my own Docker environment. Now, for those who only heard about it, what is Docker? Docker, according to its Wikipedia article, automates the deployment of applications inside software containers.

Let me give you a use case: you have this web server project, that you built in NodeJS with a MySQL database and another InfluxBD database to collect user data. You first build it and configure it on your primary machine for development purposes - so it works great on your own machine. Once you have a working project, you decide to upload it on your distant machine to make it publicly accessible. But that means that you have to install and configure a MySQL server and an InfluxDB database on your own server too - which can sometimes take 5 minutes, but can easily take you an hour if you run into an unexpected problem… And imagine - after the configuration, your app still won’t run, and after a long debugging time you just realize it’s because you were running an old version of NodeJS on your server that doesn’t have some APIs you call!

This situation can get even worse if you involve more developers in the equation: suppose one of your co-workers has an old, legacy version of MySQL that’s not compatible with the queries you wrote, but that he can’t update because is a dependency of one of his own projects…

Using Docker as a back-end developer easily allows you to contain development environments: you can, with a few Docker configuration files, declare what software you need and give it a few configuration scripts that will execute everytime the Docker image builds, so everyone gets the same environment that will not conflict with their own personal setup.

 Setting up Docker for Node

Docker revolves around the concept of images: every image is a more-or-less empty Unix environment that contains a piece of software. Since we’re talking about containing… You guessed it right: we’re going to create our own image to contain our application. But we are going to do so by using other images as basis.

First step: setting up your own Dockerfile. A Dockerfile is a command file that specifies what steps Docker must take to build a working image for your app. Let’s see in practice how it looks like:

FROM node:4-onbuild
RUN mkdir /code
ADD . /code
WORKDIR /code
RUN npm install
RUN npm install -g supervisor

What does our Dockerfile? The 1st line says “we’re going to create our image by import the node:4-onbuild image”. Docker is connected to an image marketplace, the Docker Hub, where most systems developers publish their own Docker images. These usually are empty systems with only what you need. You can actually go on the Docker Hub and publish your own images, if you build a system that you want to share with other developers or admins. Since I imported here a Node image, you can check out the Node image page: it shows all the different Node versions you can download as an image. The reason we’re using a Node image is that this way, we know for sure node and npm are preinstalled. Plus, we know exactly the version that’ll be installed - version 4!

The 2nd and 3rd line say “create a repertory to store my code, and put all my code in this repository”. We’ll see later how we can change that from “copying my code” to “linking my folder to my container”.

The 4th, 5th and 6th lines are pretty much clear for any Node developer out there. We’ll use supervisor to manage our Node process once it’s ready to go, so we’re going to install it too globally on our system.

Once you’re done writing your Dockerfile, you can build it using docker build, and using CLI parameters to give it a shining new name. Ta-da, you’ve built your first Docker image, and you’re now containing your code within a virtual machine! Once built, you can run your code with docker run.

 Wiring images: Docker Compose

That’s cool, but I’m going to need PostgreSQL/MySQL/MariaDB/MongoDB/… for my app. You haven’t talked about this yet!

Indeed! If you’ve been following correctly, your first thought might be adding the dependencies from the Dockerfile, by making the container run apt-get install mysql-server. But that’s kind of heavy: you’ll also have to run apt-get update, plus you’ll have to configure all of your database within the Dockerfile… Although that works, Docker actually has a built-in tool to separate software in different containers: Docker Compose. Using a docker-compose.yml configuration file, you’re going to be able to tell Docker “Okay, now that I’ve built my image, I want it to interact with this PostgreSQL image, and this Redis image”. Your config for these other images will be stored in your Docker Compose configuration. If you want your PostgreSQL image to name the database in a certain way, like configured in your NodeJS files, you just pass it on as an environment variable in your docker-compose.yml file!

This is what my docker-compose.yml file looked like for a NodeJS/PostgreSQL/Redis project:

version: '2'
services:
  web:
    build: .
    command: supervisor index.js
    ports:
      - "5000:5000"
    volumes:
      - .:/code
    links:
      - redis
      - db
  redis:
    image: redis
  db:
    image: postgres:latest
    environment:
      - POSTGRES_DB=mynewappdatabase

This docker-compose.yml configuration file, version 2 as specified, tells the Docker engine to build and run the specified Docker images - including the one from the folder we’re in, that we’ve called web, and other images that we’ll import from the Hub, redis and postgres:latest. It will then execute the command supervisor index.js once done. It also tells the Docker container to expose the port 5000, which is the one I’m using for development purposes. We’re also mounting all our files as a volume, with volumes : so if you modify your code, it will instantly reflect in your container - and if you modify a file within your container, it will be applied in your file system.

A sidenote on supervisor: Supervisor is a npm package that launches your Node app for you and manages them. Say your Node app crashes: Supervisor restarts it. You modify your Node app’s sources? Supervisor will kill and restart the app for you. It’s really useful, both in development and in production, and acts as a python manage.py runserver for the Django maniac that I am. Plus, it’s super easy to use: just install it with npm install -g supervisor, and then you only have to launch your Node apps with supervisor instead of node.

The interesting part comes with the links part. This part lets the Docker engine know that our Webapp wants to interact with the Postgres and the Redis images we’re importing - so Docker puts all the container in the same network, so web can interact with redis and db (which contains the Postgres image). How does it actually work, inside the Node app ? It’s actually really simple.

var postgres = require('pg-promise')()    

// Connection to PG database
var db = postgres('postgres://postgres:@db:5432/mynewappdatabase')

From my JS code, the database is accessible as the db host! As a bonus: since I defined the default database name in my docker-compose.yml file, I even know the name of the database I want to connect to. Now, to launch it, you just have to enter docker-compose up, which will automatically rebuild the images if necessary, and launch the specified commands!

To connect to your server :

 More about Docker

Docker is a fast-evolving tool, with a huge community revolving around it - so there are a lot of upcoming functions to help you as a developer coming around.

 
2
Kudos
 
2
Kudos

Now read this

Using Govendor to manage your Go dependencies

I started recently using Go for work purposes. It’s a great language! While it doesn’t have the agility of Python or Node scripting, it’s fantastic as a compiled language. It’s easy to pick up and to use, maintainable, and has an active... Continue →