Jenkins pipeline setup for Docker based NodeJS application

Jenkins has come a long way and always remains my goto tool for continuous integration needs. In this article, we will try to configure a jenkins from scratch and get a declarative pipeline created. We will also try to build dockers and push them to docker hub.

Setup Jenkins

Let’s setup a Jenkins with BlueOcean. For this let’s use a dockerized jenkins — jenkinsci/blueocean

Few pointers:

  • In the above snippet, while starting jenkins we mount docker.sock to jenkins. This helps in doing docker in docker style builds, where the build runs inside an agent having declared image. We will get into details as we proceed.

Github integration

Jenkins Blueocean on starting a new instance provides a setup flow. This flow takes us through a Github integration flow. If blueOcean is not used., can achieve the same result within Manage Jenkins flow and configuring the Github integration plugin

  • The user who is configuring Jenkins in this step should have owner privilege to the repository.
  • If it’s organizational Github repository., by giving OAuth based access to Jenkins., we can select interested repos or have all repositories
  • By selecting all repositories., all existing repositories and any new one going forward will automatically be accessible to Jenkins.

Google login for Jenkins

More and more organization have Google account based organizational emails. It will be very helpful, by having Jenkins login based on Google account but restricting to organizational domain.

We can use Jenkin’s Google login plugin to achieve this. To use this plugin, you must obtain OAuth 2.0 credentials from the Google Developers Console. These don’t need to belong to a special account, or even one associated with the domain you want to restrict logins to.

Instructions to create the Client ID and Secret:

  1. Login to the Google Developers Console
  2. Create a new project
  3. Under APIs & Services -> Credentials, Create credentials, OAuth client ID
  4. The application type should be “Web Application”
  5. The authorized redirect URLs should contain ${JENKINS_ROOT_URL}/securityRealm/finishLogin
  6. Enter the created Client ID and secret in the Security Realm Configuration
Jenkins configuration for Google integration

Authorization based on user / groups

Under Configure Global Security we can find Authorization controls. This helps to restrict user access.

  • Give authorized user (anyone who can login) — with appropriate controls
  • Remove all access to anonymous user
  • Add specific user or groups to provide admin access to control Jenkins itself.
sample authorization screen

Jenkinsfile based build pipeline

Having a Jenkinsfile, which is checked into source control, provides a number of immediate benefits:

  • Code review/iteration on the Pipeline
  • Audit trail for the Pipeline
  • Single source of truth for the Pipeline, which can be viewed and edited by multiple members of the project.

Jenkinsfile can be written in different ways:

I personally like to follow Declarative pipelines since:

  • Declarative Pipeline is a relatively recent addition to Jenkins Pipeline which presents a more simplified and opinionated syntax on top of the Pipeline sub-systems
  • Scripted pipeline can become hard to manage as each developer may have different style of achieving an objective.

Jenkinsfile — getting started

The above Jenkinsfile shows an example for a nodeJs project.

Few pointers:

  • It showcases docker based agent in which build stages/steps will run
  • builds a docker image inside an agent
  • pushes the docker image to docker registry
  • If we are using a private docker registry, withDockerRegistry([ credentialsId: 'myCredentialsId', url: 'your-docker-registry-url'] is very useful. If we leave url empty, default is the docker hub.

Notification of pipeline results

  • post block in Jenkinsfile (as shown above) provides us ability to hook to results of the pipeline. Think of it like finally block — irrespective of the result, this block will get called
  • the above code snippet shows a slack integration. We could also do email notification and other ways of sending notification.

Multi branch pipelines

Jenkins now supports multi branch pipeline support. This helps to have new pipelines dynamically created if those branches have Jenkinsfile at the root folder.

This helps the team to keep adding feature branches and feature branch getting all the Jenkins pipelines immediately

Shared libraries to reduce duplicated logic

If we have multiple microservices, many a times, content of Jenkinsfile gets duplicated. If there is a change needed in the flow, for example,change in docker registry etc., we may end up updating all the microservice repos.

It’s very helpful to have a shared-library repo and extract out common logic. Jenkins supports shared libraries and we could write groovy based implementation that can be shared in pipelines.

This option can be found in Jenkins here : Manage Jenkins » Configure System » Global Pipeline Libraries

Global variables

Pulling out common variables out of Jenkinsfile and managing them from Jenkins pays huge dividend as the systems mature. It also avoid the pain of updating multiple repositories if generic variables changes. Most of the team uses Jenkins credentials for the secure data like passwords. But other than password, by pulling out constants into Jenkin’s global variables is a huge benefit.

To find this setting : Manage Jenkins > Configure | Under Global properties | Environment variables. It’s a key-value pair. Once initialized, in Jenkinsfile, we could access them as env.key1

Conclusion

Hope it was helpful to someone trying to setup jenkins and get docker based build / publish flows working quickly.

Software Architect ★ Data engineer ★ Committed to improve data science productivity