Microservices and DevOps Using Java, Spring Boot, Git Flow, Jenkins, and Docker

Introduction

In this article, I will create a simple microservice using Java and the Spring framework and also create a DevOps pipeline using Jenkins and Docker.

Note: It is assumed that the reader has a background in Java and web technologies. Spring, Jenkins, Java, Git, and Docker introductions are not covered. 

I will cover the following points in order:

The Microservice

The microservice application can be cloned from Github using the following URL:

https://github.com/Microservices-DevOps/person.git 

The Resource Tier

The entity is called Person and contains a name, an email, and an identifier. The service we're developing is to manage the Person entity. 

package com.myapp.sample.model;

import java.io.Serializable;
import javax.persistence.Column;
import javax.persistence.Entity;
import javax.persistence.GeneratedValue;
import javax.persistence.GenerationType;
import javax.persistence.Id;
import javax.persistence.Table;
import javax.validation.constraints.Email;
import javax.validation.constraints.NotNull;


@Entity
@Table(name = "person")
public class Person implements Serializable{

  private static final long serialVersionUID = 7401548380514451401L;

  public Person() {}

  @Id
  @GeneratedValue(strategy = GenerationType.IDENTITY)
  Long id;

  @Column(name = "name")
  String name;

  @NotNull
  @Email
  @Column(name = "email")
  String email;

  public Long getId() {
    return id;
  }

  public void setId(Long id) {
    this.id = id;
  }

  public String getName() {
    return name;
  }

  public void setName(String name) {
    this.name = name;
  }

  public String getEmail() {
    return email;
  }

  public void setEmail(String email) {
    this.email = email;
  }

  @Override
  public int hashCode() {
    final int prime = 31;
    int result = 1;
    result = prime * result + ((email == null) ? 0 : email.hashCode());
    result = prime * result + ((id == null) ? 0 : id.hashCode());
    result = prime * result + ((name == null) ? 0 : name.hashCode());
    return result;
  }

  @Override
  public boolean equals(Object obj) {
    if (this == obj)
      return true;
    if (obj == null)
      return false;
    if (getClass() != obj.getClass())
      return false;
    Person other = (Person) obj;
    if (email == null) {
      if (other.email != null)
        return false;
    } else if (!email.equals(other.email))
      return false;
    if (id == null) {
      if (other.id != null)
        return false;
    } else if (!id.equals(other.id))
      return false;
    if (name == null) {
      if (other.name != null)
        return false;
    } else if (!name.equals(other.name))
      return false;
    return true;
  }

  @Override
  public String toString() {
    return "Person [id=" + id + ", name=" + name + ", email=" + email + "]";
  }
}

The entity tier is tested using normal CRUD operations. We then checked whether the entity was persisted, queried, and updated. 

package com.myapp.sample.model;

import org.junit.Assert;
import org.junit.Before;
import org.junit.Test;
import org.junit.runner.RunWith;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.boot.test.autoconfigure.orm.jpa.DataJpaTest;
import org.springframework.boot.test.autoconfigure.orm.jpa.TestEntityManager;
import org.springframework.test.context.junit4.SpringRunner;
import java.util.List;

@RunWith(SpringRunner.class)
@DataJpaTest
public class PersonTest {

    @Autowired
    private TestEntityManager entityManager;

    @Before
    public void setUp() {
        List<Person> list = entityManager.getEntityManager().createQuery("from Person").getResultList();
        for(Person person:list) {
            entityManager.remove(person);
        }
    }

    @Test
    public void testCRUD()
    {
        Person p1 = new Person();
        p1.setName("test person 1");
        p1.setEmail("test@person1.com");

        entityManager.persist(p1);

        List<Person> list = entityManager.getEntityManager().createQuery("from Person").getResultList();
        Assert.assertEquals(1L, list.size());

        Person p2 = list.get(0);
        Assert.assertEquals("test person 1", p2.getName());
        Assert.assertEquals("test@person1.com", p2.getEmail());

        Assert.assertEquals(p2.hashCode(), p2.hashCode());
        Assert.assertTrue(p2.equals(p2));
    }
}

The Repository Tier

The Repository is managed automatically by Spring Boot. The PagingAndSortingRepository interface is an extension of the CrudRepository to provide additional methods to retrieve entities using the pagination and sorting abstraction. There is no need to test the methods as test coverage will automatically arrive at 100% coverage using these interfaces. 

package com.myapp.sample.repositories;

import com.myapp.sample.model.Person;
import org.springframework.data.repository.PagingAndSortingRepository;
import org.springframework.data.rest.core.annotation.RestResource;

@RestResource(exported=false)
public interface PersonRepository extends PagingAndSortingRepository<Person, Long> {

}

The Business Service Tier

The PersonService interface contains three operations: save, find by ID, and find All instances of the several CRUD operations supported by the Repository tier. 

package com.myapp.sample.service;

import com.myapp.sample.model.Person;
import java.util.List;

public interface PersonService {
  public List<Person> getAll();

  public Person save(Person p);

  public Person findById(Long ids);
}

The implementation of the PersonService interface calls the repository and adds any business service implementations required. There is no need to test the business service as it is automatically tested because of the repository coverage by Spring Boot. 

package com.myapp.sample.service;

import java.util.ArrayList;
import java.util.List;
import java.util.Optional;

import com.myapp.sample.model.Person;
import com.myapp.sample.repositories.PersonRepository;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.stereotype.Service;


@Service
public class PersonServiceImpl implements PersonService {

  @Autowired
  PersonRepository personRepository;

  @Override
  public List<Person> getAll() {
    List<Person> personList = new ArrayList<>();
    personRepository.findAll().forEach(personList::add);
    return personList;
  }

  @Override
  public Person save(Person p) {
    return personRepository.save(p);
  }

  @Override
  public Person findById(Long id) {
    Optional<Person> dbPerson = personRepository.findById(id);
    return dbPerson.orElse(null);
  }
}

The REST API Tier

The REST APIs are exposed by delegating the calls to the business service tier and then turning to the repository tier.

package com.myapp.sample.controller;

import java.util.List;

import com.myapp.sample.model.Person;
import com.myapp.sample.service.PersonService;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.http.ResponseEntity;
import org.springframework.web.bind.annotation.GetMapping;
import org.springframework.web.bind.annotation.PathVariable;
import org.springframework.web.bind.annotation.PostMapping;
import org.springframework.web.bind.annotation.RequestBody;
import org.springframework.web.bind.annotation.RestController;

@RestController
public class PersonController {

  @Autowired
  PersonService personService;

  @PostMapping(path = "/api/person")
  public ResponseEntity<Person> register(@RequestBody Person p) {
    return ResponseEntity.ok(personService.save(p));
  }

  @GetMapping(path = "/api/person")
  public ResponseEntity<List<Person>> getAllPersons() {
    return ResponseEntity.ok(personService.getAll());
  }

  @GetMapping(path = "/api/person/{person-id}")
  public ResponseEntity<Person> getPersonById(@PathVariable(name="person-id", required=true)Long personId) {
    Person person = personService.findById(personId);
    if (person != null) {
      return ResponseEntity.ok(person);
    }
    return ResponseEntity.notFound().build();
  }
}

The REST API tier should be tested using a Spring Boot test. It is a good idea to perfor these tests using H2 or an in-memory database and to test the main implementation by using any standard RDBMS. 

package com.myapp.sample.controller;

import com.myapp.sample.model.Person;
import com.myapp.sample.repositories.PersonRepository;
import org.junit.Assert;
import org.junit.Before;
import org.junit.Test;
import org.junit.runner.RunWith;
import org.mockito.Mock;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.boot.test.context.SpringBootTest;
import org.springframework.boot.test.context.SpringBootTest.WebEnvironment;
import org.springframework.boot.test.web.client.TestRestTemplate;
import org.springframework.core.ParameterizedTypeReference;
import org.springframework.http.*;
import org.springframework.test.context.junit4.SpringJUnit4ClassRunner;
import org.springframework.test.web.servlet.MockMvc;
import org.springframework.test.web.servlet.setup.MockMvcBuilders;

import java.util.List;

@RunWith(SpringJUnit4ClassRunner.class)
@SpringBootTest(webEnvironment = WebEnvironment.RANDOM_PORT)
public class PersonControllerTest {

  MockMvc mockMvc;

  @Mock
  private PersonController personController;

  @Autowired
  private TestRestTemplate template;

  @Autowired
  PersonRepository personRepository;

  @Before
  public void setup() throws Exception {
    mockMvc = MockMvcBuilders.standaloneSetup(personController).build();
    personRepository.deleteAll();
  }

  @Test
  public void testRegister() throws Exception {
    HttpEntity<Object> person = getHttpEntity(
        "{\"name\": \"test 1\", \"email\": \"test10000000000001@gmail.com\"}");
    ResponseEntity<Person> response = template.postForEntity(
        "/api/person", person, Person.class);
    Assert.assertEquals("test 1", response.getBody().getName());
    Assert.assertEquals(200,response.getStatusCode().value());
  }

  @Test
  public void testGetAllPersons() throws Exception {
    HttpEntity<Object> person = getHttpEntity(
            "{\"name\": \"test 1\", \"email\": \"test10000000000001@gmail.com\"}");
    ResponseEntity<Person> response = template.postForEntity(
            "/api/person", person, Person.class);

    ParameterizedTypeReference<List<Person>> responseType = new ParameterizedTypeReference<List<Person>>(){};
    ResponseEntity<List<Person>> response2 = template.exchange("/api/person", HttpMethod.GET, null, responseType);
    Assert.assertEquals(response2.getBody().size(), 1);
    Assert.assertEquals("test 1", response2.getBody().get(0).getName());
    Assert.assertEquals(200,response2.getStatusCode().value());
  }

  @Test
  public void testGetPersonById() throws Exception {
    HttpEntity<Object> person = getHttpEntity(
            "{\"name\": \"test 1\", \"email\": \"test10000000000001@gmail.com\"}");
    ResponseEntity<Person> response = template.postForEntity(
            "/api/person", person, Person.class);

    ParameterizedTypeReference<List<Person>> responseType = new ParameterizedTypeReference<List<Person>>(){};
    ResponseEntity<List<Person>> response2 = template.exchange("/api/person", HttpMethod.GET, null, responseType);

    Long id = response2.getBody().get(0).getId();
    ResponseEntity<Person> response3 = template.getForEntity(
            "/api/person/" + id, Person.class);
    Assert.assertEquals("test 1", response3.getBody().getName());
    Assert.assertEquals(200,response3.getStatusCode().value());
  }

  @Test
  public void testGetPersonByNull() throws Exception {
    ResponseEntity<Person> response3 = template.getForEntity(
            "/api/person/1", Person.class);
    Assert.assertEquals(404,response3.getStatusCode().value());
  }

  private HttpEntity<Object> getHttpEntity(Object body) {
    HttpHeaders headers = new HttpHeaders();
    headers.setContentType(MediaType.APPLICATION_JSON);
    return new HttpEntity<Object>(body, headers);
  }
}

So that is essentially the Java code used in developing the microservice. 

Required Software(s)

We now turn to look at the required softwares for managing the microservice. 

Java

In this example, I have used Java version 8. It is recommended to use version 8 because Jenkins runs on Java 8. 

Git

Install the latest Git version. Since I have a Windows machine, I have used Git for Windows 2.21.0 (64-bit). 

Docker

Install the latest version of Docker. Since I have a Windows 8 machine, I have used Docker Toolbox for Windows. 

MySQL

Install the MySQL 5.7 Docker image using the following commands:

docker pull mysql
docker run -d --name mysql -e MYSQL_DATABASE=person -e MYSQL_ROOT_PASSWORD=<root_password> -p 3306:3306 mysql

The docker-machine ip command returns the IP address of the Docker instance and can be substituted in the resources\application.properties file.

Jenkins

Install the Jenkins Blue Ocean release using the following commands:

docker pull jenkinsci/blueocean
docker run -u root --rm -d -p 8080:8080 -p 50000:50000 -v jenkins-data:/var/jenkins_home -v /var/run/docker.sock:/var/run/docker.sock jenkinsci/blueocean

DevOps Pipeline

We now turn to take a look at the DevOps pipeline used to build, deploy, and manage the Git repository. Before we understand the pipeline, it is important to spend a few minutes on Git Flow. 

Git Flow

Git Flow is a branching model for Git. It mainly consists of the master branch (which is parallel to the production code), a development branch (which is the main branch for development), release branches for releasing from development, and feature branches (which are for developers to work on). After the code is completed by the developers, a pull request is created for the team lead to review and merge the code to development. After the release branch is created, the bug fixes go in this branch and are again merged to develop and master once the code stabilizes. In this model, tags are created from the master branch for releases to production. 

You can look at a visual description here: https://datasift.github.io/gitflow/IntroducingGitFlow.html

JenkinsFile

It is necessary to create a multi-branch pipeline in Jenkins. This allows for the Git Flow to be taken care of by Jenkins automatically. The JenkinsFile is automatically picked up by Jenkins when the link to the Git repository is provided. Stages are configured to see the pipelines visually when the build is triggered on a check-in. In this example, we are checking the code using PMD, CheckStyle, and FindBugs. You are welcome to try a more matured tool, like Sonar, in place of PMD, CheckStyle, and FindBugs. In the pipeline settings, we build the image in one step and run the image in another step to update the testing environment container whenever a change occurs in the master branch. When a tag is created, the production environment is updated with the image tag name, like 1.0.0. You are welcome to try and set this example for a different Jenkins File for production and Docker file for production which is required after the image is set up for production. 

#!/usr/bin/env groovy

pipeline {
    agent any

    triggers {
        pollSCM('*/15 * * * *')
    }

    options { disableConcurrentBuilds() }

    stages {
        stage('Permissions') {
            steps {
                sh 'chmod 775 *'
            }
        }

stage('Cleanup') {
            steps {
                sh './gradlew --no-daemon clean'
            }
        }

        stage('Check Style, FindBugs, PMD') {
            steps {
                sh './gradlew --no-daemon checkstyleMain checkstyleTest findbugsMain findbugsTest pmdMain pmdTest cpdCheck'
            }
        post {
        always {
                step([
                $class         : 'FindBugsPublisher',
                pattern        : 'build/reports/findbugs/*.xml',
                canRunOnFailed : true
                ])
                step([
                $class         : 'PmdPublisher',
                pattern        : 'build/reports/pmd/*.xml',
                canRunOnFailed : true
                ])
                step([
                $class: 'CheckStylePublisher', 
                pattern: 'build/reports/checkstyle/*.xml',
                canRunOnFailed : true
                ])
        }
      }
    }

stage('Test') {
            steps {
                sh './gradlew --no-daemon check'
            }
            post {
                always {
                    junit 'build/test-results/test/*.xml'
                }
            }
        }

        stage('Build') {
            steps {
                sh './gradlew --no-daemon build'
            }
        }

        stage('Update Docker UAT image') {
            when { branch "master" }
            steps {
                sh '''
docker login -u "<userid>" -p "<password>"
                    docker build --no-cache -t person .
                    docker tag person:latest amritendudockerhub/person:latest
                    docker push amritendudockerhub/person:latest
docker rmi person:latest
                '''
            }
        }

        stage('Update UAT container') {
            when { branch "master" }
            steps {
                sh '''
docker login -u "<userid>" -p "<password>"
                    docker pull amritendudockerhub/person:latest
                    docker stop person
                    docker rm person
                    docker run -p 9090:9090 --name person -t -d amritendudockerhub/person
                    docker rmi -f $(docker images -q --filter dangling=true)
                '''
            }
        }

        stage('Release Docker image') {
            when { buildingTag() }
            steps {
                sh '''
docker login -u "<userid>" -p "<password>"
                    docker build --no-cache -t person .
                    docker tag person:latest amritendudockerhub/person:${TAG_NAME}
                    docker push amritendudockerhub/person:${TAG_NAME}
docker rmi $(docker images -f “dangling=true” -q)
               '''
            }
        }
    }
}

You can look at a visual description here: https://jenkins.io/doc/tutorials/build-a-multibranch-pipeline-project/

Conclusion

This example can be improved using Kubernetes for deployment. But a complete pipeline can be created using Docker also and that was the aim for this article. 

 

 

 

 

Top