[JAVA] Easy microservices with Spark Framework!

1. What is Spark Framework?

It's a very simple web application framework, described on the Official Site (http://sparkjava.com/) as follows:

Spark - A micro framework for creating web applications in Kotlin and Java 8 with minimal effort

It has a rating of: star: 7199 on github (as of 3/4/2018), so it seems to be a fairly used framework.

As a feature, it is very easy to implement a web application using lambda expressions and static methods. Below is a sample from the official documentation.

HelloWorld.java


import static spark.Spark.*;

public class HelloWorld {
    public static void main(String[] args) {
        get("/hello", (req, res) -> "Hello World");
    }
}

Run as a normal java application with a main method. After the application is launched, go to http: // localhost: 4567 / hello with a web browser and you will see Hello World. It's very simple! This time, I would like to make a REST service Todo application using this Spark Framework.

2. Todo app specifications

Todo.java


package com.example.spark.demo;

import java.io.Serializable;
import java.util.Date;

public class Todo implements Serializable {

    private static final long serialVersionUID = 1L;
    
    private String todoId;
    private String todoTitle;
    private Date createdAt;
    private boolean finished;
    
    // constructor, setter, getter omitted
}
Item number path HTTP method Description
1 /api/todo POST Create a TODO with the sent data
2 /api/todo/:todoId GET Gets the TODO specified by todoId
3 /api/todo/:todoId PUT Update the TODO specified by todoId
4 /api/todo/:todoId DELETE Delete the TODO specified by todoId

2. Create a project

First, create a blank project with mvn.

Command example in windows


mvn archetype:generate ^
-DinteractiveMode=false ^
-DarchetypeArtifactId=maven-archetype-quickstart ^
-DgroupId=com.example.spark.demo ^
-DartifactId=spark-demo

After creating a blank project, add the library to be used this time to pom.xml. This time, we will use GSON to convert the JSON format. Other libraries are fine. Since the official document of Spark Framework used GSON, I decided to use GSON this time.

pom.xml


    <!-- add spark framework -->
    <dependency>
      <groupId>com.sparkjava</groupId>
      <artifactId>spark-core</artifactId>
      <version>2.7.1</version>
    </dependency>
    <!-- add gson -->
    <dependency>
      <groupId>com.google.code.gson</groupId>
      <artifactId>gson</artifactId>
      <version>2.8.2</version>
    </dependency>

After adding the library to pom.xml, you will get the library, so let's try build with the following command. It is OK if BUILD SUCCESS is displayed.

Trial build command


mvn package -Dmaven.test.skip=true

3. Source code

3.1. Application class

App.java


package com.example.spark.demo;

import static spark.Spark.*;

/**
 *★ Point 1
 * spark demo app
 */
public class App {
    //★ Point 1
    public static void main(String[] args) {
        //★ Point 2
        // initialize
        initialize();
        //★ Point 3
        // define api
        TodoApi.api();
        // omitted
    }

    //★ Point 2
    private static void initialize() {
        // server port
        port(8090);
        // static files
        staticFiles.location("/public");
        // connection pool
        // maxThreads, minThreads, timeOutMillis
        threadPool(8, 2, 30000);
    }
}

** ★ Point 1 ** The Spark Framework application is implemented as a normal java application executed by the Main method.

** ★ Point 2 ** The process of initial setting (configuration) of the application was cut out to the ʻinitialize` method. This time, I made the following three initial settings that are likely to be used frequently.

Changed the server port from the default 4567 to 8090 with the port method.

It is not necessary for a normal REST service, but it may be used as a simple Web server, so I will explain how to publish static files under the classpath to the Web. Specify the directory to be published by the staticFiles.location method.

In the sample settings, you can get the / spark-demo / src / main / resources / public / css / styles.css file by accessing http: // localhost: 8090 / css / styles.css with a web browser. ..

Set the connection pool with the threadPool method. The arguments are the maximum number of threads, the minimum number of threads, and the timeout (milliseconds) in order.

** ★ Point 3 ** We decided to define the Web API as a separate class in consideration of maintainability when there are additions or changes.

3.2. Web API classes

TodoApi.java


package com.example.spark.demo;

import static spark.Spark.*;

import java.util.HashMap;
import java.util.Map;

/**
 *★ Point 4
 * Web API for TODO
 */
public class TodoApi {
    //★ Point 4
    public static void api() {
        //★ Point 5
        TodoService todoService = new TodoService();
        JsonTransformer jsonTransformer = new JsonTransformer();

        //★ Point 6
        path("/api", () -> {
            path("/todo", () -> {
                post("", (request, response) -> {
                    String json = request.body();
                    Todo todo = jsonTransformer.fromJson(json, Todo.class);
                    return todoService.create(todo);
                }, jsonTransformer);
                get("/:todoId", (request, response) -> {
                    return todoService.find(request.params(":todoId"));
                }, jsonTransformer);
                put("/:todoId", (request, response) -> {
                    String json = request.body();
                    Todo todo = jsonTransformer.fromJson(json, Todo.class);
                    todo.setTodoId(request.params(":todoId"));
                    return todoService.update(todo);
                }, jsonTransformer);
                delete("/:todoId", (request, response) -> {
                    todoService.delete(request.params(":todoId"));
                    return success();
                }, jsonTransformer);
            });
            //★ Point 7
            // set response-type to all request of under '/api'
            after("/*", (request, response) -> {
                response.type("application/json;charset=UTF-8");
            });
        });
    }

    private static Map<String, String> success() {
        Map<String, String> map = new HashMap<String, String>();
        map.put("result", "success!");
        return map;
    }
}

** ★ Point 4 ** Defines Web API processing as a regular class. Since there are many static methods in the Spark Framework API, I decided to define it as a static ʻapi` method this time.

** ★ Point 5 ** Create an instance of business logic (TodoService) and format transformation (JsonTransformer) class. We'll talk about the two classes later.

** ★ Point 6 ** This is the point of this article.

See Official Documentation Routes (http://sparkjava.com/documentation#routes) and Request (http://sparkjava.com/documentation#request) for more information.

** ★ Point 7 ** You can add processing before and after the Web API with a function called Filter of Spark Framework. In the sample, I defined a ʻafter filter that sets application / json; charset = UTF-8tocontent-type in the response header for all requests whose path is under / api`.

In addition to the ʻafterfilter, there are alsobefore and ʻafterAfter filters. For more information on filters, see Official Documentation (http://sparkjava.com/documentation#filters).

3.3. Format conversion class

JsonTransformer.java


package com.example.spark.demo;

import com.google.gson.Gson;
import spark.ResponseTransformer;

//★ Point 8
public class JsonTransformer implements ResponseTransformer {

    private Gson gson = new Gson();

    //★ Point 8
    @Override
    public String render(Object model) throws Exception {
        return gson.toJson(model);
    }

    //★ Point 9
    public <T> T fromJson(String json, Class<T> classOfT) {
        return gson.fromJson(json, classOfT);
    }
}

** ★ Point 8 ** Defines a class that implements the spark.ResponseTransformer interface. The purpose of this interface is to convert the processing result of the API method (the result of the lambda expression at point 6) to a String to write to the HTTP response. Implement this process in render as @Override is given. This time, the process described in the official document is used as it is, and it is converted to JSON format using GSON.

** ★ Point 9 ** As the name of ResponseTransformer suggests, it is originally a class for data conversion of response, but I decided to convert the object conversion from JSON at the time of request here as well. (I just wanted to reuse GSON objects)

By the way, there is no such thing as RequestTransformer that transforms request data in Spark Framework.

3.4. Business logic class

TodoService.java


package com.example.spark.demo;

import java.util.Date;
import java.util.HashMap;
import java.util.Map;
import java.util.UUID;

//★ Point 10
public class TodoService {

    private Map<String, Todo> store = new HashMap<String, Todo>();

    public Todo find(String todoId) {
        return store.get(todoId);
    }

    public void delete(String todoId) {
        store.remove(todoId);
        System.out.println("delete todoId : " + todoId);
    }
    
    public Todo update(Todo todo) {
        Todo updatedTodo = store.get(todo.getTodoId());
        if (updatedTodo != null) {
            updatedTodo.setTodoTitle(todo.getTodoTitle());
            updatedTodo.setFinished(todo.isFinished());
        }
        return updatedTodo;
    }

    public Todo create(Todo todo) {
        String todoId = UUID.randomUUID().toString();
        Todo registeredTodo = new Todo(todoId, todo.getTodoTitle(), new Date(),
                false);
        store.put(todoId, registeredTodo);
        System.out.println("registeredTodo : " + registeredTodo);
        return registeredTodo;
    }
}

** ★ Point 10 ** Implement the business logic of the TODO application. However, it is an appropriate dummy process because it does not use the functions of Spark Framework. This time, we did not access the DB and used Map for in-memory CRUD processing.

4. Create an executable jar

Since it is a microservice, I would like to simplify the execution. I want to build it as one executable jar file like ʻuber.jar`.

pom.xml


<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
  xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/maven-v4_0_0.xsd">
  <modelVersion>4.0.0</modelVersion>
  <groupId>com.example.spark.demo</groupId>
  <artifactId>spark-demo</artifactId>
  <packaging>jar</packaging>
  <version>1.0-SNAPSHOT</version>
  <name>spark-demo</name>
  <url>http://maven.apache.org</url>
  <dependencies>
    <!-- add spark framework -->
    <dependency>
      <groupId>com.sparkjava</groupId>
      <artifactId>spark-core</artifactId>
      <version>2.7.1</version>
    </dependency>
    <!-- add gson -->
    <dependency>
      <groupId>com.google.code.gson</groupId>
      <artifactId>gson</artifactId>
      <version>2.8.2</version>
    </dependency>
    <dependency>
      <groupId>junit</groupId>
      <artifactId>junit</artifactId>
      <version>3.8.1</version>
      <scope>test</scope>
    </dependency>
  </dependencies>
  <!-- java8 -->
  <properties>
    <java.version>1.8</java.version>
    <maven.compiler.target>${java.version}</maven.compiler.target>
    <maven.compiler.source>${java.version}</maven.compiler.source>
  </properties>
  <!-- add for executable jar -->
  <build>
    <plugins>
      <plugin>
          <artifactId>maven-assembly-plugin</artifactId>
          <executions>
            <execution>
              <phase>package</phase>
              <goals>
                <goal>single</goal>
              </goals>
            </execution>
          </executions>
          <configuration>
            <archive>
              <manifest>
                <addClasspath>true</addClasspath>
                <mainClass>com.example.spark.demo.App</mainClass>
              </manifest>
            </archive>
            <descriptorRefs>
              <descriptorRef>jar-with-dependencies</descriptorRef>
            </descriptorRefs>
          </configuration>
      </plugin>
    </plugins>
  </build>
</project>

Once you have an executable jar file, run it with java -jar.

C:\tmp\spark\spark-demo>java -jar target/spark-demo-1.0-SNAPSHOT-jar-with-dependencies.jar
SLF4J: Failed to load class "org.slf4j.impl.StaticLoggerBinder".
SLF4J: Defaulting to no-operation (NOP) logger implementation
SLF4J: See http://www.slf4j.org/codes.html#StaticLoggerBinder for further details.

5. Finally

This time, I explained the implementation of microservices using Spark Framework. It's very simple and easy to implement, and it can be built as a single executable jar file, so I think it's easy to release. In the sample, input check is omitted, but it is actually necessary processing. Java-json-tools / json-schema-validator when checking input as JSON, when checking input as Java Bean Validation is recommended.

Recommended Posts

Easy microservices with Spark Framework!
SaveAsBinaryFile with Spark (Part 2)
Easy BDD with (Java) Spectrum?
Microservices with DevOps Make Changes
Easy web scraping with Jsoup
Easy library introduction with Maven!
Create microservices with Spring Boot
Hello, World! With Asakusa Framework!
Easy JDBC calls with Commons DbUtils
Double submit measures with Play Framework
Easy input check with Bean Validation!
Easy Pub/Sub messaging with Apache Kafka
Data linkage with Spark and Cassandra
Easy database access with Java Sql2o
Test Spring framework controller with Junit
Microservices With Docker and Cloud Performance