Playing With GraalVM

Image title

Recently, I started playing with Graal VM. What is Graal VM? To quote their website directly:

GraalVM is a universal virtual machine for running applications written in JavaScript, Python, Ruby, R, JVM-based languages like Java, Scala, Kotlin, Clojure, and LLVM-based languages such as C and C++.

GraalVM removes the isolation between programming languages and enables interoperability in a shared runtime.

In lay man's terms, Graal VM allows languages JavaScript, Python, Ruby, R, JVM-based languages like Java, Scala, Kotlin, Clojure to talk to each other, thus removing isolation between these languages.

But, as a Java/JVM platform developer, I am more interested in Graal VM's ability to ahead-of-time compile Java to run a Java application as a native application. Additionally:

"GraalVM Native Image allows you to ahead-of-time compile Java code to a standalone executable, called a  native image."
"native-image is a utility that processes all the classes of your application and their dependencies, including those from the JDK. It statically analyses these classes to determine which classes and methods are reachable and used during application execution. Then, it passes all this reachable code as the input to the GraalVM compiler, which ahead-of-time compiles it to the native binary."

The key benefits of AOT compiling Java application to a native binary are:

  1. Speed of startup
  2. Reduced memory footprint

For example, here's a Micronaut application: https://github.com/RaviH/graal-micronaut

When I try to run the jar, it takes about 1.3 secs to start:

 $ java -jar target/graal-micronaut-0.1.jar
 18:30:22.244 [main] INFO  io.micronaut.runtime.Micronaut - Startup completed in 1320ms. Server Running: http://localhost:8080


But when I create the native image, it starts in 29 ms (vs 1.3 s) – 45 times faster boot time.

# rhasija ~/dev/projects/tmp/graal-micronaut on git:master x  C:130
$ native-image -jar target/graal-micronaut-0.1.jar
Build on Server(pid: 5608, port: 50229)
[graal-micronaut:5608]    classlist:   3,788.29 ms
[graal-micronaut:5608]        (cap):   1,547.58 ms
[graal-micronaut:5608]        setup:   2,041.59 ms
[graal-micronaut:5608]   (typeflow):  27,169.03 ms
[graal-micronaut:5608]    (objects):  32,584.30 ms
[graal-micronaut:5608]   (features):   1,562.28 ms
[graal-micronaut:5608]     analysis:  63,945.36 ms
[graal-micronaut:5608]     universe:   1,593.43 ms
[graal-micronaut:5608]      (parse):   4,368.97 ms
[graal-micronaut:5608]     (inline):   9,400.59 ms
[graal-micronaut:5608]    (compile):  36,820.75 ms
[graal-micronaut:5608]      compile:  53,848.37 ms
[graal-micronaut:5608]        image:   4,955.53 ms
[graal-micronaut:5608]        write:   1,971.52 ms
[graal-micronaut:5608]      [total]: 132,379.50 ms

# rhasija ~/dev/projects/tmp/graal-micronaut on git:master x
$ ./graal-micronaut
18:36:13.776 [main] INFO  io.micronaut.runtime.Micronaut - Startup completed in 29ms. Server Running: http://localhost:8080


Also, the native image took 14M of memory (vs  387M Java-based app) – roughly 3 percent of the memory taken by Java app. This is awesome.

Micronaut Java app:
13068 java 0.1 00:03.00 23 1 84 387M 0B 0B 13068 5096

Native image:
13136 graal-micron 0.0 00:00.03 3 1 33 14M 0B 0B 13136 5096

Below are the performance metrics for the endpoint:

For Graal VM native binary:

$ bombardier -n 100000 -c 100 "http://localhost:8080/simple"
Bombarding http://localhost:8080/simple with 100000 request(s) using 100 connection(s)
 100000 / 100000 [=========================================================] 100.00% 18s
Done!
Statistics        Avg      Stdev        Max
  Reqs/sec      5308.20    1376.18    9776.58
  Latency       18.70ms    23.93ms   305.78ms
  HTTP codes:
    1xx - 0, 2xx - 100000, 3xx - 0, 4xx - 0, 5xx - 0
    others - 0
  Throughput:    16.31MB/s%


$ bombardier -n 100000 -c 100 "http://localhost:8080/simple"
Bombarding http://localhost:8080/simple with 100000 request(s) using 100 connection(s)
 100000 / 100000 [=============================================================================================] 100.00% 19s
Done!
Statistics        Avg      Stdev        Max
  Reqs/sec      5164.61    1269.64   11265.72
  Latency       19.22ms    19.82ms   249.97ms
  HTTP codes:
    1xx - 0, 2xx - 100000, 3xx - 0, 4xx - 0, 5xx - 0
    others - 0
  Throughput:    15.87MB/s%


$ bombardier -n 100000 -c 100 "http://localhost:8080/simple"
Bombarding http://localhost:8080/simple with 100000 request(s) using 100 connection(s)
 100000 / 100000 [=============================================================================================] 100.00% 19s
Done!
Statistics        Avg      Stdev        Max
  Reqs/sec      5041.83    1283.07   10297.43
  Latency       19.66ms    22.36ms   310.20ms
  HTTP codes:
    1xx - 0, 2xx - 100000, 3xx - 0, 4xx - 0, 5xx - 0
    others - 0
  Throughput:    15.51MB/s%  


For a Java application:

$ bombardier -n 100000 -c 100 "http://localhost:8080/simple"
Bombarding http://localhost:8080/simple with 100000 request(s) using 100 connection(s)
 100000 / 100000 [=============================================================================================] 100.00% 17s
Done!
Statistics        Avg      Stdev        Max
  Reqs/sec      5629.80    1057.21   13696.04
  Latency       17.70ms     8.91ms   182.28ms
  HTTP codes:
    1xx - 0, 2xx - 100000, 3xx - 0, 4xx - 0, 5xx - 0
    others - 0
  Throughput:    17.15MB/s%

  $ bombardier -n 100000 -c 100 "http://localhost:8080/simple"
Bombarding http://localhost:8080/simple with 100000 request(s) using 100 connection(s)
 100000 / 100000 [=============================================================================================] 100.00% 14s
Done!
Statistics        Avg      Stdev        Max
  Reqs/sec      7034.21     592.21   15347.18
  Latency       14.22ms     2.91ms    65.69ms
  HTTP codes:
    1xx - 0, 2xx - 100000, 3xx - 0, 4xx - 0, 5xx - 0
    others - 0
  Throughput:    21.45MB/s%


  # rhasija ~/dev/projects/tmp/graal-micronaut on git:master x
$ bombardier -n 100000 -c 100 "http://localhost:8080/simple"
Bombarding http://localhost:8080/simple with 100000 request(s) using 100 connection(s)
 100000 / 100000 [=============================================================================================] 100.00% 14s
Done!
Statistics        Avg      Stdev        Max
  Reqs/sec      7060.21    2076.69   61322.98
  Latency       14.31ms     2.79ms    57.63ms
  HTTP codes:
    1xx - 0, 2xx - 100000, 3xx - 0, 4xx - 0, 5xx - 0
    others - 0
  Throughput:    21.31MB/s%


The Java application performed consistently better.

Details:

$ mvn -v
Apache Maven 3.6.0 (97c98ec64a1fdfee7767ce5ffb20918da4f719f3; 2018-10-24T11:41:47-07:00)
Maven home: /Users/rhasija/.sdkman/candidates/maven/current
Java version: 1.8.0_202, vendor: Oracle Corporation, runtime: /Users/rhasija/.sdkman/candidates/java/1.0.0-rc-16-grl/jre
Default locale: en_US, platform encoding: UTF-8
OS name: "mac os x", version: "10.13.6", arch: "x86_64", family: "mac"


To get more information on it, I asked the GraalVM folks, and Oleg Selajev got back with this information:

"Native Image does offer superior startup/memory overhead doesn't it? In general, peak performance will benefit more from the powerful JIT compiler, so it's expected that running for a longer period of time or under a heavy load will perform better in the JVM mode."

"Also, if you use GraalVM Enterprise, you can provide --pgo-instrument to build an instrumented binary, and use a profile from that to build (with --pgo) a better performing native image.

Bottom Line

I could be wrong but this is what I have learned: GraalVM is awesome for quick start applications/lamdas that will do a job and be shut down right after, or for applications where performance is a non-issue. But for long-running applications that are supposed to be performant under load, Java applications (because of the JIT compiler) are still better. This might change in the future, but for now, this seems to be the case.

Thoughts/feedback? Let me know in the comments below.

 

 

 

 

Top