Understanding Java JIT Compilation with JITWatch, Part 1
by Ben Evans
Published July 2014
A primer on JIT compilation in Java HotSpot VM
Oracle’s Java HotSpot VM is equipped with a highly advanced just-in-time (JIT) compiler. This means that the class files (which are compiled from Java source code) are further compiled at runtime, and they can be turned into very highly optimized machine code. This optimized code runs extremely fast—usually as fast as (and, in certain cases, faster than) compiled C/C++ code.
The JIT compiler is, therefore, one of the most important parts of Java HotSpot VM, and yet many Java developers do not know much about it or how to check that their applications work well with the JIT compiler.
Originally published in the July/August 2014 issue of Java Magazine. Subscribe today.
Fortunately, a new open source tool called JITWatch is being developed to give developers much better insight into how the JIT compiler treats their code. For most effective use, the JITWatch tool relies on developers already understanding the basic mechanisms and terminology of JIT compilation.
This article provides a basic primer on JIT compilation as it happens in Java HotSpot VM. We’ll discuss how to switch on simple logging for the JIT compiler and some of the most common (and important) JIT compilation techniques that modern Java HotSpot VM versions use. Then we’ll talk about the more-detailed logging options available (these are the options that JITWatch makes use of). This will pave the way for a full introduction to JITWatch in Part 2 of this series.
Let’s kick off with a few fundamentals about JIT compilation as it is done in Java HotSpot VM.
A new open source tool called JITWatch is being developed to give developers much better insight into how the JIT compiler treats their code.
Basic JIT Compilation
Java HotSpot VM automatically monitors which methods are being executed. Once a method has become eligible (by meeting some criteria, such as being called often), it is scheduled for compilation into machine code, and it is then known as a hot method. The compilation into machine code happens on a separate JVM thread and will not interrupt the execution of the program. In fact, even while the compiler thread is compiling a hot method, the Java Virtual Machine (JVM) will keep on using the original, interpreted version of the method until the compiled version is ready.
To learn more about the JIT compilation process, see “Understanding the Java HotSpot VM Code Cache,” and “Introduction to JIT Compilation in Java HotSpot VM.”
The first step to understanding how JIT compilation in Java HotSpot VM is affecting your code is to see which of your methods are getting compiled. Fortunately this is very easy to do, and only requires you to add the -XX:+PrintCompilation
flag to the script you use to start your Java processes.
Note: The resulting log of compilation events will end up in the standard log (that is, the standard output), and there is currently no way to redirect the entries to another file.
The JIT compiler is one of the most important parts of Java HotSpot VM, and yet many Java developers do not know much about it or how to check that their applications work well with the JIT compiler.
The exact format of the PrintCompilation
flag’s log entries varies between different Java versions. Here are some examples of log formats from different Java versions. Listing 1 shows an example log from JDK 6.
JDK 6:
22 java.util.HashMap::getEntry (79 bytes)
23 s! sun.misc.URLClassPath::getLoader (136 bytes)
Listing 1
In the JDK 6 form of the PrintCompilation
flag’s log entries, the first number corresponds to the compilation ID. This ID essentially tracks an individual method as it is compiled, optimized, and possibly deoptimized again.
Thereafter follow some flags that indicate properties of the method; for example, the s indicates the method is synchronized, and the !
indicates the method has exception handlers.
Next comes the name of the method—in fully qualified form—followed by the number of bytes of bytecode contained in the method being compiled. There is a minor annoyance: the method signatures are not printed out in the output.
Listing 2 shows an example log from JDK 7 onward. The big change in the JDK 7 form of the logs is that the first column is now the time—in milliseconds since the JVM started—at which the compilation occurred. Otherwise, the other fields are essentially the same as with JDK 6.
JDK 7 onwards:
31 1 java.lang.String::hashCode (67 bytes)
Listing 2
There are a number of excellent posts on the subject of reading PrintCompilation
output, for example, those by Stephen Colebourne and Chris Vest, which are both highly recommended.
If you’re using a different JVM language, such as Scala or Groovy, then you should be aware that those languages’ compilers might alter (mangle) the names of methods and add or remove methods as part of their process for producing class files.
Some JIT Compilation Techniques
One of the most common JIT compilation techniques used by Java HotSpot VM is inlining, which is the practice of substituting the body of a method into the places where that method is called. Inlining saves the cost of calling the method; no new stack frames need to be created. By default, Java HotSpot VM will try to inline methods that contain less than 35 bytes of JVM bytecode.
Another common optimization that Java HotSpot VM makes is monomorphic dispatch, which relies on the observed fact that, usually, there aren’t paths through a method that cause an object reference to be of one type most of the time but of another type at other times.
You might think that having different types via different code paths would be ruled out by Java’s static typing, but remember that an instance of a subtype is always a valid instance of a supertype (this principle is known as the Liskov substitution principle, after Barbara Liskov). This situation means that there could be two paths into a method—for example, one that passes an instance of a supertype and one that passes an instance of a subtype—which would be legal by the rules of Java’s static typing (and does occur in practice).
In the usual case (the monomorphic case), however, having different, path-dependent types does not happen. So we know the exact method definitions that will be called when methods are called on the passed object, because we don’t need to check which override is actually being used. This means we can eliminate the overhead of doing virtual method lookup, so the JIT compiler can emit optimized machine code that is often faster than an equivalent C++ call (because in the C++ case, the virtual lookup cannot easily be eliminated).
The two Java HotSpot VM compiler modes use different techniques for JIT compilation, and they can output very different machine code for the same Java method. Modern Java applications, however, can usually make use of both compilation modes.
Java HotSpot VM uses many other techniques to optimize the code that JIT compilation produces. Loop optimization, type sharpening, dead-code elimination, and intrinsics are just some of the other ways that Java HotSpot VM tries to optimize code as much as it can. Techniques are frequently layered one on top of another, so that once one optimization has been applied, the compiler might be able to see more optimizations that can be performed.
Compilation Modes
Inside Java HotSpot VM, there are actually two separate JIT compiler modes, which are known as C1 and C2. C1 is used for applications where quick startup and rock-solid optimization are required; GUI applications are often good candidates for this compiler. C2, on the other hand, was originally intended for long-running, predominantly server-side applications. Prior to some of the later Java SE 7 releases, these two modes were available using the -client
and -server
switches, respectively.
The two compiler modes use different techniques for JIT compilation, and they can output very different machine code for the same Java method. Modern Java applications, however, can usually make use of both compilation modes. To take advantage of this fact, starting with some of the later Java SE 7 releases, a new feature called tiered compilation became available. This feature uses the C1 compiler mode at the start to provide better startup performance. Once the application is properly warmed up, the C2 compiler mode takes over to provide more-aggressive optimizations and, usually, better performance. With the arrival of Java SE 8, tiered compilation is now the default behavior.
Learn More
Java HotSpot VM has the ability to produce a more detailed log of compilation events. Let’s move on to see how to enable the production of such a log.
Full Logging of JIT Compilation
The switch for enabling full logging is -XX:+LogCompilation
, and it must be preceded by the option -XX:+UnlockDiagnosticVMOptions
. Using the -XX:+LogCompilation
switch produces a separate log file, hotspot_pid<PID>.log
, in the startup directory. To change the location of the file, use -XX:LogFile=<path to file>
.
The output log is a large XML file (often comprising dozens or hundreds of megabytes), containing a high level of detail about the decisions the Java HotSpot VM compilers made. This log contains a lot more information than the simple format we discussed above.
Listing 3 is a sample entry from the detailed compilation log, and it contains a lot of detail about the compilation decisions that Java HotSpot VM made when compiling the method—in this case, the method String::hashCode()
. However, the format is complex and difficult to work with. This presents a barrier for many developers, which means that they can’t use the detailed logs to understand their applications. Fortunately, help is at hand.
<nmethod compile_id='2' compiler='C1' level='3'
entry='0x00000001023fe240' size='1224'
address='0x00000001023fe0d0' relocation_offset='288'
insts_offset='368' stub_offset='880' scopes_data_offset='1032'
scopes_pcs_offset='1104' dependencies_offset='1200'
nul_chk_table_offset='1208'
method='java/lang/String hashCode ()I' bytes='55' count='512'
backedge_count='8218' iicount='512' stamp='0.350'/>
Listing 3
In Part 2 of this series, we will introduce JITWatch, a new open source tool that can consume the detailed compilation logs and provide simple, graphical visualizations of many aspects of JIT compilation. You can download JITWatch from GitHub, which is where continued development of the tool takes place.
Conclusion
In this article, we have introduced some of the basic concepts of JIT compilation as deployed in Java HotSpot VM. We have illustrated the flags needed to produce compilation log output—both the compact format and the more extensive XML output. In doing so, we have paved the way to discuss a new visualization tool in Part 2 of this series.
Ben Evans (@kittylyst) is tech fellow and founder at jClarity, an organizer for the London Java Community (LJC), and a member of the Java SE/EE Executive Committee.