Every few years, when a new version of Java is released, the speakers at JavaOne tout the new language constructs and APIs, and laud the benefits. Meanwhile, excited developers line up, eager to use the new features. It’s a rosy picture—except for the fact that most developers are charged with maintaining and enhancing existing applications, not creating new ones from scratch.

You can use late binding to attempt to access a new API when your application also needs to run on older versions of Java that don’t support that API. For example, let’s say that you want to use the java.util.stream.LongStream class introduced in Java 8, and you want to use LongStream’s anyMatch(LongPredicate) method, but the application has to run on Java 7. You could create a helper class as follows:

You can shoehorn new platform features into legacy applications that need to be backward-compatible. Specifically, there are ways for you to take advantage of new APIs. It can get a little ugly, however.

So is there anything in the upcoming Java 9 release that’s aimed at developers working on legacy Java applications? Is there anything that makes your life easier, while at the same time allowing you to use the exciting new features that are coming out next year? Fortunately, the answer is yes.

When new releases come out, legacy developers feel like kids with their noses pressed up against the window of the candy store: They’re not allowed in, and that can be disappointing and frustrating.

This leads to problems when developers want to use a new feature. Do you like the idea of using default interface methods in your code? You’re out of luck if your application needs to run on Java 7 or earlier. Want to use the java.util.concurrent.ThreadLocalRandom class to generate pseudo-random numbers in a multi-threaded application? That's a no-go if your application needs to run on Java 6, 7, 8 or 9.

Most applications, particularly commercially sold ones, need to be backward-compatible with earlier versions of Java, which won’t support those new, whiz-bang features. And, finally, most customers and end users, particularly those in enterprises, are cautious about adopting the newly announced Java platform, preferring to wait until they’re confident that the new platform is solid.

Why is this ugly? Well, it can get extremely complicated and tedious when there are lots of APIs you want to access. (In fact, it’s tedious already, with a single API.) It’s also not type safe, since you can’t actually mention LongStream or LongPredicate in your code. Finally, it’s much less efficient, because of the overhead of the reflection, and the extra try-catch blocks. So, while you can do this, it’s not much fun, and it’s error-prone if you're not careful.

Instead of calling theLongStream.anyMatch(thePredicate), as you would in Java 8, you can call LongStreamHelper.anyMatch(theLongStream, thePredicate) in any version of Java. If you’re running on Java 8, it’ll work, but if you’re running on Java 7, it’ll throw a NotImplementedException.

There are ways to make this a little simpler, or more general, or more efficient, but you get the idea.

Let’s say that Java 9 comes out, and you rewrite classes A and B to use some new Java 9-specific features. Later, Java 10 comes out and you rewrite class A again to use Java 10’s new features. At the same time, the application should still work with Java 8. The new multi-release JAR file looks like this:

JEP 238 , the Java enhancement proposal that specifies multi-release JAR files, gives a simple example. Consider a JAR file containing four classes that will work in Java 8 or earlier:

If you’re running Java 8 or earlier, however, the JVM doesn’t know about this special nook. It will ignore it, and only run the classes in the regular part of the JAR file. When Java 10 comes out, it will offer another nook specifically for classes using new Java 10 features, and so forth.

Multi-release JAR files look just like old-fashioned JAR files, with one crucial addition: There’s a new nook in the JAR file where you can put classes that use the latest Java 9 features. If you’re running Java 9, the JVM recognizes this nook, uses the classes in that nook, and ignores any classes of the same name in the regular part of the JAR file.

Until recently, there hasn’t been a good way to use the latest Java features while still allowing the application to run on earlier versions of Java that didn't support the applications. Java 9 provides a way to do this for both new APIs and for new Java language constructs: It's called multi-release JAR files .

While you can access new APIs and still have your code remain backward-compatible, you can’t do this for new language constructs. For example, let’s say that you want to use lambdas in code that also needs to run in Java 7. You’re out of luck. The Java compiler will not let you specify a source compliance level later than its target compliance level. So, if we set a source compliance level of 1.8 (i.e., Java 8), and a target compliance level of 1.7 (Java 7), it will not let you proceed.

That means that you should still be able to run your modularized JAR files on Java 8 and earlier, assuming that they’re otherwise compatible with that earlier version of Java. Also note that module-info.class files can be placed, with restrictions, in the versioned areas of multi-release JAR files.

The information in module-info.class is only visible when the JVM is looking for it, which means that modularized JAR files are treated like ordinary JAR files when running on older versions of Java (assuming the code has been compiled to target an earlier version of Java. Strictly speaking, you’d need to cheat a little and still target module-info.class to Java 9, but that’s doable).

First, a JAR file becomes modularized (and becomes a module) when it contains a file module-info.class (compiled from module-info.java) at the JAR file root. module-info.java contains metadata specifying the name of the module, which packages are exported (i.e., made visible to the outside), which modules the current module requires, and some other information.

Modularization is a good thing, and developers should try to modularize their new code wherever possible, even if the rest of the legacy application is not (yet) modularized. Fortunately, the modularization specification makes this easy.

The entire module system is large and complex, and a complete discussion is beyond the scope of this article. (Here's a good, in-depth explanation .) Rather, I'll concentrate on aspects of modularization that support legacy application developers.

The second goal of modularization is to specify which modules are required by which other modules, and to ensure that all necessary modules are present before the application executes. In this sense, modules are stronger than the traditional classpath mechanism, since classpaths are not checked ahead of time, and errors due to missing classes only occur when the classes are actually needed. That means that an incorrect classpath might be discovered only after an application has been run for a long time, or after it has run many times.

The Java 9 module system (also known as Project Jigsaw), is undoubtedly the biggest change to Java 9. One goal of modularization is to strengthen Java’s encapsulation mechanism so that the developer can specify which APIs are exposed to other components, and can count on the JVM to enforce the encapsulation. Modularization’s encapsulation is stronger than that provided by the public/protected/private access modifiers of classes and class members.

So, if you want to use the new Java 9 ProcessBuilder API in your application while still allowing your application to run under Java 8, just put the new versions of your classes that use ProcessBuilder in the \META-INF\Versions\9 section of the JAR file, while leaving old versions of the classes that don’t use ProcessBuilder in the default section of the JAR file. It’s a straightforward way to use the new features of Java 9 while maintaining backward compatibility.

When you run it using Java 10, both \META-INF\Versions branches are used; specifically, the Java 10 version of A, the Java 9 version of B, and the default versions of C and D are used.

When you run it using Java 9, the classes under \META-INF\Versions\9 are used, and are used instead of the original classes A and B, but the classes in \META-INF\Versions\10 are ignored.

When you run this JAR file on a Java 8 JVM, it ignores the \META-INF\Versions section, since it doesn’t know anything about it and isn’t looking for it. Only the original classes A, B, C and D are used.

In Java 9, there is both a classpath and a module path. The classpath works like it always has. If a modularized JAR file is placed in the classpath, it’s treated just like any other JAR file. This means that if you’ve modularized a JAR file, but are not ready to have your application treat it as a module, you can put it in the classpath, and it will work as it always has. Your legacy code should be able to handle it just fine.

Also, note that the collection of all JAR files in the classpath are considered to be part of a single unnamed module. The unnamed module is considered a regular module, but it exports everything to other modules, and it can access all other modules. This means that, if you have a Java application that’s modularized, but have some old libraries that haven’t been modularized yet (and perhaps never will be), you can just put those libraries in the classpath and everything will just work.

Java 9 contains a module path that works alongside the classpath. Using the modules in the module path, the JVM can check, both at compile time and at run time, that all necessary modules are present, and can report an error if any are missing. All JAR files in the classpath, as members of the unnamed module, are accessible to the modules in the module path and vice versa.

It’s easy to migrate a JAR file from the classpath to the module path, to get the advantages of modularization. First, you can add a module-info.class file to the JAR file, then move the modularized JAR file to the module path. The newly minted module can still access all the classpath JAR files that have been left behind, because they’re part of the unnamed module, and everything is accessible.

It’s also possible that you might not want to modularize a JAR file, or that the JAR file belongs to someone else, so you can’t modularize it yourself. In that case, you can still put the JAR file into the module path; it becomes an automatic module.

An automatic module is considered a module even though it doesn’t have a module-info.class file. The module’s name is the same as the name of the JAR file containing it, and can be explicitly required by other modules. It automatically exports all of its publicly accessible APIs, and reads (that is, requires) every other named module, as well as the unnamed modules.

This means that it’s possible to make an unmodularized classpath JAR file into a module with no work at all: Legacy JAR files become modules automatically, albeit without some of the information needed to determine whether all required modules are really there, or to determine what is missing.

Not every unmodularized JAR file can be moved to the module path and made an automatic module. There is a rule that a package can only be part of one named module. So if a package is in more than one JAR file, then only one of the JAR files containing that package can be made into an automatic module—the other can be left in the classpath and remain part of the unnamed module.

The mechanism I've described sounds complicated, but it’s really quite simple. All it really means is that you can leave your old JAR files in the classpath or you can move them to the module path. You can modularize them or you can leave them unmodularized. And once your old JAR files are modularized, you can leave them in the classpath or put them in the module path.

In most cases, everything should just work as before. Your legacy JAR files should be at home in the new module system. The more you modularize, the more dependency information can be checked, and missing modules and APIs will be detected far earlier in the development cycle, possibly saving you a lot of work.

DIY Java 9 : The modular JDK and Jlink

One problem with legacy Java applications is that the end user might not be using the right Java environment. One way to guarantee that the Java application will run is to supply the Java environment with the application. Java allows the creation of a private, or redistributable, JRE, which may be distributed with the application. The JDK/JRE installation comes with instructions on how to create a private JRE. Typically, you take the JRE file hierarchy that’s installed with the JDK, keep the required files, and retain the optional files with the functionality that your application will need.

The process is a bit of a hassle: You need to maintain the installation file hierarchy, you must be careful that you don’t leave out any files and directories that you might need, and, while it does no harm to do so, you don’t want to leave in anything that you don’t need, since it will take up unnecessary space. That's an easy mistake to make.

So why not let the JDK do the job for you?

With Java 9, it’s now possible to create a self-contained environment with your application, and anything it needs to run. There's no need to worry that the wrong Java environment is on the user’s machine, and no need to worry that you’ve created the private JRE incorrectly.