Skip to end of metadata
Go to start of metadata

You are viewing an old version of this page. View the current version.

Compare with Current View Page History

« Previous Version 27 Next »

Using Maven2 to build projects which use JNI code

This page documents my experiences with supporting projects which use JNI code, and describes the solution I developed and the tools I use, in the hope that it may save somebody else this pain.


I work in telecoms, which means that we have quite a lot of projects which have a greater or lesser quantity of native code, either because they interface to the operating system at a low level, or simply as a way of dealing with the real-time requirements of our software. As time has gone by, we've built up a reasonably complex heirarchy of applications with native code which depend on libraries with native code, and some of the libraries even export native-level symbols to application native code.

Our investigations with converting JNI-free projects to a Maven2 build process were extremely positive, and it therefore soon became desirable to convert all of our projects, including those with JNI requirements, to Maven2. This led to the following requirements for the build process.


  1. The build/release process should match as closely as possible that of a non-JNI project - checkout followed by 'mvn package', etc.
  2. As this functionality will be common to many projects, long incantations of plugin configuration in each pom are unacceptable; it must be possible to either factor everything out to a common parent pom, or just to have sensible defaults which build everything. For example, it should be possible to make use of a library using JNI code - and have it work for unit tests, assemblies, etc - just by adding it to your <dependencies/> element.
  3. It should be possible to run an application with as little extraneous scripting as possible. Essentially, this means: unzip assembly; run java -jar on application jar.
  4. It should be possible to call functions in one JNI library from another. Typically this means having the library's include files and dll available at compile time, and having the library available for dynamic linking at runtime.
  5. The build process must be portable from one platform to another. I'm happy to require that building for Windows requires Cygwin, but it should be possible to have builds for different platforms alongside each other in the repository, and the right build be chosen when building.
  6. Our legacy build process must continue to work until the whole company can be migrated to maven2.

For bonus points:

  1. During the development cycle, it shouldn't be necessary to run the entire maven build cycle for each change. The user's IDE can be configured to build the Java side, so we need a means of quickly building the native side. This also offers an easier upgrade path from our legacy build system, fwiw.

Possible solutions

There are obviously many ways to skin this particular cat; I'll discuss a few of the options.


The first place to look is obviously to see if anybody else has solved this problem. I therefore made a start with the native-maven-plugin. This, however, has a number of problems, which means that it totally fails to meet my requirements.

  • The showstopper is that the native code ant the java code are separate projects. This means that it's impossible to get things to build in the right order, because we have to do the following.
    1. compile java source to class files
    2. run javah on class files
    3. compile and link native code
    4. run unit tests for java code

Some messing about with a very complex project structure is possible, but it's ugly and very hard to set up for each of many projects.

  • If you build a library which has a JNI component, making use of it is very complex. Essentially, maven2 downloads the jni library to the depths of your local repository, where it then can't be found by a System.load().
    • update: this can be worked around by judicious use of a recent maven-dependencies-plugin build.
  • Again, when you depend on a library with a JNI component, you need complex incantations in your pom, to depend on both the jar, and a separate profile for each target platform to depend on the native library.
  • Building the native code always requires running the full maven build.

JNI library as an attached artifact

The next possibility considered was to build the native library as an 'attached artifact' - in much the same way as a javadoc or source jar can be attached to the main artifact. This solved some of the problems, especially the problem with build order; depending on a library with jni code was still a nightmare, however.

JNI library archived within the jar

The solution I ended up using was to store the compiled jni library in the jar alongside the class files.

This means either cross-compiling for all possible architectures, or more simply, having a different jar for each architecture. This latter fits quite well with our setup - where almost all of our machines are Linux-i386, with a smattering of win32 boxes.

Sadly System.load() can't cope with loading libraries from within a jar, so we'll therefore need a custom loader which extracts the library to a temporary file at runtime; this is obviously achievable, however.

Freehep-NAR plugin

Since I originally wrote this, the freehep project have been working on porting their nar plugin to Maven 2. As of the time of writing (13 October 2006), it's still in Alpha - but I think this may represent a more polished alternative to this problem than my solution. I'd certainly suggest that anybody thinking about this problem takes a look at it.


For an example of a project using this implementation, you may like to download simple-native-example. That directory contains source archives (eg, as well as binaries compiled for Linux-i386, and javadocs, and may provide helpful examples of the concepts below.


The general idea here, as mentioned above is that we will build our JNI library into a jar, alongside the class files. That means that we end up wit a different jar for each system architecture we are targetting; to distinguish between the different versions, we actually put a system identifier into the names of the artifacts we produce. More on this below.

Directory structure

We need a place to put native code. This should naturally be src/main/native in the standard directory layout; for a more complex project it may be appropriate to further split this into src and include.

The JNI library should be built straight into a subdirectory of ${outputDirectory}/classes (ie, target/classes); by doing so, it will automatically get built into the project jar, as well as being in the right place when running the project from an IDE which just uses the .class files straight out of ${outputDirectory}/classes. I decided to put such my jni libraries in META-INF/lib, but this is obviously not set in stone.

Building the JNI

The first thing we need to do (after creating the .java files, with suitable native methods, of course), is to build the JNI library. I chose to do this using a Makefile, and a maven-antrun-plugin execution, as this makes it reusable between different build environments, and is easily run independently of Maven. It's also an easy way of making sure the right files, and only the right files, get recompiled.

The Makefile needs to:

  • use javah to create .h files from the .class files in the outputDirectory.
  • use gcc to compile c into object files
  • link the object files into the dynamic library.

We now need for make to be run after the .class files are built; I did this by attaching an execution to the process-classes phase:

Library loader

We now have our JNI library on the class path, so we need a way of loading it. I created a separate project which would extract JNI libraries from the class path, then load them. Find it at This is added as a dependency to the pom, obviously.

To use it, I simply call com.wapmx.nativeutils.jniloader.NativeLoader.loadLibrary(libname). More information is in the javadoc for NativeLoader.

I generally prefer to wrap such things in a try/catch block, as follows:

We should now be at the point where our junit tests work from maven; a mvn test should work! It should also work fine from an IDE.

The artifactId

As I mentioned earlier, we need to be able to keep builds for multiple architectures alongside one another in the repository, so we need to distinguish between different builds somehow. We do this by putting an identifier into the artifactId.

It turns out that we have to distinguish between features of a system which are hard to determine programatically - for example, the libc and libstdc++ versions used by the system. We therefore first create a system property, ${mx.sysinfo} which encodes this information. This needs to be set in `/etc/mavenrc` or `.mavenrc`: for example, on an i386 Linux system using glibc version 2.3 and libstdc++ version 6, we might have:

The precise value you use doesn't really matter (unless you want to share your artifacts outside your organisation), nor even the name of the property, provided you are consistent and the values you choose adequately distinguish between architectures.

It's then simple to use this in your pom; where previously you might have <project>...<artifactId>my-artifact</artifactId>...</project>, add a suffix to the artifactId thus: <project>...<artifactId>my-artifact-${mx.sysinfo}</artifactId>...</project>. Similarly, whenever we refer to that artifact, we again add -${mx.sysinfo} to the names of the artifactIds.


Releasing the project is the same as normal - mvn release:prepare followed by mvn release:perform - except that you need to do the perform step on each platform you wish to support. To repeat a release, you can do:


It really is as simple as that. You should now be able to use your JNI-enabled project exactly as you would any other project - with the one proviso that, if you depend on it from another project, you must put a -${mx.sysinfo} on the end of the artifactId. For example:


The rest of this page isn't relevant to a single JNI project - it describes some extra knobs and whistles I've added in order to handle our more extensive use of JNI.

Functions exported at the native level

Sometimes it's useful for a JNI library to export symbols for other JNI libraries to use. For example, we might want to create a "utilities" library with functions such as JNU_ThrowByName.

We therefore end up with a "library" project and an "application" project. An example of such a pair can be found at; the source is at

Building an "includes" zip

We need to make the .h and .so/.dll files from the library project available when compiling the application project. To do this, I've configured the pom of the library project to build a zip file containing the includes files; until MDEP-47 is fixed, the .so also has to go in the zip file (if we had MDEP-47, we could extract it from the jar).

Here's an example assembly descriptor (which can be packaged into the shared build-scripts artifact as below):


And a pom snippet to use it (which can be factored out to a common parent pom):


Unpacking includes and library archives

Now we need to make the build for the application project unpack the includes zip, before trying to compile the native code. We can use the dependencies plugin for this:


We thus end up with target/extracted/include, and target/extracted/lib; the Makefile can now be set up to use those two paths at appropriate points.

Ensuring libraries are extracted in the right order

At runtime, we must ensure that our library project's JNI is extracted (and loaded) before we try to load the application project's JNI - otherwise the dynamic linker won't be able to find the library. To this end, I have this in a static initialiser in the application:

Making the OS dynamic linker work

The operating system must be able to find the dynamic library from the library project at runtime for the application project; for Linux, this means that you have to set LD_LIBRARY_PATH to include wherever the native-loader extracts the libraries to. If we don't, we'll get a cannot open shared object file error.

For example, to make this work when running the tests in the application project, we can do:

You obviously also need to make java.library.tmpdir and LD_LIBRARY_PATH agree if running java directly. A similar trick is no doubt possible in Win32 (with PATH instead of LD_LIBRARY_PATH), but I haven't tested that.

Also note that, on glibc 2.3.6 at least, LD_LIBRARY_PATH is parsed at startup, and any non-existent absolute directories ignored; so unless your tmplib exists when you start, you'll still get an error. It's probably easiest just to keep LD_LIBRARY_PATH relative.

Shared build scripts

We can make use of the unpacking of includes zips to factor out the complicated parts of the Makefile to a single Makefile which is used by all JNI projects. provides a demonstration of how this might be done.


A previous version of this wiki page suggested putting the platform discriminator in the jar's classifier; I no longer recommend this as it causes problems with snapshot dependencies, and requires a hideous fudge to set a classifier on the main artifact of a build.

Contacting me

If you find this information useful, or you have problems with it, please let me know. I can be contacted at

  • No labels