Automatic checks for vulnerabilities in Java project dependencies

 Java aktuell published an article I wrote on a topic at work for TRIOLOGY GmbH.

You can find an English version on the TRIOLOGY Blog: Automatic checks for vulnerabilities in Java project dependencies. The article shows an approach to keeping your Java project dependencies free of known vulnarabilities (e.g. CVEs) using the OWASP Dependency check with Jenkins and Maven. There also is an example project on GitHub.

The original article PDF (in German) is available for download here: Automatisierte Überprüfung von Sicherheitslücken in Abhängigkeiten  von Java-Projekten.

TRIOLOGY also published a short Q&A on the article, which can be found here.

Advertisements

Building GitHub projects with Jenkins, Maven and SonarQube 5.2 on OpenShift

Time for an update of the post Building GitHub projects with Jenkins, Maven and SonarQube 4.1.1 on OpenShift, because SonarQube 5.2 is out: It’s the first version since 4.1.1 that can be run on OpenShift. That is, it’s the first version of SonarQube 5 and the first one that contains Elasticsearch and many other features that are now available on OpenShift!
Interested? Then let’s see how to set up SonarQube on OpenShift.

  • If you’re starting from scratch, just skip to this section.
  • If you got a running instance of SonarQube
    • make sure to back up you instance before you continue:
      rhc snapshot save -a <application name>--filepath <backup destination>
      or
      ssh <UID>@<application name>-<yourAccount>.rhcloud.com 'snapshot' > sonar.tar.gz
    • Then pull the git repository just like in step 2,
    • wait until the app has started and visit
      https://sonar-<yourAccount>.rhcloud.com/setup

      SonarQube will update it’s database during the process.

    • If you followed this post to set up your SonarQube instance and therefore use an SSH tunnel to access the SonarQube database, note that you can now get rid of this workaround. From SonarQube 5.2 the analyses can be run without direct contact to the database.
      That is, you can also remove the database connection from your the configuration of the SonarQube plugin in jenkins.

Install new SonarQube instance

To install SonarQube 5.2, execute the following steps on your machine:

  1. rhc app create sonar diy-0.1 postgresql-9.2

    Make sure to remember the login and passwords!

  2. git rm -r diy .openshift misc README.md
    git remote add upstream -m master https://github.com/schnatterer/openshift-sonarqube.git
    git pull -s recursive -X theirs upstream master
    git push
    
  3. Login to your SonarQube instance at
    http://sonar-<yourAccount>.rhcloud.com/

    Note that the initial setup may take some minutes. So be patient.
    The default login and passwords are admin / admin.
    You might want to change the password right away!

Basic installation Jenkins

Basically, the following is an updated (and a lot simpler) version of my post about SonarQube 4.1.1.

  1. Create Jenkins app
    rhc app create jenkins jenkins-1
  2. Install Plugins
    Browse to Update Center

    https://jenkins-<yourAccount>.rhcloud.com/pluginManager/advanced

    and hit Check Now (as described here).
    Then go to the Available tab and install

    1. Sonar Plugin,
    2. GitHub plugin,
    3. embeddable-build-status (if you’d like to include those nifty build badges in you README.md).

    Then hit Install without restart or Download and install after restart. If necessary, you can restart your app anytime like so

    rhc app restart -a jenkins
  3. Set up maven settings.xml to a writable location.
    • SSH to Jenkins
      mkdir $OPENSHIFT_DATA_DIR/.m2
      echo -e "&amp;amp;lt;settings&amp;amp;gt;&amp;amp;lt;localRepository&amp;amp;gt;$OPENSHIFT_DATA_DIR/.m2&amp;amp;lt;/localRepository&amp;amp;gt;&amp;amp;lt;/settings&amp;amp;gt;" &amp;amp;gt; $OPENSHIFT_DATA_DIR/.m2/settings.xml
      
    • Browse to Configure System
      https://jenkins-<yourAccount>.rhcloud.com/configure

      Default settings provider: Settings file in file system
      File path=$OPENSHIFT_DATA_DIR/.m2/settings.xml

  4. Either see my post on how to introduce a dedicated slave node to this setup or
    set up the Jenkins master to run its own builds as follows (not recommended on small gears, as you might run out of memory pretty fast during builds):
    Go to Configure System

    https://jenkins-<yourAccount>.rhcloud.com/configure

    and set
    # of executors: 1

  5. Setup sonar plugin (the following bases on SonarQube Plugin 2.3 for Jenkins)
    On the Jenkins frontend, go to Configure System

    https://jenkins-<yourAccount>.rhcloud.com/configure
    • Global properties,
      tick Environment variables
      Click Add
      name=SONAR_USER_HOME
      value=$OPENSHIFT_DATA_DIR
      See here for more information.
    • Setup the Runner:
      Navigate to SonarQube Runner
      Click Add SonarQube Runner
      Name=<be creative>
    • Then set up the plugin itself
      Navigate to SonarQube
      tick Enable injection of SonarQube server configuration as build environment variables
      and set the following
      Name=<be creative>
      Server URL:

      http://sonar-<yourAccount>.rhcloud.com/

      Sonar account login: admin
      Sonar account password: <your pw> (default: admin)

    • Hit Save

Configure build for a repository

Now lets set up our first build.

  1. Go to
    https://jenkins-<yourAccount>.rhcloud.com/view/All/newJob

    Item name: <your Project name>
    (Unfortunately, Maven projects do not work due to OpenShift’s restrictions.)
    Hit OK

  2. On the next Screen
    GitHub project:

    https://github.com/<your user>/<your repo>/

    Source Code Management:

    https://github.com/<your user>/<your repo>.git

    Branch Specifier (blank for 'any'): origin/master
    Build Triggers: Tick  Build when a change is pushed to GitHub
    Build Environment: Tick Prepare SonarQube Scanner environment
    Build | Execute Shell

    cd $WORKSPACE
    # Start the actual build
    mvn clean package  $SONAR_MAVEN_GOAL --settings $OPENSHIFT_DATA_DIR/.m2/settings.xml -Dsonar.host.url=$SONAR_HOST_URL
    
  3. I’d also recommend the following actions
    Post-build Actions| Add post-build action| Publish JUnit test result report
    Test report XMLs=target/surefire-reports/TEST-.xml*
    Post-build Actions| Add post-build action| E-mail Notification
    Recipients=<your email address>
  4. Hit Apply.
  5. Finally, press Save and start you first build. Check Jenkins console output for errors. If everything succeeds you should see the result of the project’s analysis on SonarQube’s dashboard.

Building GitHub projects on Jenkins slaves on OpenShift

This post showed how to build GitHub projects with Jenkins, Maven and SonarQube 4 on OpenShift. For starters, it used the Jenkins master node for running build jobs. However, when running on a small gear, the master node might run out of memory pretty fast, resulting in a reboot of the node during builds.

In order to resolve this issue, there are two options:

  • limitting the memory of the build or
  • running the build on a slave node.

As spawning additional nodes is easy in a PaaS context such as OpenShift and provides a better performance than running builds with small memory, the slave solution seems to be the better approach.

This post shows how.

  1. Create new DYI app as a slave node (a how-to can be found here), name the node e.g. slave
  2. Create node in Jenkins
    1. Go to Jenkins web UI and create new node:
      https://jenkins-<yourAccount>.rhcloud.com/computer/new
    2. Set the following values:
      Remote FS root:/app-root/data folder on slave. Typically this is /var/lib/openshift/<slave's UID>/app-root/data/jenkins, you can find out by SSHing to the slave node and calling

      echo $OPENSHIFT_DATA_DIR/app-root/data/jenkins

      Labels: Some label to use within builds to refer the node, e.g. OS Slave #1
      Host: the slave’s hostname, e.g. slave-<youraccount>.rhcloud.com

    3. Add Credentials
      username: <slave's UID>
      Private Key File:Path to a private key file that is authorized for your OpenShift account. In the first post this path was used: /var/lib/openshift/<UID of your jenkins>/app-root/data/git-ssh/id_rsa. Note: $OPENSHIFT_DATA_DIR seems not to work here.
      BTW: You can change the credentials any time later via this URL

      https://jenkins-<yourAccount>.rhcloud.com/credentials/
  3. Prepare slave node: Create same environment as on master in the first post
    1. Create folder structure
      mkdir $OPENSHIFT_DATA_DIR/jenkins
      mkdir $OPENSHIFT_DATA_DIR/.m2
      echo -e "&lt;settings&gt;&lt;localRepository&gt;$OPENSHIFT_DATA_DIR/.m2&lt;/localRepository&gt;&lt;/settings&gt;" &gt; $OPENSHIFT_DATA_DIR/.m2/settings.xml
      
    2. Copy SSH directory from master to same directory on slave, e.g.
      scp -rp -i $OPENSHIFT_DATA_DIR/.ssh $OPENSHIFT_DATA_DIR/.ssh &lt;slave's UID&gt;@slave-&lt;your account&gt;.rhcloud.com:app-root/data/.ssh
    3. As the different cartridges (jenkins and DIY) have different environment variables for their local IP addresses ($OPENSHIFT_JENKINS_IP vs $OPENSHIFT_DIY_IP) we’ll have to improvise at this point. There are two options: Either
      1. Replace all occurrences of $OPENSHIFT_JENKINS_IP
        In all builds and in

        https://jenkins-<yourAccount>.rhcloud.com/configure

        Sonar | Sonar installations
        Database URL: jdbc:postgresql://$OPENSHIFT_DIY_IP:15555/sonarqube
        or

      2. Create an $OPENSHIFT_JENKINS_IP environment variable on your slave machine
        rhc env set OPENSHIFT_JENKINS_IP=&lt;value of  $OPENSHIFT_DIY_IP&gt; -a slave

        You can find out the value of $OPENSHIFT_DIY_IP by SSHing to the slave and execute

        echo $OPENSHIFT_DIY_IP 
      3. I’d love to hear suggesstions that do better 😉
  4. Adapt Build
    Easiest way is to not use the master at all.
    To do so, go to

    https://jenkins-<yourAccount>.rhcloud.com/configure

    and set # of executors to 0.
    Hit Apply

  5. Limit memory usage.
    Make sure the slave does not run out of memory (which leads to a restart of the node):
    Global properties | Environment variables
    name: MAVEN_OPTS
    value: -Xmx512m
    Hit Save.
  6. Now run a build. It should run on the slave and hopefully succeed 🙂

See also Libor Krzyzanek’s Blog: Jenkins on Openshift wi… | JBoss Developer

Building GitHub projects with Jenkins, Maven and SonarQube 4.1.1 on OpenShift

Basic installation SonarQube

There are different community-driven sonar cartridges around. There is

  • this one that bases on a Tomcat cartridges and provides SonarQube 3.x and
  • that one that comes with SonarQube 4.0.
  • The most uptodate and flexible one is this, though. It downloads a specific version of SonarQube with each build. At the moment it works with version 4.1.1. I’m still working on getting SonarQube 5 to run on openshift, but haven’t succeeded, yet.

There also is a tutorial that shows how to install SonarQube 3.1.1. It also contains general thoughts on how to bypass OpenShift’s restrictions.

Anyway, to install SonarQube 4.1.1 execute the following steps on your machine:

    1. rhc app create sonar diy-0.1 postgresql-9.2

Make sure to remember the login and passwords!

  1. git rm -r diy .openshift misc README.md
    git remote add upstream -m master https://github.com/worldline/openshift-sonarqube.git
    git pull -s recursive -X theirs upstream master
    git push
    
  2. Login to your SonarQube instance at
    http://sonar-<yourAccount>.rhcloud.com/

    The default login and passwords are admin / admin.
    You might want to change the password right away!

Basic installation Jenkins

A lot of information within this paragraph was taken from here.

  1. Create Jenkins gear with Git-SSH
    rhc create-app jenkins  jenkins-1  "https://cartreflect-claytondev.rhcloud.com/reflect?github=majecek/openshift-community-git-ssh"
  2. Authorize your Jenkins node to communicate with other gears (and with you Git Repository)
    Generate SSH key for your Jenkins node
    SSH to the jenkins node

    ssh-keygen -t rsa -b 4096 -f $OPENSHIFT_DATA_DIR/git-ssh
  3. Add the key to your OpenShift, either
    • via web console
      In SSH console

      cat id_rsa.pub

      then copy and paste the output into web console
      or

    • via rhc
      Download the public key (id_rsa.pub) to your host (e.g. by SFTP) and use the

      rhc sshkey add

      command to authorize the public keys for your OpenShift account.
      If you plan on accessing a private repo or want to allow jenkins committing to your repo (e.g. for generate releases with the maven release plugin) you should also add the key to your repo account. See GitHub Help.

  4. Install Plugins
    Browse to Update Center

    https://jenkins-<yourAccount>.rhcloud.com/pluginManager/advanced

    and hit Check Now (as described here).
    Then go to the Available tab and install

    1. Sonar Plugin,
    2. GitHub plugin,
    3. embeddable-build-status (if you’d like to include those nifty build badges in you README.md).

    While you’re at it, you might as well update the already installed plugins in the Updates tab.
    Then hit Install without restart or Download and install after restart. If necessary, you can restart your app like so

    rhc app restart -a jenkins
  5. Set up maven settings.xml to a writable location.
    • SSH to Jenkins
      mkdir $OPENSHIFT_DATA_DIR/.m2
      echo -e "<settings><localRepository>$OPENSHIFT_DATA_DIR/.m2</localRepository></settings>" > $OPENSHIFT_DATA_DIR/.m2/settings.xml
      
    • Browse to Configure System
      https://jenkins-<yourAccount>.rhcloud.com/configure

      Default settings provider: Settings file in file system
      File path=$OPENSHIFT_DATA_DIR/.m2/settings.xml

  6. Set up main Jenkins node as slave (easy to set up and doesn’t need extra gears).
    Go to Configure System

    https://jenkins-<yourAccount>.rhcloud.com/configure

    and set
    # of executors: 1
    As an alternative, you could also use another gear as dedicated Jenkins slave. To do so, follow the steps described here.

    [EDIT 2015-08-09: As it turned out, memory is too low to run the jenkins master and builds on one node. See my second post on how to introduce a dedicated slave node to this setup]

  7. Setup sonar plugin
    On the Jenkins frontend, go to Configure System

    https://jenkins-<yourAccount>.rhcloud.com/configure
    • Global properties,
      tick Environment variables
      Click Add
      name=SONAR_USER_HOME
      value=$OPENSHIFT_DATA_DIR
      See here for more information.
    • Then set up the plugin itself
      Navigate to Sonar, Sonar installations and set the following
      Name=<be creative>
      Server URL:

      http://sonar-<yourAccount>.rhcloud.com/

      Sonar account login: admin
      Sonar account password: <your pw>, default: admin
      Database URL: jdbc:postgresql://$OPENSHIFT_JENKINS_IP:15555/sonar
      Database login: The admin account that was returned when you first created the sonar application
      Database password: The password that was returned when you first created the sonar application

    • Hit Save

Configure build for a repository

Now lets set up our first build.

  1. Go to
    https://jenkins-<yourAccount>.rhcloud.com/view/All/newJob

    Item name: <your Project name>
    Build a free-style software project (Unfortunately, Maven projects do not work due to OpenShift’s restrictions.)
    Hit OK

  2. On the next Screen
    GitHub project:

    https://github.com/<your user>/<your repo>/

    Source Code Management:

    https://github.com/<your user>/<your repo>.git

    Branch Specifier (blank for ‘any’): origin/master
    Build Triggers: Tick: Build when a change is pushed to GitHub
    Build | Execute Shell

    cd $WORKSPACE
    # Start the actual build
    mvn clean compile test package
    

    Post-build Actions | Add post-build action | Sonar

  3. I’d also recommend the following actions
    Post-build Actions | Add post-build action | Publish JUnit test result report
    Test report XMLs=target/surefire-reports/TEST-.xml*
    Post-build Actions | Add post-build action | E-mail Notification
    Recipients=<your email address>
  4. Hit Apply.
  5. That’s it for the basic build set up. Now for the fun part: We need to find a way for Jenkins to reach sonar’s database.
    We’ll use an SSH tunnel for that.
    Build | Add build step | Execute Shell
    Now enter the following:

    # Make sure Tunnel for Sonar is open
    # Find out IP and port of DB
    OPENSHIFT_POSTGRESQL_DB_HOST_N_PORT=$(ssh -i $OPENSHIFT_DATA_DIR/git-ssh/id_rsa -o "UserKnownHostsFile=$OPENSHIFT_DATA_DIR/git-ssh/known_hosts" <UID>@sonar<yourAccount>.rhcloud.com  '(echo `printenv OPENSHIFT_POSTGRESQL_DB_HOST`:`printenv OPENSHIFT_POSTGRESQL_DB_PORT`)')
    # Open tunnel to DB
    BUILD_ID=dontKillMe nohup ssh -i $OPENSHIFT_DATA_DIR/git-ssh/id_rsa -o "UserKnownHostsFile=$OPENSHIFT_DATA_DIR/git-ssh/known_hosts" -L $OPENSHIFT_JENKINS_IP:15555:$OPENSHIFT_POSTGRESQL_DB_HOST_N_PORT -N <UID>@sonar<yourAccount>.rhcloud.com &
    

    This will tunnel requests from your Jenkins’ local Port 15555 via SSH to your sonar gear, which will forward it to its local PostgreSQL database.
    What is missing is script that explicitly closes the tunnel. But for now I’m just happy that everything is up and running. The tunnel will eventually be closed after a timeout. Let me know if you have any ideas how to improve the tunnel handling.

  6. Finally, press Save and you’re almost good to go.
  7. Before running your first build you should SSH to your Jenkins once more and
    ssh -i $OPENSHIFT_DATA_DIR/git-ssh/id_rsa -o "UserKnownHostsFile=$OPENSHIFT_DATA_DIR/git-ssh/known_hosts" <UID>@sonar<yourAccount>.rhcloud.com

    so the sonar node is added to the list of know hosts.

Shutting down JUnit tests “gracefully” in eclipse

Motvation

I started working on an integration testing project recently. We’re using an integrated container (provided by cargo) to host the application under test and then have the selenium webdriver accessing the connecting to the server via browser.

In this environment, I bumped into a problem that I didn’t consider being one at first: Every time when clicking the terminate button during a test run in eclipse, it just kills the process – no @after or onTearDown() methods getting invoked nor is a shutdown hook called.
This leaves me with the problem that when the JVM that runs the test is killed, no cleanup is performed: The server isn’t stopped, the browser isn’t closed and the database isn’t cleaned up.
As a consequence, I have to do this manually after every test that I terminate, which, at least during active development, is about every single one, as these test unfortunately aren’t very performing. Doing so like 100 times a day really started to make me angry.

So I spent some time looking for a solution, which terminates JUnit tests within eclipse more gently, but no luck. A couple of guys even filled bugs at eclipse during the last years, from which not a single one ever got fixed.

I just wonder why eclipse just can’t call @after/onTearDown() when terminating JUnit tests?

Anyway, after getting inspired by a couple of stackoverflow posts, I put together my own rather simple “solution”, which is more of a workaround, really.
Still, I think it might be worth being posted here.

Solution overview

My approach includes having a separate thread listening to standard in, waiting for a specific “signal” – a string whose occurrence initiates the “soft” termination. Just like SIGINT on unix-like systems. Once the signal is received, the console listener calls System.exit() and leaves the cleanup to a shutdown hook.

The shutdown is realized as a user-defined Thread, which is registered as the JVM’s shutdown hook. This thread is executed by the runtime after System.exit() is invoked.

Speaking code

Sounds complex? Maybe a couple of lines of code illustrate the mechanism.

You can also find the following classes as a part of a demo-project from my github. So pull it and try it out for yourself!
The following class realizes the shutdown mechanisms mentioned above.

public class JUnitShutdown {
	/** Log4j logger. */
	private Logger log = Logger.getLogger(this.getClass());

	/** User defined shutdown hook thread. */
	private Thread shutdownHookThread;
	/**
	 * The "signal" string, whose input on the console initiates the shutdown of
	 * the test.
	 */
	private String exitSignalString;
	/** Listen for the signal only if the test is run in debug mode. */
	private boolean isDebugOnly;

	/**
	 * Creates an instance of the shutdownhook that listens to the console for
	 * an <code>exitSignal</code> and executes <code>shutdownHook</code> when
	 * the signal is received.
	 *
	 * @param exitSignal
	 *            the signal the leads to exiting the test.
	 * @param isUsedInDebugOnly
	 *            if <code>true</code>, <code>exitSignal</code> is only
	 *            evaluated if test is run in debug mode
	 * @param shutdownHook
	 *            the thread that is executed when <code>exitSignal</code> is
	 *            received.
	 */
	public JUnitShutdown(final String exitSignal,
			final boolean isUsedInDebugOnly, final Thread shutdownHook) {
		shutdownHookThread = shutdownHook;
		exitSignalString = exitSignal;
		this.isDebugOnly = isUsedInDebugOnly;
		initShutdownHook();
	}

	/**
	 * Allows for cleanup before test cancellation by listening to the console
	 * for a specific exitSignal. On this signal registers a shutdown hook who
	 * performs the cleanup in separate thread.
	 */
	private void initShutdownHook() {
		if (isDebugOnly
				&& java.lang.management.ManagementFactory.getRuntimeMXBean()
						.getInputArguments().toString()
						.contains("-agentlib:jdwp")) {
			return;
		}

		/* Start thread which listens to system.in */
		Thread consoleListener = new Thread() {
			@Override
			public void run() {
				BufferedReader bufferReader = null;
				try {
					bufferReader = new BufferedReader(new InputStreamReader(
							System.in));
					/* Read from system.in */
					while (!bufferReader.readLine().equals(exitSignalString)) {
						doNothing();
					}

					// Add shutdown hook that performs cleanup
					Runtime.getRuntime().addShutdownHook(shutdownHookThread);

					log.debug("Received exit signal \"" + exitSignalString
							+ "\". Shutting down test.");
					System.exit(0);
				} catch (IOException e) {
					log.debug("Error reading from console", e);
				}
			}

			/**
			 * Is not doing a thing.
			 */
			private void doNothing() {
			}
		};
		consoleListener.start();
	}

}

Example

Now what to do with JUnitShutdown?
The following rather senseless JUnit test keeps you machine busy for a while (like forever). At the beginning it opens an exemplary external resource (a file) open, which is closed and then deleted on cleanup. That is, entering “q” in your console during test execution results in closing and then deleting the resource, hitting the terminate button in eclipse will cause the file to remain on the system, however.

public class SomeLongRunningTest {
	/** Log4j logger. */
	private Logger log = Logger.getLogger(this.getClass());
	/**
	 * An exemplary resource which not deleted when the test gets terminated,
	 * but gets deleted, when using {@link JUnitShutdown}.
	 */
	private static final String SOME_FILE = "some.file";
	/**
	 * An exemplary resource which is not closed when the test gets terminated,
	 * but gets closed, when using {@link JUnitShutdown}.
	 */
	private BufferedWriter someResource = null;

	/** A value that controls how many values are output during the test. */
	private static final int SOME_DIVISOR = 100000000;

	/**
	 * The shutdown hook listening to the exit signal.
	 */
	@SuppressWarnings("unused")
	private JUnitShutdown shutdownHook = new JUnitShutdown("q", false,
			new Thread("testShutdownHook") {
				public void run() {
					cleanupResources("shutdown hook");
				}
			});

	/**
	 * Some exemplary test.
	 *
	 * @throws IOException
	 *             some error
	 */
	@Test
	public void testSomething() throws IOException {
		try {
			someResource = new BufferedWriter(new FileWriter(
					new File(SOME_FILE), false));

			doSomethingExpensive();

		} finally {
			cleanupResources("testSomething()");
		}
	}

	/**
	 * Keeps the machine busy forever.
	 *
	 * @throws IOException
	 *             something went wrong during writing to file
	 */
	private void doSomethingExpensive() throws IOException {
		int i = 0;
		while (true) {
			if (++i % SOME_DIVISOR == 0) {
				log.debug(i);
				someResource.write(i);
				someResource.newLine();
			}
		}
	}

	/**
	 * This method is only called when the test ends without being killed.
	 */
	@After
	public void onTearDown() {
		cleanupResources("onTearDown()");
	}

	/**
	 * Closes {@link #someResource} and deletes {@link #SOME_FILE}.
	 *
	 * @param caller
	 *            the caller of this method for logging purpose only.
	 */
	private void cleanupResources(final String caller) {
		log.debug("cleanupResources() called by " + caller);
		if (someResource != null) {
			try {
				someResource.close();
			} catch (IOException e) {
				log.error("Unable to close resource", e);
			}
		}
		File f = new File(SOME_FILE);
		if (f.exists() && !f.isDirectory()) {
			f.delete();
		}
	}
}

Running this test an pressing “q” and then pressing enter will produce a log output such as this:

2012-11-20 22:54:14,248 [main] DEBUG SomeLongRunningTest  info.schnatterer.test.sometest.SomeLongRunningTest.doSomethingExpensive(SomeLongRunningTest.java:78) 100000000
2012-11-20 22:54:15,260 [main] DEBUG SomeLongRunningTest  info.schnatterer.test.sometest.SomeLongRunningTest.doSomethingExpensive(SomeLongRunningTest.java:78) 200000000
q2012-11-20 22:54:16,175 [main] DEBUG SomeLongRunningTest  info.schnatterer.test.sometest.SomeLongRunningTest.doSomethingExpensive(SomeLongRunningTest.java:78) 300000000

2012-11-20 22:54:16,505 [Thread-0] DEBUG JUnitShutdown  info.schnatterer.test.shutdown.JUnitShutdown$1.run(JUnitShutdown.java:83) Received exit signal "q". Shutting down test.
2012-11-20 22:54:16,544 [testShutdownHook] DEBUG SomeLongRunningTest  info.schnatterer.test.sometest.SomeLongRunningTest.cleanupResources(SomeLongRunningTest.java:100) cleanupResources() called by shutdown hook

More advanced shutdowns

Where to go next?
I kept the examples above rather simple to make my point. Still, I’m sure that there is a lot you could extend or improve.
For instance, in the project mentioned above, I extended the mechanism, so now I have two signals:

  • “q” stops the server, closes the browser and deletes all the test data from the database.
  • “q!” (does that sound familiar?) is a bit faster, it omits the database-related stuff (as the data is set up at the beginning of each test run anyway).

I have to admit “q!” improves productivity tremendously 🙂

You may also have noted that this mechanism is not limited to be used with JUnit. I first considered implementing it using JUnit rules, then I found out that it would be easier and more generic not to do so.

Let me know if you have any ideas on how to further improve this mechanism.

Generating and customizing JUnit and Code Coverage reports with Ant

Recently, I had a hard time finding a solution for automatic JUnit and Code Coverage report generation within an Ant build. Nobody using good old Ant anymore?
Maybe generating these reports using Maven is piece of cake (I don’t know) but as I suppose there are still people who are using (or have to use) Ant, I suppose my solution might be interesting for at least some people.

Solution overview

  • JUnit 4.1
  • JUnit reports (HTML): Apache Ant 1.7.1 (included in Eclipse 3.7.2 helios) / Ant 1.8.3.
  • JUnit reports (PDF): 1.0
  • Code Coverage and reports: JaCoCo 0.5.6

JUnit tasks in Ant

JUnit tests can be executed using the Ant task junit.
The test results are output as XML files. This structured text data can then be transformed to some different format for the purpose of visualization. Eclipse, for instance, uses JUnit’s XML-output to create the notorious green and red bars. Of course, its also possible to generate a report from the “raw” test results. For this purpose, Ant provides the task junitreport, which generates a HTML-report. You can even choose if you prefer to have the report in a single HTML-file (e.g. if you intend to attach the report to an email) or using frames (which is more comfortable to read). If you are interested in generating PDF, you should have a look at the 3rd-party task junitpdfreport.
Both tasks use XSL to transform the XML files, so they can easily be customized.

Measuring Code coverage with Ant

One of the most popular tools for measuring code coverage is called EMMA. There also is a plugin that integrates EMMA in eclipse: EclEmma. Does that have anything to do with creating reports using Ant? Yeah! The EclEmma-guys created a code coverage library which features Ant tasks for measuring code coverage and for creating reports that visualize the results JaCoCo. How nice is that!

 

Measuring the code coverage with Ant and JaCoCo is as easy as it can be: Just wrap whatever JUnit tests you want to be know the code coverage about in the JaCoCo task, but don’t forget to declare the jacoco namespace in the build.xml file:

<project name="AntTestReporting-test" basedir="." default="all-test" xmlns:jacoco="antlib:org.jacoco.ant">
	<!-- ... -->

	<!-- Java Code Coverage -->
	<taskdef uri="antlib:org.jacoco.ant" resource="org/jacoco/ant/antlib.xml">
		<classpath path="lib/ant/jacoco/lib/jacocoant.jar" />
	</taskdef>

	<!-- ... -->

	<!-- Run all tests -->
	<jacoco:coverage destfile="${test.data.dir}/jacoco.exec">
		<junit printsummary="true" haltonfailure="false" fork="yes" forkmode="once">
			<jvmarg value="${xms}" />
			<jvmarg value="${xmx}" />
			<!-- <jvmarg value="${log4j.config}" /> -->
			<classpath refid="classpath.test" />
			<formatter type="xml" />
			<batchtest todir="${test.data.dir}">
				<fileset dir="${build.dir}">
					<!-- Exclude inner classes -->
					<exclude name="**/*$*.class" />
					<include name="**/*Test.class" />
				</fileset>
			</batchtest>
		</junit>
	</jacoco:coverage>
</project>

Note: Why forking the tests? This allows for passing JVM arguments to the tests, for example a special configuration of the logging-framework for testing purpose only. Also, Increasing the initial heap size (xms) as well as the maximal heap size (xmx) of the JVM that runs the tests in combination with running all tests in the same (forked) JVM (forkmode=”once”) reduced the time to perform the unit tests drastically (at least for me).

Let’s generate some reports!

Now, for the actual generation of reports, we need to import the Ant-build file for the PDF-reports (if necessary).

 

Then we can go ahead generating our reports.

 

Note that if you’re not interested in customizing the reports you can leave out the parameters for the styledirs, they aren’t mandatory.

 

For demonstration purpose, this code generates three different kinds of JUnit reports:

  • HTML without frames (single file),
  • HTML with frames (multiple files) and
  • PDF

Usually, you might just need one of them.

<!-- PDF-Reports for JUnit -->
<import file="lib/ant/junitpdfreport/build-junitpdfreport.xml" />

<!-- Generate HTML report
- junit-noframes.html -> Single page HTML-report
- index.html -> HTML-report using frames (several files, but more comfortable to read)-->
<junitreport todir="${test.data.dir}">
	<fileset dir="${test.data.dir}">
		<include name="TEST-*.xml" />
	</fileset>
	<report styledir="test/etc/junitreport" format="noframes" todir="${test.reports.dir}" />
	<report styledir="test/etc/junitreport" format="frames" todir="${test.reports.dir}" />
</junitreport>

<!-- Generate PDF report -->
<junitpdfreport todir="${test.reports.dir}" styledir="../../../test/etc/junitpdfreport/default">
	<fileset dir="${test.data.dir}">
		<include name="TEST-*.xml" />
	</fileset>
</junitpdfreport>

<!-- Generate Code Coverage report
See: http://www.eclemma.org/jacoco/trunk/doc/ant.html -->
<jacoco:report>
	<executiondata>
		<file file="${test.data.dir}/jacoco.exec" />
	</executiondata>

	<structure name="AntTestReporting">
		<classfiles>
			<fileset dir="${build.dir}">
				<include name="**/*.class" />
				<!-- Exclude classes necessary for testing only from the code coverage report-->
				<exclude name="**/*Test*.class" />
				<!-- Exclude inner classes -->
				<exclude name="**/*$*.class" />
			</fileset>
		</classfiles>
	</structure>

	<html destdir="${coverage.reports.dir}" />
</jacoco:report>

Customizing reports

By now, our Ant build creates uniformly looking reports. But what if we want them to look differently? Or maybe we want to add some more information during the build? All this can be done by using different XSL-stylesheets.

 

To prove the point, we will generate a timestamp during build and include this in the title of the JUnit HTML and PDF reports.

 

As the JUnit-XML files contain all Ant properties defined during the build process. We will first create our timestamp and then extracted it from the XML during the generation of the reports.

<!-- Create the time stamp -->
<tstamp>
	<format property="lastUpdated" pattern="yyyy-MM-dd HH:mm:ss" />
</tstamp>

What is left to be done is adapting the XSL-Sheets. In the above code examples we already have explicitly mentioned a path to the XSL files (the styledir attribute). A good way to start customization is to modify the sheet provided by Ant, for example this one. This sheet creates the multi-document HTML report with frames. The same applies to the customization of the single-document HTML and PDF reports.

Finally, let’s add the timestamp to the title:

junit-frames.xsl

<xsl:param name="TITLE">Unit Test Results. Build time: <xsl:value-of select="//property[@name='lastUpdated']/@value"/></xsl:param>

… and that’s it!

Result

How do the reports look like? I created a small demo project and generated the following reports:

Sources

To complete the picture, here’s the complete build script. Actually, the script is split up in two files, to keep things simpler:

  • build.xml – contains the “normal” build
  • build-test.xml – uses build.xml to execute the build and then builds and executes the tests, finally generating the reports.

You can pull the demo-project mentioned above from my github repository. It’s and Eclipse project but you should also be able compile it with Ant from the command line, like this:
ant -f build-test.xml.

BTW: Beware of this issue: this issue. I ran into this problem using win7 and either JDK6_32 as well as JDK7_03. I got rid of it by using Ant 1.8.3. To run an external copy of Ant from eclipse, all you have to do is executing the launch config located in the archive under launch/build-test-external.launch by right clicking on it > Run As > build-test-external.

build.xml

<?xml version="1.0" encoding="UTF-8"?>
<!DOCTYPE project>

<project name="AntTestReporting" basedir="." default="jar">

	<!-- general -->
	<property name="DEBUG" value="true" />
	<property name="VERBOSE" value="true" />
	<property name="TARGET" value="1.6" />

	<!-- folder -->
	<property name="build.dir" value="bin" />
	<property name="src" value="src" />
	<property name="lib" value="lib" />

	<!-- classpath -->
	<path id="classpath">
		<fileset dir="${lib}">
			<include name="**/*.jar" />
		</fileset>
	</path>

	<!-- targets -->
	<target name="clean">
		<delete dir="${build.dir}" />
	</target>

	<target name="compile" depends="clean">
		<mkdir dir="${build.dir}" />
		<mkdir dir="${build.dir}/build" />

		<!-- Create the time stamp -->
		<tstamp>
			<format property="lastUpdated" pattern="yyyy-MM-dd HH:mm:ss" />
		</tstamp>

		<javac target="${TARGET}" debug="${DEBUG}" verbose="${VERBOSE}" classpathref="classpath" optimize="true" destdir="${build.dir}">
			<src path="${src}" />
		</javac>

		<fileset id="srcFiles" dir="${src}">
		   	<exclude name="**/*.java"/>
			<exclude name="**/*.html"/>
			<include name="**/*.*" />
		</fileset>

		<copy todir="${build.dir}">
			<fileset refid="srcFiles"/>
		</copy>

	</target>

	<!-- <target name="jar" depends="compile, test">  -->
	<target name="jar" depends="compile">

		<jar jarfile="${build.dir}/build/${ant.project.name}.jar" basedir="${build.dir}">
			<manifest>
				<attribute name="Build-Time" value="${lastUpdated}" />
				<attribute name="Main-Class" value="com.some.pckge.SomeClass"/>
			</manifest>
		</jar>

		<!-- Remove contents of build dir after packaging -->
		<!-- <delete>
		   <fileset dir="${build.dir}">
		   	<include name="**/*.*" />
		   </fileset>
		</delete> -->
	</target>

</project>

build-test.xml

<?xml version="1.0" encoding="UTF-8"?>
<!DOCTYPE project>
<project name="AntTestReporting-test" basedir="." default="all-test" xmlns:jacoco="antlib:org.jacoco.ant">

	<import file="build.xml" />

	<!-- PDF-Reports for JUnit -->
	<import file="lib/ant/junitpdfreport/build-junitpdfreport.xml" />

	<!-- Java Code Coverage -->
	<taskdef uri="antlib:org.jacoco.ant" resource="org/jacoco/ant/antlib.xml">
		<classpath path="lib/ant/jacoco/lib/jacocoant.jar" />
	</taskdef>

	<property name="test.build.dir" location="bin/test" />
	<property name="test.src" location="test" />
	<property name="test.data.dir" location="${test.build.dir}/testResults" />
	<property name="reports.dir" location="${test.build.dir}/reports" />
	<property name="test.reports.dir" location="${reports.dir}/junit" />
	<property name="coverage.reports.dir" location="${reports.dir}/coverage" />

	<property name="xms" value="-Xms256m" />
	<property name="xmx" value="-Xmx1024m" />
	<!-- <property name="log4j.config" value="-Dlog4j.configuration=file:/${base.dir}/test/log4j-test.properties" /> -->

	<path id="classpath.test">
		<pathelement location="${build.dir}" />
		<fileset dir="${lib}">
			<include name="**/*.jar" />
		</fileset>
	</path>

	<target name="compile-test" depends="compile">
		<mkdir dir="${test.build.dir}" />

		<javac destdir="${build.dir}" srcdir="${test.src}" includeantruntime="false">
			<classpath refid="classpath.test" />
		</javac>

		<fileset id="srcFiles" dir="${test.src}">
			<exclude name="**/*.java" />
			<exclude name="**/*.html" />
			<exclude name="**/*.xsl" />
			<include name="**/*.*" />
		</fileset>

		<copy todir="${test.build.dir}">
			<fileset refid="srcFiles" />
		</copy>
	</target>

	<target name="clean-compile-test">
		<delete>
			<fileset dir="${test.build.dir}" includes="**/*.*" />
		</delete>
	</target>

	<target name="test" depends="compile-test">
		<mkdir dir="${test.data.dir}" />

		<!-- Run all tests -->
		<jacoco:coverage destfile="${test.data.dir}/jacoco.exec">
			<junit printsummary="true" haltonfailure="false" fork="yes" forkmode="once">
				<jvmarg value="${xms}" />
				<jvmarg value="${xmx}" />
				<!-- <jvmarg value="${log4j.config}" /> -->
				<classpath refid="classpath.test" />
				<formatter type="xml" />
				<batchtest todir="${test.data.dir}">
					<fileset dir="${build.dir}">
						<!-- Exclude inner classes -->
						<exclude name="**/*$*.class" />
						<include name="**/*Test.class" />
					</fileset>
				</batchtest>
			</junit>
		</jacoco:coverage>

		<!-- Generate HTML report
			- junit-noframes.html -> Single page HTML-report
			- index.html -> HTML-report using frames (several files, but more comfortable to read)-->
		<junitreport todir="${test.data.dir}">
			<fileset dir="${test.data.dir}">
				<include name="TEST-*.xml" />
			</fileset>
			<report styledir="test/etc/junitreport" format="noframes" todir="${test.reports.dir}" />
			<report styledir="test/etc/junitreport" format="frames" todir="${test.reports.dir}" />
		</junitreport>

		<!-- Generate PDF report -->
		<junitpdfreport todir="${test.reports.dir}" styledir="../../../test/etc/junitpdfreport/default">
			<fileset dir="${test.data.dir}">
				<include name="TEST-*.xml" />
			</fileset>
		</junitpdfreport>

		<!-- Generate Code Coverage report
			See: http://www.eclemma.org/jacoco/trunk/doc/ant.html -->
		<jacoco:report>
			<executiondata>
				<file file="${test.data.dir}/jacoco.exec" />
			</executiondata>

			<structure name="AntTestReporting">
				<classfiles>
					<fileset dir="${build.dir}">
						<include name="**/*.class" />
						<!-- Exclude classes necessary for testing only from the code coverage report-->
						<exclude name="**/*Test*.class" />
						<!-- Exclude inner classes -->
						<exclude name="**/*$*.class" />
					</fileset>
				</classfiles>
			</structure>

			<html destdir="${coverage.reports.dir}" />
		</jacoco:report>
	</target>

	<target name="all-test" depends="test" />
</project>