Shutting down JUnit tests “gracefully” in eclipse

Motvation

I started working on an integration testing project recently. We’re using an integrated container (provided by cargo) to host the application under test and then have the selenium webdriver accessing the connecting to the server via browser.

In this environment, I bumped into a problem that I didn’t consider being one at first: Every time when clicking the terminate button during a test run in eclipse, it just kills the process – no @after or onTearDown() methods getting invoked nor is a shutdown hook called.
This leaves me with the problem that when the JVM that runs the test is killed, no cleanup is performed: The server isn’t stopped, the browser isn’t closed and the database isn’t cleaned up.
As a consequence, I have to do this manually after every test that I terminate, which, at least during active development, is about every single one, as these test unfortunately aren’t very performing. Doing so like 100 times a day really started to make me angry.

So I spent some time looking for a solution, which terminates JUnit tests within eclipse more gently, but no luck. A couple of guys even filled bugsย at eclipse during the last years, from which not a single one ever got fixed.

I just wonder why eclipse just can’t call @after/onTearDown() when terminating JUnit tests?

Anyway, after getting inspired by a couple of stackoverflow posts, I put together my own rather simple “solution”, which is more of a workaround, really.
Still, I think it might be worth being posted here.

Solution overview

My approach includes having a separate thread listening to standard in, waiting for a specific “signal” – a string whose occurrence initiates the “soft” termination. Just like SIGINT on unix-like systems. Once the signal is received, the console listener calls System.exit() and leaves the cleanup to a shutdown hook.

The shutdown is realized as a user-defined Thread, which is registered as the JVM’s shutdown hook. This thread is executed by the runtime after System.exit() is invoked.

Speaking code

Sounds complex? Maybe a couple of lines of code illustrate the mechanism.

You can also find the following classes as a part of a demo-project from my github. So pull it and try it out for yourself!
The following class realizes the shutdown mechanisms mentioned above.

public class JUnitShutdown {
	/** Log4j logger. */
	private Logger log = Logger.getLogger(this.getClass());

	/** User defined shutdown hook thread. */
	private Thread shutdownHookThread;
	/**
	 * The "signal" string, whose input on the console initiates the shutdown of
	 * the test.
	 */
	private String exitSignalString;
	/** Listen for the signal only if the test is run in debug mode. */
	private boolean isDebugOnly;

	/**
	 * Creates an instance of the shutdownhook that listens to the console for
	 * an <code>exitSignal</code> and executes <code>shutdownHook</code> when
	 * the signal is received.
	 *
	 * @param exitSignal
	 *            the signal the leads to exiting the test.
	 * @param isUsedInDebugOnly
	 *            if <code>true</code>, <code>exitSignal</code> is only
	 *            evaluated if test is run in debug mode
	 * @param shutdownHook
	 *            the thread that is executed when <code>exitSignal</code> is
	 *            received.
	 */
	public JUnitShutdown(final String exitSignal,
			final boolean isUsedInDebugOnly, final Thread shutdownHook) {
		shutdownHookThread = shutdownHook;
		exitSignalString = exitSignal;
		this.isDebugOnly = isUsedInDebugOnly;
		initShutdownHook();
	}

	/**
	 * Allows for cleanup before test cancellation by listening to the console
	 * for a specific exitSignal. On this signal registers a shutdown hook who
	 * performs the cleanup in separate thread.
	 */
	private void initShutdownHook() {
		if (isDebugOnly
				&& java.lang.management.ManagementFactory.getRuntimeMXBean()
						.getInputArguments().toString()
						.contains("-agentlib:jdwp")) {
			return;
		}

		/* Start thread which listens to system.in */
		Thread consoleListener = new Thread() {
			@Override
			public void run() {
				BufferedReader bufferReader = null;
				try {
					bufferReader = new BufferedReader(new InputStreamReader(
							System.in));
					/* Read from system.in */
					while (!bufferReader.readLine().equals(exitSignalString)) {
						doNothing();
					}

					// Add shutdown hook that performs cleanup
					Runtime.getRuntime().addShutdownHook(shutdownHookThread);

					log.debug("Received exit signal \"" + exitSignalString
							+ "\". Shutting down test.");
					System.exit(0);
				} catch (IOException e) {
					log.debug("Error reading from console", e);
				}
			}

			/**
			 * Is not doing a thing.
			 */
			private void doNothing() {
			}
		};
		consoleListener.start();
	}

}

Example

Now what to do with JUnitShutdown?
The following rather senseless JUnit test keeps you machine busy for a while (like forever). At the beginning it opens an exemplary external resource (a file) open, which is closed and then deleted on cleanup. That is, entering “q” in your console during test execution results in closing and then deleting the resource, hitting the terminate button in eclipse will cause the file to remain on the system, however.

public class SomeLongRunningTest {
	/** Log4j logger. */
	private Logger log = Logger.getLogger(this.getClass());
	/**
	 * An exemplary resource which not deleted when the test gets terminated,
	 * but gets deleted, when using {@link JUnitShutdown}.
	 */
	private static final String SOME_FILE = "some.file";
	/**
	 * An exemplary resource which is not closed when the test gets terminated,
	 * but gets closed, when using {@link JUnitShutdown}.
	 */
	private BufferedWriter someResource = null;

	/** A value that controls how many values are output during the test. */
	private static final int SOME_DIVISOR = 100000000;

	/**
	 * The shutdown hook listening to the exit signal.
	 */
	@SuppressWarnings("unused")
	private JUnitShutdown shutdownHook = new JUnitShutdown("q", false,
			new Thread("testShutdownHook") {
				public void run() {
					cleanupResources("shutdown hook");
				}
			});

	/**
	 * Some exemplary test.
	 *
	 * @throws IOException
	 *             some error
	 */
	@Test
	public void testSomething() throws IOException {
		try {
			someResource = new BufferedWriter(new FileWriter(
					new File(SOME_FILE), false));

			doSomethingExpensive();

		} finally {
			cleanupResources("testSomething()");
		}
	}

	/**
	 * Keeps the machine busy forever.
	 *
	 * @throws IOException
	 *             something went wrong during writing to file
	 */
	private void doSomethingExpensive() throws IOException {
		int i = 0;
		while (true) {
			if (++i % SOME_DIVISOR == 0) {
				log.debug(i);
				someResource.write(i);
				someResource.newLine();
			}
		}
	}

	/**
	 * This method is only called when the test ends without being killed.
	 */
	@After
	public void onTearDown() {
		cleanupResources("onTearDown()");
	}

	/**
	 * Closes {@link #someResource} and deletes {@link #SOME_FILE}.
	 *
	 * @param caller
	 *            the caller of this method for logging purpose only.
	 */
	private void cleanupResources(final String caller) {
		log.debug("cleanupResources() called by " + caller);
		if (someResource != null) {
			try {
				someResource.close();
			} catch (IOException e) {
				log.error("Unable to close resource", e);
			}
		}
		File f = new File(SOME_FILE);
		if (f.exists() && !f.isDirectory()) {
			f.delete();
		}
	}
}

Running this test an pressing “q” and then pressing enter will produce a log output such as this:

2012-11-20 22:54:14,248 [main] DEBUG SomeLongRunningTest  info.schnatterer.test.sometest.SomeLongRunningTest.doSomethingExpensive(SomeLongRunningTest.java:78) 100000000
2012-11-20 22:54:15,260 [main] DEBUG SomeLongRunningTest  info.schnatterer.test.sometest.SomeLongRunningTest.doSomethingExpensive(SomeLongRunningTest.java:78) 200000000
q2012-11-20 22:54:16,175 [main] DEBUG SomeLongRunningTest  info.schnatterer.test.sometest.SomeLongRunningTest.doSomethingExpensive(SomeLongRunningTest.java:78) 300000000

2012-11-20 22:54:16,505 [Thread-0] DEBUG JUnitShutdown  info.schnatterer.test.shutdown.JUnitShutdown$1.run(JUnitShutdown.java:83) Received exit signal "q". Shutting down test.
2012-11-20 22:54:16,544 [testShutdownHook] DEBUG SomeLongRunningTest  info.schnatterer.test.sometest.SomeLongRunningTest.cleanupResources(SomeLongRunningTest.java:100) cleanupResources() called by shutdown hook

More advanced shutdowns

Where to go next?
I kept the examples above rather simple to make my point. Still, I’m sure that there is a lot you could extend or improve.
For instance, in the project mentioned above, I extended the mechanism, so now I have two signals:

  • “q” stops the server, closes the browser and deletes all the test data from the database.
  • “q!” (does that sound familiar?) is a bit faster, it omits the database-related stuff (as the data is set up at the beginning of each test run anyway).

I have to admit “q!” improves productivity tremendously ๐Ÿ™‚

You may also have noted that this mechanism is not limited to be used with JUnit. I first considered implementing it using JUnit rules, then I found out that it would be easier and more generic not to do so.

Let me know if you have any ideas on how to further improve this mechanism.

Generating and customizing JUnit and Code Coverage reports with Ant

Recently, I had a hard time finding a solution for automatic JUnit and Code Coverage report generation within an Ant build. Nobody using good old Ant anymore?
Maybe generating these reports using Maven is piece of cake (I don’t know) but as I suppose there are still people who are using (or have to use) Ant, I suppose my solution might be interesting for at least some people.

Solution overview

  • JUnit 4.1
  • JUnit reports (HTML): Apache Ant 1.7.1 (included in Eclipse 3.7.2 helios) / Ant 1.8.3.
  • JUnit reports (PDF): 1.0
  • Code Coverage and reports: JaCoCo 0.5.6

JUnit tasks in Ant

JUnit tests can be executed using the Ant task junit.
The test results are output as XML files. This structured text data can then be transformed to some different format for the purpose of visualization. Eclipse, for instance, uses JUnit’s XML-output to create the notorious green and red bars. Of course, its also possible to generate a report from the “raw” test results. For this purpose, Ant provides the task junitreport, which generates a HTML-report. You can even choose if you prefer to have the report in a single HTML-file (e.g. if you intend to attach the report to an email) or using frames (which is more comfortable to read). If you are interested in generating PDF, you should have a look at the 3rd-party task junitpdfreport.
Both tasks use XSL to transform the XML files, so they can easily be customized.

Measuring Code coverage with Ant

One of the most popular tools for measuring code coverage is called EMMA. There also is a plugin that integrates EMMA in eclipse: EclEmma. Does that have anything to do with creating reports using Ant? Yeah! The EclEmma-guys created a code coverage library which features Ant tasks for measuring code coverage and for creating reports that visualize the results JaCoCo. How nice is that!

 

Measuring the code coverage with Ant and JaCoCo is as easy as it can be: Just wrap whatever JUnit tests you want to be know the code coverage about in the JaCoCo task, but don’t forget to declare the jacoco namespace in the build.xml file:

<project name="AntTestReporting-test" basedir="." default="all-test" xmlns:jacoco="antlib:org.jacoco.ant">
	<!-- ... -->

	<!-- Java Code Coverage -->
	<taskdef uri="antlib:org.jacoco.ant" resource="org/jacoco/ant/antlib.xml">
		<classpath path="lib/ant/jacoco/lib/jacocoant.jar" />
	</taskdef>

	<!-- ... -->

	<!-- Run all tests -->
	<jacoco:coverage destfile="${test.data.dir}/jacoco.exec">
		<junit printsummary="true" haltonfailure="false" fork="yes" forkmode="once">
			<jvmarg value="${xms}" />
			<jvmarg value="${xmx}" />
			<!-- <jvmarg value="${log4j.config}" /> -->
			<classpath refid="classpath.test" />
			<formatter type="xml" />
			<batchtest todir="${test.data.dir}">
				<fileset dir="${build.dir}">
					<!-- Exclude inner classes -->
					<exclude name="**/*$*.class" />
					<include name="**/*Test.class" />
				</fileset>
			</batchtest>
		</junit>
	</jacoco:coverage>
</project>

Note: Why forking the tests? This allows for passing JVM arguments to the tests, for example a special configuration of the logging-framework for testing purpose only. Also, Increasing the initial heap size (xms) as well as the maximal heap size (xmx) of the JVM that runs the tests in combination with running all tests in the same (forked) JVM (forkmode=”once”) reduced the time to perform the unit tests drastically (at least for me).

Let’s generate some reports!

Now, for the actual generation of reports, we need to import the Ant-build file for the PDF-reports (if necessary).

 

Then we can go ahead generating our reports.

 

Note that if you’re not interested in customizing the reports you can leave out the parameters for the styledirs, they aren’t mandatory.

 

For demonstration purpose, this code generates three different kinds of JUnit reports:

  • HTML without frames (single file),
  • HTML with frames (multiple files) and
  • PDF

Usually, you might just need one of them.

<!-- PDF-Reports for JUnit -->
<import file="lib/ant/junitpdfreport/build-junitpdfreport.xml" />

<!-- Generate HTML report
- junit-noframes.html -> Single page HTML-report
- index.html -> HTML-report using frames (several files, but more comfortable to read)-->
<junitreport todir="${test.data.dir}">
	<fileset dir="${test.data.dir}">
		<include name="TEST-*.xml" />
	</fileset>
	<report styledir="test/etc/junitreport" format="noframes" todir="${test.reports.dir}" />
	<report styledir="test/etc/junitreport" format="frames" todir="${test.reports.dir}" />
</junitreport>

<!-- Generate PDF report -->
<junitpdfreport todir="${test.reports.dir}" styledir="../../../test/etc/junitpdfreport/default">
	<fileset dir="${test.data.dir}">
		<include name="TEST-*.xml" />
	</fileset>
</junitpdfreport>

<!-- Generate Code Coverage report
See: http://www.eclemma.org/jacoco/trunk/doc/ant.html -->
<jacoco:report>
	<executiondata>
		<file file="${test.data.dir}/jacoco.exec" />
	</executiondata>

	<structure name="AntTestReporting">
		<classfiles>
			<fileset dir="${build.dir}">
				<include name="**/*.class" />
				<!-- Exclude classes necessary for testing only from the code coverage report-->
				<exclude name="**/*Test*.class" />
				<!-- Exclude inner classes -->
				<exclude name="**/*$*.class" />
			</fileset>
		</classfiles>
	</structure>

	<html destdir="${coverage.reports.dir}" />
</jacoco:report>

Customizing reports

By now, our Ant build creates uniformly looking reports. But what if we want them to look differently? Or maybe we want to add some more information during the build? All this can be done by using different XSL-stylesheets.

 

To prove the point, we will generate a timestamp during build and include this in the title of the JUnit HTML and PDF reports.

 

As the JUnit-XML files contain all Ant properties defined during the build process. We will first create our timestamp and then extracted it from the XML during the generation of the reports.

<!-- Create the time stamp -->
<tstamp>
	<format property="lastUpdated" pattern="yyyy-MM-dd HH:mm:ss" />
</tstamp>

What is left to be done is adapting the XSL-Sheets. In the above code examples we already have explicitly mentioned a path to the XSL files (the styledir attribute). A good way to start customization is to modify the sheet provided by Ant, for example this one. This sheet creates the multi-document HTML report with frames. The same applies to the customization of the single-document HTML and PDF reports.

Finally, let’s add the timestamp to the title:

junit-frames.xsl

<xsl:param name="TITLE">Unit Test Results. Build time: <xsl:value-of select="//property[@name='lastUpdated']/@value"/></xsl:param>

… and that’s it!

Result

How do the reports look like? I created a small demo project and generated the following reports:

Sources

To complete the picture, here’s the complete build script. Actually, the script is split up in two files, to keep things simpler:

  • build.xml – contains the “normal” build
  • build-test.xml – uses build.xml to execute the build and then builds and executes the tests, finally generating the reports.

You can pull the demo-project mentioned above from my github repository. It’s and Eclipse project but you should also be able compile it with Ant from the command line, like this:
ant -f build-test.xml.

BTW: Beware of this issue: this issue. I ran into this problem using win7 and either JDK6_32 as well as JDK7_03. I got rid of it by using Ant 1.8.3. To run an external copy of Ant from eclipse, all you have to do is executing the launch config located in the archive under launch/build-test-external.launch by right clicking on it > Run As > build-test-external.

build.xml

<?xml version="1.0" encoding="UTF-8"?>
<!DOCTYPE project>

<project name="AntTestReporting" basedir="." default="jar">

	<!-- general -->
	<property name="DEBUG" value="true" />
	<property name="VERBOSE" value="true" />
	<property name="TARGET" value="1.6" />

	<!-- folder -->
	<property name="build.dir" value="bin" />
	<property name="src" value="src" />
	<property name="lib" value="lib" />

	<!-- classpath -->
	<path id="classpath">
		<fileset dir="${lib}">
			<include name="**/*.jar" />
		</fileset>
	</path>

	<!-- targets -->
	<target name="clean">
		<delete dir="${build.dir}" />
	</target>

	<target name="compile" depends="clean">
		<mkdir dir="${build.dir}" />
		<mkdir dir="${build.dir}/build" />

		<!-- Create the time stamp -->
		<tstamp>
			<format property="lastUpdated" pattern="yyyy-MM-dd HH:mm:ss" />
		</tstamp>

		<javac target="${TARGET}" debug="${DEBUG}" verbose="${VERBOSE}" classpathref="classpath" optimize="true" destdir="${build.dir}">
			<src path="${src}" />
		</javac>

		<fileset id="srcFiles" dir="${src}">
		   	<exclude name="**/*.java"/>
			<exclude name="**/*.html"/>
			<include name="**/*.*" />
		</fileset>

		<copy todir="${build.dir}">
			<fileset refid="srcFiles"/>
		</copy>

	</target>

	<!-- <target name="jar" depends="compile, test">  -->
	<target name="jar" depends="compile">

		<jar jarfile="${build.dir}/build/${ant.project.name}.jar" basedir="${build.dir}">
			<manifest>
				<attribute name="Build-Time" value="${lastUpdated}" />
				<attribute name="Main-Class" value="com.some.pckge.SomeClass"/>
			</manifest>
		</jar>

		<!-- Remove contents of build dir after packaging -->
		<!-- <delete>
		   <fileset dir="${build.dir}">
		   	<include name="**/*.*" />
		   </fileset>
		</delete> -->
	</target>

</project>

build-test.xml

<?xml version="1.0" encoding="UTF-8"?>
<!DOCTYPE project>
<project name="AntTestReporting-test" basedir="." default="all-test" xmlns:jacoco="antlib:org.jacoco.ant">

	<import file="build.xml" />

	<!-- PDF-Reports for JUnit -->
	<import file="lib/ant/junitpdfreport/build-junitpdfreport.xml" />

	<!-- Java Code Coverage -->
	<taskdef uri="antlib:org.jacoco.ant" resource="org/jacoco/ant/antlib.xml">
		<classpath path="lib/ant/jacoco/lib/jacocoant.jar" />
	</taskdef>

	<property name="test.build.dir" location="bin/test" />
	<property name="test.src" location="test" />
	<property name="test.data.dir" location="${test.build.dir}/testResults" />
	<property name="reports.dir" location="${test.build.dir}/reports" />
	<property name="test.reports.dir" location="${reports.dir}/junit" />
	<property name="coverage.reports.dir" location="${reports.dir}/coverage" />

	<property name="xms" value="-Xms256m" />
	<property name="xmx" value="-Xmx1024m" />
	<!-- <property name="log4j.config" value="-Dlog4j.configuration=file:/${base.dir}/test/log4j-test.properties" /> -->

	<path id="classpath.test">
		<pathelement location="${build.dir}" />
		<fileset dir="${lib}">
			<include name="**/*.jar" />
		</fileset>
	</path>

	<target name="compile-test" depends="compile">
		<mkdir dir="${test.build.dir}" />

		<javac destdir="${build.dir}" srcdir="${test.src}" includeantruntime="false">
			<classpath refid="classpath.test" />
		</javac>

		<fileset id="srcFiles" dir="${test.src}">
			<exclude name="**/*.java" />
			<exclude name="**/*.html" />
			<exclude name="**/*.xsl" />
			<include name="**/*.*" />
		</fileset>

		<copy todir="${test.build.dir}">
			<fileset refid="srcFiles" />
		</copy>
	</target>

	<target name="clean-compile-test">
		<delete>
			<fileset dir="${test.build.dir}" includes="**/*.*" />
		</delete>
	</target>

	<target name="test" depends="compile-test">
		<mkdir dir="${test.data.dir}" />

		<!-- Run all tests -->
		<jacoco:coverage destfile="${test.data.dir}/jacoco.exec">
			<junit printsummary="true" haltonfailure="false" fork="yes" forkmode="once">
				<jvmarg value="${xms}" />
				<jvmarg value="${xmx}" />
				<!-- <jvmarg value="${log4j.config}" /> -->
				<classpath refid="classpath.test" />
				<formatter type="xml" />
				<batchtest todir="${test.data.dir}">
					<fileset dir="${build.dir}">
						<!-- Exclude inner classes -->
						<exclude name="**/*$*.class" />
						<include name="**/*Test.class" />
					</fileset>
				</batchtest>
			</junit>
		</jacoco:coverage>

		<!-- Generate HTML report
			- junit-noframes.html -> Single page HTML-report
			- index.html -> HTML-report using frames (several files, but more comfortable to read)-->
		<junitreport todir="${test.data.dir}">
			<fileset dir="${test.data.dir}">
				<include name="TEST-*.xml" />
			</fileset>
			<report styledir="test/etc/junitreport" format="noframes" todir="${test.reports.dir}" />
			<report styledir="test/etc/junitreport" format="frames" todir="${test.reports.dir}" />
		</junitreport>

		<!-- Generate PDF report -->
		<junitpdfreport todir="${test.reports.dir}" styledir="../../../test/etc/junitpdfreport/default">
			<fileset dir="${test.data.dir}">
				<include name="TEST-*.xml" />
			</fileset>
		</junitpdfreport>

		<!-- Generate Code Coverage report
			See: http://www.eclemma.org/jacoco/trunk/doc/ant.html -->
		<jacoco:report>
			<executiondata>
				<file file="${test.data.dir}/jacoco.exec" />
			</executiondata>

			<structure name="AntTestReporting">
				<classfiles>
					<fileset dir="${build.dir}">
						<include name="**/*.class" />
						<!-- Exclude classes necessary for testing only from the code coverage report-->
						<exclude name="**/*Test*.class" />
						<!-- Exclude inner classes -->
						<exclude name="**/*$*.class" />
					</fileset>
				</classfiles>
			</structure>

			<html destdir="${coverage.reports.dir}" />
		</jacoco:report>
	</target>

	<target name="all-test" depends="test" />
</project>