Modernizing android UIs part 2: design support library, switches, action buttons

After finishing the migration from Actionbarsherlock to appcompat described in the first Post on modernizing android UIs, it turns out there are even more things to modernize in terms of android UIs.

Tabs and design support library

If you used Tabs within the ActionBar, after migrating to appcompat-v7 API 22 you might recognize a warning, that tells you that they are now deprecated.

In order to modernize those, you should use the design support library that was added to the Android SDK with API level 22. Similar to the appcompat -v7, the design support library provides backports of material design components of Android lollipop (5.x) for older versions of Android.

Here’s the steps that rid you of the deprecation warning

  • Add the design library to your eclipse workspace and link it with your project in pretty much the same way as appcompat described in the first Post. The library can be found on the following path: <sdkdir>/extras/android/support/design.
    Import it into eclipse as Existing Android Code Into Workspace, change the build target to level 22, and link the appcompat project with it.
  • Add the following to maven
    <dependency>
    <groupId>com.android.support</groupId>
    <artifactId>design</artifactId>
    <version>${android.compatibility.version}</version>
    <type>aar</type>
    </dependency>
    
  • Change your code to use the new TabLayout as described here. A complete example is the cheesesquare app: activity_main.xml, include_list_viewpager.xml, MainActivity.java
  • Once you’re done with this and have the design support library up an running, you could modernize your app by using more of the library’s features like navigation drawers, floating labels and buttons, snackbars, collapsing toolsbars, etc. See this blog post for more features.

The screenshots bellow show a before-after comparison – deprecated tabs vs material design tabs.

Tabs: design support library, API-22

Tabs: design support library, API-22

Tabs: AppCompat-v7, API-22, deprecated

Tabs: AppCompat-v7, API-22, deprecated

 

Checkboxes to switches

Android API level 14 introduces the switch component, that according to google should be used when only one option is available. For API levels < 14  there’s no such thing as switches. So we’ll have to rely on checkboxes there. Here’s how to replace checkbox preferences by switches for devices running API level 14 and above

  • In a preference XML, replace CheckBoxPreference by SwitchPreference (see commit ab16eb1997dfba743abcd488b6b20de71b5c3ff0 for an example).
  • However, If you want to keep compatibility with API levels < 14, you’re only choice is to keep redundant copies of the same preferences.xml in

    • res/xml/ that contains the preferences with CheckBoxPreferences and
    • res/xml-v14/ that contains the same file with SwitchPreferences.

Another before-after comparison is shown bellow – checkboxes vs switches.

AppCompat-v7, API-22 with checkboxes

Preferences: AppCompat-v7, with checkboxes

AppCompat-v7, API-22 with switches

AppCompat-v7, API-22 with switches

Using Action Buttons

Action Buttons are icons that realize the most important actions on the actionBar if there is enough room to display them. These could have been used in Actionbarsherlock already, but if you still didn’t modernized them then it’s about time 🙂

One reason for not using action buttons might be that you don’t have suitable icons. Here’s the solution: Google provides a huge amount of material design icons under open source (CC-BY) license. They can be found on this site, or you can just clone their git repository.

So it’s as easy as that

  • Choose proper icons for your actions and copy them to your res/drawable-xyz folders
  • Add icon tags to your menu.xml like so
        <item
            android:id="@+id/action_refresh"
            android:icon="@drawable/ic_refresh_white_24dp"
            android:title="@string/action_refresh"
            app:showAsAction="ifRoom"/>

And that’s it. See bellow for an example. For an example see the commit that realized this change.

AppCompat-v7, API-22, ActionBar with menu

AppCompat-v7, API-22, ActionBar with menu

AppCompat-v7, API-22, ActionBar with action buttons

AppCompat-v7, API-22, ActionBar with action buttons

Advertisements

Modernizing android UIs part 1: Migrating from Actionbarsherlock to Material Design

This post shows by example the steps that are necessary for migrating an android application from Actionbarsherlock to Material Design (introduced in android KitKat/Version 5.x/API level 21/22), while keeping compatibility with at least android Gingerbread/Version 2.3 /API level 9). It uses the appcompat-v7 library on API level 22. The app that was migrated (nusic) in this example is developed using Eclipse, Maven and RoboGuice.

Before-after comparison

Let’s begin with some before-after screenshots:

ActionBarSherlock

ActionBarSherlock

AppCompat-v7, API-22

AppCompat-v7, API-22

02-nusic-abs-prefs

Preferences: ActionBarSherlock

Preferences: AppCompat-v7, API-22

Preferences: AppCompat-v7, API-22

Basic migration (appcompat-v7)

The following lists the steps that were implemented in order to change the app as depicted in the screenshots above.

  • Update Android SDK Tools
  • Eclipse
    • Setup Eclipse project for appcompat described here.
    • Link your project with it. Right click on your project | Properties | Android | Library | Add
      eclipse-choose-appcompat
    • Remove action bar sherlock. Same menu as above.
  • Set Up Maven Build
    • <repositories>
      <repository>
      <id>android</id>
      <url>file://${env.ANDROID_HOME}/extras/android/m2repository</url>
      </repository>
      </repositories>
      
    • <dependency>
      <groupId>com.android.support</groupId>
      <artifactId>appcompat-v7</artifactId>
      <version>${android.compatibility-v7.version}</version>
      <type>aar</type>
      </dependency>
      
  • Migrate from Actionbarsherlock to appcompat
    • styles.xml

      <!-- <style name="AppBaseTheme" parent="@style/Theme.Sherlock"> -->
       <style name="AppBaseTheme" parent="@style/Theme.AppCompat">
      
    • Replacing classes
      • RoboSherlockFragment -> RoboFragment
      • RoboSherlockFragmentActivity -> RoboActionBarActivity
      • RoboSherlockPreferenceActivity -> Write your own RoboAppCompatPreferenceActivity that looks like this (or as described here) but is derived from RoboPreferenceActivity (see here for the class that was used in the example).
        Then derive your class from it as before with RoboSherlockPreferenceActivity.
    • Replacing methods
      • getSherlockActivity() -> getActivity()
      • getSupportMenuInflater() -> getMenuInflater()
    • Fixing imports
      • android.view.Menu
      • android.support.v7.app.ActionBar
    • Updating proguard.cfg
      • Remove actionbarsherlock
      • Add
        # support4, appcompat-v7, design support
        -dontwarn android.support.**
        -keep class android.support.** { *; }
        -keep interface android.support.** { *; }
        

For further info please see

Migrating from Songbird/Nightingale to iTunes

TL;DR; If you still use Songbird or Nightingale and want to migrate your music database to iTunes, start reading here.

Bye-bye Songbird, bye-bye Nightingale

Is anyone out there still using Nightingale or even Songbird? I started using Songbird in 2009 (or even earlier) and switched to Nightingale in 2013. All that time I loved the open source approach of both applications and even contrtibuted a bit myself. However, there also were a lot of drawbacks like performance, incompatibilities of addons after each new version, etc. Speaking of addons – the idea of a modular media player that is extensible just like Firefox or Thunderbird is wonderful. However, it seems to me there’s not much of a community left that releases addons for Nightingale. One of my favorites used to be MLyrics, which was last released in 2013 and doesn’t work properly anymore in the current version of Nightingale (at least for me). Still, there seems to be some development going on. Same goes for the core media player software itself: We all know that Songbird was discontinued in 2013 and the last release of Nightingale was published in January, 2014. Even though there also seems to be some development going on, I lost hope that there will be better usability at some point. So I finally decided with a heavy heart to move on. Nevertheless, I’d like to say thank you to all the Nightingale developers for their strong efforts to keep the dream of a real open source alternative for iTunes alive.

Is there a better alternative to Nightingale? That question I cannot answer properly. My reasons for migrating to iTunes are that it has been maintained by a huge company for years and it’s one of the most popular media players around. So hopefully, it might get along better with my rather huge media library in terms of performance. Plus it is the only tool that is capable of feeding my iPod Nano 6 :-/

Technical approaches of migrating to iTunes

Leaving sentimentality behind – how to migrate from Songbird/Nightingale to iTunes?

One of the nice things about Songbird/Nightingale is their SQLite database. I already worked with it before when creating a playlist exporter for Songbird using the java programming language. It’s always a good idea not to reinvent the wheel. So after extracting the database wrapper into a separate project – songbirdDbApi4j (right now, I really wonder why I named it like this 😮 ) – we’re halfway done with the Songbird to iTunes migration. Almost.

The other half – importing to iTunes was a bit more challenging. iTunes stores its database in an XML file. So one approach is to access this XML directly, just like tools such as iTunesExport. However, this file is generated by iTunes merely for the purpose of exporting, the actual database is stored in the ITL file. It would be possible to recreate the ITL from the XML, but this approach is not very convenient. So a different approach might be more suitable here: on Windows, iTunes offers a COM Interface. It’s poorly documented but fortunately, the developers of COM4j implemented it as one of their sample projects. In order to ease the use of this API, I create a very basic Java wrapper for iTunes’ COM API (itunes4j).

Almost done! What’s missing is a bit of glue logic that reads all files, some of their attributes and all playlists from songbird using songbirdDbApi4j and adds them to iTunes via iTunes4j.

Migrating to iTunes

After a lot of empirical studies and nightly test migrations, I’m proud to present a tool for migrating from songbird to iTunes: songbird2itunes. In case there happen to be any other Nightingale survivors out there that run on Windows and would like to migrate their music database to iTunes, you might just give it a go!

I did my best to make it a resilient migration tool. Still, there might be errors. So:

  1. Double check if you really want to leave Nightingale behind
  2. Make sure to read the wiki first. If you’re sure you want to do this, start the migration as described there.
  3. When the migration is done, check if the statistics show any warnings.
  4. If so, look for WARN in the songbird2itunes.log and see if those are not critical for you.
  5. In case of error, please fix it and contribute 🙂
  6. Manually check your new iTunes library, making sure everything is as expected

Building GitHub projects with Jenkins, Maven and SonarQube 4.1.1 on OpenShift

Basic installation SonarQube

There are different community-driven sonar cartridges around. There is

  • this one that bases on a Tomcat cartridges and provides SonarQube 3.x and
  • that one that comes with SonarQube 4.0.
  • The most uptodate and flexible one is this, though. It downloads a specific version of SonarQube with each build. At the moment it works with version 4.1.1. I’m still working on getting SonarQube 5 to run on openshift, but haven’t succeeded, yet.

There also is a tutorial that shows how to install SonarQube 3.1.1. It also contains general thoughts on how to bypass OpenShift’s restrictions.

Anyway, to install SonarQube 4.1.1 execute the following steps on your machine:

    1. rhc app create sonar diy-0.1 postgresql-9.2

Make sure to remember the login and passwords!

  1. git rm -r diy .openshift misc README.md
    git remote add upstream -m master https://github.com/worldline/openshift-sonarqube.git
    git pull -s recursive -X theirs upstream master
    git push
    
  2. Login to your SonarQube instance at
    http://sonar-<yourAccount>.rhcloud.com/

    The default login and passwords are admin / admin.
    You might want to change the password right away!

Basic installation Jenkins

A lot of information within this paragraph was taken from here.

  1. Create Jenkins gear with Git-SSH
    rhc create-app jenkins  jenkins-1  "https://cartreflect-claytondev.rhcloud.com/reflect?github=majecek/openshift-community-git-ssh"
  2. Authorize your Jenkins node to communicate with other gears (and with you Git Repository)
    Generate SSH key for your Jenkins node
    SSH to the jenkins node

    ssh-keygen -t rsa -b 4096 -f $OPENSHIFT_DATA_DIR/git-ssh
  3. Add the key to your OpenShift, either
    • via web console
      In SSH console

      cat id_rsa.pub

      then copy and paste the output into web console
      or

    • via rhc
      Download the public key (id_rsa.pub) to your host (e.g. by SFTP) and use the

      rhc sshkey add

      command to authorize the public keys for your OpenShift account.
      If you plan on accessing a private repo or want to allow jenkins committing to your repo (e.g. for generate releases with the maven release plugin) you should also add the key to your repo account. See GitHub Help.

  4. Install Plugins
    Browse to Update Center

    https://jenkins-<yourAccount>.rhcloud.com/pluginManager/advanced

    and hit Check Now (as described here).
    Then go to the Available tab and install

    1. Sonar Plugin,
    2. GitHub plugin,
    3. embeddable-build-status (if you’d like to include those nifty build badges in you README.md).

    While you’re at it, you might as well update the already installed plugins in the Updates tab.
    Then hit Install without restart or Download and install after restart. If necessary, you can restart your app like so

    rhc app restart -a jenkins
  5. Set up maven settings.xml to a writable location.
    • SSH to Jenkins
      mkdir $OPENSHIFT_DATA_DIR/.m2
      echo -e "<settings><localRepository>$OPENSHIFT_DATA_DIR/.m2</localRepository></settings>" > $OPENSHIFT_DATA_DIR/.m2/settings.xml
      
    • Browse to Configure System
      https://jenkins-<yourAccount>.rhcloud.com/configure

      Default settings provider: Settings file in file system
      File path=$OPENSHIFT_DATA_DIR/.m2/settings.xml

  6. Set up main Jenkins node as slave (easy to set up and doesn’t need extra gears).
    Go to Configure System

    https://jenkins-<yourAccount>.rhcloud.com/configure

    and set
    # of executors: 1
    As an alternative, you could also use another gear as dedicated Jenkins slave. To do so, follow the steps described here.

    [EDIT 2015-08-09: As it turned out, memory is too low to run the jenkins master and builds on one node. See my second post on how to introduce a dedicated slave node to this setup]

  7. Setup sonar plugin
    On the Jenkins frontend, go to Configure System

    https://jenkins-<yourAccount>.rhcloud.com/configure
    • Global properties,
      tick Environment variables
      Click Add
      name=SONAR_USER_HOME
      value=$OPENSHIFT_DATA_DIR
      See here for more information.
    • Then set up the plugin itself
      Navigate to Sonar, Sonar installations and set the following
      Name=<be creative>
      Server URL:

      http://sonar-<yourAccount>.rhcloud.com/

      Sonar account login: admin
      Sonar account password: <your pw>, default: admin
      Database URL: jdbc:postgresql://$OPENSHIFT_JENKINS_IP:15555/sonar
      Database login: The admin account that was returned when you first created the sonar application
      Database password: The password that was returned when you first created the sonar application

    • Hit Save

Configure build for a repository

Now lets set up our first build.

  1. Go to
    https://jenkins-<yourAccount>.rhcloud.com/view/All/newJob

    Item name: <your Project name>
    Build a free-style software project (Unfortunately, Maven projects do not work due to OpenShift’s restrictions.)
    Hit OK

  2. On the next Screen
    GitHub project:

    https://github.com/<your user>/<your repo>/

    Source Code Management:

    https://github.com/<your user>/<your repo>.git

    Branch Specifier (blank for ‘any’): origin/master
    Build Triggers: Tick: Build when a change is pushed to GitHub
    Build | Execute Shell

    cd $WORKSPACE
    # Start the actual build
    mvn clean compile test package
    

    Post-build Actions | Add post-build action | Sonar

  3. I’d also recommend the following actions
    Post-build Actions | Add post-build action | Publish JUnit test result report
    Test report XMLs=target/surefire-reports/TEST-.xml*
    Post-build Actions | Add post-build action | E-mail Notification
    Recipients=<your email address>
  4. Hit Apply.
  5. That’s it for the basic build set up. Now for the fun part: We need to find a way for Jenkins to reach sonar’s database.
    We’ll use an SSH tunnel for that.
    Build | Add build step | Execute Shell
    Now enter the following:

    # Make sure Tunnel for Sonar is open
    # Find out IP and port of DB
    OPENSHIFT_POSTGRESQL_DB_HOST_N_PORT=$(ssh -i $OPENSHIFT_DATA_DIR/git-ssh/id_rsa -o "UserKnownHostsFile=$OPENSHIFT_DATA_DIR/git-ssh/known_hosts" <UID>@sonar<yourAccount>.rhcloud.com  '(echo `printenv OPENSHIFT_POSTGRESQL_DB_HOST`:`printenv OPENSHIFT_POSTGRESQL_DB_PORT`)')
    # Open tunnel to DB
    BUILD_ID=dontKillMe nohup ssh -i $OPENSHIFT_DATA_DIR/git-ssh/id_rsa -o "UserKnownHostsFile=$OPENSHIFT_DATA_DIR/git-ssh/known_hosts" -L $OPENSHIFT_JENKINS_IP:15555:$OPENSHIFT_POSTGRESQL_DB_HOST_N_PORT -N <UID>@sonar<yourAccount>.rhcloud.com &
    

    This will tunnel requests from your Jenkins’ local Port 15555 via SSH to your sonar gear, which will forward it to its local PostgreSQL database.
    What is missing is script that explicitly closes the tunnel. But for now I’m just happy that everything is up and running. The tunnel will eventually be closed after a timeout. Let me know if you have any ideas how to improve the tunnel handling.

  6. Finally, press Save and you’re almost good to go.
  7. Before running your first build you should SSH to your Jenkins once more and
    ssh -i $OPENSHIFT_DATA_DIR/git-ssh/id_rsa -o "UserKnownHostsFile=$OPENSHIFT_DATA_DIR/git-ssh/known_hosts" <UID>@sonar<yourAccount>.rhcloud.com

    so the sonar node is added to the list of know hosts.

Maven: Create a simple build number

It’s always a good idea to have means to integrate a build number within your application. It makes it easy to find out which exact versions are deployed on your stages our what version you’re users are running in case of third level support.

The task of creating unique build numbers can easily be automated using maven.

A build number can contain all kind of information, like the application version, a sequential number, SCM revision or a time stamp.

From my point of view, using the time stamp when creating a unique build number is essential: It is unique per se (it is unlikely that the same system is build twice in one second) and it is generated by maven without depending on any plugins. In addition, it is easy to understand, like “Did operations really deploy the new version of the application, yet? No, the build number still shows that it’s the one from last week!”.

So let’s get to buisness: How to generate build numbers within a JAR and how to read it? Does it also work with WARs?

JAR

As mentioned before, maven provides a built-in mechanism for creating a timestamp. You can just customize the formatting and use it for your build number within your pom.xml like so:

	<properties>
		<maven.build.timestamp.format>yyyyMMddHHmmss</maven.build.timestamp.format>
		<buildNumber>v.${project.version} build ${maven.build.timestamp}</buildNumber>
	</properties>

Please note that using the property maven.build.timestamp directly within your filtered file is not possible due to this bug.

The build number must no be made accessible to your application. There are at least two solutions at hand: Write the build number to your manifest or create a properties file.

Manifest

pom.xml

	<build>
		<!-- Add build number to manifest -->
		<plugins>
			<plugin>
				<groupId>org.apache.maven.plugins</groupId>
				<artifactId>maven-jar-plugin</artifactId>
				<configuration>
					<archive>
						<manifestEntries>
							<build>${buildNumber}</build>
						</manifestEntries>
						<manifest>
							<mainClass>info.schnatterer.Main</mainClass>
						</manifest>
					</archive>
				</configuration>
			</plugin>
		</plugins>
	</build>

And that’s how you can access it from your application:

	private String getBuildNumberFromManifest() throws IOException {
		InputStream manifestStream = getClass().getClassLoader()
				.getResourceAsStream("META-INF/MANIFEST.MF");
		if (manifestStream != null) {
			Manifest manifest = new Manifest(manifestStream);
			Attributes attributes = manifest.getMainAttributes();
			return attributes.getValue("build");
		}
		return null;
	}

Properties

Alternatively, create use a properties file:

build.properties

buildNumber=${buildNumber}

pom.xml

	<build>
		<resources>
			<resource>
				<!-- Filter for build number -->
				<directory>src/main/resources</directory>
				<filtering>true</filtering>
				<includes>
					<include>build.properties</include>
				</includes>
			</resource>
		</resources>
	</build>

And that’s how you can access it from your application:

	private String getBuildNumberFromProps() throws IOException {
		String propertiesName = "/build.properties";

		InputStream propertiesStream = Main.class
				.getResourceAsStream(propertiesName);
		if (propertiesStream != null) {
			Properties pros = new Properties();
			pros.load(propertiesStream);

			return pros.getProperty("buildNumber");
		}
		return null;
	}

The output looks something like this:

v.0.0.1-SNAPSHOT build 20140225211328

Note: You might want to check the returned value for null or have the methods return empty Strings in order to avoid NullPointerExceptions or the word null on your GUI.

WAR

If you’re building a web application, you should also be able to use the manifest or the properties file as described above. Your maven set up might be a bit different, though (see Maven War plugin – WAR Manifest Customization).

You have another alternative that makes creating the build number even easier: You can “stamp” the build number directly into interpreted documents like (X)HTML or js files.

For example when using JSF, you could write the build number directly into the footer template.

Add this to your page:

<h:outputText styleClass="version" value="${buildNumber}"/>

And this to your pom.xml

	<properties>
		<maven.build.timestamp.format>yyyy-MM-dd HH:mm:ss</maven.build.timestamp.format>
		<buildNumber>v.${project.version} build ${maven.build.timestamp}</buildNumber>
	</properties>

	<build>
		<plugins>
			<plugin>
				<groupId>org.apache.maven.plugins</groupId>
				<artifactId>maven-war-plugin</artifactId>
				<version>2.3</version>
				<configuration>
					<!-- Filter for build number -->
					<webResources>
						<resource>
							<directory>src/main/webapp/WEB-INF/templates</directory>
							<targetPath>WEB-INF/templates</targetPath>
							<filtering>true</filtering>
							<includes>
								<include>yourFooterTemplate.xhtml</include>
							</includes>
						</resource>
					</webResources>
				</configuration>
			</plugin>
		</plugins>
	</build>

And that’s it. No parsing necessary in code, as maven will insert the build number in your web page directly.

Further Reading

If you’re planning on creating a more complex build number (SCM, sequential numbers, etc.), I suggest you have a look at the Build Number Maven Plugin.

JSF: Displaying FacesMessages during render response phase

The problem

A controller runs into an error situation during the last phase of the JSF lifecycle (the RENDER_RESPONSE phase). In order to let the user know that something went wrong, a FacesMessage is added. However, the faces message is not shown on the client. There may or may not be a warning on the log which says “FacesMessage(s) have been enqueued, but may not have been displayed“.

The cause

In the render response phase the component tree is traversed and each component is rendered (that is, HTML is generated). When an error occurs after the message component finished rendering, it is not possible to add another message to it.

Some solutions/workarounds

  1. Place the message component at the end of the component tree (that is, the page or template). That way, it’s most likely that the message widget has not been rendered at the time the content is rendered (which is when the error occurs and the FacesMessage is added). The FacesMessages are processed later, when the message widget is rendered.
    However, in most cases you don’t want to display your message at the end of the page. This can be solved with at least these two approaches:

    1. Use an overlay (for example PrimeFace’s growl widget – <p:growl>), whose position on the rendered page is “independent” of its position in the component tree.
    2. Move the widget via CSS positioning (which is a bit of an ugly workaround).
  2. Try to validate the data in an earlier phase. For example, you could check the data before rendering via an event:
    <f:event type="preRenderView" listener="#{bean.initData}"/>
    

    Or (if you’re using Seam), you could use Seam page actions,  as shown in this thread.

  3. Save the messages in the session and trigger a server render (which can be done with ICEfaces). This can be realized with a PhaseListener as described here.

Which one to use?

This depends on your use case and the component framework you’re using.

Solution #3 should always be your last measure, as it is really heavyweight and it requires additional HTTP requests.

#2 cannot be used in conjunction with some components and requires changing the UI logic and increases complexity of your code.

Which leaves #1, which is the most pragmatic solution. However, it can’t be used when your FacesMessage should be displayed on a position within the page (e.g. on the top of the page).

An example

When rendering data using PrimeFace’s DataTable with lazy loading (p:dataTable) the data is fetched from database via the LazyDataModel at rendering time (render repsonse phase). Once the data is fetched it might occur that some data is missing and the user should be informed about this fact. In order to do so, a FacesMessage would be the way to go. However, as described above, the message never gets rendered.

As far as I know, fetching the data cannot be brought forward to an earlier phase.

As the growl widget was already used within the application as the way of displaying information to the user, the solution was as easy to move the growl widgets to the end of the layout template. And voilà, the messages suddenly show up!

This solution was really easy to realize, but finding out about it took a painful amount of time. So hopefully, this post may speed up the process for anyone stuck with the same problem 🙂

Hibernate: Write SQL to a specific logfile (without additional framework)

This post will show a simple setup logging Hibernate’s SQL statements to a single file, including all the parameters, using only log4j and (of course) Hibernate.

SQL in Hibernate

For the purpose of seeing the magic behind a object-relational mapper (e.g. for debugging), it’s always useful to have a look at the actual SQL queries that are created.

Especially, if you are working (or have to work) with something as useful as Hibernate’s Criteria API.

In Hibernate, the simplest way is to set the parameter hibernate.show_sql to true and Hibernate will log all SQL statements to system out.

Straight after setting this option, two things come to mind:

  • Even small applications can generate tons of SQL statements on your log and
  • the SQL statements contain only question marks instead of the actual paramaters, which leaves them only partly useful.

I can’t count the number of SystemOut.log files I saw that were so flooded with Hibernate’s SQL drivel that the important log statements were nowhere to be found.

Using additional Frameworks

In order to solve these issues, the best approaches might be to intercept the SQL statements on JDBC level and then logging them. This can be achieved using a framework like p6spy or log4jdbc. Those frameworks will log the complete SQL statements including the actual parameter values instead of question marks to a file.

However, those frameworks require a lot of effort on configuration. In some cases they can’t be used at all, e.g. in enterprise software – log4jdbc doesn’t even support maven (as of log4jdbc 4.1.2).

Using only Hibernate (and log4j)

A by far simpler alternative is to swiftly set up your logging framework (which should already be part of your application) to log the SQL statements, the values of the parameters and the values returned by the database to a separate file.

A configuration for log4j might look as follows:

log4.properties

log4j.rootLogger=INFO, File, Console

log4j.logger.my.application=DEBUG, File

log4j.logger.org.hibernate=WARN, File, Console
log4j.additivity.org.hibernate.SQL=false
log4j.logger.org.hibernate.SQL=DEBUG, FileSql
log4j.additivity.org.hibernate.SQL=false
log4j.logger.org.hibernate.type=TRACE, FileSql
log4j.additivity.org.hibernate.type=false

log4j.appender.Console=org.apache.log4j.ConsoleAppender
log4j.appender.Console.layout=org.apache.log4j.PatternLayout
log4j.appender.Console.layout.ConversionPattern=%d{ISO8601} %-5p %c{1} %l - %m%n

log4j.appender.File=org.apache.log4j.DailyRollingFileAppender
log4j.appender.File.datePattern='.'yyyy-MM-dd
log4j.appender.File.file=logs/myApplication.log
log4j.appender.File.layout=org.apache.log4j.PatternLayout
log4j.appender.File.layout.ConversionPattern=%d{ISO8601} %-5p %c{1} %l - %m%n

log4j.appender.FileSql=org.apache.log4j.DailyRollingFileAppender
log4j.appender.FileSql.datePattern='.'yyyy-MM-dd
log4j.appender.FileSql.file=logs/myApplication_sql.log
log4j.appender.FileSql.layout=org.apache.log4j.PatternLayout
log4j.appender.FileSql.layout.ConversionPattern=%d{ISO8601} %-5p %c{1} - %m%

This results in all SQL statements being logged to myApplication_sql.log whereas all other application-related messages (including warnings issued by Hibernate) are written to myApplication.log.

The important part is the additivity parameter, which avoids the log messages being propagated to the parent logger. That is, the SQL statements will not flood your system out.

This solution has only one drawback compared to the additional frameworks solution: It still contains the question marks. However, the org.hibernate.type logger writes additional log statements that contain the values.

Note: Setting the logger org.hibernate.SQL to DEBUG seems to be equivalent to setting the parameter hibernate.show_sql to true.

In addition I’d always recommend to set the hibernate.format_sql parameter to true in order to make the SQL statements easier to read.

The result might look something like this

2013-04-16 21:18:47,684 DEBUG SQL -
    select
        table0_.id as id0_1_
    from
        app table0_
    where
        table0_.val=?
2013-04-16 21:18:47,684 TRACE BasicBinder - binding parameter [1] as [BIGINT] - 95
2013-04-16 21:18:47,684 TRACE BasicExtractor - Found [1] as column [id0_1_]

Conclusion

To conclude, using log4j can be set up in no time and contains all the necessary information for debugging (including the values returned by the database), whereas a dedicated JDBC logging framework requires more configuration effort but puts out the SQL statements more neatly.

Links

Once more, I found most of the information aggregate in this article on stackoverflow: here and there. Thanks guys!