Earlier this year, while working on a project for TRIOLOGY GmbH, I once again used maven to write a version name into an application, using the mechanism described in my post Maven: Create a simple build number. As a more sophisticated version name was required for this project, we expanded it by a time stamp, SCM information (branch and commit), build number and a also created a special name for releases. You can find a how-to here – Version names with Maven: Creating the version name – which is the first part of a small series of blog posts on this topic.
The second part shows how the version name can be read from within the application. While writing the examples for the post, I wondered how many times I must have implemeted reading a version name from a file in Java. Way too often! So I decided that this would be the very last time I had to do it, and extracted the logic into a small library: versionName, availble on GitHub. What it does and how to use it is described in the second part of the post: Version names with Maven: Reading the version name.
Hopefully, this will be useful for someone else. Funny enough, in the new project I’m on, I’m about to reuse it once again. I’m glad I don’t have to write it again. Here’s to reusability 🍺
make sure to back up you instance before you continue: rhc snapshot save -a <application name>--filepath <backup destination>
or ssh <UID>@<application name>-<yourAccount>.rhcloud.com 'snapshot' > sonar.tar.gz
SonarQube will update it’s database during the process.
If you followed this post to set up your SonarQube instance and therefore use an SSH tunnel to access the SonarQube database, note that you can now get rid of this workaround. From SonarQube 5.2 the analyses can be run without direct contact to the database.
That is, you can also remove the database connection from your the configuration of the SonarQube plugin in jenkins.
Install new SonarQube instance
To install SonarQube 5.2, execute the following steps on your machine:
Branch Specifier (blank for 'any'): origin/master Build Triggers: Tick Build when a change is pushed to GitHub Build Environment: Tick Prepare SonarQube Scanner environment Build | Execute Shell
# Start the actual build
mvn clean package $SONAR_MAVEN_GOAL --settings $OPENSHIFT_DATA_DIR/.m2/settings.xml -Dsonar.host.url=$SONAR_HOST_URL
I’d also recommend the following actions Post-build Actions| Add post-build action| Publish JUnit test result report Test report XMLs=target/surefire-reports/TEST-.xml* Post-build Actions| Add post-build action| E-mail Notification Recipients=<your email address>
Finally, press Save and start you first build. Check Jenkins console output for errors. If everything succeeds you should see the result of the project’s analysis on SonarQube’s dashboard.
At the time of writing OpenShift features Maven 3.0.4 and OpenJDK Server 1.7.0_85. Why would you want to change those? Best example is a Java8 project to be build on Jenkins. Can we just advise Jenkins to download the newest Oracle JDK and we’re good to go? Nope, it’s not that simple on OpenShift! Jenkins does download the new JDK, sets the JAVA_HOME variable and the correct PATH, but maven is always going to use the stock JDK. Why? Running this command provides the answer
The stock maven is setting its own environment variables that cannot be overridden by Jenkins!
So, in order to exchange the JDK, we need to exchange maven first.
SSH to the machines where your builds are executed (e.g. your slave node). The following example show what to do for maven 3.3.3:
tar -xvf apache-maven-3.3.3-bin.tar.gz
Edit maven config
Add the following to the <settings> tag (replace <UID> by your OpenShift UID first)
Set Environment variables PATH=$OPENSHIFT_DATA_DIR/maven/apache-maven-3.3.3/bin:$PATH M2_HOME=$OPENSHIFT_DATA_DIR/maven/apache-maven-3.3.3
And that’s it, your builds are now running on the custom maven!
This allows for using a specific JDK in Jenkins. You could just choose a specific JDK via Jenkins console. This is comfortable, but has one disadvantage: It takes a lot of memory (approx. 600MB per JDK), because the JDK is stored twice – compressed in cache to be sent to slave and again uncompressed to be used on the master. If you got enough memory, you’re done here.
However, In case you’re running a small gear with only 1GB of memory, you might want to save a bit of your precious memory. The following example shows how to do so for JDK 8 update 51 build 16.
wget --no-check-certificate --no-cookies --header &quot;Cookie: oraclelicense=accept-securebackup-cookie&quot; http://download.oracle.com/otn-pub/java/jdk/8u51-b16/jdk-8u51-linux-x64.tar.gz
tar -xvf jdk-8u51-linux-x64.tar.gz
This post showed how to build GitHub projects with Jenkins, Maven and SonarQube 4 on OpenShift. For starters, it used the Jenkins master node for running build jobs. However, when running on a small gear, the master node might run out of memory pretty fast, resulting in a reboot of the node during builds.
In order to resolve this issue, there are two options:
limitting the memory of the build or
running the build on a slave node.
As spawning additional nodes is easy in a PaaS context such as OpenShift and provides a better performance than running builds with small memory, the slave solution seems to be the better approach.
This post shows how.
Create new DYI app as a slave node (a how-to can be found here), name the node e.g. slave
Set the following values: Remote FS root:/app-root/data folder on slave. Typically this is /var/lib/openshift/<slave's UID>/app-root/data/jenkins, you can find out by SSHing to the slave node and calling
Labels: Some label to use within builds to refer the node, e.g. OS Slave #1 Host: the slave’s hostname, e.g. slave-<youraccount>.rhcloud.com
Add Credentials username: <slave's UID> Private Key File:Path to a private key file that is authorized for your OpenShift account. In the first post this path was used: /var/lib/openshift/<UID of your jenkins>/app-root/data/git-ssh/id_rsa. Note: $OPENSHIFT_DATA_DIR seems not to work here.
BTW: You can change the credentials any time later via this URL
As the different cartridges (jenkins and DIY) have different environment variables for their local IP addresses ($OPENSHIFT_JENKINS_IP vs $OPENSHIFT_DIY_IP) we’ll have to improvise at this point. There are two options: Either
Replace all occurrences of $OPENSHIFT_JENKINS_IP
In all builds and in
The most uptodate and flexible one is this, though. It downloads a specific version of SonarQube with each build. At the moment it works with version 4.1.1. I’m still working on getting SonarQube 5 to run on openshift, but haven’t succeeded, yet.
Download the public key (id_rsa.pub) to your host (e.g. by SFTP) and use the
rhc sshkey add
command to authorize the public keys for your OpenShift account.
If you plan on accessing a private repo or want to allow jenkins committing to your repo (e.g. for generate releases with the maven release plugin) you should also add the key to your repo account. See GitHub Help.
and hit Check Now (as described here).
Then go to the Available tab and install
embeddable-build-status (if you’d like to include those nifty build badges in you README.md).
While you’re at it, you might as well update the already installed plugins in the Updates tab.
Then hit Install without restart or Download and install after restart. If necessary, you can restart your app like so
tick Environment variables
Click Add name=SONAR_USER_HOME value=$OPENSHIFT_DATA_DIR
See here for more information.
Then set up the plugin itself
Navigate to Sonar, Sonar installations and set the following Name=<be creative> Server URL:
Sonar account login: admin Sonar account password: <your pw>, default: admin Database URL: jdbc:postgresql://$OPENSHIFT_JENKINS_IP:15555/sonar Database login: The admin account that was returned when you first created the sonar application Database password: The password that was returned when you first created the sonar application
I’d also recommend the following actions Post-build Actions | Add post-build action | Publish JUnit test result report Test report XMLs=target/surefire-reports/TEST-.xml* Post-build Actions | Add post-build action | E-mail Notification Recipients=<your email address>
That’s it for the basic build set up. Now for the fun part: We need to find a way for Jenkins to reach sonar’s database.
We’ll use an SSH tunnel for that. Build | Add build step | Execute Shell
Now enter the following:
# Make sure Tunnel for Sonar is open
# Find out IP and port of DB
OPENSHIFT_POSTGRESQL_DB_HOST_N_PORT=$(ssh -i $OPENSHIFT_DATA_DIR/git-ssh/id_rsa -o "UserKnownHostsFile=$OPENSHIFT_DATA_DIR/git-ssh/known_hosts" <UID>@sonar<yourAccount>.rhcloud.com '(echo `printenv OPENSHIFT_POSTGRESQL_DB_HOST`:`printenv OPENSHIFT_POSTGRESQL_DB_PORT`)')
# Open tunnel to DB
BUILD_ID=dontKillMe nohup ssh -i $OPENSHIFT_DATA_DIR/git-ssh/id_rsa -o "UserKnownHostsFile=$OPENSHIFT_DATA_DIR/git-ssh/known_hosts" -L $OPENSHIFT_JENKINS_IP:15555:$OPENSHIFT_POSTGRESQL_DB_HOST_N_PORT -N <UID>@sonar<yourAccount>.rhcloud.com &
This will tunnel requests from your Jenkins’ local Port 15555 via SSH to your sonar gear, which will forward it to its local PostgreSQL database.
What is missing is script that explicitly closes the tunnel. But for now I’m just happy that everything is up and running. The tunnel will eventually be closed after a timeout. Let me know if you have any ideas how to improve the tunnel handling.
Finally, press Save and you’re almost good to go.
Before running your first build you should SSH to your Jenkins once more and