Build System for XPages and OSGi plugins
We have only just recently set up a Build System for XPages / NSF / OSGi plugins. Here is a little summary of how it started, roughly how it is set up and a few random notes that might help others hoping to do a similar thing. It’s not perfect, but at least you know you are not alone if you are trying to do something like this! Any questions or suggested improvements please comment!
How did it start?
Continuous Integration / Deployment, has long seemed out of reach for Domino developers.
I saw a post on OpenNTF last year asking if anyone was interested a project for an XPages build ecosystem. I thought that would be great! I think I posted an “I’m interested” comment and then I forgot about it.
Then I noticed a tweet from Martin Jinoch
running the “headless designer” to produce NSF from on-disk project. This could be a BIG THING!!
— Martin Jinoch (@mjinoch) February 6, 2014
I thought that Martin was playing around with something of his own creation! At the time I did not realise it was an IBM 9.0.1 feature. I then saw this from Martin Pradny…
And then this from Egor Margineanu…
Then I was going through some slides from Connect 14 and found some slides from a session that outlined this new feature. It was only then I fully understood it was a proper IBM feature for 9.0.1!
At this point I was very interested but all of this still seemed like a far off dream to me. Our team’s workload has increased due to a developer leaving, we are months behind on a project and I am scrambling for any spare time I can get with 2 small children keeping me busy.
Also, we are developing our own OSGi plugins. We have been building and deploying these manually since we still didn’t know how to automatically build these yet. So there would be no point investigating continuous integration without being able to automatically build the OSGi plugins. I’d had a little look into Maven, I wasn’t sure how it would fit with the way we currently develop or if it would play nice with the Domino Debug Plugin (from XPages SDK) which we heavily rely on for developing plugins. (if interested in this, see also Paul Withers’ post regarding jvm for ExtLib development)
Then, I was looking into modifying the Drag n Drop sidebar to suit a project I’m working on. I watched Ryan Baxter’s series on plugin development, and in it he mentioned PDE Build for automatic building of plugins. So, I investigated PDE Build and found that I could successfully build our OSGi plugins using headless Eclipse! This meant that I now had most of the pieces of the puzzle, one important piece missing was Time!!!
I still needed to successfully build an NSF. I managed to do a quick test of a simple On-Disk project to NSF build, this was successful! I fired off a couple of questions I had to the IBM email addresses at the end of the slide deck. Jonathan Roche answered them for me within the same day and included a link to the developer wiki for Headless designer .
The Business Need
The key stake-holder of our current project had been making regular visits to check on our progress. Unfortunately he often shows up without warning, when our own development system is usually in a broken state. We have a user acceptance testing environment, but deploying to it is a manual process. We are already short of time, so we rarely update it unless we actually are deploying to production, so the testing environment is pretty useless I guess.
Well his frustration at not being able to test the system was reaching it’s boiling point, we agreed it would be worthwhile to put aside the project temporarily to investigate a continuous deployment of our latest stable branch, to a testing environment. With this agreement, the final piece of the puzzle Time was found! A few days and late nights later we have a working Continuous Integration system for XPages and OSGi plugins.
The System as it currently stands
Please Note, setting up this build system was not trivial, and this post is not a ‘How-To’ post, but more of a ‘It’s possible if you want to’ post. The system is working well for us and for the most part is ticking along, however if the Domino Designer NSF build breaks it sometimes needs a kick along manually.
So as an overview of how our system currently works.
Developers’ develop OSGi plugins and NSF’s on their own machine with a local domino server. Developers use their own feature branch (we are using git-flow) of the Git Repository.
When code is ready for testing, developers merge into ‘develop’ and push to GitHub.
Jenkins monitors Github for changes to the develop branch, when it sees changes, it runs a build.
When we set up the Jenkins job we gave it a ‘Build Script’, a list of steps that we want it to do. We have used Apache Ant to define our steps. There are other options but we are using ant.
If the build script runs successfully is successful it deploys it to our testing environment.
For production releases (master branch), there is no automatic deployment, just the build step, which produces either an NSF or some plugins, these are still deployed manually however there is no reason they couldn’t be automatically deployed too, but this is just how we have it set up at the moment.
Here is the software used in the build system:
- Github for hosting code repositories
- Build server (is also our Testing Environment Domino Server)
- Windows 2003 Server
- Domino 9.0.1
- Notes 9.0.1
- Designer 9.0.1
- Eclipse Kepler (for building OSGi plugins), but technically you could use Designer as well
- Git 1.9.0 for Windows
- Jenkins 1.5
- Java JDK 7u51
- Apache Ant 1.9.3
- Powershell 1.0 to run Egor Margineanu’s Powershell script
I also added the Notes Program directory to the PATH environment variable so that you can run ‘designer.exe’ or ‘designer’ from any directory. (For launching headless designer)
We already had a Windows 2003 Server set up with a Domino instance for our Testing Environment, so we decided to use this as the build server.
Building and Deploying
I will describe Building and Deploying separately.
A Build task is purely concerned with taking some Source Code (Java code, or an NSF On-Disk Project) and turning it into an artifact which can be deployed.
A deployment task is concerned with taking an artifact that has been successfully built, and putting in in use somewhere, be it a testing or live environment. At present we are only automatically deploying to our testing environment.
General overview of build process
Here is a rough description of what happens in the build process. You create a Job in Jenkins which defines:
- Instructions on when to run the build: automatically when new changes? manually? periodically?
- Where to get the latest source code from?
- what are the steps taking to build? copy files? compile something?
- what do you want to do at the end? archive the results? notify somebody?
Each Job has a ‘workspace’ directory of it’s own where most of the operations take place. The source code is checked out to here, and at the end of the build, you archive whatever artifacts that you built (e.g. NSF file or plugins) for safe keeping.
Each time you run a build you get a reference number for that attempted build, which is sequential.
Starting a Build
You can start a build job Manually, you can schedule build job, you can have build job start when another finishes etc. You can trigger a Build with a REST Call
With the github plugin, you can have your Jenkins server check GitHub for changes by Polling it periodically, however you can also configure it so GitHub will notify your jenkins server when there is new code to build. This is what we have done, it requires your build server to be externally accessible though.
Overview of Building an NSF
Building an NSF involves preparing an On-Disk Project, and then launching Domino Designer in what is referred to as ‘Headless’ mode. It is not truly Headless however, headless usually means that there is no GUI, but in this case the Domino Designer GUI is still launched but the window is minimized soon afterwards. Designer then carries on with importing the On-Disk Project and building the NSF and then shutting down.
For example if we had are building an NSF for our ‘Discussion’ NSF with build number 15.
- Jenkins fetches the latest version of the repository from Github
- Jenkins checks out the branch that you are building (e.g. develop)
- Jenkins runs our Ant target ‘buildNSF’ Which does the following
- Copies our on-disk project to a temporary ‘WORKSPACE/odp_15’ directory. We rename the folder to include the build number to be doubly sure that designer will have never seen this project folder before, otherwise it may think it is the same project as previously attempted.
- Copies the following files to the On-Disk Project, which set the desired Application Properties for the NSF to be Built.
This allows us to ignore them from the repository, which means each developer can put whatever they want on their database.properties and it won’t affect the build.
- Builds the NSF Discussion_15.nsf into Notes Data Directory, using headless designer by running the powershell script.
- Jenkins then Moves the successfully built NSF Discussion_15.nsf from the Notes Data Directory to our Jenkins Job workspace.
- As a post-build step, Jenkins archives our successfully built Discussion_15.nsf file
Overview of Building an OSGi plugin
To perform an OSGi plugin build we use Headless Eclipse, and this time it truly is Headless, no GUI involved, command line only. It uses PDE Build, which is the exact same set of code that Eclipse uses when you build your plugins within Eclipse.
As an overview, you provide PDE Build with a ‘feature’ to build, the feature can point to one or more plugin projects. You also specify build.properties to PDE Build which define the target Java Runtime environment, the target OSGi platform (e.g. Domino Plugins plus any plugins your project depends on), compiler flags such as whether to include debugging information
- Jenkins fetches the latest version of the repository from Github
- Jenkins checks out the branch that you are building
- Jenkins runs our Ant target ‘buildPlugins’
- creates a sub directory ‘buildDirectory’ in the jenkins job workspace with sub folders for features and plugins. this is the directory PDE Build will use as it’s working directory.
- copies the plugins to be built to the ‘buildDirectoryplugins’ directory
- copies the feature which defines the plugins to be built, to the ‘buildDirectoryfeatures’ directory
- runs the PDE build script to build osgi plugins (results in a zip file)
- unzips the plugins and features to the Jenkins Job Workspace
- Copies the plugins and features from the Jenkins Job Workspace to a permanent eclipse update site folder.
- generates a new site.xml in the update site directory
- as a post-build step archives the zip file of the built plugins as build artifacts.
Overview of Deploying an NSF to Testing Server
There is more than way to do this I suppose! It all depends on what your template inheritance structure might be.
At the moment it is very simple for us. It is simply a design replace, and no template inheritance needs to be set. For example if we are updating the testing database TestingDiscussion.nsf
- Copy the built NSF e.g. Discussion_15.nsf from where Jenkins archived it, to the Domino data directory, to a folder where we will keep templates e.g. ‘C:DominoDataTestTemplatesDiscussion_15.nsf’
- run nconvert.exe -d TestingDiscussion.nsf * TestTemplatesDiscussion_15.nsf
to update our Testing version of Discussion to the latest version
This is not a very flexible solution, it kind of depends on the Domino Server being on the same machine as the Build server.
More complicated versions would involve moving the nsf to another machine, setting template inheritance, replicating, running design refresh etc.
I have written a small helper deployHelper.exe with the Notes C Api which does template name setting and can send console commands to servers. I have recently discovered Java Native Access so I plan to re-write this helper exe in Java, it could even be made into a Jenkins Plugin I guess 🙂
In any case I think deploying the NSF could be a whole set of blog posts, it is 11pm here so I will leave for another day 🙂
Overview of Deploying Plugins to Testing Server
Deploying the plugins to the testing server, currently Jenkins simply moves the plugins and features into the <DominoDataDir>dominoworkspaceapplicationseclipse directory.
It then uses my deployHelper.exe which sends a ‘restart task http’ console command to the server.
For deploying plugins to production, we still do this manually. I simply have a mapped drive letter (U:) to the updatesites folder on the build server (this is where Jenkins dumps the successfully built plugins) I think open the production UpdateSite.nsf and import from local update site, U:develop
Other options / ideas for automatic deployment, check out the Open Eclipse Update Site project on OpenNTF. Karsten Lehmann modified the normal Update Site so it too can import plugins headlessly. It does it via an agent. You could then use my deployHelper to restart server!
So you can see there is quite a lot involved in setting up a build server! This post really is just an overview. I am happy to do some more detailed posts on specific parts so please post a comment / ask a question.
It is quite a bit of effort, and yet another learning curve.
The benefits of doing it, is that you know the build process will be the same every time, and it is as simple as pushing a branch to GitHub.
Returning to our business stakeholder mentioned earlier, now instead of saying
‘Why is it always broken?’
“Did you push the latest changes?”
and we are a little bit less stressed 🙂
p.s. I originally drafted this post a few months ago, Here are some random notes I wrote I thought I would leave them in!
I haven’t really used Ant before, but I really, really like it now.
If you haven’t used it before, basically you define a set of ‘targets’ which define tasks such as moving files, running programs, running java. A target is a bit like a ‘method’ really.
I used an Ant script to define all the steps (except for fetching the source code, which jenkins already does) in building plugins/NSFs.
For example here is my ‘target’ which performs the build of the Nsf. You can see it calls the powershell executable, passing in arguments for the Execution policy
The great part about ant, is you can test and run all the build steps on your own computer
Headless Designer Notes
Make sure the default user is not prompted for a password
Make sure the ‘Binary DXL’ setting for source control is the same as whatever your developers use
I think we set Designer to only automatically import from On-Disk Project, and Not auto export to disk
Turn off replicate on startup or any other startup tasks that will slow down designer.
I had never actually used Jenkins before, but found it very easy to install and use. If you are struggling to find documentation on some part of Jenkins, look up Hudson as they almost the same system . e.g. the Hudson book seemed pretty comprehensive and even though it wasn’t exactly matched to Jenkins, it was enough information to set me in a general direction.
I used the ‘manage plugins’ section to download Git Plugin. I did notice a couple of *failed* to install messages on the dependant credentials but after restart Git Plugin worked fine.
Don’t install as a Windows Service
At first we had Jenkins installed as a windows service. This is it’s default and recommended setting.
This was working fine for building OSGi plugins, however this was a problem for NSF building, as the Headless Designer is not really Headless, it needs to still launch the GUI (it just minimizes it after launch). If you run jenkins as a service, it will fail to launch designer properly. I think that some others had success running as a service and ticking a setting about interacting with desktop, but for me I still had trouble.
As such, we have to start Jenkins from the console. We do this by navigating to Jenkins directory (C:Jenkins in my case) and then executing
java -jar jenkins.war
When we originally started as a service, it used our jenkins install directory C:Jenkins as it’s home directory for all configuration etc. We had set up a few jobs, installed some plugins and did some server configuration.
However when we started it from console, it uses ~/.jenkins as the the home directory for configuration, resulting in a brand new installation. To override this we simply set a JENKINS_HOME environment variable and set this to C:Jenkins, jenkins then returned to using this as it’s home directory.
Here are all the Installed Jenkins plugins on our Jenkins server. Some of them may have been installed as a dependency, some of them may have been automatically installed, I can’t remember! If you want to know more about one ask me
- Ant Plugin 1.2 – runs ant build scripts
- Copy Artifact Plugin 1.30 – allows you to copy built artifacts from one job to another
- Credentials Plugin 1.10 – I think github plugin uses it to store your username /pword
- CVS Plugin 2.11
- Email-ext plugin 188.8.131.52 – more options for sending notification emails
- External Monitor Job Type Plugin 1.2 – i think was installed as a dependency of another
- Git Client Plugin 1.6.4 – dependency for Git Plugin
- Git Plugin 2.0.4 – uses Git!
- GitHub API Plugin 1.44 –
- Github OAuth Plugin 0.14
- Hudson PowerShell plugin 1.2 – not sure if I actually needed it as I call powershell from within Ant but at one stage I could use this plugin to do it
- Javadoc plugin 1.1 – generates javadoc for you
- LDAP Plugin 1.8
- Mailer 1.8
- Matrix Authorization Strategy Plugin 1.1 – permissions for jenkins users and jobs etc
- Maven Project Plugin 2.1 – we don’t use (yet ?)
- OWASP Markup Formatter Plugin 1.0 – ??
- PAM Authentication Plugin 1.1
- promoted builds plugin 2.17 – allows you to take a successful build an ‘promote’ it for deployment, we don’t do this yet but it looks like it could be a good strategy, slightly different to git-flow system
- SCM API Plugin 0.2
- SSH Credentials Plugin 1.6.1
- SSH Slaves plugin 1.6
- Subversion plugin 2.2
- Token Macro plugin 1.10
- Translation Assistance Plugin 1.11
- Windows Slaves Plugin 1.0