Our current Deployment setup: Github + Jenkins + BuildXPages

I recently received a question from Patrick Kwinten about whether I am still using my BuildXPages project, and whether I have involved Jenkins in the setup.

The answer is yes, I still use BuildXPages on a daily basis, and in regards to Jenkins, I have been using Jenkins for almost 5 years to build and deploy our XPages projects. I decided to keep the BuildXPages Documentation ‘agnostic’ on the choice of Build server, so as to focus just on the library and the tasks it can achieve. The point was that you should be able to use any build server you like. So this is the reason Jenkins was not really mentioned in the documentation.

It has been more than a year since I have done a blog post so I thought instead of responding to Patrick directly, it would be a good excuse to finally put another blog post out there! Maybe it will spark me into action a bit more and I might make some more posts after, well see. I have to be honest, the constant ‘XPages is dead’ mantra from here there and everywhere, combined with a recently developed addiction to online chess has a negative effect on my contributions to the community!

What is our Current Setup?

This will just be a rough overview of our current setup. You can see a more detailed post on our setup from about 5 years ago here. Many of the details are the same but I will just go over it quickly here in a “Stream of consciousness” so I can get this blog post out quickly.

All our XPages related work is done in one monolithic repository, hosted as a private repository on Github. Initially we had multiple repositories but it was just way too much overhead to be switching branches / merging on multiple repositories. Moving to a single repository simplified everything greatly and I’m very glad we did.

Our repository consists of about 15+ different NSFs, and maybe 20+ custom java plugins.

All the NSFs are edited in Domino Designer and synchronised to the ODP using the Team Development NSF to ODP sync and of course they are using Swiper!

All our Java plugins are developed using Eclipse IDE. Our java plugins consist of a custom java framework (which has no dependency on XPages so it can ran as background tasks etc.) which provides the concept of a custom ‘application’ with ‘services’, ‘settings’, POJO Model configurations. The framework manages mapping the Models to the NotesDocuments and back etc. among a lot of other things.
Our plugins then also contain a custom XPages library which provides an additional layer of our framework that includes the XPages dependency. It has all the XPages Controls that we have developed, it has other extensions that we have made on top of the extension library (things like the Notes based JDBC provider). It also contains our custom renderers for the SmartAdmin wrapbootstrap theme (v3), which is our base theme for our applications.

In addition to these base framework plugins, we have specific plugins for each custom ‘application’. These provide the custom java code which defines the actual business applications that we deliver.

Branching strategy

We initially went with git-flow because it had a pretty diagram that explained it, and it was built directly into Sourcetree which is our git client of choice. However it quickly became apparent that the git-flow branching model does not suit the way we deliver our product.

Instead we moved to the github-flow method, in which there is one ‘master’ branch, and everything else is just hanging off that. Read the link for a better explanation, but again this was a much simpler strategy that suits a ‘continuous deployment’ style of code release, as opposed to the git-flow which I believe better supports the situation that you might have to support multiple versions of your product, and need to retain the ability to apply hotfixes and patches to different versions.

Branch ready for Production!

So I have been working away on my branch and I am ready to deploy! I create a pull request on github, we have the chance to review it and decide if it is ready, if it is all ready to go then, I merge the pull request on github and now my code has made it into the master branch, ready to deploy.

Build It!

As mentioned earlier, we use Jenkins to build and deploy our xpages applications and plugins. We have windows machine set up with Jenkins running, and also the ‘headless’ domino designer running, and also an eclipse installation too.

Our Jenkins server used to be publicly accessible, and therefore used to receive a ‘ping’ from github to automatically trigger a build whenever master branch was updated. But since our Jenkins machine is now only accessible from within our private network, it no longer receives the github notification. You could also set up the jenkins server to ‘poll’ github for changes, however I have found that it is not a problem to just manually trigger the production build because I usually just do it immediately after merging the pull-request anyway.

Here is a screenshot of our Jenkins homepage, there are a bunch of jobs on there but truth is we only really use 3 of them. 1 for building and 2 for deploying.

The build job is called ‘Horizon’ as this is our name for our system, I have set this one up as a multi-branch pipeline job, because I thought I might have a ‘production’ and a ‘staging’ building, but in truth we only ever do a production build so the multi-branch thing was unnecessary.

Looking at the page of the production branch build, you can see the jenkins ‘pipeline’ is split into checkout, build plugins, build nsfs.

Here you can see something that happens occasionally where it fails when I introduce some new properties to some XPages controls (which I only do maybe once a month). What happens is, the plugins build successfully with the new control properties, however the Headless Domino Designer NSF Build triggers immediately afterwards, and does not have these new plugins installed, so the NSFs fail. What you can’t see is that after build #387, I manually re-install the newly built XPages library to the Headless Domino designer, and then trigger another build. It usually returns to a successful build immediately afterwards but here I got a random error for build #388 for some reason, and running the build again #389 was successful. This is a little annoying but usually most deployments don’t involve new control properties so this is a rare occurrence.

Jenkins Pipeline file

The Jenkins ‘pipeline’ build is a newer way of defining the build. Here is the pipeline file that we are using. You can see it simply triggers 2 ant tasks, and reports the success / failure via email

Ant Tasks

The two ant tasks triggered in the JenkinsFile (distplugins and buildNsfs) are defined in a custom ant build.xml script that is committed to the repository. This script makes use of the functionality of the BuildXPages project.

Building Plugins

The distplugins task is just used to sequence some other tasks in the script

This shows the copyPlugins task which simply copies the source of the plugins to a working directory in preparation to be built by Eclipse PDE Build’s Headless plugin/feature building.
This is the buildPlugins custom task which calls the BuildXPages buildPlugins task, this is where Eclipse PDE Build does it’s thing.
After building the plugins I unzip them because they are zipped up after PDE build does it’s thing.
Lastly, I copy the newly build plugins to an ‘update site’ so I always have the latest plugins available in the same place on the file system. This is where I will deploy them from when we deploy

Building NSFs

The buildNsfs task coordinates the building of all the NSFs

The ‘nsfbuild’ is a macro that is defined within the same custom build.xml ant script, this calls the ‘buildnsf’ task from the BuildXPages project.

If there are any errors in the NSF build, they will show in the console output of the Jenkins build

If there were no errors during plugins and NSF building, then we get a green status and we are ready to deploy!

As you can see, our build takes about 4 mins. Our NSFs’ are building incrementally thanks to the Headless Designer Plugin that comes with BuildXPages project. Otherwise, a full-build of every NSF would be performed every time, and this would take much longer.

Deploying!

We have 2 Jenkins jobs for deploying, one to upload our plugins to the NSF update site, and another one to run the design refresh of the templates on the production server.

Our servers are set to restart every morning at 6am, so we actually schedule these 2 tasks to run every morning at 5:15am for plugins and then 5:30am for templates refresh.

We used to just restart after the 1am design refresh but we had some people from other offices still using our server at that time, so 6am was a better time for our company.

Import Master Plugins

The Import master plugins is just a custom Jenkins job that is not linked to any source repository, it simply has one file in the job that is the ant script.

The ant script uses the BuildXPages importplugins task to import the plugins into an Update site nsf that is on the production server, from the plugins final location in the earlier build task.

Refresh Data Templates

All the NSFs that are ‘built’ in the Headless Domino Designer have a template name with a suffix of ‘_master’ to indicate they are the master branch design. These NSFs are considered to be ‘local’ to the Jenkins machine.

All the production templates are set to inherit from their templates with suffix ‘_master’.

So for refreshing the production templates, we simply use the BuildXPages refreshdbdesign task to refresh from our ‘local’ templates to the production server.

Conclusion

Hopefully that was a good overview on how we are managing and deploying our xpages applications using BuildXPages + Jenkins. I’m sure I missed something here or there so let me know if you have any questions!

You may also like...

2 Responses

  1. Hi Cameron, thanks for your post! Have you considered picking online chess opponents who need more time to think for their actions so you gain some free time to write about your development activities? 🙂

  2. Thanks for sharing, Cameron!

Leave a Reply

Your email address will not be published. Required fields are marked *