Gradle projects as Debian packages

Gradle is a great tool for setting up and building your Java projects. If you want to deliver them for Ubuntu or other debian-based distributions you should consider building .deb packages. Because of the quite steep learning curve of debian packaging I want to show you a step-by-step guide to get you up to speed.

Prerequisites

You have a project that can be built by gradle using gradle wrapper. In addition you have a debian-based system where you can install and use the packaging utilities used to create the package metadata and the final packages.

To prepare the debian system you have to install some packages:

sudo apt install dh-make debhelper javahelper

Generating packaging infrastructure

First we have to generate all the files necessary to build full fledged debian packages. Fortunately, there is a tool for that called dh_make. To correctly prefill the maintainer name and e-mail address we have to set 2 environment variables. Of course, you could change them later…

export DEBFULLNAME="John Doe"
export DEBEMAIL="john.doe@company.net"
cd $project_root
dh_make --native -p $project_name-$version

Choose “indep binary” (“i”) as type of package because Java is architecture indendepent. This will generate the debian directory containing all the files for creating .deb packages. You can safely ignore all of the files ending with .ex as they are examples features like manpage-generation, additional scripts pre- and post-installation and many other aspects.

We will concentrate on only two files that will allow us to build a nice basic package of our software:

  1. control
  2. rules

Adding metadata for our Java project

In the control file fill all the properties if relevant for your project. They will help your users understand what the package contains and whom to contact in case of problems. You should add the JRE to depends, e.g.:

Depends: openjdk-8-jre, ${misc:Depends}

If you have other dependencies that can be resolved by packages of the distribution add them there, too.

Define the rules for building our Java project

The most important file is the rules makefile which defines how our project is built and what the resulting package contents consist of. For this to work with gradle we use the javahelper dh_make extension and override some targets to tune the results. Key in all this is that the directory debian/$project_name/ contains a directory structure with all our files we want to install on the target machine. In our example we will put everything into the directory /opt/my_project.

#!/usr/bin/make -f
# -*- makefile -*-

# Uncomment this to turn on verbose mode.
#export DH_VERBOSE=1

%:
	dh $@ --with javahelper # use the javahelper extension

override_dh_auto_build:
	export GRADLE_USER_HOME="`pwd`/gradle"; \
	export GRADLE_OPTS="-Dorg.gradle.daemon=false -Xmx512m"; \
	./gradlew assemble; \
	./gradlew test

override_dh_auto_install:
	dh_auto_install
# here we can install additional files like an upstart configuration
	export UPSTART_TARGET_DIR=debian/my_project/etc/init/; \
	mkdir -p $${UPSTART_TARGET_DIR}; \
	install -m 644 debian/my_project.conf $${UPSTART_TARGET_DIR};

# additional install target of javahelper
override_jh_installlibs:
	LIB_DIR="debian/my_project/opt/my_project/lib"; \
	mkdir -p $${LIB_DIR}; \
	install lib/*.jar $${LIB_DIR}; \
	install build/libs/*.jar $${LIB_DIR};
	BIN_DIR="debian/my_project/opt/my_project/bin"; \
	mkdir -p $${BIN_DIR}; \
	install build/scripts/my_project_start_script.sh $${BIN_DIR}; \

Most of the above should be self-explanatory. Here some things that cost me some time and I found noteworthy:

  • Newer Gradle version use a lot memory and try to start a daemon which does not help you on your build slaves (if using a continous integration system)
  • The rules file is in GNU make syntax and executes each command separately. So you have to make sure everything is on “one line” if you want to access environment variables for example. This is achieved by \ as continuation character.
  • You have to escape the $ to use shell variables.

Summary

Debian packaging can be daunting at first but using and understanding the tools you can build new packages of your projects in a few minutes. I hope this guide helps you to find a starting point for your gradle-based projects.

Advertisements

Advanced deb-packaging with CMake

CMake has become our C/C++ build tool of choice because it provides good cross-platform support and very reasonable IDE (Visual Studio, CLion, QtCreator) integration. Another very nice feature is the included packaging support using the CPack module. It allows to create native deployable artifacts for a plethora of systems including NSIS-Installer for Windows, RPM and Deb for Linux, DMG for Mac OS X and a couple more.

While all these binary generators share some CPACK-variables there are specific variables for each generator to use exclusive packaging system features or requirements.

Deb-packaging features

The debian package management system used not only by Debian but also by Ubuntu, Raspbian and many other Linux distributions. In addition to dependency handling and versioning packagers can use several other features, namely:

  • Specifying a section for the packaged software, e.g. Development, Games, Science etc.
  • Specifying package priorities like optional, required, important, standard
  • Specifying the relation to other packages like breaks, enhances, conflicts, replaces and so on
  • Using maintainer scripts to customize the installation and removal process like pre- and post-install, pre- and post-removal
  • Dealing with configuration files to protect end user customizations
  • Installing and linking files and much more without writing shell scripts using ${project-name}.{install | links | ...} files

All these make the software easier to package or easier to manage by your end users.

Using deb-features with CMake

Many of the mentioned features are directly available as appropriately named CMake-variables all starting with CPACK_DEBIAN_.  I would like to specifically mention the CPACK_DEBIAN_PACKAGE_CONTROL_EXTRA variable where you can set the maintainer scripts and one of my favorite features: conffiles.

Deb protects files under /etc from accidental overwriting by default. If you want to protect files located somewhere else you specify them in a file called conffiles each on a separate line:

/opt/myproject/myproject.conf
/opt/myproject/myproject.properties

If the user made changes to these files she will be asked what to do when updating the package:

  • keep the own version
  • use the maintainer version
  • review the situation and merge manually.

For extra security files like myproject.conf.dpkg-dist and myproject.conf.dpkg-old are created so no changes are lost.

Unfortunately, I did not get the linking feature working without using maintainer scripts. Nevertheless I advise you to use CMake for your packaging work instead of packaging using the native debhelper way.

It is much more natural for a CMake-based project and you can reuse much of your metadata for other target platforms. It also shields you from a lot of the gory details of debian packaging without removing too much of the power of deb-packages.

Debian packaging against the rules

In a former post I talked about packaging your own software in the most convenient and natural way for the target audience. Think of a MSI or .exe installer for Microsoft Windows, distribution specific packages for Linux (maybe even by providing own repositories) or smartphone apps via the standard app stores. In the case of Debian packages there are quite strict rules about filesystem layout, licensing and signatures. This is all fine if you want to get your software upstream into official repositories.

If you are developing commercial software for specific clients things may be different! I suggest doing what serves the clients user experience (UX) best even in regard to packaging for debian or linux.

Packaging for your users

Packaging for Linux means you need to make sure that your dependencies and versioning are well defined. If you miss out here problems will arise in updating your software. Other things you may consider even if they are against the rules

  • Putting your whole application with executables, libraries, configuration and resources under the same prefix, e.g. /opt/${my_project} or /usr/local/${my_project}. That way the user finds everything in one place instead of scattered around in the file system.
    • On debian this has some implication like the need to use the conffiles-feature for your configuration
  • Package together what belongs together. Often times it has no real benefit to split headers, libraries, executables etc. into different packages. Fewer packages makes it easier for the clients to handle.
  • Provide integration with operating system facilities like systemd or the desktop. Such a seamless integration eases use and administration of your software as no “new tricks” have to be learned.
    • A simple way for systemd is a unit file that calls an executable with an environment file for configuration
  • Adjust the users path or put links to your executables in well known directories like /usr/bin. Running your software from the command line should be easy and with sensible defaults. Show sample usages to the user so they can apply “monkey see – monkey do”.

Example of a unit file:

[Unit]
Description=My Server

[Service]
EnvironmentFile=/opt/my_project/my-server.env
ExecStart=/usr/bin/my-server

[Install]
WantedBy=multi-user.target

In the environment file you can point to other configuration files like XML configs or the like if need be. Environment variables in general are a quite powerful way to customize behaviour of a program on a per-process base, so make sure your start scripts or executables support them for manual experimentation, too.

Possible additional preparations

If you plan to deliver your packages without providing an own repository and want to enable your clients to install them easily themselves you can further aid them.

If the target machines are few and can easily be prepared by you, install tools like gdebi that allow installation using double click and a graphical interface.

If the target machines are numerous implement automation with tools like ansible and ensure unattended installation/update procedures.

Point your clients to easy tools they are feeling comfortable with. That could of course be a command line utility like aptitude, too.

What to keep in mind

There is seldom a one-size-fits-all in custom software. Do what fits the project and your target audience best. Do not fear to break some rules if it improves the overall UX of your service.

Functional tests for Grails with Geb and geckodriver

Previously we had many functional tests using the selenium-rc plugin for Grails. Many were initially recorded using Selenium IDE, then refactored to be more maintainable. These refactorings introduced “driver” objects used to interact with common elements on the pages and runners which improved the API for walking through a multipage process.

Selenium-rc got deprecated quite a while ago and support for Firefox broke every once in a while. Finally we were forced to migrate to the current state-of-the-art in Grails functional testing: Geb.

Generally I can say it is really a major improvement over the old selenium API. The page concept is similar to our own drivers with some nice features:

  • At-Checkers provide a standardized way of checking if we are at the expected page
  • Default and custom per page timeouts using atCheckWaiting
  • Specification of relevant content elements using a JQuery-like syntax and support for CSS-selectors
  • The so-called modules ease the interaction with form elements and the like
  • Much better error messages

While Geb is a real improvement over selenium it comes with some quirks. Here are some advice that may help you in successfully using geb in the context of your (grails) webapplication.

Cross-plattform testing in Grails

Geb (or more specifically the underlying webdriver component) requires a geckodriver-binary to work correctly with Firefox. This binary is naturally platform-dependent. We have a setup with mostly Windows machines for the developers and Linux build slaves and target systems. So we need binaries for all required platforms and have to configure them accordingly. We have simply put them into a folder in our project and added following configuration to the test-environment in Config.groovy:

environments {
  test {
    def basedir = new File(new File('.', 'infrastructure'), 'testing')
    def geckodriver = 'geckodriver'
    if (System.properties['os.name'].toLowerCase().contains('windows')) {
      geckodriver += '.exe'
    }
    System.setProperty('webdriver.gecko.driver', new File(basedir, geckodriver).canonicalPath)
  }
}

Problems with File-Uploads

If you are plagued with file uploads not working it may be a Problem with certain Firefox versions. Even though the fix has landed in Firefox 56 I want to add the workaround if you still experience problems. Add The following to your GebConfig.grooy:

driver = {
  FirefoxProfile profile = new FirefoxProfile()
  // Workaround for issue https://github.com/mozilla/geckodriver/issues/858
  profile.setPreference('dom.file.createInChild', true)
  new FirefoxDriver(profile)
}

Minor drawbacks

While the Geb-DSL is quite readable and allows concise tests the IDE-support is not there. You do not get much code assist when writing the tests and calling functions of the page objects like in our own, code based solution.

Conclusion

After taking the first few hurdles writing new functional tests with Geb really feels good and is a breeze compared to our old selenium tests. Converting them will be a lot work and only happen on a case-by-case basis but our coverage with Geb is ever increasing.

Evolution of programming languages

Programming languages evolve over time. They get new language features and their standard library is extended. Sounds great, doesn’t it? We all know not going forward means your go backward.

But I observe very different approaches looking at several programming ecosystems we are using.

Featuritis

Java and especially C# added more and more “me too” features release after release making relatively lean languages quite complex multi-paradigm languages. They started object oriented and added generics, functional programming features and declarative programming (LINQ in C#) and different UI toolkits (AWT, Swing, JavaFx in Java; Winforms, WPF in C#) to the mix.

Often the new language features add their own set of quirks because they are an afterthought and not carefully enough designed.

For me, this lack of focus makes said language less attractive than more current approaches like Kotlin or Go.

In addition, deprecation often has no effect (see Java) where 20 year old code and style still works which increases the burden further . While it is great from a business perspektive in that your effort to maintain compatibility is low it does not help your code base. Different styles and old ways of doing something tend to remain forever.

Revolution

In Grails (I know, it is not a programming language, but I has its own ecosystem) we see more of a revolution. The core concept as a full stack framework stays the same but significant components are changed quite rapidly. We have seen many changes in technology like jetty to tomcat, ivy to maven, selenium-rc to geb, gant to gradle and the list goes on.

This causes many, sometimes subtle, changes in behaviour that are a real pain when maintaining larger applications over many years.

Framework updates are often a time-consuming hassle but if you can afford it your code base benefits and will eventually become cleaner.

Clean(er) evolution

I really like the evolution in C++. It was relatively slow – many will argue too slow – in the past but it has picked up pace in the last few years. The goal is clearly stated and only features that support it make it in:

  • Make C++ a better language for systems programming and library building
  • Make C++ easier to teach and learn
  • Zero-Cost abstractions
  • better Tool-support

If you cannot make it zero-cost your chances are slim to get your feature in…

C at its core did not change much at all and remained focused on its merits. The upgrades mostly contained convenience features like line comments, additional data type definitions and multithreading.

Honest evolution – breaking backwards compatibility

In Python we have something I would call “honest evolution”. Python 3 introduced some breaking changes with the goal of a cleaner and more consistent language. Python 2 and 3 are incompatible so the distinction in the version number is fair. I like this approach of moving forward as it clearly communicates what is happening and gets rid of the sins in the past.

The downside is that many systems still come with both, a Python 2 and a Python 3 interpreter and accompanying libraries. Fortunately there are some options and tools for your code to mitigate some of the incompatibilities, like the __future__ module and python-six.

At some point in the future (expected in 2020) there will only support for Python 3. So start making the switch.

Do most language make false promises?

Some years ago I stumbled over this interesting article about C being the most effective of programming language and one making the least false promises. Essentially Damien Katz argues that the simplicity of C and its flaws lead to simple, fast and easy to reason about code.

C is the total package. It is the only language that’s highly productive, extremely fast, has great tooling everywhere, a large community, a highly professional culture, and is truly honest about its tradeoffs.

-Damien Katz about the C Programming language

I am Java developer most of the time but I also have reasonable experience in C, C++, C#, Groovy and Python and some other languages to a lesser extent. Damien’s article really made me think for quite some time about the languages I have been using. I think he is right in many aspects and has really good points about the tools and communities around the languages.

After quite some thought I do not completely agree with him.

My take on C

At a time I really liked the simplicity of C. I wrote gtk2hack in my spare time as an exercise and definitely see interoperability and a quick “build, run, debug”-cycle as big wins for C. On the other hand I think while it has a place in hardware and systems programming many other applications have completely different requirements.

  • A standardized ABI means nothing to me if I am writing a service with a REST/JSON interface or a standalone GUI application.
  • Portability means nothing to me if the target system(s) are well defined and/or covered by the runtime of choice.
  • Startup times mean nothing to me if the system is only started once every few months and development is still fast because of hot-code replacement or other means.
  • etc.

But I am really missing more powerful abstractions and better error handling or ressource management features. Data structures and memory management are a lot more painful than in other languages. And this is not (only) about garbage collection!

Especially C++ is making big steps in the right direction in the last few years. Each new standard release provides additional features making code more readable and less error prone. With zero cost abstractions at the core of language evolution and the secondary aim of ease of use I really like what will come to C++ in the future. And it has a very professional community, too.

Aims for the C++11 effort:

  • Make C++ a better language for systems programming and library building
  • Make C++ easier to teach and learn

-Bjarne Stroustup, A Tour of C++

What we can learn from C

Instead of looking down at C and pointing at its flaws we should look at its strengths and our own weaknesses/flaws. All languages and environments I have used to date have their own set of annoyances and gotchas.

Java people should try building simple things and having a keen eye on dependencies especially because the eco system is so rich and crowded. Also take care of ressource management – the garbage collector is only half the deal.

Scala and C++ people should take a look at ABI stability and interoperability in general. Their compile times and “build, run, debug”-cycle has much room for improvement to say the least.

C# may look at simplicity instead of wildly adding new features creating a language without opinion. A plethora of ways implementing the same stuff. Either you ban features or you have to know them all to understand code in a larger project.

Conclusion

My personal answer to the title of this blog: Yes, they make false promises. But they have a lot to offer, too.

So do not settle with the status quo of your language environment or code style of choice. Try to maintain an objective perspective and be aware of the weaknesses of the tools you are using. Most platforms improve over time and sometimes you have to re-evaluate your opinion regarding some technology.

I prefer C++ to C for some time now and did not look back yet. But I also constantly try different languages, platforms and frameworks and try to maintain a balanced view. There are often good reasons to choose one over the other for a particular project.

 

Packaging Python projects for Debian/Ubuntu

Deployment of software using built-in software management tools is very convenient and provides a nice user experience (UX) for the users. For debian-based linux distributions like Ubuntu packaging software in .deb-packages is the way to go. So how can we prepare our python projects for packaging as a deb-package? The good news is that python is supported out-of-the-box in the debian package build system.

Alternatively, you can use the distutils-extension stdeb if you do not need complete flexibility in creating the packages.

Basic python deb-package

If you are using setuptools/distutils for your python project debian packaging consists of editing the package metadata and adding --with python to the rules file. For a nice headstart we can generate templates of the debian metadata files using two simple commands (the debhelper package is needed for dh_make:

# create a tarball with the current project sources
python setup.py sdist
# generate the debian package metadata files 
dh_make -p ${project_name}_${version} -f dist/${project_name}-${version}.tar.gz 

You have to edit at least the control-file, the changelog and the rules-file to build the python package. In the rules-file the make-target % is the crucial point and should include the flag to build a python project:

# main packaging script based on dh7 syntax
%:
	dh $@ --with python

After that you can build the package issueing dpkg-buildpackage.

The caveats

The debian packaging system is great in complaining about non-conformant aspects of your package. It demands digital signatures, correct file and directory names including version strings etc. Unfortunately it is not very helpful when you make packaging  mistakes resulting in empty, incomplete or broken packages.

Issues with setup.py

The setup.py build script has to reside on the same level as the debian-directory containing the package metadata. The packaging tools will not tell you if they could not find the setup script. In addition it will always run setup.py using python 2, even if you specified --with python3 in the rules-file.

Packaging for specific python versions

If you want better control over the target python versions for the package you should use Pybuild. You can do this by a little change to the rules-file, e.g. a python3-only build using Pybuild:

# main packaging script based on dh7 syntax
%:
	dh $@ --with python3 --buildsystem=pybuild

For pybuild to work it is crucial to add the needed python interpreter(s) besides the mandatory build dependency dh-python to the Build-Depends of the control-file, for python3-only it could look like this:

Build-Depends: debhelper (>=9), dh-python, python3-all
...
Depends: ${python3:Depends}

Without the dh-python build dependency pybuild will silently do nothing. Getting the build dependencies wrong will create incomplete or broken packages. Take extra care of getting this right!

Conclusion

Debian packaging looks quite intimidating at first because there are so many ways to build a package. Many different tools can ease package creation but also add confusion. Packaging python software is done easily if you know the quirks. The python examples from the Guide for Debian Maintainers are certainly worth a look!