GNOME has a disturbing, almost paralytic dependence on the GNU Autotools. I’ve already ranted about weird tools once tonight, and yet, here we go again.
Reading through the documentation for PyGI is annoying, mainly because it makes a set of very silly recommendations: use Autotools for new projects.
For those who live in a sane world whene build systems actually work, the Autotools mess is an ancient attempt at using M4 (but not quite standard M4) to generate Bourne shell (but not quite standard Bourne shell) and Makefiles (but not quite standard Makefiles).
Here’s the catch: it was new in the mid-1990s. It’s nearly two decades old and backwards and forwards incompatible in a neurotic attempt at building a system that can generate totally cross platform build tools.
In theory, this is a great idea. Except, of course, for the fact that
standard M4 won’t work; you have to use GNU M4 to run the Autotools,
and run Autoconf configure
scripts with GNU bash (the “Bourne again”
shell) and compile Automake Makefile
s with GNU Make. Oh, and all of
them use Autotools.
The Autotools are two main applications: Autoconf and Automake. Autoconf is a good way to take 6 kiB, semi-legible text files, and produce 200+ kiB shell scripts that are so densely packed with shell compatibility wrappers and system feature detection hacks they can take minutes to run.
Automake is an even better way to turn 3 kiB, nearly POSIXly correct
Makefiles into 600+ kiB, somewhat POSIXly correct Makefile templates,
which are then turned, via the magic of sed(1)
into Makefiles that
are so thoroughly obfuscated it is impossible to determine their
function.
Along with them, there’s the magic that is libtool, which is probably the most excusable of the lot simply because it has to abstract away the more stupid differences between various platforms in terms of shared (“dynamic”) and static library building. Even then, it’s another 200-300 kiB of shell scripting line noise.
There are so many bizarre and exotic extensions that it’s non-trivial to port anything away from them. And yet nearly any build system one could choose to name does better than it.
pkg-config
is a good start: it provides a bunch of preconfigured
files so that applications know how to communicate with certain
libraries. This, for instance, is one such pkg-config
file for
Gtk+, and it’s fairly straightforward to see how it can be trivially
parsed for information about how to link against Gtk+ 3.0.
prefix=/usr
exec_prefix=/usr
libdir=/usr/lib64
includedir=/usr/include
targets=x11 broadway wayland
gtk_binary_version=3.0.0
gtk_host=x86_64-redhat-linux-gnu
Name: GTK+
Description: GTK+ Graphical UI Library
Version: 3.8.8
Requires: gdk-3.0 atk cairo cairo-gobject gdk-pixbuf-2.0 gio-2.0
Requires.private: atk atk-bridge-2.0 pangoft2 gio-unix-2.0
Libs: -L${libdir} -lgtk-3
Cflags: -I${includedir}/gtk-3.0
But we’re already beyond this. imake, the grand old regent, used the
C preprocessor, cpp
, in much the same way that Autotools uses M4.
imake knew how systems were configured at it’s build time (in much
the same way that CMake does so), so it was trivially easy to expand a
bunch of macros.
To illustrate the difference, compare the build times of X.org’s
X11R6.9 and X11R7.0, which were identical source trees, excepting the
build system. 6.9 used imake, and build system overheads were tiny.
7.0 used Autotools, and each configure
would take about a minute,
maybe a minute and a half, to run on my systems, as they probed the
same damn’ thing over and over again. It’s really impressive for some
of the userland applications, like xev
or xfontsel
where the
compile time is a tenth the time Autotools takes to run.
CMake, as I’ve previously alluded to, does it similarly, except it does do a degree of runtime detection. It’s sufficiently hybridised that build times are very small anyway, which it trades off for a long pre-configuration phase in it’s second-stage build.
Ah, and then there’s the old adage, heralded by Peter Miller in AUUG 1997: recursive Make is harmful, which is true more so than ever when one looks at the recommended way to build projects with Autotools. Considering huge Autotools usecases, like Mozilla Gecko, demonstrates the problems: incremental builds from the top level take huge amounts of time simply because the hierachy of makefiles that need to be parsed is so complex.
Chromium’s use of GN (migrated from GYP) and Ninja demonstrates a good compromise, simply because Ninja was built intending to be a stripped-down Make on steroids, and because both GYP and GN are so much more flexible than Autotools. And GN is reportedly derived from a Google internal build tool, which takes moments to run because, like imake, it knows every possible build environment, and doesn’t need to sit around for a prolonged period trying to determine what’s installed.
FreeBSD recently put together a config.site
, which is a filthy
kludge to make Autotools perform better by giving it answers to
questions it should already have. It’s similar to a config.cache
,
except it still falls over because it only covers base-system
libraries.
To summarise: Autotools is crap. Excise it from your project as soon as possible. Use any other build system you can find; CMake is an excellent bet because it understands multiple output engines.