Recently I’ve been putting some effort into getting the new buildtools working. I am now beginning to get a good understanding of how and why the previous version became rather messy. If the aim is to make life as simple as possible for the software authors and packagers then the more this is done the worse the code becomes. A big problem is that the way in which the system currently works is fundamentally incorrect but we’ve got very accustomed to the working practices. It seems to me to be inevitable that for any new system to work correctly we must give up some of the “benefits” of the current system.
The biggest problem is well demonstrated in the way we manage the RPM specfile. The specfile is generated from an input template file (it contains
@FOO@ macros). When we build a package using buildtools (e.g. with
make rpm) these macros are evaluated at build-time and you can get a different specfile for every platform. This means that building a package on one Redhat platform using buildtools and then using the generated SRPM to build on other platforms is a flawed approach. This can cause particular problems when moving from one architecture to another with the path to libraries (e.g. pam modules are stored in /lib64/security/ on 64-bit machines but are in /lib/security for 32bit). The specfile should have constant contents, and drive the build-process, it should not be a by-product of the build system.
So, given this what process should we go through starting with a code change and ending with newly built packages?
I reckon it works something like this:
- Edit code.
- Commit changes into revision-control system.
- Generate changelog entry.
- Tag new release.
- Export tagged release.
- Generate cmake files, within the exported release, which will control the build process. Note that no macro-substitution or compilation is done at this stage.
- Generate specfile with the only macros getting filled-in being “static” (e.g. name, version, changelog).
- Generate source tarball.
- Generate SRPM.
- For each target platform build RPM using SRPM. At this stage cmake is actually used to do the macro-substitution, code compilation and file installation.
These details are intentionally quite high-level. Throughout you can replace SRPM/RPM with the packaging system for your favourite platform. To begin with this is the process which we already know, it’s only from step 7 onwards that it begins to diverge. I’ve already explained the bit about not allowing completely general macro expansion for the specfile, building from SRPM is also essential. I believe it is very important that we build our packages in the same way that external users would, anything else is just not a good test of our packages.
The problem I’m banging my head against is that traditionally we have generated specfiles in the same way as we have filled in macros in the component code, schema and documentation. I don’t want to completely rule it out but I want to know at what level things should and shouldn’t be substituted. It is really useful being able to have an
@VERSION@ macro in the specfile so I don’t have to remember to edit by hand each time. But is doing macro-substitution on file and directory names at release-time a good idea? My gut-reaction is that this is not a good idea but what’s the alternative?
This is going to require something of a shift in working practices for Informatics COs so I’m really interested in thoughts and comments here.