Monday, October 13, 2008

Alternatives to Make, Part II

Last time I described how somebody insane enough could use CMake for their cross compiling needs. This entry I'll consider the next entry on the list of interesting Make replacements: SCons.


When using SCons as your build system you have 2 options: install scons on the system or include a version in your source tree. The easiest way is to install a version of scons on your system, and mark your build scripts that they require that version or newer of scons to run. This does put a burden on end users as they must appropriate their own copy of SCons, but it means your repository of code isn't full of 3rd party stuff. Additionally, the machine doing the compiling must have Python installed. On Linux this is no problem, but on Windows it's Yet Another Dependency. This is because SCons, unlike CMake, uses an already-existing programming language to describe the build process. In this case, Python. The build scripts are called "SConstruct" and, apart from some oddities, work like normal Python modules.

So on to the cross compiling issue - compiling our C or C++ code into a DS program. Unlike CMake, there's no set way to handle cross compilation. In fact, there are no real "best working practices" outlined anywhere in the (quite extensive) SCons documentation - you're free to do what you like. My recommendation here is to take a leaf out of CMake's book and separate out as much as possible into a toolchain file.

As with the CMake example, let's start off with the ansi_console example from devkitPro. This has a source sub directory and a main.c file. First, create a SConstruct file that will read in a toolchain called "arm":
env = Environment(tools='arm', toolpath='.')
The toolpath means "look for the file in the current directory". The toolchain file is a normal python module. This means a file called "". In order to make it a true SCons tool file we need to add 2 functions, "generate" and "exists". So that would be:
def generate(env, **kwargs):

def exists(env):
return 1
I get the feeling that exists() is not actually used - it certainly isn't in SCons 1.0.1, but the documentation says it is required. The generate function is called when we create the Environment, passing the Environment into the generate function so we can mess about with the innards. So lets do the usual setup, which is to check DEVKITARM and DEVKITPRO are set in the user's environment. SCons by default doesn't pull all the environment variables into its own Environment object. This is good, as we don't really want a build to depend "randomly" on some rogue variable. But! we do want to use some of them. Anyway, on to the code:
from os import environ
from os.path import pathsep,join
def check_devkit(env):
ENV = env['ENV']
for var in ('DEVKITARM', 'DEVKITPRO'):
if var not in environ:
print 'Please set %s. export %s=/path/to/%s'%(var, var, var)
ENV[var] = environ[var]
ENV['PATH'] = ENV['PATH']+pathsep+join(environ['DEVKITARM'], 'bin')
if not find_devkitarm(env):
print 'DevkitARM was not found'

def generate(env, **kwargs):
That is pretty straight forward - it checks the environ(ment) and adds $DEVKITARM/bin to the path. Now we need to find out if we have a suitable compiler. Here things get a bit icky. Because SCons isn't a priori designed for cross compilation, it assumes that your GNU-based compiler is called just "gcc". This means that, in order to X-compile, you'll need to install gcc, since we use the gcc tool detection and environment set up and overwrite parts with our arm-eabi-gcc. A bit of an irritating flaw, and one which CMake has understood and implemented correctly. The alternative, of course, is to copy paste the entire SCons gcc tools and replace gcc with arm-eabi-gcc - or suitable prefix. In fact, this alternative approach may well work out better... anyway. For now we'll use the approach that assumes vanilla gcc and overwrites with arm-eabi.
def setup_tools(env):
gnu_tools = ['gcc', 'g++', 'gnulink', 'ar', 'gas']
for tool in gnu_tools:
env['CC'] = prefix+'gcc'
env['CXX'] = prefix+'g++'
env['AR'] = prefix+'ar'
env['AR'] = prefix+'as'
env['OBJCOPY'] = prefix+'objcopy'
env['PROGSUFFIX'] = '.elf'

def generate(env, **kwargs):
So that sets up the environment for compiling, then overwrites the tool names with the arm-eabi equivalent. We also set the PROGSUFFIX (program suffix) to .elf - this makes life easier for the objcopy step. Now we need the "magic flags" that cause our Nintendo DS program to compile.
def add_flags(env):
# add arm flags
env.Append(CCFLAGS='-march=armv5te -mtune=arm946e-s'.split())
# add libnds
libnds = join(environ['DEVKITPRO'], 'libnds')
env.Append(LIBPATH=[ join(libnds, 'lib')])
env.Append(CPPPATH=[ join(libnds, 'include')])

def generate(env, **kwargs):
These flags are the usual ARM9 flags and libnds include path for compiling C code, and libnds9 and the specs flag when linking. Without these devkitArm complains (this is as expected, it is a multi-platform ARM compiler, so you need to tell it the exact platform you want to use). There's one more step here - we need a way to build the nds file from an .elf file - but first lets go back to the SConstruct. If you recall, at the top of the post there, we had just created an Environment. Now we have added a load of ARM stuff to that Environment. The next step here is to create a program. In SCons we can do this as follows:
env.Program('ansi_console', os.path.join('source','main.c')) 
That's it! Except, no it isn't. This produces ansi_console.elf, which needs to be objcopy'd and ndstool'd. To do that we can go back to our tool file. Here we add in new "Builders" to do the work. A Builder is like the Program method we saw in the SConstruct - it takes the names of the source files and produces the outputs... so we add in the builders to our file:
def add_builders(env):
def generate_arm(source, target, env, for_signature):
return '$OBJCOPY -O binary %s %s'%(source[0], target[0])
def generate_nds(source, target, env, for_signature):
if len(source) == 2:
return "ndstool -c %s -7 %s -9 %s"%(target[0], source[0], source[1])
return "ndstool -c %s -9 %s"%(target[0], source[0])
'Ndstool': SCons.Builder.Builder(
'Objcopy': SCons.Builder.Builder(
def generate(env, **kwargs):
[f(env) for f in (check_devkit, setup_tools, add_flags, add_builders)]
That's a fair amount of typing! What does it do? Well, the "generate_arm" function is called when we want to generate an .arm file from an .elf file. It returns a SCons-y string that will be executed to do the actual work. Here it is our old OBJCOPY string - the $OBJCOPY is replaced automagically by SCons with the equivalent Environment variable. The "generate_nds" function is called when we want to generate an .nds from an .arm, it too returns the command line that will be executed. There's a bit of a trick there that checks if we need to combine an ARM7 and ARM9 core, or just use the default ARM7, but apart from that it is straightforward. The "env.Append(BUILDERS=...)" bit creates new functions called Ndstool and Objcopy that can be used like Program. Passing in a generator function means we use functions - you could use a fixed string and pass it as an action too.

Armed with our new methods, lets go back to the SConstruct. We can objcopy and ndstool the Program/elf file as follows:
env.Ndstool(env.Objcopy(env.Program('ansi_console', join('source','main.c'))))
That's it. There are a couple of things we can do to make this better. For example, rather than splurge the build artifacts all over the source tree, we can use a build directory. To do this we need to use SCons recursively. The Recursive file is called a "SConscript", and all we have to remember is that to pass objects (like the env created in the top level SConstruct) down to the other SConscripts, we have to use an Export/Import mechanism. A bit confusing, but the code's easy enough:
#in SConstruct
env = Environment(tools=['arm'], toolpath=['.'])

# in source/SConscript
env.Ndstool(env.Objcopy(env.Program('ansi_console', 'main.c')))
Passing exports to the SConscript file exports the named variables. Using Import imports them into the local namespace. Magical! There's also a Return() function to return variables from sub scripts. That's usefull when tracking local library dependencies.

OK, so what about "dual core" programs? It turns out that it is a bit of effort. More even than CMake, I'd say. We can either create a second Environment and set up the variables for ARM9 or ARM7 according to the core in question, or use a single Environment instance, set flags that should be used for ARM9 or ARM7, and use these appropriately for each Program call that we make. The first approach ends up a bit messy, with ifs for processor type in several places. The second approach is cleaner, but means there is a bit more code used when calling env.Program. I use the latter approach in Elite DS, so that's what I'll outline here. You have to set the compiler flags for arm9 and arm7 in the env variable. This can be done as follows, modifying the add_flags() function of our tool:
THUMB_FLAGS = ' -mthumb -mthumb-interwork '
'9': ' -march=armv5te -mtune=arm946e-s',
'7': ' -mcpu=arm7tdmi -mtune=arm7tdmi'

' -specs=ds_arm%c.specs -g -mno-fpu -Wl,-Map,${TARGET.base}.map -Wl,-gc-sections'
EXTRA_FLAGS = ' -Wno-strict-aliasing -fomit-frame-pointer -ffast-math '

def add_flags(env):
ccflags = ' '.join([EXTRA_FLAGS, THUMB_FLAGS])
CCFLAGS_ARM9 = ' '.join([ccflags, PROCESSOR_CFLAGS['9']])
CCFLAGS_ARM7 = ' '.join([ccflags, PROCESSOR_CFLAGS['7']])
LIBS_ARM9 = ['fat', 'nds9']
LIBS_ARM7 = ['nds7']


# add libnds
libnds = join(environ['DEVKITPRO'], 'libnds')
env.Append(LIBPATH=[ join(libnds, 'lib')])
env.Append(CPPPATH=[ join(libnds, 'include')])

def generate(env, **kwargs):
[f(env) for f in (check_devkit, setup_tools, add_flags, add_builders)]
Most of those flags are taken from the devkitPro examples and should be pretty familiar. In order to actually use them, we can override the flags used for each individual program. So in the SConscript, the env.Program line would become:
env.Program('ansi_console', os.path.join('source','main.c'),
More typing, but there's no need to pass extra Environments about. The equivalent for an ARM7 program would obviously replace the 9's for 7's.

As before with my CMake entry, here is a quick summary to help you decide whether or not to use SCons as your build system for cross compiling NDS programs.

The Bad

Scalability: I haven't mentioned this yet, but the biggest problem SCons has is that when you reach several hundred source files, the time that it takes to parse the SConscripts and generate dependencies becomes considerable. If you also use SCons autoconf-like Configure functions, then the configuration step is, by default, run each and every time you compile. The results are cached, but it takes several seconds to go through the tests. This became an issue for me on Bunjalloo, which used to use SCons. Before compiling anything SCons would typically sit around in silence for 15 seconds contemplating the build. I've seen builds that do "nothing" for minutes at a time that, when changed to CMake or whatever, did similar thinking time in seconds. KDE and OpenMoko are two famous SCons "deserters" due to performance problems. On the other hand for Elite DS, which only has around a hundred source files, the do-nothing time is negligible and SCons works great.

Not really cross-compiling friendly: Unlike CMake, SCons is not built with cross compiling in mind. This is demonstrated by the hard-coding of "gcc", etc, in the SCons-tools. This means you probably will have to install native tools as well as the cross tools in order to compile anything. (Note: the new 1.1.0 may have fixed this, it was released as I was writing this post!)

Syntax can get complicated: When writing scripts that use multiple local libraries, the Return() and Import/Export() mechanisms that SCons uses can get a bit unwieldy. You end up with lots of global variable names that you have to Import() and Export() across SConscripts, or else Return() everything to the parent file and let it sort out the mess.

Python can look intimidating: Unlike CMake, which still looks a bit Makefile-like, SCons scripts can have any old Python code in it. Without discipline, this can result in build scripts that are difficult to follow, especially if Python is not one of your strong languages.

Dependencies: As with CMake, you still need to install stuff - namely the SCons build system itself. As more projects use scons this will become less of a problem, but at the moment it can be annoying for users ("oh no! where is my Makefile?")

The Good

Python code: If you are familiar with Python already, you don't need to learn yet another language. There are no hacks to get loops working. Proper data structures can be used to describe the build. Any of the Python libraries can be imported and used. This is a very powerful advantage, as well as being multi-platform (providing you stick to cross platform safe python code, of course).

Stable API: SCons is backwards compatible with the practically stone-age Python 1.5.2 and its APIs for building things changes infrequently, first deprecating functions and rarely removing anything. This makes changing from an older version to a newer one fairly painless. A full suite of tests means that regressions on new versions are pretty rare - if something goes wrong, it is likely to be due to a bug in your SConscript, rather than a bug in SCons.

Great documentation: There is loads of good documentation on the SCons website. The manual is particularly well done, with an easy step-by-step guide. Once installed, a very complete man page is installed (on Linux, FreeBSD, etc at least) that contains examples as well as the full API.


As with CMake, you probably need a good reason to not use a bog-standard Makefile. For me, being able to code the build logic in Python is the deciding factor. I think SCons is pretty good, and use it for Elite DS. I'd probably use SCons for other projects I do too, depending on their size (and my mood). The Bunjalloo build became a bit too slow with SCons, but I'll talk more about that next time when I discuss Waf. There seems to be quite an uptake of SCons for new OSS projects (Google's Chrome browser for one), especially in favour of autotools, and hopefully we'll see some improvement in the speed because of this (the latest release, 1.1.0, boasts improved memory use, but I have yet to see any benchmarks). Until then, you could probably do a lot worse than download the latest release and have a mess around to see what it's all about.


  1. Anonymous1:37 a.m.

    Are you planning on reviewing bjam and I'd be interested to see what you think.

    We had several large, complex projects using Make, and were frequently running into limitations or difficulties. To write clean, portable make scripts takes a huge amount of work. We used SCons for a while, but it doesn't really provide proper support for building on different platforms with different toolchains. You have to write a lot of support code yourself.

    Then we tried Boost.Build. It takes a lot of effort to learn, and wade through the confusing documentation. However, if you can do so, it turns out to be a very elegant and very powerful build system. It really is worth the pain of learning, as we now have nice, clean build scripts that can fairly transparently handle different toolchains, platforms, build variants and more.

  2. I had a great experience replacing a make based system with SCons for a custom tool chain.

    At the end the new system was faster than the one one, mainly due to the fact that the old one used gcc to compute dependencies as well.

    Highly recommended.

  3. Boost.Build/Jam isn't something I've ever really looked at. I tend to dig through any open source programs that I come across, and it always intrigues me when I see something other than the GNU build system or plain-old Makefiles. While CMake and SCons are becmonig more popular, Jam files are still pretty rare. So I assumed it wasn't too popular. I might take a look now, as the lack of documentation could well be the reason for its rarity.


Note: only a member of this blog may post a comment.