CMake
CMake is a build system that generates files for use with the build tool on your platform. You write a configuration file, called CMakeLists.txt, run the cmake program to generate a load of Makefiles or project files, then run make (or whatever) to compile your project. On Windows, CMake generates project files for use with Visual Studio. It can also generate project files for CodeBlocks, Eclipse CDT and KDevelop.
A hefty obstacle for using CMake with the DS is that currently CMake doesn't support building multiple target architectures in one build tree. So compiling ARM7 and ARM9 cores and combining the result into the final .nds file requires a bit of fiddling. More on that later, as it can be done.
The first thing you will need is the latest version of CMake. The features needed to compile on the DS were only added quite recently (version 2.6 onwards).
After you've installed CMake, you can start hacking away. First we need to create a Toolchain File that CMake uses to understand your target platform. Normally CMake autodetects the local platform type and configures the compiler, linker, compilation flags and so on, to the correct values. When cross compiling for the DS, you have to tell CMake which compiler it should use. We don't want it to try and guess based on any native compiler installed.
As described in the CMake wiki we first have to turn off the auto detection. This is done by setting the variable CMAKE_SYSTEM_NAME to "Generic". Here we come across one of the oddities that takes a bit of getting used to. Instead of the usual BNF variable = value style syntax for assignment, CMake uses a function. So we have set(variable value). Whatever. The other 2 variables set here are optional:
set(CMAKE_SYSTEM_NAME Generic)
set(CMAKE_SYSTEM_VERSION 1)
set(CMAKE_SYSTEM_PROCESSOR arm-eabi)
A note on style here. Commands like if, message, endif, set and so on can be written either in ALL CAPS or in lower case. The keywords inside commands have to be in UPPER CASE. I prefer to write the commands in lowercase and everything else in upper case.So, after those first set() lines we begin to describe which compiler we'll use. The approach I'm going to use doesn't stray from the standard DEVKITARM and DEVKITPRO environment variables, which should point to where you have the devkitARM toolchain and libnds installed. We set 2 CMake variables based on environment variables as follows:
set(DEVKITARM $ENV{DEVKITARM})
set(DEVKITPRO $ENV{DEVKITPRO})
Actually this is quite good - it means that CMake by default doesn't drag the whole environment into the namespace. This means there is a better chance of our build being repeatable without having to set up a million environment variables. After this, we can check if they are set with these lines of code:if(NOT DEVKITARM)
message(FATAL_ERROR "Please set DEVKITARM in your environment")
endif(NOT DEVKITARM)
And similarly for DEVKITPRO. So now we need to check if the compiler is installed. This is done by setting the CMAKE_C(XX)_COMPILER variables as follows. Also we set the CMAKE_FIND_ROOT_PATH, which lets us use objcopy, ar, ranlib and friends. The other flags tell CMake to just use the compiler headers and libraries, to not search DEVKITARM for other libs and headers. Really we could create an install base for the DS, with lib, include and so on, but that is not the usual way to do things. It would probably break a lot of other hand-coded Makefiles that people use, so we'll stick with the $DEVKITPRO/libnds convention. Anyway, here is the code for setting up the compilers:set(CMAKE_C_COMPILER ${DEVKITARM}/bin/arm-eabi-gcc)
set(CMAKE_CXX_COMPILER ${DEVKITARM}/bin/arm-eabi-g++)
set(CMAKE_FIND_ROOT_PATH ${DEVKITARM})
set(CMAKE_FIND_ROOT_PATH_MODE_PROGRAM NEVER)
set(CMAKE_FIND_ROOT_PATH_MODE_LIBRARY ONLY)
set(CMAKE_FIND_ROOT_PATH_MODE_INCLUDE ONLY)
Here if DEVKITARM points somewhere that doesn't really have the arm-eabi compiler installed, then CMake will throw up an error message when we run it:CMake Error: your C compiler: "/path/to/devkitARM/bin/arm-eabi-gcc" was not found. Please set CMAKE_C_COMPILER to a valid compiler path or name.
CMake Error: your CXX compiler: "/path/to/devkitARM/bin/arm-eabi-g++" was not found. Please set CMAKE_CXX_COMPILER to a valid compiler path or name.
Apart from the compiler, we also need to indicate where libnds lives. This is done by telling CMake how to construct a library name - prepending lib, and adding .a at the end - and where to search for the library and headers:set(CMAKE_FIND_LIBRARY_PREFIXES lib)
set(CMAKE_FIND_LIBRARY_SUFFIXES .a)
include_directories(${DEVKITPRO}/libnds/include)
link_directories(${DEVKITPRO}/libnds/lib)
find_library(NDS9 nds9)
find_library(NDS7 nds7)
OK, that's it. The rest is now "vanilla" CMake. When we run the cmake program we need to tell it to use cross compiling. This is done with the CMAKE_TOOLCHAIN_FILE variable. We can set it with a gcc-like "-D" flag:cmake -DCMAKE_TOOLCHAIN_FILE=/full/path/to/devkitArm.cmake .
So now we have this, we can take an example from the nds-examples that are distributed with devkitArm and try to have it compile using CMake. Let's use the ansi_console example. Unzip the examples, and copy ansi_console to some place - it's in the Graphics/2D directory. Now create a file ansi_console/CMakeLists.txt and start editing it. The first thing to do here is to give the project a name:project(NDS-EXAMPLES)
In order to compile correctly when using libnds, we need to set a C define ARM9. For this, we use the add_definition macro. I'll also create a variable EXE_NAME that we can use instead of copying ansi_console everywhere. This way if we want to change the name of the binary later it won't be too much hassle.add_definitions(-DARM9)
set(EXE_NAME ansi_console)
Now we can describe how to build the "executable". The next line tells CMake to create an executable from the file source/main.cadd_executable(${EXE_NAME} source/main.c )
That alone isn't enough, as we also need to link with libnds for the ARM9 and pass in the required -specs flag.target_link_libraries(${EXE_NAME} nds9)
set_target_properties(${EXE_NAME}
PROPERTIES
LINK_FLAGS -specs=ds_arm9.specs
COMPILER_FLAGS "-mthumb -mthumb-interwork")
The set_target_properties macro allows us to fiddle with the compilation and link flags. A bit later, if you decide to use CMake for your NDS projects, you could make a wrapper around this to set up the usual flags for a NDS binary, rather than copying the above lines everywhere.So to here this would normally be enough on a platform that has an operating system and knows how to load the executable file format. We could run cmake, passing in the devkitArm toolchain file and the path to the ansi_console directory. The DS doesn't have an OS however and we need to strip out all the "elfyness" - the ELF headers and whatnot -, and add the special DS header. Converting the ELF to a pure binary is done with objcopy. Adding the header is done with ndstool, provided as part of the devkitArm distribution. We can write macros to help with these steps, and that allows me to write this kind of thing to finish off the process:
objcopy_file(${EXE_NAME})
ndstool_file(${EXE_NAME})
So where do these come from? Well, we have 2 choices. Either add the macro inline in the CMakeLists.txt file or create a module. The first is easier, but the second approach lets us reuse the module elsewhere. So in a new file called ndsmacros.cmake I'll add the following lines to define the macros OBJCOPY_FILE and NDSTOOL_FILEmacro(OBJCOPY_FILE EXE_NAME)
set(FO ${CMAKE_CURRENT_BINARY_DIR}/${EXE_NAME}.bin)
set(FI ${CMAKE_CURRENT_BINARY_DIR}/${EXE_NAME})
message(STATUS ${FO})
add_custom_command(
OUTPUT "${FO}"
COMMAND ${CMAKE_OBJCOPY}
ARGS -O binary ${FI} ${FO}
DEPENDS ${FI})
get_filename_component(TGT "${EXE_NAME}" NAME)
add_custom_target("TargetObjCopy_${TGT}" ALL DEPENDS ${FO} VERBATIM)
get_directory_property(extra_clean_files ADDITIONAL_MAKE_CLEAN_FILES)
set_directory_properties(
PROPERTIES
ADDITIONAL_MAKE_CLEAN_FILES "${extra_clean_files};${FO}")
set_source_files_properties("${FO}" PROPERTIES GENERATED TRUE)
endmacro(OBJCOPY_FILE)
if(NOT NDSTOOL_EXE)
message(STATUS "Looking for arm-eabi-objcopy")
find_program(NDSTOOL_EXE ndstool ${DEVKITARM}/bin)
if(NDSTOOL_EXE)
message(STATUS "Looking for arm-eabi-objcopy -- ${NDSTOOL_EXE}")
endif(NDSTOOL_EXE)
endif(NOT NDSTOOL_EXE)
if(NDSTOOL_EXE)
macro(NDSTOOL_FILE EXE_NAME)
set(FO ${CMAKE_CURRENT_BINARY_DIR}/${EXE_NAME}.nds)
set(I9 ${CMAKE_CURRENT_BINARY_DIR}/${EXE_NAME}.bin)
add_custom_command(
OUTPUT ${FO}
COMMAND ${NDSTOOL_EXE}
ARGS -c ${FO} -9 ${I9}
MAIN_DEPENDENCY ${I9}
)
get_filename_component(TGT "${EXE_NAME}" NAME)
add_custom_target("Target9_${TGT}" ALL DEPENDS ${FO} VERBATIM)
get_directory_property(extra_clean_files ADDITIONAL_MAKE_CLEAN_FILES)
set_directory_properties(
PROPERTIES
ADDITIONAL_MAKE_CLEAN_FILES "${extra_clean_files};${FO}")
set_source_files_properties(${FO} PROPERTIES GENERATED TRUE)
endmacro(NDSTOOL_FILE)
endif(NDSTOOL_EXE)
The first macro that defines OBJCOPY_FILE uses the built-in CMake command "add_custom_command". This has lots of options, but the ones used here say that cmake should run the CMAKE_OBJCOPY command passing in the given arguments (ARGS). CMAKE_OBJCOPY is defined automagically from our CMAKE_FIND_ROOT_PATH in the devkitArm toolchain file, it corresponds to arm-eabi-objcopy in the devkitArm installation. The "add_custom_target" isn't really needed here, but we will need it later for combined arm7/9 cores. It adds a new top level target called TargetObjCopy_[name of exe] that depends on the out file to the "all" target, so when we run "make all" it will build the binary file, if needed. Whew! That was all a bit complicated. The rest of the macro adds the output file to the clean target and marks the output as a generated file.The "if" block in the middle is a bit like the autoconf functionality from the GNU Build System - it tries to find the ndstool program, and raises an error if it cannot be found. Otherwise we get a varible NDSTOOL_EXE that we can use to run the program. The NDSTOOL_FILE macro uses the ndstool exe to create a "single core" binary. Actually, it uses a default arm7 core, which is why we don't have to provide one. Again, we add a custom command to make sure the thing gets built when we run "make all" and add the nds file to the list of things to get cleaned.
So we have ndsmacros.cmake. To include it there are 2 ways that vary subtly. The best approach is to set the CMAKE_MODULE_PATH variable to the directory containing our macro file and then include the macro as a module:
set(CMAKE_MODULE_PATH ${CMAKE_SOURCE_DIR})
include(ndsmacros)
That assumes you have the macro file at the top of the ansi_console directory, which for a small example like this is fine. On a bigger project, if you want lots of extra tools, you would probably park them off in a sub-directory to avoid clutter.So that's more or less it. Let's run this and see what happens. Ah! first I should just mention that we will compile in a separate build tree, keeping the source tree free from object files and other build artifacts. This is just good practice.
$ cd ansi_console
$ mkdir build
$ cd build
$ cmake -DCMAKE_TOOLCHAIN_FILE=$(pwd)/../devkitArm.cmake ..
-- The C compiler identification is GNU
-- The CXX compiler identification is GNU
-- Check for working C compiler: /path/to/devkitARM/bin/arm-eabi-gcc
-- Check for working C compiler: /path/to/devkitARM/bin/arm-eabi-gcc -- works
-- Detecting C compiler ABI info
-- Detecting C compiler ABI info - done
-- Check for working CXX compiler: /path/to/devkitARM/bin/arm-eabi-g++
-- Check for working CXX compiler: /path/to/devkitARM/bin/arm-eabi-g++ -- works
-- Detecting CXX compiler ABI info
-- Detecting CXX compiler ABI info - done
-- Looking for grit
-- Looking for grit -- /home/rich/src/ndsdev/devkitARM/bin/grit
-- Looking for arm-eabi-objcopy
-- Looking for arm-eabi-objcopy -- /path/to/devkitARM/bin/ndstool
-- In data
-- Configuring done
-- Generating done
-- Build files have been written to: /path/to/examples-cmake/ansi_console/build
Now if we look at what has been produced we see that there's a normal Makefile in the build directory. So lets run it...$ make
Scanning dependencies of target ansi_console
[ 25%] Building C object ansi_console/CMakeFiles/ansi_console.dir/source/main.c.obj
Linking C executable ansi_console
[ 25%] Built target ansi_console
Scanning dependencies of target Target9_ansi_console
[ 25%] Generating ansi_console.bin
[ 25%] Generating ansi_console.nds
Nintendo DS rom tool 1.38 - May 14 2008
by Rafael Vuijk, Dave Murphy, Alexei Karpenko
[ 75%] Built target Target9_ansi_console
Scanning dependencies of target TargetObjCopy_ansi_console
[100%] Built target TargetObjCopy_ansi_console
Nice coloured output there. And we have our ansi_console.nds. That's it, job done. As you can see, the Makefile tracks dependencies. For this simple example it isn't too bad - everything really only depends on the main.c file. But for larger projects, not having to think about generating the dependencies manually is a big plus. Sadly we are limited to timestamp checks for updates to dependencies, the make legacy showing through. We can do better than that with hashes. I'll mention that when I discuss SCons next time.There are still things missing from this, but armed with the basics we can improve the build without too much hassle. For example, we'd need a tool to convert images to data using grit. There are more flags that we need to pass in for C++ code to turn off exceptions and RTTI, for example. And we still have the dual code problem. The grit tool would be pretty much the same as our objcopy_file example, except it'd change a png file to a c source file. A bit of a limitation that, CMake doesn't compile assembler into executables, only C/C++.
Dual core is not too bad - we already have the basics. If we follow the above but for the "combined" example, then the main CMakeLists.txt would have the following lines:
add_subdirectory(arm9)
add_subdirectory(arm7)
We wouldn't actually use the ADD_EXECUTABLE macro here. In the arm7/CMakeLists.txt file, we'd have something like this:include_directories(${CMAKE_CURRENT_SOURCE_DIR})
add_definitions(-DARM7)
add_executable(combined_arm7 arm7.c)
target_link_libraries(combined_arm7 nds7)
set_target_properties(combined_arm7
PROPERTIES
LINK_FLAGS -specs=ds_arm7.specs)
objcopy_file(combined_arm7)
This layout is mandatory because the ADD_DEFINITIONS macro applies the -D flag in the directory where it is used and for all sub directories. I tried using REMOVE_DEFINITIONS to work around the problem, but that does not work:add_definitions(-DARM7)
add_executable(combined_arm7 arm7/arm7.c)
remove_definitions(-DARM7)
add_definitions(-DARM9)
add_executable(combined_arm9 arm9/arm9.c)
CMake uses the final set of definitions for all the executables. So we'd end up with the -DARM9 flag for the ARM7 binary too, which is not what we want. Ok, so the arm9/CMakeLists.txt file would have similar content to the arm7/CMakeLists.txt one, just switching all "7"s for "9"s. At the end we'd use the following macro to combine the 2 binaries into the nds file, in combined/CMakeLists.txt:ndstool_files(arm7/combined_arm7 arm9/combined_arm9 combined)
You have probably guessed that this macro is added to our ndsmacros.cmake file and would take 3 arguments:macro(NDSTOOL_FILES arm7_NAME arm9_NAME exe_NAME)
set(FO ${CMAKE_CURRENT_BINARY_DIR}/${exe_NAME}.nds)
set(I9 ${CMAKE_CURRENT_BINARY_DIR}/${arm9_NAME}.bin)
set(I7 ${CMAKE_CURRENT_BINARY_DIR}/${arm7_NAME}.bin)
add_custom_command(
OUTPUT ${FO}
COMMAND ${NDSTOOL_EXE}
ARGS -c ${FO} -9 ${I9} -7 ${I7})
get_filename_component(TGT "${exe_NAME}" NAME)
get_filename_component(TGT7 "${arm7_NAME}" NAME)
get_filename_component(TGT9 "${arm9_NAME}" NAME)
add_custom_target("Target97_${TGT}" ALL DEPENDS ${FO} VERBATIM)
add_dependencies("Target97_${TGT}"
"TargetObjCopy_${TGT7}"
"TargetObjCopy_${TGT9}")
get_directory_property(extra_clean_files ADDITIONAL_MAKE_CLEAN_FILES)
set_directory_properties(
PROPERTIES
ADDITIONAL_MAKE_CLEAN_FILES "${extra_clean_files};${FO}")
endmacro(NDSTOOL_FILES)
This is a bit more hacky than I would have liked, that add_dependencies line shouldn't be needed. But if we use DEPENDS in the add_custom_command macro as the documentation suggests, then we only see errors like "No rule to make target `combined/arm7/combined_arm7.bin'" - I imagine it is because of the use of sub directories. We need sub directories though, otherwise we cannot set flags per ARM core, so we'll have to live with the rather hacky solution here. Unless someone knows better and posts a comment! ;-)That's the lot! Now we can compile either single or dual core nds files and know how to check for the installed tools and libraries. So should you use CMake? Here is my round up to help you decide!
The Bad
Documentation: CMake sometimes requires what feels like cargo cult coding to get the desired result. It took me a lot longer than I would have liked to get this working. Part of the problem as to why I couldn't figure this out was the lack of good documentation. The CMake wiki is, like most wikis, not very well organised. Most help ends up pointing you towards the main site's "documentation" page, which only tells you how to buy a CMake book. The cynical side of me believes that this is a deliberate strategy - why have good free docs when you can sell books?
Syntax: I think the whole mixed case thing is a bit yucky. ALL IN CAPS makes things tricky to read, but at least you know you'll get it right. Mixing case is easier on the eye, but has the potential for cock ups. Meh.
Make's limitations: As we've seen here, there are problems when we push CMake to do things it possibly wasn't meant to do. Often the lowest-common-denominator build tool underneath leaks through the abstraction CMake provides. When I was trying to get the nds file to depend on the 2 binaries built in the sub directories, all the errors came from make - the cmake script looked correct. And those time stamped dependency checks... are you from the past?
Needs installing: Before we can do anything we have to install CMake itself, as well as the bog-standard make (or platform equivalent). This is extra bootstrapping that is not always necessary - waf manages to get round this on Linux, at least.
The Good
Cross platform: we can generate build scripts native to the host platform. Cool!
Configure checks: We can check that the building machine has everything we need to compile our project. If not, we can provide usefull info on how to obtian the code. This is better than having the build explode with "error: foo.h not found" followed by a billion other errors.
Compact syntax: The syntax of CMake is streamlined for building stuff. It's less wordy than make alone and more to-the-point than the other competitors in this field.
Starting to gain critical mass: I think this is important. There's no point using something if you're going to be the only one that uses it. CMake is now gaining enough popularity that it is fairly well tested, has a stable API and enough features to make it worthwhile.
Conclusion
Nobody is going to move away from regular Makefiles to compile the standard "Hello, World!" example. But when your project starts to grow in size and you introduce more sub-libraries, or unit tests, or want to have better control of the way the code gets built, then it's nice to know there are better options out there. I think CMake is a worthy candidate for consideration. The learning curve is quite shallow, especially if you already are familiar with Make or bash scripting, and it is something that is gradually gaining more mainstream recognition. Hey, at least it isn't as difficult to use as autohell ;-)