This is autobook.info, produced by makeinfo version 4.7 from autobook.texi. INFO-DIR-SECTION GNU programming tools START-INFO-DIR-ENTRY * Autoconf, Automake, Libtool: (autobook). Using the GNU autotools. END-INFO-DIR-ENTRY This file documents GNU Autoconf, Automake and Libtool. Copyright (C) 1999, 2000 Gary V. Vaughan, Ben Elliston, Tom Tromey, Ian Lance Taylor Permission is granted to make and distribute verbatim copies of this manual provided the copyright notice and this permission notice are preserved on all copies. Permission is granted to copy and distribute modified versions of this manual under the conditions for verbatim copying, provided that the entire resulting derived work is distributed under the terms of a permission notice identical to this one. Permission is granted to copy and distribute translations of this manual into another language, under the above conditions for modified versions, except that this permission notice may be stated in a translation approved by the Foundation.  File: autobook.info, Node: Guidelines for writing macros, Next: Implementation specifics, Prev: Reusing Existing Macros, Up: Writing New Macros for Autoconf 23.3 Guidelines for writing macros ================================== There are some guidelines which should be followed when writing a macro. The criteria for a well-written macro are that it should be easy to use, well documented and, most importantly, portable. Portability is a difficult problem that requires much anticipation on the part of the macro writer. This section will discuss the design considerations for using a static Autoconf test at compile time versus a test at runtime. It will also cover some of the characteristics of a good macro including non-interactive behavior, properly formatted output and a clean interface for the user of the macro. * Menu: * Non-interactive behavior:: * Testing system features at application runtime:: * Output from macros:: * Naming macros:: * Macro interface::  File: autobook.info, Node: Non-interactive behavior, Next: Testing system features at application runtime, Up: Guidelines for writing macros 23.3.1 Non-interactive behavior ------------------------------- Autoconf's generated `configure' scripts are designed to be non-interactive - they should not prompt the user for input. Many users like the fact that `configure' can be used as part of a automated build process. By introducing code into `configure' which prompts a user for more information, you will prohibit unattended operation. Instead, you should use the `AC_ARG_ENABLE' macro in `configure.in' to add extra options to `configure' or consider runtime configuration (*note Testing system features at application runtime::).  File: autobook.info, Node: Testing system features at application runtime, Next: Output from macros, Prev: Non-interactive behavior, Up: Guidelines for writing macros 23.3.2 Testing system features at application runtime ----------------------------------------------------- When pondering how to handle a difficult portability problem or configurable option, consider whether the problem is better solved by performing tests at runtime or by providing a configuration file to customize the application. Keep in mind that the results of tests that Autoconf can perform will ultimately affect how the program will be built-and can limit the number of machines that the program can be moved to without recompiling it. Here is an example where this consideration had to be made in a real life project: The pthreads for Win32 project has sought to provide a standards compliant implementation for the POSIX threads API. It does so by mapping the POSIX API functions into small functions which achieve the desired result using the Win32 thread API. Windows 95, Windows 98 and Windows NT have different levels of support for a system call primitive that attempts to enter a critical section without blocking. The `TryEnterCriticalSection' function is missing on Windows 95, is an inoperative stub on Windows 98, and works as expected on Windows NT. If this behavior was to be checked by `configure' at compile time, then the resultant library would only work on the variant of Windows that it was compiled for. Because it's more common to distribute packages for Windows in binary form, this would be an unfortunate situation. Instead, it is sometimes preferable to handle this kind of portability problem with a test, performed by your code at runtime.  File: autobook.info, Node: Output from macros, Next: Naming macros, Prev: Testing system features at application runtime, Up: Guidelines for writing macros 23.3.3 Output from macros ------------------------- Users who run `configure' expect a certain style of output as tests are performed. As such, you should use the well-defined interface to the existing Autoconf macros for generating output. Your tests should not arbitrarily echo messages to the standard output. Autoconf provides the following macros to output the messages for you in a consistent way (*note Invoking configure::). They are introduced here with a brief description of their purpose and are documented in more detail in *Note Autoconf Macro Reference::. Typically, a test starts by invoking `AC_MSG_CHECKING' to describe to the user what the test is doing and `AC_MSG_RESULT' is invoked to output the result of the test. `AC_MSG_CHECKING' This macro is used to notify the user that a test is commencing. It prints the text `checking' followed by your message and ends with `...'. You should use `AC_MSG_RESULT' after this macro to output the result of the test. `AC_MSG_RESULT' This macro notifies the user of a test result. In general, the result should be the word `yes' or `no' for boolean tests, or the actual value of the result, such as a directory or filename. `AC_MSG_ERROR' This macro emits a hard error message and aborts `configure'-this should be used for fatal errors. `AC_MSG_WARN' This macro emits a warning to the user and proceeds.  File: autobook.info, Node: Naming macros, Next: Macro interface, Prev: Output from macros, Up: Guidelines for writing macros 23.3.4 Naming macros -------------------- Just like functions in a C program, it's important to choose a good name for your Autoconf macros. A well-chosen name helps to unambiguously describe the purpose of the macro. Macros in M4 are all named within a single namespace and, thus, it is necessary to follow a convention to ensure that names retain uniqueness. This reasoning goes beyond just avoiding collisions with other macros-if you happen to choose a name that is already known to M4 as a definition of any kind, your macro's name could be rewritten by the prior definition during macro processing. One naming convention has emerged-prefixing each macro name with the name of the package that the macro originated in or the initials of the macro's author. Macros are usually named in a hierarchical fashion, with each part of the name separated by underscores. As you move left-to-right through each component of the name, the description becomes more detailed. There are some high-level categories of macros suggested by the Autoconf manual that you may wish to use when forming a descriptive name for your own macro. For example, if your macro tries to discover the existence of a particular C structure, you might wish to use `C' and `STRUCT' as components of its name. `C' Tests related to constructs of the C programming language. `DECL' Tests for variable declarations in header files. `FUNC' Tests for functions present in (or absent from) libraries. `HEADER' Tests for header files. `LIB' Tests for libraries. `PATH' Tests to discover absolute filenames (especially programs). `PROG' Tests to determine the base names of programs. `STRUCT' Tests for definitions of C structures in header files. `SYS' Tests for operating system features, such as restartable system calls. `TYPE' Tests for built-in or declared C data types. `VAR' Tests for C variables in libraries. Some examples of macro names formed in this way include: `AC_PROG_CC' A test that looks for a program called `cc'. `AC_C_INLINE' A test that discovers if the C keyword `inline' is recognized. `bje_CXX_MUTABLE' A test, written by "bje", that discovers if the C++ keyword `mutable' is recognized.  File: autobook.info, Node: Macro interface, Prev: Naming macros, Up: Guidelines for writing macros 23.3.5 Macro interface ---------------------- When designing your macro, it is worth spending some time deciding on what your macro's interface-the macro's name and argument list-will be. Often, it will be possible to extract general purpose functionality into a generic macro and to write a second macro which is a client of the generic one. Like planning the prototype for a C function, this is usually a straightforward process of deciding what arguments are required by the macro to perform its function. However, there are a couple of further considerations and they are discussed below. M4 macros refer to their arguments by number with a syntax such as `$1'. It is typically more difficult to read an M4 macro definition and understand what each argument's designation is than in a C function body, where the formal argument is referred to by its name. Therefore, it's a good idea to include a standard comment block above each macro that documents the macro and gives an indication of what each argument is for. Here is an example from the Autoconf source code: # AC_CHECK_FILE(FILE, [ACTION-IF-FOUND], [ACTION-IF-NOT-FOUND]) # ------------------------------------------------------------- # # Check for the existence of FILE. To remain general purpose, the existing Autoconf macros follow the convention of keeping side-effects outside the definition of the macro. Here, when a user invokes `AC_CHECK_FILE', they must provide shell code to implement the side effect that they want to occur if the `FILE' is found or is not found. Some macros implement a basic and desirable action like defining a symbol like `HAVE_UNISTD_H' if no user-defined actions are provided. In general, your macros should provide an interface which is consistent with the interfaces provided by the core Autoconf macros. M4 macros may have variable argument lists, so it is possible to implement macros which have defaults for arguments. By testing each individual argument against the empty string with `ifelse', it is possible for users to accept the default behavior for individual arguments by passing empty values: AC_CHECK_FILE([/etc/passwd], [], [AC_MSG_ERROR([something is really wrong])]) One final point to consider when designing the interface for a macro is how to handle macros that are generic in nature and, say, wish to set a cache variable whose name is based on one of the arguments. Consider the `AC_CHECK_HEADER' macro-it defines a symbol and makes an entry in the cache that reflects the result of the test it performs. `AC_CHECK_HEADER' takes an argument - namely the name of a header file to look for. This macro cannot just make a cache entry with a name like `ac_cv_check_header', since it would only work once; any further uses of this macro in `configure.in' would cause an incorrect result to be drawn from the cache. Instead, the name of the symbol that is defined and the name of the cache variable that is set need to be computed from one of the arguments: the name of the header file being sought. What we really need is to define `HAVE_UNISTD_H' and set the cache variable `ac_cv_header_unistd_h'. This can be achieved with some `sed' and `tr' magic in the macro which transforms the filename into uppercase characters for the call to `AC_DEFINE' and into lowercase for the cache variable name. Unknown characters such as `.' need to be transformed into underscores. Some existing macros also allow the user to pass in the name of a cache variable name so that the macro does not need to compute a name. In general, this should be avoided, as it makes the macro harder to use and exposes details of the caching system to the user.  File: autobook.info, Node: Implementation specifics, Next: Future directions for macro writers, Prev: Guidelines for writing macros, Up: Writing New Macros for Autoconf 23.4 Implementation specifics ============================= This section provides some tips about how to actually go about writing your macros once you've decided what it is that you want to test and how to go about testing for it. It covers writing shell code for the test and optionally caching the results of those tests. * Menu: * Writing shell code:: * Using M4 correctly:: * Caching results::  File: autobook.info, Node: Writing shell code, Next: Using M4 correctly, Up: Implementation specifics 23.4.1 Writing shell code ------------------------- It is necessary to adopt a technique of writing portable Bourne shell code. Often, shell programming tricks you might have learned are actually extensions provided by your favorite shell and are non-portable. When in doubt, check documentation or try the construct on another system's Bourne shell. For a thorough treatment of this topic, *Note Writing Portable Bourne Shell::.  File: autobook.info, Node: Using M4 correctly, Next: Caching results, Prev: Writing shell code, Up: Implementation specifics 23.4.2 Using M4 correctly ------------------------- Writing macros involves interacting with the M4 macro processor, which expands your macros when they are used in `configure.in'. It is crucial that your macros use M4 correctly-and in particular, that they quote strings correctly. *Note M4:: for a thorough treatment of this topic.  File: autobook.info, Node: Caching results, Prev: Using M4 correctly, Up: Implementation specifics 23.4.3 Caching results ---------------------- Autoconf provides a caching facility, whereby the results of a test may be stored in a cache file. The cache file is itself a Bourne shell script which is sourced by the `configure' script to set any `cache variables' to values that are present in the cache file. The next time `configure' is run, the cache will be consulted for a prior result. If there is a prior result, the value is re-used and the code that performs that test is skipped. This speeds up subsequent runs of `configure' and configuration of deep trees, which can share a cache file in the top-level directory (*note Invoking configure::). A custom macro is not required to do caching, though it is considered best practice. Sometimes it doesn't make sense for a macro to do caching-tests for system aspects which may frequently change should not be cached. For example, a test for free disk space should not employ caching as it is a dynamic characteristic. The `AC_CACHE_CHECK' macro is a convenient wrapper for caching the results of tests. You simply provide a description of the test, the name of a cache variable to store the test result to, and the body of the test. If the test has not been run before, the cache will be primed with the result. If the result is already in the cache, then the cache variable will be set and the test will be skipped. Note that the name of the cache variable must contain `_cv_' in order to be saved correctly. Here is the code for an Autoconf macro that ties together many of the concepts introduced in this chapter:  File: autobook.info, Node: Future directions for macro writers, Prev: Implementation specifics, Up: Writing New Macros for Autoconf 23.5 Future directions for macro writers ======================================== A future trend for Autoconf is to make it easier to write reliable macros and re-use macros written by others. This section will describe some of the ideas that are currently being explored by those actively working on Autoconf. * Menu: * Autoconf macro archive:: * Primitive macros to aid in building macros::  File: autobook.info, Node: Autoconf macro archive, Next: Primitive macros to aid in building macros, Up: Future directions for macro writers 23.5.1 Autoconf macro archive ----------------------------- In mid-1999, an official Autoconf macro archive was established on the World Wide Web by Peter Simons in Germany. The archive collects useful Autoconf macros that might be useful to some users, but are not sufficiently general purpose to include in the core Autoconf distribution. The URL for the macro archive is: http://www.gnu.org/software/ac-archive/ It is possible to retrieve macros that perform different kinds of tests from this archive. The macros can then be inserted, in line, into your `aclocal.m4' or `acinclude.m4' file. The archive has been steadily growing since its inception. Please try and submit your macros to the archive!  File: autobook.info, Node: Primitive macros to aid in building macros, Prev: Autoconf macro archive, Up: Future directions for macro writers 23.5.2 Primitive macros to aid in building macros ------------------------------------------------- Writing new macros is one aspect of Autoconf that has proven troublesome to users in the past, since this is one area where Autoconf's implementation details leak out. Autoconf extensively uses `m4' to perform the translation of `configure.in' to `configure'. Thus, it is necessary to understand implementation details such as M4's quoting rules in order to write Autoconf macros (*Note M4::). Another aspect of macro writing which is extremely hard to get right is writing portable Bourne shell scripts (*note Writing Portable Bourne Shell::). Writing portable software, be it in Bourne shell or C++, is something that can only be mastered with years of experience-and exposure to many different kinds of machines! Rather than expect all macro writers to acquire this experience, it makes sense for Autoconf to become a `knowledge base' for this experience. With this in mind, one future direction for Autoconf will be to provide a library of low-level macros to assist in writing new macros. By way of hypothetical example, consider the benefit of using a macro named `AC_FOREACH' instead of needing to learn the hard way that some vendor's implementation of Bourne shell has a broken `for' loop construct. This idea will be explored in future versions of Autoconf. When migrating existing packages to the GNU Autotools, which is the topic of the next chapter, it is worth remember these guidelines for best practices as you write the necessary tests to make those packages portable.  File: autobook.info, Node: Migrating Existing Packages, Next: Integration with Cygnus Cygwin, Prev: Writing New Macros for Autoconf, Up: Top 24 Migrating an Existing Package to GNU Autotools ************************************************* Sometimes you have to take an existing package and wrap it in an Autoconf framework. This is called _autoconfiscating_ (1) a package. This chapter gives an overview of various approach that have been taken when autoconfiscating, explains some important points through examples, and discusses some of potential pitfalls. It is not an exhaustive guide to autoconfiscation, as this process is much more art than it is science. 24.1 Why autconfiscate ====================== There are a few reasons to autoconfiscate a package. You might be porting your package to a new platform for the first time, or your might have outstripped the capabilities of an ad hoc system. Or, you might be assuming maintenance of a package and you want to make it fit in with other packages that use the GNU Autotools. For instance, for libgcj, we wanted to distribute some libraries needed for proper operation, such as the zip archiving program and the Boehm garbage collector. In neither case was an autoconf framework available. However, we felt one was required in order to give the overall package a seamless and easy-to-use configuration and build system. This attention to ease of install by users is important; it is one reason that the GNU Autotools were written. In another case, a group I worked with was taking over maintenance of a preexisting package. We preferred an Autoconf-based solution to the home-grown one already in use by the package - the existing system was based on platform tests, not feature tests, and was difficult to navigate and extend. 24.2 Overview of the Two Approaches =================================== The two fundamental approaches to autoconfiscation, which we call `quick and dirty', and `the full pull'. In practice each project is a mix of the two. There are no hard-and-fast rules when autoconficating an existing package, particularly when you are planning to track future releases of the original source. However, since Autoconf is so flexible, it is usually possible to find some reasonable way to implement whatever is required. Automake isn't as flexible, and with `strangely' constructed packages you're sometimes required to make a difficult choice: restructure the package, or avoid automake. 1. Quick And Dirty. In the quick and dirty approach, the goal is to get the framework up and running with the least effort. This is the approach we took when we autoconficated both zip and the Boehm garbage collector. Our reasons were simple: we knew we would be tracking the original packages closely, so we wanted to minimize the amount of work involved in importing the next release and subsequently merging in our changes. Also, both packages were written to be portable (but in very different ways), so major modifications to the source were not required. 2. The Full Pull. Sometimes you'd rather completely convert a package to GNU Autotools. For instance, you might have just assumed maintenance of a package. Or, you might read this book and decide that your company's internal projects should use a state-of-the-art configuration system. The full pull is more work than the quick-and-dirty approach, but in the end it yields a more easily understood, and more idiomatic package. This in turn has maintenance benefits due to the relative absence of quirks, traps, and special cases - oddities which creep into quick and dirty ports due to the need, in that case, to structure the build system around the package instead of having the ability to restructure the package to fit the build system. 24.3 Example: Quick And Dirty ============================= As part of the `libgcj' project (2), I had to incorporate the `zip' program into our source tree. Since this particular program is only used in one part of the build, and since this program was already fairly portable, I decided to take a quick-and-dirty approach to autoconfiscation. First I read through the `README' and `install.doc' files to see how `zip' is ordinarily built. From there I learned that `zip' came with a `Makefile' used to build all Unix ports (and, for the initial autoconfiscation, Unix was all I was interested in), so I read that. This file indicated that `zip' had few configurability options. Running `ifnames' on the sources, both Unix and generic, confirmed that the `zip' sources were mostly self-configuring, using system-specific `#defines'--a practice which we recommend against; however for a quicky-and-dirty port it is not worth cleaning up: $ ifnames *.[ch] unix/*.[ch] | grep ^__ | head __386BSD__ unix/unix.c __CYGWIN32__ unix/osdep.h __CYGWIN__ unix/osdep.h __DATE__ unix/unix.c zipcloak.c zipnote.c zipsplit.c __DEBUG_ALLOC__ zip.c __ELF__ unix/unix.c __EMX__ fileio.c ttyio.h util.c zip.c __FreeBSD__ unix/unix.c __G ttyio.h __GNUC__ unix/unix.c zipcloak.c zipnote.c zipsplit.c Based on this information I wrote my initial `configure.in', which is the one still in use today: AC_INIT(ziperr.h) AM_INIT_AUTOMAKE(zip, 2.1) AM_MAINTAINER_MODE AC_PROG_CC AC_HEADER_DIRENT AC_DEFINE(UNIX) AC_LINK_FILES(unix/unix.c, unix.c) AC_OUTPUT(Makefile) The one mysterious part of this `configure.in' is the define of the `UNIX' preprocessor macro. This define came directly from `zip''s `unix/Makefile' file; `zip' uses this define to enable certain Unix-specific pieces of code. In this particular situation, I lucked out. `zip' was unusually easy to autoconficate. Typically more actual checks are required in `configure.in', and more than a single iteration is required to get a workable configuration system. From `unix/Makefile' I also learned which files were expected to be built in order to produce the `zip' executable. This information let me write my `Makefile.am': ## Process this file with automake to create Makefile.in. ## NOTE: this file doesn't really try to be complete. In particular ## `make dist' won't work at all. We're just aiming to get the ## program built. We also don't bother trying to assemble code, or ## anything like that. AUTOMAKE_OPTIONS = no-dependencies INCLUDES = -I$(srcdir)/unix bin_PROGRAMS = zip zip_SOURCES = zip.c zipfile.c zipup.c fileio.c util.c globals.c \ crypt.c ttyio.c unix.c crc32.c crctab.c deflate.c trees.c bits.c ## This isn't really correct, but we don't care. $(zip_OBJECTS) : zip.h ziperr.h tailor.h unix/osdep.h crypt.h \ revision.h ttyio.h unix/zipup.h This file provides a good look at some of the tradeoffs involved. In my case, I didn't care about full correctness of the resulting `Makefile.am' - I wasn't planning to maintain the project, I just wanted it to build in my particular set of environments. So, I sacrificed `dist' capability to make my work easier. Also, I decided to disable dependency tracking and instead make all the resulting object files depend on all the headers in the project. This approach is inefficient, but in my situation perfectly reasonable, as I wasn't planning to do any actual development on this package - I was simply looking to make it build so that it could be used to build the parts of the package I was actually hacking. 24.4 Example: The Full Pull =========================== Suppose instead that I wanted to fully autoconfiscate `zip'. Let's ignore for now that `zip' can build on systems to which the GNU Autotools have not been ported, like TOPS-20--perhaps a big problem back in the real world. The first step should always be to run `autoscan'. `autoscan' is a program which examines your source code and then generates a file called `configure.scan' which can be used as a rough draft of a `configure.in'. `autoscan' isn't perfect, and in fact in some situations can generate a `configure.scan' which `autoconf' won't directly accept, so you should examine this file by hand before renaming it to `configure.in'. `autoscan' doesn't take into account macro names used by your program. For instance, if `autoscan' decides to generate a check for `', it will just generate ordinary `autoconf' code which in turn might define `HAVE_FCNTL_H' at `configure' time. This just means that `autoscan' isn't a panacea - you will probably have to modify your source to take advantage of the code that `autoscan' generates. Here is the `configure.scan' I get when I run `autoscan' on `zip': dnl Process this file with autoconf to produce a configure script. AC_INIT(bits.c) dnl Checks for programs. AC_PROG_AWK AC_PROG_CC AC_PROG_CPP AC_PROG_INSTALL AC_PROG_LN_S AC_PROG_MAKE_SET dnl Checks for libraries. dnl Replace `main' with a function in -lx: AC_CHECK_LIB(x, main) dnl Checks for header files. AC_HEADER_DIRENT AC_HEADER_STDC AC_CHECK_HEADERS(fcntl.h malloc.h sgtty.h strings.h sys/ioctl.h \ termio.h unistd.h) dnl Checks for typedefs, structures, and compiler characteristics. AC_C_CONST AC_TYPE_SIZE_T AC_STRUCT_ST_BLKSIZE AC_STRUCT_ST_BLOCKS AC_STRUCT_ST_RDEV AC_STRUCT_TM dnl Checks for library functions. AC_PROG_GCC_TRADITIONAL AC_FUNC_MEMCMP AC_FUNC_MMAP AC_FUNC_SETVBUF_REVERSED AC_TYPE_SIGNAL AC_FUNC_UTIME_NULL AC_CHECK_FUNCS(getcwd mktime regcomp rmdir strstr) AC_OUTPUT(acorn/makefile unix/Makefile Makefile atari/Makefile) As you can see, this isn't suitable for immediate use as `configure.in'. For instance, it generates several `Makefile's which we know we won't need. At this point there are two things to do in order to fix this file. First, we must fix outright flaws in `configure.scan', add checks for libraries, and the like. For instance, we might also add code to see if we are building on Windows and set a variable appropriately: AC_CANONICAL_HOST case "$target" in *-cygwin* | *-mingw*) INCLUDES='-I$(srcdir)/win32' ;; *) # Assume Unix. INCLUDES='-I$(srcdir)/unix' ;; esac AC_SUBST(INCLUDES) Second, we must make sure that the `zip' sources use the results we compute. So, for instance, we would check the `zip' source to see if we should use `HAVE_MMAP', which is the result of calling `AC_FUNC_MMAP'. At this point you might also consider using a configuration header such as is generated by `AC_CONFIG_HEADER'. Typically this involves editing all your source files to include the header, but in the long run this is probably a cleaner way to go than using many `-D' options on the command line. If you are making major source changes in order to fully adapt your code to `autoconf''s output, adding a `#include' to each file will not be difficult. This step can be quite difficult if done thoroughly, as it can involve radical changes to the source. After this you will have a minimal but functional `configure.in' and a knowledge of what portability information your program has already incorporated. Next, you want to write your `Makefile.am's. This might involve restructuring your package so that it can more easily conform to what Automake expects. This work might also involve source code changes if the program makes assumptions about the layout of the install tree - these assumptions might very well break if you follow the GNU rules about the install layout. At the same time as you are writing your `Makefile.am's, you might consider _libtoolizing_ your package. This makes sense if you want to export shared libraries, or if you have libraries which several executables in your package use. In our example, since there is no library involved, we won't use Libtool. The `Makefile.am' used in the minimal example is nearly sufficient for our use, but not quite. Here's how we change it to add dependency tracking and `dist' support: ## Process this file with automake to create Makefile.in. bin_PROGRAMS = zip if UNIX bin_SCRIPTS = unix/zipgrep os_sources = unix/unix.c else os_sources = win32/win32.c win32zip.c endif zip_SOURCES = zip.c zipfile.c zipup.c fileio.c util.c globals.c \ crypt.c ttyio.c crc32.c crctab.c deflate.c trees.c \ bits.c $(os_sources) ## It was easier to just list all the source files than to pick out the ## non-source files. EXTRA_DIST = algorith.doc README TODO Where crc_i386.S bits.c crc32.c \ acorn/RunMe1st acorn/ReadMe acorn/acornzip.c acorn/makefile \ acorn/match.s acorn/osdep.h acorn/riscos.c acorn/riscos.h \ acorn/sendbits.s acorn/swiven.h acorn/swiven.s acorn/zipup.h crctab.c \ crypt.c crypt.h deflate.c ebcdic.h fileio.c globals.c history \ ... wizdll/wizdll.def wizdll/wizmain.c wizdll/wizzip.h wizdll/zipdll16.mak \ wizdll/zipdll32.mak The extremely long `EXTRA_DIST' macro above has be truncated for brevity, denoted by the `...' line. Note that we no longer define `INCLUDES' - it is now automatically defined by `configure'. Note also that, due to a small technicality, this `Makefile.am' won't really work with Automake 1.4. Instead, we must modify things so that we don't try to compile `unix/unix.c' or other files from subdirectories. ---------- Footnotes ---------- (1) A term coined by Noah Friedman in the early days of Autoconf to denote the process of converting a package that configures itself without Autoconf to one which does. (2) See `http://sourceware.cygnus.com/java/'  File: autobook.info, Node: Integration with Cygnus Cygwin, Next: Cross Compilation, Prev: Migrating Existing Packages, Up: Top 25 Using GNU Autotools with Cygnus Cygwin ***************************************** It is possible to use the GNU Autotools to build software packages on Windows. Since the tools were developed on Unix, it is easier to get them to work using Cygnus Solutions' Cygwin distribution which provides a POSIX wrapper for the Win32 API, *Note The Diversity of Unix Systems: Unix Diversity, but it is certainly possible to run the tools within other Windows environments, notably Colin Peters' "Mingw32" and D.J. Delorie's "DJGPP". These development environments are freely available on the Internet(1). Unlike Cygwin, these other environments are designed for developing with the Win32 API directly and consequently they are not as useful for porting Unix projects to Windows or writing code that works on both Windows and Unix, see *Note Unix/Windows Portability:: for more details. This chapter describes the process of using GNU Autotools with Cygwin, although some of this advice also applies to employing some of the other GNU based Windows development environments. It is notable that the recent Cygwin ports of GCC and "binutils" can produce binaries which will run with the `cygwin1.dll' emulation layer, or linked against `CRTDLL.DLL', the Windows native "C RunTime Dynamic Link Library" depending on the needs of particular source code. Recent versions(2) of the binutils implement the PE-COFF binary format used by Windows, so by specifying the `-mno-cygwin' compiler option to the Cygwin compiler and using only the API from `CRTDLL.DLL', you can build binaries which are independent of the `cygwin1.dll' DLL. Such binaries will generally run faster, since they bypass the POSIX emulation, and give easier access to Windows specific things such as drive letters. Source code designed to be compiled this way will not compile on Unix however, since it will be limited to the Win32 API provided by `CRTDLL.DLL'. After reading this chapter, you will be able to install and use GNU Autotools _natively_ under Windows using Cygnus Solutions' Cygwin environment, both to develop your own packages with the aid of Cygwin, and to compile, install, and to a certain degree port other peoples packages for use with Cygwin. As a Unix package developer, you will learn how to write your configury to be Windows friendly, and to be aware of certain quirks of Windows which can affect the portability of packages which need to work on Windows in addition to your Unix development machine. * Menu: * Preliminaries:: * Installing GNU Autotools on Cygwin:: * Writing A Cygwin Friendly Package:: * DLLs with Libtool:: * Package Installation:: ---------- Footnotes ---------- (1) Mingw32 home page, `http://www.geocities.com/Tokyo/Towers/6162/gcc.html'; and DJGPP home page, `http://www.delorie.com/djgpp/'. (2) since Cygwin-b20.1, I believe.  File: autobook.info, Node: Preliminaries, Next: Installing GNU Autotools on Cygwin, Up: Integration with Cygnus Cygwin 25.1 Preliminaries ================== As explained in *Note Installing GNU Autotools on Cygwin::, GNU Autotools requires several other tools to operate. Most Unices provide the majority, if not all, of these prerequisites by default. Windows, unfortunately, does not. Cygwin is better than most in this respect, and only a few extras are required. The latest net release of Cygwin(1) has a packaging mechanism which downloads and installs various Unix tools that have been precompiled for the Cygwin environment by the Cygnus folks. To develop with GNU Autotools and Cygwin, you need to install all of these packages to make sure you have all of the necessary header files and compiler tools. Bourne shell Cygwin provides a port of "ash" which is smaller and faster than bash, but sometimes rejects arcane Bourne shell scripts. If you can stand to sacrifice a little speed, it is worth copying the supplied `bash.exe' to `/bin/sh.exe' to forestall any such problems. GNU M4 Cygwin provides a port of GNU M4. GNU Make At the time of writing, developers need GNU Make in order to do dependency tracking (*note Automatic dependency tracking: Advanced GNU Automake Usage.), though this is set to change in a future release of Automake. Cygwin version 1.1.1 comes with a port of GNU `make-3.77', which I have personally never had any problems with. The received wisdom from users is to manually upgrade to the latest version, `make-3.79'(2), which compiles and installs from source without modification. Should you experience (or anticipate) any Make related problems, you might try upgrading to this version or later. GNU GCC At the time of writing, GNU GCC is also needed by Automake in order to do dependency tracking. Cygwin version 1.1.1 comes with a port of the latest GNU GCC compiler. Perl The current implementation of Automake (1.4) is written in `perl4', though it is likely that `perl5' will be needed for Automake 1.5. The very latest versions of Perl now compile out of the box on Cygwin(3). There are some other pitfalls to installing a fully working Cygwin environment on your Windows machine, but that is outside the scope of this chapter. Cygnus host a mailing list archive and an FAQ(4) to provide some level of support, and these should be your first port of call in case the installation does not go according to plan. ---------- Footnotes ---------- (1) 1.1.1 at the time of writing. (2) `ftp://ftp.gnu.org/gnu/make/make-3.79.tar.gz' (3) You can get a precompiled package from `http://cygutils.netpedia.net/', also an excellent resource for other packages ported to Cygwin. (4) `http://sourceware.cygnus.com/cygwin/'  File: autobook.info, Node: Installing GNU Autotools on Cygwin, Next: Writing A Cygwin Friendly Package, Prev: Preliminaries, Up: Integration with Cygnus Cygwin 25.2 Installing GNU Autotools on Cygwin ======================================= With all of the above infrastructure in place, each of the GNU Autotools can be built natively and installed from source right out of the box. It is worth taking care with the installation directories, as there is no package management under Cygwin, and it is easy to let everything get thrown into a big pile in `/usr/local', which makes it relatively difficult to upgrade and remove packages. Support for Cygwin has been in Autoconf for several years, as far back as version 2.0 as best as I can tell. Building it has never been a problem as long as GNU M4 and a Bourne Shell are available, it is the macros themselves which offer this support. Of course, any Autoconf macros you write yourself must be designed carefully to not make any assumptions about being executed on Unix if the Cygwin compatibility is to remain. A binary package of Autoconf for Cygwin version 1.1.1 is available from the CygUtils website(1). Automake joined the fray much later than the Cygwin support code was added to Autoconf, and has consequently always supported Cygwin. Until the last release of Cygwin, the stumbling block has always been finding (or building) a Cygwin compatible Perl interpreter for Automake to use. Thanks to the work of Eric Fifer, Perl 5.6.0 builds right out of the box on Cygwin, removing this problem entirely. Ready built packages of Perl and Automake are available from the CygUtils website. The initial Libtool support for Windows was written by Ian Lance Taylor of Cygnus Solutions, when Cygwin was at release b18, *Note Microsoft Windows: Microsoft Windows Development. More recent releases of Cygwin in general, and GCC in particular have much better facilities for building and linking with Windows DLLs, to the extent that with a little perseverance it is possible to build DLLs with GCC from C++ sources, and to have those DLLs interoperate with DLLs built with Windows development environments. In time, automation of these facilities will make their way into Libtool. The method that Libtool currently uses to build DLLs works with Cygwin releases at least as far back as b18, and at least as far forward as the version I am now using, Cygwin-1.1.1. The same code will also build DLLs correctly with Mingw32. There are certainly simpler ways to assemble a DLL, but Libtool aims to combine two goals which are somewhat in contention with Windows' treatment of DLLs; Libtool is aiming for maximum portability across the various flavours of DLL-using Windows build environments; not forgetting Libtool's raison d'e^tre which is to abstract the many and varied ways of building libraries on different targets behind a single unified interface. To meet these two goals, Libtool must only use tools which exist across the range of versions it supports, and must at the same time try to make DLLs appear to have the same characteristics as a modern ELF shared library, such as the shared libraries under GNU/Linux. This is no mean feat, and in fact Libtool still has some way to go in order to be able to do this convincingly. It turns out that Windows DLLs lack many, many features that packages developed on Unix are likely to take for granted. Emulation of these missing features are making their way into Libtool. Although support for DLLs is improving steadily with every release, there are some severe technical problems with the Windows library architecture that will prevent Libtool from ever being able to build DLLs completely transparently. The details are extremely technical and beyond the scope of this book. As noted in *Note Installing the tools::, things will only work correctly if each of Autoconf, Automake and Libtool are installed with the same `--prefix' argument, since they all share a macro directory in `$prefix/share/aclocal'. ---------- Footnotes ---------- (1) The CygUtils website is `http://cygutils.netpedia.net/V1.1/'.  File: autobook.info, Node: Writing A Cygwin Friendly Package, Next: DLLs with Libtool, Prev: Installing GNU Autotools on Cygwin, Up: Integration with Cygnus Cygwin 25.3 Writing A Cygwin Friendly Package ====================================== One approach to using the Cygwin support offered by GNU Autotools in your own package is to have an eye towards having it compile nicely on Unix and on Windows, or indeed of tweaking the configuration of existing packages which use GNU Autotools but which do not compile under Cygwin, or do not behave quite right after compilation. There are several things you need to be aware of in order to design a package to work seamlessly under Cygwin, and yet several more if portability to DOS and (non-Cygwin) Windows is important too. We discussed many of these issues in *Note Unix/Windows Issues::. In this section, we will expand on those issues with ways in which GNU Autotools can help deal with them. If you only need to build executables and static libraries, then Cygwin provides an environment close enough to Unix that any packages which ship with a relatively recent configuration will compile pretty much out of the box, except for a few peculiarities of Windows which are discussed throughout the rest of this section. If you want to build a package which has not been maintained for a while, and which consequently uses an old Autoconf, then it is usually just a matter of removing the generated files, rebootstrapping the package with the installed (up to date!) Autoconf, and rerunning the `configure' script. On occasion some tweaks will be needed in the `configure.in' to satisfy the newer `autoconf', but `autoconf' will almost always diagnose these for you while it is being run. * Menu: * Text vs Binary Modes:: * File System Limitations:: * Executable Filename Extensions::  File: autobook.info, Node: Text vs Binary Modes, Next: File System Limitations, Up: Writing A Cygwin Friendly Package 25.3.1 Text vs Binary Modes --------------------------- As discussed in *Note Unix/Windows Text/Binary::, text and binary files are different on Windows. Lines in a Windows text files end in a carriage return/line feed pair, but a C program reading the file in text mode will see a single line feed. Cygwin has several ways to hide this dichotomy, and the solution(s) you choose will depend on how you plan to use your program. I will outline the relative tradeoffs you make with each choice: mounting Before installing an operating system to your hard drive, you must first organise the disk into "partitions". Under Windows, you might only have a single partition on the disk, which would be called `C:'(1). Provided that some media is present, Windows allows you to access the contents of any drive letter - that is you can access `A:' when there is a floppy disk in the drive, and `F:' provided you divided you available drives into sufficient partitions for that letter to be in use. With Unix, things are somewhat different: hard disks are still divided into partitions (typically several), but there is only a single filesystem "mounted" under the root directory. You can use the `mount' command to hook a partition (or floppy drive or CD-ROM, etc.) into a subdirectory of the root filesystem: $ mount /dev/fd0 /mnt/floppy $ cd /mnt/floppy Until the directory is `unmount'ed, the contents of the floppy disk will be available as part of the single Unix filesystem in the directory, `/mnt/floppy'. This is in contrast with Windows' multiple root directories which can be accessed by changing filesystem root - to access the contents of a floppy disk: C:\WINDOWS\> A: A:> DIR ... Cygwin has a mounting facility to allow Cygwin applications to see a single unified file system starting at the root directory, by `mount'ing drive letters to subdirectories. When mounting a directory you can set a flag to determine whether the files in that partition should be treated the same whether they are TEXT or BINARY mode files. Mounting a file system to treat TEXT files the same as BINARY files, means that Cygwin programs can behave in the same way as they might on Unix and treat all files as equal. Mounting a file system to treat TEXT files properly, will cause Cygwin programs to translate between Windows CR-LF line end sequences and Unix CR line endings, which plays havoc with file seeking, and many programs which make assumptions about the size of a `char' in a `FILE' stream. However `binmode' is the default method because it is the only way to interoperate between Windows binaries and Cygwin binaries. You can get a list of which drive letters are mounted to which directories, and the modes they are mounted with by running the `mount' command without arguments: BASH.EXE-2.04$ mount Device Directory Type flags C:\cygwin / user binmode C:\cygwin\bin /usr/bin user binmode C:\cygwin\lib /usr/lib user binmode D:\home /home user binmode As you can see, the Cygwin `mount' command allows you to `mount' arbitrary Windows directories as well as simple drive letters into the single filesystem seen by Cygwin applications. binmode The `CYGWIN' environment variable holds a space separated list of setup options which exert some minor control over the way the `cygwin1.dll' (or `cygwinb19.dll' etc.) behaves. One such option is the `binmode' setting; if `CYGWIN' contains the `binmode' option, files which are opened through `cygwin1.dll' without an explicit text or binary mode, will default to binary mode which is closest to how Unix behaves. system calls `cygwin1.dll', GNU libc and other modern C API implementations accept extra flags for `fopen' and `open' calls to determine in which mode a file is opened. On Unix it makes no difference, and sadly most Unix programmers are not aware of this subtlety, so this tends to be the first thing that needs to be fixed when porting a Unix program to Cygwin. The best way to use these calls portably is to use the following macros with a package's `configure.in' to be sure that the extra arguments are available: # _AB_AC_FUNC_FOPEN(b | t, USE_FOPEN_BINARY | USE_FOPEN_TEXT) # ----------------------------------------------------------- define([_AB_AC_FUNC_FOPEN], [AC_CACHE_CHECK([whether fopen accepts "$1" mode], [ab_cv_func_fopen_$1], [AC_TRY_RUN([#include int main () { FILE *fp = fopen ("conftest.bin", "w$1"); fprintf (fp, "\n"); fclose (fp); return 0; }], [ab_cv_func_fopen_$1=yes], [ab_cv_func_fopen_$1=no], [ab_cv_func_fopen_$1=no])]) if test x$ab_cv_func_fopen_$1 = xyes; then AC_DEFINE([$2], 1, [Define this if we can use the "$1" mode for fopen safely.]) fi[]dnl ])# _AB_AC_FUNC_FOPEN # AB_AC_FUNC_FOPEN_BINARY # ----------------------- # Test whether fopen accepts a "" in the mode string for binary file # opening. This makes no difference on most unices, but some OSes # convert every newline written to a file to two bytes (CR LF), and # every CR LF read from a file is silently converted to a newline. AC_DEFUN([AB_AC_FUNC_FOPEN_BINARY], [_AB_AC_FUNC_FOPEN(b, USE_FOPEN_BINARY)]) # AB_AC_FUNC_FOPEN_TEXT # --------------------- # Test whether open accepts a "t" in the mode string for text file # opening. This makes no difference on most unices, but other OSes # use it to assert that every newline written to a file writes two # bytes (CR LF), and every CR LF read from a file are silently # converted to a newline. AC_DEFUN([AB_AC_FUNC_FOPEN_TEXT], [_AB_AC_FUNC_FOPEN(t, USE_FOPEN_TEXT)]) # _AB_AC_FUNC_OPEN(O_BINARY|O_TEXT) # --------------------------------- AC_DEFUN([_AB_AC_FUNC_OPEN], [AC_CACHE_CHECK([whether fcntl.h defines $1], [ab_cv_header_fcntl_h_$1], [AC_EGREP_CPP([$1], [#include #include #include $1 ], [ab_cv_header_fcntl_h_$1=no], [ab_cv_header_fcntl_h_$1=yes]) if test "x$ab_cv_header_fcntl_h_$1" = xno; then AC_EGREP_CPP([_$1], [#include #include #include _$1 ], [ab_cv_header_fcntl_h_$1=0], [ab_cv_header_fcntl_h_$1=_$1]) fi]) if test "x$ab_cv_header_fcntl_h_$1" != xyes; then AC_DEFINE_UNQUOTED([$1], [$ab_cv_header_fcntl_h_$1], [Define this to a usable value if the system provides none]) fi[]dnl ])# _AB_AC_FUNC_OPEN # AB_AC_FUNC_OPEN_BINARY # ---------------------- # Test whether open accepts O_BINARY in the mode string for binary # file opening. This makes no difference on most unices, but some # OSes convert every newline written to a file to two bytes (CR LF), # and every CR LF read from a file is silently converted to a newline. # AC_DEFUN([AB_AC_FUNC_OPEN_BINARY], [_AB_AC_FUNC_OPEN([O_BINARY])]) # AB_AC_FUNC_OPEN_TEXT # -------------------- # Test whether open accepts O_TEXT in the mode string for text file # opening. This makes no difference on most unices, but other OSes # use it to assert that every newline written to a file writes two # bytes (CR LF), and every CR LF read from a file are silently # converted to a newline. # AC_DEFUN([AB_AC_FUNC_OPEN_TEXT], [_AB_AC_FUNC_OPEN([O_TEXT])]) Add the following preprocessor code to a common header file that will be included by any sources that use `fopen' calls: #define fopen rpl_fopen Save the following function to a file, and link that into your program so that in combination with the preprocessor magic above, you can always specify text or binary mode to `open' and `fopen', and let this code take care of removing the flags on machines which do not support them: #if HAVE_CONFIG_H # include #endif #include /* Use the system size_t if it has one, or fallback to config.h */ #if STDC_HEADERS || HAVE_STDDEF_H # include #endif #if HAVE_SYS_TYPES_H # include #endif /* One of the following headers will have prototypes for malloc and free on most systems. If not, we don't add explicit prototypes which may generate a compiler warning in some cases -- explicit prototypes would certainly cause compilation to fail with a type clash on some platforms. */ #if STDC_HEADERS || HAVE_STDLIB_H # include #endif #if HAVE_MEMORY_H # include #endif #if HAVE_STRING_H # include #else # if HAVE_STRINGS_H # include # endif /* !HAVE_STRINGS_H */ #endif /* !HAVE_STRING_H */ #if ! HAVE_STRCHR /* BSD based systems have index() instead of strchr() */ # if HAVE_INDEX # define strchr index # else /* ! HAVE_INDEX */ /* Very old C libraries have neither index() or strchr() */ # define strchr rpl_strchr static inline const char *strchr (const char *str, int ch); static inline const char * strchr (const char *str, int ch) { const char *p = str; while (p && *p && *p != (char) ch) { ++p; } return (*p == (char) ch) ? p : 0; } # endif /* HAVE_INDEX */ #endif /* HAVE_STRCHR */ /* BSD based systems have bcopy() instead of strcpy() */ #if ! HAVE_STRCPY # define strcpy(dest, src) bcopy(src, dest, strlen(src) + 1) #endif /* Very old C libraries have no strdup(). */ #if ! HAVE_STRDUP # define strdup(str) strcpy(malloc(strlen(str) + 1), str) #endif char* rpl_fopen (const char *pathname, char *mode) { char *result = NULL; char *p = mode; /* Scan to the end of mode until we find 'b' or 't'. */ while (*p && *p != 'b' && *p != 't') { ++p; } if (!*p) { fprintf(stderr, "*WARNING* rpl_fopen called without mode 'b' or 't'\n"); } #if USE_FOPEN_BINARY && USE_FOPEN_TEXT result = fopen(pathname, mode); #else { char ignore[3]= "bt"; char *newmode = strdup(mode); char *q = newmode; p = newmode; # if ! USE_FOPEN_TEXT strcpy(ignore, "b") # endif # if ! USE_FOPEN_BINARY strcpy(ignore, "t") # endif /* Copy characters from mode to newmode missing out b and/or t. */ while (*p) { while (strchr(ignore, *p)) { ++p; } *q++ = *p++; } *q = '\0'; result = fopen(pathname, newmode); free(newmode); } #endif /* USE_FOPEN_BINARY && USE_FOPEN_TEXT */ return result; } The correct operation of the file above relies on several things having been checked by the `configure' script, so you will also need to ensure that the following macros are present in your `configure.in' before you use this code: # configure.in -- Process this file with autoconf to produce configure AC_INIT(rpl_fopen.c) AC_PROG_CC AC_HEADER_STDC AC_CHECK_HEADERS(string.h strings.h, break) AC_CHECK_HEADERS(stdlib.h stddef.h sys/types.h memory.h) AC_C_CONST AC_TYPE_SIZE_T AC_CHECK_FUNCS(strchr index strcpy strdup) AB_AC_FUNC_FOPEN_BINARY AB_AC_FUNC_FOPEN_TEXT ---------- Footnotes ---------- (1) Typically you would also have a floppy drive named `A:', and a CD-ROM named `D:'.  File: autobook.info, Node: File System Limitations, Next: Executable Filename Extensions, Prev: Text vs Binary Modes, Up: Writing A Cygwin Friendly Package 25.3.2 File System Limitations ------------------------------ We discussed some differences between Unix and Windows file systems in *Note Unix/Windows Filesystems::. You learned about some of the differences between Unix and Windows file systems. This section expands on that discussion, covering filename differences and separator and drive letter distinctions. * Menu: * 8.3 Filenames:: * Separators and Drive Letters::  File: autobook.info, Node: 8.3 Filenames, Next: Separators and Drive Letters, Up: File System Limitations 25.3.2.1 8.3 Filenames ...................... As discussed earlier, DOS file systems have severe restrictions on possible file names: they must follow an 8.3 format. *Note DOS Filename Restrictions::. This is quite a severe limitation, and affects some of the inner workings of GNU Autotools in two ways. The first is handled automatically, in that if `.libs' isn't a legal directory name on the host system, Libtool and Automake will use the directory `_libs' instead. The other is that the traditional `config.h.in' file is not legal under this scheme, and it must be worked around with a little known feature of Autoconf: AC_CONFIG_HEADER(config.h:config.hin)  File: autobook.info, Node: Separators and Drive Letters, Prev: 8.3 Filenames, Up: File System Limitations 25.3.2.2 Separators and Drive Letters ..................................... As discussed earlier (*note Windows Separators and Drive Letters::), the Windows file systems use different delimiters for separating directories and path elements than their Unix cousins. There are three places where this has an effect: the shell command line Up until Cygwin b20.1, it was possible to refer to drive letter prefixed paths from the shell using the `//c/path/to/file' syntax to refer to the directory root at `C:\path\to\file'. Unfortunately, the Windows kernel confused this with the its own network share notation, causing the shell to pause for a short while to look for a machine named `c' in its network neighbourhood. Since release 1.0 of Cygwin, the `//c/path/to/file' notation now really does refer to a machine named `c' from Cygwin as well as from Windows. To refer to drive letter rooted paths on the local machine from Cygwin there is a new hybrid `c:/path/to/file' notation. This notation also works in Cygwin b20, and is probably the system you should use. On the other hand, using the new hybrid notation in shell scripts means that they won't run on old Cygwin releases. Shell code embedded In `configure.in' scripts, should test whether the hybrid notation works, and use an alternate macro to translate hybrid notation to the old style if necessary. I must confess that from the command line I now use the longer `/cygdrive/c/path/to/file' notation, since completion doesn't yet work for the newer hybrid notation. It is important to use the new notation in shell scripts however, or they will fail on the latest releases of Cygwin. shell scripts For a shell script to work correctly on non-Cygwin development environments, it needs to be aware of and handle Windows path and directory separator and drive letters. The Libtool scripts use the following idiom: case "$path" in # Accept absolute paths. [\\/]* | [A-Za-\]:[\\/]*) # take care of absolute paths insert some code here ;; *) # what is left must be a relative path insert some code here ;; esac source code When porting Unix software to Cygwin, this is much less of an issue because these differences are hidden beneath the emulation layer, and by the `mount' command respectively; although I have found that GCC, for example, returns a mixed mode `/' and `\' delimited include path which upsets Automake's dependency tracking on occasion. Cygwin provides convenience functions to convert back and forth between the different notations, which we call "POSIX paths" or path lists, and "WIN32 paths" or path lists: -- Function: int posix_path_list_p (const char *PATH) Return `0', unless PATH is a `/' and `:' separated path list. The determination is rather simplistic, in that a string which contains a `;' or begins with a single letter followed by a `:' causes the `0' return. -- Function: void cygwin_win32_to_posix_path_list (const char *WIN32, char *POSIX) Converts the `\' and `;' delimiters in WIN32, into the equivalent `/' and `:' delimiters while copying into the buffer at address POSIX. This buffer must be preallocated before calling the function. -- Function: void cygwin_conv_to_posix_path (const char *PATH, char *POSIX_PATH) If PATH is a `\' delimited path, the equivalent, `/' delimited path is written to the buffer at address POSIX_PATH. This buffer must be preallocated before calling the function. -- Function: void cygwin_conv_to_full_posix_path (const char *PATH, char *POSIX_PATH) If PATH is a, possibly relative, `\' delimited path, the equivalent, absolute, `/' delimited path is written to the buffer at address POSIX_PATH. This buffer must be preallocated before calling the function. -- Function: void cygwin_posix_to_win32_path_list (const char *POSIX, char *WIN32) Converts the `/' and `:' delimiters in POSIX, into the equivalent `\' and `;' delimiters while copying into the buffer at address WIN32. This buffer must be preallocated before calling the function. -- Function: void cygwin_conv_to_win32_path (const char *PATH, char *WIN32_PATH) If PATH is a `/' delimited path, the equivalent, `\' delimited path is written to the buffer at address WIN32_PATH. This buffer must be preallocated before calling the function. -- Function: void cygwin_conv_to_full_win32_path (const char *PATH, char *WIN32_PATH) If PATH is a, possibly relative, `/' delimited path, the equivalent, absolute, `\' delimited path is written to the buffer at address WIN32_PATH. This buffer must be preallocated before calling the function. You can use these functions something like this: void display_canonical_path(const char *maybe_relative_or_win32) { char buffer[MAX_PATH]; cygwin_conv_to_full_posix_path(maybe_relative_or_win32, buffer); printf("canonical path for %s: %s\n", maybe_relative_or_win32, buffer); } For your code to be fully portable however, you cannot rely on these Cygwin functions as they are not implemented on Unix, or even mingw or DJGPP. Instead you should add the following to a shared header, and be careful to use it when processing and building paths and path lists: #if defined __CYGWIN32__ && !defined __CYGWIN__ /* For backwards compatibility with Cygwin b19 and earlier, we define __CYGWIN__ here, so that we can rely on checking just for that macro. */ # define __CYGWIN__ __CYGWIN32__ #endif #if defined _WIN32 && !defined __CYGWIN__ /* Use Windows separators on all _WIN32 defining environments, except Cygwin. */ # define DIR_SEPARATOR_CHAR '\\' # define DIR_SEPARATOR_STR "\\" # define PATH_SEPARATOR_CHAR ';' # define PATH_SEPARATOR_STR ";" #endif #ifndef DIR_SEPARATOR_CHAR /* Assume that not having this is an indicator that all are missing. */ # define DIR_SEPARATOR_CHAR '/' # define DIR_SEPARATOR_STR "/" # define PATH_SEPARATOR_CHAR ':' # define PATH_SEPARATOR_STR ":" #endif /* !DIR_SEPARATOR_CHAR */ With this in place we can use the macros defined above to write code which will compile and work just about anywhere: char path[MAXBUFLEN]; snprintf(path, MAXBUFLEN, "%ctmp%c%s\n", DIR_SEPARATOR_CHAR, DIR_SEPARATOR_CHAR, foo); file = fopen(path, "tw+");  File: autobook.info, Node: Executable Filename Extensions, Prev: File System Limitations, Up: Writing A Cygwin Friendly Package 25.3.3 Executable Filename Extensions ------------------------------------- As I already noted in *Note Package Installation::, the fact that Windows requires that all program files be named with the extension `.exe', is the cause of several inconsistencies in package behaviour between Windows and Unix. For example, where Libtool is involved, if a package builds an executable which is linked against an as yet uninstalled library, `libtool' puts the real executable in the `.libs' (or `_libs') subdirectory, and writes a shell script to the original destination of the executable(1), which ensures the runtime library search paths are adjusted to find the correct (uninstalled) libraries that it depends upon. On Windows, only a PE-COFF executable is allowed to bear the `.exe' extension, so the wrapper script has to be named differently to the executable it is substituted for (i.e the script is only executed correctly by the operating system if it does *not* have an `.exe' extension). The result of this confusion is that the `Makefile' can't see some of the executables it builds with Libtool because the generated rules assume an `.exe' extension will be in evidence. This problem will be addressed in some future revision of Automake and Libtool. In the mean time, it is sometimes necessary to move the executables from the `.libs' directory to their install destination by hand. The continual rebuilding of wrapped executables at each invocation of `make' is another symptom of using wrapper scripts with a different name to the executable which they represent. It is very important to correctly add the `.exe' extension to program file names in your `Makefile.am', otherwise many of the generated rules will not work correctly while they await a file without the `.exe' extension. Fortunately, Automake will do this for you where ever it is able to tell that a file is a program - everything listed in `bin_PROGRAMS' for example. Occasionally you will find cases where there is no way for Automake to be sure of this, in which case you must be sure to add the `$(EXEEXT)' suffix. By structuring your `Makefile.am' carefully, this can be avoided in the majority of cases: TESTS = $(check_SCRIPTS) script-test bin1-test$(EXEEXT) could be rewritten as: check_PROGRAMS = bin1-test TESTS = $(check_SCRIPTS) script-test $(check_PROGRAMS) The value of `EXEEXT' is always set correctly with respect to the host machine if you use Libtool in your project. If you don't use Libtool, you must manually call the Autoconf macro, `AC_EXEEXT' in your `configure.in' to make sure that it is initialiased correctly. If you don't call this macro (either directly or implicitly with `AC_PROG_LIBTOOL'), your project will almost certainly not build correctly on Cygwin. ---------- Footnotes ---------- (1) *Note Executing Uninstalled Binaries::.  File: autobook.info, Node: DLLs with Libtool, Next: Package Installation, Prev: Writing A Cygwin Friendly Package, Up: Integration with Cygnus Cygwin 25.4 DLLs with Libtool ====================== Windows' DLLs, are very different to their nearest equivalent on Unix: shared libraries. This makes Libtool's job of hiding both behind the same abstraction extremely difficult - it is not fully implemented at the time of writing. As a package author that wants to use DLLs on Windows with Libtool, you must construct your packages very carefully to enable them to build and link with DLLs in the same way that they build and link with shared libraries on Unix. Some of the difficulties that must be addressed follow: * At link time, a DLL effectively consists of two parts; the DLL itself which contains the shared object code, and an import library which consists of the "stub"(1) functions which are actually linked into the executable, at a rate of one stub per entry point. Unix has a run time loader which links shared libraries into the main program as it is executed, so the shared library is but a single file. * Pointer comparisons do not always work as expected when the pointers cross a DLL boundary, since you can be comparing the addresses of the stubs in the import library rather than the addresses of the actual objects in the DLL. GCC provides the `__declspec' extension to alleviate this problem a little. * The search algorithm for the runtime library loader is very different to the algorithms typically used on Unix; I'll explain how to dela with this in *Note Runtime Loading of DLLs::. * All of the symbols required by a DLL at runtime, must be resolved at link time. With some creative use of import libraries, it is usually possible to work around this shortcoming, but it is easy to forget this limitation if you are developing on a modern system which has lazy symbol resolution. Be sure to keep it at the back of your mind if you intend to have your package portable to Windows. * Worst of all, is that it is impossible to reference a non-pointer item imported from a DLL. In practice, when you think you have exported a data item from a DLL, you are actually exporting it's address (in fact the address of the address if you take the import library into consideration), and it is necessary to add an extra level of indirection to any non-pointers imported from a DLL to take this into account. The GNU gcc `__declspec' extension can handle this automatically too, at the expense of obfuscating your code a little. Cygwin support in Libtool is very new, and is being developed very quickly, so newer versions generally improve vastly over their predecessors when it comes to Cygwin, so you should get the newest release you can. The rest of this section is correct with respect to Libtool version 1.3.5. In some future version, Libtool might be able to work as transparently as Autoconf and Automake, but for now designing your packages as described in this chapter will help Libtool to help us have DLLs and Unix shared libraries from the same codebase. The bottom line here is that setting a package up to build and use modules and libraries as both DLLs _and_ Unix shared libraries is not straightforward, but the rest of this section provides a recipe which I have used successfully in several projects, including the module loader for GNU `m4' 1.5 which works correctly with DLLs on Windows. Lets create "hello world" as a DLL, and an executable where the runtime loader loads the DLL. * Menu: * DLL Support with GNU Autotools:: * A Makefile.am for DLLs:: * A configure.in for DLLs:: * Handling Data Exports from DLLs:: * Runtime Loading of DLLs:: ---------- Footnotes ---------- (1) In general, a stub function will satisfy the linker's requirements to resolve an undefined symbol at link time, but has no functionality of its own. In this context, the stubs do have some boilerplate code to pass execution flow into the correct full function in the DLL.  File: autobook.info, Node: DLL Support with GNU Autotools, Next: A Makefile.am for DLLs, Up: DLLs with Libtool 25.4.1 DLL Support with GNU Autotools ------------------------------------- Here are the contents of the three source files used as an example for the remainder of this chapter (for brevity, they are missing most of the special code one would normally use to maximise portability): `hello.h' documents the interface to `libhello.dll': #ifndef HELLO_H #define HELLO_H 1 extern int hello (const char *who); #endif /* !HELLO_H */ `hello.c' is the implementation of `libhello.dll': #if HAVE_CONFIG_H # include #endif #include #include "hello.h" int hello (const char *who) { printf("Hello, %s!\n", who); return 0; } `main.c' is the source for the executable which uses `libhello.dll': #if HAVE_CONFIG_H # include #endif #include "hello.h" int main (int argc, const char *const argv[]) { return hello("World"); }  File: autobook.info, Node: A Makefile.am for DLLs, Next: A configure.in for DLLs, Prev: DLL Support with GNU Autotools, Up: DLLs with Libtool 25.4.2 A Makefile.am for DLLs ----------------------------- First of all we will "autoconfiscate"(1) the source files above with a minimal setup: `Makefile.am' is used to generate the `Makefile.in' template for the `configure' script: ## Process this file with automake to produce Makefile.in. lib_LTLIBRARIES = libhello.la libhello_la_SOURCES = hello.c libhello_la_LDFLAGS = -no-undefined -version-info 0:0:0 include_HEADERS = hello.h bin_PROGRAMS = hello hello_SOURCES = main.c hello_LDADD = libhello.la The new feature introduced in this file is the use of the `-no-undefined' flag in the `libhello_la_LDFLAGS' value. This flag is required for Windows DLL builds. It asserts to the linker that there are no undefined symbols in the `libhello.la' target, which is one of the requirements for building a DLL outlined earlier. *Note Creating Libtool Libraries with Automake::. For an explanation of the contents of the rest of this `Makefile.am', *Note Introducing GNU automake: Introducing GNU Automake. ---------- Footnotes ---------- (1) Some people prefer to use the term "autoconfuse" - if you should meet any, be sure to tell them about this book  File: autobook.info, Node: A configure.in for DLLs, Next: Handling Data Exports from DLLs, Prev: A Makefile.am for DLLs, Up: DLLs with Libtool 25.4.3 A configure.in for DLLs ------------------------------ `configure.in' is used to generate the `configure' script: # Process this file with autoconf to create configure. AC_INIT(hello.h) AM_CONFIG_HEADER(config.h:config.hin) AM_INIT_AUTOMAKE(hello, 1.0) AC_PROG_CC AM_PROG_CC_STDC AC_C_CONST AM_PROG_LIBTOOL AC_OUTPUT(Makefile) The `AC_PROG_CC' and `AM_PROG_CC_STDC' macros in the `configure.in' above will conspire to find a suitable compiler for the C code in this example, and to discover any extra switches required to put that compiler into an ANSI mode. I have used the `const' keyword in the sources, so I need to specify the `AC_C_CONST' macro, in case the compiler doesn't understand it, and finally I have specified the `AM_PROG_LIBTOOL' macro since I want the library to be built with Libtool. In order to set the build environment up we need to create the autogenerated files: $ ls Makefile.in hello.c main.c configure.in hello.h $ aclocal $ autoheader $ libtoolize --force --copy $ automake --foreign --add-missing --copy automake: configure.in: installing ./install-sh automake: configure.in: installing ./mkinstalldirs automake: configure.in: installing ./missing $ autoconf $ ls Makefile.am config.hin hello.c ltmain.sh stamp-h.in Makefile.in config.sub hello.h main.c aclocal.m4 configure install-sh missing config.guess configure.in ltconfig mkinstalldirs If you have already tried to build DLLs with Libtool, you have probably noticed that the first point of failure is during the configuration process. For example, running the new `configure' script you might see: ... checking if libtool supports shared libraries... yes checking if package supports dlls... no checking whether to build shared libraries... no ... `libtool' provides a macro, `AC_LIBTOOL_WIN32_DLL', which must be added to a package's `configure.in' to communicate to the `libtool' machinery that the package supports DLLs. Without this macro, `libtool' will never try to build a DLL on Windows. Add this macro to `configure.in' before the `AM_PROG_LIBTOOL' macro, and try again: $ make cd . && aclocal cd . && automake --foreign Makefile cd . && autoconf ... checking if libtool supports shared libraries... yes checking if package supports dlls... yes checking whether to build shared libraries... yes ... gcc -DHAVE_CONFIG_H -I. -I. -I. -g -O2 -Wp,-MD,.deps/hello.pp \ -c -DDLL_EXPORT -DPIC hello.c -o .libs/hello.lo gcc -DHAVE_CONFIG_H -I. -I. -I. -g -O2 -Wp,-MD,.deps/hello.pp \ -c hello.c -o hello.o >/dev/null 2>&1 mv -f .libs/hello.lo hello.lo ... gcc -g -O2 -o ./libs/hello main.o .libs/libimp-hello-0-0-0.a \ -Wl,--rpath -Wl,/usr/local/lib creating hello ... $ ./hello Hello, World! If you run this and watch the full output of the `make' command, Libtool uses a rather contorted method of building DLLs, with several invocations each of `dlltool' and `gcc'. I have omitted these from the example above, since they really are very ugly, and in any case are almost incomprehensible to most people. To see it all in its full horror you can always examine the output after running the commands yourself! In a future release of Cygwin, recent work on the binutils linker by DJ Delorie, will allow `gcc' to link DLLs in a single pass using the same syntax used on other systems to produce shared libraries. Libtool will adopt this method when it becomes available, deprecating the use of `dlltool'. I have extracted the interesting lines from amongst the many calls to `dlltool'(1) and `gcc' generated by `make' in the shell log. The main thing to notice is that we have a `hello' binary, which is executable, and which gives the right result when we run it! From the partial log above, it certainly appears that it has built `libhello' as a DLL and linked that into `hello', but just to double check we can use `ldd'(2): $ libtool --mode=execute ldd ./hello lt-hello.exe -> /tmp/.libs/lt-hello.exe libhello-0-0-0.dll -> /tmp/.libs/libhello-0-0-0.dll cygwin1.dll -> /usr/bin/cygwin1.dll kernel32.dll -> /WINNT/system32/kernel32.dll ntdll.dll -> /WINNT/system32/ntdll.dll advapi32.dll -> /WINNT/system32/advapi32.dll user32.dll -> /WINNT/system32/user32.dll gdi32.dll -> /WINNT/system32/gdi32.dll rpcrt4.dll -> /WINNT/system32/rpcrt4.dll So now you know how to build and link a simple Windows DLL using GNU Autotools: You add `-no-undefined' to the Libtool library `LDFLAGS', and include the `AC_LIBTOOL_WIN32_DLL' macro in your `configure.in'. ---------- Footnotes ---------- (1) Part of the Binutils port to Windows, and necessary to massage compiler objects into a working DLL. (2) This is a shell script for Cygwin which emulates the behaviour of `ldd' on GNU/Linux, available online from `http://www.oranda.demon.co.uk/dist/ldd'.  File: autobook.info, Node: Handling Data Exports from DLLs, Next: Runtime Loading of DLLs, Prev: A configure.in for DLLs, Up: DLLs with Libtool 25.4.4 Handling Data Exports from DLLs -------------------------------------- Unfortunately, things are not quite that simple in reality, except in the rare cases where no data symbols are exported across a DLL boundary. If you look back at the example in *Note A configure.in for DLLs: A configure.in for DLLs, you will notice that the Libtool object, `hello.lo' was built with the preprocessor macro `DLL_EXPORT' defined. Libtool does this deliberately so that it is possible to distinguish between a static object build and a Libtool object build, from within the source code. Lets add a data export to the DLL source to illustrate: The `hello.h' header must be changed quite significantly: #ifndef HELLO_H #define HELLO_H 1 #if HAVE_CONFIG_H # include #endif #ifdef _WIN32 # ifdef DLL_EXPORT # define HELLO_SCOPE __declspec(dllexport) # else # ifdef LIBHELLO_DLL_IMPORT # define HELLO_SCOPE extern __declspec(dllimport) # endif # endif #endif #ifndef HELLO_SCOPE # define HELLO_SCOPE extern #endif HELLO_SCOPE const char *greet; extern int hello (const char *who); #endif /* !HELLO_H */ The nasty block of preprocessor would need to be shared among all the source files which comprise the `libhello.la' Libtool library, which in this example is just `hello.c'. It needs to take care of five different cases: compiling `hello.lo' When compiling the Libtool object which will be included in the DLL, we need to tell the compiler which symbols are exported data so that it can do the automatic extra dereference required to refer to that data from a program which uses this DLL. We need to flag the data with `__declspec(dllexport)', *Note DLLs with Libtool::. compilation unit which will link with `libhello-0-0-0.dll' When compiling an object which will import data from the DLL, again we need to tell the compiler so that it can perform the extra dereference, except this time we use `extern __declspec(dllimport)'. From the preprocessor block, you will see that we need to define `LIBHELLO_DLL_IMPORT' to get this define, which I will describe shortly. compiling `hello.o' When compiling the object for inclusion in the static archive, we must be careful to hide the `__declspec()' declarations from the compiler, or else it will start dereferencing variables for us by mistake at runtime, and in all likelihood cause a segmentation fault. In this case we want the compiler to see a simple `extern' declaration. compilation unit which will link with `libhello.a' Similarly, an object which references a data symbol which will be statically linked into the final binary from a static archive must not see any of the `__declspec()' code, and requires a simple `extern'. non Windows host It seems obvious, but we must also be careful not to contaminate the code when it is compiled on a machine which doesn't need to jump through the DLL hoops. The changes to `hello.c' are no different to what would be required on a Unix machine. I have declared the `greet' variable to allow the caller to override the default greeting: #if HAVE_CONFIG_H # include #endif #include #include "hello.h" const char *greet = "Hello"; int hello (const char *who) { printf("%s, %s!\n", greet, who); return 0; } Again, since the DLL specific changes have been encapsulated in the `hello.h' file, enhancements to `main.c' are unsurprising too: #if HAVE_CONFIG_H # include #endif #include "hello.h" int main (int argc, const char *const argv[]) { if (argc > 1) { greet = argv[1]; } return hello("World"); } The final thing to be aware of is to be careful about ensuring that `LIBHELLO_DLL_IMPORT' is defined when we link an executable against the `libhello' DLL, but not defined if we link it against the static archive. It is impossible to automate this completely, particularly when the executable in question is from another package and is using the installed `hello.h' header. In that case it is the responsibility of the author of that package to probe the system with `configure' to decide whether it will be linking with the DLL or the static archive, and defining `LIBHELLO_DLL_IMPORT' as appropriate. Things are a little simpler when everything is under the control of a single package, but even then it isn't quite possible to tell for sure whether Libtool is going to build a DLL or only a static library. For example, if some dependencies are dropped for being static, Libtool may disregard `-no-undefined' (*note Creating Libtool Libraries with Automake::). One possible solution is: 1. Define a function in the library that invokes `return 1' from a DLL. Fortunately that's easy to accomplish thanks to `-DDLL_EXPORT', in this case, by adding the following to `hello.c': #if defined WIN32 && defined DLL_EXPORT char libhello_is_dll (void) { return 1; } #endif /* WIN32 && DLL_EXPORT */ 2. Link a program with the library, and check whether it is a DLL by seeing if the link succeeded. 3. To get cross builds to work, you must, in the same vein, test whether linking a program which calls `libhello_is_dll' succeeds to tell whether or not to define `LIBHELLO_DLL_IMPORT'. As an example of building the `hello' binary we can add the following code to `configure.in', just before the call to `AC_OUTPUT': # ---------------------------------------------------------------------- # Win32 objects need to tell the header whether they will be linking # with a dll or static archive in order that everything is imported # to the object in the same way that it was exported from the # archive (extern for static, __declspec(dllimport) for dlls) # ---------------------------------------------------------------------- LIBHELLO_DLL_IMPORT= case "$host" in *-*-cygwin* | *-*-mingw* ) if test X"$enable_shared" = Xyes; then AC_TRY_LINK_FUNC([libhello_is_dll], [LIBHELLO_DLL_IMPORT=-DLIBHELLO_DLL_IMPORT]) fi ;; esac AC_SUBST(LIBHELLO_DLL_IMPORT) And we must also arrange for the flag to be passed while compiling any objects which will end up in a binary which links with the dll. For this simple example, only `main.c' is affected, and we can add the following rule to the end of `Makefile.am': main.o: main.c $(COMPILE) @LIBHELLO_DLL_IMPORT@ -c main.c In a more realistic project, there would probably be dozens of files involved, in which case it would probably be easier to move them all to a separate subdirectory, and give them a `Makefile.am' of their own which could include: CPPFLAGS = @LIBHELLO_DLL_IMPORT@ Now, lets put all this into practice, and check that it works: $ make cd . && aclocal cd . && automake --foreign Makefile cd . && autoconf ... checking for gcc option to produce PIC ... -DDLL_EXPORT checking if gcc PIC flag -DDLL_EXPORT works... yes ... checking whether to build shared libraries... yes ... gcc -DHAVE_CONFIG_H -I. -I. -I. -g -O2 -Wp,-MD,.deps/hello.pp \ -c -DDLL_EXPORT -DPIC hello.c -o .libs/hello.lo gcc -DHAVE_CONFIG_H -I. -I. -I. -g -O2 -Wp,-MD,.deps/hello.pp \ -c hello.c -o hello.o >/dev/null 2>&1 ... gcc -DHAVE_CONFIG_H -I. -I. -I. -g -O2 -DLIBHELLO_DLL_IMPORT \ -c main.c ... gcc -g -O2 -o ./libs/hello main.o .libs/libimp-hello-0-0-0.a \ -Wl,--rpath -Wl,/usr/local/lib creating hello ... $ ./hello Hello, World! $ ./hello Howdy Howdy, World! The recipe also works if I use only the static archives: $ make clean ... $ ./configure --disable-shared ... checking whether to build shared libraries... no ... $ make ... gcc -DHAVE_CONFIG_H -I. -I. -I. -f -O2 -Wp,-MD,.deps/hello.pp \ -c hello.c -o hello.o ... ar cru ./libs/libhello.a hello.o ... gcc -DHAVE_CONFIG_H -I. -I. -I. -g -O2 -c main.c ... gcc -g -O2 -o hello main.o ./.libs/libhello.a $ ./hello Hello, World! $ ./hello "G'Day" G'day, World! And just to be certain that I am really testing a new statically linked executable: $ ldd ./hello hello.exe -> /tmp/hello.exe cygwin1.dll -> /usr/bin/cygwin1.dll kernel32.dll -> /WINNT/system32/kernel32.dll ntdll.dll -> /WINNT/system32/ntdll.dll advapi32.dll -> /WINNT/system32/advapi32.dll user32.dll -> /WINNT/system32/user32.dll gdi32.dll -> /WINNT/system32/gdi32.dll rpcrt4.dll -> /WINNT/system32/rpcrt4.dll  File: autobook.info, Node: Runtime Loading of DLLs, Prev: Handling Data Exports from DLLs, Up: DLLs with Libtool 25.4.5 Runtime Loading of DLLs ------------------------------ DLLs built using the recipe described in this chapter can be loaded at runtime in at least three different ways: * Using the Cygwin emulation of the POSIX `dlopen'/`dlclose'/`dlsym' API. Note however that the emulation is broken up until at least version b20.1, and `dlopen(NULL)' doesn't work at all. * Using the Windows `LoadLibrary'/`FreeLibrary'/`GetProcAddress' API. * Using libltdl, which is covered in more detail in *Note Using GNU libltdl: Using GNU libltdl.  File: autobook.info, Node: Package Installation, Prev: DLLs with Libtool, Up: Integration with Cygnus Cygwin 25.5 Package Installation ========================= Having successfully built a GNU Autotools managed package, a Systems Administrator will typically want to install the binaries, libraries and headers of the package. The GNU standards dictate that this be done with the command `make install', and indeed Automake always generates `Makefile's which work in this way. Unfortunately, this `make install' command is often thwarted by the peculiarities of Window's file system, and after an apparently successful installation, often the Windows installation conventions are not always satisfied, so the installed package may not work, even though the uninstalled build is fully operational. There are a couple of issues which are worthy of discussion: Prior to release 1.1.0, the Cygwin `install' program did not understand the `.exe' file extension. Fixing it was only a matter of writing a shell script wrapper for the `install' binary. Even though the current release is well behaved in this respect, `.exe' handling is still the cause of some complications. *Note Executable Filename Extensions::. If a package builds any DLLs with `libtool', they are installed to `$prefix/lib' by default, since this is where shared libraries would be installed on Unix. Windows searches for DLLs at runtime using the user's executable search path (`$PATH'), which generally doesn't contain library paths. The first evidence you will see of this problem is when DLLs you have installed are not found by executables which depend on them, and there are two ways to fix it: The installed DLLs can be moved by hand from their installation directory into the equivalent executable destination, say from `/usr/local/lib' to `/usr/local/bin'; or better, you can extend your binary search path to include library directories. Adding the following to your `.profile' would be a good start: PATH=$PATH:/usr/local/lib:/usr/lib:/lib Once you are comfortable with setting your packages up like this, they will be relatively well behaved on Windows and Unix. Of course, you must also write portable code, see *Note Writing Portable C with GNU Autotools: Writing Portable C.  File: autobook.info, Node: Cross Compilation, Next: Installing GNU Autotools, Prev: Integration with Cygnus Cygwin, Up: Top 26 Cross Compilation with GNU Autotools *************************************** Normally, when you build a program, it runs on the system on which you built it. For example, if you compile a simple program, you can immediately run it on the same machine. This is normally how GNU Autotools is used as well. You run the `configure' script on a particular machine, you run `make' on the same machine, and the resulting program also runs on the same machine. However, there are cases where it is useful to build a program on one machine and run it on another. One common example is a program which runs on an "embedded system". An embedded system is a special purpose computer, often part of a larger system, such as the computers found within modern automobiles. An embedded system often does not support a general programming environment, so there is no way to run a shell or a compiler on the embedded system. However, it is still necessary to write programs to run on the embedded system. These programs are built on a different machine, normally a general purpose computer. The resulting programs can not be run directly on the general purpose computer. Instead, they are copied onto the embedded system and run there. (We are omitting many details and possibilities of programming embedded systems here, but this should be enough to understand the the points relevant to GNU Autotools. For more information, see a book such as `Programming Embedded Systems' by Michael Barr.) Another example where it is useful to build a program on one machine and run it on another is the case when one machine is much faster. It can sometimes be useful to use the faster machine as a compilation server, to build programs which are then copied to the slower machine and run there. Building a program on one type of system which runs on a different type of system is called "cross compiling". Doing this requires a specially configured compiler, known as a "cross compiler". Similarly, we speak of cross assemblers, cross linkers, etc. When it is necessary to explicitly distinguish the ordinary sort of compiler, whose output runs on the same type of system, from a cross compiler, we call the ordinary compiler a "native compiler". Although the debugger is not strictly speaking a compilation tool, it is meaningful to speak of a cross debugger: a debugger which is used to debug code which runs on another system. GNU Autotools supports cross compilation in two distinct though related ways. Firstly, GNU Autotools supports configuring and building a cross compiler or other cross compilation tools. Secondly, GNU Autotools supports building tools using a cross compiler (this is sometimes called a "Canadian Cross"). In the rest of this chapter we will explain how to use GNU Autotools to do these tasks. If you are not interested in doing cross compilation, you may skip this chapter. However, if you are developing `configure' scripts, we recommend that you at least skim this chapter to get some hints as to how to write them so that it is possible to build your package using a cross compiler; in particular, see *Note Supporting Cross Compiler::. Even if your package is useless for an embedded system, it is possible that somebody with a very fast compilation server will want to use it to cross compile your package. * Menu: * Host and Target:: * Specifying the Target:: * Using the Target Type:: * Building with a Cross Compiler::  File: autobook.info, Node: Host and Target, Next: Specifying the Target, Up: Cross Compilation 26.1 Host and Target ==================== We will first discuss using GNU Autotools to build cross compilation tools. For example, the information in this section will explain how to configure and build the GNU cc compiler as a cross compiler. When building cross compilation tools, there are two different systems involved: the system on which the tools will run, and the system for which the tools will generate code. The system on which the tools will run is called the "host" system. The system for which the tools generate code is called the "target" system. For example, suppose you have a compiler which runs on a GNU/Linux system and generates ELF programs for a MIPS-based embedded system. In this case, the GNU/Linux system is the host, and the MIPS ELF system is the target. Such a compiler could be called a GNU/Linux cross MIPS ELF compiler, or, equivalently, a `i386-linux-gnu' cross `mips-elf' compiler. We discussed the latter sorts of names earlier; see *Note Configuration Names::. Naturally, most programs are not cross compilation tools. For those programs, it does not make sense to speak of a target. It only makes sense to speak of a target for programs like the GNU compiler or the GNU binutils which actually produce running code. For example, it does not make sense to speak of the target of a program like `make'. Most cross compilation tools can also serve as native tools. For a native compilation tool, it is still meaningful to speak of a target. For a native tool, the target is the same as the host. For example, for a GNU/Linux native compiler, the host is GNU/Linux, and the target is also GNU/Linux.  File: autobook.info, Node: Specifying the Target, Next: Using the Target Type, Prev: Host and Target, Up: Cross Compilation 26.2 Specifying the Target ========================== By default, the `configure' script will assume that the target is the same as the host. This is the more common case; for example, when the target is the same as the host, you get a native compiler rather than a cross compiler. If you want to build a cross compilation tool, you must specify the target explicitly by using the `--target' option when you run `configure' *Note Invoking configure::. The argument to `--target' is the configuration name of the system for which you wish to generate code. *Note Configuration Names::. For example, to build tools which generate code for a MIPS ELF embedded system, you would use `--target mips-elf'.  File: autobook.info, Node: Using the Target Type, Next: Building with a Cross Compiler, Prev: Specifying the Target, Up: Cross Compilation 26.3 Using the Target Type ========================== A `configure' script for a cross compilation tool will use the `--target' option to control how it is built, so that the resulting program will produce programs which run on the appropriate system. In this section we explain how you can write your own configure scripts to support the `--target' option. You must start by putting `AC_CANONICAL_SYSTEM' in `configure.in'. `AC_CANONICAL_SYSTEM' will look for a `--target' option and canonicalize it using the `config.sub' shell script (for more information about configuration names, canonicalizing them, and `config.sub', *note Configuration Names::). `AC_CANONICAL_SYSTEM' will also run `AC_CANONICAL_HOST' to get the host information. The host and target type will be recorded in the following shell variables: `host' The canonical configuration name of the host. This will normally be determined by running the `config.guess' shell script, although the user is permitted to override this by using an explicit `--host' option. `target' The canonical configuration name of the target. `host_alias' The argument to the `--host' option, if used. Otherwise, the same as the `host' variable. `target_alias' The argument to the `--target' option. If the user did not specify a `--target' option, this will be the same as `host_alias'. `host_cpu' `host_vendor' `host_os' The first three parts of the canonical host configuration name. `target_cpu' `target_vendor' `target_os' The first three parts of the canonical target configuration name. Note that if `host' and `target' are the same string, you can assume a native configuration. If they are different, you can assume a cross configuration. It is possible for `host' and `target' to represent the same system, but for the strings to not be identical. For example, if `config.guess' returns `sparc-sun-sunos4.1.4', and somebody configures with `--target sparc-sun-sunos4.1', then the slight differences between the two versions of SunOS may be unimportant for your tool. However, in the general case it can be quite difficult to determine whether the differences between two configuration names are significant or not. Therefore, by convention, if the user specifies a `--target' option without specifying a `--host' option, it is assumed that the user wants to configure a cross compilation tool. The `target' variable should not be handled in the same way as the `target_alias' variable. In general, whenever the user may actually see a string, `target_alias' should be used. This includes anything which may appear in the file system, such as a directory name or part of a tool name. It also includes any tool output, unless it is clearly labelled as the canonical target configuration name. This permits the user to use the `--target' option to specify how the tool will appear to the outside world. On the other hand, when checking for characteristics of the target system, `target' should be used. This is because a wide variety of `--target' options may map into the same canonical configuration name. You should not attempt to duplicate the canonicalization done by `config.sub' in your own code. By convention, cross tools are installed with a prefix of the argument used with the `--target' option, also known as `target_alias'. If the user does not use the `--target' option, and thus is building a native tool, no prefix is used. For example, if `gcc' is configured with `--target mips-elf', then the installed binary will be named `mips-elf-gcc'. If `gcc' is configured without a `--target' option, then the installed binary will be named `gcc'. The Autoconf macro `AC_ARG_PROGRAM' will handle the names of binaries for you. If you are using Automake, no more need be done; the programs will automatically be installed with the correct prefixes. Otherwise, see the Autoconf documentation for `AC_ARG_PROGRAM'.  File: autobook.info, Node: Building with a Cross Compiler, Prev: Using the Target Type, Up: Cross Compilation 26.4 Building with a Cross Compiler =================================== It is possible to build a program which uses GNU Autotools on one system and to run it on a different type of system. In other words, it is possible to build programs using a cross compiler. In this section, we explain what this means, how to build programs this way, and how to write your `configure' scripts to support it. Building a program on one system and running it on another is sometimes referred to as a "Canadian Cross"(1). * Menu: * Canadian Cross Example:: * Canadian Cross Concepts:: * Build Cross Host Tools:: * Build and Host Options:: * Canadian Cross Tools:: * Supporting Cross Compiler:: ---------- Footnotes ---------- (1) The name Canadian Cross comes from the most complex case, in which three different types of systems are used. At the time that these issues were being hashed out, Canada had three national political parties.  File: autobook.info, Node: Canadian Cross Example, Next: Canadian Cross Concepts, Up: Building with a Cross Compiler 26.4.1 Canadian Cross Example ----------------------------- We'll start with an example of a Canadian Cross, to make sure that the concepts are clear. Using a GNU/Linux system, you can build a program which will run on a Solaris system. You would use a GNU/Linux cross Solaris compiler to build the program. You could not run the resulting programs on your GNU/Linux system. After all, they are Solaris programs. Instead, you would have to copy the result over to a Solaris system before you could run it. Naturally, you could simply build the program on the Solaris system in the first place. However, perhaps the Solaris system is not available for some reason; perhaps you don't actually have one, but you want to build the tools for somebody else to use. Or perhaps your GNU/Linux system is much faster than your Solaris system. A Canadian Cross build is most frequently used when building programs to run on a non-Unix system, such as DOS or Windows. It may be simpler to configure and build on a Unix system than to support the GNU Autotools tools on a non-Unix system.  File: autobook.info, Node: Canadian Cross Concepts, Next: Build Cross Host Tools, Prev: Canadian Cross Example, Up: Building with a Cross Compiler 26.4.2 Canadian Cross Concepts ------------------------------ When building a Canadian Cross, there are at least two different systems involved: the system on which the tools are being built, and the system on which the tools will run. The system on which the tools are being built is called the "build" system. The system on which the tools will run is called the host system. For example, if you are building a Solaris program on a GNU/Linux system, as in the previous example, the build system would be GNU/Linux, and the host system would be Solaris. Note that we already discussed the host system above; see *Note Host and Target::. It is, of course, possible to build a cross compiler using a Canadian Cross (i.e., build a cross compiler using a cross compiler). In this case, the system for which the resulting cross compiler generates code is the target system. An example of building a cross compiler using a Canadian Cross would be building a Windows cross MIPS ELF compiler on a GNU/Linux system. In this case the build system would be GNU/Linux, the host system would be Windows, and the target system would be MIPS ELF.  File: autobook.info, Node: Build Cross Host Tools, Next: Build and Host Options, Prev: Canadian Cross Concepts, Up: Building with a Cross Compiler 26.4.3 Build Cross Host Tools ----------------------------- In order to configure a program for a Canadian Cross build, you must first build and install the set of cross tools you will use to build the program. These tools will be build cross host tools. That is, they will run on the build system, and will produce code that runs on the host system. It is easy to confuse the meaning of build and host here. Always remember that the build system is where you are doing the build, and the host system is where the resulting program will run. Therefore, you need a build cross host compiler. In general, you must have a complete cross environment in order to do the build. This normally means a cross compiler, cross assembler, and so forth, as well as libraries and header files for the host system. Setting up a complete cross environment can be complex, and is beyond the scope of this book. You may be able to get more information from the `crossgcc' mailing list and FAQ; see `http://www.objsw.com/CrossGCC/'.  File: autobook.info, Node: Build and Host Options, Next: Canadian Cross Tools, Prev: Build Cross Host Tools, Up: Building with a Cross Compiler 26.4.4 Build and Host Options ----------------------------- When you run `configure' for a Canadian Cross, you must use both the `--build' and `--host' options. The `--build' option is used to specify the configuration name of the build system. This can normally be the result of running the `config.guess' shell script, and when using a Unix shell it is reasonable to use `--build=`config.guess`'. The `--host' option is used to specify the configuration name of the host system. As we explained earlier, `config.guess' is used to set the default value for the `--host' option (*note Using the Target Type::). We can now see that since `config.guess' returns the type of system on which it is run, it really identifies the build system. Since the host system is normally the same as the build system (or, in other words, people do not normally build using a cross compiler), it is reasonable to use the result of `config.guess' as the default for the host system when the `--host' option is not used. It might seem that if the `--host' option were used without the `--build' option that the `configure' script could run `config.guess' to determine the build system, and presume a Canadian Cross if the result of `config.guess' differed from the `--host' option. However, for historical reasons, some configure scripts are routinely run using an explicit `--host' option, rather than using the default from `config.guess'. As noted earlier, it is difficult or impossible to reliably compare configuration names (*note Using the Target Type::). Therefore, by convention, if the `--host' option is used, but the `--build' option is not used, then the build system defaults to the host system. (This convention may be changing in the Autoconf 2.5 release. Check the release notes.)  File: autobook.info, Node: Canadian Cross Tools, Next: Supporting Cross Compiler, Prev: Build and Host Options, Up: Building with a Cross Compiler 26.4.5 Canadian Cross Tools --------------------------- You must explicitly specify the cross tools which you want to use to build the program. This is done by setting environment variables before running the `configure' script. You must normally set at least the environment variables `CC', `AR', and `RANLIB' to the cross tools which you want to use to build. For some programs, you must set additional cross tools as well, such as `AS', `LD', or `NM'. You would set these environment variables to the build cross host tools which you are going to use. For example, if you are building a Solaris program on a GNU/Linux system, and your GNU/Linux cross Solaris compiler were named `solaris-gcc', then you would set the environment variable `CC' to `solaris-gcc'.  File: autobook.info, Node: Supporting Cross Compiler, Prev: Canadian Cross Tools, Up: Building with a Cross Compiler 26.4.6 Supporting Building with a Cross Compiler ------------------------------------------------ If you want to make it possible to build a program which you are developing using a cross compiler, you must take some care when writing your `configure.in' and `make' rules. Simple cases will normally work correctly. However, it is not hard to write configure tests which will fail when building with a cross compiler, so some care is required to avoid this. You should write your `configure' scripts to support building with a cross compiler if you can, because that will permit others to build your program on a fast compilation server. * Menu: * Supporting Cross Compiler in Configure:: * Supporting Cross Compiler in Make::  File: autobook.info, Node: Supporting Cross Compiler in Configure, Next: Supporting Cross Compiler in Make, Up: Supporting Cross Compiler 26.4.6.1 Supporting Building with a Cross Compiler in Configure Scripts ....................................................................... In a `configure.in' file, after calling `AC_PROG_CC', you can find out whether the program is being built by a cross compiler by examining the shell variable `cross_compiling'. If the compiler is a cross compiler, which means that this is a Canadian Cross, `cross_compiling' will be `yes'. In a normal configuration, `cross_compiling' will be `no'. You ordinarily do not need to know the type of the build system in a `configure' script. However, if you do need that information, you can get it by using the macro `AC_CANONICAL_SYSTEM', the same macro which is used to determine the target system. This macro will set the variables `build', `build_alias', `build_cpu', `build_vendor', and `build_os', which correspond to the similar `target' and `host' variables, except that they describe the build system. *Note Using the Target Type::. When writing tests in `configure.in', you must remember that you want to test the host environment, not the build environment. Macros which use the compiler, such as like `AC_CHECK_FUNCS', will test the host environment. That is because the tests will be done by running the compiler, which is actually a build cross host compiler. If the compiler can find the function, that means that the function is present in the host environment. Tests like `test -f /dev/ptyp0', on the other hand, will test the build environment. Remember that the `configure' script is running on the build system, not the host system. If your `configure' scripts examines files, those files will be on the build system. Whatever you determine based on those files may or may not be the case on the host system. Most Autoconf macros will work correctly when building with a cross compiler. The main exception is `AC_TRY_RUN'. This macro tries to compile and run a test program. This will fail when building with a cross compiler, because the program will be compiled for the host system, which means that it will not run on the build system. The `AC_TRY_RUN' macro provides an optional argument to tell the `configure' script what to do when building with a cross compiler. If that argument is not present, you will get a warning when you run `autoconf': warning: AC_TRY_RUN called without default to allow cross compiling This tells you that the resulting `configure' script will not work when building with a cross compiler. In some cases while it may better to perform a test at configure time, it is also possible to perform the test at run time (*note Testing system features at application runtime::). In such a case you can use the cross compiling argument to `AC_TRY_RUN' to tell your program that the test could not be performed at configure time. There are a few other autoconf macros which will not work correctly when building with a cross compiler: a partial list is `AC_FUNC_GETPGRP', `AC_FUNC_SETPGRP', `AC_FUNC_SETVBUF_REVERSED', and `AC_SYS_RESTARTABLE_SYSCALLS'. The `AC_CHECK_SIZEOF' macro is generally not very useful when building with a cross compiler; it permits an optional argument indicating the default size, but there is no way to know what the correct default should be.  File: autobook.info, Node: Supporting Cross Compiler in Make, Prev: Supporting Cross Compiler in Configure, Up: Supporting Cross Compiler 26.4.6.2 Supporting Building with a Cross Compiler in Makefiles ............................................................... The main cross compiling issue in a `Makefile' arises when you want to use a subsidiary program to generate code or data which you will then include in your real program. If you compile this subsidiary program using `$(CC)' in the usual way, you will not be able to run it. This is because `$(CC)' will build a program for the host system, but the program is being built on the build system. You must instead use a compiler for the build system, rather than the host system. This compiler is conventionally called `$(CC_FOR_BUILD)'. A `configure' script should normally permit the user to define `CC_FOR_BUILD' explicitly in the environment. Your configure script should help by selecting a reasonable default value. If the `configure' script is not being run with a cross compiler (i.e., the `cross_compiling' shell variable is `no' after calling `AC_PROG_CC'), then the proper default for `CC_FOR_BUILD' is simply `$(CC)'. Otherwise, a reasonable default is simply `cc'. Note that you should not include `config.h' in a file you are compiling with `$(CC_FOR_BUILD)'. The `configure' script will build `config.h' with information for the host system. However, you are compiling the file using a compiler for the build system (a native compiler). Subsidiary programs are normally simple filters which do no user interaction, and it is often possible to write them in a highly portable fashion so that the absence of `config.h' is not crucial. The `gcc' `Makefile.in' shows a complex situation in which certain files, such as `rtl.c', must be compiled into both subsidiary programs run on the build system and into the final program. This approach may be of interest for advanced GNU Autotools hackers. Note that, at least in GCC 2.95, the build system compiler is rather confusingly called `HOST_CC'.  File: autobook.info, Node: Installing GNU Autotools, Next: Autoconf Macro Reference, Prev: Cross Compilation, Up: Top Appendix A Installing GNU Autotools *********************************** The GNU Autotools may already be installed at your site, particularly if you are using a GNU/Linux system. If you don't have these tools installed, or do not have the most recent versions, this appendix will help you install them. * Menu: * Prerequisite tools:: * Downloading GNU Autotools:: * Installing the tools::  File: autobook.info, Node: Prerequisite tools, Next: Downloading GNU Autotools, Up: Installing GNU Autotools A.1 Prerequisite tools ====================== The GNU Autotools make use of a few additional tools to get their jobs done. This makes it necessary to gather all of the prerequisite tools to get started. Before installing GNU Autotools, it is necessary to obtain and install these tools. The GNU Autotools are all built around the assumption that the system will have a relatively functional version of the Bourne shell. If your system is missing a Bourne shell or your shell behaves different to most, as is the case with the Bourne shell provided with Ultrix, then you might like to obtain and install GNU `bash'. *Note Downloading GNU Autotools::, for details on obtaining GNU packages. If you are using a Windows system, the easiest way to obtain a Bourne shell and all of the shell utilities that you will need is to download and install Cygnus Solutions' Cygwin product. You can locate further information about Cygwin by reading `http://www.cygnus.com/cygwin/'. Autoconf requires GNU M4. Vendor-provided versions of M4 have proven to be troublesome, so Autoconf checks that GNU M4 is installed on your system. Again, *note Downloading GNU Autotools::, for details on obtaining GNU packages such as M4. At the time of writing, the latest version is 1.4. Earlier versions of GNU M4 will work, but they may not be as efficient. Automake requires Perl version 5 or greater. You should download and install a version of Perl for your platform which meets these requirements.  File: autobook.info, Node: Downloading GNU Autotools, Next: Installing the tools, Prev: Prerequisite tools, Up: Installing GNU Autotools A.2 Downloading GNU Autotools ============================= The GNU Autotools are distributed as part of the GNU project, under the terms of the GNU General Public License. Each tool is packaged in a compressed archive that you can retrieve from sources such as Internet FTP archives and CD-ROM distributions. While you may use any source that is convenient to you, it is best to use one of the recognized GNU mirror sites. A current list of mirror sites is listed at `http://www.gnu.org/order/ftp.html'. The directory layout of the GNU archives has recently been improved to make it easier to locate particular packages. The new scheme places package archive files under a subdirectory whose name reflects the base name of the package. For example, GNU Autoconf 2.13 can be found at: /gnu/autoconf/autoconf-2.13.tar.gz The filenames corresponding to the latest versions of GNU Autotools, at the time of writing, are: autoconf-2.13.tar.gz automake-1.4.tar.gz libtool-1.3.5.tar.gz These packages are stored as `tar' archives and compressed with the `gzip' compression utility. Once you have obtained all of these packages, you should unpack them using the following commands: gunzip TOOL-VERSION.tar.gz tar xfv TOOL-VERSION.tar GNU `tar' archives are created with a directory name prefixed to all of the files in the archive. This means that files will be tidily unpacked into an appropriately named subdirectory, rather than being written all over your current working directory.  File: autobook.info, Node: Installing the tools, Prev: Downloading GNU Autotools, Up: Installing GNU Autotools A.3 Installing the tools ======================== When installing GNU Autotools, it is a good idea to install the tools in the same location (eg. `/usr/local'). This allows the tools to discover each others' presence at installation time. The location shown in the examples below will be the default, `/usr/local', as this choice will make the tools available to all users on the system. Installing Autoconf is usually a quick and simple exercise, since Autoconf itself uses `configure' to prepare itself for building and installation. Automake and Libtool can be installed using the same steps as for Autoconf. As a matter of personal preference, I like to create a separate build tree when configuring packages to keep the source tree free of derived files such as object files. Applying what we know about invoking `configure' (*note Invoking configure::), we can now configure and build Autoconf. The only `configure' option we're likely to want to use is `--prefix', so if you want to install the tools in another location, include this option on the command line. It might be desirable to install the package elsewhere when operating in networked environments. $ mkdir ac-build && cd ac-build $ ~/autoconf-2.13/configure You will see `configure' running its tests and producing a `Makefile' in the build directory: creating cache ./config.cache checking for gm4... no checking for gnum4... no checking for m4... /usr/bin/m4 checking whether we are using GNU m4... yes checking for mawk... no checking for gawk... gawk checking for perl... /usr/bin/perl checking for a BSD compatible install... /usr/bin/install -c updating cache ./config.cache creating ./config.status creating Makefile creating testsuite/Makefile To build Autoconf, type the following: $ make all Autoconf has no architecture-specific files to be compiled, so this process finishes quickly. To install files into `/usr/local', it may be necessary to become the root user before installing. # make install Autoconf is now installed on your system.  File: autobook.info, Node: PLATFORMS, Next: Generated File Dependencies, Prev: Autoconf Macro Reference, Up: Top Appendix B PLATFORMS ******************** This table lists platforms and toolchains known to be supported by Libtool. Each row represents completion of the self test suite shipped with the Libtool distribution on the platform named in that row. There is a `PLATFORMS' file maintained in the Libtool source tree, updated whenever a Libtool user volunteers updated information, or when the Libtool team runs pre-release tests on the platforms to which they individually have access. The table from the latest source tree at the time of writing follows: `canonical host name' This is the configuration triplet returned by `config.guess' on each system for which the test suite was executed. Where the developer who ran the tests considered it to be significant, versions of tools in the compiler toolchain are named below the configuration triplet. `compiler' The compiler used for the tests. `libtool release' The version number of the Libtool distribution most recently tested for the associated configuration triplet. The GNU Autotools all use an alpha version numbering system where `odd' letters (a, c, e, g etc.) represent many CVS snapshots between the `even' lettered (b, d, f etc) alpha release versions. After version 1.4, the CVS revision number of the `Changelog' file will be appended to odd lettered CVS snapshots, `1.4a 1.641.2.54', for example. `results' Either `ok' if the Libtool test suite passed all tests, or optionally `NS' if the test suite would only pass when the distribution was configured with the `--disable-shared' option. ------------------------------------------------------- canonical host name compiler libtool results (tools versions) release ------------------------------------------------------- alpha-dec-osf4.0* gcc 1.3b ok (egcs-1.1.2) alpha-dec-osf4.0* cc 1.3b ok alpha-dec-osf3.2 gcc 0.8 ok alpha-dec-osf3.2 cc 0.8 ok alpha-dec-osf2.1 gcc 1.2f NS alpha*-unknown-linux-gnu gcc 1.3b ok (egcs-1.1.2, GNU ld 2.9.1.0.23) hppa2.0w-hp-hpux11.00 cc 1.2f ok hppa2.0-hp-hpux10.20 cc 1.3.2 ok hppa1.1-hp-hpux10.20 gcc 1.2f ok hppa1.1-hp-hpux10.20 cc 1.2f ok hppa1.1-hp-hpux10.10 gcc 1.2f ok hppa1.1-hp-hpux10.10 cc 1.2f ok hppa1.1-hp-hpux9.07 gcc 1.2f ok hppa1.1-hp-hpux9.07 cc 1.2f ok hppa1.1-hp-hpux9.05 gcc 1.2f ok hppa1.1-hp-hpux9.05 cc 1.2f ok hppa1.1-hp-hpux9.01 gcc 1.2f ok hppa1.1-hp-hpux9.01 cc 1.2f ok i*86-*-beos gcc 1.2f ok i*86-*-bsdi4.0.1 gcc 1.3c ok (gcc-2.7.2.1) i*86-*-bsdi4.0 gcc 1.2f ok i*86-*-bsdi3.1 gcc 1.2e NS i*86-*-bsdi3.0 gcc 1.2e NS i*86-*-bsdi2.1 gcc 1.2e NS i*86-pc-cygwin gcc 1.3b NS (egcs-1.1 stock b20.1 compiler) i*86-*-dguxR4.20MU01 gcc 1.2 ok i*86-*-freebsdelf4.0 gcc 1.3c ok (egcs-1.1.2) i*86-*-freebsdelf3.2 gcc 1.3c ok (gcc-2.7.2.1) i*86-*-freebsdelf3.1 gcc 1.3c ok (gcc-2.7.2.1) i*86-*-freebsdelf3.0 gcc 1.3c ok i*86-*-freebsd3.0 gcc 1.2e ok i*86-*-freebsd2.2.8 gcc 1.3c ok (gcc-2.7.2.1) i*86-*-freebsd2.2.6 gcc 1.3b ok (egcs-1.1 & gcc-2.7.2.1, native ld) i*86-*-freebsd2.1.5 gcc 0.5 ok i*86-*-gnu gcc 1.3c ok (1.602) i*86-*-netbsd1.4 gcc 1.3c ok (egcs-1.1.1) i*86-*-netbsd1.3.3 gcc 1.3c ok (gcc-2.7.2.2+myc2) i*86-*-netbsd1.3.2 gcc 1.2e ok i*86-*-netbsd1.3I gcc 1.2e ok (egcs 1.1?) i*86-*-netbsd1.2 gcc 0.9g ok i*86-*-linux-gnu gcc 1.3b ok (egcs-1.1.2, GNU ld 2.9.1.0.23) i*86-*-linux-gnulibc1 gcc 1.2f ok i*86-*-openbsd2.5 gcc 1.3c ok (gcc-2.8.1) i*86-*-openbsd2.4 gcc 1.3c ok (gcc-2.8.1) i*86-*-solaris2.7 gcc 1.3b ok (egcs-1.1.2, native ld) i*86-*-solaris2.6 gcc 1.2f ok i*86-*-solaris2.5.1 gcc 1.2f ok i*86-ncr-sysv4.3.03 gcc 1.2f ok i*86-ncr-sysv4.3.03 cc 1.2e ok (cc -Hnocopyr) i*86-pc-sco3.2v5.0.5 cc 1.3c ok i*86-pc-sco3.2v5.0.5 gcc 1.3c ok (gcc 95q4c) i*86-pc-sco3.2v5.0.5 gcc 1.3c ok (egcs-1.1.2) i*86-UnixWare7.1.0-sysv5 cc 1.3c ok i*86-UnixWare7.1.0-sysv5 gcc 1.3c ok (egcs-1.1.1) m68k-next-nextstep3 gcc 1.2f NS m68k-sun-sunos4.1.1 gcc 1.2f NS (gcc-2.5.7) m88k-dg-dguxR4.12TMU01 gcc 1.2 ok m88k-motorola-sysv4 gcc 1.3 ok (egcs-1.1.2) mips-sgi-irix6.5 gcc 1.2f ok (gcc-2.8.1) mips-sgi-irix6.4 gcc 1.2f ok mips-sgi-irix6.3 gcc 1.3b ok (egcs-1.1.2, native ld) mips-sgi-irix6.3 cc 1.3b ok (cc 7.0) mips-sgi-irix6.2 gcc 1.2f ok mips-sgi-irix6.2 cc 0.9 ok mips-sgi-irix5.3 gcc 1.2f ok (egcs-1.1.1) mips-sgi-irix5.3 gcc 1.2f NS (gcc-2.6.3) mips-sgi-irix5.3 cc 0.8 ok mips-sgi-irix5.2 gcc 1.3b ok (egcs-1.1.2, native ld) mips-sgi-irix5.2 cc 1.3b ok (cc 3.18) mipsel-unknown-openbsd2.1 gcc 1.0 ok powerpc-ibm-aix4.3.1.0 gcc 1.2f ok (egcs-1.1.1) powerpc-ibm-aix4.2.1.0 gcc 1.2f ok (egcs-1.1.1) powerpc-ibm-aix4.1.5.0 gcc 1.2f ok (egcs-1.1.1) powerpc-ibm-aix4.1.5.0 gcc 1.2f NS (gcc-2.8.1) powerpc-ibm-aix4.1.4.0 gcc 1.0 ok powerpc-ibm-aix4.1.4.0 xlc 1.0i ok rs6000-ibm-aix4.1.5.0 gcc 1.2f ok (gcc-2.7.2) rs6000-ibm-aix4.1.4.0 gcc 1.2f ok (gcc-2.7.2) rs6000-ibm-aix3.2.5 gcc 1.0i ok rs6000-ibm-aix3.2.5 xlc 1.0i ok sparc-sun-solaris2.7 gcc 1.3b ok (egcs-1.1.2, GNU ld 2.9.1 & native ld) sparc-sun-solaris2.6 gcc 1.3.2 ok (egcs-1.1.2, GNU ld 2.9.1 & native ld) sparc-sun-solaris2.5.1 gcc 1.2f ok sparc-sun-solaris2.5 gcc 1.3b ok (egcs-1.1.2, GNU ld 2.9.1 & native ld) sparc-sun-solaris2.5 cc 1.3b ok (SC 3.0.1) sparc-sun-solaris2.4 gcc 1.0a ok sparc-sun-solaris2.4 cc 1.0a ok sparc-sun-solaris2.3 gcc 1.2f ok sparc-sun-sunos4.1.4 gcc 1.2f ok sparc-sun-sunos4.1.4 cc 1.0f ok sparc-sun-sunos4.1.3_U1 gcc 1.2f ok sparc-sun-sunos4.1.3C gcc 1.2f ok sparc-sun-sunos4.1.3 gcc 1.3b ok (egcs-1.1.2, GNU ld 2.9.1 & native ld) sparc-sun-sunos4.1.3 cc 1.3b ok sparc-unknown-bsdi4.0 gcc 1.2c ok sparc-unknown-linux-gnulibc1 gcc 1.2f ok sparc-unknown-linux-gnu gcc 1.3b ok (egcs-1.1.2, GNU ld 2.9.1.0.23) sparc64-unknown-linux-gnu gcc 1.2f ok Notes: - "ok" means "all tests passed". - "NS" means "Not Shared", but OK for static libraries You too can contribute to this file, either if you use a platform which is missing from the table entirely, or if you are using a newer release of Libtool than the version listed in the table. From a freshly unpacked release, do the following: $ cd libtool-1.4 $ ./configure ... --------------------------------------------------------- Configuring libtool 1.4a (1.641.2.54 2000/06/18 03:02:52) --------------------------------------------------------- ... checking host system type... i586-pc-linux-gnu checking build system type... i586-pc-linux-gnu ... $ make ... $ make check ... =================== All 76 tests passed =================== ... If there are no test failures, and you see a message similar to the above, send a short message to stating what you did and the configuration triplet for your platform as reported for the `host system' by `configure' (see the example directly above), and the precise version number of the release you have tested as reported by `libtool --version': $ pwd /tmp/cvs/libtool $ ./libtool --version ltmain.sh (GNU libtool) 1.4a (1.641.2.41 2000/05/29 10:40:46) The official `PLATFORMS' file will be updated shortly thereafter.  File: autobook.info, Node: Generated File Dependencies, Next: OPL, Prev: PLATFORMS, Up: Top Appendix C Generated File Dependencies ************************************** These diagrams show the data flows associated with each of the tools you might need to use when bootstrapping a project with GNU Autotools. A lot of files are consumed and produced by these tools, and it is important that all of the required input files are present (and correct) at each stage - `configure' requires `Makefile.in' and produces `Makefile' for example. There are many of these relationships, and these diagrams should help you to visualize the dependencies. They will be invaluable while you learn your way around GNU Autotools, but before long you will find that you need to refer to them rarely, if at all. They do not show how the individual files are laid out in a project directory tree, since some of them, `config.guess' for example, have no single place at which they must appear, and others, `Makefile.am' for example, may be present in several places, depending on how you want to structure your project directories. The key to the diagrams in this appendix follows: * The boxes are the individual tools which comprise GNU Autotools. * Where multiple interlinked boxes appear in a single diagram, this represents one tool itself running other helper programs. If a box is behind another box, it is a (group of) helper program(s) that may be automatically run by the boxes in front. * Dotted arrows are for optional files, which may be a part of the process. * Where an input arrow and output arrow are aligned horizontally, the output is created from the input by the process between the two. * words in parentheses, "()", are for deprecated files which are supported but no longer necessary. Notice that in some cases, a file output during one stage of the whole process becomes the driver for a subsequent stage. Each of the following diagrams represents the execution of one of the tools in GNU Autotools; they are presented in the order that we recommend you run them, though some stages may not be required for your project. You shouldn't run `libtoolize' if your project doesn't use `libtool', for example. * Menu: * aclocal process:: * autoheader process:: * automake and libtoolize process:: * autoconf process:: * configure process:: * make process::  File: autobook.info, Node: aclocal process, Next: autoheader process, Up: Generated File Dependencies C.1 aclocal =========== The `aclocal' program creates the file `aclocal.m4' by combining stock installed macros, user defined macros and the contents of `acinclude.m4' to define all of the macros required by `configure.in' in a single file. `aclocal' was created as a fix for some missing functionality in Autoconf, and as such we consider it a wart. In due course `aclocal' itself will disappear, and Autoconf will perform the same function unaided. user input files optional input process output files ================ ============== ======= ============ acinclude.m4 - - - - -. V .-------, configure.in ------------------------>|aclocal| {user macro files} ->| |------> aclocal.m4 `-------'  File: autobook.info, Node: autoheader process, Next: automake and libtoolize process, Prev: aclocal process, Up: Generated File Dependencies C.2 autoheader ============== `autoheader' runs `m4' over `configure.in', but with key macros defined differently than when `autoconf' is executed, such that suitable `cpp' definitions are output to `config.h.in'. user input files optional input process output files ================ ============== ======= ============ aclocal.m4 - - - - - - - . (acconfig.h) - - - -. | V V .----------, configure.in ----------------------->|autoheader|----> config.h.in `----------'  File: autobook.info, Node: automake and libtoolize process, Next: autoconf process, Prev: autoheader process, Up: Generated File Dependencies C.3 automake and libtoolize =========================== `automake' will call `libtoolize' to generate some extra files if the macro `AC_PROG_LIBTOOL' is used in `configure.in'. If it is not present then `automake' will install `config.guess' and `config.sub' by itself. `libtoolize' can also be run manually if desired; `automake' will only run `libtoolize' automatically if `ltmain.sh' and `ltconfig' are missing. user input files optional input processes output files ================ ============== ========= ============ .--------, | | - - -> COPYING | | - - -> INSTALL | |------> install-sh | |------> missing |automake|------> mkinstalldirs configure.in ----------------------->| | Makefile.am ----------------------->| |------> Makefile.in | |------> stamp-h.in .---+ | - - -> config.guess | | | - - -> config.sub | `------+-' | | - - - -> config.guess |libtoolize| - - - -> config.sub | |--------> ltmain.sh | |--------> ltconfig `----------' The versions of `config.guess' and `config.sub' installed differ between releases of Automake and Libtool, and might be different depending on whether `libtoolize' is used to install them or not. Before releasing your own package you should get the latest versions of these files from `ftp://ftp.gnu.org/gnu/config', in case there have been changes since releases of the GNU Autotools.  File: autobook.info, Node: autoconf process, Next: configure process, Prev: automake and libtoolize process, Up: Generated File Dependencies C.4 autoconf ============ `autoconf' expands the `m4' macros in `configure.in', perhaps using macro definitions from `aclocal.m4', to generate the `configure' script. user input files optional input processes output files ================ ============== ========= ============ aclocal.m4 - - - - - -. V .--------, configure.in ----------------------->|autoconf|------> configure `--------'  File: autobook.info, Node: configure process, Next: make process, Prev: autoconf process, Up: Generated File Dependencies C.5 configure ============= The purpose of the preceding processes was to create the input files necessary for `configure' to run correctly. You would ship your project with the generated `script' and the files in columns, "other input" and "processes" (except `config.cache'), but `configure' is designed to be run by the person installing your package. Naturally, you will run it too while you develop your project, but the files it produces are specific to your development machine, and are not shipped with your package - the person installing it later will run `configure' and generate "output files" specific to their own machine. Running the `configure' script on the build host executes the various tests originally specified by the `configure.in' file, and then creates another script, `config.status'. This new script generates the `config.h' header file from `config.h.in', and `Makefile's from the named `Makefile.in's. Once `config.status' has been created, it can be executed by itself to regenerate files without rerunning all the tests. Additionally, if `AC_PROG_LIBTOOL' was used, then `ltconfig' is used to generate a `libtool' script. user input files other input processes output files ================ =========== ========= ============ .---------, config.site - - ->| | config.cache - - ->|configure| - - -> config.cache | +-, `-+-------' | | |----> config.status config.h.in ------->|config- |----> config.h Makefile.in ------->| .status|----> Makefile | |----> stamp-h | +--, .-+ | | | `------+--' | ltmain.sh ------->|ltconfig|-------> libtool | | | `-+------' | |config.guess| | config.sub | `------------'  File: autobook.info, Node: make process, Prev: configure process, Up: Generated File Dependencies C.6 make ======== The final tool to be run is `make'. Like `configure', it is designed to execute on the build host. `make' will use the rules in the generated `Makefile' to compile the project sources with the aid of various other scripts generated earlier on. user input files other input processes output files ================ =========== ========= ============ .--------, Makefile ------>| | config.h ------>| make | {project sources} ---------------->| |--------> {project targets} .-+ +--, | `--------' | | libtool | | missing | | install-sh | |mkinstalldirs| `-------------'  File: autobook.info, Node: Autoconf Macro Reference, Next: PLATFORMS, Prev: Installing GNU Autotools, Up: Top Appendix D Autoconf Macro Reference *********************************** This is an alphabetical list of each Autoconf macro used in this book, along with a description of what each does. They are provided for your reference while reading this book. The descriptions are only brief; see the appropriate reference manual for a complete description. `AC_ARG_ENABLE(FEATURE, HELP-TEXT, [IF-GIVEN], [IF-NOT-GIVEN])' This macro allows the maintainer to specify additional package options accepted by `configure'-for example, `--enable-zlib'. The action shell code may access any arguments to the option in the shell variable `enableval'. For example, `--enable-buffers=128' would cause `configure' to set `enableval' to `128'. `AC_ARG_PROGRAM' This macro places a `sed' transformation program into the output variable `program_transform_name' that can be used to transform the filenames of installed programs. If the `--program-prefix', `--program-suffix' or `--program-transform-name' options are passed to `configure', an appropriate transformation program will be generated. If no options are given, but the type of the host system differs from the type of the target system, program names are transformed by prefixing them with the type of the target (eg. `arm-elf-gcc'). `AC_ARG_WITH(PACKAGE, HELP-TEXT, [IF-GIVEN], [IF-NOT-GIVEN])' This macro allows the maintainer to specify additional packages that this package should work with (for example, a library to manipulate shadow passwords). The user indicates this preference by invoking `configure' with an option such as `--with-shadow'. If an optional argument is given, this value is available to shell code in the shell variable `withval'. `AC_CACHE_CHECK(MESSAGE, CACHE-VARIABLE, COMMANDS)' This macro is a convenient front-end to the `AC_CACHE_VAL' macro that takes care of printing messages to the user, including whether or not the result was found in the cache. It should be used in preference to `AC_CACHE_VAL'. `AC_CACHE_VAL(CACHE-VARIABLE, COMMANDS)' This is a low-level macro which implements the Autoconf cache feature. If the named variable is set at runtime (for instance, if it was read from `config.cache'), then this macro does nothing. Otherwise, it runs the shell code in COMMANDS, which is assumed to set the cache variable. `AC_CANONICAL_HOST' This macro determines the type of the host system and sets the output variable `host', as well as other more obscure variables. `AC_CANONICAL_SYSTEM' This macro determines the type of the build, host and target systems and sets the output variables `build', `host' and `target', amongst other more obscure variables. `AC_CHECK_FILE(FILE, [IF-FOUND], [IF-NOT-FOUND])' This macro tests for the existence of a file in the file system of the build system, and runs the appropriate shell code depending on whether or not the file is found. `AC_CHECK_FUNCS(FUNCTION-LIST, [IF-FOUND], [IF-NOT-FOUND])' This looks for a series of functions. If the function `quux' is found, the C preprocessor macro `HAVE_QUUX' will be defined. In addition, if the IF-FOUND argument is given, it will be run (as shell code) when a function is found - this code can use the `sh' `break' command to prevent `AC_CHECK_FUNCS' from looking for the remaining functions in the list. The shell code in IF-NOT-FOUND is run if a function is not found. `AC_CHECK_HEADER(HEADER, [IF-FOUND], [IF-NOT-FOUND])' This macro executes some specified shell code if a header file exists. If it is not present, alternative shell code is executed instead. `AC_CHECK_HEADERS(HEADER-LIST, [IF-FOUND], [IF-NOT-FOUND])' This looks for a series of headers. If the header `quux.h' is found, the C preprocessor macro `HAVE_QUUX_H' will be defined. In addition, if the IF-FOUND argument is given, it will be run (as shell code) when a header is found - this code can use the `sh' `break' command to prevent `AC_CHECK_HEADERS' from looking for the remaining headers in the list. The shell code in IF-NOT-FOUND is run if a header is not found. `AC_CHECK_LIB(LIBRARY, FUNCTION, [IF-FOUND], [IF-NOT-FOUND], [OTHER-LIBRARIES])' This looks for the named function in the named library specified by its base name. For instance the math library, `libm.a', would be named simply `m'. If the function is found in the library `foo', then the C preprocessor macro `HAVE_LIBFOO' is defined. `AC_CHECK_PROG(VARIABLE, PROGRAM-NAME, VALUE-IF-FOUND, [VALUE-IF-NOT-FOUND], [PATH], [REJECT])' Checks to see if the program named by PROGRAM-NAME exists in the path PATH. If found, it sets the shell variable VARIABLE to the value VALUE-IF-FOUND; if not it uses the value VALUE-IF-NOT-FOUND. If VARIABLE is already set at runtime, this macro does nothing. `AC_CHECK_SIZEOF(TYPE, [SIZE-IF-CROSS-COMPILING])' This macro determines the size of C and C++ built-in types and defines `SIZEOF_type' to the size, where `type' is transformed-all characters to upper case, spaces to underscores and `*' to `P'. If the type is unknown to the compiler, the size is set to 0. An optional argument specifies a default size when cross-compiling. The `configure' script will abort with an error message if it tries to cross-compile without this default size. `AC_CONFIG_AUX_DIR(DIRECTORY)' This macro allows an alternative directory to be specified for the location of auxiliary scripts such as `config.guess', `config.sub' and `install-sh'. By default, `$srcdir', `$srcdir/..' and `$srcdir/../..' are searched for these files. `AC_CONFIG_HEADER(HEADER-LIST)' This indicates that you want to use a config header, as opposed to having all the C preprocessor macros defined via `-D' options in the `DEFS' `Makefile' variable. Each header named in HEADER-LIST is created at runtime by `configure' (via `AC_OUTPUT'). There are a variety of optional features for use with config headers (different naming schemes and so forth); see the reference manual for more information. `AC_C_CONST' This macro defines the C preprocessor macro `const' to the string `const' if the C compiler supports the `const' keyword. Otherwise it is defined to be the empty string. `AC_C_INLINE' This macro tests if the C compiler can accept the `inline' keyword. It defines the C preprocessor macro `inline' to be the keyword accepted by the compiler or the empty string if it is not accepted at all. `AC_DEFINE(VARIABLE, [VALUE], [DESCRIPTION])' This is used to define C preprocessor macros. The first argument is the name of the macro to define. The VALUE argument, if given, is the value of the macro. The final argument can be used to avoid adding an `#undef' for the macro to `acconfig.h'. `AC_DEFINE_UNQUOTED(VARIABLE, [VALUE], [DESCRIPTION])' This is like `AC_DEFINE', but it handles the quoting of VALUE differently. This macro is used when you want to compute the value instead of having it used verbatim. `AC_DEFUN(NAME, BODY)' This macro is used to define new macros. It is similar to M4's `define' macro, except that it performs additional internal functions. `AC_DISABLE_FAST_INSTALL' This macro can be used to disable Libtool's `fast install' feature. `AC_DISABLE_SHARED' This macro changes the default behavior of `AC_PROG_LIBTOOL' so that shared libraries will not be built by default. The user can still override this new default by using `--enable-shared'. `AC_DISABLE_STATIC' This macro changes the default behavior of `AC_PROG_LIBTOOL' so that static libraries will not be built by default. The user can still override this new default by using `--enable-static'. `AC_EXEEXT' Sets the output variable `EXEEXT' to the extension of executables produced by the compiler. It is usually set to the empty string on Unix systems and `.exe' on Windows. `AC_FUNC_ALLOCA' This macro defines the C preprocessor macro `HAVE_ALLOCA' if the various tests indicate that the C compiler has built-in `alloca' support. If there is an `alloca.h' header file, this macro defines `HAVE_ALLOCA_H'. If, instead, the `alloca' function is found in the standard C library, this macro defines `C_ALLOCA' and sets the output variable `ALLOCA' to `alloca.o'. `AC_FUNC_GETPGRP' This macro tests if the `getpgrp' function takes a process ID as an argument or not. If it does not, the C preprocessor macro `GETPGRP_VOID' is defined. `AC_FUNC_MEMCMP' This macro tests for a working version of the `memcmp' function. If absent, or it does not work correctly, `memcmp.o' is added to the `LIBOBJS' output variable. `AC_FUNC_MMAP' Defines the C preprocessor macro `HAVE_MMAP' if the `mmap' function exists and works. `AC_FUNC_SETVBUF_REVERSED' On some systems, the order of the `mode' and `buf' arguments is reversed with respect to the ANSI C standard. If so, this macro defines the C preprocessor macro `SETVBUF_REVERSED'. `AC_FUNC_UTIME_NULL' Defines the C preprocessor macro `HAVE_UTIME_NULL' if a call to `utime' with a NULL `utimbuf' pointer sets the file's timestamp to the current time. `AC_FUNC_VPRINTF' Defines the C preprocessor macro `HAVE_VPRINTF' if the `vprintf' function is available. If not and the `_doprnt' function is available instead, this macro defines `HAVE_DOPRNT'. `AC_HEADER_DIRENT' This macro searches a number of specific header files for a declaration of the C type `DIR'. Depending on which header file the declaration is found in, this macro may define one of the C preprocessor macros `HAVE_DIRENT_H', `HAVE_SYS_NDIR_H', `HAVE_SYS_DIR_H' or `HAVE_NDIR_H'. Refer to the Autoconf manual for an example of how these macros should be used in your source code. `AC_HEADER_STDC' This macro defines the C preprocessor macro `STDC_HEADERS' if the system has the ANSI standard C header files. It determines this by testing for the existence of the `stdlib.h', `stdarg.h', `string.h' and `float.h' header files and testing if `string.h' declares `memchr', `stdlib.h' declares `free', and `ctype.h' macros such as `isdigit' work with 8-bit characters. `AC_INIT(FILENAME)' This macro performs essential initialization for the generated `configure' script. An optional argument may provide the name of a file from the source directory to ensure that the directory has been specified correctly. `AC_LIBTOOL_DLOPEN' Call this macro before `AC_PROG_LIBTOOL' to indicate that your package wants to use Libtool's support for `dlopen'ed modules. `AC_LIBTOOL_WIN32_DLL' Call this macro before `AC_PROG_LIBTOOL' to indicate that your package has been written to build DLLs on Windows. If this macro is not called, Libtool will only build static libraries on Windows. `AC_LIB_LTDL' This macro does the `configure'-time checks needed to cause `ltdl.c' to be compiled correctly. That is, this is used to enable dynamic loading via `libltdl'. `AC_LINK_FILES(SOURCE-LIST, DEST-LIST)' Use this macro to create a set of links; if possible, symlinks are made. The two arguments are parallel lists: the first element of DEST-LIST is the name of a to-be-created link whose target is the first element of SOURCE-LIST. `AC_MSG_CHECKING(MESSAGE)' This macro outputs a message to the user in the usual style of `configure' scripts: leading with the word `checking' and ending in `...'. This message gives the user an indication that the `configure' script is still working. A subsequent invocation of `AC_MSG_RESULT' should be used to output the result of a test. `AC_MSG_ERROR(MESSAGE)' This macro outputs an error message to standard error and aborts the `configure' script. It should only be used for fatal error conditions. `AC_MSG_RESULT(MESSAGE)' This macro should be invoked after a corresponding invocation of `AC_MSG_CHECKING' with the result of a test. Often the result string can be as simple as `yes' or `no'. `AC_MSG_WARN(MESSAGE)' This macro outputs a warning to standard error, but allows the `configure' script to continue. It should be used to notify the user of abnormal, but non-fatal, conditions. `AC_OBJEXT' Sets the output variable `OBJEXT' to the extension of object files produced by the compiler. Usually, it is set to `.o' on Unix systems and `.obj' on Windows. `AC_OUTPUT(FILES, [EXTRA-COMMANDS], [INIT-COMMANDS])' This macro must be called at the end of every `configure.in'. It creates each file listed in FILES. For a given file, by default, `configure' reads the template file whose name is the name of the input file with `.in' appended - for instance, `Makefile' is generated from `Makefile.in'. This default can be overridden by using a special naming convention for the file. For each name `foo' given as an argument to `AC_SUBST', `configure' will replace any occurrence of `@foo@' in the template file with the value of the shell variable `foo' in the generated file. This macro also generates the config header, if `AC_CONFIG_HEADER' was called, and any links, if `AC_LINK_FILES' was called. The additional arguments can be used to further tailor the output processing. `AC_OUTPUT_COMMANDS(EXTRA-COMMANDS, [INIT-COMMANDS])' This macro works like the optional final arguments of `AC_OUTPUT', except that it can be called more than once from `configure.in'. (This makes it possible for macros to use this feature and yet remain modular.) See the reference manual for the precise definition of this macro. `AC_PROG_AWK' This macro searches for an `awk' program and sets the output variable `AWK' to be the best one it finds. `AC_PROG_CC' This checks for the C compiler to use and sets the shell variable `CC' to the value. If the GNU C compiler is being used, this sets the shell variable `GCC' to `yes'. This macro sets the shell variable `CFLAGS' if it has not already been set. It also calls `AC_SUBST' on `CC' and `CFLAGS'. `AC_PROG_CC_STDC' This macro attempts to discover a necessary command line option to have the C compiler accept ANSI C. If so, it adds the option to the `CC'. If it were not possible to get the C compiler to accept ANSI, the shell variable `ac_cv_prog_cc_stdc' will be set to `no'. `AC_PROG_CPP' This macro sets the output variable `CPP' to a command that runs the C preprocessor. If `$CC -E' does not work, it will set the variable to `/lib/cpp'. `AC_PROG_CXX' This is like `AC_PROG_CC', but it checks for the C++ compiler, and sets the variables `CXX', `GXX' and `CXXFLAGS'. `AC_PROG_GCC_TRADITIONAL' This macro determines if GCC requires the `-traditional' option in order to compile code that uses `ioctl' and, if so, adds `-traditional' to the `CC' output variable. This condition is rarely encountered, thought mostly on old systems. `AC_PROG_INSTALL' This looks for an `install' program and sets the output variables `INSTALL', `INSTALL_DATA', `INSTALL_PROGRAM', and `INSTALL_SCRIPT'. This macro assumes that if an `install' program cannot be found on the system, your package will have `install-sh' available in the directory chosen by `AC_CONFIG_AUX_DIR'. `AC_PROG_LEX' This looks for a `lex'-like program and sets the `Makefile' variable `LEX' to the result. It also sets `LEXLIB' to whatever might be needed to link against `lex' output. `AC_PROG_LIBTOOL' This macro is the primary way to integrate Libtool support into `configure'. If you are using Libtool, you should call this macro in `configure.in'. Among other things, it adds support for the `--enable-shared' `configure' flag. `AC_PROG_LN_S' This sets the `Makefile' variable `LN_S' to `ln -s' if symbolic links work in the current working directory. Otherwise it sets `LN_S' to just `ln'. `AC_PROG_MAKE_SET' Some versions of `make' need to have the `Makefile' variable `MAKE' set in `Makefile' in order for recursive builds to work. This macro checks whether this is needed, and, if so, it sets the `Makefile' variable `SET_MAKE' to the result. `AM_INIT_AUTOMAKE' calls this macro, so if you are using Automake, you don't need to call it or use `SET_MAKE' in `Makefile.am'. `AC_PROG_RANLIB' This searches for the `ranlib' program. It sets the `Makefile' variable `RANLIB' to the result. If `ranlib' is not found, or not needed on the system, then the result is `:'. `AC_PROG_YACC' This searches for the `yacc' program - it tries `bison', `byacc', and `yacc'. It sets the `Makefile' variable `YACC' to the result. `AC_REPLACE_FUNCS(FUNCTION LIST)' This macro takes a single argument, which is a list of functions. For a given function `func', `configure' will do a link test to try to find it. If the function cannot be found, then `func.o' will be added to `LIBOBJS'. If function can be found, then `configure' will define the C preprocessor symbol `HAVE_FUNC'. `AC_REQUIRE(MACRO-NAME)' This macro takes a single argument, which is the name of another macro. (Note that you must quote the argument correctly: `AC_REQUIRE([FOO])' is correct, while `AC_REQUIRE(FOO)' is not.) If the named macro has already been invoked, then `AC_REQUIRE' does nothing. Otherwise, it invokes the named macro with no arguments. `AC_REVISION(REVISION)' This macro takes a single argument, a version string. Autoconf will copy this string into the generated `configure' file. `AC_STRUCT_ST_BLKSIZE' Defines the C preprocessor macro `HAVE_ST_BLKSIZE' if `struct stat' has an `st_blksize' member. `AC_STRUCT_ST_BLOCKS' Defines the C preprocessor macro `HAVE_ST_BLOCKS' if `struct stat' has an `st_blocks' member. `AC_STRUCT_ST_RDEV' Defines the C preprocessor macro `HAVE_ST_RDEV' if `struct stat' has an `st_rdev' member. `AC_STRUCT_TM' This macro looks for `struct tm' in `time.h' and defines `TM_IN_SYS_TIME' if it is not found there. `AC_SUBST(NAME)' This macro takes a single argument, which is the name of a shell variable. When `configure' generates the files listed in `AC_OUTPUT' (e.g., `Makefile'), it will substitute the variable's value (at the end of the `configure' run - the value can be changed after `AC_SUBST' is called) anywhere a string of the form `@NAME@' is seen. `AC_TRY_COMPILE(INCLUDES, BODY, [IF-OK], [IF-NOT-OK])' This macro is used to try to compile a given function, whose body is given in BODY. INCLUDES lists any `#include' statements needed to compile the function. If the code compiles correctly, the shell commands in IF-OK are run; if not, IF-NOT-OK is run. Note that this macro will not try to link the test program - it will only try to compile it. `AC_TRY_LINK(INCLUDES, BODY, [IF-FOUND], [IF-NOT-FOUND])' This is used like `AC_TRY_COMPILE', but it tries to link the resulting program. The libraries and options in the `LIBS' shell variable are passed to the link. `AC_TRY_RUN(PROGRAM, [IF-TRUE, [IF-FALSE], [IF-CROSS-COMPILING])' This macro tries to compile and link the program whose text is in PROGRAM. If the program compiles, links, and runs successfully, the shell code IF-TRUE is run. Otherwise, the shell code IF-FALSE is run. If the current configure is a cross-configure, then the program is not run, and on a successful compile and link, the shell code IF-CROSS-COMPILING is run. `AC_TYPE_SIGNAL' This macro defines the C preprocessor macro `RETSIGTYPE' to be the correct return type of signal handlers. For instance, it might be `void' or `int'. `AC_TYPE_SIZE_T' This macro looks for the type `size_t'. If not defined on the system, it defines it (as a macro) to be `unsigned'. `AM_CONDITIONAL(NAME, TESTCODE)' This Automake macro takes two arguments: the name of a conditional and a shell statement that is used to determine whether the conditional should be true or false. If the shell code returns a successful (0) status, then the conditional will be true. Any conditional in your `configure.in' is automatically available for use in any `Makefile.am' in that project. `AM_CONFIG_HEADER(HEADER)' This is just like `AC_CONFIG_HEADER', but does some additional setup required by Automake. If you are using Automake, use this macro. Otherwise, use `AC_CONFIG_HEADER'. `AM_INIT_AUTOMAKE(PACKAGE, VERSION, [NODEFINE])' This macro is used to do all the standard initialization required by Automake. It has two required arguments: the package name and the version number. This macro sets and calls `AC_SUBST' on the shell variables `PACKAGE' and `VERSION'. By default it also defines these variables (via `AC_DEFINE_UNQUOTED'). However, this macro also accepts an optional third argument which, if not empty, means that the `AC_DEFINE_UNQUOTED' calls for `PACKAGE' and `VERSION' should be suppressed. `AM_MAINTAINER_MODE' This macro is used to enable a special Automake feature, maintainer mode, which we've documented elsewhere (*note Maintaining Input Files::). `AM_PROG_CC_STDC' This macro takes no arguments. It is used to try to get the C compiler to be ANSI compatible. It does this by adding different options known to work with various system compilers. This macro is most typically used in conjunction with Automake when you want to use the automatic de-ANSI-fication feature. `AM_PROG_LEX' This is like `AC_PROG_LEX', but it does some additional processing used by Automake-generated `Makefile's. If you are using Automake, then you should use this. Otherwise, you should use `AC_PROG_LEX' (and perhaps `AC_DECL_YYTEXT', which `AM_PROG_LEX' calls). `AM_WITH_DMALLOC' This macro adds support for the `--with-dmalloc' flag to `configure'. If the user chooses to enable `dmalloc' support, then this macro will define the preprocessor symbol `WITH_DMALLOC' and will add `-ldmalloc' to the `Makefile' variable `LIBS'.  File: autobook.info, Node: OPL, Next: Index, Prev: Generated File Dependencies, Up: Top Appendix E OPL ************** OPEN PUBLICATION LICENSE Draft v0.4, 8 June 1999 I. REQUIREMENTS ON BOTH UNMODIFIED AND MODIFIED VERSIONS The Open Publication works may be reproduced and distributed in whole or in part, in any medium physical or electronic, provided that the terms of this license are adhered to, and that this license or an incorporation of it by reference (with any options elected by the author(s) and/or publisher) is displayed in the reproduction. Proper form for an incorporation by reference is as follows: Copyright (c) by . This material may be distributed only subject to the terms and conditions set forth in the Open Publication License, vX.Y or later (the latest version is presently available at ). The reference must be immediately followed with any options elected by the author(s) and/or publisher of the document (see section VI). Commercial redistribution of Open Publication-licensed material is permitted. Any publication in standard (paper) book form shall require the citation of the original publisher and author. The publisher and author's names shall appear on all outer surfaces of the book. On all outer surfaces of the book the original publisher's name shall be as large as the title of the work and cited as possessive with respect to the title. II. COPYRIGHT The copyright to each Open Publication is owned by its author(s) or designee. III. SCOPE OF LICENSE The following license terms apply to all Open Publication works, unless otherwise explicitly stated in the document. Mere aggregation of Open Publication works or a portion of an Open Publication work with other works or programs on the same media shall not cause this license to apply to those other works. The aggregate work shall contain a notice specifying the inclusion of the Open Publication material and appropriate copyright notice. SEVERABILITY. If any part of this license is found to be unenforceable in any jurisdiction, the remaining portions of the license remain in force. NO WARRANTY. Open Publication works are licensed and provided "as is" without warranty of any kind, express or implied, including, but not limited to, the implied warranties of merchantability and fitness for a particular purpose or a warranty of non-infringement. IV. REQUIREMENTS ON MODIFIED WORKS All modified versions of documents covered by this license, including translations, anthologies, compilations and partial documents, must meet the following requirements: 1) The modified version must be labeled as such. 2) The person making the modifications must be identified and the modifications dated. 3) Acknowledgement of the original author and publisher if applicable must be retained according to normal academic citation practices. 4) The location of the original unmodified document must be identified. 5) The original author's (or authors') name(s) may not be used to assert or imply endorsement of the resulting document without the original author's (or authors') permission. V. GOOD-PRACTICE RECOMMENDATIONS In addition to the requirements of this license, it is requested from and strongly recommended of redistributors that: 1) If you are distributing Open Publication works on hardcopy or CD-ROM, you provide email notification to the authors of your intent to redistribute at least thirty days before your manuscript or media freeze, to give the authors time to provide updated documents. This notification should describe modifications, if any, made to the document. 2) All substantive modifications (including deletions) be either clearly marked up in the document or else described in an attachment to the document. Finally, while it is not mandatory under this license, it is considered good form to offer a free copy of any hardcopy and CD-ROM expression of an Open Publication-licensed work to its author(s). VI. LICENSE OPTIONS The author(s) and/or publisher of an Open Publication-licensed document may elect certain options by appending language to the reference to or copy of the license. These options are considered part of the license instance and must be included with the license (or its incorporation by reference) in derived works. A. To prohibit distribution of substantively modified versions without the explicit permission of the author(s). "Substantive modification" is defined as a change to the semantic content of the document, and excludes mere changes in format or typographical corrections. To accomplish this, add the phrase `Distribution of substantively modified versions of this document is prohibited without the explicit permission of the copyright holder.' to the license reference or copy. B. To prohibit any publication of this work or derivative works in whole or in part in standard (paper) book form for commercial purposes is prohibited unless prior permission is obtained from the copyright holder. To accomplish this, add the phrase `Distribution of the work or derivative of the work in any standard (paper) book form is prohibited unless prior permission is obtained from the copyright holder.' to the license reference or copy. OPEN PUBLICATION POLICY APPENDIX: (This is not considered part of the license.) Open Publication works are available in source format via the Open Publication home page at . Open Publication authors who want to include their own license on Open Publication works may do so, as long as their terms are not more restrictive than the Open Publication license. If you have questions about the Open Publication License, please contact TBD, and/or the Open Publication Authors' List at , via email.  File: autobook.info, Node: Index, Prev: OPL, Up: Top Index ***** [index] * Menu: * #!env: Magic Numbers. (line 51) * --build option: Build and Host Options. (line 6) * --host option: Build and Host Options. (line 6) * --target option: Specifying the Target. (line 11) * -all-static, libtool option: Linking an Executable. (line 87) * -DPIC: Position Independent Code. (line 24) * -static, libtool option: Linking an Executable. (line 84) * 8.3 filenames: DOS Filename Restrictions. (line 6) * 8.3 filenames in GNU Autotools: 8.3 Filenames. (line 6) * AC_CANONICAL_SYSTEM: Using the Target Type. (line 12) * AC_DISABLE_FAST_INSTALL: Extra Macros for Libtool. (line 14) * AC_DISABLE_SHARED: Extra Macros for Libtool. (line 21) * AC_DISABLE_STATIC: Extra Macros for Libtool. (line 21) * AC_LIBTOOL_WIN32_DLL: A configure.in for DLLs. (line 60) * autogen.sh: Bootstrapping. (line 38) * back-linking: Introducing libltdl. (line 57) * binary files: Unix/Windows Text/Binary. (line 6) * binary mode fopen: Text vs Binary Modes. (line 82) * binary mode open: Text vs Binary Modes. (line 82) * bootstrap script: Bootstrapping. (line 54) * build option: Build and Host Options. (line 6) * C language portability: Writing Portable C. (line 6) * canadian cross in configure: Supporting Cross Compiler in Configure. (line 6) * canadian cross in make: Supporting Cross Compiler in Make. (line 6) * canadian cross, configuring: Build and Host Options. (line 6) * case-folding filesystems: Windows File Name Case. (line 6) * configuration name: Configuration Names. (line 6) * configure build system: Build and Host Options. (line 6) * configure cross compiler support: Supporting Cross Compiler in Configure. (line 6) * configure host: Build and Host Options. (line 6) * configure target: Specifying the Target. (line 11) * configuring a canadian cross: Build and Host Options. (line 6) * cross compilation: Cross Compilation. (line 6) * cross compiler support in configure: Supporting Cross Compiler in Configure. (line 6) * cross compiler support in make: Supporting Cross Compiler in Make. (line 6) * CRTDLL.DLL: Integration with Cygnus Cygwin. (line 6) * Cygwin autotools compilation: Installing GNU Autotools on Cygwin. (line 6) * CYGWIN binmode setting: Text vs Binary Modes. (line 73) * Cygwin Bourne shell: Preliminaries. (line 18) * Cygwin full.exe: Preliminaries. (line 6) * Cygwin gcc: Preliminaries. (line 40) * Cygwin M4: Preliminaries. (line 25) * Cygwin Make: Preliminaries. (line 28) * Cygwin mount: Text vs Binary Modes. (line 16) * Cygwin package portability: Writing A Cygwin Friendly Package. (line 6) * Cygwin Perl: Preliminaries. (line 45) * Cygwin sh.exe: Preliminaries. (line 18) * Cygwin static packages: Writing A Cygwin Friendly Package. (line 17) * Cygwin usertools.exe: Preliminaries. (line 6) * directory separator character: Separators and Drive Letters. (line 6) * DJGPP: Integration with Cygnus Cygwin. (line 6) * file name case in Windows: Windows File Name Case. (line 6) * host optionxgr: Build and Host Options. (line 6) * host system: Host and Target. (line 6) * HOST_CC: Supporting Cross Compiler in Make. (line 31) * library terminology: Introducing GNU Libtool. (line 28) * Libtool library: Introducing GNU Libtool. (line 18) * Libtool object <1>: Creating Shared Libraries with libtool. (line 42) * Libtool object: The Libtool Library. (line 6) * LIBTOOL_DEPS: Extra Macros for Libtool. (line 73) * loaders: libltdl Loader Mechanism. (line 6) * LTALLOCA: Extra Macros for Libtool. (line 61) * LTLIBOBJS: Extra Macros for Libtool. (line 38) * make cross compiler support: Supporting Cross Compiler in Make. (line 6) * partial linking: Creating Convenience Libraries with libtool. (line 6) * path element separator character: Separators and Drive Letters. (line 6) * path separator, mixed mode: Separators and Drive Letters. (line 56) * PE-COFF binary format: Integration with Cygnus Cygwin. (line 6) * PIC: Position Independent Code. (line 12) * pseudo-library: Introducing GNU Libtool. (line 28) * shared library: Creating Shared Libraries with libtool. (line 6) * target option: Specifying the Target. (line 11) * target system: Host and Target. (line 6) * text files: Unix/Windows Text/Binary. (line 6) * text mode fopen: Text vs Binary Modes. (line 82) * text mode open: Text vs Binary Modes. (line 82) * version.texi: Including Texinfo Documentation. (line 60) * Windows CR-LF <1>: Text vs Binary Modes. (line 6) * Windows CR-LF: Unix/Windows Text/Binary. (line 6) * Windows text line terminator <1>: Text vs Binary Modes. (line 6) * Windows text line terminator: Unix/Windows Text/Binary. (line 6) * Windows, Autoconf: Installing GNU Autotools on Cygwin. (line 13) * Windows, Automake: Installing GNU Autotools on Cygwin. (line 22) * Windows, Cygwin: Integration with Cygnus Cygwin. (line 6) * Windows, Libtool philosophy: Installing GNU Autotools on Cygwin. (line 30) * Windows, mingw: Integration with Cygnus Cygwin. (line 6) * wrapper scripts: Executing Uninstalled Binaries. (line 6)