This is autobook.info, produced by makeinfo version 4.7 from autobook.texi. INFO-DIR-SECTION GNU programming tools START-INFO-DIR-ENTRY * Autoconf, Automake, Libtool: (autobook). Using the GNU autotools. END-INFO-DIR-ENTRY This file documents GNU Autoconf, Automake and Libtool. Copyright (C) 1999, 2000 Gary V. Vaughan, Ben Elliston, Tom Tromey, Ian Lance Taylor Permission is granted to make and distribute verbatim copies of this manual provided the copyright notice and this permission notice are preserved on all copies. Permission is granted to copy and distribute modified versions of this manual under the conditions for verbatim copying, provided that the entire resulting derived work is distributed under the terms of a permission notice identical to this one. Permission is granted to copy and distribute translations of this manual into another language, under the above conditions for modified versions, except that this permission notice may be stated in a translation approved by the Foundation.  File: autobook.info, Node: Top, Next: Introduction, Prev: (dir), Up: (dir) The GNU Project Build Tools *************************** Preamble ======== This book is published on paper by New Riders (ISBN 1-57870-190-2), their webpage on the book is at `http://www.newriders.com/autoconf/'. Technical reviewers of the paper book were Akim Demaille, Phil Edwards, Bruce Korb, Alexandre Oliva, Didier Verna, Benjamin Koznik and Jason Molenda. The paper book includes Acknowlegements to: Richard Stallman and the FSF, Gord Matzigkeit, David Mackenzie, Akim Demaille, Phil Edwards, Bruce Korb, Alexandre Oliva, Didier Verna, Benjamin Koznik, Jason Molenda, the Gnits group, and the New Riders staff. From the paper book, quoted from "About the Authors": "Gary V. Vaughan spent three years as a Computer Systems Engineering undergraduate at the University of Warwick and then two years at Coventry Unversity studying Computer Science. He has been employed as a professional C programmer in several industry sectors for the past seven years, but most recently as a scientist for the Defence Evaluation and Research Agency. Over the past 10 years or so, Gary has contributed to several free software projects, including AutoGen, Cygwin, Enlightenment, and GNU M4. He currently helps maintain GNU Libtool for the Free Software Foundation. Ben Elliston works at Red Hat Inc., telecommuting from his home in Canberra, Australia. He develops instruction set simulators and adapts the GNU development tools to new microprocessors. He has worked with GNU tools for many years and is a past maintainer of GNU Autoconf. He has a bachelor's degree in Computer Engineering from the University of Canberra. Tom Tromey works at Red Hat Inc., where he works on GCJ, the Java front end to the GNU Compiler Collection. Patches of his appear in GCC, emacs, Gnome, Autoconf, GDB, and probably other packages he has forgotten about. He is the primary author of GNU Automake. Ian Lance Taylor is co-founder and CTO of Zembu Labs. Previously, he worked at Cygnus Solutions, where he designed and wrote features and ports for many free software programs, including the GNU Compiler Collection and the GNU binutils. He was the maintainer of the GNU binutils for several years. He is the author of GNU/Taylor UUCP. He was one of the first contributors to Autoconf, and has also made significant contributions to Automake and Libtool." Until 2001-09-05, this package was maintained by Gary V. Vaughan, Tom Tromey, Ian Lance Taylor and Ben Elliston. It was (and is) available via anonymous CVS: cvs -d :pserver:anoncvs@sources.redhat.com:/cvs/autobook login Password is anoncvs cvs -d :pserver:anoncvs@sources.redhat.com:/cvs/autobook co autobook The sources there would build autobook-0.5a.tar.gz . Furthermore, `http://sources.redhat.com/autobook/autobook-1.4.tar.gz', a tarball containing html pages, is linked to from `http://sources.redhat.com/autobook/'. I've made some changes to the sources I've found in the redhat.com CVS (see ChangeLog), and maintain the sources via CVS on topaz.conuropsis.org. I maintain a webpage on `http://mdcc.cx/autobook/', the package can be downloaded from there. It also has instructions on how to get informed about changes to the autobook package. This package is not blessed by the maintainers of the official autobook-1.4.tar.gz. Therefore, I take responsiblity for all errors you might find in this package. Of course, all credit should go to Vaughan, Tromey, Taylor and Elliston for their excellent work. Beware! The Autobook is getting somewhat obsolete, I'm afraid: the text has not been updated (apart from minor corrections) since 2001-09. Autobook talks likely about autoconf version 2.13, automake version 1.4 and libtool version 1.3.5 (see Appendix A.2: Downloading GNU Autotools). As of january 2005, released are autoconf 2.59a, automake 1.9.4 and libtool 1.5.6. Therefore, regard the texinfo documentation shipped with these tools as the authoritive source of information. See the unofficial Autobook webpage at `http://mdcc.cx/autobook/' for pointers to other sources of information on the GNU Autotools. Joost van Baal January 2005 Magic Happens Here ================== Do you remember the 1980s? Veteran users of free software on Unix could testify that though there were a lot of programs distributed as source code back then (over USENET), there was not a lot of consistency in how to compile and install it. The more complicated a package was, the more likely it was to have its own unique build procedure that had to be learned first. And there were no widely used approaches to portability problems. Each software author handled them in a different way, if they did at all. Fast forward to the present. A de facto standard is in widespread use for solving those problems, and it's not just free software packages that are using it; some proprietary programs from the largest computer companies are built using this software. It even does Windows. As it evolved in the 1990s it demonstrated the power of some good ideas: sharing expertise, automating repetitive work, and having consistency where it is helpful without sacrificing flexibility where it is helpful. What is "it"? The GNU Autotools, a group of utilities developed in the 1990s for the GNU Project. The authors of this book and I were some of its principal developers, but it turned out to help solve many other peoples' problems as well, and many other people contributed to it. It is one of the many projects that developed by cooperation while making what is now often called GNU/Linux. The community made the GNU Autotools widespread, as people adopted it for their own programs and extended it where they found that was needed. The creation of Libtool is that type of contribution. Autoconf, Automake, and Libtool were developed separately, to make tackling the problem of software configuration more manageable by partitioning it. But they were designed to be used as a system, and they make more sense when you have documentation for the whole system. This book stands a level above the software packages, giving the expertise of its authors in using this whole system to its fullest. It was written by people who have lived closest to the problems and their solutions in software. Magic happens under the hood, where experts have tinkered until the GNU Autotools engine can run on everything from jet fuel to whale oil. But there is a different kind of magic, in the cooperation and sharing that built a widely used system over the Internet, for anyone to use and improve. Now, as the authors share their knowledge and experience, you are part of the community, too. Perhaps its spirit will inspire you to make your own contributions. David MacKenzie Germantown, Maryland June 2000 * Menu: * Introduction:: * History:: * Invoking configure:: * Introducing Makefiles:: * A Minimal GNU Autotools Project:: * Writing configure.in:: * Introducing GNU Automake:: * Bootstrapping:: * A Small GNU Autotools Project:: * Introducing GNU Libtool:: * Using GNU Libtool:: * A Large GNU Autotools Project:: * Rolling Distribution Tarballs:: * Installing and Uninstalling:: * Writing Portable C:: * Writing Portable C++:: * Dynamic Loading:: * Using GNU libltdl:: * Advanced GNU Automake Usage:: * A Complex GNU Autotools Project:: * M4 :: * Writing Portable Bourne Shell:: * Writing New Macros for Autoconf:: * Migrating Existing Packages:: * Integration with Cygnus Cygwin:: * Cross Compilation:: * Installing GNU Autotools:: * Autoconf Macro Reference:: * PLATFORMS:: * Generated File Dependencies:: * OPL:: * Index::  File: autobook.info, Node: Introduction, Next: History, Prev: Top, Up: Top 1 Introduction ************** Autoconf, Automake and Libtool are packages for making your software more portable and to simplify building it--usually on someone else's system. Software portability and effective build systems are crucial aspects of modern software engineering practice. It is unlikely that a software project would be started today with the expectation that the software would run on only one platform. Hardware constraints may change the choice of platform, new customers with different kinds of systems may emerge or your vendor might introduce incompatible changes in newer versions of their operating system. In addition, tools that make building software easier and less error prone are valuable. Autoconf is a tool that makes your packages more portable by performing tests to discover system characteristics before the package is compiled. Your source code can then adapt to these differences. Automake is a tool for generating `Makefile's--descriptions of what to build--that conform to a number of standards. Automake substantially simplifies the process of describing the organization of a package and performs additional functions such as dependency tracking between source files. Libtool is a command line interface to the compiler and linker that makes it easy to portably generate static and shared libraries, regardless of the platform it is running on. * Menu: * What this book is:: * What the book is not:: * Who should read this book:: * How this book is organized::  File: autobook.info, Node: What this book is, Next: What the book is not, Up: Introduction 1.1 What this book is ===================== This book is a tutorial for Autoconf, Automake and Libtool, hereafter referred to as the GNU Autotools. The GNU manuals that accompany each tools adequately document each tool in isolation. Until now, there has not been a guide that has described how these tools work _together_. As these tools have evolved over the years, design decisions have been made by contributors who clearly understand the associated problems, but little documentation exists that captures why things are the way they are. By way of example, one might wonder why some Autoconf macros use shell constructs like: if test "x$var" = xbar; then echo yes 1>&5 fi instead of the simpler: if [ $var = bar ]; then echo yes 1>&5 fi Much of this reasoning is recorded in this book.  File: autobook.info, Node: What the book is not, Next: Who should read this book, Prev: What this book is, Up: Introduction 1.2 What the book is not ======================== This book is not a definitive reference to Autoconf, Automake or Libtool. Attempting to do so would fill this book with information that is doomed to obsolescence. For instance, you will not find a description of every predefined macro provided by Autoconf. Instead, the book will attempt to help you understand any macro you encounter and, instead, influence how you approach software portability and package building. The GNU manual for each tool should be consulted as a reference. This book briefly introduces pertinent concepts, but does not attempt to teach them comprehensively. You will find an introduction to writing `Makefile's and Bourne shell scripts, but you should consult other references to become familiar with these broader topics.  File: autobook.info, Node: Who should read this book, Next: How this book is organized, Prev: What the book is not, Up: Introduction 1.3 Who should read this book ============================= Revealing the mystery around the GNU Autotools is likely to raise the interest of a wide audience of software developers, system administrators and technical managers. Software developers, especially those involved with free software projects, will find it valuable to understand how to use these tools. The GNU Autotools are enjoying growing popularity in the free software community. Developers of in-house projects can reap the same benefits by using these tools. System administrators can benefit from a working knowledge of these tools - a common task for system administrators is to compile and install packages which commonly use the GNU Autotools framework. Occasionally, a feature test may produce a false result, leading to a compilation error or a misbehaving program. Some hacking is usually sufficient to get the package to compile, but knowing the correct way to fix the problem can assist the package maintainer. Finally, technical managers may find the discussion to be an insight into the complex nature of software portability and the process of building a large project.  File: autobook.info, Node: How this book is organized, Prev: Who should read this book, Up: Introduction 1.4 How this book is organized ============================== Like any good tutorial, this book starts with an explanation of simple concepts and builds on these fundamentals to progress to advanced topics. Part I of the book provides a history of the development of these tools and why they exist. Part II contains most of the book's content, starting with an introduction to concepts such as `Makefile's and configuration triplets. Later chapters introduce each tool and how to manage projects of varying sizes using the tools in concert. Programs written in C and C++ can be non-portable if written carelessly. Chapters 14 and 15 offer guidelines for writing portable programs in C and C++, respectively. Part III provides information that you are unlikely to find in any other documentation, that is based on extensive experience with the tools. It embodies chapters that treat some advanced, yet essential, concepts such as the `m4' macro processor and how to write portable Bourne shell scripts. Chapter 23 outlines how to migrate an existing package to the GNU Autotools framework and will be of interest to many developers. One of the most mystifying aspects of using the GNU Autotools for building packages in a cross-compilation environment. This is de-mystified in Chapter 25.  File: autobook.info, Node: History, Next: Invoking configure, Prev: Introduction, Up: Top 2 History ********* In this chapter we provide a brief history of the tools described in this book. You don't need to know this history in order to use the tools. However, the history of how the tools developed over time helps explain why the tools act the way that they do today. Also, in a book like this, it's only fair for us to credit the original authors and sources of inspiration, and to explain what they did. * Menu: * Unix Diversity:: * First Configure Programs:: * Configure Development:: * Automake Development:: * Libtool Development:: * Microsoft Windows Development::  File: autobook.info, Node: Unix Diversity, Next: First Configure Programs, Up: History 2.1 The Diversity of Unix Systems ================================= Of the programs discussed in this book, the first to be developed was Autoconf. Its development was determined by the history of the Unix operating system. The first version of Unix was written by Dennis Ritchie and Ken Thompson at Bell Labs in 1969. During the 1970s, Bell Labs was not permitted to sell Unix commercially, but did distribute Unix to universities at relatively low cost. The University of California at Berkeley added their own improvements to the Unix sources; the result was known as the BSD version of Unix. In the early 1980s, AT&T signed an agreement permitting them to sell Unix commercially. The first AT&T version of Unix was known as System III. As the popularity of Unix increased during the 1980s, several other companies modified the Unix sources to create their own variants. Examples include SunOS from Sun Microsystems, Ultrix from Digital Equipment Corporation, and HP-UX from Hewlett Packard. Although all of the Unix variants were fundamentally similar, there were various differences between them. They had slightly different sets of header files and slightly different lists of functions in the system libraries, as well as more significant differences in areas such as terminal handling and job control. The emerging POSIX standards helped to eliminate some of these differences. However, in some areas POSIX introduced new features, leading to more variants. Also, different systems adopted the POSIX standard at different times, leading to further disparities. All of these variations caused problems for programs distributed as source code. Even a function as straightforward as `memcpy' was not available everywhere; the BSD system library provided the similar function `bcopy' instead, but the order of arguments was reversed. Program authors who wanted their programs to run on a wide variety of Unix variants had to be familiar with the detailed differences between the variants. They also had to worry about the ways in which the variants changed from one version to another, as variants on the one hand converged on the POSIX standard and on the other continued to introduce new and different features. While it was generally possible to use `#ifdef' to identify particular systems and versions, it became increasingly difficult to know which versions had which features. It became clear that some more organized approach was needed to handle the differences between Unix variants.  File: autobook.info, Node: First Configure Programs, Next: Configure Development, Prev: Unix Diversity, Up: History 2.2 The First Configure Programs ================================ By 1992, four different systems had been developed to help with source code portability: * The Metaconfig program, by Larry Wall, Harlan Stenn, and Raphael Manfredi. * The Cygnus `configure' script, by K. Richard Pixley, and the original GCC `configure' script, by Richard Stallman. These are quite similar, and the developers communicated regularly. GCC is the GNU Compiler Collection, formerly the GNU C compiler. * The GNU Autoconf package, by David MacKenzie. * Imake, part of the X Window system. These systems all split building a program into two steps: a configuration step, and a build step. For all the systems, the build step used the standard Unix `make' program. The `make' program reads a set of rules in a `Makefile', and uses them to build a program. The configuration step would generate `Makefile's, and perhaps other files, which would then be used during the build step. Metaconfig and Autoconf both use feature tests to determine the capabilities of the system. They use Bourne shell scripts (all variants of Unix support the Bourne shell in one form or another) to run various tests to see what the system can support. The Cygnus `configure' script and the original GCC `configure' script are also Bourne shell scripts. They rely on little configuration files for each system variant, both header files and `Makefile' fragments. In early versions, the user compiling the program had to tell the script which type of system the program should be built for; they were later enhanced with a shell script written by Per Bothner which determines the system type based on the standard Unix `uname' program and other information. Imake is a portable C program. Imake can be customized for a particular system, and run as part of building a package. However, it is more normally distributed with a package, including all the configuration information needed for supported systems. Metaconfig and Autoconf are programs used by program authors. They produce a shell script which is distributed with the program's source code. A user who wants to build the program runs the shell script in order to configure the source code for the particular system on which it is to be built. The Cygnus and GCC `configure' scripts, and `imake', do not have this clear distinction between use by the developer and use by the user. The Cygnus and GCC `configure' scripts included features to support cross development, both to support building a cross-compiler which compiles code to be run on another system, and to support building a program using a cross-compiler. Autoconf, Metaconfig and Imake did not have these features (they were later added to Autoconf); they only worked for building a program on the system on which it was to run. The scripts generated by Metaconfig are interactive by default: they ask questions of the user as they go along. This permits them to determine certain characteristics of the system which it is difficult or impossible to test, such as the behavior of setuid programs. The Cygnus and GCC `configure' scripts, and the scripts generated by `autoconf', and the `imake' program, are not interactive: they determine everything themselves. When using Autoconf, the package developer normally writes the script to accept command line options for features which can not be tested for, or sometimes requires the user to edit a header file after the `configure' script has run.  File: autobook.info, Node: Configure Development, Next: Automake Development, Prev: First Configure Programs, Up: History 2.3 Configure Development ========================= The Cygnus `configure' script and the original GCC `configure' script both had to be updated for each new Unix variant they supported. This meant that packages which used them were continually out of date as new Unix variants appeared. It was not hard for the developer to add support for a new system variant; however, it was not something which package users could easily do themselves. The same was true of Imake as it was commonly used. While it was possible for a user to build and configure Imake for a particular system, it was not commonly done. In practice, packages such as the X window system which use Imake are shipped with configuration information detailed for specific Unix variants. Because Metaconfig and Autoconf used feature tests, the scripts they generated were often able to work correctly on new Unix variants without modification. This made them more flexible and easier to work with over time, and led to the wide adoption of Autoconf. In 1994, David MacKenzie extended Autoconf to incorporate the features of the Cygnus `configure' script and the original GCC `configure' script. This included support for using system specified header file and makefile fragments, and support for cross-compilation. GCC has since been converted to use Autoconf, eliminating the GCC `configure' script. Most programs which use the Cygnus `configure' script have also been converted, and no new programs are being written to use the Cygnus `configure' script. The `metaconfig' program is still used today to configure Perl and a few other programs. `imake' is still used to configure the X window system. However, these tools are not generally used for new packages.  File: autobook.info, Node: Automake Development, Next: Libtool Development, Prev: Configure Development, Up: History 2.4 Automake Development ======================== By 1994, Autoconf was a solid framework for handling the differences between Unix variants. However, program developers still had to write large `Makefile.in' files in order to use it. The `configure' script generated by `autoconf' would transform the `Makefile.in' file into a `Makefile' used by the `make' program. A `Makefile.in' file has to describe how to build the program. In the Imake equivalent of a `Makefile.in', known as an `Imakefile', it is only necessary to describe which source files are used to build the program. When Imake generates a `Makefile', it adds the rules for how to build the program itself. Later versions of the BSD `make' program also include rules for building a program. Since most programs are built in much the same way, there was a great deal of duplication in `Makefile.in' files. Also, the GNU project developed a reasonably complex set of standards for `Makefile's, and it was easy to get some of the details wrong. These factors led to the development of Automake. `automake', like `autoconf', is a program run by a developer. The developer writes files named `Makefile.am'; these use a simpler syntax than ordinary `Makefile's. `automake' reads the `Makefile.am' files and produces `Makefile.in' files. The idea is that a script generated by `autoconf' converts these `Makefile.in' files into `Makefile's. As with Imake and BSD `make', the `Makefile.am' file need only describe the files used to build a program. `automake' automatically adds the necessary rules when it generates the `Makefile.in' file. `automake' also adds any rules required by the GNU `Makefile' standards. The first version of Automake was written by David MacKenzie in 1994. It was completely rewritten in 1995 by Tom Tromey.  File: autobook.info, Node: Libtool Development, Next: Microsoft Windows Development, Prev: Automake Development, Up: History 2.5 Libtool Development ======================= Over time, Unix systems added support for shared libraries. Conventional libraries, or static libraries, are linked into a program image. This means that each program which uses a static library includes some or all of the library in the program binary on disk. Shared libraries, on the other hand, are a separate file. A program which uses a shared library does not include a copy of the library; it only includes the name of the library. Many programs can use a single shared library. Using a shared library reduces disk space requirements. Since the system can generally share a single executable instance of the shared library among many programs, it also reduces swap space requirements at run time. Another advantage is that it is possible to fix a bug by updating the single shared library file on disk, without requiring all the programs which use the library to be rebuilt. The first Unix shared library implementation was in System V release 3 from AT&T. The idea was rapidly adopted by other Unix vendors, appearing in SunOS, HP-UX, AIX, and Digital Unix among others. Unfortunately, each implementation differed in the creation and use of shared libraries and in the specific features which were supported. Naturally, packages distributed as source code which included libraries wanted to be able to build their own shared libraries. Several different implementations were written in the Autoconf/Automake framework. In 1996, Gordon Matzigkeit began work on a package known as Libtool. Libtool is a collection of shell scripts which handle the differences between shared library generation and use on different systems. It is closely tied to Automake, although it is possible to use it independently. Over time, Libtool has been enhanced to support more Unix variants and to provide an interface for standardizing shared library features.  File: autobook.info, Node: Microsoft Windows Development, Prev: Libtool Development, Up: History 2.6 Microsoft Windows ===================== In 1995, Microsoft released Windows 95, which soon became the most widely-used operating system in the world. Autoconf and Libtool were written to support portability across Unix variants, but they provided a framework to support portability to Windows as well. This made it possible for a program to support both Unix and Windows from a single source code base. The key requirement of both Autoconf and Libtool was the Unix shell. The GNU bash shell was ported to Windows as part of the Cygwin project, which was originally written by Steve Chamberlain. The Cygwin project implements the basic Unix API in Windows, making it possible to port Unix programs directly. Once the shell and the Unix `make' program (also provided by Cygwin) were available, it was possible to make Autoconf and Libtool support Windows directly, using either the Cygwin interface or the Visual C++ tools from Microsoft. This involved handling details like the different file extensions used by the different systems, as well as yet another set of shared library features. This first version of this work was by Ian Lance Taylor in 1998. Automake has also been ported to Windows. It requires Perl to be installed (*note Prerequisite tools::).  File: autobook.info, Node: Invoking configure, Next: Introducing Makefiles, Prev: History, Up: Top 3 How to run configure and make ******************************* * Menu: * Configuring:: * Files generated by configure:: * The most useful Makefile targets:: * Configuration Names:: A package constructed using Autoconf will come with a `configure' script. A user who wants to build and install the package must run this script in order to prepare their source tree in order to build it on their particular system. The actual build process is performed using the `make' program. The `configure' script tests system features. For example, it might test whether the C library defines the `time_t' data type for use by the `time()' C library function. The `configure' script then makes the results of those tests available to the program while it is being built. This chapter explains how to invoke a `configure' script from the perspective of a user--someone who just wants to take your package and compile it on their system with a minimum of fuss. It is because Autoconf works as well as it does that it is usually possible to build a package on any kind of machine with a simple `configure; make' command line. The topics covered in this chapter include how to invoke `configure', the files that `configure' generates and the most useful `Makefile' targets-actions that you want `make' to perform-that will be available when compiling the package (*note Introducing Makefiles::).  File: autobook.info, Node: Configuring, Next: Files generated by configure, Up: Invoking configure 3.1 Configuring =============== A `configure' script takes a large number of command line options. The set of options can vary from one package to the next, although a number of basic options are always present. The available options can be discovered by running `configure' with the `--help' option. Although many of these options are esoteric, it's worthwhile knowing of their existence when configuring packages with special installation requirements. Each option will be briefly described below: `--cache-file=FILE' `configure' runs tests on your system to determine the availability of features (or bugs!). The results of these tests can be stored in a _cache file_ to speed up subsequent invocations of `configure'. The presence of a well primed cache file makes a big improvement when configuring a complex tree which has `configure' scripts in each subtree. `--help' Outputs a help message. Even experienced users of `configure' need to use `--help' occasionally, as complex projects will include additional options for per-project configuration. For example, `configure' in the GCC package allows you to control whether the GNU assembler will be built and used by GCC in preference to a vendor's assembler. `--no-create' One of the primary functions of `configure' is to generate output files. This option prevents `configure' from generating such output files. You can think of this as a kind of _dry run_, although the cache will still be modified. `--quiet' `--silent' As `configure' runs its tests, it outputs brief messages telling the user what the script is doing. This was done because `configure' can be slow. If there was no such output, the user would be left wondering what is happening. By using this option, you too can be left wondering! `--version' Prints the version of Autoconf that was used to generate the `configure' script. `--prefix=PREFIX' The -prefix option is one of the most frequently used. If generated `Makefile's choose to observe the argument you pass with this option, it is possible to entirely relocate the architecture-independent portion of a package when it is installed. For example, when installing a package like Emacs, the following command line will cause the Emacs Lisp files to be installed in `/opt/gnu/share': $ ./configure --prefix=/opt/gnu It is important to stress that this behavior is dependent on the generated files making use of this information. For developers writing these files, Automake simplifies this process a great deal. Automake is introduced in *Note Introducing GNU Automake::. `--exec-prefix=EPREFIX' Similar to `--prefix', except that it sets the location of installed files which are architecture-dependent. The compiled `emacs' binary is such a file. If this option is not given, the default `exec-prefix' value inserted into generated files is set to the same values at the `prefix'. `--bindir=DIR' Specifies the location of installed binary files. While there may be other generated files which are binary in nature, binary files here are defined to be programs that are run directly by users. `--sbindir=DIR' Specifies the location of installed superuser binary files. These are programs which are usually only run by the superuser. `--libexecdir=DIR' Specifies the location of installed executable support files. Contrasted with `binary files', these files are never run directly by users, but may be executed by the binary files mentioned above. `--datadir=DIR' Specifies the location of generic data files. `--sysconfdir=DIR' Specifies the location of read-only data used on a single machine. `--sharedstatedir=DIR' Specifies the location of data which may be modified, and which may be shared across several machines. `--localstatedir=DIR' Specifies the location of data which may be modified, but which is specific to a single machine. `--libdir=DIR' Specifies where object code library should be installed. `--includedir=DIR' Specifies where C header files should be installed. Header files for other languages such as C++ may be installed here also. `--oldincludedir=DIR' Specifies where C header files should be installed for compilers other than GCC. `--infodir=DIR' Specifies where Info format documentation files should be installed. Info is the documentation format used by the GNU project. `--mandir=DIR' Specifies where manual pages should be installed. `--srcdir=DIR' This option does not affect installation. Instead, it tells `configure' where the source files may be found. It is normally not necessary to specify this, since the configure script is normally in the same directory as the source files. `--program-prefix=PREFIX' Specifies a prefix which should be added to the name of a program when installing it. For example, using `--program-prefix=g' when configuring a program normally named `tar' will cause the installed program to be named `gtar' instead. As with the other installation options, this `configure' option only works if it is utilized by the `Makefile.in' file. `--program-suffix=SUFFIX' Specifies a suffix which should be appended to the name of a program when installing it. `--program-transform-name=PROGRAM' Here, PROGRAM is a `sed' script. When a program is installed, its name will be run through `sed -e SCRIPT' to produce the installed name. `--build=BUILD' Specifies the type of system on which the package will be built. If not specified, the default will be the same configuration name as the host. `--host=HOST' Specifies the type of system on which the package will run--or _be hosted_. If not specified, the host triplet is determined by executing `config.guess'. `--target=TARGET' Specifies the type of system which the package is to be targeted to. This makes the most sense in the context of programming language tools like compilers and assemblers. If not specified, the default will be the same configuration name as the host. `--disable-FEATURE' Some packages may choose to provide compile-time configurability for large-scale options such as using the Kerberos authentication system or an experimental compiler optimization pass. If the default is to provide such features, they may be disabled with `--disable-FEATURE', where FEATURE is the feature's designated name. For example: $ ./configure --disable-gui `--enable-FEATURE[=ARG]' Conversely, some packages may provide features which are disabled by default. To enable them, use `--enable-FEATURE', where FEATURE is the feature's designated name. A feature may accept an optional argument. For example: $ ./configure --enable-buffers=128 Using `--enable-FEATURE=no' is synonymous with `--disable-FEATURE', described above. `--with-PACKAGE[=ARG]' In the free software community, there is a healthy tendency to reuse existing packages and libraries where possible. At the time when a source tree is configured by `configure', it is possible to provide hints about other installed packages. For example, the BLT widget toolkit relies on Tcl and Tk. To configure BLT, it may be necessary to give `configure' some hints about where you have installed Tcl and Tk: $ ./configure --with-tcl=/usr/local --with-tk=/usr/local Using `--with-PACKAGE=no' is synonymous with `--without-PACKAGE' which is described below. `--without-PACKAGE' Sometimes you may not want your package to inter-operate with some pre-existing package installed on your system. For example, you might not want your new compiler to use GNU `ld'. You can prevent this by using an option such as: $ ./configure --without-gnu-ld `--x-includes=DIR' This option is really a specific instance of a `--with-package' option. At the time when Autoconf was initially being developed, it was common to use `configure' to build programs to run on the X Window System as an alternative to Imake. The `--x-includes' option provides a way to guide the configure script to the directory containing the X11 header files. `--x-libraries=DIR' Similarly, the -x-libraries option provides a way to guide `configure' to the directory containing the X11 libraries. It is unnecessary, and often undesirable, to run `configure' from within the source tree. Instead, a well-written `Makefile' generated by `configure' will be able to build packages whose source files reside in another tree. The advantages of building derived files in a separate tree to the source code are fairly obvious: the derived files, such as object files, would clutter the source tree. This would also make it impossible to build those same object files on a different system or with a different configuration. Instead, it is recommended to use three trees: a source tree, a build tree and an _install tree_. Here is a closing example of how to build the GNU malloc package in this way: $ gtar zxf mmalloc-1.0.tar.gz $ mkdir build && cd build $ ../mmalloc-1.0/configure creating cache ./config.cache checking for gcc... gcc checking whether the C compiler (gcc ) works... yes checking whether the C compiler (gcc ) is a cross-compiler... no checking whether we are using GNU C... yes checking whether gcc accepts -g... yes checking for a BSD compatible install... /usr/bin/install -c checking host system type... i586-pc-linux-gnu checking build system type... i586-pc-linux-gnu checking for ar... ar checking for ranlib... ranlib checking how to run the C preprocessor... gcc -E checking for unistd.h... yes checking for getpagesize... yes checking for working mmap... yes checking for limits.h... yes checking for stddef.h... yes updating cache ../config.cache creating ./config.status Now that this build tree is configured, it is possible to go on and build the package and install it into the default location of `/usr/local': $ make all && make install  File: autobook.info, Node: Files generated by configure, Next: The most useful Makefile targets, Prev: Configuring, Up: Invoking configure 3.2 Files generated by configure ================================ After you have invoked `configure', you will discover a number of generated files in your build tree. The build directory structure created by `configure' and the number of files will vary from package to package. Each of the generated files are described below and their relationships are shown in *Note Generated File Dependencies::: `config.cache' `configure' can cache the results of system tests that have been performed to speed up subsequent tests. This file contains the cache data and is a plain text file that can be hand-modified or removed if desired. `config.log' As `configure' runs, it outputs a message describing each test it performs and the result of each test. There is substantially more output produced by the shell and utilities that `configure' invokes, but it is hidden from the user to keep the output understandable. The output is instead redirected to `config.log'. This file is the first place to look when `configure' goes hay-wire or a test produces a nonsense result. A common scenario is that `configure', when run on a Solaris system, will tell you that it was unable to find a working C compiler. An examination of `config.log' will show that Solaris' default `/usr/ucb/cc' is a program that informs the user that the optional C compiler is not installed. `config.status' `configure' generates a shell script called `config.status' that may be used to recreate the current configuration. That is, all generated files will be regenerated. This script can also be used to re-run `configure' if the `--recheck' option is given. `config.h' Many packages that use `configure' are written in C or C++. Some of the tests that `configure' runs involve examining variability in the C and C++ programming languages and implementations thereof. So that source code can programmatically deal with these differences, `#define' preprocessor directives can be optionally placed in a _config header_, usually called `config.h', as `configure' runs. Source files may then include the `config.h' file and act accordingly: #if HAVE_CONFIG_H # include #endif /* HAVE_CONFIG_H */ #if HAVE_UNISTD_H # include #endif /* HAVE_UNISTD_H */ We recommend always using a config header. `Makefile' One of the common functions of `configure' is to generate `Makefile's and other files. As it has been stressed, a `Makefile' is just a file often generated by `configure' from a corresponding input file (usually called `Makefile.in'). The following section will describe how you can use `make' to process this `Makefile'. There are other cases where generating files in this way can be helpful. For instance, a Java developer might wish to make use of a `defs.java' file generated from `defs.java.in'.  File: autobook.info, Node: The most useful Makefile targets, Next: Configuration Names, Prev: Files generated by configure, Up: Invoking configure 3.3 The most useful Makefile targets ==================================== By now `configure' has generated the output files such as a `Makefile'. Most projects include a `Makefile' with a basic set of well-known _targets_ (*note Targets and dependencies::). A target is a name of a task that you want `make' to perform - usually it is to build all of the programs belonging to your package (commonly known as the _all_ target). From your build directory, the following commands are likely to work for a configured package: `make all' Builds all derived files sufficient to declare the package built. `make check' Runs any self-tests that the package may have. `make install' Installs the package in a predetermined location. `make clean' Removes all derived files. There are other less commonly used targets which are likely to be recognized, particularly if the package includes a `Makefile' which conforms to the GNU `Makefile' standard or is generated by `automake'. You may wish to inspect the generated `Makefile' to see what other targets have been included.  File: autobook.info, Node: Configuration Names, Prev: The most useful Makefile targets, Up: Invoking configure 3.4 Configuration Names ======================= The GNU Autotools name all types of computer systems using a "configuration name". This is a name for the system in a standardized format. Some example configuration names are `sparc-sun-solaris2.7', `i586-pc-linux-gnu', or `i386-pc-cygwin'. All configuration names used to have three parts, and in some documentation they are still called "configuration triplets". A three part configuration name is CPU-MANUFACTURER-OPERATING_SYSTEM. Currently configuration names are permitted to have four parts on systems which distinguish the kernel and the operating system, such as GNU/Linux. In these cases, the configuration name is CPU-MANUFACTURER-KERNEL-OPERATING_SYSTEM. When using a configuration name in an option to a tool such as `configure', it is normally not necessary to specify an entire name. In particular, the middle field (MANUFACTURER, described below) is often omitted, leading to strings such as `i386-linux' or `sparc-sunos'. The shell script `config.sub' is used to translate these shortened strings into the canonical form. On most Unix variants, the shell script `config.guess' will print the correct configuration name for the system it is run on. It does this by running the standard `uname' program, and by examining other characteristics of the system. On some systems, `config.guess' requires a working C compiler or an assembler. Because `config.guess' can normally determine the configuration name for a machine, it is only necessary for a user or developer to specify a configuration name in unusual cases, such as when building a cross-compiler. Here is a description of each field in a configuration name: CPU The type of processor used on the system. This is typically something like `i386' or `sparc'. More specific variants are used as well, such as `mipsel' to indicate a little endian MIPS processor. MANUFACTURER A somewhat freeform field which indicates the manufacturer of the system. This is often simply `unknown'. Other common strings are `pc' for an IBM PC compatible system, or the name of a workstation vendor, such as `sun'. OPERATING_SYSTEM The name of the operating system which is run on the system. This will be something like `solaris2.5' or `winnt4.0'. There is no particular restriction on the version number, and strings like `aix4.1.4.0' are seen. Configuration names may be used to describe all sorts of systems, including embedded systems which do not run any operating system. In this case, the field is normally used to indicate the object file format, such as `elf' or `coff'. KERNEL This is used mainly for GNU/Linux systems. A typical GNU/Linux configuration name is `i586-pc-linux-gnulibc1'. In this case the kernel, `linux', is separated from the operating system, `gnulibc1'. `configure' allows fine control over the format of binary files. It is not necessary to build a package for a given kind of machine on that machine natively--instead, a cross-compiler can be used. Moreover, if the package you are trying to build is itself capable of operating in a cross configuration, then the build system need not be the same kind of machine used to host the cross-configured package once the package is built! Consider some examples: Compiling a simple package for a GNU/Linux system. HOST = BUILD = TARGET = `i586-pc-linux-gnu' Cross-compiling a package on a GNU/Linux system that is intended to run on an IBM AIX machine: BUILD = `i586-pc-linux-gnu', HOST = TARGET = `rs6000-ibm-aix3.2' Building a Solaris-hosted MIPS-ECOFF cross-compiler on a GNU/Linux system. BUILD = `i586-pc-linux-gnu', HOST = `sparc-sun-solaris2.4', TARGET = `mips-idt-ecoff'  File: autobook.info, Node: Introducing Makefiles, Next: A Minimal GNU Autotools Project, Prev: Invoking configure, Up: Top 4 Introducing `Makefile's ************************* A `Makefile' is a specification of dependencies between files and how to resolve those dependencies such that an overall goal, known as a _target_, can be reached. `Makefile's are processed by the `make' utility. Other references describe the syntax of `Makefile's and the various implementations of `make' in detail. This chapter provides an overview into `Makefile's and gives just enough information to write custom rules in a `Makefile.am' (*note Introducing GNU Automake::) or `Makefile.in'. * Menu: * Targets and dependencies:: * Makefile syntax:: * Suffix rules:: * Macros::  File: autobook.info, Node: Targets and dependencies, Next: Makefile syntax, Up: Introducing Makefiles 4.1 Targets and dependencies ============================ The `make' program attempts to bring a target up to date by bring all of the target's dependencies up to date. These dependencies may have further dependencies. Thus, a potentially complex dependency graph forms when processing a typical `Makefile'. From a simple `Makefile' that looks like this: all: foo foo: foo.o bar.o baz.o .c.o: $(CC) $(CFLAGS) -c $< -o $@ .l.c: $(LEX) $< && mv lex.yy.c $@ We can draw a dependency graph that looks like this: all | foo | .-------+-------. / | \ foo.o bar.o baz.o | | | foo.c bar.c baz.c | baz.l Unless the `Makefile' contains a directive to `make', all targets are assumed to be filename and rules must be written to create these files or somehow bring them up to date. When leaf nodes are found in the dependency graph, the `Makefile' must include a set of shell commands to bring the dependent up to date with the dependency. Much to the chagrin of many `make' users, _up to date_ means the dependent has a more recent timestamp than the target. Moreover, each of these shell commands are run in their own sub-shell and, unless the `Makefile' instructs `make' otherwise, each command must exit with an exit code of 0 to indicate success. Target rules can be written which are executed unconditionally. This is achieved by specifying that the target has no dependents. A simple rule which should be familiar to most users is: clean: -rm *.o core  File: autobook.info, Node: Makefile syntax, Next: Suffix rules, Prev: Targets and dependencies, Up: Introducing Makefiles 4.2 Makefile syntax =================== `Makefile's have a rather particular syntax that can trouble new users. There are many implementations of `make', some of which provide non-portable extensions. An abridged description of the syntax follows which, for portability, may be stricter than you may be used to. Comments start with a `#' and continue until the end of line. They may appear anywhere except in command sequences--if they do, they will be interpreted by the shell running the command. The following `Makefile' shows three individual targets with dependencies on each: target1: dep1 dep2 ... depN cmd1 cmd2 ... cmdN target2: dep4 dep5 cmd1 cmd2 dep4 dep5: cmd1 Target rules start at the beginning of a line and are followed by a colon. Following the colon is a whitespace separated list of dependencies. A series of lines follow which contain shell commands to be run by a sub-shell (the default is the Bourne shell). Each of these lines _must_ be prefixed by a horizontal tab character. This is the most common mistake made by new `make' users. These commands may be prefixed by an `@' character to prevent `make' from echoing the command line prior to executing it. They may also optionally be prefixed by a `-' character to allow the rule to continue if the command returns a non-zero exit code. The combination of both characters is permitted.  File: autobook.info, Node: Macros, Prev: Suffix rules, Up: Introducing Makefiles 4.3 Macros ========== A number of useful macros exist which may be used anywhere throughout the `Makefile'. Macros start with a dollar sign, like shell variables. Our first `Makefile' used a few: $(CC) $(CFLAGS) -c $< -o $@ Here, syntactic forms of `$(..)' are `make' variable expansions. It is possible to define a `make' variable using a `VAR=VALUE' syntax: CC = ec++ In a `Makefile', `$(CC)' will then be literally replaced by `ec++'. `make' has a number of built-in variables and default values. The default value for `$(CC)' is *cc*. Other built-in macros exist with fixed semantics. The two most common macros are `$@' and `$<'. They represent the names of the target and the first dependency for the rule in which they appear. `$@' is available in any rule, but for some versions of `make' `$<' is only available in suffix rules. Here is a simple `Makefile': all: dummy @echo "$@ depends on dummy" dummy: touch $@ This is what `make' outputs when processing this `Makefile': $ make touch dummy all depends on dummy The GNU Make manual documents these macros in more detail.  File: autobook.info, Node: Suffix rules, Next: Macros, Prev: Makefile syntax, Up: Introducing Makefiles 4.4 Suffix rules ================ To simplify a `Makefile', there is a special kind of rule syntax known as a _suffix rule_. This is a wildcard pattern that can match targets. Our first `Makefile' used some. Here is one: .c.o: $(CC) $(CFLAGS) -c $< -o $@ Unless a more specific rule matches the target being sought, this rule will match any target that ends in `.o'. These files are said to always be dependent on `.c'. With some background material now presented, let's take a look at these tools in use.  File: autobook.info, Node: A Minimal GNU Autotools Project, Next: Writing configure.in, Prev: Introducing Makefiles, Up: Top 5 A Minimal GNU Autotools Project ********************************* * Menu: * User-Provided Input Files:: * Generated Output Files:: * Maintaining Input Files:: * Packaging Generated Files:: * Documentation and ChangeLogs:: This chapter describes how to manage a minimal project using the GNU Autotools. A minimal project is defined to be the smallest possible project that can still illustrate a sufficient number of principles in using the tools. By studying a smaller project, it becomes easier to understand the more complex interactions between these tools when larger projects require advanced features. The example project used throughout this chapter is a fictitious command interpreter called `foonly'. `foonly' is written in C, but like many interpreters, uses a lexical analyzer and a parser expressed using the `lex' and `yacc' tools. The package will be developed to adhere to the GNU `Makefile' standard, which is the default behavior for Automake. There are many features of the GNU Autotools that this small project will not utilize. The most noteworthy one is libraries; this package does not produce any libraries of its own, so Libtool will not feature in this chapter. The more complex projects presented in *Note A Small GNU Autotools Project:: and *Note A Large GNU Autotools Project:: will illustrate how Libtool participates in the build system. The purpose of this chapter will be to provide a high-level overview of the user-written files and how they interact.  File: autobook.info, Node: User-Provided Input Files, Next: Generated Output Files, Up: A Minimal GNU Autotools Project 5.1 User-Provided Input Files ============================= The smallest project requires the user to provide only two files. The remainder of the files needed to build the package are generated by the GNU Autotools (*note Generated Output Files::). * `Makefile.am' is an input to `automake'. * `configure.in' is an input to `autoconf'. I like to think of `Makefile.am' as a high-level, bare-bones specification of a project's build requirements: what needs to be built, and where does it go when it is installed? This is probably Automake's greatest strength-the description is about as simple as it could possibly be, yet the final product is a `Makefile' with an array of convenient `make' targets. The `configure.in' is a template of macro invocations and shell code fragments that are used by `autoconf' to produce a `configure' script (*note Generated File Dependencies::). `autoconf' copies the contents of `configure.in' to `configure', expanding macros as they occur in the input. Other text is copied verbatim. Let's take a look at the contents of the user-provided input files that are relevant to this minimal project. Here is the `Makefile.am': ## Makefile.am -- Process this file with automake to produce Makefile.in bin_PROGRAMS = foonly foonly_SOURCES = main.c foo.c foo.h nly.c scanner.l parser.y foonly_LDADD = @LEXLIB@ This `Makefile.am' specifies that we want a program called `foonly' to be built and installed in the `bin' directory when `make install' is run. The source files that are used to build `foonly' are the C source files `main.c', `foo.c', `nly.c' and `foo.h', the `lex' program in `scanner.l' and a `yacc' grammar in `parser.y'. This points out a particularly nice aspect about Automake: because `lex' and `yacc' both generate intermediate C programs from their input files, Automake knows how to build such intermediate files and link them into the final executable. Finally, we must remember to link a suitable `lex' library, if `configure' concludes that one is needed. And here is the `configure.in': dnl Process this file with autoconf to produce a configure script. AC_INIT(main.c) AM_INIT_AUTOMAKE(foonly, 1.0) AC_PROG_CC AM_PROG_LEX AC_PROG_YACC AC_OUTPUT(Makefile) This `configure.in' invokes some mandatory Autoconf and Automake initialization macros, and then calls on some Autoconf macros from the `AC_PROG' family to find suitable C compiler, `lex', and `yacc' programs. Finally, the `AC_OUTPUT' macro is used to cause the generated `configure' script to output a `Makefile'--but from what? It is processed from `Makefile.in', which Automake produces for you based on your `Makefile.am' (*note Generated File Dependencies::).  File: autobook.info, Node: Generated Output Files, Next: Maintaining Input Files, Prev: User-Provided Input Files, Up: A Minimal GNU Autotools Project 5.2 Generated Output Files ========================== By studying the diagram in *Note Generated File Dependencies::, it should be possible to see which commands must be run to generate the required output files from the input files shown in the last section. First, we generate `configure': $ aclocal $ autoconf Because `configure.in' contains macro invocations which are not known to autoconf itself-`AM_INIT_AUTOMAKE' being a case in point, it is necessary to collect all of the macro definitions for autoconf to use when generating `configure'. This is done using the `aclocal' program, so called because it generates `aclocal.m4' (*note Generated File Dependencies::). If you were to examine the contents of `aclocal.m4', you would find the definition of the `AM_INIT_AUTOMAKE' macro contained within. After running `autoconf', you will find a `configure' script in the current directory. It is important to run `aclocal' first because `automake' relies on the contents of `configure.in' and `aclocal.m4'. On to `automake': $ automake --add-missing automake: configure.in: installing ./install-sh automake: configure.in: installing ./mkinstalldirs automake: configure.in: installing ./missing automake: Makefile.am: installing ./INSTALL automake: Makefile.am: required file ./NEWS not found automake: Makefile.am: required file ./README not found automake: Makefile.am: installing ./COPYING automake: Makefile.am: required file ./AUTHORS not found automake: Makefile.am: required file ./ChangeLog not found The `--add-missing' option copies some boilerplate files from your Automake installation into the current directory. Files such as `COPYING', which contain the GNU General Public License change infrequently, and so can be generated without user intervention. A number of utility scripts are also installed-these are used by the generated `Makefile's, particularly by the `install' target. Notice that some required files are still missing. These are: `NEWS' A record of user-visible changes to a package. The format is not strict, but the changes to the most recent version should appear at the top of the file. `README' The first place a user will look to get an overview for the purpose of a package, and perhaps special installation instructions. `AUTHORS' Lists the names, and usually mail addresses, of individuals who worked on the package. `ChangeLog' The ChangeLog is an important file-it records the changes that are made to a package. The format of this file is quite strict (*note Documentation and ChangeLogs::). For now, we'll do enough to placate Automake: $ touch NEWS README AUTHORS ChangeLog $ automake --add-missing Automake has now produced a `Makefile.in'. At this point, you may wish to take a snapshot of this directory before we really let loose with automatically generated files. By now, the contents of the directory will be looking fairly complete and reminiscent of the top-level directory of a GNU package you may have installed in the past: AUTHORS INSTALL NEWS install-sh mkinstalldirs COPYING Makefile.am README configure missing ChangeLog Makefile.in aclocal.m4 configure.in It should now be possible to package up your tree in a `tar' file and give it to other users for them to install on their own systems. One of the `make' targets that Automake generates in `Makefile.in' makes it easy to generate distributions (*note Rolling Distribution Tarballs::). A user would merely have to unpack the `tar' file, run `configure' (*note Invoking configure::) and finally type `make all': $ ./configure creating cache ./config.cache checking for a BSD compatible install... /usr/bin/install -c checking whether build environment is sane... yes checking whether make sets ${MAKE}... yes checking for working aclocal... found checking for working autoconf... found checking for working automake... found checking for working autoheader... found checking for working makeinfo... found checking for gcc... gcc checking whether the C compiler (gcc ) works... yes checking whether the C compiler (gcc ) is a cross-compiler... no checking whether we are using GNU C... yes checking whether gcc accepts -g... yes checking how to run the C preprocessor... gcc -E checking for flex... flex checking for flex... (cached) flex checking for yywrap in -lfl... yes checking lex output file root... lex.yy checking whether yytext is a pointer... yes checking for bison... bison -y updating cache ./config.cache creating ./config.status creating Makefile $ make all gcc -DPACKAGE=\"foonly\" -DVERSION=\"1.0\" -DYYTEXT_POINTER=1 -I. -I. \ -g -O2 -c main.c gcc -DPACKAGE=\"foonly\" -DVERSION=\"1.0\" -DYYTEXT_POINTER=1 -I. -I. \ -g -O2 -c foo.c flex scanner.l && mv lex.yy.c scanner.c gcc -DPACKAGE=\"foonly\" -DVERSION=\"1.0\" -DYYTEXT_POINTER=1 -I. -I. \ -g -O2 -c scanner.c bison -y parser.y && mv y.tab.c parser.c if test -f y.tab.h; then \ if cmp -s y.tab.h parser.h; then rm -f y.tab.h; \ else mv y.tab.h parser.h; fi; \ else :; fi gcc -DPACKAGE=\"foonly\" -DVERSION=\"1.0\" -DYYTEXT_POINTER=1 -I. -I. \ -g -O2 -c parser.c gcc -g -O2 -o foonly main.o foo.o scanner.o parser.o -lfl  File: autobook.info, Node: Maintaining Input Files, Next: Packaging Generated Files, Prev: Generated Output Files, Up: A Minimal GNU Autotools Project 5.3 Maintaining Input Files =========================== If you edit any of the GNU Autotools input files in your package, it is necessary to regenerate the machine generated files for these changes to take effect. For instance, if you add a new source file to the `foonly_SOURCES' variable in `Makefile.am'. It is necessary to re-generate the derived file `Makefile.in'. If you are building your package, you need to re-run `configure' to re-generate the site-specific `Makefile', and then re-run `make' to compile the new source file and link it into `foonly'. It is possible to regenerate these files by running the required tools, one at a time. However, as we can see above, it can be difficult to compute the dependencies--does a particular change require `aclocal' to be run? Does a particular change require `autoconf' to be run? There are two solutions to this problem. The first solution is to use the `autoreconf' command. This tool regenerates all derived files by re-running all of the necessary tools in the correct order. It is somewhat of a brute force solution, but it works very well, particularly if you are not trying to accommodate other maintainers, or regular maintenance that would render this command bothersome. The alternative is Automake's `maintainer mode'. By invoking the `AM_MAINTAINER_MODE' macro from `configure.in', automake will activate an `--enable-maintainer-mode' option in `configure'. This is explained at length in *Note Bootstrapping::.  File: autobook.info, Node: Packaging Generated Files, Next: Documentation and ChangeLogs, Prev: Maintaining Input Files, Up: A Minimal GNU Autotools Project 5.4 Packaging Generated Files ============================= The debate about what to do with generated files is one which is keenly contested on the relevant Internet mailing lists. There are two points of view and I will present both of them to you so that you can try to decide what the best policy is for your project. One argument is that generated files should not be included with a package, but rather only the `preferred form' of the source code should be included. By this definition, `configure' is a derived file, just like an object file, and it should not be included in the package. Thus, the user should use the GNU Autotools to bootstrap themselves prior to building the package. I believe there is some merit to this purist approach, as it discourages the practice of packaging derived files. The other argument is that the advantages of providing these files can far outweigh the violation of good software engineering practice mentioned above. By including the generated files, users have the convenience of not needing to be concerned with keeping up to date with all of the different versions of the tools in active use. This is especially true for Autoconf, as `configure' scripts are often generated by maintainers using locally modified versions of `autoconf' and locally installed macros. If `configure' were regenerated by the user, the result could be different to that intended. Of course, this is poor practice, but it happens to reflect reality. I believe the answer is to include generated files in the package when the package is going to be distributed to a wide user community (ie. the general public). For in-house packages, the former argument might make more sense, since the tools may also be held under version control.  File: autobook.info, Node: Documentation and ChangeLogs, Prev: Packaging Generated Files, Up: A Minimal GNU Autotools Project 5.5 Documentation and ChangeLogs ================================ As with any software project, it is important to maintain documentation as the project evolves-the documentation must reflect the current state of the software, but it must also accurately record the changes that have been made in the past. The GNU coding standard rigorously enforces the maintenance of documentation. Automake, in fact, implements some of the standard by checking for the presence of a `ChangeLog' file when `automake' is run! A number of files exist, with standardized filenames, for storing documentation in GNU packages. The complete GNU coding standard, which offers some useful insights, can be found at `http://www.gnu.org/prep/standards.html'. Other projects, including in-house projects, can use these same tried-and-true techniques. The purpose of most of the standard documentation files was outlined earlier *Note Generated Output Files::, but the `ChangeLog' deserves additional treatment. When recording changes in a `ChangeLog', one entry is made per person. Logical changes are grouped together, while logically distinct changes (ie. `change sets') are separated by a single blank line. Here is an example from Automake's own `ChangeLog': 1999-11-21 Tom Tromey * automake.in (finish_languages): Only generate suffix rule when not doing dependency tracking. * m4/init.m4 (AM_INIT_AUTOMAKE): Use AM_MISSING_INSTALL_SH. * m4/missing.m4 (AM_MISSING_INSTALL_SH): New macro. * depend2.am: Use @SOURCE@, @OBJ@, @LTOBJ@, @OBJOBJ@, and @BASE@. Always use -o. Another important point to make about `ChangeLog' entries is that they should be brief. It is not necessary for an entry to explain in details _why_ a change was made, but rather _what_ the change was. If a change is not straightforward then the explanation of _why_ belongs in the source code itself. The GNU coding standard offers the complete set of guidelines for keeping `ChangeLog's. Although any text editor can be used to create ChangeLog entries, Emacs provides a major mode to help you write them.  File: autobook.info, Node: Writing configure.in, Next: Introducing GNU Automake, Prev: A Minimal GNU Autotools Project, Up: Top 6 Writing `configure.in' ************************ Writing a portable `configure.in' is a tricky business. Since you can put arbitrary shell code into `configure.in', your options seem overwhelming. There are many questions the first-time Autoconf user asks: What constructs are portable and what constructs aren't portable? How do I decide what to check for? What shouldn't I check for? How do I best use Autoconf's features? What shouldn't I put in `configure.in'? In what order should I run my checks? When should I look at the name of the system instead of checking for specific features? * Menu: * What is Portability?:: * Brief introduction to portable sh:: * Ordering Tests:: * What to check for:: * Using Configuration Names::  File: autobook.info, Node: What is Portability?, Next: Brief introduction to portable sh, Up: Writing configure.in 6.1 What is Portability? ======================== Before we talk about the mechanics of deciding what to check for and how to check for it, let's ask ourselves a simple question: what is portability? Portability is a quality of the code that enables it to be built and run on a variety of platforms. In the Autoconf context, portability usually refers to the ability to run on Unix-like systems--sometimes including Windows. When I first started using Autoconf, I had a hard time deciding what to check for in my `configure.in'. At the time, I was maintaining a proprietary program that ran only on SunOS 4. However, I was interested in porting it to Solaris, OSF/1, and possibly Irix. The approach I took, while workable, was relatively time-consuming and painful: I wrote a minimal `configure.in' and then proceeded to simply try to build my program on Solaris. Each time I encountered a build problem, I updated `configure.in' and my source and started again. Once it built correctly, I started testing to see if there were runtime problems related to portability. Since I didn't start with a relatively portable base, and since I was unaware of the tools available to help with adding Autoconf support to a package (*note Migrating Existing Packages::), it was much more difficult than it had to be. If at all possible, it is better to write portable code to begin with. There are a large number of Unix-like systems in the world, including many systems which, while still running, can only be considered obsolete. While it is probably possible to port some programs to all such systems, typically it isn't useful to even try. Porting to everything is a difficult process, especially given that it usually isn't possible to test on all platforms, and that new operating systems, with their own bugs and idiosyncrasies are released every year. We advocate a pragmatic approach to portability: we write our programs to target a fairly large, but also fairly modern, cross-section of Unix-like systems. As deficiencies are discovered in our portability framework, we update `configure.in' and our sources, and move on. In practice, this is an effective approach.  File: autobook.info, Node: Brief introduction to portable sh, Next: Ordering Tests, Prev: What is Portability?, Up: Writing configure.in 6.2 Brief introduction to portable sh ===================================== If you read a number of `configure.in's, you'll quickly notice that they tend to be written in an unusual style. For instance, you'll notice you hardly ever see the `[' program used; instead you'll see `test' invoked. We won't go into all the details of writing a portable shell script here; instead we leave that for *Note Writing Portable Bourne Shell::. Like other aspects of portability, the approach you take to writing shell scripts in `configure.in' and `Makefile.am' should depend on your goals. Some platforms have notoriously broken `sh' implementations. For instance, Ultrix `sh' doesn't implement `unset'. Of course, the GNU Autotools are written in the most portable style possible, so as not to limit your possibilities. Also, it doesn't really make sense to talk about portable `sh' programming in the abstract. `sh' by itself does very little; most actual work is done by separate programs, each with its own potential portability problems. For instance, some options are not portable between systems, and some seemingly common programs don't exist on every system - so not only do you have to know which `sh' constructs are not portable, but you also must know which programs you can (and cannot) use, and which options to those programs are portable. This seems daunting, but in practice it doesn't seem to be too hard to write portable shell scripts - once you've internalized the rules. Unfortunately, this process can take a long time. Meanwhile, a pragmatic `try and see' approach, while noting other portable code you've seen elsewhere, works fairly well. Once again, it pays to be aware of which architectures you'll probably care about - you will make different choices if you are writing an extremely portable program like `emacs' or `gcc' than if you are writing something that will only run on various flavors of Linux. Also, the cost of having unportable code in `configure.in' is relatively low - in general it is fairly easy to rewrite pieces on demand as unportable constructs are found.  File: autobook.info, Node: Ordering Tests, Next: What to check for, Prev: Brief introduction to portable sh, Up: Writing configure.in 6.3 Ordering Tests ================== In addition to the problem of writing portable `sh' code, another problem which confronts first-time `configure.in' writers is determining the order in which to run the various tests. Autoconf indirectly (via the `autoscan' program, which we cover in *Note Migrating Existing Packages::) suggests a standard ordering, which is what we describe here. The standard ordering is: 1. Boilerplate. This section should include standard boilerplate code, such as the call to `AC_INIT' (which must be first), `AM_INIT_AUTOMAKE', `AC_CONFIG_HEADER', and perhaps `AC_REVISION'. 2. Options. The next section should include macros which add command-line options to `configure', such as `AC_ARG_ENABLE'. It is typical to put support code for the option in this section as well, if it is short enough, like this example from `libgcj': AC_ARG_ENABLE(getenv-properties, [ --disable-getenv-properties don't set system properties from GCJ_PROPERTIES]) dnl Whether GCJ_PROPERTIES is used depends on the target. if test -n "$enable_getenv_properties"; then enable_getenv_properties=${enable_getenv_properties_default-yes} fi if test "$enable_getenv_properties" = no; then AC_DEFINE(DISABLE_GETENV_PROPERTIES) fi 3. Programs. Next it is traditional to check for programs that are either needed by the configure process, the build process, or by one of the programs being built. This usually involves calls to macros like `AC_CHECK_PROG' and `AC_PATH_TOOL'. 4. Libraries. Checks for libraries come before checks for other objects visible to C (or C++, or anything else). This is necessary because some other checks work by trying to link or run a program; by checking for libraries first you ensure that the resulting programs can be linked. 5. Headers. Next come checks for existence of headers. 6. Typedefs and structures. We do checks for typedefs after checking for headers for the simple reason that typedefs appear in headers, and we need to know which headers we can use before we look inside them. 7. Functions. Finally we check for functions. These come last because functions have dependencies on the preceding items: when searching for functions, libraries are needed in order to correctly link, headers are needed in order to find prototypes (this is especially important for C++, which has stricter prototyping rules than C), and typedefs are needed for those functions which use or return types which are not built in. 8. Output. This is done by invoking `AC_OUTPUT'. This ordering should be considered a rough guideline, and not a list of hard-and-fast rules. Sometimes it is necessary to interleave tests, either to make `configure.in' easier to maintain, or because the tests themselves do need to be in a different order. For instance, if your project uses both C and C++ you might choose to do all the C++ checks after all the C checks are done, in order to make `configure.in' a bit easier to read.  File: autobook.info, Node: What to check for, Next: Using Configuration Names, Prev: Ordering Tests, Up: Writing configure.in 6.4 What to check for ===================== Deciding what to check for is really the central part of writing `configure.in'. Once you've read the Autoconf reference manual, the "how"s of writing a particular test should be fairly clear. The "when"s might remain a mystery - and it's just as easy to check for too many things as it is to check for too few. One notable area of divergence between various Unix-like systems is that the same programs don't exist on all systems, and, even when they do, they don't always work in the same way. For these problems we recommend, when possible, following the advice of the GNU Coding Standards: use the most common options from a relatively limited set of programs. Failing that, try to stick to programs and options specified by POSIX, perhaps augmenting this approach by doing checks for known problems on platforms you care about. Checking for tools and their differences is usually a fairly small part of a `configure' script; more common are checks for functions, libraries, and the like. Except for a few core libraries like `libc' and, usually, `libm' and libraries like `libX11' which typically aren't considered system libraries, there isn't much agreement about library names or contents between Unix systems. Still, libraries are easy to handle, because decisions about libraries almost always only affect the various `Makefile's. That means that checking for another library typically doesn't require major (or even, sometimes, any) changes to the source code. Also, because adding a new library test has a small impact on the development cycle - effectively just re-running `configure' and then a relink - you can effectively adopt a lax approach to libraries. For instance, you can just make things work on the few systems you immediately care about and then handle library changes on an as-needed basis. Suppose you do end up with a link problem. How do you handle it? The first thing to do is use `nm' to look through the system libraries to see if the missing function exists. If it does, and it is in a library you can use then the solution is easy - just add another `AC_CHECK_LIB'. Note that just finding the function in a library is not enough, because on some systems, some "standard" libraries are undesirable; `libucb' is the most common example of a library which you should avoid. If you can't find the function in a system library then you have a somewhat more difficult problem: a non-portable function. There are basically three approaches to a missing function. Below we talk about functions, but really these same approaches apply, more or less, to typedefs, structures, and global variables. The first approach is to write a replacement function and either conditionally compile it, or put it into an appropriately-named file and use `AC_REPLACE_FUNCS'. For instance, Tcl uses `AC_REPLACE_FUNCS(strstr)' to handle systems that have no `strstr' function. The second approach is used when there is a similar function with a different name. The idea here is to check for all the alternatives and then modify your source to use whichever one might exist. The idiom here is to use `break' in the second argument to `AC_CHECK_FUNCS'; this is used both to skip unnecessary tests and to indicate to the reader that these checks are related. For instance, here is how `libgcj' checks for `inet_aton' or `inet_addr'; it only uses the first one found: AC_CHECK_FUNCS(inet_aton inet_addr, break) Code to use the results of these checks looks something like: #if HAVE_INET_ATON ... use inet_aton here #else #if HAVE_INET_ADDR ... use inet_addr here #else #error Function missing! #endif #endif Note how we've made it a compile-time error if the function does not exist. In general it is best to make errors occur as early as possible in the build process. The third approach to non-portable functions is to write code such that these functions are only optionally used. For instance, if you are writing an editor you might decide to use `mmap' to map a file into the editor's memory. However, since `mmap' is not portable, you would also write a function to use the more portable `read'. Handling known non-portable functions is only part of the problem, however. The pragmatic approach works fairly well, but it is somewhat inefficient if you are primarily developing on a more modern system, like GNU/Linux, which has few functions missing. In this case the problem is that you might not notice non-portable constructs in your code until it has largely been finished. Unfortunately, there's no high road to solving this problem. In the end, you need to have a working knowledge of the range of existing Unix systems. Knowledge of standards such as POSIX and XPG can be useful here, as a first cut - if it isn't in POSIX, you should at least consider checking for it. However, standards are not a panacea - not all systems are POSIX compliant, and sometimes there are bugs in systems functions which you must work around. One final class of problems you might encounter is that it is also easy to check for too much. This is bad because it adds unnecessary maintenance burden to your program. For instance, sometimes you'll see code that checks for `'. However, there's no point in doing that - using this header is mostly portable. Again, this can only be addressed by having a practical knowledge, which is only really possible by examining your target systems.  File: autobook.info, Node: Using Configuration Names, Prev: What to check for, Up: Writing configure.in 6.5 Using Configuration Names ============================= While feature tests are definitely the best approach, a `configure' script may occasionally have to make a decision based on a configuration name. This may be necessary if certain code must be compiled differently based on something which can not be tested using a standard Autoconf feature test. For instance, the `expect' package needs to find information about the system's `tty' implementation; this can't reliably be done when cross compiling without examining the particular configuration name. It is normally better to test for particular features, rather than to test for a particular system type. This is because as Unix and other operating systems evolve, different systems copy features from one another. When there is no alternative to testing the configuration name in a `configure' script, it is best to define a macro which describes the feature, rather than defining a macro which describes the particular system. This permits the same macro to be used on other systems which adopt the same feature (*note Writing New Macros for Autoconf::). Testing for a particular system is normally done using a case statement in the autoconf `configure.in' file. The `case' statement might look something like the following, assuming that `host' is a shell variable holding a canonical configuration system--which will be the case if `configure.in' uses the `AC_CANONICAL_HOST' or `AC_CANONICAL_SYSTEM' macros. case "${host}" in i[[3456]]86-*-linux-gnu*) do something ;; sparc*-sun-solaris2.[[56789]]*) do something ;; sparc*-sun-solaris*) do something ;; mips*-*-elf*) do something ;; esac Note the doubled square brackets in this piece of code. These are used to work around an ugly implementation detail of `autoconf'--it uses M4 under the hood. Without these extra brackets, the square brackets in the `case' statement would be swallowed by M4, and would not appear in the resulting `configure'. This nasty detail is discussed at more length in *Note M4::. It is particularly important to use `*' after the operating system field, in order to match the version number which will be generated by `config.guess'. In most cases you must be careful to match a range of processor types. For most processor families, a trailing `*' suffices, as in `mips*' above. For the i386 family, something along the lines of `i[34567]86' suffices at present. For the m68k family, you will need something like `m68*'. Of course, if you do not need to match on the processor, it is simpler to just replace the entire field by a `*', as in `*-*-irix*'.  File: autobook.info, Node: Introducing GNU Automake, Next: Bootstrapping, Prev: Writing configure.in, Up: Top 7 Introducing GNU Automake ************************** The primary goal of Automake is to generate `Makefile.in's compliant with the GNU Makefile Standards. Along the way, it tries to remove boilerplate and drudgery. It also helps the `Makefile' writer by implementing features (for instance automatic dependency tracking and parallel `make' support) that most maintainers don't have the patience to implement by hand. It also implements some best practices as well as workarounds for vendor `make' bugs - both of which require arcane knowledge not generally available. A secondary goal for Automake is that it works well with other free software, and, specifically, GNU tools. For example, Automake has support for Dejagnu-based test suites. Chances are that you don't care about the GNU Coding Standards. That's okay. You'll still appreciate the convenience that Automake provides, and you'll find that the GNU standards compliance feature, for the most part, assists rather than impedes. Automake helps the maintainer with five large tasks, and countless minor ones. The basic functional areas are: 1. Build 2. Check 3. Clean 4. Install and uninstall 5. Distribution We cover the first three items in this chapter, and the others in later chapters. Before we get into the details, let's talk a bit about some general principles of Automake. * Menu: * General Automake principles:: * Introduction to Primaries:: * The easy primaries:: * Programs and libraries:: * Frequently Asked Questions:: * Multiple directories:: * Testing::  File: autobook.info, Node: General Automake principles, Next: Introduction to Primaries, Up: Introducing GNU Automake 7.1 General Automake principles =============================== Automake at its simplest turns a file called `Makefile.am' into a GNU-compliant `Makefile.in' for use with `configure'. Each `Makefile.am' is written according to `make' syntax; Automake recognizes special macro and target names and generates code based on these. There are a few Automake rules which differ slightly from `make' rules: * Ordinary `make' comments are passed through to the output, but comments beginning with `##' are Automake comments and are not passed through. * Automake supports `include' directives. These directives are not passed through to the `Makefile.in', but instead are processed by `automake' - files included this way are treated as if they were textually included in `Makefile.am' at that point. This can be used to add boilerplate to each `Makefile.am' in a project via a centrally-maintained file. The filename to include can start with `$(top_srcdir)' to indicate that it should be found relative to the top-most directory of the project; if it is a relative path or if it starts with `$(srcdir)' then it is relative to the current directory. For example, here is how you would reference boilerplate code from the file `config/Make-rules' (where `config' is a top-level directory in the project): include $(top_srcdir)/config/Make-rules * Automake supports conditionals which are not passed directly through to `Makefile.in'. This feature is discussed in *Note Advanced GNU Automake Usage::. * Automake supports macro assignment using `+='; these assignments are translated by Automake into ordinary `=' assignments in `Makefile.in'. All macros and targets, including those which Automake does not recognize, are passed through to the generated `Makefile.in' - this is a powerful extension mechanism. Sometimes Automake will define macros or targets internally. If these are also defined in `Makefile.am' then the definition in `Makefile.am' takes precedence. This feature provides an easy way to tailor specific parts of the output in small ways. Note, however, that it is a mistake to override parts of the generated code that aren't documented (and thus `exported' by Automake). Overrides like this stand a good chance of not working with future Automake releases. Automake also scans `configure.in'. Sometimes it uses the information it discovers to generate extra code, and sometimes to provide extra error checking. Automake also turns every `AC_SUBST' into a `Makefile' variable. This is convenient in more ways than one: not only does it mean that you can refer to these macros in `Makefile.am' without extra work, but, since Automake scans `configure.in' before it reads any `Makefile.am', it also means that special variables and overrides Automake recognizes can be defined once in `configure.in'.  File: autobook.info, Node: Introduction to Primaries, Next: The easy primaries, Prev: General Automake principles, Up: Introducing GNU Automake 7.2 Introduction to Primaries ============================= Each type of object that Automake understands has a special root variable name associated with it. This root is called a "primary". Many actual variable names put into `Makefile.am' are constructed by adding various prefixes to a primary. For instance, scripts--interpreted executable programs--are associated with the `SCRIPTS' primary. Here is how you would list scripts to be installed in the user's `bindir': bin_SCRIPTS = magic-script (Note that the mysterious `bin_' prefix will be discussed later.) The contents of a primary-derived variable are treated as targets in the resulting `Makefile'. For instance, in our example above, we could generate `magic-script' using `sed' by simply introducing it as a target: bin_SCRIPTS = magic-script magic-script: magic-script.in sed -e 's/whatever//' < $(srcdir)/magic-script.in > magic-script chmod +x magic-script  File: autobook.info, Node: The easy primaries, Next: Programs and libraries, Prev: Introduction to Primaries, Up: Introducing GNU Automake 7.3 The easy primaries ====================== This section describes the common primaries that are relatively easy to understand; the more complicated ones are discussed in the next section. `DATA' This is the easiest primary to understand. A macro of this type lists a number of files which are installed verbatim. These files can appear either in the source directory or the build directory. `HEADERS' Macros of this type list header files. These are separate from `DATA' macros because this allows for extra error checking in some cases. `SCRIPTS' This is used for executable scripts (interpreted programs). These are different from `DATA' because they are installed with different permissions and because they have the program name transform applied to them (e.g., the `--program-transform-name' argument to `configure'). Scripts are also different from compiled programs because the latter can be stripped while scripts cannot. `MANS' This lists man pages. Installing man pages is more complicated than you might think due to the lack of a single common practice. One developer might name a man page in the source tree `foo.man' and then rename to the real name (`foo.1') at install time. Another developer might instead use numeric suffixes in the source tree and install using the same name. Sometimes an alphabetic code follows the numeric suffix (e.g., `quux.3n'); this code must be stripped before determining the correct install directory (this file must still be installed in `$(man3dir)'). Automake supports all of these modes of operation: * `man_MANS' can be used when numeric suffixes are already in place: man_MANS = foo.1 bar.2 quux.3n * `man1_MANS', `man2_MANS', etc., can be used to force renaming at install time. This renaming is skipped if the suffix already begins with the correct number. For instance: man1_MANS = foo.man man3_MANS = quux.3n Here `foo.man' will be installed as `foo.1' but `quux.3n' will keep its name at install time. `TEXINFOS' GNU programs traditionally use the Texinfo documentation format, not man pages. Automake has full support for Texinfo, including some additional features such as versioning and `install-info' support. We won't go into that here except to mention that it exists. See the Automake reference manual for more information. Automake supports a variety of lesser-used primaries such as `JAVA' and `LISP' (and, in the next major release, `PYTHON'). See the reference manual for more information on these.  File: autobook.info, Node: Programs and libraries, Next: Frequently Asked Questions, Prev: The easy primaries, Up: Introducing GNU Automake 7.4 Programs and libraries ========================== The preceding primaries have all been relatively easy to use. Now we'll discuss a more complicated set, namely those used to build programs and libraries. These primaries are more complex because building a program is more complex than building a script (which often doesn't even need building at all). Use the `PROGRAMS' primary for programs, `LIBRARIES' for libraries, and `LTLIBRARIES' for Libtool libraries (*note Introducing GNU Libtool::). Here is a minimal example: bin_PROGRAMS = doit This creates the program `doit' and arranges to install it in `bindir'. First `make' will compile `doit.c' to produce `doit.o'. Then it will link `doit.o' to create `doit'. Of course, if you have more than one source file, and most programs do, then you will want to be able to list them somehow. You will do this via the program's `SOURCES' variable. Each program or library has a set of associated variables whose names are constructed by appending suffixes to the `normalized' name of the program. The "normalized name" is the name of the object with non-alphanumeric characters changed to underscores. For instance, the normalized name of `quux' is `quux', but the normalized name of `install-info' is `install_info'. Normalized names are used because they correspond to `make' syntax, and, like all macros, Automake propagates these definitions into the resulting `Makefile.in'. So if `doit' is to be built from files `main.c' and `doit.c', we would write: bin_PROGRAMS = doit doit_SOURCES = doit.c main.c The same holds for libraries. In the zlib package we might make a library called `libzlib.a'. Then we would write: lib_LIBRARIES = libzlib.a libzlib_a_SOURCES = adler32.c compress.c crc32.c deflate.c deflate.h \ gzio.c infblock.c infblock.h infcodes.c infcodes.h inffast.c inffast.h \ inffixed.h inflate.c inftrees.c inftrees.h infutil.c infutil.h trees.c \ trees.h uncompr.c zconf.h zlib.h zutil.c zutil.h We can also do this with libtool libraries. For instance, suppose we want to build `libzlib.la' instead: lib_LTLIBRARIES = libzlib.la libzlib_la_SOURCES = adler32.c compress.c crc32.c deflate.c deflate.h \ gzio.c infblock.c infblock.h infcodes.c infcodes.h inffast.c inffast.h \ inffixed.h inflate.c inftrees.c inftrees.h infutil.c infutil.h trees.c \ trees.h uncompr.c zconf.h zlib.h zutil.c zutil.h As you can see, making shared libraries with Automake and Libtool is just as easy as making static libraries. In the above example, we listed header files in the `SOURCES' variable. These are ignored (except by `make dist' (1)) but can serve to make your `Makefile.am' a bit clearer (and sometimes shorter, if you aren't installing headers). Note that you can't use `configure' substitutions in a `SOURCES' variable. Automake needs to know the _static_ list of files which can be compiled into your program. There are still various ways to conditionally compile files, for instance Automake conditionals or the use of the `LDADD' variable. The static list of files is also used in some versions of Automake's automatic dependency tracking. The general rule is that each source file which might be compiled should be listed in some `SOURCES' variable. If the source is conditionally compiled, it can be listed in an `EXTRA' variable. For instance, suppose in this example `@FOO_OBJ@' is conditionally set by `configure' to `foo.o' when `foo.c' should be compiled: bin_PROGRAMS = foo foo_SOURCES = main.c foo_LDADD = @FOO_OBJ@ foo_DEPENDENCIES = @FOO_OBJ@ EXTRA_foo_SOURCES = foo.c In this case, `EXTRA_foo_SOURCES' is used to list sources which are conditionally compiled; this tells Automake that they exist even though it can't deduce their existence automatically. In the above example, note the use of the `foo_LDADD' macro. This macro is used to list other object files and libraries which should be linked into the `foo' program. Each program or library has several such associated macros which can be used to customize the link step; here we list the most common ones: `_DEPENDENCIES' Extra dependencies which are added to the program's dependency list. If not specified, this is automatically computed based on the value of the program's `_LDADD' macro. `_LDADD' Extra objects which are passed to the linker. This is only used by programs and shared libraries. `_LDFLAGS' Flags which are passed to the linker. This is separate from `_LDADD' to allow `_DEPENDENCIES' to be auto-computed. `_LIBADD' Like `_LDADD', but used for static libraries and not programs. You aren't required to define any of these macros. ---------- Footnotes ---------- (1) *Note Rolling Distribution Tarballs::  File: autobook.info, Node: Frequently Asked Questions, Next: Multiple directories, Prev: Programs and libraries, Up: Introducing GNU Automake 7.5 Frequently Asked Questions ============================== Experience has shown that there are several common questions that arise as people begin to use automake for their own projects. It seemed prudent to mention these issues here. Users often want to make a library (or program, but for some reason it comes up more frequently with libraries) whose sources live in subdirectories: lib_LIBRARIES = libsub.a libsub_a_SOURCES = subdir1/something.c ... If you try this with Automake 1.4, you'll get an error: $ automake automake: Makefile.am: not supported: source file subdir1/something.c is in subdirectory For libraries, this problem is mostly simply solve by using libtool convenience libraries. For programs, there is no simple solution. Many people elect to restructure their package in this case. The next major release of Automake addresses this problem. Another general problem that comes up is that of setting compilation flags. Most rules have flags--for instance, compilation of C code automatically uses `CFLAGS'. However, these variables are considered user variables. Setting them in `Makefile.am' is unsafe, because the user will expect to be able to override them at will. To handle this, for each flag variable, Automake introduce an `AM_' version which can be set in `Makefile.am'. For instance, we could set some flags for C and C++ compilation like so: AM_CFLAGS = -DFOR_C AM_CXXFLAGS = -DFOR_CXX Finally, people often ask how to compile a single source file in two different ways. For instance, the `etags.c' file which comes with Emacs can be compiled with different `-D' options to produce the `etags' and `ctags' programs. With Automake 1.4 this can only be done by writing your own compilation rules, like this: bin_PROGRAMS = etags ctags etags_SOURCES = etags.c ctags_SOURCES = ctags_LDADD = ctags.o etags.o: etags.c $(CC) $(CFLAGS) -DETAGS ... ctags.o: etags.c $(CC) $(CFLAGS) -DCTAGS ... This is tedious and hard to maintain for larger programs. Automake 1.5 will support a much more natural approach: bin_PROGRAMS = etags ctags etags_SOURCES = etags.c etags_CFLAGS = -DETAGS ctags_SOURCES = etags.c ctags_CFLAGS = -DCTAGS  File: autobook.info, Node: Multiple directories, Next: Testing, Prev: Frequently Asked Questions, Up: Introducing GNU Automake 7.6 Multiple directories ======================== So far, we've only dealt with single-directory projects. Automake can also handle projects with many directories. The variable `SUBDIRS' is used to list the subdirectories which should be built. Here is an example from Automake itself: SUBDIRS = . m4 tests Automake does not need to know the list of subdirectories statically, so there is no `EXTRA_SUBDIRS' variable. You might think that Automake would use `SUBDIRS' to see which `Makefile.am's to scan, but it actually gets this information from `configure.in'. This means that, if you have a subdirectory which is optionally built, you should still list it unconditionally in your call to `AC_OUTPUT' and then arrange for it to be substituted (or not, as appropriate) at `configure' time. Subdirectories are always built in the order they appear, but cleaning rules (e.g., `maintainer-clean') are always run in the reverse order. The reason for this odd reversal is that it is wrong to remove a file before removing all the files which depend on it. You can put `.' into `SUBDIRS' to control when the objects in the current directory are built, relative to the objects in the subdirectories. In the example above, targets in `.' will be built before subdirectories are built. If `.' does not appear in `SUBDIRS', it is built following all the subdirectories.  File: autobook.info, Node: Testing, Prev: Multiple directories, Up: Introducing GNU Automake 7.7 Testing =========== Automake also includes simple support for testing your program. The most simple form of this is the `TESTS' variable. This variable holds a list of tests which are run when the user runs `make check'. Each test is built (if necessary) and then executed. For each test, `make' prints a single line indicating whether the test has passed or failed. Failure means exiting with a non-zero status, with the special exception that an exit status of `77' (1) means that the test should be ignored. `make check' also prints a summary showing the number of passes and fails. Automake also supports the notion of an _xfail_, which is a test which is expected to fail. Sometimes this is useful when you want to track a known failure, but you aren't prepared to fix it right away. Tests which are expected to fail should be listed in both `TESTS' and `XFAIL_TESTS'. The special prefix `check' can be used with primaries to indicate that the objects should only be built at `make check' time. For example, here is how you can build a program that will only be used during the testing process: check_PROGRAMS = test-program test_program_SOURCES = ... Automake also supports the use of DejaGNU, the GNU test framework. DejaGNU support can be enabled using the `dejagnu' option: AUTOMAKE_OPTIONS = dejagnu The resulting `Makefile.in' will include code to invoke the `runtest' program appropriately. ---------- Footnotes ---------- (1) A number chosen arbitrarily by the Automake developers.  File: autobook.info, Node: Bootstrapping, Next: A Small GNU Autotools Project, Prev: Introducing GNU Automake, Up: Top 8 Bootstrapping *************** There are many programs in the GNU Autotools, each of which has a complex set of inputs. When one of these inputs changes, it is important to run the proper programs in the proper order. Unfortunately, it is hard to remember both the dependencies and the ordering. For instance, whenever you edit `configure.in', you must remember to re-run `aclocal' in case you added a reference to a new macro. You must also rebuild `configure' by running `autoconf'; `config.h' by running `autoheader', in case you added a new `AC_DEFINE'; and `automake' to propagate any new `AC_SUBST's to the various `Makefile.in's. If you edit a `Makefile.am', you must re-run `automake'. In both these cases, you must then remember to re-run `config.status --recheck' if `configure' changed, followed by `config.status' to rebuild the `Makefile's. When doing active development on the build system for your project, these dependencies quickly become painful. Of course, Automake knows how to handle this automatically. By default, `automake' generates a `Makefile.in' which knows all these dependencies and which automatically re-runs the appropriate tools in the appropriate order. These rules assume that the correct versions of the tools are all in your `PATH'. It helps to have a script ready to do all of this for you once, before you have generated a `Makefile' that will automatically run the tools in the correct order, or when you make a fresh checkout of the code from a CVS repository where the developers don't keep generated files under source control. There are at least two opposing schools of thought regarding how to go about this - the `autogen.sh' school and the `bootstrap' school: `autogen.sh' From the outset, this is a poor name for a bootstrap script, since there is already a GNU automatic text generation tool called AutoGen. Often packages that follow this convention have the script automatically run the generated `configure' script after the bootstrap process, passing `autogen.sh' arguments through to `configure'. Except you don't know what options you want yet, since you can't run `configure --help' until `configure' has been generated. I suggest that if you find yourself compiling a project set up in this way that you type: $ /bin/sh ./autogen.sh --help and ignore the spurious warning that tells you `configure' will be executed. `bootstrap' Increasingly, projects are starting to call their bootstrap scripts `bootstrap'. Such scripts simply run the various commands required to bring the source tree into a state where the end user can simply: $ configure $ make $ make install Unfortunately, proponents of this school of thought don't put the bootstrap script in their distributed tarballs, since the script is unnecessary except when the build environment of a developer's machine has changed. This means the proponents of the autogen.sh school may never see the advantages of the other method. Autoconf comes with a program called `autoreconf' which essentially does the work of the `bootstrap' script. `autoreconf' is rarely used because, historically, has not been very well known, and only in Autoconf 2.13 did it acquire the ability to work with Automake. Unfortunately, even the Autoconf 2.13 `autoreconf' does not handle `libtoolize' and some `automake'-related options that are frequently nice to use. We recommend the `bootstrap' method, until `autoreconf' is fixed. At this point `bootstrap' has not been standardized, so here is a version of the script we used while writing this book (1): #! /bin/sh aclocal \ && automake --gnu --add-missing \ && autoconf We don't use `autoreconf' here because that script (as of Autoconf 2.13) also does not handle the `--add-missing' option, which we want. A typical `bootstrap' might also run `libtoolize' or `autoheader'. It is also important for all developers on a project to have the same versions of the tools installed so that these rules don't inadvertently cause problems due to differences between tool versions. This version skew problem turns out to be fairly significant in the field. So, `automake' provides a way to disable these rules by default, while still allowing users to enable them when they know their environment is set up correctly. In order to enable this mode, you must first add `AM_MAINTAINER_MODE' to `configure.in'. This will add the `--enable-maintainer-mode' option to `configure'; when specified this flag will cause these so-called `maintainer rules' to be enabled. Note that maintainer mode is a controversial feature. Some people like to use it because it causes fewer bug reports in some situations. For instance, CVS does not preserve relative timestamps on files. If your project has both `configure.in' and `configure' checked in, and maintainer mode is not in use, then sometimes `make' will decide to rebuild `configure' even though it is not really required. This in turn means more headaches for your developers - on a large project most developers won't touch `configure.in' and many may not even want to install the GNU Autotools (2). The other camp claims that end users should use the same build system that developers use, that maintainer mode is simply unaesthetic, and furthermore that the modality of maintainer mode is dangerous--you can easily forget what mode you are in and thus forget to rebuild, and thus correctly test, a change to the configure or build system. When maintainer mode is not in use, the Automake-supplied `missing' script will be used to warn users when it appears that they need a maintainer tool that they do not have. The approach you take depends strongly on the social structures surrounding your project. ---------- Footnotes ---------- (1) This book is built using `automake' and `autoconf'. We couldn't find a use for `libtool'. (2) Shock, horror  File: autobook.info, Node: A Small GNU Autotools Project, Next: Introducing GNU Libtool, Prev: Bootstrapping, Up: Top 9 A Small GNU Autotools Project ******************************* This chapter introduces a small--but real--worked example, to illustrate some of the features, and highlight some of the pitfalls, of the GNU Autotools discussed so far. All of the source can be downloaded from the book's web page(1). The text is peppered with my own pet ideas, accumulated over a several years of working with the GNU Autotools and you should be able to easily apply these to your own projects. I will begin by describing some of the choices and problems I encountered during the early stages of the development of this project. Then by way of illustration of the issues covered, move on to showing you a general infrastructure that I use as the basis for all of my own projects, followed by the specifics of the implementation of a portable command line shell library. This chapter then finishes with a sample shell application that uses that library. Later, in *Note A Large GNU Autotools Project:: and *Note A Complex GNU Autotools Project::, the example introduced here will be gradually expanded as new features of GNU Autotools are revealed. * Menu: * GNU Autotools in Practice:: * A Simple Shell Builders Library:: * A Sample Shell Application:: ---------- Footnotes ---------- (1) `http://sources.redhat.com/autobook/'  File: autobook.info, Node: GNU Autotools in Practice, Next: A Simple Shell Builders Library, Up: A Small GNU Autotools Project 9.1 GNU Autotools in Practice ============================= This section details some of the specific problems I encountered when starting this project, and is representative of the sorts of things you are likely to want to do in projects of your own, but for which the correct solution may not be immediately evident. You can always refer back to this section for some inspiration if you come across similar situations. I will talk about some of the decisions I made about the structure of the project, and also the trade-offs for the other side of the argument - you might find the opposite choice to the one I make here is more relevant a particular project of yours. * Menu: * Project Directory Structure:: * C Header Files:: * C++ Compilers:: * Function Definitions:: * Fallback Function Implementations:: * K&R Compilers::  File: autobook.info, Node: Project Directory Structure, Next: C Header Files, Up: GNU Autotools in Practice 9.1.1 Project Directory Structure --------------------------------- Before starting to write code for any project, you need to decide on the directory structure you will use to organise the code. I like to build each component of a project in its own subdirectory, and to keep the configuration sources separate from the source code. The great majority of GNU projects I have seen use a similar method, so adopting it yourself will likely make your project more familiar to your developers by association. The top level directory is used for configuration files, such as `configure' and `aclocal.m4', and for a few other sundry files, `README' and a copy of the project license for example. Any significant libraries will have a subdirectory of their own, containing all of the sources and headers for that library along with a `Makefile.am' and anything else that is specific to just that library. Libraries that are part of a small like group, a set of pluggable application modules for example, are kept together in a single directory. The sources and headers for the project's main application will be stored in yet another subdirectory, traditionally named `src'. There are other conventional directories your developers might expect too: A `doc' directory for project documentation; and a `test' directory for the project self test suite. To keep the project top-level directory as uncluttered as possible, as I like to do, you can take advantage of Autoconf's `AC_CONFIG_AUX_DIR' by creating another directory, say `config', which will be used to store many of the GNU Autotools intermediate files, such as `install-sh'. I always store all project specific Autoconf M4 macros to this same subdirectory. So, this is what you should start with: $ pwd ~/mypackage $ ls -F Makefile.am config/ configure.in lib/ test/ README configure* doc/ src/  File: autobook.info, Node: C Header Files, Next: C++ Compilers, Prev: Project Directory Structure, Up: GNU Autotools in Practice 9.1.2 C Header Files -------------------- There is a small amount of boiler-plate that should be added to all header files, not least of which is a small amount of code to prevent the contents of the header from being scanned multiple times. This is achieved by enclosing the entire file in a preprocessor conditional which evaluates to false after the first time it has been seen by the preprocessor. Traditionally, the macro used is in all upper case, and named after the installation path without the installation prefix. Imagine a header that will be installed to `/usr/local/include/sys/foo.h', for example. The preprocessor code would be as follows: #ifndef SYS_FOO_H #define SYS_FOO_H 1 ... #endif /* !SYS_FOO_H */ Apart from comments, the entire content of the rest of this header file must be between these few lines. It is worth mentioning that inside the enclosing `ifndef', the macro `SYS_FOO_H' must be defined before any other files are `#include'd. It is a common mistake to not define that macro until the end of the file, but mutual dependency cycles are only stalled if the guard macro is defined before the `#include' which starts that cycle(1). If a header is designed to be installed, it must `#include' other installed project headers from the local tree using angle-brackets. There are some implications to working like this: * You must be careful that the names of header file directories in the source tree match the names of the directories in the install tree. For example, when I plan to install the aforementioned `foo.h' to `/usr/local/include/project/foo.h', from which it will be included using `#include ', then in order for the same include line to work in the source tree, I must name the source directory it is installed from `project' too, or other headers which use it will not be able to find it until after it has been installed. * When you come to developing the next version of a project laid out in this way, you must be careful about finding the correct header. Automake takes care of that for you by using `-I' options that force the compiler to look for uninstalled headers in the current source directory before searching the system directories for installed headers of the same name. * You don't have to install all of your headers to `/usr/include' - you can use subdirectories. And all without having to rewrite the headers at install time. ---------- Footnotes ---------- (1) An `#include' cycle is the situation where file `a.h' `#include's file `b.h', and `b.h' `#include's file `a.h' - either directly or through some longer chain of `#include's.  File: autobook.info, Node: C++ Compilers, Next: Function Definitions, Prev: C Header Files, Up: GNU Autotools in Practice 9.1.3 C++ Compilers ------------------- In order for a C++ program to use a library compiled with a C compiler, it is necessary for any symbols exported from the C library to be declared between `extern "C" {' and `}'. This code is important, because a C++ compiler "mangles"(1) all variable and function names, where as a C compiler does not. On the other hand, a C compiler will not understand these lines, so you must be careful to make them invisible to the C compiler. Sometimes you will see this method used, written out in long hand in every installed header file, like this: #ifdef __cplusplus extern "C" { #endif ... #ifdef __cplusplus } #endif But that is a lot of unnecessary typing if you have a few dozen headers in your project. Also the additional braces tend to confuse text editors, such as emacs, which do automatic source indentation based on brace characters. Far better, then, to declare them as macros in a common header file, and use the macros in your headers: #ifdef __cplusplus # define BEGIN_C_DECLS extern "C" { # define END_C_DECLS } #else /* !__cplusplus */ # define BEGIN_C_DECLS # define END_C_DECLS #endif /* __cplusplus */ I have seen several projects that name such macros with a leading underscore - `_BEGIN_C_DECLS'. Any symbol with a leading underscore is reserved for use by the compiler implementation, so you shouldn't name *any* symbols of your own in this way. By way of example, I recently ported the Small(2) language compiler to Unix, and almost all of the work was writing a Perl script to rename huge numbers of symbols in the compiler's reserved namespace to something more sensible so that GCC could even parse the sources. Small was originally developed on Windows, and the author had used a lot of symbols with a leading underscore. Although his symbol names didn't clash with his own compiler, in some cases they were the same as symbols used by GCC. ---------- Footnotes ---------- (1) For an explanation of name mangling *Note Writing Portable C++ with GNU Autotools: Writing Portable C++. (2) `http://www.compuphase.com/small.htm'  File: autobook.info, Node: Function Definitions, Next: Fallback Function Implementations, Prev: C++ Compilers, Up: GNU Autotools in Practice 9.1.4 Function Definitions -------------------------- As a stylistic convention, the return types for all function definitions should be on a separate line. The main reason for this is that it makes it very easy to find the functions in source file, by looking for a single identifier at the start of a line followed by an open parenthesis: $ egrep '^[_a-zA-Z][_a-zA-Z0-9]*[ \t]*\(' error.c set_program_name (const char *path) error (int exit_status, const char *mode, const char *message) sic_warning (const char *message) sic_error (const char *message) sic_fatal (const char *message) There are emacs lisp functions and various code analysis tools, such as `ansi2knr' (*note K&R Compilers::), which rely on this formatting convention, too. Even if you don't use those tools yourself, your fellow developers might like to, so it is a good convention to adopt.  File: autobook.info, Node: Fallback Function Implementations, Next: K&R Compilers, Prev: Function Definitions, Up: GNU Autotools in Practice 9.1.5 Fallback Function Implementations --------------------------------------- Due to the huge number of Unix varieties in common use today, many of the C library functions that you take for granted on your preferred development platform are very likely missing from some of the architectures you would like your code to compile on. Fundamentally there are two ways to cope with this: * Use only the few library calls that are available everywhere. In reality this is not actually possible because there are two lowest common denominators with mutually exclusive APIs, one rooted in BSD Unix (`bcopy', `rindex') and the other in SYSV Unix (`memcpy', `strrchr'). The only way to deal with this is to define one API in terms of the other using the preprocessor. The newer POSIX standard deprecates many of the BSD originated calls (with exceptions such as the BSD socket API). Even on non-POSIX platforms, there has been so much cross pollination that often both varieties of a given call may be provided, however you would be wise to write your code using POSIX endorsed calls, and where they are missing, define them in terms of whatever the host platform provides. This approach requires a lot of knowledge about various system libraries and standards documents, and can leave you with reams of preprocessor code to handle the differences between APIS. You will also need to perform a lot of checking in `configure.in' to figure out which calls are available. For example, to allow the rest of your code to use the `strcpy' call with impunity, you would need the following code in `configure.in': AC_CHECK_FUNCS(strcpy bcopy) And the following preprocessor code in a header file that is seen by every source file: #if !HAVE_STRCPY # if HAVE_BCOPY # define strcpy(dest, src) bcopy (src, dest, 1 + strlen (src)) # else /* !HAVE_BCOPY */ error no strcpy or bcopy # endif /* HAVE_BCOPY */ #endif /* HAVE_STRCPY */ * Alternatively you could provide your own fallback implementations of function calls you know are missing on some platforms. In practice you don't need to be as knowledgeable about problematic functions when using this approach. You can look in GNU libiberty(1) or Franc,ois Pinard's libit project(2) to see for which functions other GNU developers have needed to implement fallback code. The libit project is especially useful in this respect as it comprises canonical versions of fallback functions, and suitable Autoconf macros assembled from across the entire GNU project. I won't give an example of setting up your package to use this approach, since that is how I have chosen to structure the project described in this chapter. Rather than writing code to the lowest common denominator of system libraries, I am a strong advocate of the latter school of thought in the majority of cases. As with all things it pays to take a pragmatic approach; don't be afraid of the middle ground - weigh the options on a case by case basis. ---------- Footnotes ---------- (1) Available at `ftp://sourceware.cygnus.com/pub/binutils/'. (2) Distributed from `http://www.iro.umontreal.ca/~pinard/libit'.  File: autobook.info, Node: K&R Compilers, Prev: Fallback Function Implementations, Up: GNU Autotools in Practice 9.1.6 K&R Compilers ------------------- K&R C is the name now used to describe the original C language specified by Brian Kernighan and Dennis Ritchie (hence, `"K&R"'). I have yet to see a C compiler that doesn't support code written in the K&R style, yet it has fallen very much into disuse in favor of the newer ANSI C standard. Although it is increasingly common for vendors to "unbundle" their ANSI C compiler, the GCC project(1) is available for all of the architectures I have ever used. There are four differences between the two C standards: 1. ANSI C expects full type specification in function prototypes, such as you might supply in a library header file: extern int functionname (const char *parameter1, size_t parameter 2); The nearest equivalent in K&R style C is a forward declaration, which allows you to use a function before its corresponding definition: extern int functionname (); As you can imagine, K&R has very bad type safety, and does not perform any checks that only function arguments of the correct type are used. 2. The function headers of each function definition are written differently. Where you might see the following written in ANSI C: int functionname (const char *parameter1, size_t parameter2) { ... } K&R expects the parameter type declarations separately, like this: int functionname (parameter1, parameter2) const char *parameter1; size_t parameter2; { ... } 3. There is no concept of an untyped pointer in K&R C. Where you might be used to seeing `void *' pointers in ANSI code, you are forced to overload the meaning of `char *' for K&R compilers. 4. Variadic functions are handled with a different API in K&R C, imported with `#include '. A K&R variadic function definition looks like this: int functionname (va_alist) va_dcl { va_list ap; char *arg; va_start (ap); ... arg = va_arg (ap, char *); ... va_end (ap); return arg ? strlen (arg) : 0; } ANSI C provides a similar API, imported with `#include ', though it cannot express a variadic function with no named arguments such as the one above. In practice, this isn't a problem since you always need at least one parameter, either to specify the total number of arguments somehow, or else to mark the end of the argument list. An ANSI variadic function definition looks like this: int functionname (char *format, ...) { va_list ap; char *arg; va_start (ap, format); ... arg = va_arg (ap, char *); ... va_end (ap); return format ? strlen (format) : 0; } Except in very rare cases where you are writing a low level project (GCC for example), you probably don't need to worry about K&R compilers too much. However, supporting them can be very easy, and if you are so inclined, can be handled either by employing the `ansi2knr' program supplied with Automake, or by careful use of the preprocessor. Using `ansi2knr' in your project is described in some detail in *Note Automatic de-ANSI-fication: (Automake)Automatic de-ANSI-fication, but boils down to the following: - Add this macro to your `configure.in' file: AM_C_PROTOTYPES - Rewrite the contents of `LIBOBJS' and/or `LTLIBOBJS' in the following fashion: # This is necessary so that .o files in LIBOBJS are also built via # the ANSI2KNR-filtering rules. Xsed='sed -e "s/^X//"' LIBOBJS=`echo X"$LIBOBJS"|\ [$Xsed -e 's/\.[^.]* /.\$U& /g;s/\.[^.]*$/.\$U&/']` Personally, I dislike this method, since every source file is filtered and rewritten with ANSI function prototypes and declarations converted to K&R style adding a fair overhead in additional files in your build tree, and in compilation time. This would be reasonable were the abstraction sufficient to allow you to forget about K&R entirely, but `ansi2knr' is a simple program, and does not address any of the other differences between compilers that I raised above, and it cannot handle macros in your function prototypes of definitions. If you decide to use `ansi2knr' in your project, you must make the decision before you write any code, and be aware of its limitations as you develop. For my own projects, I prefer to use a set of preprocessor macros along with a few stylistic conventions so that all of the differences between K&R and ANSI compilers are actually addressed, and so that the unfortunate few who have no access to an ANSI compiler (and who cannot use GCC for some reason) needn't suffer the overheads of `ansi2knr'. The four differences in style listed at the beginning of this subsection are addressed as follows: 1. The function prototype argument lists are declared inside a `PARAMS' macro invocation so that K&R compilers will still be able to compile the source tree. `PARAMS' removes ANSI argument lists from function prototypes for K&R compilers. Some developers continue to use `__P' for this purpose, but strictly speaking, macros starting with `_' (and especially `__') are reserved for the compiler and the system headers, so using `PARAMS', as follows, is safer: #if __STDC__ # ifndef NOPROTOS # define PARAMS(args) args # endif #endif #ifndef PARAMS # define PARAMS(args) () #endif This macro is then used for all function declarations like this: extern int functionname PARAMS((const char *parameter)); 2. With the `PARAMS' macro is used for all function declarations, ANSI compilers are given all the type information they require to do full compile time type checking. The function definitions proper must then be declared in K&R style so that K&R compilers don't choke on ANSI syntax. There is a small amount of overhead in writing code this way, however: The ANSI compile time type checking can only work in conjunction with K&R function definitions if it first sees an ANSI function prototype. This forces you to develop the good habit of prototyping _every single_ function in your project. Even the `static' ones. 3. The easiest way to work around the lack of `void *' pointers, is to define a new type that is conditionally set to `void *' for ANSI compilers, or `char *' for K&R compilers. You should add the following to a common header file: #if __STDC__ typedef void *void_ptr; #else /* !__STDC__ */ typedef char *void_ptr; #endif /* __STDC__ */ 4. The difference between the two variadic function APIs pose a stickier problem, and the solution is ugly. But it _does_ work. First you must check for the headers in `configure.in': AC_CHECK_HEADERS(stdarg.h varargs.h, break) Having done this, add the following code to a common header file: #if HAVE_STDARG_H # include # define VA_START(a, f) va_start(a, f) #else # if HAVE_VARARGS_H # include # define VA_START(a, f) va_start(a) # endif #endif #ifndef VA_START error no variadic api #endif You must now supply each variadic function with both a K&R and an ANSI definition, like this: int #if HAVE_STDARG_H functionname (const char *format, ...) #else functionname (format, va_alist) const char *format; va_dcl #endif { va_alist ap; char *arg; VA_START (ap, format); ... arg = va_arg (ap, char *); ... va_end (ap); return arg : strlen (arg) ? 0; } ---------- Footnotes ---------- (1) GCC must be compilable by K&R compilers so that it can be built and installed in an ANSI compiler free environment.  File: autobook.info, Node: A Simple Shell Builders Library, Next: A Sample Shell Application, Prev: GNU Autotools in Practice, Up: A Small GNU Autotools Project 9.2 A Simple Shell Builders Library =================================== An application which most developers try their hand at sooner or later is a Unix shell. There is a lot of functionality common to all traditional command line shells, which I thought I would push into a portable library to get you over the first hurdle when that moment is upon you. Before elaborating on any of this I need to name the project. I've called it "sic", from the Latin "so it is", because like all good project names it is somewhat pretentious and it lends itself to the recursive acronym "sic is cumulative". The gory detail of the minutiae of the source is beyond the scope of this book, but to convey a feel for the need for Sic, some of the goals which influenced the design follow: * Sic must be very small so that, in addition to being used as the basis for a full blown shell, it can be linked (unadorned) into an application and used for trivial tasks, such as reading startup configuration. * It must not be tied to a particular syntax or set of reserved words. If you use it to read your startup configuration, I don't want to force you to use my syntax and commands. * The boundary between the library (`libsic') and the application must be well defined. Sic will take strings of characters as input, and internally parse and evaluate them according to registered commands and syntax, returning results or diagnostics as appropriate. * It must be extremely portable - that is what I am trying to illustrate here, after all. * Menu: * Portability Infrastructure:: * Library Implementation:: * Beginnings of a configure.in for Small Project::  File: autobook.info, Node: Portability Infrastructure, Next: Library Implementation, Up: A Simple Shell Builders Library 9.2.1 Portability Infrastructure -------------------------------- As I explained in *Note Project Directory Structure::, I'll first create the project directories, a toplevel directory and a subdirectory to put the library sources into. I want to install the library header files to `/usr/local/include/sic', so the library subdirectory must be named appropriately. *Note C Header Files::. $ mkdir sic $ mkdir sic/sic $ cd sic/sic I will describe the files I add in this section in more detail than the project specific sources, because they comprise an infrastructure that I use relatively unchanged for all of my GNU Autotools projects. You could keep an archive of these files, and use them as a starting point each time you begin a new project of your own. * Menu: * Error Management:: * Memory Management:: * Generalised List Data Type::  File: autobook.info, Node: Error Management, Next: Memory Management, Up: Portability Infrastructure 9.2.1.1 Error Management ........................ A good place to start with any project design is the error management facility. In Sic I will use a simple group of functions to display simple error messages. Here is `sic/error.h': #ifndef SIC_ERROR_H #define SIC_ERROR_H 1 #include BEGIN_C_DECLS extern const char *program_name; extern void set_program_name (const char *argv0); extern void sic_warning (const char *message); extern void sic_error (const char *message); extern void sic_fatal (const char *message); END_C_DECLS #endif /* !SIC_ERROR_H */ This header file follows the principles set out in *Note C Header Files::. I am storing the `program_name' variable in the library that uses it, so that I can be sure that the library will build on architectures that don't allow undefined symbols in libraries(1). Keeping those preprocessor macro definitions designed to aid code portability together (in a single file), is a good way to maintain the readability of the rest of the code. For this project I will put that code in `common.h': #ifndef SIC_COMMON_H #define SIC_COMMON_H 1 #if HAVE_CONFIG_H # include #endif #include #include #if STDC_HEADERS # include # include #elif HAVE_STRINGS_H # include #endif /*STDC_HEADERS*/ #if HAVE_UNISTD_H # include #endif #if HAVE_ERRNO_H # include #endif /*HAVE_ERRNO_H*/ #ifndef errno /* Some systems #define this! */ extern int errno; #endif #endif /* !SIC_COMMON_H */ You may recognise some snippets of code from the Autoconf manual here-- in particular the inclusion of the project `config.h', which will be generated shortly. Notice that I have been careful to conditionally include any headers which are not guaranteed to exist on every architecture. The rule of thumb here is that only `stdio.h' is ubiquitous (though I have never heard of a machine that has no `sys/types.h'). You can find more details of some of these in *Note Existing Tests: (autoconf)Existing Tests. Here is a little more code from `common.h': #ifndef EXIT_SUCCESS # define EXIT_SUCCESS 0 # define EXIT_FAILURE 1 #endif The implementation of the error handling functions goes in `error.c' and is very straightforward: #if HAVE_CONFIG_H # include #endif #include "common.h" #include "error.h" static void error (int exit_status, const char *mode, const char *message); static void error (int exit_status, const char *mode, const char *message) { fprintf (stderr, "%s: %s: %s.\n", program_name, mode, message); if (exit_status >= 0) exit (exit_status); } void sic_warning (const char *message) { error (-1, "warning", message); } void sic_error (const char *message) { error (-1, "ERROR", message); } void sic_fatal (const char *message) { error (EXIT_FAILURE, "FATAL", message); } I also need a definition of `program_name'; `set_program_name' copies the filename component of `path' into the exported data, `program_name'. The `xstrdup' function just calls `strdup', but `abort's if there is not enough memory to make the copy: const char *program_name = NULL; void set_program_name (const char *path) { if (!program_name) program_name = xstrdup (basename (path)); } ---------- Footnotes ---------- (1) AIX and Windows being the main culprits.  File: autobook.info, Node: Memory Management, Next: Generalised List Data Type, Prev: Error Management, Up: Portability Infrastructure 9.2.1.2 Memory Management ......................... A useful idiom common to many GNU projects is to wrap the memory management functions to localise "out of memory handling", naming them with an `x' prefix. By doing this, the rest of the project is relieved of having to remember to check for `NULL' returns from the various memory functions. These wrappers use the `error' API to report memory exhaustion and abort the program. I have placed the implementation code in `xmalloc.c': #if HAVE_CONFIG_H # include #endif #include "common.h" #include "error.h" void * xmalloc (size_t num) { void *new = malloc (num); if (!new) sic_fatal ("Memory exhausted"); return new; } void * xrealloc (void *p, size_t num) { void *new; if (!p) return xmalloc (num); new = realloc (p, num); if (!new) sic_fatal ("Memory exhausted"); return new; } void * xcalloc (size_t num, size_t size) { void *new = xmalloc (num * size); bzero (new, num * size); return new; } Notice in the code above, that `xcalloc' is implemented in terms of `xmalloc', since `calloc' itself is not available in some older C libraries. Also, the `bzero' function is actually deprecated in favour of `memset' in modern C libraries - I'll explain how to take this into account later in *Note Beginnings of a configure.in for Small Project::. Rather than create a separate `xmalloc.h' file, which would need to be `#include'd from almost everywhere else, the logical place to declare these functions is in `common.h', since the wrappers will be called from most everywhere else in the code: #ifdef __cplusplus # define BEGIN_C_DECLS extern "C" { # define END_C_DECLS } #else # define BEGIN_C_DECLS # define END_C_DECLS #endif #define XCALLOC(type, num) \ ((type *) xcalloc ((num), sizeof(type))) #define XMALLOC(type, num) \ ((type *) xmalloc ((num) * sizeof(type))) #define XREALLOC(type, p, num) \ ((type *) xrealloc ((p), (num) * sizeof(type))) #define XFREE(stale) do { \ if (stale) { free (stale); stale = 0; } \ } while (0) BEGIN_C_DECLS extern void *xcalloc (size_t num, size_t size); extern void *xmalloc (size_t num); extern void *xrealloc (void *p, size_t num); extern char *xstrdup (const char *string); extern char *xstrerror (int errnum); END_C_DECLS By using the macros defined here, allocating and freeing heap memory is reduced from: char **argv = (char **) xmalloc (sizeof (char *) * 3); do_stuff (argv); if (argv) free (argv); to the simpler and more readable: char **argv = XMALLOC (char *, 3); do_stuff (argv); XFREE (argv); In the same spirit, I have borrowed `xstrdup.c' and `xstrerror.c' from project GNU's libiberty. *Note Fallback Function Implementations::.  File: autobook.info, Node: Generalised List Data Type, Prev: Memory Management, Up: Portability Infrastructure 9.2.1.3 Generalised List Data Type .................................. In many C programs you will see various implementations and re-implementations of lists and stacks, each tied to its own particular project. It is surprisingly simple to write a catch-all implementation, as I have done here with a generalised list operation API in `list.h': #ifndef SIC_LIST_H #define SIC_LIST_H 1 #include BEGIN_C_DECLS typedef struct list { struct list *next; /* chain forward pointer*/ void *userdata; /* incase you want to use raw Lists */ } List; extern List *list_new (void *userdata); extern List *list_cons (List *head, List *tail); extern List *list_tail (List *head); extern size_t list_length (List *head); END_C_DECLS #endif /* !SIC_LIST_H */ The trick is to ensure that any structures you want to chain together have their forward pointer in the first field. Having done that, the generic functions declared above can be used to manipulate any such chain by casting it to `List *' and back again as necessary. For example: struct foo { struct foo *next; char *bar; struct baz *qux; ... }; ... struct foo *foo_list = NULL; foo_list = (struct foo *) list_cons ((List *) new_foo (), (List *) foo_list); ... The implementation of the list manipulation functions is in `list.c': #include "list.h" List * list_new (void *userdata) { List *new = XMALLOC (List, 1); new->next = NULL; new->userdata = userdata; return new; } List * list_cons (List *head, List *tail) { head->next = tail; return head; } List * list_tail (List *head) { return head->next; } size_t list_length (List *head) { size_t n; for (n = 0; head; ++n) head = head->next; return n; }  File: autobook.info, Node: Library Implementation, Next: Beginnings of a configure.in for Small Project, Prev: Portability Infrastructure, Up: A Simple Shell Builders Library 9.2.2 Library Implementation ---------------------------- In order to set the stage for later chapter which expand upon this example, in this subsection I will describe the purpose of the sources that combine to implement the shell library. I will not dissect the code introduced here--you can download the sources from the book's webpages at `http://sources.redhat.com/autobook/'. The remaining sources for the library, beyond the support files described in the previous subsection, are divided into four pairs of files: * Menu: * sic.c & sic.h:: * builtin.c & builtin.h:: * eval.c & eval.h:: * syntax.c & syntax.h::  File: autobook.info, Node: sic.c & sic.h, Next: builtin.c & builtin.h, Up: Library Implementation 9.2.2.1 `sic.c' & `sic.h' ......................... Here are the functions for creating and managing sic parsers. #ifndef SIC_SIC_H #define SIC_SIC_H 1 #include #include #include #include typedef struct sic { char *result; /* result string */ size_t len; /* bytes used by result field */ size_t lim; /* bytes allocated to result field */ struct builtintab *builtins; /* tables of builtin functions */ SyntaxTable **syntax; /* dispatch table for syntax of input */ List *syntax_init; /* stack of syntax state initialisers */ List *syntax_finish; /* stack of syntax state finalizers */ SicState *state; /* state data from syntax extensions */ } Sic; #endif /* !SIC_SIC_H */ This structure has fields to store registered command (`builtins') and syntax (`syntax') handlers, along with other state information (`state') that can be used to share information between various handlers, and some room to build a result or error string (`result').  File: autobook.info, Node: builtin.c & builtin.h, Next: eval.c & eval.h, Prev: sic.c & sic.h, Up: Library Implementation 9.2.2.2 `builtin.c' & `builtin.h' ................................. Here are the functions for managing tables of builtin commands in each `Sic' structure: typedef int (*builtin_handler) (Sic *sic, int argc, char *const argv[]); typedef struct { const char *name; builtin_handler func; int min, max; } Builtin; typedef struct builtintab BuiltinTab; extern Builtin *builtin_find (Sic *sic, const char *name); extern int builtin_install (Sic *sic, Builtin *table); extern int builtin_remove (Sic *sic, Builtin *table);  File: autobook.info, Node: eval.c & eval.h, Next: syntax.c & syntax.h, Prev: builtin.c & builtin.h, Up: Library Implementation 9.2.2.3 `eval.c' & `eval.h' ........................... Having created a `Sic' parser, and populated it with some `Builtin' handlers, a user of this library must tokenize and evaluate its input stream. These files define a structure for storing tokenized strings (`Tokens'), and functions for converting `char *' strings both to and from this structure type: #ifndef SIC_EVAL_H #define SIC_EVAL_H 1 #include #include BEGIN_C_DECLS typedef struct { int argc; /* number of elements in ARGV */ char **argv; /* array of pointers to elements */ size_t lim; /* number of bytes allocated */ } Tokens; extern int eval (Sic *sic, Tokens *tokens); extern int untokenize (Sic *sic, char **pcommand, Tokens *tokens); extern int tokenize (Sic *sic, Tokens **ptokens, char **pcommand); END_C_DECLS #endif /* !SIC_EVAL_H */ These files also define the `eval' function, which examines a `Tokens' structure in the context of the given Sic parser, dispatching the `argv' array to a relevant `Builtin' handler, also written by the library user.  File: autobook.info, Node: syntax.c & syntax.h, Prev: eval.c & eval.h, Up: Library Implementation 9.2.2.4 `syntax.c' & `syntax.h' ............................... When `tokenize' splits a `char *' string into parts, by default it breaks the string into words delimited by whitespace. These files define the interface for changing this default behaviour, by registering callback functions which the parser will run when it meets an `interesting' symbol in the input stream. Here are the declarations from `syntax.h': BEGIN_C_DECLS typedef int SyntaxHandler (struct sic *sic, BufferIn *in, BufferOut *out); typedef struct syntax { SyntaxHandler *handler; char *ch; } Syntax; extern int syntax_install (struct sic *sic, Syntax *table); extern SyntaxHandler *syntax_handler (struct sic *sic, int ch); END_C_DECLS A `SyntaxHandler' is a function called by `tokenize' as it consumes its input to create a `Tokens' structure; the two functions associate a table of such handlers with a given `Sic' parser, and find the particular handler for a given character in that `Sic' parser, respectively.  File: autobook.info, Node: Beginnings of a configure.in for Small Project, Prev: Library Implementation, Up: A Simple Shell Builders Library 9.2.3 Beginnings of a `configure.in' ------------------------------------ Now that I have some code, I can run `autoscan' to generate a preliminary `configure.in'. `autoscan' will examine all of the sources in the current directory tree looking for common points of non-portability, adding macros suitable for detecting the discovered problems. `autoscan' generates the following in `configure.scan': # Process this file with autoconf to produce a configure script. AC_INIT(sic/eval.h) # Checks for programs. # Checks for libraries. # Checks for header files. AC_HEADER_STDC AC_CHECK_HEADERS(strings.h unistd.h) # Checks for typedefs, structures, and compiler characteristics. AC_C_CONST AC_TYPE_SIZE_T # Checks for library functions. AC_FUNC_VPRINTF AC_CHECK_FUNCS(strerror) AC_OUTPUT() Since the generated `configure.scan' does not overwrite your project's `configure.in', it is a good idea to run `autoscan' periodically even in established project source trees, and compare the two files. Sometimes `autoscan' will find some portability issue you have overlooked, or weren't aware of. Looking through the documentation for the macros in this `configure.scan', `AC_C_CONST' and `AC_TYPE_SIZE_T' will take care of themselves (provided I ensure that `config.h' is included into every source file), and `AC_HEADER_STDC' and `AC_CHECK_HEADERS(unistd.h)' are already taken care of in `common.h'. `autoscan' is no silver bullet! Even here in this simple example, I need to manually add macros to check for the presence of `errno.h': AC_CHECK_HEADERS(errno.h strings.h unistd.h) I also need to manually add the Autoconf macro for generating `config.h'; a macro to initialise `automake' support; and a macro to check for the presence of `ranlib'. These should go close to the start of `configure.in': ... AC_CONFIG_HEADER(config.h) AM_INIT_AUTOMAKE(sic, 0.5) AC_PROG_CC AC_PROG_RANLIB ... Recall that the use of `bzero' in *Note Memory Management:: is not entirely portable. The trick is to provide a `bzero' work-alike, depending on which functions Autoconf detects, by adding the following towards the end of `configure.in': ... AC_CHECK_FUNCS(bzero memset, break) ... With the addition of this small snippet of code to `common.h', I can now make use of `bzero' even when linking with a C library that has no implementation of its own: #if !HAVE_BZERO && HAVE_MEMSET # define bzero(buf, bytes) ((void) memset (buf, 0, bytes)) #endif An interesting macro suggested by `autoscan' is `AC_CHECK_FUNCS(strerror)'. This tells me that I need to provide a replacement implementation of `strerror' for the benefit of architectures which don't have it in their system libraries. This is resolved by providing a file with a fallback implementation for the named function, and creating a library from it and any others that `configure' discovers to be lacking from the system library on the target host. You will recall that `configure' is the shell script the end user of this package will run on their machine to test that it has all the features the package wants to use. The library that is created will allow the rest of the project to be written in the knowledge that any functions required by the project but missing from the installers system libraries will be available nonetheless. GNU `libiberty' comes to the rescue again - it already has an implementation of `strerror.c' that I was able to use with a little modification. Being able to supply a simple implementation of `strerror', as the `strerror.c' file from `libiberty' does, relies on there being a well defined `sys_errlist' variable. It is a fair bet that if the target host has no `strerror' implementation, however, that the system `sys_errlist' will be broken or missing. I need to write a configure macro to check whether the system defines `sys_errlist', and tailor the code in `strerror.c' to use this knowledge. To avoid clutter in the top-level directory, I am a great believer in keeping as many of the configuration files as possible in their own sub-directory. First of all, I will create a new directory called `config' inside the top-level directory, and put `sys_errlist.m4' inside it: AC_DEFUN(SIC_VAR_SYS_ERRLIST, [AC_CACHE_CHECK([for sys_errlist], sic_cv_var_sys_errlist, [AC_TRY_LINK([int *p;], [extern int sys_errlist; p = &sys_errlist;], sic_cv_var_sys_errlist=yes, sic_cv_var_sys_errlist=no)]) if test x"$sic_cv_var_sys_errlist" = xyes; then AC_DEFINE(HAVE_SYS_ERRLIST, 1, [Define if your system libraries have a sys_errlist variable.]) fi]) I must then add a call to this new macro in the `configure.in' file being careful to put it in the right place - somewhere between _typedefs and structures_ and _library functions_ according to the comments in `configure.scan': SIC_VAR_SYS_ERRLIST GNU Autotools can also be set to store most of their files in a subdirectory, by calling the `AC_CONFIG_AUX_DIR' macro near the top of `configure.in', preferably right after `AC_INIT': AC_INIT(sic/eval.c) AC_CONFIG_AUX_DIR(config) AM_CONFIG_HEADER(config.h) ... Having made this change, many of the files added by running `autoconf' and `automake --add-missing' will be put in the "aux_dir". The source tree now looks like this: sic/ +-- configure.scan +-- config/ | +-- sys_errlist.m4 +-- replace/ | +-- strerror.c +-- sic/ +-- builtin.c +-- builtin.h +-- common.h +-- error.c +-- error.h +-- eval.c +-- eval.h +-- list.c +-- list.h +-- sic.c +-- sic.h +-- syntax.c +-- syntax.h +-- xmalloc.c +-- xstrdup.c +-- xstrerror.c In order to correctly utilise the fallback implementation, `AC_CHECK_FUNCS(strerror)' needs to be removed and `strerror' added to `AC_REPLACE_FUNCS': # Checks for library functions. AC_REPLACE_FUNCS(strerror) This will be clearer if you look at the `Makefile.am' for the `replace' subdirectory: ## Makefile.am -- Process this file with automake to produce Makefile.in INCLUDES = -I$(top_builddir) -I$(top_srcdir) noinst_LIBRARIES = libreplace.a libreplace_a_SOURCES = libreplace_a_LIBADD = @LIBOBJS@ The code tells `automake' that I want to build a library for use within the build tree (i.e. not installed - `noinst'), and that has no source files by default. The clever part here is that when someone comes to install Sic, they will run `configure' which will test for `strerror', and add `strerror.o' to `LIBOBJS' if the target host environment is missing its own implementation. Now, when `configure' creates `replace/Makefile' (as I asked it to with `AC_OUTPUT'), `@LIBOBJS@' is replaced by the list of objects required on the installer's machine. Having done all this at configure time, when my user runs `make', the files required to replace functions missing from their target machine will be added to `libreplace.a'. Unfortunately this is not quite enough to start building the project. First I need to add a top-level `Makefile.am' from which to ultimately create a top-level `Makefile' that will descend into the various subdirectories of the project: ## Makefile.am -- Process this file with automake to produce Makefile.in SUBDIRS = replace sic And `configure.in' must be told where it can find instances of `Makefile.in': AC_OUTPUT(Makefile replace/Makefile sic/Makefile) I have written a `bootstrap' script for Sic, for details see *Note Bootstrapping::: #! /bin/sh set -x aclocal -I config autoheader automake --foreign --add-missing --copy autoconf The `--foreign' option to `automake' tells it to relax the GNU standards for various files that should be present in a GNU distribution. Using this option saves me from having to create empty files as we did in *Note A Minimal GNU Autotools Project::. Right. Let's build the library! First, I'll run `bootstrap': $ ./bootstrap + aclocal -I config + autoheader + automake --foreign --add-missing --copy automake: configure.in: installing config/install-sh automake: configure.in: installing config/mkinstalldirs automake: configure.in: installing config/missing + autoconf The project is now in the same state that an end-user would see, having unpacked a distribution tarball. What follows is what an end user might expect to see when building from that tarball: $ ./configure creating cache ./config.cache checking for a BSD compatible install... /usr/bin/install -c checking whether build environment is sane... yes checking whether make sets ${MAKE}... yes checking for working aclocal... found checking for working autoconf... found checking for working automake... found checking for working autoheader... found checking for working makeinfo... found checking for gcc... gcc checking whether the C compiler (gcc ) works... yes checking whether the C compiler (gcc ) is a cross-compiler... no checking whether we are using GNU C... yes checking whether gcc accepts -g... yes checking for ranlib... ranlib checking how to run the C preprocessor... gcc -E checking for ANSI C header files... yes checking for unistd.h... yes checking for errno.h... yes checking for string.h... yes checking for working const... yes checking for size_t... yes checking for strerror... yes updating cache ./config.cache creating ./config.status creating Makefile creating replace/Makefile creating sic/Makefile creating config.h Compare this output with the contents of `configure.in', and notice how each macro is ultimately responsible for one or more consecutive tests (via the Bourne shell code generated in `configure'). Now that the `Makefile's have been successfully created, it is safe to call `make' to perform the actual compilation: $ make make all-recursive make[1]: Entering directory `/tmp/sic' Making all in replace make[2]: Entering directory `/tmp/sic/replace' rm -f libreplace.a ar cru libreplace.a ranlib libreplace.a make[2]: Leaving directory `/tmp/sic/replace' Making all in sic make[2]: Entering directory `/tmp/sic/sic' gcc -DHAVE_CONFIG_H -I. -I. -I.. -I.. -g -O2 -c builtin.c gcc -DHAVE_CONFIG_H -I. -I. -I.. -I.. -g -O2 -c error.c gcc -DHAVE_CONFIG_H -I. -I. -I.. -I.. -g -O2 -c eval.c gcc -DHAVE_CONFIG_H -I. -I. -I.. -I.. -g -O2 -c list.c gcc -DHAVE_CONFIG_H -I. -I. -I.. -I.. -g -O2 -c sic.c gcc -DHAVE_CONFIG_H -I. -I. -I.. -I.. -g -O2 -c syntax.c gcc -DHAVE_CONFIG_H -I. -I. -I.. -I.. -g -O2 -c xmalloc.c gcc -DHAVE_CONFIG_H -I. -I. -I.. -I.. -g -O2 -c xstrdup.c gcc -DHAVE_CONFIG_H -I. -I. -I.. -I.. -g -O2 -c xstrerror.c rm -f libsic.a ar cru libsic.a builtin.o error.o eval.o list.o sic.o syntax.o xmalloc.o xstrdup.o xstrerror.o ranlib libsic.a make[2]: Leaving directory `/tmp/sic/sic' make[1]: Leaving directory `/tmp/sic' On this machine, as you can see from the output of `configure' above, I have no need of the fallback implementation of `strerror', so `libreplace.a' is empty. On another machine this might not be the case. In any event, I now have a compiled `libsic.a' - so far, so good.  File: autobook.info, Node: A Sample Shell Application, Prev: A Simple Shell Builders Library, Up: A Small GNU Autotools Project 9.3 A Sample Shell Application ============================== What I need now, is a program that uses `libsic.a', if only to give me confidence that it is working. In this section, I will write a simple shell which uses the library. But first, I'll create a directory to put it in: $ mkdir src $ ls -F COPYING Makefile.am aclocal.m4 configure* config/ sic/ INSTALL Makefile.in bootstrap* configure.in replace/ src/ $ cd src In order to put this shell together, we need to provide just a few things for integration with `libsic.a'... * Menu: * sic_repl.c:: * sic_syntax.c:: * sic_builtin.c:: * sic.c & sic.h (again)::  File: autobook.info, Node: sic_repl.c, Next: sic_syntax.c, Up: A Sample Shell Application 9.3.1 `sic_repl.c' ------------------ In `sic_repl.c'(1) there is a loop for reading strings typed by the user, evaluating them and printing the results. GNU readline is ideally suited to this, but it is not always available - or sometimes people simply may not wish to use it. With the help of GNU Autotools, it is very easy to cater for building with and without GNU readline. `sic_repl.c' uses this function to read lines of input from the user: static char * getline (FILE *in, const char *prompt) { static char *buf = NULL; /* Always allocated and freed from inside this function. */ XFREE (buf); buf = (char *) readline ((char *) prompt); #ifdef HAVE_ADD_HISTORY if (buf && *buf) add_history (buf); #endif return buf; } To make this work, I must write an Autoconf macro which adds an option to `configure', so that when the package is installed, it will use the readline library if `--with-readline' is used: AC_DEFUN(SIC_WITH_READLINE, [AC_ARG_WITH(readline, [ --with-readline compile with the system readline library], [if test x"${withval-no}" != xno; then sic_save_LIBS=$LIBS AC_CHECK_LIB(readline, readline) if test x"${ac_cv_lib_readline_readline}" = xno; then AC_MSG_ERROR(libreadline not found) fi LIBS=$sic_save_LIBS fi]) AM_CONDITIONAL(WITH_READLINE, test x"${with_readline-no}" != xno) ]) Having put this macro in the file `config/readline.m4', I must also call the new macro (`SIC_WITH_READLINE') from `configure.in'. ---------- Footnotes ---------- (1) Read Eval Print Loop.  File: autobook.info, Node: sic_syntax.c, Next: sic_builtin.c, Prev: sic_repl.c, Up: A Sample Shell Application 9.3.2 `sic_syntax.c' -------------------- The syntax of the commands in the shell I am writing is defined by a set of syntax handlers which are loaded into `libsic' at startup. I can get the C preprocessor to do most of the repetitive code for me, and just fill in the function bodies: #if HAVE_CONFIG_H # include #endif #include "sic.h" /* List of builtin syntax. */ #define syntax_functions \ SYNTAX(escape, "\\") \ SYNTAX(space, " \f\n\r\t\v") \ SYNTAX(comment, "#") \ SYNTAX(string, "\"") \ SYNTAX(endcmd, ";") \ SYNTAX(endstr, "") /* Prototype Generator. */ #define SIC_SYNTAX(name) \ int name (Sic *sic, BufferIn *in, BufferOut *out) #define SYNTAX(name, string) \ extern SIC_SYNTAX (CONC (syntax_, name)); syntax_functions #undef SYNTAX /* Syntax handler mappings. */ Syntax syntax_table[] = { #define SYNTAX(name, string) \ { CONC (syntax_, name), string }, syntax_functions #undef SYNTAX { NULL, NULL } }; This code writes the prototypes for the syntax handler functions, and creates a table which associates each with one or more characters that might occur in the input stream. The advantage of writing the code this way is that when I want to add a new syntax handler later, it is a simple matter of adding a new row to the `syntax_functions' macro, and writing the function itself.  File: autobook.info, Node: sic_builtin.c, Next: sic.c & sic.h (again), Prev: sic_syntax.c, Up: A Sample Shell Application 9.3.3 `sic_builtin.c' --------------------- In addition to the syntax handlers I have just added to the Sic shell, the language of this shell is also defined by the builtin commands it provides. The infrastructure for this file is built from a table of functions which is fed into various C preprocessor macros, just as I did for the syntax handlers. One builtin handler function has special status, `builtin_unknown'. This is the builtin that is called, if the Sic library cannot find a suitable builtin function to handle the current input command. At first this doesn't sound especially important - but it is the key to any shell implementation. When there is no builtin handler for the command, the shell will search the users command path, `$PATH', to find a suitable executable. And this is the job of `builtin_unknown': int builtin_unknown (Sic *sic, int argc, char *const argv[]) { char *path = path_find (argv[0]); int status = SIC_ERROR; if (!path) { sic_result_append (sic, "command \""); sic_result_append (sic, argv[0]); sic_result_append (sic, "\" not found"); } else if (path_execute (sic, path, argv) != SIC_OKAY) { sic_result_append (sic, "command \""); sic_result_append (sic, argv[0]); sic_result_append (sic, "\" failed: "); sic_result_append (sic, strerror (errno)); } else status = SIC_OKAY; return status; } static char * path_find (const char *command) { char *path = xstrdup (command); if (*command == '/') { if (access (command, X_OK) < 0) goto notfound; } else { char *PATH = getenv ("PATH"); char *pbeg, *pend; size_t len; for (pbeg = PATH; *pbeg != '\0'; pbeg = pend) { pbeg += strspn (pbeg, ":"); len = strcspn (pbeg, ":"); pend = pbeg + len; path = XREALLOC (char, path, 2 + len + strlen(command)); *path = '\0'; strncat (path, pbeg, len); if (path[len -1] != '/') strcat (path, "/"); strcat (path, command); if (access (path, X_OK) == 0) break; } if (*pbeg == '\0') goto notfound; } return path; notfound: XFREE (path); return NULL; } Running `autoscan' again at this point adds `AC_CHECK_FUNCS(strcspn strspn)' to `configure.scan'. This tells me that these functions are not truly portable. As before I provide fallback implementations for these functions in case they are missing from the target host - and as it turns out, they are easy to write: /* strcspn.c -- implement strcspn() for architectures without it */ #if HAVE_CONFIG_H # include #endif #include #if STDC_HEADERS # include #elif HAVE_STRINGS_H # include #endif #if !HAVE_STRCHR # ifndef strchr # define strchr index # endif #endif size_t strcspn (const char *string, const char *reject) { size_t count = 0; while (strchr (reject, *string) == 0) ++count, ++string; return count; } There is no need to add any code to `Makefile.am', because the `configure' script will automatically add the names of the missing function sources to `@LIBOBJS@'. This implementation uses the `autoconf' generated `config.h' to get information about the availability of headers and type definitions. It is interesting that `autoscan' reports that `strchr' and `strrchr', which are used in the fallback implementations of `strcspn' and `strspn' respectively, are themselves not portable! Luckily, the Autoconf manual tells me exactly how to deal with this: by adding some code to my `common.h' (paraphrased from the literal code in the manual): #if !STDC_HEADERS # if !HAVE_STRCHR # define strchr index # define strrchr rindex # endif #endif And another macro in `configure.in': AC_CHECK_FUNCS(strchr strrchr)  File: autobook.info, Node: sic.c & sic.h (again), Prev: sic_builtin.c, Up: A Sample Shell Application 9.3.4 `sic.c' & `sic.h' ----------------------- Since the application binary has no installed header files, there is little point in maintaining a corresponding header file for every source, all of the structures shared by these files, and non-static functions in these files are declared in `sic.h': #ifndef SIC_H #define SIC_H 1 #include #include #include BEGIN_C_DECLS extern Syntax syntax_table[]; extern Builtin builtin_table[]; extern Syntax syntax_table[]; extern int evalstream (Sic *sic, FILE *stream); extern int evalline (Sic *sic, char **pline); extern int source (Sic *sic, const char *path); extern int syntax_init (Sic *sic); extern int syntax_finish (Sic *sic, BufferIn *in, BufferOut *out); END_C_DECLS #endif /* !SIC_H */ To hold together everything you have seen so far, the `main' function creates a Sic parser and initialises it by adding syntax handler functions and builtin functions from the two tables defined earlier, before handing control to `evalstream' which will eventually exit when the input stream is exhausted. int main (int argc, char * const argv[]) { int result = EXIT_SUCCESS; Sic *sic = sic_new (); /* initialise the system */ if (sic_init (sic) != SIC_OKAY) sic_fatal ("sic initialisation failed"); signal (SIGINT, SIG_IGN); setbuf (stdout, NULL); /* initial symbols */ sicstate_set (sic, "PS1", "] ", NULL); sicstate_set (sic, "PS2", "- ", NULL); /* evaluate the input stream */ evalstream (sic, stdin); exit (result); } Now, the shell can be built and used: $ bootstrap ... $ ./configure --with-readline ... $ make ... make[2]: Entering directory `/tmp/sic/src' gcc -DHAVE_CONFIG_H -I. -I.. -I../sic -I.. -I../sic -g -c sic.c gcc -DHAVE_CONFIG_H -I. -I.. -I../sic -I.. -I../sic -g -c sic_builtin.c gcc -DHAVE_CONFIG_H -I. -I.. -I../sic -I.. -I../sic -g -c sic_repl.c gcc -DHAVE_CONFIG_H -I. -I.. -I../sic -I.. -I../sic -g -c sic_syntax.c gcc -g -O2 -o sic sic.o sic_builtin.o sic_repl.o sic_syntax.o \ ../sic/libsic.a ../replace/libreplace.a -lreadline make[2]: Leaving directory `/tmp/sic/src' ... $ ./src/sic ] pwd /tmp/sic ] ls -F Makefile aclocal.m4 config.cache configure* sic/ Makefile.am bootstrap* config.log configure.in src/ Makefile.in config/ config.status* replace/ ] exit $ This chapter has developed a solid foundation of code, which I will return to in *Note A Large GNU Autotools Project::, when Libtool will join the fray. The chapters leading up to that explain what Libtool is for, how to use it and integrate it into your own projects, and the advantages it offers over building shared libraries with Automake (or even just Make) alone.  File: autobook.info, Node: Introducing GNU Libtool, Next: Using GNU Libtool, Prev: A Small GNU Autotools Project, Up: Top 10 Introducing GNU Libtool ************************** Libtool takes care of all the peculiarities of creating, linking and loading shared and static libraries across a great number of platforms, providing a uniform command line interface to the developer. By using Libtool to manage your project libraries, you only need to concern yourself with _Libtool's_ interface: when someone else builds your project on a platform with a different library architecture, Libtool invokes that platform's compiler and linker with the correct environment and command line switches. It will install libraries and library using binaries according to the conventions of the host platform, and follows that platform's rules for library versioning and library interdependencies. Libtool empowers you to treat a library as an implementation of a well defined interface of your choosing. This "Libtool library" may be manifest as a collection of compiler objects, a static `ar' archive, or a position independent runtime loadable object. By definition, native libraries are fully supported by Libtool since they are an implementation detail of the Libtool library abstraction. It's just that until Libtool achieves complete world domination, you might need to bear in mind what is going on behind the command line interface when you first add Libtool support to your project. The sheer number of uses of the word `library' in this book could be easily very confusing. In this chapter and throughout the rest of the book, I will refer to various kinds of libraries as follows: `native' Low level libraries, that is, libraries provided by the host architecture. `Libtool library' The kind of library built by Libtool. This encompasses both the shared and static native components of the implementation of the named library. `pseudo-library' The high level `.la' file produced by Libtool. The `pseudo-library' is not a library in its own right, but is treated as if it were from outside the Libtool interface. Furthermore, in the context of Libtool, there is another subtle (but important) distinction to be drawn: `static _library_' A Libtool library which has no shared archive component. `static _archive_' The static component of a Libtool library. Many developers use Libtool as a black box which requires adding a few macros to `configure.in' and tweaking a project's `Makefile.am'. The next chapter addresses that school of thought in more detail. In this chapter I will talk a little about the inner workings of Libtool, and show you how it can be used directly from your shell prompt - how to build various kinds of library, and how those libraries can be used by an application. Before you can do any of this, you need to create a `libtool' script that is tailored to the platform you are using it from. * Menu: * Creating libtool:: * The Libtool Library:: * Linking an Executable:: * Linking a Library:: * Executing Uninstalled Binaries:: * Installing a Library:: * Installing an Executable:: * Uninstalling::  File: autobook.info, Node: Creating libtool, Next: The Libtool Library, Up: Introducing GNU Libtool 10.1 Creating `libtool' ======================= When you install a distribution of Libtool on your development machine, a host specific `libtool' program is installed. The examples in the rest of this chapter use this installed instance of `libtool'. When you start to use Libtool in the build process of your own projects, you shouldn't require that `libtool' be installed on the user's machine, particularly since they may have a different `libtool' version to the one used to develop your project. Instead, distribute some of the files installed by the Libtool distribution along with your project, and custom build a `libtool' script on the user's machine before invoking `./libtool' to build any objects. If you use Autoconf and Automake, these details are taken care of automatically (*note Using GNU Libtool with configure.in and Makefile.am: Using GNU Libtool.). Otherwise you should copy the following files from your own Libtool installation into the source tree of your own project: $ ls /usr/local/share/libtool config.guess config.sub libltdl ltconfig ltmain.in $ cp /usr/local/share/libtool/config.* /usr/local/share/libtool/lt* . $ ls config.guess config.sub ltconfig ltmain.in You must then arrange for your project build process to create an instance of `libtool' on the user's machine, so that it is dependent on their target system and not your development machine. The creation process requires the four files you just added to your project. Let's create a `libtool' instance by hand, so that you can see what is involved: $ ./config.guess hppa1.1-hp-hpux10.20 $ ./ltconfig --disable-static --with-gcc ./ltmain.sh hppa1.1-hp-hpux10.20 checking host system type... hppa1.1-hp-hpux10.20 checking build system type... hppa1.1-hp-hpux10.20 checking whether ln -s works... yes checking for ranlib... ranlib checking for BSD-compatible nm... /usr/bin/nm -p checking for strip... strip checking for gcc... gcc checking whether we are using GNU C... yes checking for objdir... .libs checking for object suffix... o checking for executable suffix... no checking for gcc option to produce PIC... -fPIC checking if gcc PIC flag -fPIC works... yes checking if gcc static flag -static works... yes checking if gcc supports -c -o file.o... yes checking if gcc supports -c -o file.lo... yes checking if gcc supports -fno-rtti -fno-exceptions ... no checking for ld used by GCC... /opt/gcc-lib/hp821/2.7.0/ld checking if the linker (/opt/gcc-lib/hp821/2.7.0/ld) is GNU ld... no checking whether the linker (/opt/gcc-lib/hp821/2.7.0/ld) supports \ shared libraries... yes checking how to hardcode library paths into programs... relink checking whether stripping libraries is possible... yes checking for /opt/gcc-lib/hp821/2.7.0/ld option to reload object \ files... -r checking dynamic linker characteristics... hpux10.20 dld.sl checking command to parse /usr/bin/nm -p output... ok checking if libtool supports shared libraries... yes checking whether to build shared libraries... yes checking whether to build static libraries... yes creating libtool $ ls config.guess config.sub ltconfig config.log libtool ltmain.sh $ ./libtool --version ltmain.sh (GNU libtool) 1.3c (1.629 1999/11/02 12:33:04) The examples in this chapter are all performed on a HP-UX system, but the principles depicted are representative of any of the platforms to which Libtool has been ported (*note PLATFORMS::). Often you don't need to specify any options, and if you omit the configuration triplet (*note Configuration Names::), `ltconfig' will run `config.guess' itself. There are several options you can specify which affect the generated `libtool', *Note Invoking ltconfig: (Libtool)Configuring Libtool. Unless your project has special requirements, you can usually use the simplified: $ ./ltconfig ./ltmain.sh With the current release of Libtool, you must be careful that `$CC' is set to the same value when you call `ltconfig' as when you invoke the `libtool' it generates, otherwise `libtool' will use the compiler specified in `$CC' currently, but with the semantics probed by `ltconfig' for the compiler specified in `$CC' at the time _it_ was executed.  File: autobook.info, Node: The Libtool Library, Next: Linking an Executable, Prev: Creating libtool, Up: Introducing GNU Libtool 10.2 The Libtool Library ======================== A Libtool library is built from Libtool objects in the same way that a native (non-Libtool) library is built from native objects. Building a Libtool library with `libtool' is as easy as building an old style static archive. Generally, each of the sources is compiled to a "Libtool object", and then these objects are combined to create the library. If you want to try this to see what `libtool' does on your machine, put the following code in a file `hello.c', in a directory of its own, and run the example shell commands from there: #include void hello (char *who) { printf ("Hello, %s!\n", who); } The traditional way to make a (native) static library is as follows: $ gcc -c hello.c $ ls hello.c hello.o $ ar cru libhello.a hello.o $ ranlib libhello.a $ ls hello.c hello.o libhello.a Notice that even when I just want to build an old static archive, I need to know that, in common with most Unices, I have to _bless_(1) my library with `ranlib' to make it work optimally on HP-UX. Essentially, Libtool supports the building of three types of library: shared libraries; static libraries; and convenience libraries. In the following sections I will talk about each in turn, but first you will need to understand how to create and use "position independent" code, as explained in the next section. * Menu: * Position Independent Code:: * Creating Shared Libraries with libtool:: * Creating Static Libraries with libtool:: * Creating Convenience Libraries with libtool:: ---------- Footnotes ---------- (1) Generally this involves indexing the symbols exported from the archive for faster linking, and to allow the archived objects to reference symbols from other objects earlier in the same archive.  File: autobook.info, Node: Position Independent Code, Next: Creating Shared Libraries with libtool, Up: The Libtool Library 10.2.1 Position Independent Code -------------------------------- On most architectures, when you compile source code to object code, you need to specify whether the object code should be "position independent" or not. There are occasional architectures which don't make the distinction, usually because all object code is position independent by virtue of the ABI(1), or less often because the load address of the object is fixed at compile time (which implies that shared libraries are not supported by such a platform). If an object is compiled as position independent code (PIC), then the operating system can load the object at _any_ address in preparation for execution. This involves a time overhead, in replacing direct address references with relative addresses at compile time, and a space overhead, in maintaining information to help the runtime loader fill in the unresolved addresses at runtime. Consequently, PIC objects are usually slightly larger and slower at runtime than the equivalent non-PIC object. The advantage of sharing library code on disk and in memory outweigh these problems as soon as the PIC object code in shared libraries is reused. PIC compilation is exactly what is required for objects which will become part of a shared library. Consequently, `libtool' builds PIC objects for use in shared libraries and non-PIC objects for use in static libraries. Whenever `libtool' instructs the compiler to generate a PIC object, it also defines the preprocessor symbol, `PIC', so that assembly code can be aware of whether it will reside in a PIC object or not. Typically, as `libtool' is compiling sources, it will generate a `.lo' object, as PIC, and a `.o' object, as non-PIC, and then it will use the appropriate one of the pair when linking executables and libraries of various sorts. On architectures where there is no distinction, the `.lo' file is just a soft link to the `.o' file. In practice, you can link PIC objects into a static archive for a small overhead in execution and load speed, and often you can similarly link non-PIC objects into shared archives. If you find that you need to do this, `libtool' provides several ways to override the default behavior (*note Creating libtool::). ---------- Footnotes ---------- (1) Application Binary Interface: the layout of the bytes that comprise binary objects and executables: 32 or 64 bit words; procedure calling conventions; memory alignment rules; system call interface; order and type of the binary sections (data, code etc) and so on.  File: autobook.info, Node: Creating Shared Libraries with libtool, Next: Creating Static Libraries with libtool, Prev: Position Independent Code, Up: The Libtool Library 10.2.2 Creating Shared Libraries -------------------------------- From Libtool's point of view, the term `shared library' is somewhat of a misnomer. Since Libtool is intended to abstract away the details of library building, it doesn't matter whether Libtool is building a shared library or a static archive. Of course, Libtool will always try to build a shared library by default on the platforms to which it has been ported (*note PLATFORMS::), but will equally fall back to building a static archive if the host architecture does not support shared libraries, or if the project developer deliberately configures Libtool to always build static archives only. These libraries are more properly called `"Libtool libraries"'; the underlying native library will usually be a shared library, except as described above. To create a Libtool library on my HP-UX host, or indeed anywhere else that `libtool' works, run the following commands: $ rm hello.o libhello.a $ libtool gcc -c hello.c mkdir .libs gcc -c -fPIC -DPIC hello.c -o .libs/hello.lo gcc -c hello.c -o hello.o >/dev/null 2>&1 mv -f .libs/hello.lo hello.lo $ ls hello.c hello.lo hello.o $ libtool gcc -rpath /usr/local/lib -o libhello.la hello.lo rm -fr .libs/libhello.la .libs/libhello.* .libs/libhello.* /opt/gcc-lib/hp821/2.7.0/ld -b +h libhello.sl.0 +b /usr/local/lib \ -o .libs/libhello.sl.0.0 hello.lo (cd .libs && rm -f libhello.sl.0 && ln -s libhello.sl.0.0 libhello.sl.0) (cd .libs && rm -f libhello.sl && ln -s libhello.sl.0.0 libhello.sl) ar cru .libs/libhello.a hello.o ranlib .libs/libhello.a creating libhello.la (cd .libs && rm -f libhello.la && ln -s ../libhello.la libhello.la) $ ls hello.c hello.lo hello.o libhello.la This example illustrates several features of `libtool'. Compare the command line syntax with the previous example (*note The Libtool Library::). They are both very similar. Notice, however, that when compiling the `hello.c' source file, `libtool' creates _two_ objects. The first, `hello.lo', is the "Libtool object" which we use for Libtool libraries, and the second, `hello.o' is a standard object. On HP-UX, `libtool' knows that Libtool objects should be compiled with "position independent code", hence the extra switches when creating the first object. When you run `libtool' from the command line, you must also specify a compiler for it to call. Similarly when you create a `libtool' script with `ltconfig', a compiler is chosen and interrogated to discover what characteristics it has. *Note Creating libtool::. Prior to release 1.4 of Libtool, `ltconfig' probed the build machine for a suitable compiler, by searching first for `gcc' and then `cc'. The functionality of `ltconfig' is being migrated into the `AC_PROG_LIBTOOL' macro, such that there will be no `ltconfig' script in Libtool release 1.5. The current release is part way between the two. In all cases, you can specify a particular compiler by setting the `CC' environment variable. It is important to continue to use the same compiler when you run `libtool' as the compiler that was used when you created the `libtool' script. If you create the script with `CC' set to `gcc', and subsequently try to compile using, say: $ libtool c89 -rpath /usr/local/lib -c hello.c `libtool' will try to call `c89' using the options it discovered for `gcc'. Needless to say, that doesn't work! The link command specifies a Libtool library target, `libhello.la', compiled from a single Libtool object, `hello.lo'. Even so, `libtool' knows how to build both static and shared archives on HP-UX - underneath the `libtool' abstraction both are created. `libtool' also understands the particulars of library linking on HP-UX: the static archive, `libhello.a', is blessed; the system (and compiler) dependent compiler and linker flags, versioning scheme and `.sl' extension are utilised for the shared archive, `libhello.sl'. On another host, all of these details may be completely different, yet with exactly the same invocation, `libtool' will call the native tools with the appropriate options to achieve the same result. Try it on your own machines to see any differences. It is the `-rpath' switch that tells `libtool' that you want to build a Libtool library (with both the shared and static components where possible). If you omit the `-rpath' switch, `libtool' will build a convenience library instead, see *Note Creating convenience Libraries: Creating Convenience Libraries with libtool. The `-rpath' switch is doubly important, because it tells `libtool' that you intend to install `libhello.la' in `/usr/local/lib'. This allows `libtool' to finalize the library correctly after installation on the architectures that need it, see *Note Installing a Library::. Finally, notice that only the Libtool library, `libhello.la', is visible after a successful link. The various files which form the local implementation details of the Libtool library are in a hidden subdirectory, but in order for the abstraction to work cleanly you shouldn't need to worry about these too much.  File: autobook.info, Node: Creating Static Libraries with libtool, Next: Creating Convenience Libraries with libtool, Prev: Creating Shared Libraries with libtool, Up: The Libtool Library 10.2.3 Creating Static Libraries -------------------------------- In contrast, `libtool' will create a static library if either the `-static' or `-all-static' switches are specified on the link line for a Libtool library: $ libtool gcc -static -o libhello.la hello.lo rm -fr .libs/libhello.la .libs/libhello.* .libs/libhello.* ar cru .libs/libhello.a hello.o ranlib .libs/libhello.a creating libhello.la (cd .libs && rm -f libhello.la && ln -s ../libhello.la libhello.la) Note that since `libtool' will only create a static archive, the `-rpath' switch is not required: once a static library has been installed, there is no need to perform additional finalization for the library to be used from the installed location(1), or to track runtime search paths when installing a static archive. When you link an executable against this `libhello.la', the objects from the static archive will be statically linked into the executable. The advantage of such a library over the traditional native static archive is that all of the dependency information from the Libtool library is used. For an example, *Note Creating Convenience Libraries: Creating Convenience Libraries with libtool. `libtool' is useful as a *general* library building toolkit, yet people still seem to regress to the old way of building libraries whenever they want to use static archives. You should exploit the consistent interface of `libtool' even for static archives. If you don't want to use shared archives, use the `-static' switch to build a static Libtool library. ---------- Footnotes ---------- (1) As is often the case, AIX is peculiar in this respect - `ranlib' adds path information to a static archive, and must be run again after the archive is installed. `libtool' knows about this, and will automatically _bless_ the installed library again on AIX.  File: autobook.info, Node: Creating Convenience Libraries with libtool, Prev: Creating Static Libraries with libtool, Up: The Libtool Library 10.2.4 Creating Convenience Libraries ------------------------------------- The third type of library which can be built with `libtool' is the "convenience library". Modern compilers are able to create "partially linked" objects: intermediate compilation units which comprise several compiled objects, but are neither an executable or a library. Such partially linked objects must be subsequently linked into a library or executable to be useful. Libtool convenience libraries _are_ partially linked objects, but are emulated by `libtool' on platforms with no native implementation. If you want to try this to see what `libtool' does on your machine, put the following code in a file `trim.c', in the same directory as `hello.c' and `libhello.la', and run the example shell commands from there: #include #define WHITESPACE_STR " \f\n\r\t\v" /** * Remove whitespace characters from both ends of a copy of * '\0' terminated STRING and return the result. **/ char * trim (char *string) { char *result = 0; /* Ignore NULL pointers. */ if (string) { char *ptr = string; /* Skip leading whitespace. */ while (strchr (WHITESPACE_STR, *ptr)) ++ptr; /* Make a copy of the remainder. */ result = strdup (ptr); /* Move to the last character of the copy. */ for (ptr = result; *ptr; ++ptr) /* NOWORK */; --ptr; /* Remove trailing whitespace. */ for (--ptr; strchr (WHITESPACE_STR, *ptr); --ptr) *ptr = '\0'; } return result; } To compile the convenience library with `libtool', you would do this: $ libtool gcc -c trim.c rm -f .libs/trim.lo gcc -c -fPIC -DPIC trim.c -o .libs/trim.lo gcc -c trim.c -o trim.o >/dev/null 2>&1 mv -f .libs/trim.lo trim.lo $ libtool gcc -o libtrim.la trim.lo rm -fr .libs/libtrim.la .libs/libtrim.* .libs/libtrim.* ar cru .libs/libtrim.al trim.lo ranlib .libs/libtrim.al creating libtrim.la (cd .libs && rm -f libtrim.la && ln -s ../libtrim.la libtrim.la) Additionally, you can use a convenience library as an alias for a set of zero or more object files and some dependent libraries. If you need to link several objects against a long list of libraries, it is much more convenient to create an alias: $ libtool gcc -o libgraphics.la -lpng -ltiff -ljpeg -lz rm -fr .libs/libgraphics.la .libs/libgraphics.* .libs/libgraphics.* ar cru .libs/libgraphics.al ranlib .libs/libgraphics.al creating libgraphics.la (cd .libs && rm -f libgraphics.la && \ ln -s ../libgraphics.la libgraphics.la) Having done this, whenever you link against `libgraphics.la' with `libtool', all of the dependent libraries will be linked too. In this case, there are no actual objects compiled into the convenience library, but you can do that too, if need be.  File: autobook.info, Node: Linking an Executable, Next: Linking a Library, Prev: The Libtool Library, Up: Introducing GNU Libtool 10.3 Linking an Executable ========================== Continuing the parallel between the syntax used to compile with `libtool' and the syntax used when building old static libraries, linking an executable is a matter of combining compilation units into a binary in both cases. We tell the compiler which objects and libraries are required, and it creates an executable for us. If you want to try this to see what `libtool' does on your machine, put the following code in a file `main.c', in the same directory as `hello.c' and `libhello.la', and run the example shell commands from there: void hello (); int main (int argc, char *argv[]) { hello ("World"); exit (0); } To compile an executable which uses the non-Libtool `libhello.a' library built previously (*note The Libtool Library::), I would use the following commands: $ gcc -o hello main.c libhello.a $ ./hello Hello, World! To create a similar executable on the HP-UX host, using `libtool' this time: $ libtool gcc -o hello main.c libhello.la libtool: link: warning: this platform does not like uninstalled libtool: link: warning: shared libraries. libtool: link: hello will be relinked during installation gcc -o .libs/hello main.c /tmp/hello/.libs/libhello.sl \ -Wl,+b -Wl,/tmp/hello/.libs:/usr/local/lib creating hello $ ls hello hello.lo libhello.la hello.c hello.o main.c $ ./hello Hello, World! Notice that you linked against the Libtool library, `libhello.la', but otherwise the link command you used was not really very different from non-Libtool static library link command used earlier. Still, `libtool' does several things for you: it links with the shared archive rather than the static archive; and it sets the compiler options so that the program can be run in place, even though it is linked against the uninstalled Libtool library. Using a `make' rule _without_ the benefit of `libtool', it would be almost impossible to reliably link a program against an uninstalled shared library in this way, since the particular switches needed would be different between the various platforms you want the project to work with. Also without the extra compiler options `libtool' adds for you, the program will search only the standard library directories for a shared `libhello'. The link warning tells you that `libtool' knows that on HP-UX the program will stop working if it is copied directly to the installation directory; To prevent it breaking, `libtool' will relink the program when it is installed, see *Note Installing a Library::. I discussed the creation of static Libtool libraries in *Note Creating Static Libraries: Creating Static Libraries with libtool. If you link an executable against such a library, the library objects, by definition, can only be statically linked into your executable. Often this is what you want if the library is not intended for installation, or if you have temporarily disabled building of shared libraries in your development tree to speed up compilation while you are debugging. Sometimes, this isn't what you want. You might need to install a complete Libtool library with shared and static components, but need to generate a static executable linked against the same library, like this: $ libtool gcc -static -o hello main.c libhello.la gcc -o hello main.c ./.libs/libhello.a In this case, the `-static' switch instructs `libtool' to choose the static component of any uninstalled Libtool library. You could have specified `-all-static' instead, which instructs `libtool' to link the executable with only static libraries (wherever possible), for any Libtool or native libraries used. Finally, you can also link executables against convenience libraries. This makes sense when the convenience library is being used as an alias (*note Creating Convenience Libraries: Creating Convenience Libraries with libtool.). Notice how `libgraphics.la' expands to its own dependencies in the link command: $ libtool gcc -o image loader.o libgraphics.la libtool: link: warning: this platform does not like uninstalled libtool: link: warning: shared libraries libtool: link: image will be relinked during installation gcc -o .libs/image loader.o -lpng -ltiff -ljpeg -lz \ -Wl,+b -Wl,/tmp/image/.libs:/usr/local/lib creating image You can also link against convenience libraries being used as partially linked objects, so long as you are careful that each is linked only once. Remember that a partially linked object is just the same as any other object, and that if you load it twice (even from different libraries), you will get multiple definition errors when you try to link your executable. This is almost the same as using the `-static' switch on the `libtool' link line to link an executable with the static component of a normal Libtool library, except that the convenience library comprises PIC objects. When statically linking an executable, PIC objects are best avoided however, see *Note Position Independent Code::.  File: autobook.info, Node: Linking a Library, Next: Executing Uninstalled Binaries, Prev: Linking an Executable, Up: Introducing GNU Libtool 10.4 Linking a Library ====================== Libraries often rely on code in other libraries. Traditionally the way to deal with this is to _know_ what the dependencies are and, when linking an executable, be careful to list all of the dependencies on the link line in the correct order. If you have ever built an X Window application using a widget library, you will already be familiar with this notion. Even though you only use the functions in the widget library directly, a typical link command would need to be: $ gcc -o Xtest -I/usr/X11R6/include Xtest.c -L/usr/X11R6/lib \ -lXm -lXp -lXaw -lXmu -lX11 -lnsl -lsocket With modern architectures, this problem has been solved by allowing libraries to be linked into other libraries, but this feature is not yet particularly portable. If you are trying to write a portable project, it is not safe to rely on native support for inter-library dependencies, especially if you want to have dependencies between static and shared archives. Some of the features discussed in this section were not fully implemented before Libtool 1.4, so you should make sure that you are using this version or newer if you need these features. If you want to try the examples in this section to see what `libtool' does on your machine, you will first need to modify the source of `hello.c' to introduce a dependency on `trim.c': #include extern char *trim (); extern void free (); void hello (char *who) { char *trimmed = trim (who); printf ("Hello, %s!\n", trimmed); free (trimmed); } You might also want to modify the `main.c' file to exercise the new `trim' functionality to prove that the newly linked executable is working: void hello (); int main (int argc, char *argv[]) { hello ("\tWorld \r\n"); exit (0); } Suppose I want to make two libraries, `libtrim' and `libhello'. `libhello' uses the `trim' function in `libtrim' but the code in `main' uses only the `hello' function in `libhello'. Traditionally, the two libraries are built like this: $ rm hello *.a *.la *.o *.lo $ gcc -c trim.c $ ls hello.c main.c trim.c trim.o $ ar cru libtrim.a trim.o $ ranlib libtrim.a $ gcc -c hello.c $ ls hello.c hello.o libtrim.a main.c trim.c trim.o $ ar cru libhello.a hello.o $ ranlib libhello.a $ ls hello.c libhello.a main.c trim.o hello.o libtrim.a trim.c Notice that there is no way to specify that `libhello.a' won't work unless it is also linked with `libtrim.a'. Because of this I need to list both libraries when I link the application. What's more, I need to list them in the correct order: $ gcc -o hello main.c libtrim.a libhello.a /usr/bin/ld: Unsatisfied symbols: trim (code) collect2: ld returned 1 exit status $ gcc -o hello main.c libhello.a libtrim.a $ ls hello hello.o libtrim.a trim.c hello.c libhello.a main.c trim.o $ ./hello Hello, World! * Menu: * Inter-library Dependencies:: * Using Convenience Libraries::  File: autobook.info, Node: Inter-library Dependencies, Next: Using Convenience Libraries, Up: Linking a Library 10.4.1 Inter-library Dependencies --------------------------------- `libtool''s inter-library dependency support will use the native implementation if there is one available. If there is no native implementation, or if the native implementation is broken or incomplete, `libtool' will use an implementation of its own. To build `libtrim' as a standard Libtool library (*note The Libtool Library::), as follows: $ rm hello *.a *.o $ ls hello.c main.c trim.c $ libtool gcc -c trim.c rm -f .libs/trim.lo gcc -c -fPIC -DPIC trim.c -o .libs/trim.lo gcc -c trim.c -o trim.o >/dev/null 2>&1 mv -f .libs/trim.lo trim.lo $ libtool gcc -rpath /usr/local/lib -o libtrim.la trim.lo rm -fr .libs/libtrim.la .libs/libtrim.* .libs/libtrim.* /opt/gcc-lib/hp821/2.7.0/ld -b +h libtrim.sl.0 +b /usr/local/lib \ -o .libs/libtrim.sl.0.0 trim.lo (cd .libs && rm -f libtrim.sl.0 && ln -s libtrim.sl.0.0 libtrim.sl.0) (cd .libs && rm -f libtrim.sl && ln -s libtrim.sl.0.0 libtrim.sl) ar cru .libs/libtrim.a trim.o ranlib .libs/libtrim.a creating libtrim.la (cd .libs && rm -f libtrim.la && ln -s ../libtrim.la libtrim.la) When you build `libhello', you can specify the libraries it depends on at the command line, like so: $ libtool gcc -c hello.c rm -f .libs/hello.lo gcc -c -fPIC -DPIC hello.c -o .libs/hello.lo gcc -c hello.c -o hello.o >/dev/null 2>&1 mv -f .libs/hello.lo hello.lo $ libtool gcc -rpath /usr/local/lib -o libhello.la hello.lo libtrim.la rm -fr .libs/libhello.la .libs/libhello.* .libs/libhello.* *** Warning: inter-library dependencies are not known to be supported. *** All declared inter-library dependencies are being dropped. *** The inter-library dependencies that have been dropped here will be *** automatically added whenever a program is linked with this library *** or is declared to -dlopen it. /opt/gcc-lib/hp821/2.7.0/ld -b +h libhello.sl.0 +b /usr/local/lib \ -o .libs/libhello.sl.0.0 hello.lo (cd .libs && rm -f libhello.sl.0 && ln -s libhello.sl.0.0 libhello.sl.0) (cd .libs && rm -f libhello.sl && ln -s libhello.sl.0.0 libhello.sl) ar cru .libs/libhello.a hello.o ranlib .libs/libhello.a creating libhello.la (cd .libs && rm -f libhello.la && ln -s ../libhello.la libhello.la) $ ls hello.c hello.o libtrim.la trim.c trim.o hello.lo libhello.la main.c trim.lo Although, on HP-UX, `libtool' warns that it doesn't know how to use the native inter-library dependency implementation, it will track the dependencies and make sure they are added to the final link line, so that you only need to specify the libraries that you use directly. Now, you can rebuild `hello' exactly as in the earlier example (*note Linking an Executable::), as in: $ libtool gcc -o hello main.c libhello.la libtool: link: warning: this platform does not like uninstalled libtool: link: warning: shared libraries libtool: link: hello will be relinked during installation gcc -o .libs/hello main.c /tmp/intro-hello/.libs/libhello.sl \ /tmp/intro-hello/.libs/libtrim.sl \ -Wl,+b -Wl,/tmp/intro-hello/.libs:/usr/local/lib creating hello $ ./hello Hello, World! Notice that even though you only specified the `libhello.la' library at the command line, `libtool' remembers that `libhello.sl' depends on `libtrim.sl' and links that library too. You can also link a static executable, and the dependencies are handled similarly: $ libtool gcc -o hello-again -static main.c libhello.la gcc -o hello main.c ./.libs/libhello.a /tmp/intro-hello/.libs/libtrim.a $ ./hello-again Hello, World! For your own projects, provided that you use `libtool', and that you specify the libraries you wish to link using the `.la' pseudo-libraries, these dependencies can be nested as deeply as you like. You can also register dependencies on native libraries, though you will of course need to specify any dependencies that the native library itself has at the same time.  File: autobook.info, Node: Using Convenience Libraries, Prev: Inter-library Dependencies, Up: Linking a Library 10.4.2 Using Convenience Libraries ---------------------------------- To rebuild `libtrim' as a convenience library (*note Creating Convenience Libraries: Creating Convenience Libraries with libtool.), use the following commands: $ rm hello *.la $ ls hello.c hello.lo hello.o main.c trim.c trim.lo trim.o $ libtool gcc -o libtrim.la trim.lo rm -fr .libs/libtrim.la .libs/libtrim.* .libs/libtrim.* ar cru .libs/libtrim.al trim.lo ranlib .libs/libtrim.al creating libtrim.la (cd .libs && rm -f libtrim.la && ln -s ../libtrim.la libtrim.la) Then, rebuild `libhello', with an inter-library dependency on `libtrim' (*note Inter-library Dependencies::), like this: $ libtool gcc -rpath `pwd`/_inst -o libhello.la hello.lo libtrim.la rm -fr .libs/libhello.la .libs/libhello.* .libs/libhello.* *** Warning: inter-library dependencies are not known to be supported. *** All declared inter-library dependencies are being dropped. *** The inter-library dependencies that have been dropped here will be *** automatically added whenever a program is linked with this library *** or is declared to -dlopen it. rm -fr .libs/libhello.lax mkdir .libs/libhello.lax rm -fr .libs/libhello.lax/libtrim.al mkdir .libs/libhello.lax/libtrim.al (cd .libs/libhello.lax/libtrim.al && ar x /tmp/./.libs/libtrim.al) /opt/gcc-lib/hp821/2.7.0/ld -b +h libhello.sl.0 +b /tmp/hello/_inst \ -o .libs/libhello.sl.0.0 hello.lo .libs/libhello.lax/libtrim.al/trim.lo (cd .libs && rm -f libhello.sl.0 && ln -s libhello.sl.0.0 libhello.sl.0) (cd .libs && rm -f libhello.sl && ln -s libhello.sl.0.0 libhello.sl) rm -fr .libs/libhello.lax mkdir .libs/libhello.lax rm -fr .libs/libhello.lax/libtrim.al mkdir .libs/libhello.lax/libtrim.al (cd .libs/libhello.lax/libtrim.al && ar x /tmp/hello/./.libs/libtrim.al) ar cru .libs/libhello.a hello.o .libs/libhello.lax/libtrim.al/trim.lo ranlib .libs/libhello.a rm -fr .libs/libhello.lax .libs/libhello.lax creating libhello.la (cd .libs && rm -f libhello.la && ln -s ../libhello.la libhello.la) $ ls hello.c hello.o libtrim.la trim.c trim.o hello.lo libhello.la main.c trim.lo Compare this to the previous example of building `libhello' and you can see that things are rather different. On HP-UX, partial linking is not known to work, so `libtool' extracts the objects from the convenience library, and links them directly into `libhello'. That is, `libhello' is comprised of its own objects _and_ the objects in `libtrim'. If `libtrim' had had any dependencies, `libhello' would have inherited them too. This technique is especially useful for grouping source files into subdirectories, even though all of the objects compiled in the subdirectories must eventually reside in a big library: compile the sources in each into a convenience library, and in turn link all of these into a single library which will then contain all of the constituent objects and dependencies of the various convenience libraries. When you relink the `hello' executable, notice that `libtrim' is *not* linked, because the `libtrim' objects are already present in `libhello': $ libtool gcc -o hello main.c libhello.la libtool: link: warning: this platform does not like uninstalled libtool: link: warning: shared libraries libtool: link: hello will be relinked during installation gcc -o .libs/hello main.c /tmp/intro-hello/.libs/libhello.sl \ -Wl,+b -Wl,/tmp/intro-hello/.libs:/usr/local/lib creating hello $ ./hello Hello, World!  File: autobook.info, Node: Executing Uninstalled Binaries, Next: Installing a Library, Prev: Linking a Library, Up: Introducing GNU Libtool 10.5 Executing Uninstalled Binaries =================================== If you look at the contents of the `hello' program you built in the last section, you will see that it is not actually a binary at all, but a shell script which sets up the environment so that when the real binary is called it finds its the shared libraries in the correct locations. Without this script, the runtime loader might not be able to find the uninstalled libraries. Or worse, it might find an old version and load that by mistake! In practice, this is all part of the unified interface `libtool' presents so you needn't worry about it most of the time. The exception is when you need to look at the binary with another program, to debug it for example: $ ls hello hello.lo libhello.la main.c trim.lo hello.c hello.o libtrim.la trim.c trim.o $ libtool gdb hello GDB is free software and you are welcome to distribute copies of it under certain conditions; type "show copying" to see the conditions. There is absolutely no warranty for GDB; type "show warranty" for details. GDB 4.18 (hppa1.0-hp-hpux10.20), Copyright 1999 Free Software Foundation, Inc... (gdb) bre main Breakpoint 1 at 0x5178: file main.c, line 6. (gdb) run Starting program: /tmp/intro-hello/.libs/hello Breakpoint 1, main (argc=1, argv=0x7b03aa70) at main.c:6 6 return hello("World"); ...  File: autobook.info, Node: Installing a Library, Next: Installing an Executable, Prev: Executing Uninstalled Binaries, Up: Introducing GNU Libtool 10.6 Installing a Library ========================= Now that the library and an executable which links with it have been successfully built, they can be installed. For the sake of this example I will `cp' the objects to their destination, though `libtool' would be just as happy if I were to use `install' with the long, requisite list of parameters. It is important to install the library to the `-rpath' destination which was specified when it was linked earlier, or at least that it be visible from that location when the runtime loader searches for it. This rule is not enforced by `libtool', since it is often desirable to install libraries to a "staging"(1) area. Of course, the package must ultimately install the library to the specified `-rpath' destination for it to work correctly, like this: $ libtool cp libtrim.la /usr/local/lib cp .libs/libtrim.sl.0.0 /usr/local/lib/libtrim.sl.0.0 (cd /usr/local/lib && rm -f libtrim.sl.0 && \ ln -s libtrim.sl.0.0 libtrim.sl.0) (cd /usr/local/lib && rm -f libtrim.sl && \ ln -s libtrim.sl.0.0 libtrim.sl) chmod 555 /usr/local/lib/libtrim.sl.0.0 cp .libs/libtrim.lai /usr/local/lib/libtrim.la cp .libs/libtrim.a /usr/local/lib/libtrim.a ranlib /usr/local/lib/libtrim.a chmod 644 /usr/local/lib/libtrim.a ---------------------------------------------------------------------- Libraries have been installed in: /usr/local/lib If you ever happen to want to link against installed libraries in a given directory, LIBDIR, you must either use libtool, and specify the full pathname of the library, or use -LLIBDIR flag during linking and do at least one of the following: - add LIBDIR to the SHLIB_PATH environment variable during execution - use the -Wl,+b -Wl,LIBDIR linker flag See any operating system documentation about shared libraries for more information, such as the ld(1) and ld.so(8) manual pages. ---------------------------------------------------------------------- Again, `libtool' takes care of the details for you. Both the static and shared archives are copied into the installation directory and their access modes are set appropriately. `libtool' blesses the static archive again with `ranlib', which would be easy to forget without the benefit of `libtool', especially if I develop on a host where the library will continue to work without this step. Also, `libtool' creates the necessary links for the shared archive to conform with HP-UXs library versioning rules. Compare this to what you see with the equivalent commands running on GNU/Linux to see how `libtool' applies these rules according to the requirements of its host. The block of text `libtool' shows at the end of the installation serves to explain how to link executables against the newly installed library on HP-UX and how to make sure that the executables linked against it will work. Of course, the best way to ensure this is to use `libtool' to perform the linking. I'll leave the details of linking against an installed Libtool library as an exercise - everything you need to know can be extrapolated from the example of linking against an uninstalled Libtool library, *Note Linking an Executable::. On some architectures, even shared archives need to be blessed on installation. For example, GNU/Linux requires that `ldconfig' be run when a new library is installed. Typically, a library will be installed to its target destination after being built, in which case `libtool' will perform any necessary blessing during installation. Sometimes, when building a binary package for installation on another machine, for example, it is not desirable to perform the blessing on the build machine. No problem, `libtool' takes care of this too! `libtool' will detect if you install the library to a destination other than the one specified in the `-rpath' argument passed during the archive link, and will simply remind you what needs to be done before the library can be used: $ mkdir -p /usr/local/stow/hello-1.0/lib $ libtool cp libtrim.la /usr/local/stow/hello-1.0/lib cp .libs/libtrim.sl.0.0 /usr/local/stow/hello-1.0/lib/libtrim.sl.0.0 (cd /usr/local/stow/hello-1.0/lib && rm -f libtrim.sl.0 && \ ln -s libtrim.sl.0.0 libtrim.sl.0) (cd /usr/local/stow/hello-1.0/lib && rm -f libtrim.sl && \ ln -s libtrim.sl.0.0 libtrim.sl) chmod 555 /usr/local/stow/hello-1.0/lib/libtrim.sl.0.0 cp .libs/libtrim.lai /usr/local/stow/hello-1.0/lib/libtrim.la cp .libs/libtrim.a /usr/local/stow/hello-1.0/lib/libtrim.a ranlib /usr/local/stow/hello-1.0/lib/libtrim.a chmod 644 /usr/local/stow/hello-1.0/lib/libtrim.a libtool: install: warning: remember to run libtool: install: warning: libtool --finish /usr/local/lib If you will make the installed libraries visible in the destination directory with symbolic links, you need to do whatever it is you do to make the library visible, and then bless the library in _that_ location with the `libtool --finish /usr/local/lib' command: $ cd /usr/local/stow $ stow hello-1.0 $ libtool --finish /usr/local/lib If you are following the examples so far, you will also need to install the Libtool library, `libhello.la', before you move on to the next section: $ libtool cp libhello.la /usr/local/lib cp .libs/libhello.sl.0.0 /usr/local/lib/libhello.sl.0.0 (cd /usr/local/lib && rm -f libhello.sl.0 && \ ln -s libhello.sl.0.0 libhello.sl.0) (cd /usr/local/lib && rm -f libhello.sl && \ ln -s libhello.sl.0.0 libhello.sl) chmod 555 /usr/local/lib/libhello.sl.0.0 cp .libs/libhello.lai /usr/local/lib/libhello.la cp .libs/libhello.a /usr/local/lib/libhello.a ranlib /usr/local/lib/libhello.a chmod 644 /usr/local/lib/libhello.a ---------------------------------------------------------------------- Libraries have been installed in: /usr/local/lib If you ever happen to want to link against installed libraries in a given directory, LIBDIR, you must either use libtool, and specify the full pathname of the library, or use -LLIBDIR flag during linking and do at least one of the following: - add LIBDIR to the SHLIB_PATH environment variable during execution - use the -Wl,+b -Wl,LIBDIR linker flag See any operating system documentation about shared libraries for more information, such as the ld(1) and ld.so(8) manual pages. ---------------------------------------------------------------------- Once a Libtool library is installed, binaries which link against it will hardcode the path to the Libtool library, as specified with the `-rpath' switch when the library was built. `libtool' always encodes the installation directory into a Libtool library for just this purpose. Hardcoding directories in this way is a good thing, because binaries linked against such libraries will continue to work if there are several incompatible versions of the library visible to the runtime loader (say a Trojan `libhello' in a user's `LD_LIBRARY_PATH', or a test build of the next release). The disadvantage to this system is that if you move libraries to new directories, executables linked in this way will be unable to find the libraries they need. Moving any library is a bad idea however, doubly so for a Libtool library which has its installation directory encoded internally, so the way to avoid problems of this nature is to not move libraries around after installation! ---------- Footnotes ---------- (1) When making a binary package from a virtual root directory for example.  File: autobook.info, Node: Installing an Executable, Next: Uninstalling, Prev: Installing a Library, Up: Introducing GNU Libtool 10.7 Installing an Executable ============================= Installing an executable uses exactly the same command line that I used to install the library earlier: $ libtool cp hello /usr/local/bin gcc -o /tmp/libtool-28585/hello main.c /usr/local/lib/libhello.sl \ /usr/local/lib/libtrim.sl -Wl,+b -Wl,/usr/local/lib cp /tmp/libtool-28585/hello /usr/local/bin/hello $ /usr/local/bin/hello Hello, World! As `libtool' said earlier, during the initial linking of the `hello' program in the build directory, `hello' must be rebuilt before installation. This is a peculiarity of HP-UX (and a few other architectures) which you won't see if you are following the examples on a GNU/Linux system. In the shell trace above, `libtool' has built an installable version of the `hello' program, saving me the trouble of remembering (or worse - coding for) the particulars of HP-UX, which runs correctly from the installed location. As a matter of interest, if you look at the attributes of the installed program using HP-UX's `chatr' command: $ chatr /usr/local/bin/hello /usr/local/bin/hello: shared executable shared library dynamic path search: SHLIB_PATH disabled second embedded path enabled first /usr/local/lib internal name: /tmp/libtool-28585/hello shared library list: static /usr/local/lib/libhello.sl.0 static /usr/local/lib/libtrim.sl.0 dynamic /lib/libc.1 shared library binding: deferred ... You can see that the runtime library search path for the installed `hello' program has been set to find the installed `libhello.sl.0' shared archive, preventing it from accidentally loading a different library (with the same name) from the default load path. This is a feature of `libtool', and a very important one at that, and although it may not seem like the right way to do things initially, it saves a *lot* of trouble when you end up with several versions of a library installed in several locations, since each program will continue to use the version that it was linked with, subject to library versioning rules, see *Note Library Versioning::. Without the help of `libtool', it is very difficult to prevent programs and libraries in the build tree from loading earlier (compatible) versions of a shared archive that were previously installed without an intimate knowledge of the build hosts architecture. Making it work portably would be nigh impossible! You should experiment with changes to the uninstalled library and satisfy yourself that the previously installed program continues to load the installed library at runtime, whereas the uninstalled program picks up the modifications in the uninstalled version of the library. Equally importantly, the uninstalled `hello' program continues to load the uninstalled shared archive. This allows me to continue developing in the source directories and perform test builds in the knowledge that `libtool' has built all of my executables, including the uninstalled executables in the build tree, to load the correct version of the library. I can check wth HP-UX's `chatr' command, like this: $ libtool --mode=execute chatr ./hello /tmp/hello/.libs/hello: shared executable shared library dynamic path search: SHLIB_PATH disabled second embedded path enabled first /tmp/intro-hello/.libs:\ /usr/local/lib internal name: .libs/hello shared library list: static /tmp/intro-hello/.libs/libhello.sl.0 static /tmp/intro-hello/.libs/libtrim.sl.0 dynamic /lib/libc.1 shared library binding: deferred ... This example introduces the concept of Libtool modes. Most of the time `libtool' can infer a mode of operation from the contents of the command line, but sometimes (as in this example) it needs to be told. In *Note Executing Uninstalled Binaries:: we already used `libtool' in "execute" mode to run `gdb' against an uninstalled binary. In this example I am telling `libtool' that I want to pass the `hello' binary to the `chatr' command, particularly since I know that the `hello' file is a script to set the local execution environment before running the real binary. The various modes that `libtool' has are described in the Libtool reference documentation, and are listed in the Libtool help text: $ libtool --help ... MODE must be one of the following: clean remove files from the build directory compile compile a source file into a libtool object execute automatically set library path, then run a program finish complete the installation of libtool libraries install install libraries or executables link create a library or an executable uninstall remove libraries from an installed directory MODE-ARGS vary depending on the MODE. Try `libtool --help --mode=MODE' for a more detailed description of MODE.  File: autobook.info, Node: Uninstalling, Prev: Installing an Executable, Up: Introducing GNU Libtool 10.8 Uninstalling ================= Having installed all of these files to `/usr/local', it might be difficult to remember which particular files belong to each installation. In the case of an executable, the uninstallation requires no magic, but when uninstalling a Libtool library all of the files which comprise the implementation of the Libtool library in question must be uninstalled: $ libtool rm -f /usr/local/bin/hello rm -f /usr/local/bin/hello $ libtool rm -f /usr/local/lib/libhello.la rm -f /usr/local/lib/libhello.la /usr/local/lib/libhello.sl.0.0 \ /usr/local/lib/libhello.sl.0 /usr/local/lib/libhello.sl \ /usr/local/lib/libhello.a $ libtool rm -f /usr/local/lib/libtrim.la rm -f /usr/local/lib/libtrim.la /usr/local/lib/libtrim.sl.0.0 \ /usr/local/lib/libtrim.sl.0 /usr/local/lib/libtrim.sl \ /usr/local/lib/libtrim.a Using `libtool' to perform the uninstallation in this way ensures that all of the files that it installed, including any additional soft links required by the architecture versioning scheme for shared archives, are removed with a single command. Having explored the use of `libtool' from the command line, the next chapter will discuss how to integrate `libtool' into the configury of your GNU Autotools based projects.  File: autobook.info, Node: Using GNU Libtool, Next: A Large GNU Autotools Project, Prev: Introducing GNU Libtool, Up: Top 11 Using GNU Libtool with `configure.in' and `Makefile.am' ********************************************************** Although Libtool is usable by itself, either from the command line or from a non-`make' driven build system, it is also tightly integrated into Autoconf and Automake. This chapter discusses how to use Libtool with Autoconf and Automake and explains how to set up the files you write (`Makefile.am' and `configure.in') to take advantage of `libtool'. For a more in depth discussion of the workings of Libtool, particularly its command line interface, *Note Introducing GNU Libtool::. Using `libtool' for dynamic runtime loading is described in *Note Using GNU libltdl::. Using `libtool' to build the libraries in a project, requires declaring your use of `libtool' inside the project's `configure.in' and adding the Libtool support scripts to the distribution. You will also need to amend the build rules in either `Makefile.am' or `Makefile.in', depending on whether you are using Automake. * Menu: * Integration with configure.in:: * Integration with Makefile.am:: * Using libtoolize:: * Library Versioning:: * Convenience Libraries::  File: autobook.info, Node: Integration with configure.in, Next: Integration with Makefile.am, Up: Using GNU Libtool 11.1 Integration with `configure.in' ==================================== Declaring your use of `libtool' in the project's `configure.in' is a simple matter of adding the `AC_PROG_LIBTOOL'(1) somewhere near the top of the file. I always put it immediately after the other `AC_PROG_...' macros. If you are converting an old project to use `libtool', then you will also need to remove any calls to `AC_PROG_RANLIB'. Since Libtool will be handling all of the libraries, _it_ will decide whether or not to call `ranlib' as appropriate for the build environment. The code generated by `AC_PROG_LIBTOOL' relies on the shell variable `$top_builddir' to hold the relative path to the directory which contains the `configure' script. If you are using Automake, `$top_builddir' is set in the environment by the generated `Makefile'. If you use Autoconf without Automake then you must ensure that `$top_builddir' is set before the call to `AC_PROG_LIBTOOL' in `configure.in'. Adding the following code to `configure.in' is often sufficient: for top_builddir in . .. ../.. $ac_auxdir $ac_auxdir/..; do test -f $top_builddir/configure && break done Having made these changes to add `libtool' support to your project, you will need to regenerate the `aclocal.m4' file to pick up the macro definitions required for `AC_PROG_LIBTOOL', and then rebuild your `configure' script with these new definitions in place. After you have done that, there will be some new options available from `configure': $ aclocal $ autoconf $ ./configure --help ... --enable and --with options recognized: --enable-shared[=PKGS] build shared libraries [yes] --enable-static[=PKGS] build static libraries [yes] --enable-fast-install[=PKGS] optimize for fast installation [yes] --with-gnu-ld assume the C compiler uses GNU ld [no] --disable-libtool-lock avoid locking (might break parallel builds) --with-pic try to use only PIC/non-PIC objects [both] These new options allow the end user of your project some control over how they want to build the project's libraries. The opposites of each of these switches are also accepted, even though they are not listed by `configure --help'. You can equally pass, `--disable-fast-install' or `--without-gnu-ld' for example. * Menu: * Extra Configure Options:: * Extra Macros for Libtool:: ---------- Footnotes ---------- (1) `AM_PROG_LIBTOOL' if you have an older `automake' or `libtool' installation.  File: autobook.info, Node: Extra Configure Options, Next: Extra Macros for Libtool, Up: Integration with configure.in 11.1.1 Extra Configure Options ------------------------------ What follows is a list that describes the more commonly used options that are automatically added to `configure', by virtue of using `AC_PROG_LIBTOOL' in your `configure.in'. The Libtool Manual distributed with Libtool releases always contains the most up to date information about `libtool' options: `--enable-shared' `--enable-static' More often invoked as `--disable-shared' or equivalently `--enable-shared=no' these switches determine whether `libtool' should build shared and/or static libraries in this package. If the installer is short of disk space, they might like to build entirely without static archives. To do this they would use: $ ./configure --disable-static Sometimes it is desirable to configure several related packages with the same command line. From a scheduled build script or where subpackages with their own `configure' scripts are present, for example. The `--enable-shared' and `--enable-static' switches also accept a list of package names, causing the option to be applied to packages whose name is listed, and the opposite to be applied to those not listed. By specifying: $ ./configure --enable-static=libsnprintfv,autoopts `libtool' would pass `--enable-static' to only the packages named "libsnprintfv" and "autoopts" in the current tree. Any other packages configured would effectively be passed `--disable-static'. Note that this doesn't necessarily mean that the packages must honour these options. Enabling static libraries for a package which consists of only dynamic modules makes no sense, and the package author would probably have decided to ignore such requests, *Note Extra Macros for Libtool::. `--enable-fast-install' On some machines, `libtool' has to relink executables when they are installed, *Note Installing an Executable::. Normally, when an end user builds your package, they will probably type: $ ./configure $ make $ make install `libtool' will build executables suitable for copying into their respective installation destinations, obviating the need for relinking them on those hosts which would have required it. Whenever `libtool' links an executable which uses shared libraries, it also creates a "wrapper script" which ensures that the environment is correct for loading the correct libraries, *Note Executing Uninstalled Binaries::. On those hosts which require it, the wrapper script will also relink the executable in the build tree if you attempt to run it from there before installation. Sometimes this behaviour is not what you want, particularly if you are developing the package and not installing between test compilations. By passing `--disable-fast-install', the default behaviour is reversed; executables will be built so that they can be run from the build tree without relinking, but during installation they may be relinked. You can pass a list of executables as the argument to `--enable-fast-install' to determine which set of executables will not be relinked at installation time (on the hosts that require it). By specifying: $ ./configure --enable-fast-install=autogen The `autogen' executable will be linked for fast installation (without being relinked), and any other executables in the build tree will be linked for fast execution from their build location. This is useful if the remaining executables are for testing only, and will never be installed. Most machines do not require that executables be relinked in this way, and in these cases `libtool' will link each executable once only, no matter whether `--disable-fast-install' is used. `--with-gnu-ld' This option is used to inform `libtool' that the C compiler is using GNU ld as its linker. It is more often used in the opposite sense when both `gcc' and GNU `ld' are installed, but `gcc' was built to use the native linker. `libtool' will probe the system for GNU ld, and assume that it is used by `gcc' if found, unless `--without-gnu-ld' is passed to configure. `--disable-libtool-lock' In normal operation, `libtool' will build two objects for every source file in a package, one "PIC"(1) and one non-PIC. With `gcc' and some other compilers, `libtool' can specify a different output location for the PIC object: $ libtool gcc -c shell.c gcc -c -pic -DPIC shell.c -o .libs/shell.lo gcc -c foo.c -o shell.o >/dev/null 2>&1 When using a compiler that doesn't accept both `-o' and `-c' in the same command, `libtool' must compile first the PIC and then the non-PIC object to the same destination file and then move the PIC object before compiling the non-PIC object. This would be a problem for parallel builds, since one file might overwrite the other. `libtool' uses a simple shell locking mechanism to avoid this eventuality. If you find yourself building in an environment that has such a compiler, and not using parallel `make', then the locking mechanism can be safely turned off by using `--disable-libtool-lock' to gain a little extra speed in the overall compilation. `--with-pic' In normal operation, Libtool will build shared libraries from PIC objects and static archives from non-PIC objects, except where one or the other is not provided by the target host. By specifying `--with-pic' you are asking `libtool' to build static archives from PIC objects, and similarly by specifying `--without-pic' you are asking `libtool' to build shared libraries from non-PIC objects. `libtool' will only honour this flag where it will produce a working library, otherwise it reverts to the default. ---------- Footnotes ---------- (1) Position Independent Code - suitable for shared libraries which might be loaded to different addresses when linked by the runtime loader.  File: autobook.info, Node: Extra Macros for Libtool, Prev: Extra Configure Options, Up: Integration with configure.in 11.1.2 Extra Macros for Libtool ------------------------------- There are several macros which can be added to `configure.in' which will change the default behaviour of `libtool'. If they are used they must appear before the call to the `AC_PROG_LIBTOOL' macro. Note that these macros only change the default behaviour, and options passed in to `configure' on the command line will always override the defaults. The most up to date information about these macros is available from the Libtool Manual. `AC_DISABLE_FAST_INSTALL' This macro tells `libtool' that on platforms which require relinking at install time, it should build executables so that they can be run from the build tree at the expense of relinking during installation, as if `--disable-fast-install' had been passed on the command line. `AC_DISABLE_SHARED' `AC_DISABLE_STATIC' These macros tell `libtool' to _not_ try and build either shared or static libraries respectively. `libtool' will always try to build _something_ however, so even if you turn off static library building in `configure.in', building your package for a target host without shared library support will fallback to building static archives. The time spent waiting for builds during development can be reduced a little by including these macros temporarily. Don't forget to remove them before you release the project though! In addition to the macros provided with `AC_PROG_LIBTOOL', there are a few shell variables that you may need to set yourself, depending on the structure of your project: `LTLIBOBJS' If your project uses the `AC_REPLACE_FUNCS' macro, or any of the other macros which add object names to the `LIBOBJS' variable, you will also need to provide an equivalent `LTLIBOBJS' definition. At the moment, you must do it manually, but needing to do that is considered to be a bug and will fixed in a future release of Autoconf. The manual generation of `LTLIBOBJS' is a simple matter of replacing the names of the objects mentioned in `LIBOBJS' with equivalent `.lo' suffixed Libtool object names. The easiest way to do this is to add the following snippet to your `configure.in' near the end, just before the call to `AC_OUTPUT'. Xsed="sed -e s/^X//" LTLIBOBJS=`echo X"$LIBOBJS"|\ [$Xsed -e "s,\.[^.]* ,.lo ,g;s,\.[^.]*$,.lo,"]` AC_SUBST(LTLIBOBJS) The `Xsed' is not usually necessary, though it can prevent problems with the `echo' command in the event that one of the `LIBOBJS' files begins with a `-' character. It is also a good habit to write shell code like this, as it will avoid problems in your programs. `LTALLOCA' If your project uses the `AC_FUNC_ALLOCA' macro, you will need to provide a definition of `LTALLOCA' equivalent to the `ALLOCA' value provided by the macro. Xsed="sed -e s/^X//" LTALLOCA=`echo X"$ALLOCA"|[$Xsed -e "s,\.$[^.]*,.lo,g"]` AC_SUBST(LTALLOCA) Obviously you don't need to redefine `Xsed' if you already use it for `LTLIBOBJS' above. `LIBTOOL_DEPS' To help you write `make' rules for automatic updating of the Libtool configuration files, you can use the value of `LIBTOOL_DEPS' after the call to `AC_PROG_LIBTOOL': AC_PROG_LIBTOOL AC_SUBST(LIBTOOL_DEPS) Then add the following to the top level `Makefile.in': libtool: @LIBTOOL_DEPS@ cd $(srcdir) && \ $(SHELL) ./config.status --recheck If you are using `automake' in your project, it will generate equivalent rules automatically. You don't need to use this except in circumstances where you want to use `libtool' and `autoconf', but not `automake'.  File: autobook.info, Node: Integration with Makefile.am, Next: Using libtoolize, Prev: Integration with configure.in, Up: Using GNU Libtool 11.2 Integration with `Makefile.am' =================================== Automake supports Libtool libraries in two ways. It can help you to build the Libtool libraries themselves, and also to build executables which link against Libtool libraries. * Menu: * Creating Libtool Libraries with Automake:: * Linking against Libtool Libraries with Automake::  File: autobook.info, Node: Creating Libtool Libraries with Automake, Next: Linking against Libtool Libraries with Automake, Up: Integration with Makefile.am 11.2.1 Creating Libtool Libraries with Automake ----------------------------------------------- Continuing in the spirit of making Libtool library management look like native static archive management, converting a `Makefile.am' from static archive use to Libtool library use is a matter of changing the name of the library, and adding a Libtool prefix somewhere. For example, a `Makefile.am' for building a static archive might be: lib_LIBRARIES = libshell.a libshell_a_SOURCES = object.c subr.c symbol.c This would build a static archive called `libshell.a' consisting of the objects `object.o', `subr.o' and `bar.o'. To build an equivalent Libtool library from the same objects, you change this to: lib_LTLIBRARIES = libshell.la libshell_la_SOURCES = object.c subr.c symbol.c The only changes are that the library is now named with a `.la' suffix, and the Automake primary is now `LTLIBRARIES'. Note that since the name of the library has changed, you also need to use `libshell_la_SOURCES', and similarly for any other Automake macros which used to refer to the old archive. As for native libraries, Libtool library names should begin with the letters `lib', so that the linker will be able to find them when passed `-l' options. Often you will need to add extra objects to the library as determined by `configure', but this is also a mechanical process. When building native libraries, the `Makefile.am' would have contained: libshell_a_LDADD = xmalloc.o @LIBOBJS@ To add the same objects to an equivalent Libtool library would require: libshell_la_LDADD = xmalloc.lo @LTLIBOBJS@ That is, objects added to a Libtool library must be Libtool objects (with a `.lo') suffix. You should add code to `configure.in' to ensure that `LTALLOCA' and `LTLIBOBJS' are set appropriately, *Note Extra Macros for Libtool::. Automake will take care of generating appropriate rules for building the Libtool objects mentioned in an `LDADD' macro. If you want to pass any additional flags to `libtool' when it is building, you use the `LDFLAGS' macro for that library, like this: libshell_la_LDFLAGS = -version-info 1:0:1 For a detailed list of all the available options, see *Note Link mode: (Libtool)Link mode. Libtool's use of `-rpath' has been a point of contention for some users, since it prevents you from moving shared libraries to another location in the library search path. Or, at least, if you do, all of the executables that were linked with `-rpath' set to the old location will need to be relinked. We (the Libtool maintainers) assert that always using `-rpath' is a good thing: Mainly because you can guarantee that any executable linked with `-rpath' will find the correct version of the library, in the rpath directory, that was intended when the executable was linked. Library versions can still be managed correctly, and will be found by the run time loader, by installing newer versions to the same directory. Additionally, it is much harder for a malicious user to leave a modified copy of system library in a directory that someone might wish to list in their `LD_LIBRARY_PATH' in the hope that some code they have written will be executed unexpectedly. The argument against `-rpath' was instigated when one of the GNU/Linux distributions moved some important system libraries to another directory to make room for a different version, and discovered that all of the executables that relied on these libraries and were linked with Libtool no longer worked. Doing this was, arguably, bad system management - the new libraries should have been placed in a new directory, and the old libraries left alone. Refusing to use `-rpath' in case you want to restructure the system library directories is a very weak argument. The `-rpath' option (which is required for Libtool libraries) is automatically supplied by `automake' based on the installation directory specified with the library primary. lib_LTLIBRARIES = libshell.la The example would use the value of the make macro `$(libdir)' as the argument to `-rpath', since that is where the library will be installed. A few of the other options you can use in the library `LDFLAGS' are: `-no-undefined' Modern architectures allow us to create shared libraries with undefined symbols, provided those symbols are resolved (usually by the executable which loads the library) at runtime. Unfortunately, there are some architectures (notably AIX and Windows) which require that _all_ symbols are resolved when the library is linked. If you know that your library has no unresolved symbols at link time, then adding this option tells `libtool' that it will be able to build a shared library, even on architectures which have this requirement. `-static' Using this option will force `libtool' to build only a static archive for this library. `-release' On occasion, it is desirable to encode the release number of a library into its name. By specifying the release number with this option, `libtool' will build a library that does this, but will break binary compatibility for each change of the release number. By breaking binary compatibility this way, you negate the possibility of fixing bugs in installed programs by installing an updated shared library. You should probably be using `-version-info' instead. libshell_la_LDFLAGS = -release 27 The above fragment might create a library called `libshell-27.so.0.0.0' for example. `-version-info' Set the version number of the library according to the native versioning rules based on the numbers supplied, *Note Library Versioning::. You need to be aware that the library version number is for the use of the runtime loader, and is completely unrelated to the release number of your project. If you really want to encode the project release into the library, you can use `-release' to do it. If this option is not supplied explicitly, it defaults to `-version-info 0:0:0'. Historically, the default behaviour of Libtool was as if `-no-undefined' was always passed on the command line, but it proved to be annoying to developers who had to constantly turn it off so that their ELF libraries could be featureful. Now it has to be defined explicitly if you need it. There are is a tradeoff: * If you don't specify `-no-undefined', then Libtool will not build shared libraries on platforms which don't allow undefined symbols at link time for such a library. * It is only safe to specify this flag when you know for certain that _all_ of the libraries symbols are defined at link time, otherwise the `-no-undefined' link will appear to work until it is tried on a platform which requires all symbols to be defined. Libtool will try to link the shared library in this case (because you told it that you have not left any undefined symbols), but the link will fail, because there *are* undefined symbols in spite of what you told Libtool. For more information about this topic, see *Note Portable Library Design::.  File: autobook.info, Node: Linking against Libtool Libraries with Automake, Prev: Creating Libtool Libraries with Automake, Up: Integration with Makefile.am 11.2.2 Linking against Libtool Libraries with Automake ------------------------------------------------------ Once you have set up your `Makefile.am' to create some Libtool libraries. you will want to link an executable against them. You can do this easily with `automake' by using the program's qualified `LDADD' macro: bin_PROGRAMS = shell shell_SOURCES = shell.c token.l shell_LDADD = libshell.la This will choose either the static or shared archive from the `libshell.la' Libtool library depending on the target host and any Libtool mode switches mentioned in the `Makefile.am', or passed to `configure'. The chosen archive will be linked with any objects generated from the listed sources to make an executable. Note that the executable itself is a hidden file, and that in its place `libtool' creates a wrapper script, *Note Executing Uninstalled Binaries::. As with the Libtool libraries, you can pass additional switches for the `libtool' invocation in the qualified `LDFLAGS' macros to control how the `shell' executable is linked: `-all-static' Always choose static libraries where possible, and try to create a completely statically linked executable. `-no-fast-install' If you really want to use this flag on some targets, you can pass it in an `LDFLAGS' macro. This is not overridden by the `configure' `--enable-fast-install' switch. Executables built with this flag will not need relinking to be executed from the build tree on platforms which might have otherwise required it. `-no-install' You should use this option for any executables which are used only for testing, or for generating other files and are consequently never installed. By specifying this option, you are telling Libtool that the executable it links will only ever be executed from where it is built in the build tree. Libtool is usually able to considerably speed up the link process for such executables. `-static' This switch is similar to `-all-static', except that it applies to only the uninstalled Libtool libraries in the build tree. Where possible the static archive from these libraries is used, but the default linking mode is used for libraries which are already installed. When debugging an executable, for example, it can be useful to temporarily use: shell_LDFLAGS = -all-static You can pass Libtool link options to all of the targets in a given directory by using the unadorned `LDFLAGS' macro: LDFLAGS = -static This is best reserved for directories which have targets of the same type, all Libtool libraries or all executables for instance. The technique still works in a mixed target type directory, and `libtool' will ignore switches which don't make sense for particular targets. It is less maintainable, and makes it harder to understand what is going on if you do that though.  File: autobook.info, Node: Using libtoolize, Next: Library Versioning, Prev: Integration with Makefile.am, Up: Using GNU Libtool 11.3 Using libtoolize ===================== Having made the necessary editions in `configure.in' and `Makefile.am', all that remains is to add the Libtool infrastructure to your project. First of all you must ensure that the correct definitions for the new macros you use in `configure.in' are added to `aclocal.m4', *Note Generated File Dependencies::. At the moment, the safest way to do this is to copy `libtool.m4' from the installed `libtool' to `acinclude.m4' in the toplevel source directory of your package. This is to ensure that when your package ships, there will be no mismatch errors between the M4 macros you provided in the version of `libtool' you built the distribution with, versus the version of the Libtool installation in another developer's environment. In a future release, `libtool' will check that the macros in aclocal.m4 are from the same Libtool distribution as the generated `libtool' script. $ cp /usr/share/libtool/libtool.m4 ./acinclude.m4 $ aclocal By naming the file `acinclude.m4' you ensure that `aclocal' can see it and will use macros from it, and that `automake' will add it to the distribution when you create the tarball. Next, you should run `libtoolize', which adds some files to your distribution that are required by the macros from `libtool.m4'. In particular, you will get `ltconfig'(1) and `ltmain.sh' which are used to create a custom `libtool' script on the installer's machine. If you do not yet have them, `libtoolize' will also add `config.guess' and `config.sub' to your distribution. Sometimes you don't need to run `libtoolize' manually, since `automake' will run it for you when it sees the changes you have made to `configure.in', as follows: $ automake --add-missing automake: configure.in: installing ./install-sh automake: configure.in: installing ./mkinstalldirs automake: configure.in: installing ./missing configure.in: 8: required file ./ltconfig not found The error message in the last line is an aberration. If it were consistent with the other lines, it would say: automake: configure.in: installing ./ltconfig automake: configure.in: installing ./ltmain.sh automake: configure.in: installing ./config.guess automake: configure.in: installing ./config.sub But the effect is the same, and the files are correctly added to the distribution despite the misleading message. Before you release a distribution of your project, it is wise to get the latest versions of `config.guess' and `config.sub' from the GNU site(2), since they may be newer than the versions automatically added by `libtoolize' and `automake'. Note that `automake --add-missing' will give you its own version of these two files if `AC_PROG_LIBTOOL' is not used in the project `configure.in', but will give you the versions shipped with `libtool' if that macro is present! ---------- Footnotes ---------- (1) The functionality of `ltconfig' is slated for migration into `libtool.m4' for a future release of `libtool', whereupon this file will no longer be necessary. (2) `ftp://ftp.gnu.org/gnu/config/'  File: autobook.info, Node: Library Versioning, Next: Convenience Libraries, Prev: Using libtoolize, Up: Using GNU Libtool 11.4 Library Versioning ======================= It is important to note from the outset that the version number of your project is a very different thing to the version number of any libraries shipped with your project. It is a common error for maintainers to try to force their libraries to have the same version number as the current release version of the package as a whole. At best, they will break binary compatibility unnecessarily, so that their users won't gain the benefits of the changes in their latest revision without relinking all applications that use it. At worst, they will allow the runtime linker to load binary incompatible libraries, causing applications to crash. Far better, the Libtool versioning system will build native shared libraries with the correct _native_ library version numbers. Although different architectures use various numbering schemes, Libtool abstracts these away behind the system described here. The various native library version numbering schemes are designed so that when an executable is started, the runtime loader can, where appropriate, choose a more recent installed library version than the one with which the executable was actually built. This allows you to fix bugs in your library, and having built it with the correct Libtool version number, have those fixes propagate into any executables that were built with the old buggy version. This can only work if the runtime loader can tell whether it can load the new library into the old executable and expect them to work together. The library version numbers give this information to the runtime loader, so it is very important to set them correctly. The version scheme used by Libtool tracks "interfaces", where an interface is the set of exported entry points into the library. All Libtool libraries start with `-version-info' set to `0:0:0' - this will be the default version number if you don't explicitly set it on the Libtool link command line. The meaning of these numbers (from left to right) is as follows: CURRENT The number of the current interface exported by the library. A CURRENT value of `0', means that you are calling the interface exported by this library _interface 0_. REVISION The implementation number of the most recent interface exported by this library. In this case, a REVISION value of `0' means that this is the first implementation of the interface. If the next release of this library exports the same interface, but has a different implementation (perhaps some bugs have been fixed), the REVISION number will be higher, but CURRENT number will be the same. In that case, when given a choice, the library with the highest REVISION will always be used by the runtime loader. AGE The number of previous additional interfaces supported by this library. If AGE were `2', then this library can be linked into executables which were built with a release of this library that exported the current interface number, CURRENT, or any of the previous two interfaces. By definition AGE must be less than or equal to CURRENT. At the outset, only the first ever interface is implemented, so AGE can only be `0'. For later releases of a library, the `-version-info' argument needs to be set correctly depending on any interface changes you have made. This is quite straightforward when you understand what the three numbers mean: 1. If you have changed any of the sources for this library, the REVISION number must be incremented. *This is a new revision of the current interface*. 2. If the interface has changed, then CURRENT must be incremented, and REVISION reset to `0'. *This is the first revision of a new interface*. 3. If the new interface is a superset of the previous interface (that is, if the previous interface has not been broken by the changes in this new release), then AGE must be incremented. *This release is backwards compatible with the previous release*. 4. If the new interface has removed elements with respect to the previous interface, then you have broken backward compatibility and AGE must be reset to `0'. *This release has a new, but backwards incompatible interface*. For example, if the next release of the library included some new commands for an existing socket protocol, you would use `-version-info 1:0:1'. *This is the first revision of a new interface. This release is backwards compatible with the previous release*. Later, you implement a faster way of handling part of the algorithm at the core of the library, and release it with `-version-info 1:1:1'. *This is a new revision of the current interface*. Unfortunately the speed of your new implementation can only be fully exploited by changing the API to access the structures at a lower level, which breaks compatibility with the previous interface, so you release it as `-version-info 2:0:0'. *This release has a new, but backwards incompatible interface*. When deciding which numbers to change in the `-version-info' argument for a new release, you must remember that an interface change is not limited to the API of the library. The notion of an interface must include any method by which a user (code or human) can interact with the library: adding new builtin commands to a shell library; the format used in an output file; the handshake protocol required for a client connecting over a socket, and so on. Additionally, If you use a development model which has both a stable and an unstable tree being developed in parallel, for example, and you don't mind forcing your users to relink all of the applications which use one of your Libtool libraries every time you make a release, then `libtool' provides the `-release' flag to encode the project version number in the name of the library, *Note Creating Libtool Libraries with Automake::. This can save you library compatibility problems later if you need to, say, make a patch release of an older revision of your library, but the library version number that you should use has already been taken by another earlier release. In this case, you could be fairly certain that library releases from the unstable branch will not be binary compatible with the stable releases, so you could make all the stable releases with `-release 1.0' and begin the first unstable release with `-release 1.1'.  File: autobook.info, Node: Convenience Libraries, Prev: Library Versioning, Up: Using GNU Libtool 11.5 Convenience Libraries ========================== Sometimes it is useful to group objects together in an intermediate stage of a project's compilation to provide a useful handle for that group without having to specify all of the individual objects every time. Convenience libraries are a portable way of creating such a "partially linked" object: Libtool will handle all of the low level details in a way appropriate to the target host. This section describes the use of convenience libraries in conjunction with Automake. The principles of convenience libraries are discussed in *Note Creating Convenience Libraries: Creating Convenience Libraries with libtool. The key to creating Libtool convenience libraries with Automake is to use the `noinst_LTLIBRARIES' macro. For the Libtool libraries named in this macro, Automake will create Libtool convenience libraries which can subsequently be linked into other Libtool libraries. In this section I will create two convenience libraries, each in their own subdirectory, and link them into a third Libtool library, which is ultimately linked into an application. If you want to follow this example, you should create a directory structure to hold the sources by running the following shell commands: $ mkdir convenience $ cd convenience $ mkdir lib $ mkdir replace The first convenience library is built from two source files in the `lib' subdirectory. 1. `source.c': #if HAVE_CONFIG_H # include #endif #if HAVE_MATH_H # include #endif void foo (double argument) { printf ("cos (%g) => %g\n", argument, cos (argument)); } This file defines a single function to display the cosine of its argument on standard output, and consequently relies on an implementation of the `cos' function from the system libraries. Note the conditional inclusion of `config.h', which will contain a definition of `HAVE_MATH_H' if `configure' discovers a `math.h' system header (the usual location for the declaration of `cos'). The `HAVE_CONFIG_H' guard is by convention, so that the source can be linked by passing the preprocessor macro definitions to the compiler on the command line - if `configure.in' does not use `AM_CONFIG_HEADER' for instance. 2. `source.h': extern void foo (double argument); For brevity, there is no `#ifndef SOURCE_H' guard. The header is not installed, so you have full control over where it is `#include'ed, and in any case, function declarations can be safely repeated if the header is accidentally processed more than once. In a real program, it would be better to list the function parameters in the declaration so that the compiler can do type checking. This would limit the code to working only with ANSI compilers, unless you also use a `PARAMS' macro to conditionally preprocess away the parameters when a K&R compiler is used. These details are beyond the scope of this convenience library example, but are described in full in *Note K&R Compilers::. You also need a `Makefile.am' to hold the details of how this convenience library is linked: ## Makefile.am -- Process this file with automake to produce Makefile.in noinst_LTLIBRARIES = library.la library_la_SOURCES = source.c source.h library_la_LIBADD = -lm The `noinst_LTLIBRARIES' macro names the Libtool convenience libraries to be built in this directory, `library.la'. Although not required for compilation, `source.h' is listed in the `SOURCES' macro of `library.la' so that correct source dependencies are generated, and so that it is added to the distribution tarball by `automake''s `dist' rule. Finally, since the `foo' function relies on the `cos' function from the system math library, `-lm' is named as a required library in the `LIBADD' macro. As with all Libtool libraries, interlibrary dependencies are maintained for convenience libraries so that you need only list the libraries you are using directly when you link your application later. The libraries used by those libraries are added by Libtool. The parent directory holds the sources for the main executable, `main.c', and for a (non-convenience) Libtool library, `error.c' & `error.h'. Like `source.h', the functions exported from the Libtool library `liberror.la' are listed in `error.h': extern void gratuitous (void); extern void set_program_name (char *path); extern void error (char *message); The corresponding function definitions are in `error.c': #include #include "source.h" static char *program_name = NULL; void gratuitous (void) { /* Gratuitous display of convenience library functionality! */ double argument = 0.0; foo (argument); } void set_program_name (char *path) { if (!program_name) program_name = basename (path); } void error (char *message) { fprintf (stderr, "%s: ERROR: %s\n", program_name, message); exit (1); } The `gratuitous()' function calls the `foo()' function defined in the `library.la' convenience library in the `lib' directory, hence `source.h' is included. The definition of `error()' displays an error message to standard error, along with the name of the program, `program_name', which is set by calling `set_program_name()'. This function, in turn, extracts the basename of the program from the full path using the system function, `basename()', and stores it in the library private variable, `program_name'. Usually, `basename()' is part of the system C library, though older systems did not include it. Because of this, there is no portable header file that can be included to get a declaration, and you might see a harmless compiler warning due to the use of the function without a declaration. The alternative would be to add your own declaration in `error.c'. The problem with this approach is that different vendors will provide slightly different declarations (with or without `const' for instance), so compilation will fail on those architectures which _do_ provide a declaration in the system headers that is different from the declaration you have guessed. For the benefit of architectures which do not have an implementation of the `basename()' function, a fallback implementation is provided in the `replace' subdirectory. The file `basename.c' follows: #if HAVE_CONFIG_H # include #endif #if HAVE_STRING_H # include #elif HAVE_STRINGS_H # include #endif #if !HAVE_STRRCHR # ifndef strrchr # define strrchr rindex # endif #endif char* basename (char *path) { /* Search for the last directory separator in PATH. */ char *basename = strrchr (path, '/'); /* If found, return the address of the following character, or the start of the parameter passed in. */ return basename ? ++basename : path; } For brevity, the implementation does not use any `const' declarations which would be good style for a real project, but would need to be checked at configure time in case the end user needs to compile the package with a K&R compiler. The use of `strrchr()' is noteworthy. Sometimes it is declared in `string.h', otherwise it might be declared in `strings.h'. BSD based Unices, on the other hand, do not have this function at all, but provide an equivalent function, `rindex()'. The preprocessor code at the start of the file is designed to cope with all of these eventualities. The last block of preprocessor code assumes that if `strrchr' is already defined that it holds a working macro, and does not redefine it. `Makefile.am' contains: ## Makefile.am -- Process this file with automake to produce Makefile.in noinst_LTLIBRARIES = libreplace.la libreplace_la_SOURCES = libreplace_la_LIBADD = @LTLIBOBJS@ Once again, the `noinst_LTLIBRARIES' macro names the convenience library, `libreplace.la'. By default there are no sources, since we expect to have a system definition of `basename()'. Additional Libtool objects which should be added to the library based on tests at configure time are handled by the `LIBADD' macro. `LTLIBOBJS' will contain `basename.lo' if the system does not provide `basename', and will be empty otherwise. Illustrating another feature of convenience libraries: on many architectures, `libreplace.la' will contain no objects. Back in the toplevel project directory, all of the preceding objects are combined by another `Makefile.am': ## Makefile.am -- Process this file with automake to produce Makefile.in AUTOMAKE_OPTIONS = foreign SUBDIRS = replace lib . CPPFLAGS = -I$(top_srcdir)/lib include_HEADERS = error.h lib_LTLIBRARIES = liberror.la liberror_la_SOURCES = error.c liberror_la_LDFLAGS = -no-undefined -version-info 0:0:0 liberror_la_LIBADD = replace/libreplace.la lib/library.la bin_PROGRAMS = convenience convenience_SOURCES = main.c convenience_LDADD = liberror.la The initial `SUBDIRS' macro is necessary to ensure that the libraries in the subdirectories are built before the final library and executable in this directory. Notice that I have not listed `error.h' in `liberror_la_SOURCES' this time, since `liberror.la' is an installed library, and `error.h' defines the public interface to that library. Since the `liberror.la' Libtool library is installed, I have used the `-version-info' option, and I have also used `-no-undefined' so that the project will compile on architectures which require all library symbols to be defined at link time - the reason `program_name' is maintained in `liberror' rather than `main.c' is so that the library does not have a runtime dependency on the executable which links it. The key to this example is that by linking the `libreplace.la' and `library.la' convenience libraries into `liberror.la', all of the objects in both convenience libraries are compiled into the single installed library, `liberror.la'. Additionally, all of the inter-library dependencies of the convenience libraries (`-lm', from `library.la') are propagated to `liberror.la'. A common difficulty people experience with Automake is knowing when to use a `LIBADD' primary versus a `LDADD' primary. A useful mnemonic is: *`LIBADD' is for ADDitional LIBrary objects. `LDADD' is for ADDitional linker (LD) objects.* The executable, `convenience', is built from `main.c', and requires only `liberror.la'. All of the other implicit dependencies are encoded within `liberror.la'. Here is `main.c': #include #include "error.h" int main (int argc, char *argv[]) { set_program_name (argv[0]); gratuitous (); error ("This program does nothing!"); } The only file that remains before you can compile the example is `configure.in': # Process this file with autoconf to create configure. AC_INIT(error.c) AM_CONFIG_HEADER(config.h) AM_INIT_AUTOMAKE(convenience, 1.0) AC_PROG_CC AM_PROG_LIBTOOL AC_CHECK_HEADERS(math.h) AC_CHECK_HEADERS(string.h strings.h, break) AC_CHECK_FUNCS(strrchr) AC_REPLACE_FUNCS(basename) Xsed="sed -e s/^X//" LTLIBOBJS=echo X"$LIBOBJS" | \ $Xsed -e "s,\.[^.]* ,.lo ,g;s,\.[^.]*\$,.lo,"` AC_SUBST(LTLIBOBJS) AC_OUTPUT(replace/Makefile lib/Makefile Makefile) There are checks for all of the features used by the sources in the project: `math.h' and either `string.h' or `strings.h'; the existence of `strrchr' (_after_ the tests for string headers); adding `basename.o' to `LIBOBJS' if there is no system implementation; and the shell code to set `LTLIBOBJS'. With all the files in place, you can now bootstrap the project: $ ls -R .: Makefile.am configure.in error.c error.h lib main.c replace lib: Makefile.am source.c source.h replace: Makefile.am basename.c $ aclocal $ autoheader $ automake --add-missing --copy automake: configure.in: installing ./install-sh automake: configure.in: installing ./mkinstalldirs automake: configure.in: installing ./missing configure.in: 7: required file ./ltconfig not found $ autoconf $ ls -R .: Makefile.am config.h.in error.c ltconfig mkinstalldirs Makefile.in config.sub error.h ltmain.sh replace aclocal.m4 configure install-sh main.c config.guess configure.in lib missing lib: Makefile.am Makefile.in source.c source.h replace: Makefile.am Makefile.in basename.c With these files in place, the package can now be configured: $ ./configure ... checking how to run the C preprocessor... gcc -E checking for math.h... yes checking for string.h... yes checking for strrchr... yes checking for basename... yes updating cache ./config.cache creating ./config.status creating replace/Makefile creating lib/Makefile creating Makefile creating config.h Notice that my host has an implementation of `basename()'. Here are the highlights of the compilation itself: $ make Making all in replace make[1]: Entering directory /tmp/replace /bin/sh ../libtool --mode=link gcc -g -O2 -o libreplace.la rm -fr .libs/libreplace.la .libs/libreplace.* .libs/libreplace.* ar cru .libs/libreplace.al ranlib .libs/libreplace.al creating libreplace.la (cd .libs && rm -f libreplace.la && ln -s ../libreplace.la \ libreplace.la) make[1]: Leaving directory /tmp/replace Here the build descends into the `replace' subdirectory and creates `libreplace.la', which is empty on my host since I don't need an implementation of `basename()': Making all in lib make[1]: Entering directory /tmp/lib /bin/sh ../libtool --mode=compile gcc -DHAVE_CONFIG_H -I. -I. \ -g -O2 -c source.c rm -f .libs/source.lo gcc -DHAVE_CONFIG_H -I. -I. -g -O2 -c -fPIC -DPIC source.c \ -o .libs/source.lo gcc -DHAVE_CONFIG_H -I. -I. -g -O2 -c source.c \ -o source.o >/dev/null 2>&1 mv -f .libs/source.lo source.lo /bin/sh ../libtool --mode=link gcc -g -O2 -o library.la source.lo -lm rm -fr .libs/library.la .libs/library.* .libs/library.* ar cru .libs/library.al source.lo ranlib .libs/library.al creating library.la (cd .libs && rm -f library.la && ln -s ../library.la library.la) make[1]: Leaving directory /tmp/lib Next, the build enters the `lib' subdirectory to build `library.la'. The `configure' preprocessor macros are passed on the command line, since no `config.h' was created by `AC_CONFIG_HEADER': Making all in . make[1]: Entering directory /tmp /bin/sh ./libtool --mode=compile gcc -DHAVE_CONFIG_H -I. -I. -I./lib \ -g -O2 -c error.c mkdir .libs gcc -DHAVE_CONFIG_H -I. -I. -I./lib -g -O2 -Wp,-MD,.deps/error.pp -c \ -fPIC -DPIC error.c -o .libs/error.lo error.c: In function set_program_name: error.c:20: warning: assignment makes pointer from integer without cast gcc -DHAVE_CONFIG_H -I. -I. -I./lib -g -O2 -Wp,-MD,.deps/error.pp -c \ error.c -o error.o >/dev/null 2>&1 mv -f .libs/error.lo error.lo /bin/sh ./libtool --mode=link gcc -g -O2 -o liberror.la -rpath \ /usr/local/lib -no-undefined -version-info 0:0:0 error.lo \ replace/libreplace.la lib/library.la rm -fr .libs/liberror.la .libs/liberror.* .libs/liberror.* gcc -shared error.lo -Wl,--whole-archive replace/.libs/libreplace.al \ lib/.libs/library.al -Wl,--no-whole-archive \ replace/.libs/libreplace.al lib/.libs/library.al -lc -Wl,-soname \ -Wl,liberror.so.0 -o .libs/liberror.so.0.0.0 (cd .libs && rm -f liberror.so.0 && ln -s liberror.so.0.0.0 \ liberror.so.0) (cd .libs && rm -f liberror.so && ln -s liberror.so.0.0.0 liberror.so) rm -fr .libs/liberror.lax mkdir .libs/liberror.lax rm -fr .libs/liberror.lax/libreplace.al mkdir .libs/liberror.lax/libreplace.al (cd .libs/liberror.lax/libreplace.al && ar x \ /tmp/replace/.libs/libreplace.al) rm -fr .libs/liberror.lax/library.al mkdir .libs/liberror.lax/library.al (cd .libs/liberror.lax/library.al && ar x \ /tmp/lib/.libs/library.al) ar cru .libs/liberror.a error.o .libs/liberror.lax/library.al/source.lo ranlib .libs/liberror.a rm -fr .libs/liberror.lax creating liberror.la (cd .libs && rm -f liberror.la && ln -s ../liberror.la liberror.la) The resulting convenience library is an archive of the resulting PIC objects. The inter-library dependency, `-lm', is passed to `libtool' and, although not needed to create the convenience library, _is_ stored in the pseudo-archive, `library.la', to be used when another object links against it. Also you can see the harmless compiler warning I mentioned earlier, due to the missing declaration for `basename()'. Notice how `libtool' uses the `--whole-archive' option of GNU ld to link the convenience library contents directly into `liberror.so', but extracts the PIC objects from each of the convenience libraries so that a new `liberror.a' can be made from them. Unfortunately, this means that the resulting static archive component of `liberror.la' has a mixture of PIC and non-PIC objects. In a future release of `libtool', this will be addressed by tracking both types of objects in the convenience archive if necessary, and using the correct type of object depending on context. Here, `main.c' is compiled (not to a Libtool object, since it is not compiled using `libtool'), and linked with the `liberror.la' Libtool library: gcc -DHAVE_CONFIG_H -I. -I. -I./lib -g -O2 -c main.c /bin/sh ./libtool --mode=link gcc -g -O2 -o convenience main.o \ liberror.la gcc -g -O2 -o .libs/convenience main.o ./.libs/liberror.so -lm \ -Wl,--rpath -Wl,/usr/local/lib creating convenience make[1]: Leaving directory /tmp/convenience `libtool' calls `gcc' to link the `convenience' executable from `main.o' and the shared library component of `liberror.la'. `libtool' also links with `-lm', the propagated inter-library dependency of the `library.la' convenience library. Since `libreplace.la' and `library.la' were convenience libraries, their objects are already present in `liberror.la', so they are not listed again in the final link line - the whole point of convenience archives. This just shows that it all works: $ ls Makefile config.h configure.in install-sh main.c Makefile.am config.h.in convenience lib main.o Makefile.in config.log error.c liberror.la missing aclocal.m4 config.status error.h libtool mkinstalldirs config.cache config.sub error.lo ltconfig replace config.guess configure error.o ltmain.sh $ libtool --mode=execute ldd convenience liberror.so.0 => /tmp/.libs/liberror.so.0 (0x40014000) libm.so.6 => /lib/libm.so.6 (0x4001c000) libc.so.6 => /lib/libc.so.6 (0x40039000) /lib/ld-linux.so.2 => /lib/ld-linux.so.2 (0x40000000) $ ./convenience cos (0) => 1 lt-convenience: ERROR: This program does nothing! Notice that you are running the uninstalled executable, which is in actual fact a wrapper script, *Note Executing Uninstalled Binaries::. That is why you need to use `libtool' to run `ldd' on the real executable. The uninstalled executable called by the wrapper script is called `lt-convenience', hence the output from `basename()'. Finally, you can see from the output of `ldd', that `convenience' really isn't linked against either `library.la' and `libreplace.la'.  File: autobook.info, Node: A Large GNU Autotools Project, Next: Rolling Distribution Tarballs, Prev: Using GNU Libtool, Up: Top 12 A Large GNU Autotools Project ******************************** This chapter develops the worked example described in *Note A Small GNU Autotools Project::. Again, the example is heavily colored by my own views, and there certainly are other, very different, but equally valid ways of achieving the same objectives. I will explain how I incorporated `libtool' into the Sic project, and how to put the project documentation and test suite under the control of GNU Autotools. I pointed out some problems with the project when I first introduced it - this chapter will address those issues, and present my favored solution to each. * Menu: * Using Libtool Libraries:: * Removing --foreign:: * Installing Header Files:: * Including Texinfo Documentation:: * Adding a Test Suite::  File: autobook.info, Node: Using Libtool Libraries, Next: Removing --foreign, Up: A Large GNU Autotools Project 12.1 Using Libtool Libraries ============================ As you have seen, It is very easy to convert `automake' built static libraries to `automake' built Libtool libraries. In order to build `libsic' as a Libtool library, I have changed the name of the library from `libsic.a' (the "old archive" name in Libtool terminology) to `libsic.la' (the "pseudo-library"), and must use the `LTLIBRARIES' Automake primary: lib_LTLIBRARIES = libsic.la libsic_la_LIBADD = $(top_builddir)/replace/libreplace.la libsic_la_SOURCES = builtin.c error.c eval.c list.c sic.c \ syntax.c xmalloc.c xstrdup.c xstrerror.c Notice the `la' in `libsic_la_SOURCES' is new too. It is similarly easy to take advantage of Libtool _convenience_ libraries. For the purposes of Sic, `libreplace' is an ideal candidate for this treatment - I can create the library as a separate entity from selected sources in their own directory, and add those objects to `libsic'. This technique ensures that the installed library has all of the support functions it needs without having to link `libreplace' as a separate object. In `replace/Makefile.am', I have again changed the name of the library from `libreplace.a' to `libreplace.la', and changed the automake primary from `LIBRARIES' to `LTLIBRARIES'. Unfortunately, those changes alone are insufficient. Libtool libraries are compiled from Libtool objects (which have the `.lo' suffix), so I cannot use `LIBOBJS' which is a list of `.o' suffixed objects(1). *Note Extra Macros for Libtool::, for more details. Here is `replace/Makefile.am': MAINTAINERCLEANFILES = Makefile.in noinst_LTLIBRARIES = libreplace.la libreplace_la_SOURCES = libreplace_la_LIBADD = @LTLIBOBJS@ And not forgetting to set and use the `LTLIBOBJS' configure substitution (*note Extra Macros for Libtool::): Xsed="sed -e s/^X//" LTLIBOBJS=echo X"$LIBOBJS" | \ [$Xsed -e s,\.[^.]* ,.lo ,g;s,\.[^.]*$,.lo,'] AC_SUBST(LTLIBOBJS) As a consequence of using `libtool' to build the project libraries, the increasing number of configuration files being added to the `config' directory will grow to include `ltconfig' and `ltmain.sh'. These files will be used on the installer's machine when Sic is configured, so it is important to distribute them. The naive way to do it is to give the `config' directory a `Makefile.am' of its own; however, it is not too difficult to distribute these files from the top `Makefile.am', and it saves clutter, as you can see here: AUX_DIST = $(ac_aux_dir)/config.guess \ $(ac_aux_dir)/config.sub \ $(ac_aux_dir)/install-sh \ $(ac_aux_dir)/ltconfig \ $(ac_aux_dir)/ltmain.sh \ $(ac_aux_dir)/mdate-sh \ $(ac_aux_dir)/missing \ $(ac_aux_dir)/mkinstalldirs AUX_DIST_EXTRA = $(ac_aux_dir)/readline.m4 \ $(ac_aux_dir)/sys_errlist.m4 \ $(ac_aux_dir)/sys_siglist.m4 EXTRA_DIST = bootstrap MAINTAINERCLEANFILES = Makefile.in aclocal.m4 configure config-h.in \ stamp-h.in $(AUX_DIST) dist-hook: (cd $(distdir) && mkdir $(ac_aux_dir)) for file in $(AUX_DIST) $(AUX_DIST_EXTRA); do \ cp $$file $(distdir)/$$file; \ done The `dist-hook' rule is used to make sure the `config' directory and the files it contains are correctly added to the distribution by the `make dist' rules, *note Introduction to Distributions::. I have been careful to use the `configure' script's location for `ac_aux_dir', so that it is defined (and can be changed) in only one place. This is achieved by adding the following macro to `configure.in': AC_SUBST(ac_aux_dir) There is no need to explicitly set a macro in the `Makefile.am', because Automake automatically creates macros for every value that you `AC_SUBST' from `configure.in'. I have also added the `AC_PROG_LIBTOOL' macro to `configure.in' in place of `AC_PROG_RANLIB' as described in *Note Using GNU Libtool::. Now I can upgrade the configury to use `libtool' - the greater part of this is running the `libtoolize' script that comes with the Libtool distribution. The `bootstrap' script then needs to be updated to run `libtoolize' at the correct juncture: #! /bin/sh set -x aclocal -I config libtoolize --force --copy autoheader automake --add-missing --copy autoconf Now I can re-bootstrap the entire project so that it can make use of `libtool': $ ./bootstrap + aclocal -I config + libtoolize --force --copy Putting files in AC_CONFIG_AUX_DIR, config. + autoheader + automake --add-missing --copy automake: configure.in: installing config/install-sh automake: configure.in: installing config/mkinstalldirs automake: configure.in: installing config/missing + autoconf The new macros are evident by the new output seen when the newly regenerated `configure' script is executed: $ ./configure --with-readline ... checking host system type... i586-pc-linux-gnu checking build system type... i586-pc-linux-gnu checking for ld used by GCC... /usr/bin/ld checking if the linker (/usr/bin/ld) is GNU ld... yes checking for /usr/bin/ld option to reload object files... -r checking for BSD-compatible nm... /usr/bin/nm -B checking whether ln -s works... yes checking how to recognise dependent libraries... pass_all checking for object suffix... o checking for executable suffix... no checking for ranlib... ranlib checking for strip... strip ... checking if libtool supports shared libraries... yes checking whether to build shared libraries... yes checking whether to build static libraries... yes creating libtool ... $ make ... gcc -g -O2 -o .libs/sic sic.o sic_builtin.o sic_repl.o sic_syntax.o \ ../sic/.libs/libsic.so -lreadline -Wl,--rpath -Wl,/usr/local/lib creating sic ... $ src/sic ] libtool --mode=execute ldd src/sic libsic.so.0 => /tmp/sic/sic/.libs/libsic.so.0 (0x40014000) libreadline.so.4 => /lib/libreadline.so.4 (0x4001e000) libc.so.6 => /lib/libc.so.6 (0x40043000) libncurses.so.5 => /lib/libncurses.so.5 (0x40121000) /lib/ld-linux.so.2 => /lib/ld-linux.so.2 (0x40000000) ] exit $ As you can see, `sic' is now linked against a shared library build of `libsic', but not directly against the convenience library, `libreplace'. ---------- Footnotes ---------- (1) Actually the suffix will be whatever is appropriate for the target host: such as `.obj' on Windows for example.