makepp takes over from

Since my project was progressing very slowly, I took a new look at its closest competitor, makepp.  It is so much superior to GNU make, that I am putting up with make's strange syntax.  With that is dead, and I will also contribute my builtin commands to makepp.

I had not seen the advantage of my 100% Perl syntax in a makefile.  It makes a makefile rather hard to read, and forces build system maintaners to know Perl. The same is true of a coming up hybrid, PBS, which writes the makefiles in Perl, yet puts in the commands in strings, like make. – Perl – Documentation

SYNOPSIS[ option ...][ target ...] can contain, besides normal Perl code

rule undef, qw(main.o x.o y.o) => 'prog';
  • TODO
  • or

    rule {
        my $target = pop;
        sh qw'gcc -o', $target, @_;
    } qw(main.o x.o y.o) => 'prog';


    -B, --no-builtin-commands
    Eliminate the built-in commands.  You can still define your own. 
    -d, --debug
    Output diagnostic messages, useful for tracing which rules are being defined, tried and applied.  Since they are compiled Perl code, you don't see the commands, but you can compare the addresses. 
    -f FILE, --file=FILE, --makefile=FILE
    Use FILE instead of the default for giving rules, variables and anything else. 
    -i, --ignore-errors
    Ignore all errors in commands executed to remake files. 
    -k, --keep-going
    Continue as much as possible after an error.  While the target that failed, and those that depend on it, cannot be remade, the other prerequisites of these targets can be processed all the same. 
    -n, --just-print, --dry-run, --recon
    Print the commands that would be executed, but do not execute them. 
    -r, --no-builtin-rules
    Eliminate use of the built-in implicit rules.  You can still define your own. 
    -R, --no-builtin-variables
    Eliminate use of the built-in rule-specific variables.  You can still define your own. 
    -s, --silent, --quiet
    Silent operation; do not print the commands as they are executed. 
    -v, --version
    Show the installed version of 


    Deferred Variables

    Unlike Perl variables, which get evaluated at the moment you use them, traditional make variables get evaluated lazily, at the last possible moment.  Like GNU make, allows both kinds of variables.

    The evaluation of deferred variables is performed through the rule function only when applying the rule.  It also happens when calling external and builtin commands.  Deferred variables are effectively references which are dereferenced when used.  This happens recursively until they no longer contain references.

    Unblessed references are the simplest form of deferred variables. 
    determine environment-variable, [command, ..., ]default-value
    The first time this is evaluated, the first applicable value is determined.  If $ENV{environment-variable} is defined, its value is parsed by shellparse.  Else the commands are searched for until one of them exists in $ENV{PATH}.  Each command may be a command-name, or an array consisting of the command-name and initial arguments.  If no command is given or found, default-value is returned.  Later uses will rereturn the same value, which is automatically cached. 
    once { Perl code }
    The first time this is evaluated, Perl code will be called in list context.  Later uses will rereturn the same value, which is automatically cached. 
    shellparse string
    This does not create a deferred variable!  But it is handy when once or sub deferred variables return a command(-fragment) from an external source.  It parses string like the Bourne Shell would and returns it as a list.  It not only splits on whitespace but also understands escaped characters (\.), single- ('...') and double-quoted ("...") strings.  Variables ($var or ${var}) are taken from the environment – take care that Perl doesn't interpolate its variables for you!  Command substitutions (`...`) also work. 
    sub { Perl code }
    Code references without a prototype will be called in list context without an argument, and the return value will be further dereferenced. 

    Predefined Variables

    The keys are the names of languages, like C or 'C++'.  Each value is an array reference (so you can just push anything onto it) containing the command to compile that language.  Used by the builtin rules and suggested that you also use this for consistency. 
    To be implemented. 
    The keys are the names of languages, like C or 'C++'.  Each value is an array reference (so you can just push anything onto it) containing the compiler options to compile that language.  Used by the builtin rules and suggested that you also use this for consistency. 
    Where makefiles and prerequisites are searched for.  This allows using a different directory tree as a reference and only remaking those files locally which are newer (locally). 
    Any key that has a true value is not searched for in the file system.  Anything to be made that depends on a phony rule, and all prerequisites of those rules will always be made.  This defaults to all, clean, install, test and distclean


    Rules tell what a target depends on and how to make it once all prerequisites are there.

    The prerequisites and target or pattern may also be variable references, which get expanded only when applying the rule.  See Deferred Variables.

    Static Rules

    rule { Perl code } target;
    Such a rule tells what Perl code is needed to make target
    rule undef, prerequisite, ... => target;
    Such a rule is the opposite, telling what target depends on, not how to make it.  This tries to make every prerequisite, and then implicitly uses another rule (builtin or supplied by you) to make target
    rule { Perl code } prerequisite, ... => target;
    This combines the two other kinds, first making every prerequisite and then performing Perl code to make target

    Pattern Rules

    Pattern rules are tried when no static rule applies, or when the static rule has no Perl code.

    rule { Perl code } qr/pattern/;
    rule { Perl code } prerequisite, ... => qr/pattern/;
    These do the same as the corresponding static rules, but the targets are only matched when applying the rule.  Each prerequisite is taken as a substitution string and may contain (literally) $1 and so forth if the pattern has the corresponding grouping.  If the prerequisite contains braces, they must be paired properly or escaped as in {} or \{

    Rule Utilities

    Normally a rule will overwrite an already existing same rule.  Static rules are the same when they have the same target.  Pattern rules are the same when they have the same target regexp and the same list of prerequisite patterns.

    extend { Perl code };
    You can place rules into such a block.  This extends rules for the same target, rather than overwriting the rule. 
    include file, ...;
    Read more rules and variables (and generally any Perl code) from each file. 


    Commands are functions with special behaviour.  Some of the builtins take a code block as their first argument.  Otherwise their arguments may be strings, literally or in variables.  But they may also be scalar, array or code references.  See Deferred Variables.

    Unless a command is silent (see Options or io), it will be echoed before being performed.

    Unless a command is being ignored (see Options or Command Modifiers), the return value will be analyzed by and acted on.  When ignoring failure, commands return undef if and always only if they fail.  Otherwise a failing command will let the whole run abort.

    Some commands take options which are shown here without quotes, though in real use they are Perl strings.

    Builtin Commands

    These behave similarly to their Unix or GNU counterparts, without some of the bells and whistles.  But they are more efficient, since they use Perl's capabilities, rather than forking a process.  Those that talk about operating on lines of the file contents, actually mean chunks as defined by $/.  When echoing they show a function prefix &.

    cat[ file, ...];
    Concatenate the contents of all the files. 
    chmod mode, file, ...;
    Sets the mode of each file to mode, a number or string (parsed like a literal number, i.e. in decimal unless there is a leading 0 – octal, 0x -- hexadecimal or 0b – binary, unlike normally where '0644' != 0644.) If you would like mode not to be echoed as decimal, pass it as an octal string.  Use CORE::chmod if you really need the original.
    chown user[:group], file, ...;
    chown :group, file, ...;
    Sets user and / or group (given as name or numeric id) for all the files.  Use CORE::chown if you really need the original.
    cp[ option, ...,] file, ... destination;
    ln[ option, ...,] file, ... destination;
    mv[ option, ...,] file, ... destination;
    Copy, link or move (rename) the files to destination.  If more than one file is given, destination must be a directory.  You can also symlink or move directories.  Options are:
    -b, --backup move existing destination files to same name with suffix appended
    -f, --force first remove files that can't be overwritten
    --parents preserve file's given parent directories under destination directory
    -S, --suffix=str use str as backup suffix instead of ~
    -s, --symbolic create symbolic instead of real link (ln only)
    egrep { Perl code }[ file, ...];
    Output those lines from all files, for which Perl code returns true. 
    Simply fail. 
    fsort undef[, -r|--reverse][, -u|--uniq|--unique][, file, ...];
    fsort { Perl code }[ -r|--reverse,][ -u|--uniq|--unique,][ file, ...];
    Sort the lines of all files together.  In the first case, the default sort order is used.  In the second a programmed order as in the Perl sort function is used. 
    head number[, file, ...];
    head qr/regexp/[, file, ...];
    tail number[, file, ...];
    tail qr/regexp/[, file, ...];
    Output the lines respectively from one to number, or from number to end of each file.  If number is negative, it is counted from the end of each file.  The regexp variants will instead go up to or from the first matching line. 
    mkdir directory, ...;
    Creates each directory and any required parent directories.  (cf. Unix mkdir -p) Use CORE::mkdir if you really need the original.
    paste[ -d|--delimiter=delimiter,] file, ...;
    Paste all lines from each file side by side.  They get separated by delimiter, which defaults to tabulator. 
    pod2html[ option, ...];
    pod2usage[ option, ...];
    Identical to the respective external Perl programs. 
    rev[ -l|--lines,][ file, ...];
    Reverse the characters on each line, or with the options the lines among all files. 
    rm file|directory, ...;
    Removes each file or directory with contents.  Doesn't complain about missing files.  (cf. Unix rm -fr)
    sed { Perl code }[ file, ...];
    Performs Perl code for each line of each file, and outputs the result.  It should modify $_, like s/// or tr///.  (cf. Unix sed or perl -pe)
    template %{{ key => value, ... }}[, option, ...,][ file, ...];
    template hash[, option, ...,][ file, ...];
    Find all the occurences of the keys from the hash in either form, delimited by an immediately leading and trailing @.  Replace all these occurences, from all the files, with the associated value.  Options are:
    -p, --prefix=str use str as key prefix, instead of @
    -s, --suffix=str use str as key suffix, instead of @
    touch file, ...;
    Sets the access and modification time for each file to now, creating those that don't exist. 
    uniq[ file, ...];
    Output those lines from all files which differ from the preceding one. 

    External Commands

    sh[ -f|--first|-l|--last,] command, ...;
    This can take one or more arguments, and Perl will execute the list.  To eliminate repeated arguments, use --first to keep only the first occurence of each, or --last to keep only the last. 
    sh -c|--command => command, ...;
    In the second form the arguments are joined into one string, separated by semicolons.  This string is passed to a shell for execution.  If it contains no shell metacharacters, Perl emulates the shell.  Do not call something like
    sh 'ls -l';

    when you want one of

    sh qw'ls -l';
    sh -c => 'ls -l';

    Adding Commands

    use command name => functionreference;
    use command name => sub { Perl code };
    These create a function of name name, which calls functionreference or Perl code with command semantics.  The functionreference or Perl code can warn, die or return.  Dying is turned into a error. 
    use command name;
    use command name => qw(command arg ...);
    These create a function of name name, which is an alias to an external command.  In the first form the external command name is identical to the function name.  In the second case it may be different, and/or may have additional arguments. 

    Command Modifiers

    io { Perl code } mode, ...;
    You can place builtin or external shell commands (e.g. within rules) into such a block, to redirect their I/O.  The old settings get restored afterwards.  Note that this is technically not a block, but a closure.  This means that the outer @_ is not available, but my-variables are. 

    The mode arguments are performed back to front, allowing you to do pipelines.  If mode is the empty string, this prevents the commands from being echoed, as with the --silent command line option.

    As in Shell, a mode of '<&-', '>&-' and '2>&-' will close STDIN, STDOUT and STDERR respectively.

    The other modes are almost as for the open function.  A mode starting with '>', '+>', '+<' or '|' will apply this to STDOUT.  Using '+<' allows some inplace editing of files, as they remain readable, see EXAMPLES.  A mode starting with '2>', '2+>', '2+<' or '2|' will apply this to STDERR.  The '2' will not be passed to open.  All others are applied to STDIN.

    You can also give mode as [mode, argument ...], to use the multiple argument form of open.

    ignore_errors { Perl code }[ level];
    You can place builtin or external shell commands (e.g. within rules) into such a block.  Optional level is either 0 – override englobing ignore_errors; 1, the default – prevents the commands' return value from being acted on; 2 – error return code is not even shown.  This is equivalent to preceding commands with - in make.


    This makes any intermediate .o file when needed, from the corresponding .c source:

    rule {
        my $source = $_[0];
        my $target = $_[-1];    # may have more than 2 args from static rules
        sh -c => "gcc -c -o $target $source";
    } '.c' => qr/\.o$/;

    This makes myprog from all present object files:

    rule undef, <*.o> => 'myprog';              # wrong, maybe none are built yet
    rule undef, sub { <*.o> } => 'myprog';                      # closer but wrong
    rule undef, sub { grep s/c$/o/, <*.c> } => 'myprog';        # correct for C

    This makes myprog from a.o, b.o, c.o, d.o, e3.o, which are themselves implicitly made from any available sources:

    rule undef, \@more_C_objects, \$eno => 'myprog';
    @more_C_objects = (qw(a.o b.o c.o), \$do);
    $do = 'd.o';
    $eno = sub { "e$n.o" };
    $n = 3;

    This defines analogous static rules with a varying component, here in a maximum of places, the Perl code, several prerequisites and the target:

    rule { print "$_.a\n" } "$_.b", "$_.c", 'config.h' => "$_.z"
        for qw(src1 src2 srcn);

    This gives you another builtin command, and two internal aliases to external ones:

    use command nop => sub { print "I do nothing\n" };
    use command ls;
    use command ll => qw(ls -l);

    This will first print to the file, and then edit the file inplace, without echoing the sed command:

    io { print 'hallo' } ">$file";
    io { sed { tr/a/e/ } $file } '', ['+<', $file];

    Cummulate io redirections from back to front creating a pipeline:

    io { ... } '| 1st-pipe', ['|-', '2nd-pipe'], ..., '> outfile';

    This will extract the header and signature of each file in email format:

    egrep { 1../^$/ or /^-- /..eof } file, ...;


    Precompile files and extract dependency list (cpp-style and open for handling anything).

    Try to look which files from the same directory can be compiled with one call to the compiler (idea thanks to Nadim Khemir).  I'm not sure how this fits in with my rules, nor with using an intermediate directory...

    Optionally stop after reading all makefiles (and autoloading every command) waiting for targets from fifo and forking (idea from ant server).

    Have more predicates for eliminating compilations and getting away from only files:

    Handle parallelization (hopefully easy with threads) and subdirectories (package and threads vs. fork?  chdir probably forces us to fork).

    Extend sh() to transparently distribute to various machines.

    Have a builtin install command, since that is not standard, and, when available, named differently of different machines.  Also:

    sh -i|--install=produced-target-file, producing-command
    cut undef|{ split /regexp/ } [-o outfile1,] [-o outfile2,] ..., infile, ...;
    ncp url|file ... url|directory # with LWP::UserAgent
    $COMMAND{yacc} = determine YACC => 'yacc', [$COMMAND{bison}, '-y'];
    $COMMAND{bison} = determine BISON => 'bison';
    # fsort:
    my @data = map  $_->[0],
               sort { $a->[1] <=> $b->[1] ||            # second column-numeric
                      $a->[2] <=> $b->[2] ||            # third column-numeric
                      $a->[3] <=> $b->[3]       # fourth column-numeric
               map  [ $_, (split)[1, 2, 3] ], <DATA>;

    Command make for building self-contained subprojects.

    Give it all the rules and external variables (CFLAGS, ... – optionally?) of GNU make.

    Turn it into one or two modules, so that a makefile can itself be a standalone Perl program.

    Somebody should write a simple makedepend-style makefile parser.

    Somebody might use Perl's XML-capabilities for writing an ant-makefile parser.

    Teach ExtUtils::MakeMaker to generate this.

    Maybe support an interpreter linked to gcc, performing almost everything in a single process.

    Last modified: 2003-06-27
    Powered by the GPL and the Artistic License