Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Using Make – writing less Makefile (causal.agency)
217 points by todsacerdoti on Dec 26, 2023 | hide | past | favorite | 196 comments


Make is the grilling of the hacker world. You just want to light up some quality lumpwood charcoal and get a nice smokey crust on some steaks, cook up some hamburgers, and crisp up some chicken for your guests over the course of an afternoon…

…but oh no there is a constant stream of inquisitive folk coming over, beer in hand, suggesting cool adjustments / hacks / “improvements” to your setup. With grilling it’s actually pretty easy to defend oneself from this unsolicited advice. There’s only one grill after all. It’s not like it’s some company wide grill that anyone can make changes to solely by convincing one other person to sign off on their PR / MR / changeset.

I’ve seen some very “clever” makefiles (and cmake lists.txt) over the years. It is hacker catnip for all manner of Jackson Pollock ad hoc scripting rubbish and I lament every commit that did anything except make the makefile less complex! All new software engineers should be given a shiny ball to play with when they get bored with their day job, lest they decide to take out their playful creativity on the build system instead.


make is very useful. The idea that things depend on each other and are built as a hierarchy is important.

But it is the opposite of beauty, or elegance, or organization.

Honestly all the millions of man months that have gone into writing makefiles should have gone into making make more usable.

for example:

all the implicit rules in this article? Nobody¹ uses them, they should just be explicit.

You shold explicitly distinguish between a target and a file and a directory. don't let them depend on each other helter-skelter.

indent, tabs, spaces. what a mess.

requiring continuation lines and juggling return codes?

macros that break with whitespace?

I could go on, but ugh.

I actually like cmake better than make. Just having IF statements and a flow makes things more understandable. But yeah, you trade one set of problems for another.

look up "modern cmake" and you'll see the nitty-gritty between nice theory and ugly practice.

[1] for practical purposes


All I can say about these critiques: I learned make a few years ago and it wasn't nearly as bad as it was made out to be.

As someone who cares about the aesthetics of code, I thought I would hate using make but I ended up enjoying writing modern makefiles.

When make was created back in the day, I get why certain choices were made, like with virtually all Unix tools of that era. But we don't have to use them in the same way. In general, our tooling is so much better that the so-called warts of make are more like minor annoyances than show stoppers.


The beauties of make are:

* It actually does the complicated job correctly, or can, it provides the means to. A lot of things that claim to be "new better make" are simpler by being simply incomplete and/or internally more magic and less correct and less deterministic.

* Single simple binary. No freaking python or ruby of just a certain vintage and 500 modules but maybe not the system ones or maybe only the system ones...

* Always available everywhere, not just in the middle of the bell curve of modern and popular systems, and will be tomorrow exactly the same as yesterday. Deviations like gmake vs bsdmake are essentially trivial and don't really break anything or present any real problem. IE: portable across both space and time in all directions of all dimensions.

Make is sane for the job it does. It's a great example of a correct use and application of a DSL.

And all those complaints are still true. I address them by:

* Simply adopt a habit or discipline to voluntarily avoid the implicit features. Don't use globbing either. Just don't use those features that are unwise. They are still there, but you don't have to invoke them.

* Just suck it up and deal with the tabs and whitespace and line-continuation and quirks like how each line of shell in a target is a new shell environment unless you build the annoying long single command, capturing exit values so they don't reach the parent make process and get treated as an error, etc. Whatever. To me that's all just normal shell scripting requirements in any context, make not especially different from cron or cgi or other. Rather that than passing around powershell objects or something.

Could be better but ninja or whatever is not the answer to those minor annoyances.


Agree on implicit behavior.

Actually I think it's fine for the implicit automatic features to exist, I just think no one should use them in most cases.

It's fine for a tool to try to be as magic as possible, as an optional thing you can do whaen you decide you want. And for practically any makefile that you are bothering to actually write into a file that you are keeping and running again tomorrow, and giving to someone else to run some other day on some other machine... everything should be explicit. Just like the projects code, it's documentation as much as function, and no part of either the documentation or the function should be left unexplained and undetermined.

But that's a seperate issue from the tool having the ability. I think it's not a tools job to tell me what I want and why and what is and is not valid. If make never had the magic features, I would not miss them feel any need to invest in adding them, but if they are there anyway somehow, I also don't see any reason to remove them or create a new tool just to have something that doesn't have them.

I should be free to write a short super convenient fully magic 3-line makefile that just says "do the likely thing with whatever is in this directory", and at the same time, no project should ever have that as their real makefile. Both stances are valid at the same time, I think.


Apparently the reasoning behind a bunch of Make’s warts is that by the time Stuart Feldman realized the problem, it already had half a dozen users, and, well, he wasn’t gonna go and break it for them!


I don't see why there couldn't be (even today!) some statement like "require version XYZ", which Make could notice, which would enable new features and nicer syntax. Something like "editions" in Rust, though the idea is much older.

For some reason, people prefer to add more layers on top of Make rather than improving the core language and tool. Maybe there's some good reason?


> rather than improving the core language and tool

GNU Make has been steadily gaining new functions though. I've been reading the manual again recently and there were a few functions I did not recognize: let and intcmp. The new integer comparison function is particularly notable since people have gone to some rather incredible lengths to implement arithmetic in make:

https://gmsl.jgc.org

It's also possible to write native plugins for GNU Make to load at runtime. New functions can be implemented this way. It works just like C extensions for other scripting languages. There's also Guile extension support but unfortunately it doesn't seem to allow new function definitions.


> arithmetic in make

Well that’s horrifying. What is the use case? Or is it just hackers gonna hack?


idk about anyone else but I have auto incrementing build numbers in several makefiles

That would seem to be the most obvious and trivial example and I just assume that there must be countless others without having to actually see them and agree that they are valid.

Wait, how could I forget the other most obvious example, comparing version numbers of requirements/dependencies?

The Vala preprocessor can't compare numbers and so I have to implement "if libfoo < x then add -D OLDFOO" outside in the makefile and can only have boolean #if OLDFOO in the code. (libs don't always provide good enough equivalents, maybe only provides a numerical value that you are assumed to be able to compare, or maybe provides nothing at all, or does provide a set of booleans intended to be used for this but they aren't granular enough, etc)


Incrementing a counter would be one example. Like maybe you want to produce file-00.txt through file-99.txt.


I think people treat make like a dmv trip or taxes.

They want to get in, get it done, and get out.

But who wants to improve the DMV?


Quasi-tangent: I was once hit by a coworker doing first code review on a new repo with "hmm...I'm not familiar with using Makefiles as a project management tool. So something something [don't remember] we should replace that." It struck me as weird because I don't see `make` as a build tool so much as automating shell script snippets you would/could type at the command line. From that POV I conceptually see Makefiles as like a `bin/` folder full of little scripts, honestly the main advantage for me being every command runs with working dir being where the Makefile is; no need to write a `set -euo pipefail; HERE="$(dirname "$0")"` dingleberry at the top of every hypothetical `bin/` script. And having incremental rebuild is sugar on top when it's possible to write :)

I don't see `make` as a build tool because I believe from experience that is where the road to hell begins. But it is convenient to have `make` call CMake/Cargo/pip/whatever. Plain `make` can build, `make test` can test, `make fmt` can auto-format... Even better if you alias `m` to `make` in your shell.


Your coworker's experience is more principled: Make is a mediocre tool for executing commands. It wasn't ever designed for that. Although it is pretty common to see what you are mentioning in projects because it doesn't require installing a dependency.

For a repo where an easy to install (single binary) dependency is a non-issue, consider using just. [1] You get `just -l` where you can see all the command available, the ability to use different languages, and overall simpler command writing.

[1] https://github.com/casey/just


With all due respect, I don't understand the first part of your comment. Make's core purpose is to execute commands, isn't it? How was it not designed to execute commands?


No, no. Make's principal purpose is to put a set of files into a desired state. It can "make" a particular file by invoking a dependent graph of commands that produce that file from other files. It checks timestamps and only run steps where the resulting files are older than some of the (transitive) source files.

You can invoke it by naming a named rule, not a file, but the logic will remain.

If this is not what you want to be doing, and if your Makefile is full of .PHONY targets, you likely need a Justfile instead, or (worse) plain shell scripts.


I tell people not to use make if there's no need for a dependency graph.

Just don't.

And do just :-)


Yeah, use make -j8 instead.


I also don't like Make for running commands.

I was using Make for running commands, but it interprets all command line parameters as build targets, which was annoying. And Bash scripts in Makefiles aren't type safe, became messy.

Nowadays I use Make only for building. And Deno + Typescript instead, for running commands & scripts.

(Hadn't heard about Just — the scrips aren't type safe though?)


depends on how you define type-safe. Even in TypeScript you are still executing strings when running commands. You can turn those strings into variables and share them as a form of type safety- the type check will fail if you mis-type the variable name. Just supports that- you will get an error if a variable that is being inserted doesn't actually exist. The error will occur at startup before usage or can be checked ahead of time with just --check --fmt. So for quick usage there's not much benefit to TS, but over time in TS you could build up type safe interfaces to commands and obviously if you want to run TS scripts instead of shell scripts that's going to have more opportunities for strong typing then invoking TS from just.


> Hadn't heard about Just — the scrips aren't type safe though?

Just runs code written in other languages. It's `sh` by default, but you can define anything (e.g. `python3 -c` to run Python scripts).


> Make's core purpose is to execute commands, isn't it? How was it not designed to execute commands?

Some people on HN believe that if you use make primarily as a task runner that you're doing it wrong and should use something else. I don't agree.

Most users are using multiple aspects of make: as a task runner and as something that handles the dependency graph when building something digital.

As a web developer, I used npm scripts, grunt, gulp, etc. to manage web builds but now I only use make and have gotten off the merry-go-round of web build tools.


What’s the point of using another tool to manage your shell scripts though? Why not just use shell scripts


Makefiles coordinate your shell scripts, using a dependency tree you'd have to reimplement if you did it purely with shell scripts.

For example:

A) You have a recipe that installs dependencies if they're not up-to-date (such as "pip install").

B) You have a recipe that compiles stuff if it's not up-to-date; it's made to depend on A.

C) "make" can be an interface recipe that depends on B and does nothing else.

D) "make test" can be an interface recipe that depends on B, then runs tests.

E) "make run" can be an interface recipe that depends on B, then actually runs the code / a webserver to interact with.

Nowadays whenever I put one of these together there's 2 versions of A (one for python, one for node), 1+ versions of B (webpack build without watch, maybe others depending on the project), 3 versions of D ("test-js", "test-python", and "test" that does both, and each of them only requires the relevant parts of A and B).

Makes it trivial to ensure you're up-to-date after a "git pull" without having to waste time waiting on things that don't need updating.


> What’s the point of using another tool to manage your shell scripts though? Why not just use shell scripts.

When you create a Makefile, you're describing a dependency graph of what depends on what and the order tasks need to take to produce a particular set of digital artifacts--you're not writing shell scripts.

Turns out there are all kinds of edge cases and foot guns that make handles for you that you'd otherwise have to deal with if you wrote a bespoke shell script.

Plus make is battle tested and has been around since the dawn of Unix--there's literally no build scenario it can't handle.

I hadn't used make until a few years ago but it's been one of the best investments I've made in my workflow.


> there's literally no build scenario it can't handle.

Well except this little case where some files or folders in your paths have spaces in them...


Out of the box with no configuration, Fish shell provides target completions for a Makefile in the current directory [1].

[1]: https://fishshell.com/docs/current/interactive.html


A similar tool is `task` https://taskfile.dev/ . It is quite capable and also a single executable. I've grown to quite like it.


I prefer task over just, while I am not a huge fan of YAML, we now use it everywhere so it just makes sense to not learn yet another DSL for Just and just use YAML.


How do task and just compare to SCons?

Although SCons is Python (which is a pro or con depending upon your perspective), it has strong dependency management. Or is the argument that dependency management is part of build, not general project maintenance?


Starting out from the blog post, it talks essentially about Make as it was intended, as a build system to compile programs with. Make maps this to the task of producing a file from input files, which are written down in the form of rules in a Makefile. a key ingredient from make is that it checks for timestamps on disk for the source files and updates targets only if the source files have been modified after the targets have been built.

If you go a bit further down this route, you end up with build tools that generate the compilation rules for you in some form: These are Automake/CMake/Meson and SCons. I did use scons years ago and it was nice, but its definitely completely lost its market share. IIRC Scons does this without generating Makefiles.

Task and Just are following a different route. The problem people have solved by using a "hack" in Makefiles (PHONY targets), so that you can easily run "sub-commands" in Make (make install_deps, etc). It would never occur to me to use Scons in that space.

Btw. a third option is to use a shell script like the following (POSIX-shell compatible actually).

  sub_install_deps() {
   set -e -x
   # ...
  }

  sc=$1
  case $sc in
  "" | "-h" | "--help")
   sub_help
   ;;
  *)
   shift
   "sub_${sc}" "$@"
   if [ $? = 127 ]; then
    echo "Error: '${sc}' is not a known sc." >&2
    echo "       Run '${prog_name} --help' for a list of known scs." >&2
    exit 1
   fi
   ;;
  esac


just is great, I add it everywhere, just test, just run, just fix, just shell. just works ;-)


Honestly Make is a pretty decent tool for executing shell snippets plus some lacklustre dependency resolution stuff on top.


This looks really nice. But then should `j` alias to `just` or to `jobs`? Non-starter. /s


I really like GNU Make because it has a hidden superpower: the "-j" parameter enables instant, easy parallelization. I have a project with many subprojects and I use "-j16" to invoke the same command in all of the subprojects, 16 at a time. It saves a lot of time and it works for all commands that don't touch other subprojects.

Like you, I use Make as a front end to lower level build tools. It seems to fit that role well.


I think this is a great example of tools being stuck in the 90's, actually. Make should be parallel by default.

I get it, I do. I know there's all sorts of legacy reasons, but my 4k resolution screen with 12GB vram and 32 cores still runs make build serially, outpitx to an 80x24 window and is bottlenecked down by stdout flushing, all because we can't change defaults.


And annoyingly, I don’t think there’s a way to tell `make` that a particular `Makefile` should be built with multiple threads by default. So you have to specify `-j` every time.


Yep. If it was opt-in in a per-makefile way, it would be ok (or preferably opt-out).

I think cmake got this absolutely right, fwiw. You define the version of cmake you're writing against, and they go to great pains to ensure that they preserve behaviour even in newer versions of cmake. As an example, we could have a special variable:

    make_ver = 4.5
And even running with a future version of make the behaviour would be that of 4.5. if it's not present, it defaults to the latest version before the versions were introduced.

This would let make do things like accept spaces instead of tabs, be parallel by default, etc in future versions while presenting old behaviour.


‘-j -l $MAXLOAD’ is even more polite on shared systems. Avoids most runaways, but not all.


Make is declarative - that's roughly the point of it. You give it rules and dependencies and variables and it's supposed to work out what to do.

If your build is extremely simple and fast then it cannot add value.


Nice to see some love for make.

Sure, make syntax sucks. It's arcane and hard to google much of what is going on. (Shower thought... a tool that lets you view a Makefile, giving explanatory tooltips for syntax elements would be great).

But make is widespread, ancient and eternal. It isn't going anywhere, despite decades of new tools trying to take its place. It does many jobs relatively well.

I'd go so far as to say that the platonic ideal of development is:

make # Build the project

make install # Install the project

make clean # Clean the project build dir(s)

and whatever new tool you use to actually build things should live behind make. You should have a really good reason to deviate from this. It's approachable and familiar to most devs(well, ok, older C/C++ devs). Sure, write your own custom tool that builds everything in a chroot jail or docker container, has sub tools for installing, running tests, etc... but please, put it behind a make frontend.


> But make is widespread, ancient and eternal.

Yeah. The GNU Make is literally everywhere, it's too useful and it's got too many good features. GNU Make is worth learning and using just because of this. Having to deal with non-make systems is seriously annoying.

> It does many jobs relatively well.

I wonder if anyone else other than me was insane enough to try and use GNU Make to manage dotfiles? I even blogged about it.

https://www.matheusmoreira.com/articles/managing-dotfiles-wi...

Gotta be really careful with stuff like that. GNU Make is simultaneously a rather lisplike metaprogrammable language and a turing tarpit. I screwed up a personal project once because I got sidetracked essentially reinventing a fraction of autoconf in pure GNU Make code.

Now I try to keep things as simple as possible. I still understand the makefile so I suppose it's simple enough.


> I wonder if anyone else other than me was insane enough to try and use GNU Make to manage dotfiles?

Not a bad idea IMHO… thanks for the blog post. I'm tempted to try something similar.


The real makefile that I use and wrote about has some features that I didn't get around to describing in the blog post. Also has a ton of comments.

https://github.com/matheusmoreira/.files/blob/master/GNUmake...

The metaprogramming template I described is used to implement XDG Base Directories. The links take the XDG variables into account while the real files live in their default locations inside the repository.


> the platonic ideal of development is:

> make # Build the project

> make install # Install the project

> make clean # Clean the project build dir(s)

IMO, that last step of cleaning is very not platonic ideal. However... maybe a reality of incremental development?

The first step for me in a lot of cases is just trying a project to see if it's worth my time, typically that would just have been:

cd /tmp git clone http://foo/some/project.git cd project # build steps, perhaps in a good case your make example above, a dockerfile, or nix flake

These days though I'm ecstatic when I see a Nix flake because I can reduce that to:

    nix run github:some/project
Then for development I could even avoid cloning things myself (and sometimes do) with:

    nix develop github:some/project
    cd /tmp
    unpackPhase
    
Make some change:

    genericBuild
I guesss all this to say my platonic ideal includes getting more busy work out of the way and at least in the case of building, not having to worry about cache affecting reproducibility.

I guess all of the above is to try and express my different idea of platonic ideal here and how Nix gives it to me (and could perhaps to you, for a cost).


If nothing else, make clean is useful to recover disk space not needed anymore, which can be substantial. It’s like deleting a cache or a tmp directory. It can also help to fix the build process by resetting it if something got stuck in the wrong state, but that’s not necessarily the primary purpose.


For make there is much better option than googling: https://www.gnu.org/software/make/manual/make.html and hit Ctrl+F. You can even easily search for things like $<. Nothing else is really needed. And for BSD make the man page is all you need.


> But make is widespread, ancient and eternal. It isn't going anywhere, despite decades of new tools trying to take its place. It does many jobs relatively well.

Agreed--make is eternal. One thing that'll probably be true: in a post-apocalyptic future, make will still be around and still be useful.


gradle is just a makefile with bells on and you’ll never convince me otherwise


The article reads

  I think an important thing to know about make(1) is that you don't need to write a Makefile to use it.  There are default rules for C, C++ and probably Fortran.
And then it goes against its own spirit... You don't need this:

  OBJS = foo.o bar.o baz.o

  foo: $(OBJS)
      $(CC) $(LDFLAGS) $(OBJS) $(LDLIBS) -o $@
Just do:

  foo: foo.o bar.o baz.o
It respects LDFLAGS and LDLIBS.


But the OBJS allows for an easy `clean` target, defined further down the article.

Personally, I'd have gone with

    foo: $(OBJS)
            $(CC) $(LDFLAGS) -o $@ $< $(LDLIBS)
to put all flag options first, then the output option, then using `$<` for the compiler inputs, and additional linker inputs last.

But that is somewhat idiosyncratic, and based on some half-remembered ideas of second-hand stories of how compilers used to process their command parameters back in the '90s (and before). Those ideas probably aren't relevant now (and may not have been entirely correct originally), so if the recipe in the original article works for the author, I'm not going to say they're "wrong" for doing it the way they did.


You do not need to write any project-specific definition for OBJS.

You can include in the project Makefile a generic Makefile that may contain a definition like:

  OBJS := $(CPP_FILES:.cpp=.o) $(CXX_FILES:.cxx=.o) $(CC_FILES:.cc=.o) \
   $(C_FILES:.c=.o) $(SS_FILES:.S=.o) $(S_FILES:.s=.o) $(F_FILES:.f=.o) \
   $(L_FILES:.l=.o) $(Y_FILES:.y=.o) $(RC_FILES:.rc=.o) $(O_FILES)
which builds an OBJS list from all the lists of source files that "make" has previously gathered from all the directories with source files that belong to the project.

The automatically defined OBJS can be used for linking, and a similar list without $(O_FILES), i.e. without .o files that are source files, not intermediate files, can be used for cleaning.


This is simple enough if you're okay with intermediary files littering the entire codebase. Targets and their intermediary files will be written right next to their sources and the whole tree will be a huge mess.

The makefile quickly gets complicated if one wants an organized source tree such as:

  source/module/file1.c
  source/module/file2.c
  source/main.c
  source/tools/tool.c
To be transformed into an architecture specific build tree whose organization automatically matches that of the source tree:

  build/$arch/prerequisites/module/file1.d
  build/$arch/prerequisites/module/file2.d
  build/$arch/prerequisites/main.d
  build/$arch/prerequisites/tools/tool.d
  build/$arch/objects/module/file1.o
  build/$arch/objects/module/file2.o
  build/$arch/objects/main.o
  build/$arch/objects/tools/tool.o
  build/$arch/executables/main
  build/$arch/executables/tool
The general solution to that is the explicit path method:

https://make.mad-scientist.net/papers/multi-architecture-bui...


Lists of files are usually used more than once for more than one task.

This would only make sense if the only rule that needed the list was that one that builds, no install/uninstall/package/modify/audit/hash/clean/etc, which is never the case. You also don't want to use globbing in place of an explicit list either.


Sure, I'm just pointing out you don't need to type out the C compiler invocation again.

  FOOOBJS = foo.o bar.o baz.o
  foo: $(FOOOBJS)


So:

> The example Makefile in its entirety:

           CFLAGS += -Wall -Wextra
           LDLIBS = -lcurses
           OBJS = foo.o bar.o baz.o

           foo: $(OBJS)
                   $(CC) $(LDFLAGS) $(OBJS) $(LDLIBS) -o $@

           foo.o bar.o: foo.h

           clean:
                   rm -f $(OBJS) foo
Become:

           CFLAGS += -Wall -Wextra
           LDLIBS = -lcurses
           OBJS = foo.o bar.o baz.o
?

           foo: $(OBJS)

           foo.o bar.o: foo.h

           clean:
                   rm -f $(OBJS) foo


Ah got it. Agreed.


With gnu make you don't even need OBJs for a simple directory in which all the .c files are compiled:

    .SECONDEXPANSION

    foo: $$(patsubst %.c,%.o,$$(wildcard *.c))
         $(CC) $(LDFLAGS) $^ $(LDLIBS) -o $@
Admittedly you can end up with short, very general Makefiles that look like they were a Prolog program written in TECO. But the advantage is that they are general so don't need to be fiddled with as your project grows.

I make small Makefiles like this that ensure my local build environment is correct for the package, then call cmake.


Using these and other similar features of GNU make it is possible to write a generic Makefile that works for any software project.

It appears that almost nobody reads the manual of GNU make, despite the fact that it is extremely instructive.

I have read the GNU make manual once, about 25 years ago. Then I have written a set of small Makefiles that I have used in all my software projects forever, until now, with only extremely small changes during the years, e.g. when new compiler options have appeared, or when I have added compilers for additional CPU targets or operating system targets or additional programming languages.

It is possible to make such generic Makefiles that will work in any software project.

For example, in the simplest case, when the source files are in a single directory and you use that directory also for building the project, copying a small template Makefile that includes the appropriate generic Makefile for the target CPU and operating system is enough so that "make" will identify all the kinds of source files that exist in the directory, generate the necessary dependence rules, invoke the appropriate compilers and build the executable.

For any more complex project, only a minimum of information needs to be added to the template Makefile. For instance, when the build is done in another directory than the one with sources, or when there are multiple directories with source files, a list of directories with source files must be added into the template Makefile. Lists of non-standard libraries, non-standard library directories and non-standard include directories may be added. Additional options can be specified, e.g. building a shared library, not an executable file.

Besides these, nothing specific to the project needs to be written. Adding or deleting source files or renaming source files do not need any changes in the Makefile, but after such changes a "make clean" is recommended.

Instead of this simple and obvious approach, more than 99% of the software projects that I have seen use absolutely horrible huge and unmaintainable Makefiles, or they use "make" replacements like "cmake", which are much worse than the traditional "make", so I fail to understand why anyone would want to use them.


I think you’re overstating the ease of use of make.

How does one discover dependencies in a cross platform way with Make without writing the logic themselves to stay up to date with platform changes? Or picking up configuration options from dependencies.

How does one make use of Ninja with make? Or discover changes to sdk paths for new platforms?

You end up having to duplicate that logic across every repo that needs it. Along the way you accrue lots of variance.

In the end you get something that is difficult to debug and scale. That’s why so many projects moved to cmake. It scales better. In much the same way that you can technically do anything you need with C, but other languages add ergonomics and scalability that people value more.


Funny to mention cmake when GNU Autotools handles all of these dependency resolution with grace using only a few lines of autoconf.

CMake script syntax is not very great, and debugging is no better than sprinkling printf's throughout... So, why not use Autotools? /bin/sh has been the norm for most build processes, to the point where the later-designed YaML syntax is _effectively_ the SAME as a Makefile with /bin/sh statements running the pipeline procedures.

CMake came along and tried to fix complexity in Autotools while adding kitchen-sink baggage along the way.


I’m not specifically advocating for CMake , though it is my preference for many reasons like faster uptake of multiple platform and compiler features.

I was merely pushing back on the person perplexing on why people don’t just use Make.


Why cmake and not meson?


Covered in my statement above but to expand a bit:

meson lags behind for supporting new features like different languages (Swift, ObjC etc), Xcode/msvc updates, IDE integration (CLion is amazing), build outputs like frameworks.

basically, Mesons better ergonomics lose out to practicality for my use cases. To get it do what I need, I’d be making my own meta system around it.


I recommend taking a look at Autotools—if anything for historical context. It quickly becomes apparent that many features in CMake (and likely meson—no experience there myself)—are derived from some functionality in Autotools. It also helps one appreciate the package maintainers role in the software development lifecycle, as many of the GNU coding standards have influenced software packaging/distribution—be it RPM, Deb, Pacman, etc.

While there is certainly value in using CMake (as so many projects have chosen it as their build system)—it also becomes a sort of cautionary tale whence projects try to reinvent the wheel to fix some deficiency, only to build a complex system with it's own set of deficiencies. Obviously, part of this is due to the fact that commercial entities (cough Microsoft) deviated from any sort of standards-based approach to building software.

But for every build-configuration system, Autotools seems to be the only one which dictates that a developer machine merely have POSIX tools installed. Nowadays, that's all the big players, considering that MSYS2 on Windows is better than ever. And it supports many languages to boot: C/C++, Objective C/C++, Go, Fortran, Erlang.


Discovering the dependencies depends on the compilers, not on the platform.

I have stopped using MSVC many years ago, so I do not know how it handles dependencies, but with gcc or clang that works regardless of platform, with compiler-specific options.

Moreover, software for one platform can be built on another platform. What matters is only the platform used for building, where the compilers are hosted, not the target platform for the built software.

In general, I extract all platform specific definitions, like the names of the compilers and their command-line options in platform-specific Makefiles, which include a generic Makefile with platform-independent definitions, rules and make targets. Any platform-specific details, like SDK paths, are encapsulated in the definitions of certain "make" variables.

While I have not needed to update the platform-independent generic Makefile for decades, the platform-dependent sections need updates from time to time, but that is not done for every software project, but perhaps once per year or more seldom, when significant new compiler versions, standard libraries or other new software tools become available.

I have not seen yet any reason for using Ninja. Perhaps it may be faster when building something like chrome, but because chrome is likely to be the slowest-compiling software project known to mankind, it is hard to tell how much is gained by Ninja. For software projects of more typical sizes I have not seen noticeable advantages of Ninja. Traditional make can keep all the cores of a CPU 100% busy, so there is no way any other building system can be appreciably faster. There are some very bad Makefiles that are intrinsically slow, but that is a problem of those Makefiles, not of "make" in general.

Using a minimum project-specific Makefile section can only simplify debugging, not making it difficult.

I have never seen any evidence for the claim that cmake scales better. On the contrary, at least in all the open-source software projects that I have ever seen cmake is a major source of bugs in the building process that nobody knows how to fix. (i.e. after some software updates cmake fails to find files that exist, presumably due to some errors in the cmakelists that appear to be arcane enough that they are hard to find) I have never seen in projects that use traditional make such errors that are so frequent in all the projects that use cmake.

I have a lot of experience in building software projects, because for many decades, with the exception of a few professional programs without alternatives, I have been using only programs that I compile from sources. Most projects have very bad Makefiles, which are very hard to maintain, i.e. to change when files are added, deleted, moved or renamed, but at least they build the software projects reliably. Whenever cmake is used, building a new version is always an adventure with unpredictable outcome.


I should rephrase, by dependencies I mean third party dependencies. That is on the build system not the compiler.

And make tends to fall significantly behind Ninja in my experience for cold builds. Here’s a post from someone else that echoes my experience https://david.rothlis.net/ninja-benchmark/ but for a sufficiently complex project even a well tuned make build is about 20% slower on CI, which is about 10-20m per build.

It also seems like you’re giving make the benefit of the doubt when it comes to “most being badly written but my own is great”, while simultaneously deriding cmake without giving it the same benefit.

But to your larger point, and going back to my C vs other languages comparison (let’s pick C++ for arguments sake). Yes Make can do it all, but even by your own statements it requires doing everything correctly and there’s significant room for error. CMake makes the trade off of abstraction for a more consistent experience with minimal work.

This is similar to C and C++, where yes C can (1) do much better in the very very specific right hands, but (2) it can also become an unwieldy mess. C++ with RAII may seem perplexing from the lens of only accepting the former and not the latter. It lets people spend their time elsewhere on the system.


Ninja excludes a lot of features of make which are expensive to implement and cause it to be slower to parse so naturally in certain situations ninja runs faster than make. If your build time is actually affected by how long it takes to read the makefiles then this is important (e.g. building Android).


When the makefiles are well organized, all of them have just a few lines, with a few definitions, perhaps at most ten lines, but usually less, and they include a single bigger common Makefile, with most of the definitions, all the rules and all the make targets.

So reading all the makefiles should take a negligible time, even in a big project.

I have seen too many projects with huge makefiles and with much more makefiles than necessary, but all of those are examples of misuse of make, which are easier solved by an efficient use of make, instead of by replacing make with another tool.

For instance, there are many projects with a Makefile in each directory with source files, which is a very bad choice that multiplies the number of makefiles. Makefiles should exist only in a dedicated build directory, not in source directories, except for very small projects with a handful of source files.

The best way IMO, is to have a single very small Makefile for each final file that must be built, e.g. executable file or library, in a dedicated subdirectory of the build directory. This allows for a maximum simplification of the Makefiles and their number and size is independent on the number of source directories and on the number of source files.


Make does a lot more than just reading makefiles - sometimes it re-reads them as a result of something that happened in the makefile itself.

It's also doing a lot of variable expansion and if the makefile uses macros like define....endef to generate rules it has to expand those.

Make is also often trying all sorts of pattern rules to see if they will allow it to fulfill dependencies. This takes time.

If makefiles are small and you run make on each one separately then you miss dependencies across modules and you get the syndrome where program A doesn't get rebuild when library B changes.

So there's a price to pay for everything. Ninja is a lot faster because it just doesn't have some of these features and you can make a big makefile where dependencies work properly with it and not pay the price of slow parsing.

I worked on builds that took 12 hours on a large cluster of build machines where parsing was 45 minutes - so it matters. The more general solution would be to use a packaging system like the Linux distributions but even that doesn't save you from every problem as I also found out when working with OBS on a linux phone distribution.


The link provided by you shows correctly that Ninja is indeed faster in the cases when almost nothing is recompiled, so the command execution time is dominated by the dependency evaluation time, which is faster in Ninja.

However, the same link shows that even for relatively large projects the time difference is less than a second, which matches my experience.

Ninja requires much more effort for using, as it is not a complete build system, but just the execution component, and except for projects of chrome size that effort is not worthwhile for a subsecond gain in certain scenarios.

I cannot imagine how a "tuned" make build can be 20% slower, except when the project is not really tuned, but it is badly organized. A "tuned" Makefile means that the execution time for "make" is negligible in comparison with the execution times of the compilers and linkers. The time spent in "make" should be measured in seconds an most, never in minutes.

Only a very slow file system can cause "make" to be much slower than Ninja, because the dependencies for "make" are stored in many ".d" files, equal in number with the source files. On modern SSDs or RAM disks, that is never a problem.

Perhaps you are right about CMake. I cannot be certain whether CMake is good or bad, because I have never created a CMake project myself, I have only built CMake projects created by others. Nevertheless, there certainly is a problem with the CMake users that I have not seen yet one that can explain which are the advantages of CMake. All the tutorials that I have ever seen about CMake were showing how to do in a complex way things that can be done in a simple way with GNU make, so I never had any reason to investigate any further.

What I know for sure is that much fewer understand how to use CMake than how to use make, because the frequency of bugs in CMake projects is much greater.

I would like to see how CMake can provide minimal work. With GNU make, creating a new very simple project needs only copying a template Makefile in the source directory. For a more typical software project, the template Makefile must be edited to add a list of directories, those that must be searched for source files (I actually write the list as a prefix directory plus a list of subdirectories for it, as that is how I normally structure the projects).

For external dependencies, up to 3 lists may be added, of libraries, include directories and perhaps of library directories, which should suffice.

That is all. Nothing more. I normally build each target file of a project, e.g. executable or library, in a separate subdirectory of the build directory, so by default it is not necessary to write it in the Makefile, because the name of the subdirectory will be used for the target. The root of the build directory contains a Makefile that I never change and which descends in each subdirectory to build its target.

How can be CMake more simple to use than this? All CMake projects that I have ever seen were much more complicated and they were very difficult to modify, with a very large number of project-specific details that have no place in the project configuration files, because the building system should be able to deduce them.

All the examples of using CMake that I have ever seen have demonstrated a tool with a much lower level of abstraction than GNU make and which is tedious to use.

GNU make does not need any such thing like the CMakeLists that are ubiquitous with CMake.


> GNU make does not need any such thing like the CMakeLists that are ubiquitous with CMake.

Then what is a Makefile?

And if your dependencies only amount to looking for source files, then that’s fine. Many are more complex than that and include configuration options themselves.

Again, I think you’re hyper indexing on just your own setup and ignoring anything outside of that, while simultaneously hand waving the number of bad makefiles out there and any shortcomings to drive home that Make is simplest.

Meanwhile by your posts, you’ve had to create an entire scaffolding of support around your makefiles to make it appear that simple. How is that any different than CMake or any other build system/generator? In the end you’ve just orchestrated your own version of such a thing and are decrying any other system as overly complex or just needing adjustments.

I believe we’re just talking past each other, but I’d really encourage you to consider that the reason you’re confused why people don’t just use Make is perhaps well founded and rooted in the number of things you have outright dismissed or don’t care about for your own projects. I’m not even specifically advocating for Cmake, but there’s no great mystery why so many projects use it or other build systems; it mostly lets them shift the responsibility of configuration to the system and focus on their task at hand. Again, exactly the same as C vs C++/Rust/etc…


In this kind of situation the templates tend to become complicated stores of knowledge and choices about how to build something for a particular platform and there are usually some complications about maintaining them.

It all depends on what you want out of the system - the ability to build against multiple versions of a specific linux distribution or compatibility across unix or even to non-posix operating systems. So the complexity exits from the generic parts where you say:

  PROGRAM:= bob
  SOURCE:=fred.c
  LIBRARIES=gtk
  include build_program_template_$(PLATFORM).mk
and enters the templates for building on a specific platform where you have to decide if that's gtk4 or gtk2 and how to tell the program what it can/cannot do on that specific platform.

If you're the one building the code and deciding where to do it that's cool. For people who want to port your code to some other platform, cmake or autoconf will help discover what needs to be done on some specific plaform, check that the needed features/libraries are on the platform and construct a build with all the features it can support on that platform turned on and the ones it cannot cope with turned off.

They automatically do part of the job you might be doing manually when maintaining templates. This is useful when you're giving the build to someone who just wants it to work and doesn't know the ins and outs like you do. But of course they are complicated and for all the great functionality they are far more complicated to fix when they don't work :-D


Yeah Make can be used for "any software project" if your idea of a complex project is that it has multiple source directories!


i have a couple of old blog articles that cover the basics of writing a generic makefile for c++ here: https://latedev.wordpress.com/2014/11/08/generic-makefiles-w...


Love this, I aspire to build a similar system myself one day.


> a Prolog program written in TECO

I know we don't do general-purpose "I like this" comments on HN, but, as a Christmas treat to myself, I'm just going to say how happy this little sentence fragment makes me.


Thanks! I aim to please.

As it happens I used to program in both Prolog and TECO, in the same period of time but of course not on the same code bases.


Just do:

    .SECONDEXPANSION
    foo: $$(patsubst %.c,%.o,$$(wildcard *.c))
No need to tell Make how to invoke a C compiler for the billionth time.


But then you need the name it GNUmakefile when you use incompatible GNU extensions.

BSD makefiles do look better.


Globbing is possible but unwise. You do want a variable with an explicit list of sources, even while trying to make things as automatic and dynamic as possible.


Why would you want that?

In decades of programming in very varied environments I have never compiled any software project otherwise than by using globbing, so that I have never needed to waste time to write the name of any source file in a Makefile.

I have never seen any reason to do otherwise.

For instance, using special compilation flags for a certain source file, different from the others, is something that I consider a mistake. Whichever is the goal, it certainly can be achieved by other means (e.g. pragmas or attributes).

Moreover, the dependencies for any file must always be generated automatically, they must never be written explicitly, which removes the main reason why in bad Makefiles the names of the source files are written individually.


Globbing means that the directory has to be scanned on every build

Once you start getting to multiple directories and recursive scans, the build time begins to grow enormously and usually unexpectedly.

Not a problem at all for small or even medium size projects on a fast machine.

But once you start building very large projects with multiple sub-projects, it generally slows the machine to a crawl on every build.


When the project is so big that this is a problem, you can make an additional make target, e.g. "make file_lists" that would use globbing and store the lists into a file that will be used by any other make commands.

The bigger a project is, the more important is to use only globbing and never write explicitly any file name lists.

However, I believe that this, the slowness of globbing for big directories, might be a strictly Windows-specific problem, if it exists. At least on Linux and with XFS (which has B-tree directories) I have never seen globbing to take any noticeable time, even on directories with ten thousand files or more, where it still appears to be instantaneous at the human reaction time scale.

There is no reason to ever do recursive scans. All the directories with source files of a project must be scanned to search for any source files located there, but only once.


I said unwise not inefficient.

Globbing is a security and reliability hole. You don't ever really know what got included or will get included, you just assume you do, and you are right only most of the time and only by luck not by actually ensuring it.

Saying things like "only the files I expect should ever be in this dir" is just wrong swiss cheese thinking. (you never said that, but another commenter essentially did)


The problem with globbing is gratuitous matches. If a.cc depends on l.h and m.h, and b.cc only depends on m.h, doing a glob in the dependencies ( $(wildcard…) ) will unnecessarily rebuild b.o if l.h changes.


globs don't even prevent you from special casing one file to have different flags


True, but this should better be avoided, because in such cases reading the source files is not enough to understand what they do.

You must also be aware that they are compiled in a different way and you must search a frequently too big and obfuscated Makefile to discover which is the applicable compilation rule.

Unfortunately I have seen many projects that had used such tricks and it was always painful for maintenance to discover how they were supposed to work.


Globbing is how you avoid pointless busy work. If there's a file in the directory that shouldn't be built, it shouldn't be there.


When you want to exclude a file from compilation without deleting it, it is enough to change its extension, e.g. from ".c" to ".c_".

Alternatively, you could move such a file to a subdirectory "Attic" or the like.

Any of these solutions would ensure that globbing will no longer include that file in the list of source files, without needing to update any project configuration files.


And nothing that shouldn't happen ever does. I guess you also include "." in $PATH, even roots.


I admittedly don't have a ton of experience with make, but what's the point of doing this vs just throwing the compile commands in a shell script?


Imagine you had all your build commands in a shell-script.

Now imagine that you want to run individual steps from time-to-time, instead of running your whole script start to finish. So maybe you write functions that represent each group of commands, and call the function based on an input argument.

Now imagine that some of your build-steps have dependencies on other build-steps, and you want to be able to say "run step X, plus whatever else you have to do first." Or maybe you want to retry a failed build from wherever it left off, instead of having to re-run the entire thing.

Now imagine that you'd like to have some level of isolation between build-steps, so that one step doesn't mess the other one up.

As you add more and more things to your build script, you'll find yourself writing an implementation of Make pretty quickly.

Now

- sometimes - your build fails halfway though, and you want to resume from where-ever it failed. S


The biggest practical difference is that make can test to see if a file needs to be rebuilt before proceeding.


Yes, exactly this.

When compiling a project once from a clean state make has no advantage over a shell script that would use a tool like "parallel" to distribute compilation jobs over all available CPU cores.

However writing such a script for parallel compilation is more complex than writing a simple Makefile executed with "make -j N".

As you say, the main reason of existence of "make" is the acceleration of the recompilation of a project after incremental changes in the source files, when only the files that depend on the changes are recompiled, saving time.


Make is declarative. You tell it how to make an A from a B and a B from a C or a D and it then works out that it needs to do "CBA" or "DBA" depending on whether it finds a D or a C in the filesystem.

The more possible variations you have to cater for the worse build scripts get.

If you try not to redo work that you did 5 minutes ago that's still valid then build scripts become even more complicated whereas the makefile doesn't change at all.

Other tools do declarative even more than make does but as they get easier to understand they tend to lose abilities.


Make will automatically do all the steps, and stop if any one fails. Of course you van do manually, but is much more work.


The irony is that the same argument would work against Make in the description further up vs other build systems like CMake etc.

Not to take away from your correct answer of course


That’s because CMake calls make or an equivalent (ninja, xcodebuild…) that actually does all the dependency checking!


Gonna get downvoted to oblivion for saying this but I haven't hated a tool more than make.


Not downvoting you. Just curious to know why? In my experience, as long as the Makefile is less than about 100 lines, it's the most useful workflow system ever. Because it's available almost everywhere. After about 100 lines, yeah, it is not great. And that idiotic tab character. I know the historical reason why tab was used. I wish they had fixed it to accept both tabs and spaces a long time ago.


It's a poor, verbose, arcane solution in every problem space it purports to solve.

It was fine in its era, but that's a long time ago. It is completely unsuitable as a project management tool and a poor task runner.

Also its ubiquity is overhyped. There's no build platform where I can't download the tools I need, that's the entire point of a build platform, and most such platforms come with far more advanced tools pre-installed.


Make has few dependencies so it's a way to build an operating system up from ground 0. Any tool/language that wishes to be generally useful in an OS doesn't want to be built by something that is only available much further up the tree like python.

So it gets used a lot and it's only a language for dependencies and rules. It doesn't do "project management". Its the simplicity of what it's trying to that saves it from ever becoming irrelevant - because it can be made to fit almost any use case. It's not a special tool for enforcing one structure or building only one language - something that seems highly regressive to me but which is adopted by many languages now.


> Any tool/language that wishes to be generally useful in an OS doesn't want to be built by something that is only available much further up the tree like python.

This is a non-priority, an unreal use case. I can't build GCC without a functioning C++ compiler and a fairly sophisticated OS environment and that's fine. Real-world use cases for bootstrapping a build environment from rubbing two sticks together are so exceptionally rare as to be near fictional.

Even if we allow that such cases exist, they are unicorns, not a thing to design tools around.

> It's not a special tool for enforcing one structure or building only one language - something that seems highly regressive to me but which is adopted by many languages now.

This is what makes it so unsuitable. A tool that can make no assumptions about its application is less and less useful a tool. A bread knife is better at cutting bread than a plain 10" kitchen knife, a boning knife better for deboning, etc.

We live in a world where for every language there are build tools that know far, far more about the needs of that language environment than make, and thus are far better suited.


> We live in a world where for every language there are build tools that know far, far more about the needs of that language environment than make, and thus are far better suited.

Languages don't exist in a vacuum.

A big reason for the resurgence of make by developers is every language/environment comes with something new that tries to recreate make. They all have issues and limitations where make ends up being a better tool.

Also, when you need to deal with multiple languages in a project, these tools tend not to be that helpful.

As a web developer, there's been so many attempts to make tools like gulp [1], grunt [2], npm scripts [3], web pack [4], etc. and a dozen more, to do a part of what make has been doing for decades. And literally every 6–12 months, there's a new, hot tool that everyone gets excited about. It's just reinventing the wheel over and over.

Now that I've settled on make for web projects, it's no longer a concern.

I much rather describe my dependencies using make than JavaScript (ugh), which many of these tools use. Some tools fall out of favor; developers stop creating plugins or whatever for them. As a user of the tool, you can find yourself shit out of luck for your particular project or use case.

The beauty of make is it can handle the tasks of building a web site or web app using modern tools that didn't exist when it was created. As someone mentioned further up, make is eternal; it's not going anywhere.

[1]: https://gulpjs.com

[2]: https://gruntjs.com/

[3]: https://docs.npmjs.com/cli/v9/using-npm/scripts

[4]: https://webpack.js.org


Make exists and is used widely and not because everyone's too inept to understand your points.

The "assumption-making" build tools are the unicorns that inevitably cannot dominate because they're not generally applicable. The more they assume the more niche they become.


> The "assumption-making" build tools are the unicorns that inevitably cannot dominate because they're not generally applicable.

They already do dominate. In the last bastion where make can be said to be popular (C) make has already lost to CMake and its usage shrinks every year [1]

The situation is moving even faster for C++ [2]

And for literally every other language in the systems programming space (C#, Swift, Rust, D, Go, Zig), make was never a player to begin with.

[1]: https://www.jetbrains.com/lp/devecosystem-2022/c/#which-proj...

[2]: https://www.jetbrains.com/lp/devecosystem-2022/cpp/#Which-pr...


cmake generates makefiles (or ninja) so it doesn't replace them. It replaces autotools - it works out what features a system has and what build flags to enable and from a generic description of your program it generates makefiles or ninja.build.

There's nothing fun whatsoever about fixing build problems with cmake - because you have to understand make/ninja AND cmake. It's slightly easier to understand than autotools, though, where one has the same problem.


> cmake generates makefiles (or ninja) so it doesn't replace them

That make happens to be a possible target of CMake is irrelevant to this discussion, it replaces the usage of make. make becomes an implementation detail.

No one should be writing Makefiles. You also shouldn't be using make as the underlying task runner but for a different set of reasons.

> There's nothing fun whatsoever about fixing build problems with cmake - because you have to understand make/ninja AND cmake

You need exactly zero understanding of make or ninja to use CMake. Ninja explicitly is not meant to be understood by end users, and only as a target for generators like CMake:

>Where other build systems are high-level languages Ninja aims to be an assembler ... it is designed to have its input files generated by a higher-level build system [1]

You cannot name a single use case where it is beneficial to know the mechanics of make when using CMake, much less the mechanics of Ninja.

[1] https://ninja-build.org/


When something doesn't build properly then you start needing to know the details and your logic about how make is an implementation detail is true of everything in the end - nobody cares how builds are done compared to the thing they build and often they don't care about that either except that it is needed by something which they need.


> When something doesn't build properly then you start needing to know the details and your logic

You would use compile_commands.json for this, which would show you the literal commands being invoked. What program is doing the invoking is 100% irrelevant. If the invoking program is what is breaking your build, your build is screwed beyond comprehension.

You don't need to know LLVM IR or the GCC intermediate representations, you don't need to know the internals of the ELF format, and you don't need to know make.

If you want to know these things more power to you, but don't insist that everyone should be writing LLVM IR because it's a more general building block than C++ or that anyone should use make for the same reason.


Make is popular because many old projects use it and it comes pre-installed on Linux. It's a cultural thing.


It works on an enormous number of platforms with few dependencies. People have got these projects which build on everything from embedded platforms to mainframes and it has taken effort from many different people to achieve that.

So you might come up with some new thing that "Works For Me" and now all those people who are peacefully using the tool on their whatever platform have to fiddle with it to make it work again - or the person who made it work is no longer around and that platform loses support for the new versions.

When you say "cultural" you make it sound like people do things because they are ignorant and resistant to change but it's worth at least considering that their point of view is quite different.


You see, this is why I personally hate make: automake.

As you say, a small elegant makefile is lovey.

That is not the sort of makefile you get from the autoconf/automake family, and (ha!), good luck if something goes wrong.

The reality is that a small simple makefile is not sufficient to build real software.

The problem is fundamentally that make doesn’t compose well.

Most real build systems such as cmake, scons, cargo, etc. provide primitives for dealing with complex messy situations like “oh today I’m using visual studio not clang” or “does this platform have stdbool.h?” or “is the flag for disabling a warning different on this compiler?”

…and a way of composing that tasks into smaller ones (eg. add_subdirectory() and find_package()).

Make does not.

Make is a simple tool for simple tasks.

The reason people don’t like it, in my experience, is they have had to work with a make based project that has grown beyond the trivial size and it’s become a nightmare.

It makes easy things easier, and hard things much harder.

If worked on SPAs where they used make instead of webpack; but it’s a stupid solution. It doesn’t do live reloading, or any of the other pipeline stuff… but by gosh they tried!

You know what I like about clojure? It’s a simple tool that scales well for both simple and complex tasks.

Make simply doesnt.

…and that is reflected in the reality, which is that increasingly build tools are moving away from it and towards others, even for the low level tasks that people used to “generate makefiles” for eg. to ninja

I really am not a cmake fan; but there absolutely no denying it works very well in many real world situations, where trivial “-lsqlite” flags in a naive makefile don’t and cant work.



You can find an exception to any rule, but you'll find, in general, that its both self-evident and unsurprising that make is on the decline, and being replaced by other tools.

plan9 exists in an enviable position of only needing to build in a few controlled environments.

For example, http://9p.io/sources/plan9/sys/src/libmach/mkfile reads:

    CFLAGS=$CFLAGS -I/sys/src/cmd
Cute. Non-portable. A perfect example of how make is useful in trivial or contrived circumstances only.

/shrug

Try reading the llama.cpp makefile, which is an example of an excellent cross platform makefile that works very well: https://github.com/ggerganov/llama.cpp/blob/master/Makefile

^ this is what many real Makefiles look like, and this one is excellent and well maintained.

Many are worse.

Make is good when you only have easy problems to solve (like, specifically, only having to build one specific constrained environment for one specific compiler).


Woah, what is that @{ construct in that mkfile? I cannot find it in the GNU Make Manual (https://www.gnu.org/software/make/manual/make.html). I use backslash-before-newline for multi-line loops like that.


> I wish they had fixed it to accept both tabs and spaces a long time ago.

I have good news for you: this is already the case! In all modern versions of make you can use semicolons instead of tabs.

You just write

    target : dependencies ; rule
instead of

    target : dependencies
    ^Irule
if, for some reason, you hate the beautiful tab characters.


Ha, what if the 'rule' is 5 lines long?

Many times, a Makefile snippet in an HTML page or a PDF file has had the tab character auto-magically morphed into spaces. So when it is copied, it will be a syntax error in a Makefile. Even copying Makefile snippets from one terminal window to another will eat those tab characters. Fortunately my vim editor is configured to handle most of those edge cases now, but it took me... I don't know... 10-15 years to get that right.

My personal rule is that control characters which are visually indistinguishable from other characters have no place in source code which are meant to be written, read, and copied by human beings. So yeah, that includes the <tab> character. And yes, I do think the Golang people got that wrong, although the 'go fmt' command largely solves that problem. Except when reading Go source code on web sites which don't handle tabs well, and you are nested 6 levels deep, and half the code is unreadable because it's bled off the right side of the page.


> control characters which are visually indistinguishable from other characters

This is a text editor configuration issue, not a character choice issue.


In my editor, tabs are shown with reddish background, so they cannot be mismatched with spaces. Fix your editor.


If you want to write less makefile, you don't need to define foo. `foo` is an implicit target if foo.c is present.

You can use make in this form without a makefile even:

    ~ % echo '#include <stdio.h>\nvoid main() { printf("OHAI\\n"); }' > ohai.c
    ~ % make CFLAGS=-DDEFINE_ME_A_MAKEFILE ohai
    cc -DDEFINE_ME_A_MAKEFILE    ohai.c   -o ohai
    ~ % ./ohai
    OHAI


In that case why call make at all?


CC, CFLAGS and LDFLAGS conventions, and freshness checks (if you do it twice, it won't build again as it performs an mtime check on the source).


Make is fantastic. I’m using it to rebuild and run containers in my development environment, and to build a full repository of RPM packages (that my project needs and which CentOS Stream lacks). I suppose that if javascript tooling was more amenable to process one file at a time, I could do parallel transpiling and bundling of a hefty frontend with make as well.

I also use its metaprogramming and late evaluation capabilities a lot, heavily inspired by the tricks Buildroot is doing.

If you are mindful of its restrictions (it works with “words” separated by spaces, thus paths with spaces in them are a problem unfixable without severely breaking backward compatibility; and your targets need to be expressible as files with meaningful modification timestamps), it’s a very powerful tool. It deserves more appreciation.


I too have found it handy for managing containers, container networks, and images. It's prevented me from using docker-compose as much as I pronanly should.


This page just makes me miss manpages in general.

Not every manpage was great, but many were, and many could also be formatted, printed out, and placed into a binder for easier reading.

What I enjoyed most about manpages is that the people that cared to write good documentation on things like bash.

If you have groff installed you can:

    man -Tpdf man >man.pdf
The only problem with groff is that it doesn't support system standard fonts.


Why do you speak in the past tense? Man pages are alive and well, and we use them every day.


On recent vim, :Man crisply delivers within the same editor tab.


The author should mention the automatic variables: https://web.mit.edu/gnu/doc/html/make_10.html#SEC94

Specifically, in one of the examples, to list all dependencies as the input to `cc`, you can write `$^` instead of `$(OBJS)`.


OBJS is a little more clear to what's happening than $^

I'm not a fan of magic symbols. Hard to remember, especially across languages.


Make actually has some of the more sensible shortcuts, which are easy to remember

$@ -> the target (like an email address)

$^ -> the dependencies (look up)

$< -> only the first dependency (look up sign, but rotated 90s left to point at the first one)


For larger programs, the compiler can also generate the dependency tree for incremental compilation: https://www.gnu.org/software/make/manual/html_node/Automatic...


Outside of simple C projects, a Makefile should be just a “here’s how to hold it” reference for the rest of your build system. Shoehorning in all of the build complexity is going to be overwrought, but a good “make dev; make clean; make build; make release” is a very welcome entry point for new or infrequent contributors


I wish people actually defined the dependencies between .c and .h files and .h and other .h files. When a project doesn't do this, I always and up doing a `make clean && make` after getting stuck on a weird bug because I just assume it's not rebuilding something it should.

Nowadays I use CMake and am really happy with it.


It's unfortunate that most programmers only experience with make is automake/autoconf. This is fantastic.


Autoconf, that's the thing that spends time making sure that my software can be built on BeOS and A/UX?


I have a vivid memory of autoconf. I set it up best as I could. Someone ran ./configure. It triggered a new run of autoconf that failed because of mismatching autoconf version. Why it triggered a new run? Because of mtime checks. Why did they say it needed to be rerun? Because git doesn't preserve mtime. I guess it's just assumed that everyone distributes their sources as a tarball. I spent many hours trying to figure out how to bypass those issues, and I can't remember if I ever succeeded.

These days I just use Rust instead.


> I spent many hours trying to figure out how to bypass those issues, and I can't remember if I ever succeeded.

Might I suggest “got-restore-mtime” for this, upstream in Debian and Ubuntu already.

Did you really spend hours on this?


Well I wanted to create an easy and familiar process for people installing from source, so I wanted it to be roughly git pull && ./configure && make && sudo make install. So I didn't consider restoring mtimes into that flow, just looked for options for autoconf to not regen.

Also I think it took quite a while to even realize the mtimes were the issue from an error messages, as I was not very knowledgeable about autoconf / automake.



Sure. Are you trying to imply the existence of a manpage means something ubiquitous?


I was wondering if “got-restore-mtime" was a typo or a cleverly named similar project to “git-restore-mtime".


More like some obscure pre-c89 stuff. But it takes actual work and testing to get that working, so most people have a configure script that checks for all that stuff but when it's time to compile it only actually works on their specific Linux distro.


autoconf's principal purpose is to waste time.


Make's killer feature was conditionally rebuilding based on changed dependencies. Back in the day, it was easy for a medium to largish software project to take many hours, or even days to fully rebuild. C and C++ were especially bad, especially "every file transitively includes every header" type projects. Make saved a lot of that pain, even if linking still sucked.

I love make and still spin up a minimalist Makefile like in the article regularly for tiny projects, but I'd hate to have to actually maintain one for a real world project.


You could do something like this to get dependencies almost for free:

    .depends: $(SRCS)
        $(CC) $(CFLAGS) -MM $(SRCS) -o .depends

    -include .depends


I see people now with long javascript builds that waste time doing many things repeatedly - it's quite ironic. Eventually people find crap to fill up the performance that's available.

The problem is that now the tools aren't designed as composable bits that you can really parallelise with a makefile. C/C++ are designed that way and if they didn't use header files I'd consider them perfect from the build system's point of view :-). Golang and java for example are a nuisance.


A lot can be said about js tooling but honestly it works good once it is set up. Most modern js tooling provides --watch flags and hot module reloading. Vite is one popular option. Just save one file and the change instantly takes effect in the browser, even on a large project. Never seen any makefile based c++ project come close to that experience.


I'm glad you had success. It wouldn't be difficult to do that with inotify to run make when a file changed. OTOH with webpack and transpiling and various other things I have watched people in my company take quite a long time to see a single change. I think they haven't sorted out the build right and I'm not enough of a javascript expert to do it for them.

They prioritise various features over productivity and I think that's a mistake.


30 year veteran here, funny enough I barely know anything about make. My first job we were building our own build utility because we had to build on windows netware, dos, Linux… soon after that it was all cmake, and I never had to use a make file for much of anything.

Only recently have I come to be kinda fond of its insanely simple syntax but really make hasn’t been a requirement to native development since as long as I’ve been around getting paid.


I say this as someone with 20+ years of experience in C and C++: Using make is a huge waste of time in 2023 (and forward). Learning the intricacies of make actively blocks you from Getting Things Done.

Yeah, make is kind of neat because it has this functional what_I_want : how_to_get_it syntax, but it doesn't actually work that way. It's a huge waste of time to try to get it working so unless you meet the following caveats, make is not for you:

- you love trivia like "who was the 19th president of the US?"

- on a road trip, you stop to look at historical markers

- you love steampunk/tube amplifiers/making pasta from scratch/listening on vinyl/home brewing beer

If those sound like you, you're the type who won't regret wasting a couple days getting make running and getting cut on the sharp edges. Most of us who just want to get there will be better with cmake, meson, or hand-rolling a script using a modern language (inb4 "make does more than compile code" - yeah, so does ruby or python or any other script, without making false promises or wasting your time)


3 paragraphs of rambling for "make is a waste of time because of sharp edges"

Can you actually articulate what those sharp edges are?


yes, I can


If meson fits your project it's worth using it - it could turn into a straightjacket later but changing to something else isn't the end of the world if it cost little effort to implement it in the first place.

As for trying to use a "modern" language to write scripts.......well if they're not declarative and don't understand dependencies then how can they be better?

make is no paragon - it just contains solutions to lots of problems that exist in build systems. The fact that those problems exist and make has ways to solve them isn't make's fault. Everyone might perhaps want builds to "just work" so they can ignore them and such people write scripts and use the concept of "rebuild and clean" a lot and it works until the build gets big and slow.

"modern" tools get written with many simplifying assumptions and become slowly complicated as they hit all the problems make hit long ago and try to find ways to solve them.


> if they're not declarative and don't understand dependencies then how can they be better?

They are both a wash: you're spinning your gears trying to get a simple task done. My claim is that you might as well do it in python instead of wade through the cesspool of hidden rules or try to debug why your makefile silently fails. It's the same amount of time spent, you might as well check the date stamp in python and rebuild the source if it's newer than the output. They both suck, might as well suck in a modern language than trial by fire with make.


Oh, I think that's OFTEN true. When your build gets big though - when you're building multiple packages for example of which most are not yours - the benefits can start to become more worthwhile.

I currently work with people who think a huge build.sh is the way to go - so the build does a lot of unnecessary things just to be sure they are uptodate and it's slow. It's still not bad enough to kick up a fuss but it's going that way.


"Make actively blocks you from Getting Things Done." --Quote from someone suggesting CMake.


Lost me at "making past from scratch".


In some sense, don't we all make the past from scratch? [stares into middle distance]


If you're looking to an alternative, you could take a look at Task:

https://taskfile.dev/ https://github.com/go-task/task


+1 to this recommendation. Everyone likes to point to ‘just’ as a makefile replacement, but having tried both I slightly prefer Task.


I like the man page format each post has on this website. A little hard to read, but very unique.


Yeah, 2023 and reflowing documents still isn't a thing. Yay


I read this on mobile and it felt pretty tedious. I wonder if maybe some minimal HTML could make the paragraphs reflow properly while still looking like a manpage everywhere.


Which is ironic, given that man pages do reflow on different-width terminals and have done since the '90s.


What sort of keyhole are you peering through that needs to reflow 72-character text lines?


A phone? Your comment is 3 lines for me.


Just checked; both my comment and TFA appear full width on my phone. Bog-standard Android.


Maybe other people do not use the same setup as you do.


That much is obvious given that a vanilla phone setup has no trouble with even 90-character lines. Not my business to stop anyone making things hard on themselves, though, so good luck you all!

(do we have anyone who reads HN over a morse clacker?)


I have vision problems, I increase zoom a lot in desktop and mobile.

What do you have against people different than youself?


Nothing; I wished you luck! I imagine people with screenreaders have an even tougher time...

(mobile I agree would be hopeless with zoom [I avoid reading on mine even without zoom], but does your desktop zoom mean even TFA's vt100-friendly 72-chars needed reflowing?)


I don't always increase zoom but TFA is def too small on desktop. I just increase until its readable, that being said, the article is comfortable for me at about 200-250%. Reflow issues seem to occur at 300% - so no issue for me but I could see some people going that high.


This is supposed to be functional, not art work. Make it legible; easy to digest.

I guess it's fitting for a tool (make) that is also unergonomic by today's standards.


I wouldn't use this format myself either, but that's the fun with personal websites: you can do whatever you want. There's no law that says stuff needs to be functional. People who dislike it will just move on.


Typography isn’t set. It _is_ about legibility and therefore proper typography _is_ functional.

This site simply isn’t as legible as it could be. I am not suggesting it needs to be “pretty”. A monospaced Markdown file would be more legible.


There’s so much of it in this thread I couldn’t possibly reply to all of it, but wow. Lot of make/shell wankery going on. No one is impressed with your pipe.


I honestly wish that make's default rules were available to import and view the source of, but we're disabled by default.

Then a beginner could see what happens in a straightforward way, and then import the C rules. And we could all stop importing the VCS rules which aren't useful anymore.


I don't know about POSIX make, but GNU make you can run "make -p -f/dev/null" to get a dump of all the built in rules it has. And if you don't want to run those, you can always use "make -r" to have it ignore built in rules.


You can use Autotools to generate your Makefiles. Autotools gets a bad rep, but I've found it perfectly fine for simple projects. To get started you only need two files, configure.ac and Makefile.am, you can ignore all the autogenerated stuff.


Back in 2000s I was all in the bash make unix camp. That was for ten years or so. Now I realize that it was just a religion, at least on my part. Python and Node and their libs are thousands of times better than all that nonsense (including arcanes like CMake).

I have a whole folder of personal tools I’ve written in bash, and had a similar set for make. Some of them I don’t want to touch, but when I have to, I just rewrite them in Node with a little io/ipc helper library I created and it feels like a breath of fresh air.


Is there any reason to use make over ninja and gn?


I'm a seasoned developer for the past 15-ish years and I haven't heard of ninja or gn before.

I tried typing them in my command line (macos 12.4) and I would need to install them to use them.

I was taught make in CS101 in college and the majority of other people's projects I've looked at use it.

So the advantage is ubiquity and familiarity. Which can be useful if your code base is going to have a large number of people working on it.


> I'm a seasoned developer for the past 15-ish years and I haven't heard of ninja or gn before

Then you also stopped learning about developments in your field 15 years ago.

> I tried typing them in my command line (macos 12.4) and I would need to install them to use them.

Ok? And? We don't pre-installed dev tools on consumer operating systems. You won't find valgrind or vcpkg either.

> I was taught make in CS101 in college

The tools we taught to beginners over a decade ago are perhaps not suitable in all cases, in fact they are suitable in very few cases. Not much Pascal usage either anymore.

> the majority of other people's projects I've looked at use it.

Then you live in a very tiny bubble.


You seriously compare whatever this ninja thing is to valgrind with a straight face?

There’s value in not overloading your cognitive capacity with the most recent fads, there’s enough serious tools and tech to learn.


> You seriously compare whatever this ninja thing is to valgrind with a straight face?

In so much as neither is pre-installed on a consumer operating system, yes, they are obviously equivalent.

> There’s value in not overloading your cognitive capacity with the most recent fads

Ninja has been the standard backend for CMake and other meta-generators for 11 years, it is hardly recent or a fad.

Moreover, you don't need to know anything about it, since it's a task-running backend format. You need to "know" it as much as you know the ELF object format or LLVM IR, your tools use it not you personally.


No need for personal attacks here, I'm just answering based on my experience.


You're the one who made your personal experience the grounding for your defense of make, not me.

There's no way to respond other than to point out you've had a very bizarre, isolated 15 years if the only thing you encounter regularly is make. In the one language ecosystem where make is still popular (plain C), it's a minority that only comes in second overall for build tools [1]

[1]: https://www.jetbrains.com/lp/devecosystem-2022/c/#which-proj...


Ninja was super easy to pick up even after using make for some time (10+ years). GN is just a ninja generator that is optional.

https://gn.googlesource.com/gn/+/main/docs/quick_start.md

https://ninja-build.org/


Ninjafiles are not designed to be edited by hand, it expects more sane, language-aware tooling to generate the dependency graph, for ninja to execute. https://ninja-build.org/manual.html#_design_goals

Though you could also argue that writing makefiles by hand is equally foolish. Use a tool designed for the job and not some duct tape that “works” everywhere but barely keeps things together. The example in the OP is missing handling of headers #including other headers, just to give one obvious example.



Yeah, if you have no effing idea what ninja or gn are


ninja is faster at parsing large makefiles and I'm reasonably certain it's because it lacks certain expensive features of make.

Do you need those features? You'd have to read about make to find out but it boils down to there being not much performance difference for small builds so make might have an edge there. When you get to android sized builds the reading of makefiles can take minutes and ninja wins there enough to make it worth sacrificing the features.


nice: friendly but also succinct.

how many people are using systems like cmake (or worse) who would be fine with just make (used right)?


Things get messier when you start to add automatic dependencies.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: