In my projects, whether the language is Go, Rust, Python, typescript, or even Android, there are standard make commands, if applicable, always work
- make format
- make lint
- make build
- make docker_build
- make docker_run
Once might migrate from one build system eg pipenv to poetry to uv, but the high level `make format` commands doesn't change.Also, how do you solve the problem of actually bootstrapping over different versions of docker/podman/make? Do you have some sort of ./makew like how Maven uses to bootstrap itself?
But it is closer to being nice aliases as opposed to fetching the tool chain.
You would want to mark the dependencies anyway, since when you update your compiler, IMHO, that invalidates the objects compiled with that compiler, so the compiler is a dependency of the object.
That said, I don't distribute much software. What works for personal software and small team software may not be effective for widely distributed software that needs to build in many environments.
Especially, given how bad the defaults in bash are https://ashishb.net/programming/better-bash/
> I do the same except I use shell scripts. script/fmt, script/lint
Do you create a separate file for every single target then?
it’s not, you need to start thinking about .PHONY targets and other stuff quite quickly
From what I see it is almost* strictly less useful than BSD make, which (IMO) is strictly less useful than Solaris make, all of which are strictly less useful than GNU make. It only might be more useful than SysV make.
* Regex support is unique, but you know what they say about solving problems with regexes, and you can usually do any useful think with `subst` and `word`. The `Pcmp -s` at first seems unique, but can easily be replaced by some stamp/dep file logic, with much better performance.
Don't get confused by the fact that `automake` chooses to maintain compatibility with versions of `make` more than 4 decades old.
- Readability: $target, $prereq, $stem is much easier to read than Make’s $@, $<. Any white space can be used to indent.
- Flags after the target to do useful thing such as autodelete target om error (no more confusion on about partial files), and controlling verbosity, and specifying virtualness.
- Stricter when dealing with overlapping targets.
- Has a flag for printing why each target is being generated.
- Regex in targets is sometimes really useful! Taken together with the strictness towards overlapping targets this leads to less confusion overall.
That said, the automatic variables ($target, $stem, etc.) are more descriptive than those of make ($@, $*, etc) since the former are just normal shell variables passed in at rule runtime I think.
So you get a working build example, and if you’re lucky and the base image exists for your architecture, it’ll compile a binary for your system too (assuming static linking I guess).
I’ve tried as you said and I’ve found that this is not a good assumption to make
glibc doesn’t support static linking so this is not working in the general case
On the contrary it has many. My favorite is apt but maybe you prefer cargo or homebrew or rubygems.
That doesn’t make sense. C absolutely does not have a dependency manager.
So in that sense dnf/apt/zypper/pacman are all “C” package managers (not sure I agree with the OP but I think this is what they meant)
It will likely have less packages than the other established ones though
i use cmake - not a great system but at least it isn't hard and it always works. I've heard good things about a few others as well. There is no reason to use autotools.
I always had the idea that Make reads a Makefile, so when I saw this for the first time it blew my mind.
So if you have file.c you can just call “make file” and that is it.
And the file it processes can be called Makefile or makefile or even other names (see the -f option).
https://man7.org/linux/man-pages/man1/make.1.html
And that applies even to pre-GNU Linux make.
Source: been using Unix since some years before Linux was created.
I have not read the GNU docs much, only as needed, now and then, because, as I said, my usage of Unix predates the creation of GNU / Linux by some years, and I had spent tons of time reading Unix man pages and other docs earlier, and tons of time actually using various Unix versions too, in production, over multiple years, including the shell (sh before bash, even, and awk, sed, (e)grep, and friends, aka lots of Unix filters, including custom hand-written ones in C, and combinations of all of the above, as all red-blooded Unix folks did for years, and still do :)
See "The Unix Programming Environment" book by Kernighan and Pike for the best experience of the gestalt of Unix, other than real life usage with the guidance of a friendly expert.
Nothing against GNU, they have done tons of really good work.
And, for example, there is a lot of appreciation and regard for GNU, Linux and other free software and open source stuff in India, and interestingly, somewhat more so in Kerala, an Indian state with a more communist (politically) and hence somewhat / sometimes more egalitarian approach than other states. And they don't just appreciate, many of them also contribute to such efforts. People in other parts of India also do, some, of course.
I'm not from there, but I have family and friends from there, which is one reason I know about this.
Cheers.
make
in many cases.For example, google:
./configure; make; make install
to check out a common method of building packages from source on Unix.
Done it dozens of times, for packages like Oracle, MySQL, Python, and many others.
Although readymade binary packages exist for many such products, what you get by the above method is a good amount of configurability, for special needs or environments (hence the name of the configure command).
Somewhat confusingly, there's also a special target, ".DEFAULT", which you can define as a catchall; it's run for any requested target that has no definition of its own.
also, built-in rules like
.c.o
as well as similar user-definable ones, are useful.* build-essential - this actually installs what is considered "essential" for Debian packages. This happens to include some common compilers, other tools, and libraries, but also a bunch of junk you probably don't need. Still, this is harmless.
* There is `pkg-config` for detailing dependencies that are already installed, there's just no standard way to automatically download and run code from the internet (and good riddance)
* Sometimes you have to run `autoreconf -i` first to generate `configure` in the first place. This normally happens if you're running from a git clone rather than a tarball. Autotools is mostly useless if you're only working on Linux, marginally useful if you're also working on a different modern OS (especially since the first step is "install the GNU version of all the build tools"), but extremely useful if MyWeirdPrePosixProprietaryUnix. Despite this, `./configure` remains the single best interface for end-user installing - cmake in particular is an atrocity.
* There is a little confusion about "build/host/target", due to the question "what machine am I running on, at what time?" If you aren't cross-compiling you can ignore all this though.
* For more about `./configure` and `make` options, see https://www.gnu.org/prep/standards/html_node/Managing-Releas... - there's also some good reference material in the documentation for `autotools` and `make`. I won't repeat all the things there, many of which people really need to learn before posting.
* One annoying thing is that `./configure` and `make` disagree about whether libraries go in `LIBS` or `LDLIBS`. You should normally set variable for `./configure` so you don't have to remember them; one exception is `DESTDIR`.
* `as` and `ld` are called internally by then compiler, you should never call them directly, since it needs to mangle flags.
* The `checkinstall` tool, if supported for your distro, can help call `make install` in a way that uninstallation is reliable.
* rpath is the correct solution instead of `LD_LIBRARY_PATH`, and can be automated based copying `-L` arguments to `-R`. There's some FUD floating around the internet, but it only affects `setuid` (and other privileged) packages on multiuser systems, and even then only if they set it to certain values.
https://oneofus.la/have-emacs-will-hack/files/1978_Feldman_M...
Some already do
https://github.com/tsoding/nob.h
https://youtu.be/D1bsg8wkZzo?t=97
https://youtu.be/mLUhSPD4F2k?t=73
https://youtu.be/eRt7vhosgKE?list=PLpM-Dvs8t0Va1sCJpPFjs2lKz...
kazinator•6mo ago
Don't be fooled by the convention used in some necks of the woods of a .cpp suffix for C++ files; CPPFLAGS have to do with the "cpp" program, not the .cpp suffix.
LDLIBS is sister to LDFLAGS. Both these variables hold options for the linker command line destructured into two groups: LDFLAGS are the early options that go before the object files. LDLIBS are the -l options that give libraries, like -lssl -ldl -lcrypto ... these go after the object files.
If you're writing a Makefile, with your own custom recipes for linking, be sure you interpolate both LDFLAGS and LDLIBS in the right places.
zabzonk•6mo ago
wpollock•6mo ago
bluGill•6mo ago
hopefully this is useless trivia.
o11c•6mo ago
But you really, really should be using `pkg-config` (and no other obsolete `foo-config`) so you don't have to worry about it regardless.
(If you really want to do it yourself, learn about the `tsort` program first. Although allegedly obsolete for its original purpose, it is still useful to automate some sanity, and applies just as well between libraries as within them.)
cozzyd•6mo ago
wpollock•6mo ago
<https://wpollock.com/AUnix2/dll-demo.tgz>
(I wrote this long ago, as a linking demo for system administration students.)
hedora•6mo ago
https://www.gnu.org/software/make/manual/make.html#Implicit-...
emmelaich•6mo ago