Hacker Newsnew | past | comments | ask | show | jobs | submit | skore's commentslogin

I see no way to get a hold of that particular soundtrack and searching the web failed me. How did you get a copy? Any tips?


After much searching I found a copy (FLAC) on MEGA, but I can't find my way back to it. No search engine seems to index MEGA


Would also highlight Martin Owens' Video: https://www.youtube.com/watch?v=TMHnyL4cRYA

He has been consistently putting out videos on his process and has been a (the?) main driving force behind the new pages addition.

He's also on patreon: https://www.patreon.com/doctormo



I just wish someone would RIIR Gnu make.

I use it heavily and think it's extremely underappreciated, so instead of reinventing it, I would like to build on it. But - trying to extend the old C codebase is daunting. I'd even be happy with a reduced featureset that avoids the exotic stuff that is either outdated or no longer useful. The core syntax of a Makefile is just so close to perfect.

(I wrote about some of this in the remake repo: https://github.com/rocky/remake/issues/114 )


You had me until

> The core syntax of a Makefile is just so close to perfect.

I'd argue what is close to perfect is much of the underlying model/semantics, and what is terrible is very much the syntax. I've long wanted to make something similar to Make but simply with better syntax...


To me, the core of the syntax is:

    target: prereq | ooprereq
        recipe
In my eyes, the most important thing when building something that is complex is the dependency graph and it makes sense to make the syntax for defining the graph as simple as possible. I think the make syntax just nails it and most of the other approaches I have seen so far add complexity without any benefit. In fact, most of the complexity they introduce seems to stem from confusion on the side of the developer being unable to simplify what they're trying to express.

At the level of variable handling and so forth, make is slightly annoying but manageable.

Anything beyond that - yeah, I'm with you, a lot of that is terrible.


The syntax has one major insurmountable problem - it uses space as a separator and so doesn't allow spaces in target names and therefore doesn't allow spaces in filenames.

Yes, there are workarounds for some situations, but none of them work in all cases and no-one ever applies them consistently.


That isn't a "major insurmountable" problem. Its a minor annoyance - just don't have spaces in your file names.


Or in your directory names both in your project, or in any parent folder. There's absolutely no reason why Windows shouldn't be able to call the folder "My Documents" but it broke a whole generation of builds that people tried to do in their home directory.

Spaces in filenames is a perfectly reasonable thing to want to do. They are supported on all three major platforms, but make just throws up it's hands and doesn't care. Yes, it is workable around, but it would be nice if it wasn't.

It's an annoyance with make overall, but if we're talking about the syntax specifically then yes I would call that a "major insurmountable problem".


One time, I had a work-issued laptop where they filled out my user account. They used spaces in my username. This meant that I had a space in every single path.

It caused so many issues. But mostly with developer tooling, like Make, that assumes these sorts of things. Most programs worked just fine.


> One time, I had a work-issued laptop [which I did not have root on].

Well there's your problem right there. That kind of shit needs to stop, and making companies that perpetrate it less able to develop software (and thus less profitable) is one of the only things people who don't directly interact with such a company can do to discourage it. (Not that that's in any way a good strategy, but to the tiny extent it has any bearing on makefile syntax in the first place, it's a argument against supporting spaces in filenames.)


I did have root on it. It was just set up for me by IT. Having root doesn’t change my account name.


You may be interested in [fac](http://sites.science.oregonstate.edu/~roundyd/fac/introducin...)

It's a build-system in Rust with some really cool features (and an even simpler syntax).


you might like ninja build then


I really like the idea of Ninja; I really want Ninja to be a single C(++) file that I can build with a trivial command line. I don't want a single C(++) file because of "elegance" so much that I want to ship Ninja as an inline dependency but now I've got two problems: I have to build my code and I have to build Ninja. If Ninja was a single file: boom! Trivial build line.

Other than that, I think Ninja is perfect.


You could probably post-process samurai (a rewrite of ninja into C) into a single-file: https://github.com/michaelforney/samurai


Ninja precompiled is a self contained 200kb single file binary.

Small enough to even checkin into git. Your compiler, cmake and all the other tools you use is going to be a much bigger problem.


ninja is normal part of Linux distros. So at least there you don't have to worry.


Ninja has as goal to be simple and fast, but not necessarily convenient for humans to write. It's in the second paragraph of their home page.

While ninja is great for many uses i would not recommend to use it for hand written rules. In fact any simple dependency-resolver like make or ninja will be lacking a lot of language context about includes and other transitive dependencies so you always want some higher level abstraction closer to the language as your porcelain.


ninja specifically handles includes automatically.

Anyway, not having any other automagical behavior and hidden rules and the fact that it can do incremental and correct job re-runs even when the rules change is exactly why I like to use ninja.

Here's one such use case, maybe not the cleanest:

https://megous.com/git/p-boot/tree/configure.php

I especially love it in projects involving many different compilers/architectures/sdks at once (like when doing low level embedded programming), where things like meson or autotools or arcane Makefile hacks become harder to stomach.


Thats quite a stretch of what handles means. While it supports integrating such use cases but you still have to do the gcc -M dance with https://ninja-build.org/manual.html#ref_headers.

As opposed to say cmake, bazel or other higher level abstractions where you just ask it to take all c-files in this directory and solve the rest.


Take meson. It's at the same level of cmake, but what handles the C include dependencies for it is ninja. cmake also has ninja backend. Not sure how it works exactly, because I don't use cmake, but I assume it will be ninja handling include deps too in that case.

Yes, ninja basically handles it for you, compared to what you have to go through when using Makefiles, to have autogenerated dependencies.


I will check it out, thank you for the suggestion - what I glanced at seems very interesting.


Shake would also be interesting to look at. It has much more sophisticated dependency graph handling.

Gnu Make is almost pathetic.

See eg http://simonmar.github.io/bib/papers/shake.pdf


That was a great read - thanks for linking.

I had trouble with the one class in uni that used Haskell and I've been working in a Python shop for some years now, so I kept expecting to encounter some impenetrable section that would make my eyes glaze over and my hand close the tab. I was probably closest around page 9!

But the writing was excellent, and clear, and satisfying, and little concepts I was so close grokking kept catching my eye and pulling me back in until I understood them, then their neighbors, then the section, then I was done.

Guess I've picked up some things since college. Wish I could go back and take that class again. I think learning to use Rust's Option type in anger on a personal project helped me understand monads more than anything in that class.

Also, I'm happy to see the method described by the paper does seem to have become the official GHC build system [0].

0. https://gitlab.haskell.org/ghc/ghc/-/wikis/building/hadrian


This is their minimal example to compile a single C file:

1 module Main(main) where

2 import Development.Shake

3 import System.FilePath

4

5 main :: IO ()

6 main = shake shakeOptions $ do

7 want ["foo.o"]

8

9 "∗.o" %> \out → do

10 let src = out -<.> "c"

11 need [src]

12 cmd "gcc -c" src "-o" out

Maybe there is a need for super sophisticated graph handling, but for most use cases the way more complicated syntax of Shake is not a worthy tradeof.


https://shakebuild.com/ agrees with you:

> Large build systems written using Shake tend to be significantly simpler, while also running faster. If your project can use a canned build system (e.g. Visual Studio, cabal) do that; if your project is very simple use a Makefile; otherwise use Shake.

For what it's worth, if I remember right, Shake has some support for interpreting Makefiles, too.

> [...] the way more complicated syntax of Shake [...]

For context, Shake uses Haskell syntax, because your 'Shakefile' is just a normal Haskell program that happens to use Shake as a library and then compiles to a bespoke build system.

Also:

> The original motivation behind the creation of Shake was to allow rules to discover additional dependencies after running previous rules, allowing the build system to generate files and then examine them to determine their dependencies – something that cannot be expressed directly in most build systems. However, now Shake is a suitable build tool even if you do not require that feature.


> I've long wanted to make something similar to Make but simply with better syntax...

If you haven't, I'd suggest having a look at pmake (sometimes packaged as 'bmake' since it is the base 'make' implementation on BSDs) - the core 'make' portions are still the same (like gmake extends core 'make' as well), but the more script-like 'dynamic' parts are much nicer than gnumake in my opinion. It also supports the notion of 'makefile libraries' which are directories containing make snippets which can be #include's into other client projects.

freebsd examples:

manual: https://www.freebsd.org/cgi/man.cgi?query=make&apropos=0&sek...

makefile library used by the system to build itself (good examples): https://cgit.freebsd.org/src/tree/share/mk


Looks interesting, thanks for sharing!


I had a go at making an experimental tool with a better syntax, fixing a lot of the obvious problems (whitespace, $ having a double meaning, etc). https://rwmj.wordpress.com/2020/01/14/goals-an-experimental-...


Wow, cool! Thanks for sharing! I'll have to take a look when I get the chance.


So you want ninja?


Yeah, GNU Make is actually a functional programming language. When I discovered this, I almost reinvented parts of autoconf in GNU Make. It's like trying to write a complex project with C preprocessor macros! It was a lot of fun.

GNU Make is hurt by its minimalism. Doing anything interesting in pure GNU Make is a herculean effort. Even something simple like recursing into directories in order to build up a list of source files is extremely hard. Most people don't even try, they would rather keep their source code tree flat than deal with GNU Make. I wanted to support any directory structure so I implemented a simple version of find as a pure GNU Make function!

At the same time, a reduced feature set would be nice. GNU Make ships with a ton of old rules enabled by default. The no-builtin-rules and no-builtin-variables options let you disable this stuff. Makes it a lot easier to understand the print-data-base output.

> The core syntax of a Makefile is just so close to perfect.

It's very simple syntax but it has its pain points. Significant spaces effectively rules out spaces in file names. It also makes it much harder to format custom functions.

Speaking of functions, why can't we call user-defined functions directly? We're forced to use the call function with the custom function's name as parameter. Things quickly get out of hand when you're building new functions on top of existing ones. I actually looked up the GNU Make source code, I remember comments and a discussion about this... It was possible to do it but they didn't want to because then they'd have to think about users when introducing new built-in functions. Oh well...


> something simple like recursing into directories in order to build up a list of source files is extremely hard

What's wrong with shelling out to find?


The find program might not be available, right? The developer might be building on Windows or something. Also, spawning new processes is an expensive operation so doing it from inside GNU Make might be faster.

Okay, I just wanted to see if I could do it in pure GNU Make.

  true := T
  not = $(if $(1),,$(true))

  directory? = $(if $(1),$(wildcard $(addsuffix /.,$(1))))
  file? = $(and $(wildcard $(1)),$(call not,$(call directory?,$(1))))

  glob = $(sort $(wildcard $(or $(1),*)))
  glob.directory = $(call glob,$(addsuffix /$(or $(2),*),$(or $(1),.)))

  recurse = $(foreach x,$(3),$(if $(call $(2),$(x)),$(x),$(x) $(call recurse,$(1),$(2),$(call $(1),$(x)))))

  file_system.traverse = $(call recurse,glob.directory,file?,$(or $(1),.))

  find = $(strip $(foreach entry,$(call file_system.traverse,$(1)),$(if $(call $(or $(2),true),$(entry)),$(entry))))

  sources := $(call find,src,file?)
Yes.

Then I discovered GNU Make supports C extensions. It will even automatically build them due to the way the include keyword works. It might actually be easier to just make a plugin with all the functionality I want...


Have you seen Just? https://github.com/casey/just

It's make-like but supposedly improves on some of the arcane aspects of make (I can't judge how successful it is at that as I've never used make in anger).


We use it a bit for https://github.com/embedded-graphics/embedded-graphics and I have to say I love it. It's just the right balance between configuration variables/constants and "a folder full of bash scripts". I highly recommend giving Just a try.


One of the reasons I know I still rely on Make instead of Just, is about how available make is, and pretty much any tool (be it a new implementation of make in Rust or Just) will have to deal with the fact that make is already a default on so many systems.

That makes make a great entry point for any project. Does anyone have a general suggestion about how to work around this? The goal being, sync a project and not need to install anything to get going. One thing I do sometimes is to use make as the entry point, and it has an init target to install the necessary tools to get going (depends on who I know to be the target audience).


This sounds like what I do: Makefile with distinct steps: install, build, test, lint, deploy. Those steps call underlying tools, depending on the platform, framework, language.

E.g. in one project it will install rbenv, bundle, npm, yarn etc. Call Rake, npx, that odd docker wrapper, etc. Or deploy with git in one project and through capistrano or ansible in another.

As a dev, in daily mode, all you need is 'make lint && make test && make deploy'. All the underlying tools and their intricacies are only needed when it fails or when you want to change stuff.


Yeah. Exactly. I do this when I think there will be users who don’t already have all the dev tools installed, and I don’t want them to have to download and install everything.


yeah, until you meet the system with different make, and now you have to install GNU/Make.


I love Just, but I love it because it's not a build system — I already have a build system, I just need a standard "entry point" to all my projects so each command a user could use to interact with the actual build system is in the same place for each project. I wouldn't build, say, a C project with Just and Just alone, where it tracks the dependencies between files and only rebuilds what's necessary.

So while it's replaced one use of Make for me, I can't rightly call it a Make replacement.


Would you build a C project with make? One might argue that C could do with a better build system like other languages have. I don't use C much, but I hear that these exist.


i wont use this because the author sucks up all justfiles files with no way for a user to opt out and isnt fully clear on this. use it on some private or company thing that accidentally goes public, oops he has your file now in his repo.

if it was opt-in to him monitoring i might be more open. maybe i should fork and change the filename to keep this from happening


What are you talking about? Where does he "suck up" all Justfiles?



I see. Interesting.


I've seen it before but discarded it because since my workflow is heavily file-based, I DO need makes capacity as a build system, not just a task runner. Will check it out to see whether that changed.

edit: Yeah, I can't see how I can make it work for a file-based approach.


There is kati from Android:

https://github.com/google/kati/blob/master/INTERNALS.md

This takes your existing Makefile(s) and produces a ninja build file. Not sure if Android still uses it or it is all soong now.


>The core syntax of a Makefile is just so close to perfect.

That's about the last thing I would call it.


Allow me to suggest build2 (the build system part) as another alternative. You would probably be first interested in ad hoc recipes: https://build2.org/release/0.13.0.xhtml#adhoc-recipe


Very VERY interesting!

The one thing that I'm currently struggling to find information on is dynamic dependency handling (dyndep in ninja terms).

Is that something that build2 covers as well? Any resource you could point me to?


Dynamic dependencies are supported by build2 (this is, for example, how we handle auto-generated C/C++ headers) but it's not something that is exposed in ad hoc recipes that are written in Buildscript yet (but is exposed in those written in C++). Our plan is to start with the ability to "ingest" make dependency declarations since quite a few tools are able to produce those. Would something like this cover your use-cases?


Very likely, yes. I'm generating my own makefiles anyways, so generating something that build2 could consume should be possible, too.

Would it be appropriate to open a github issue for discussing this further? I would like to share some example for how my current setup is working and having the github syntax available would be helpful.


Yes, please, that would be very helpful: https://github.com/build2/build2/issues


"issue" filed. My apologies for the lack of brevity.

https://github.com/build2/build2/issues/135



That looks very good! I wonder if I can make it fit my file-based approach.

edit: Yeah, doesn't seem like it.


Someone is working on this, and could use help: https://github.com/michaelmelanson/fab-rs

I would love to see this completed to the point of passing the GNU make testsuite. Having make as a modular library would be wildly useful.


There's always ninja.

https://ninja-build.org/


In case anyone would attempt this: please make quoting right.


6 years into writing my own kind of SSC, based entirely on make.

AMA ;-)


I do understand you're not reading the replies and I'm trying to not make the same mistake that you did, so I'm not responding to the emotional content (which I could, at length, in kind) of your message, but what I think is the meat of your misunderstanding.

I find it odd that you're focussing on making people understand that many in your profession care. It has been my experience that most health care providers… care.

Let's suppose that health care is like any other profession - let's say it's just like programming, something most people understand deeply on here.

We all know programmers who care and programmers who don't care. The best programmers I know are the ones who care, are competent, but also able and willing to have their understanding of the subject matter challenged. They're not married to their craft - they keep the right distance to excel at it.

The worst ones I know are those who care just as much, but are utterly incompetent. The ones who are so deep into it that they literally cannot see any other way. To them, everything is clear and obvious. They're willing, enthusiastic and overjoyed to be working. And they're wrong. There is not a codebase they touch where every hour of work they put in produces ten or twenty times as much work for others, later on.

I think caring about your craft is a force multiplier and it's a necessary one. You don't want a mindless drone who doesn't give a damn about either your health or your codebase, no matter how competent they are. To care means to have a vital part of you engaged in such a way that all that you could bring to the table does get there. But it has to be lightly held and accepting of being challenged.

You care. I get it. But that doesn't mean everybody cares. (Again, really struggling here to not respond in kind with examples of people I have met who clearly did not care.) And even if everybody did care - some people are actually more useless the more they care.

But - it's up to the customers (or patients) to navigate that landscape. There has to be a mutual understanding that all parties involved are just human and can make mistakes. Our understanding of health care moves forward and things that are standard care today had to be invented, sometimes just a few years ago, often replacing things that used to be just as firmly a standard, before. Challenging what is accepted practice should be a healthy part of this interaction.

There's a limit to how many people care. There is a limit to how useful it is that people care. There is a limit to how effective you can be for your patients when you walk around with a chip on your shoulder, thinking that people just don't get how much you care.


Thiel actually lampshades this himself:

> Warp drive is in fact hard to take seriously because its basic physics are so far beyond the furthest reaches of our knowledge as to debilitate would-be researchers—not to mention reasonable doubts about the friendliness of faraway foreign species. (Some of them, I assume, are good aliens.)


Ouch, that "Subscribe today" banner should have its fonts either embedded on the page or rendered into paths. Without that, it looks like this: https://imgur.com/a/kxie7Cb


Goodhart's law - "When a measure becomes a target, it ceases to be a good measure."

I used to work for a company that bills their customers for dev hours spent. The software they put together worked fabulously well - in the production of billable dev hours.


Since some people might pay again to see the "improved" version (after some of them likely saw the original version to see for themselves what all the screaming was about) - is this a film industry version of a DLC?


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: