> Yes there are GCs for C, but is anyone successfully doing "systems programming" (whatever that may be) in C with GCs?
Actually, yes. But it's hackish (relies on some pretty complex macros) and requires you to adapt certain conventions. Still, it's doable and a whole lot safer than managing your memory directly in terms of leaks and re-use after free. The cost to me really is that macro magic, that should not be required but it's the only thing I could think of to make this work. To give you an idea of just how ugly this is I re-defined 'return'. Any C hacker will be able to deduce the rest from that one hint ;)
On another note, I felt - and feel - that this was not the proper solution but the various policy choices made this pretty much the only way in which it could be done. And it works.
In three letters: NIH. Management decision was that all IP had to be 100% owned by the company and had to be in 'C', in spite of an enormous amount of friction between C and the project as well as a bunch of work by others that could have been leveraged if we had decided to use code from other contributors. I got called in long after these decisions were made and it was very clear they weren't going to budge on those. There is a lot more to this story but I'm not at liberty to tell. Let's just say I learned a lot.
So you had to write your own compiler, operating system, and runtime libraries too?
-- I know, you didn't think it made any sense either. I'm just pointing out that a line has to be drawn somewhere; where it gets drawn is actually arbitrary.
Yeah, I've had to deal with ridiculous mandates from on high too, though none anywhere near that onerous. In my previous job, we were writing a compiler. It was mandated to be in C++ -- the first mistake -- and we had to use smart pointers instead of GC -- also a mistake. But the completely idiotic thing was that we were not allowed to declare any exception classes. The VP of Engineering -- a very smart and experienced but very arrogant guy -- had seen exception hierarchies get out of control before and decided the solution was to ban them.
But that's on a pretty small scale compared to what you're talking about.
Reminds me of my last^2 job: we were using Scala and the "technical architect" made two decisions that were mildly wrong in isolation but interacted rather badly: we'd make heavy use of monads for core functionality, and we wouldn't use scalaz. So I spent a while reimplementing parts of scalaz, and learned quite a lot (though I doubt I was adding much business value while doing so).
No, no need to apologize. It's one of the strangest assignments I've ever had and there were a few twists to the whole story that would make for a good book.
Actually, yes. But it's hackish (relies on some pretty complex macros) and requires you to adapt certain conventions. Still, it's doable and a whole lot safer than managing your memory directly in terms of leaks and re-use after free. The cost to me really is that macro magic, that should not be required but it's the only thing I could think of to make this work. To give you an idea of just how ugly this is I re-defined 'return'. Any C hacker will be able to deduce the rest from that one hint ;)
On another note, I felt - and feel - that this was not the proper solution but the various policy choices made this pretty much the only way in which it could be done. And it works.