Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I just refuse to use any software that baloons it's filesize. Not because I can't afford storage, but because there are always alternatives that have similar features and packed into fraction (usually less than 1%) of filesize. If one of them can do it and other can't, it's a bad product, that I have no intention to support.

We should strive to write better software that is faster, smaller and more resilient.

"Storage is cheap" is a bad mentality. This way of thinking is why software only gets worse with time: let's have a 400mb binary, let's use javascript for everything, who needs optimization - just buy top of the shelf super computer. And it's why terabytes of storage might not be enough soon.



I can empathize with how lazy some developers have gotten with program sizes. I stopped playing CoD because I refused to download their crap 150+ GB games with less content than alot of other titles that are much smaller.

That said, storage is cheap, it's not a mentality but a simple statement of fact. You think zed balloons their file sizes because the developers are lazy. It's not true. It's because the users have become lazy. No one wants to spend time downloading the correct libraries to use software anymore. We've seen a rise in binary sizes in most software because of a rise in static linking, which does increase binary size, but makes using and testing the actual software much less of a pain. Not to mention the benefits in reduced memory overhead.

VSCode and other editors aren't smaller because the developers are somehow better or more clever. They're using dynamic linking to call into libraries on the OS. This linking itself is a small overhead, but overhead none-the-less, and all so they can use electron + javascript, the real culprits which made people switch to neovim + zed in the first place. 400mb is such a cheap price to pay for a piece of software I use on a daily basis.

I'm not here to convince you to use Zed or any editor for that matter. Use what you want. But you're not going to somehow change this trend by dying on this hill, because unless you're working with actual hardware constraints, dynamic linking makes no sense nowadays. There's no such thing as silver bullet in software. Everything is a tradeoff, and the resounding answer has been people are more than happy to trade disk space for lower memory & cpu usage.


Does static linking really reduce memory and cpu usage significantly compared to dynamic linking?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: