> The world moved into dynamic linking in the 1980's for a reason.
Reasons that no longer exist. Storage is cheap, update distribution is free, time spent debugging various shared lib versions across OSes is expensive.
Tbh, the rights and wrongs aside, I suspect "everyone" is complaining about it because it's the easiest thing to talk about. Much like how feature discussions tend towards bikeshedding.
My /usr is 15G already, and /var/lib/docker isn't that far off despite people's obsession with alpine images. If more people would dismiss storage as cheap it'll quickly become expensive, just not per GiB.
> update distribution is free
I wouldn't be surprised if at one point Github would start restricting asset downloads for very popular projects simply because of how much traffic they'd generate.
Also, there's still plenty of places on the planet with relatively slow internet connectivity.
Storage doesn't really feel cheap. I'm considering buying a new laptop, and Apple charges $600 per TB. Sure, it's cheaper than it was in the '80s, but wasting a few gigabytes here and a few gigabytes there is quickly enough to at least force you to go from a 500GB drive to a 1TB drive, which costs $300.
It's the reality of storage pricing. The general statement "storage is cheap" is incorrect. For some practically relevant purposes, such as Apple laptops, it's $600/TB. For other purposes, it's significantly below $50/TB.
You could say "just don't buy Apple products". And sure, that might be a solution for some. But the question of what laptop to buy is an extremely complicated one, where storage pricing is just one of many, many, many different factors. I personally have landed on Apple laptops, for a whole host of reasons which have nothing to do with storage. That means that if I have to bump my storage from 1TB to 2TB, it directly costs me $600.
If you're buying Apple then you should expect inflated prices.
I got a 4TB NVMe SSD for like 350€, a 2TB one goes from 122 - 220 € depending on read/write speeds.
I don't check the installation size of applications anymore.
I'm just saying that $600/TB is a real storage price that lots of people deal with. Storage isn't universally cheap.
This feels especially relevant since we're discussing Zed here, the Mac-focused developer tool, and developers working on Mac are the exact people who pay $600/TB.
A 2TB SSD for the Framework 13 cost me 200 euros. But I agree that it's not cheap, files are getting bigger, games are big, apps are huge, and then you need backups and external storage and always some free space as temp storage so you can move files around.
I don't need to "get far in the Apple universe", I need a laptop. My current MacBook Pro cost about the same as the Dell XPS I was using before it, I like nice laptops
RAM isn't cheap (it may be for your tasks and wallet depth, but generally it isn't, especially since DDR5). Shared objects also get "deduplicated" in RAM, not just on disk.
What objects is the Zed process using that would even be shared with any other process on my system? Language support is mostly via external language servers. It uses its own graphics framework, so the UI code wouldn't be shared. A huge amount of the executable size is tree-sitter related.
I 100% agree.
As soon as you step outside of the comfort of your Linux distributions' package manager, dynamic linking turns into dependency hell.
And the magic solution to that problem our industry has come up with is packaging half an OS inside of a container...
OSes don't load the full executable into physical RAM, only the pages in the working set. Most of the Zed executable's size is tree-sitter code for all the supported languages, and only needs to page in if those languages are being used in a project.
Reasons that no longer exist. Storage is cheap, update distribution is free, time spent debugging various shared lib versions across OSes is expensive.