Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Always host your dependencies yourself, it's easy to do & even in the absence of a supply chain attack it helps to protect your users' privacy.


But if the dependency from a CDN is already cached, it will skip an extra resource and site will load faster.

I agree with the points though.


That’s not been true since Site Isolation IIRC

e: Not sure it’s Site Isolation specifically, but it’s definitely still not true anymore: https://news.ycombinator.com/item?id=24745748

e2: listen to the commenter below, its Cache Partitioning: https://developer.chrome.com/blog/http-cache-partitioning


Right - and site isolation is about five years old at this point. The idea that CDNs can share caches across different sites is quite out of date.


Didn't know that. Then why are we (I mean, many web devs) still using it?

Just an old convention, simplicity, or saving bandwidth?


If that's true, this is a wild microcosm example of how the web breaks in ways we don't expect.


> this is a wild microcosm example of how the web breaks in ways we don't expect.

I think the performance characteristics of the web are subject to change over time, especially to allow increased security and privacy.

https is another example of increased security+privacy at the cost of being slightly slower than non-https connections because of an extra round trip or two to create the connection.

The lesson I take from it is: don't use complicated optimization techniques that might get out of date over time. Keep it simple instead of chasing every last bit of theoretical performance.

For example, there used to be a good practice of using "Domain Sharding" to allow browsers to download more files in parallel, but was made obsolete with HTTP/2, and domain sharding now has a net negative effect, especially with https.

https://developer.mozilla.org/en-US/docs/Glossary/Domain_sha...

Now they're realizing that HTTP/2's multiplexing of a single TCP connection can have negative effects on wireless connections, so they're working on HTTP/3 to solve that.

Also don't use polyfills. If your supported browsers don't support the feature then don't use the feature, or implement the fallback yourself. Use the features that are actually available to you.


In addition to Cache Partitioning, it was never really likely that a user had visited another site that used the same specific versions from the same cdn as your site uses.

Making sure all of your pages were synchronized with the same versions and bundling into appropriate bits for sharing makes sense, and then you may as well serve it from your own domain. I think serving from your www server is fine now, but back in the day there were benefits to having a different hostname for static resources and maybe it still applies (I'm not as deep into web stuff anymore, thank goodness).


Because of modern cache partitioning, HTTP/2+ multiplexing, and sites themselves being served off CDNs, external CDNs are now also worse for performance.

If you use them, though, use subresource integrity.


> and sites themselves being served off CDNs

Funnily enough I can't set up CDN on Azure at work because it's not approved but I could link whatever random ass CDN I want for external dependencies if I was so inclined.


This theory gets parroted endlessly, but nearly never actually happens in reality.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: