Hacker Newsnew | past | comments | ask | show | jobs | submit | apatheticonion's commentslogin

Same. I started writing one as a weekend project and got as far as having a GH repo with releases that mirror the binaries of the tools that I use, normalizing the archives so they can be installed in the same way. It auto-generates a release whenever the project authors update their projects.

https://github.com/alshdavid/install-scripts/releases?q=node...

https://github.com/alshdavid/install-scripts/releases

All of the binaries here are expected to be standalone/portable installations, so you can download/extract the archive and just run the binary.

    curl -L --url https://github.com/alshdavid/install-scripts/releases/download/terraform-1.14.1/terraform-1.14.1-linux-amd64.tar.gz | tar -xvzf - -C $HOME/.local/bin
    $HOME/.local/bin/terraform --help

I haven't yet written a package manager yet, but I was planning for it to just do the same thing as above but figure out your OS/ARCH, handle extraction and also offer a PATH update system so you can run `eval $(xpkg env)` and PATH is updated automatically.

I agree. Incoming hot take.

IMO, a big part of it is the lack of competition (in approach) exacerbated by the inability to provide alternatives due to technical/syntactical limitations of JavaScript itself.

Vue, Svelte, Angular, Ripple - anything other than React-y JSX based frameworks require custom compilers, custom file-types and custom LSPs/extensions to work with.

React/JSX frameworks have preferential treatment with pre-processors essentially baking in a crude compile time macro for JSX transformations.

Rust solved this by having a macro system that facilitated language expansion without external pre-processors - e.g. Yew and Leptos implement Vue-like and React-like patterns, including support for JSX and HTML templating natively inside standard .rs files, with standard testing tools and standard LSP support;

https://github.com/leptos-rs/leptos/blob/main/examples/count...

https://github.com/yewstack/yew/blob/master/examples/counter...

So either the ECMAScript folks figure out a way to have standardized runtime & compilable userland language extensions (e.g. macros) or WASM paves the way for languages better suited to the task to take over.

Neither of these cases are likely, however, so the web world is likely destined to remain unergonomic, overly complex and slow - at least for the next 5 - 10 years.


OK I got my own extremely hot take.

In my opinion, the core functionality of React (view rendering) is actually good and is why it cannot be unseated.

I remember looking for a DOM library:

- dojo: not for me

- prototype.js: not for me

- MooTools: not for me

- jQuery: something I liked finally

Well, guess what library won. After I adopted jQuery, I completely stopped looking for other DOM libraries.

But I still needed a template rendering library:

- Mustache.js: not for me

- Handlebars.js: not for me

- Embedded JavaScript Templates: not for me

- XML with XSLT: not for me

- AngularJS: really disliked it SOO much*

- Knockout.js: not for me

- Backbone.js with template engine: not for me and actually it was getting popular and I really wished it would just go away at the time**

- React: something I actually liked

You must remember that when React came out, you needed a JSX transpiler too, at a time when few people even used transpilers. This was a far bigger obstacle than these days IMO.

Which leads to my hot take: core React is just really good. I really like writing core React/JSX code and I think most people do too. If someone wrote a better React, I don’t think the problem you mentioned would hamper adoption.

The problems come when you leave React’s core competency. Its state management has never been great. Although not a React project itself, I hated Redux (from just reading its docs). I think RSC at the current moment is a disaster — so many pain points.

I think that’s where we are going to see the next innovation. I don’t think anyone is going to unseat React or JSX itself for rendering templates. No one unseated jQuery for DOM manipulation — rather we just moved entirely away from DOM manipulation.

*I spent 30 minutes learning AngularJS and then decided “I’m never going to want to see this library again.” Lo and behold they abandoned their entire approach and rewrote Angular for v2 so I guess I was right.

**It went away and thankfully I avoided having to ever learn Backbone.js.


Does transpilation not cover this? That's how they did JSX.

Transpilation of anything other than jsx requires a complex toolchain with layers of things like LSPs, compilers, IDE plugins, bundler plugins, etc.

Frameworks that go that route typically activate this toolchain by defining a dedicated file extension (.vue, .svelte).

This custom toolchain (LSP, IDE plugins) presents a lot of overhead to project maintainers and makes it difficult to actually create a viable alternative to the JSX based ecosystem.

For instance both Vue and Svelte took years to support TypeScript, and their integrations were brittle and often incompatible with test tooling.

Angular used decorators in a very similar way to what I am describing here. It's a source code annotation in "valid" ecmascript that is compiled away by their custom compiler. Though decorators are now abandoned and Angular still requires a lot of custom tooling to work (e.g, try to build an Angular project with a custom rspack configuration).

JSX/TSX has preferential treatment in this regard as it's a macro that's built into tsc - no other framework has this advantage.


Chicken and egg problem. JSX is supported because it's popular. If React decides to push a new syntax I don't see why everyone wouldn't reasonably quickly adapt and support it.

This only applies to TS, not JS, right? Cause afaik JSX isn't getting any special treatment from babel, but TSX has tsc support like you said.

Exactly this. "AI usage is 20% of our customer base" "AI usage has increased 5% this quarter" "Due to our xyz campaign, AI usage has increased 10%"

It writes a narrative of success even if it's embellished. Managers respond to data and the people collecting the data are incentivised to indicate success.


Sadly, almost none of my friends care or understand (older family members or non-tech people). If I tried to convince friends to move to Signal because of my disdain for AI profiteering, they'd react as if I were trying to get them to join a church.

I don't understand why it takes 5 seconds for Chrome to open on my MBP while it's near instant on my Linux and Windows PC.

Why is eveything so slow on new MacOS?


It’s not everything, it’s just Chrome. Chrome is 1.6GB including all its dependencies. It’s going to be slow to start on any system if those dependencies aren’t preloaded.

Most Mac software I use (I don’t use Chrome) starts quickly because the dependencies (shared libraries) are already loaded. Chrome seems to have its own little universe of dependencies which aren’t shared and so have to be loaded on startup. This is the same reason Office 365 apps are so slow.


It's not just Chrome, it's everything, though apps that have a large number of dependencies (including Chrome and the myriad Electron apps most of us use these days) are for sure more noticeable.

My M4 MacBook Pro loads a wide range of apps - including many that have no Chromium code at all in them - noticeably slower than exactly the same app on a 4 year old Ryzen laptop running Linux, despite being approximately twice as fast at running single-threaded code, having a faster SSD, and maybe 5x the memory bandwidth.

Once they're loaded they're fine, so it's not a big deal for the day to day, but if you swap between systems regularly it does give macOS the impression of being slow and lumbering.

Disabling Gatekeeper helps but even then it's still slower. Is it APFS, the macOS I/O system, the dynamic linker, the virtual memory system, or something else? I dunno. One of these days it'll bother me enough to run some tests.


Somewhere around 2011 when I switched my MBP to an SSD (back when you could upgrade the drives, and memory, yourself), Chrome opened in 1-2 bounces of the dock icon instead of 12-14 second.

People used to make YouTube videos of their Mac opening 15 different programs in 4/5 seconds

Now, my Apple Silicon MacBook Air is very, very fast but at times it takes like 8-9 seconds to open a browser again.


I loved the MBP’s from that era. That was my first (easy) upgrade as well in addition to more memory. Those 5400 RPM hard drives were horrible. Also another slick upgrade you could do back then is to swap out the super drive with a caddy to have a second SSD/HDD.

It still works fine today, though I had install Linux on it to keep it up to date.


I'm running the latest MacOS right now on a modest m4 Mini and it doesn't seem slow to me at all. I use Windows for gaming and Linux for several of my machines as well and I don't "feel" like MacOS is slow.

In any case, Chrome opens quickly on my Mac Mini, under a second when I launch it from clicking its icon in my task bar or from spotlight (which is my normal way of starting apps). When Chrome is idle with no windows, opening chrome seems even faster, almost instant.

This made me curious so I tried opening some Apple apps, and they appear to open about the same speed as Chrome.

Gui applications like Chrome or Keynote can be opened from a terminal command line using the open command so I tried timing this:

     $ time open /Applications/Google\ Chrome.app
which indicated that open was finished in under 0.05 seconds total. So this wasn't useful because it appears to be timing only part of the time involved with getting the first window up.

It's always been that way. Even when I had a maxed out current-gen Mac Pro in 2008, it still launched and ran faster in Windows than MacOS.

I have seen people suggesting that it's because of app signature checks choking on Internet slowness, but 1. those are cached, so the second run should be faster, and in non-networked instances the speed is unchanged, and 2. I don't believe those were even implemented back in 2002 when I got my iMac G4, and it was likewise far quicker in Linux than in OS X.

At the time (2002), I joked that it was because the computer was running two operating systems at once: NeXTSTEP and FreeBSD.


Do you by chance still run an intel version of chrome on an apple silicon device?

Our work laptops have antivirus and other verification turned on which impose a 4-16x penalty on IO.

The cpu, memory, and ssd are blazing fast. Unfortunately they are hamstrung by bad software configuration.


The better question is why is Chrome so much slower and more of battery drainer than Safari on a Mac

I was interested in trying to make a DIY thermal battery as a hobby experiment. Other than using thermal energy directly, I couldn't find a way to effectively convert the heat energy to electrical energy.

Peltier modules can be used to generate electricity, but they are crazy inefficient.

An efficient steam turbine is largely inaccessible to hobbiests and I am scared of steam/pressure. Though I did look at repurposing a car turbo for this purpose. There were additional issues with regulating the amount of heat you wanted to extract (load matching) and recycling waste heat.

I wondered if it was possible to use a Sterling engine, but you can't buy anything other than very small toys online and I don't have the facilities to machine my own.

Haha, would love to get something working, but I suppose I'm not smart enough to figure out an effective way to get that heat back out as usable/controlled electricity.


The answer in almost all electrical production boils down to spinning a turbine with steam (or wind). Nuclear does it, all the fossil fuels do it and ultimately heat batteries do it too. The alternative is photovoltaic or directly nuclear to electron production and then storage with chemical batteries or massive capacitors.

Most of our electrical production is based on a solution found several hundred years ago, we just made it really big and worked out how to control the heating and pressure of the steam well.


Non-steam turbines have been operated (e.g. https://en.wikipedia.org/wiki/Mercury_vapour_turbine), but… steam is just so much easier to work with.


Come on, mercury vapour sounds like sooooo much fun! Where's your sense of adventure?

/s


You missed thermoelectric generators that uses the Seebeck effect to generate a current between two temperature differentials. It's terribly inefficient, unfortunately.


> An efficient steam turbine is largely inaccessible to hobbiests and I am scared of steam/pressure.

Thermal electricity generation really benefits from scale and extremes. The Carnot efficiency is proportional to the temperature differential between hot and cold. Even so-called "low quality" heat from a standard nuclear rector design is far hotter than anybody should deal with at home and it only gets ~1/3 efficiency. And dealing with small turbines is really inefficient too.

This is where batteries and solar really shine. They scale so well, and are extremely economical and electrically efficient.

Heat storage works well when you get beyond the scale of individual homes, but it's hard to make it work. I'd love to see something related to heat pumps in the future for homes, but district heating, such as could be accomplished by converting natural gas systems to heat delivery, are probably required for it to make sense.


Yeah, sadly, it seems almost impossible to get anything higher than 30% efficiency (theoretically with a Stirling engine, if you can find one, haha) out of a thermal battery without extreme pressures and temperatures.

Back-of-the-napkin math felt promising. A 1kg block of sand heated to 500 degrees Celsius should contain about 100Wh of electricity. Scaling that capacity up is easy, as it's just about adding sand or temperature (+ an effective method of transporting heat across the sand - maybe sand + used motor oil?).

Assuming 80% efficiency, tariff arbitrage (buy electricity during off-peak hours and use it during high-price hours) would pay off very quickly. In my area (Australia) it would be a matter of months - but the low real-world efficiency and lack of parts make it impossible.

It could work for heating during winter, though perhaps an AC/heatpump with the condenser a couple metres underground would be better value for money.


Heat storage can work for individual homes on the shorter scale. If you heat your home with in-floor heating (lower temperature requirements) you can have ~1-2m3 buffer tank that you heat up during the night and then use the stored heat during the day to heat your home. Works very well.


This project is for district heating, not producing electricity.

In general it is true that low-grade heat is difficult to convert to electricity, and there isn't any existing mass-market device that does it. You'll have to make your own, which involves learning to machine and responding to your perfectly reasonable fear of steam and pressure with proven safety measures.


Every couple of years I look around to see if anyone is selling sterling cycle engines in the 5-10 hp range, I always find a couple neat projects but nowhere can you just buy an engine.

I assume that because there is no current market for small sterling generators nobody wants invest in tooling to make one and because there are no small sterling generators there is no market for them.


In the articles case the end use of energy is household heating, so there is no need to convert back to electricity. The whole beauty of thermal energy storage that the end use of energy in many use cases is.. heat: heating buildings, cooking, industrial heating (from food processing to iron smelting), producing steam, etc.


If you need to use heating in a cold climate, you could use your stored energy to heat the radiator of a heat pump, which would then be drastically more efficient than using normal air on the radiator.

There's a video of people doing this on YouTube. They use the ground as their heat source. https://youtu.be/s-41UF02vrU


https://en.wikipedia.org/wiki/Thermoelectric_generator

Seebeck generator, generally. Peltier goes the opposite way. But basically the same thing.


If your hot source is really hot, thermophotovoltaic (https://en.wikipedia.org/wiki/Thermophotovoltaic_energy_conv...) makes sense and can offer much better efficiency...


LFP is so cheap that small-scale thermal battery makes not sense for electricity generation. Even in big scale (like OP) it mostly makes sense for heat, e.g. district heating systems, industry process heat, etc.


How far away are we from being able to use new snapdragon laptops with Linux?

I'm pretty keen to play around with Proton, FEX in a laptop that rivals the MBP


> slack clones

As an anecdote - I really want to see consolidation here. All my chat services under one parent application. WhatsApp, Slack, Discord, Messenger.

I really don't care about the vendor for chat services, just exhausted of installing multiple clients and many of the clients being pretty garbage.

I would also like to say the same for many services. Online banking is top of mind for me right now as I have several bank accounts chasing competitive savings rates.


I'm pretty sure you don't want Teams to be the winner of consolidation. Unfortunately it's for the advantage of being included for free for ever big company using M365. We are fighting a losing battle to keep Slack.


A funny workaround I employed is running Beeper. It's a Matrix client that also provides chat mirroring for other platforms. The sync is slightly jank but it works for what I want to achieve

The mirroring stuff is FOSS and I think so is the client, the financial model being that you're limited to a fairly low amount of services proxies at once without a paid plan


> All my chat services under one parent application. WhatsApp, Slack, Discord, Messenger.

There was a time where one application for multiple chat services was a thing, e.g. Pidgin, Trillian or Miranda. With thw death of ICQ, AIM or MSN this is pretty much history.


Tauri is pretty awesome. Rust backend, WebView front end. Nothing uses native desktop elements of course.

To be fair, there is no practical way to write native desktop applications using stylistically consistent UI elements AND have it be portable AND in a language that you enjoy using.

As far as I can tell, Windows 11 doesn't even have a toolkit with platform UI elements.

GTK on Gnome is pretty okay and GTK-rs is not dissimilar to React. Who know what MacOS uses but something something Swift XCode.

But I agree, just use web technologies. Write once, ship everywhere (and hum loudly when people complain about poor performance - joking, it's the vendors' fault we have to use web technologies).


In my experience with Tauri, it's pretty good on Windows, but not so much on other platforms, especially Linux. The decision to target different browser engines on each operating system means you still have to deal with a bunch of different OS-specific bugs.

For Windows you're dealing with Edge (so Chromium), on macOS you have Safari, and on Linux you have WebKitGTK. WebKitGTK has honestly abysmal performance, and you're missing a lot of modern standards.

The Tauri devs are looking at bundling a Chromium browser with it to deal with that, but that's still some time off, and leads to the same issue Electron has, where you have large bloated applications.

https://github.com/tauri-apps/wry/issues/1064


> As far as I can tell, Windows 11 doesn't even have a toolkit with platform UI elements.

They do, it's called WinUI 3. It's barely used for all of the aforementioned.


I tried this. Their examples don't even compile lol


> Rust backend, WebView front end.

I don't know much about it but it seems like a weird combination. If you want high performance and low memory usage, you don't want HTML, if you want fast code writing, you don't want Rust.


I don't believe you generally end up writing a lot of Rust with something like Tauri. It is mostly web dev. While it is true that browser based UIs are slower than native, it isn't clear that .NET based UIs would be any faster while being very niche.


There's Wails for Go backend and webview frontend; https://github.com/wailsapp/wails


Use the slower but easier to write languages for front end is the norm for complex apps. Many apps that passed the trial of time are like that.

Blender: frontend Python, backend C++.

Houdini: frontend Python(PyQt), backend C(presumably)

Sim City: frontend JavaScript, backend C++

The reason is very simple: frontend is more error tolerant, but less resistant to the product designer's whims (or the users' desire to customize.)


> Blender: frontend Python, backend C++.

blender's frontend is pretty much exclusively C++? https://github.com/blender/blender/tree/main/source/blender/...

> Houdini: frontend Python(PyQt),

I would be infinitely surprised if Houdini's frontend wasn't also a majority C++. Likewise consider large apps such as Ardour, Krita, etc.


I could see a case where the core logic needs to be performant, but the UI does not. The front end could be some menus, displaying (not a giant amount of) data, and a progress bar, while the back end does heavy computing.

And furthermore, if you want fast code writing, you write in the language you already know. For some people, that is Rust.


A very reasonable combination:

- HTML is the main way of designing interfaces (whether we like it or not)

- Rust is the main language promoted by intelligence agencies with multi-billion dollar budgets because it is laced with their backdoors (whether we like it or not)


I'd been taking 3mg slow release melatonin daily for years up until a few months ago. To be honest, I'm not sure it has any significant effect.

Regular exercise and a consistent sleep routine (cardio, weight lifting, going to bed early, and waking up early) has been more effective for me.

According to my fitbit, my average sleep duration is 6hr 30min over the last 2 years, down from 7hr30. When I wake up, there's no going back.

The biggest contributor to my reduction in sleep is my job, which in the last few years added stack ranking and by-annual performance reviews which requires daily book keeping of my "company impact".

I also got an echo-cardiogram last week (unrelated) and it came back in top shape (have a calcium score test coming up). Not saying melatonin isn't a risk for cardio health, but as a male in his early 30s with a family history of heart disease, nothing seems to indicate an increase in damage in my case.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: