Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> What makes it order of magnitudes more performant than Ruby

Source for the "orders of magnitude" anywhere in the last year or so? Ruby with EventMachine outperformed Node significantly when I tested websocket chat servers.



Though now you're in Event Machine world and can't use the rest of the ecosystem.

One of the frequently overlooked properties of Javascript/Node is that it's async-everything.

I'd be curious to hear more about this benchmark.


This might come as a surprise, but async everything is not a feature for a lot of people. It’s a bug.


Yes, of course. Everything is a trade-off. Why the condescension?

But there are almost zero async-everything options in the space. And being confined to a second-class async subworld inside a synchronous ecosystem is a classic error-prone challenge whether you're using Twisted, Event Machine, Tokio, or Netty.

It's a pretty big downside of using Event Machine which even created its own networking primitives instead of using those in the Ruby stdlib.


Sorry, I guess on a second reading you were specifically talking about situations in which you want to do things async. In that case, I can see that everything being async from the start is preferred.

I was talking about the general case of everything being async in JS. That's frequently touted as a benefit of JS, but it's utterly maddening to workaday web developers. You want your requests to be served async (which should just be handled by whatever framework you're using), but inside of a single request you mostly just want to write synchronous code, even when you're dealing with IO. It's a lot easier to reason about.


I'd assert that inside routes you actually don't want to be sync, yet async/await lets you write async code with the simplicity of writing sync code.

Consider the simple example of just running two unrelated database/network queries at the same time which is basically a ubiquitous desire when writing a web service:

    const [user, stats] = await Promise.all([
      db.getUser(42), 
      cache.fetchStats()
    ])
And now consider a case where you want to issue four database queries, but you don't want the route to take four connections out of the pool at once, instead ensuring that it only uses two:

    // This fn is built into Bluebird and trivial to find 8-line impl for.
    // getA..getD are just functions that return promises so they can be
    // created lazily.
    const [a,b,c,d] = await Promise.map([getA, getB, getC, getD],
      (fn) => fn(),
      { concurrency: 2 }
    )
What I would simply assert is that these sorts of things are really nice to have in your toolbox when writing I/O code like a networked program, and I don't think there exists a simpler async abstraction for it than Node. And I certainly would not have said this until Node had promises and async/await.


99% of the endpoints I write are two steps: load some data from the database, then use that to render HTML, JSON, CSV, etc. I really do not want to encounter any of the myriad of bugs that asynchronous execution makes possible. And the best way for that to happen is if nothing is async.


Or just use gevent and monkey patch the synchronous world to make it asynchronous :)


We had huge performance issues with ActionCable that basically made it unusable for more than 100 concurrent users. We considered EventMachine however ended up going with AnyCable in conjunction with AnyCable-Go which got us up to 1000 concurrency users without any performance impact.


Sources? Seriously, I'd really like to check this EventMachine out.


I ran my own benchmark to decide between the two things a while ago, so I don't have a source in the sense of a published benchmark. YMMV

https://github.com/eventmachine/eventmachine




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: