Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

They're still around, most of them having landed in the Rationalist/LessWrong community along with Yudkowski.

And then some of them, like Anissimov, et al, moved onto the Post-Rationalist community, with a small percentage of those moving onto Neoreaction, and a small percentage of those moving onto radical Accelerationist politics.

I think what happened is that a lot of these bright-eyed and bushy-tailed youngsters who saw Singularity and/or radical life extension happening in their lifetimes eventually came to accept that they're not totally wrong, but they'll be long dead when it does.

There was a sort of demotivation that happened to push them into more tangible efforts, even some as prosaic as politics.



lesswrong (specifically eliezer) doesn't push enough people into hard sciences; worrying about AI x-risk is not a recipe for innovating in genetics, neurobiology, life extension, or anything else extropian.


Yudkowski likely doesn't think those things will matter if the AI risk isn't handled properly.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: