Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Very few utilitarians or affective altruists (including me) consider that ethical.


> To maximize your expected value, you must aim for it and then march blindly forth, acting as if the fabulously lucky SBF of the future can reach into the other, parallel, universes and compensate the failson SBFs for their losses. It sounds crazy, or perhaps even selfish—but it’s not. It’s math. It follows from the principle of risk-neutrality. [0]

I think he figured in a million realities, the expected value is net very large. He just happens to live in a reality where it collapsed, but how much of his wealth is due to these games? I don't know if the probabilistic method is common among EA. For instance, if you gave me a chance to bet my entire net worth on a 51% odds game, sure the expected return is positive and if i had a million realities and netted everything out, I should take the bet. But most people would see that as insane

https://web.archive.org/web/20221027181046/https://www.sequo...


Seems like they should at least consider Kelly Criterion

https://news.ycombinator.com/item?id=30265797


Taking a risk-neutral approach to altruism isn't the problem, it's the misusing customer funds.


Affective altruism is quite a different thing, practically opposite of effective altruism. ;-)


Downsides of dictation!




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: