Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

You are assigning value to "us" beyond what might actually be there. What if our only evolutionary value is to give rise to machine intelligence [that will go on to explore the universe] and die off?

We are primitive forms of life after all and are at the point where we should be able to say, with certainty, that machine intelligence will be objectively superior and something that we should feel OBLIGATED to give rise to, regardless of consequences.



> What if our only evolutionary value is to give rise to machine intelligence

There is no prescribed script about what our evolutionary value should be. We will be whatever we ourselves make of us. Unless something unexpected happens -more advanced aliens intervening with our specie for example- Our human instinct is to grow, explore, discover, compete, share as much as we can. AI should be seen as a tool to our goals. Not as an end by itself. There is of course the possibility that we may loose control to the AI. That is not our desired goal, but it could happen. In that case our future will not be our decision. We will be at the mercy of whatever the AI choose for us. That will depend on what kind of AI we built, and how the rampant AI chooses to modify itself. But we will strive to reach our human goals. Because that is the human nature.


Just because you or me see no prescribed script, doesn't mean it's not there. You make a philosophical point here, but my point is more than that.

"We will be whatever we ourselves make of us" sounds meaningless to me. The universe obeys certain physical laws, evolution is directed and follows a specific pattern. We have up to this point been bound by both. Occam's razor says this will continue to be the case.

I think we are already at the crossroads where we can more than speculate about the function of machine intelligence in evolutionary terms, but hopefully sooner rather than later we will have actual _evidence_ that we can look at and use to refine our assumptions.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: