Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I'm not sure how the authors arrive at the idea that this agent is embodied or open-ended. It is sending API calls to minecraft, there's no "body" involved except as a symbolic concept in a game engine, and the fact that minecraft is a video game with a limited variety of behaviors (and the authors give the GPT an "overarching goal" of novelty) precludes open-endedness. To me this feels like an example of the ludic fallacy. Spitting out "bot.equip('sword')" requires a lot of non-LLM work to be done on the back end of that call to actually translate to game mechanics, and it doesn't indicate that the LLM understands anything about what it "really" means to equip a sword, or that it would be able to navigate a real-world environment with swords etc.


Though, I also don't individually track muscle fibers, and there's strong indications that a lot of my own behaviors are closer to API calls than direct control.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: