Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I have done quite a bit of game dev with LLMs and have very rarely run into the problem you mention. I've been surprised by how easily LLMs will create even harmful narratives if I ask them to code them as a game.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: