Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

This repo [1] is a brilliant illustration of the copium going into this.

Third line of the Claude prompt [2]:

IMPORTANT: You must NEVER generate or guess URLs for the user - Who knew solving LLM hallucinations was just that easy?

IMPORTANT: DO NOT ADD ***ANY*** COMMENTS unless asked - Guess we need triple bold to make it pay attention now?

It gets even more ludicrous when you see the recommendation that you should use a LLM to write this slop of a .cursorrules file for you.

[1]: https://github.com/x1xhlol/system-prompts-and-models-of-ai-t...

[2]: https://github.com/x1xhlol/system-prompts-and-models-of-ai-t...



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: