Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

It's possible that LLMs build ASTs internally for programming. I have no 1st hand data on this, but it would not surprise me at all.


LLMs don't have memory, so they can't build anything. Insofar as they produce correct results, they have implicit structures corresponding to ASTs built into their networks during training time.


"LLMs don't have memory"

That's interesting. Is there research into adding memory or has it been proven that it provides no pragmatic value over any context it outputs?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: