Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Most/all LLMs have already been able to cite sources for quite some time now.


Citing is not referencing links, it's finding primary sources and fact-checking. Kurzgesagt[1], an informational YouTube channel, has had issues with LLMs citing LLM generated content.

[1]: https://youtube.com/watch?v=_zfN9wnPvU0&t=175


LLMs aren't arbiters of truth - they're just natural language search engines. In a way it's quite similar to Wikipedia. Something being on Wikipedia doesn't mean that it's true, but rather it means that a "reliable source", which are not infrequently less than reliable, said so. And even that often doesn't hold as there's another layer of indirection where it's an editor saying this is what a source said, which is again not infrequently not exactly accurate. And then there's 'citogenesis' [1] where imaginary facts can be circularly whisked into existence.

This is why Wikipedia is not a source, but can provide links to sources (which then, in turn, often send you down a rabbit hole trying to find their sources), and it's then up to you to determine the value and accuracy of those sources. For instance I enjoy researching historic economic issues and you'll often find there's like 5 layers of indirection before you can finally get to a first party source, and at each step along the road it's like a game of telephone of being played. It's the exact same with LLMs.

[1] - https://xkcd.com/978/


I've had LLMs cite me bullshit many times, links that don't exist and claiming it does. It even cited a very realistic git commit log entry about a feature that never existed.

Haven't yet had the same issue with Wikipedia.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: