> Isn't the answer on SO the result of a human intelligence writing it in the first place, and then voted by several human intelligencies to top place? If an LLM was merely an automated "equivalent" to that, that's already a good thing!
The word "merely" is doing all of the heavy lifting here. Having human intelligence in the loop providing and evaluating answers is what made it valuable. Without that intelligence you just have a machine that mimics the process yet produces garbage.
>The word "merely" is doing all of the heavy lifting here"
That's not some new claim that I made. My answer accepts the premise already asserted by the parent that LLM is "equivalent to printing out a Stackoverflow answer and manually typing it into your computer instead of copy-and-pasting".
My point: if that's true, and LLM is "merely that", then that's already valuable.
>Having human intelligence in the loop providing and evaluating answers is what made it valuable. Without that intelligence you just have a machine that mimics the process yet produces garbage.*
Well, what we actually have is a machine that even without AGI, it does have more practical intelligence than to merely produce garbage.
A machine which programmers than run circles around you and me still use, and find it produces acceptable code, fit for the purpose, and possible to get it to fix any potential initial issues in a first iteration it gives, too.
If it merely produced garbage or was no better than random chance we wouldn't have this discussion.
The word "merely" is doing all of the heavy lifting here. Having human intelligence in the loop providing and evaluating answers is what made it valuable. Without that intelligence you just have a machine that mimics the process yet produces garbage.