>> Honest question: Why so many people attribute "thinking", "knowing, "understanding", "reasoning", "extrapolating" and even "symbolic reasoning" to the outputs of the advanced token-based probabilistic sequence generators, also known as LLMs?
It's very confusing when you come up with some idiosyncratic expression like "advanced token-based probabilistic sequence generators" and then hold it up as if it is a commonly accepted term. The easiest thing for anyone to do is to ignore your comment as coming from someone who has no idea what a large language model is and is just making it up in their mind to find something to argue with.
Why not just talk about "LLMs"? Everybody knows what you're talking about then. Of course I can see that you have tied your "definition" of LLMs very tightly to your assumption that they can't do reasoning etc., so your question wouldn't be easy to ask unless you started from that assumption in the first place.
Which makes it a pointless question to ask, if you've answered it already.
The extravagant hype about LLMs needs to be criticised, but coming up with fanciful descriptions of their function and attacking those fanciful descriptions as if they were the real thing, is not going to be at all impactful.
Seriously, let's try to keep the noise down in this debate we're having on HN. Can't hear myself think around here anymore.
Hang on, how is it fair to ask me why I "add nothing to the discussion" when all your comment does is ask me why I add nothing to the discussion? Is your comment adding something to the discussion?
I think it makes perfect sense to discuss how we discuss, and even try to steer the conversation to more productive directions. I bet that's part of why we have downvote buttons and flag controls. And I prefer to leave a comment than to downvote without explanation, although it gets hard when the conversation grows as large as this one.
Also, can I please separately bitch about how everyone around here assumes that everyone around here is a "he"? I don't see how you can make that guess from the user's name ("drbig"). And then the other user below seems to assume I'm a "him" also, despite my username (YeGoblynQueenne? I guess I could be a "queen" in the queer sense...). Way to go to turn this place into a monoculture, really.
Not him but I am also extremely frustrated by the fact it is impossible to have a real discussion about this topic, especially on HN. Everyone just talks past each other and I get the feeling that a majority of the disagreement is basically about definitions, but since no one defines terms it is hard to tell.
It's very confusing when you come up with some idiosyncratic expression like "advanced token-based probabilistic sequence generators" and then hold it up as if it is a commonly accepted term. The easiest thing for anyone to do is to ignore your comment as coming from someone who has no idea what a large language model is and is just making it up in their mind to find something to argue with.
Why not just talk about "LLMs"? Everybody knows what you're talking about then. Of course I can see that you have tied your "definition" of LLMs very tightly to your assumption that they can't do reasoning etc., so your question wouldn't be easy to ask unless you started from that assumption in the first place.
Which makes it a pointless question to ask, if you've answered it already.
The extravagant hype about LLMs needs to be criticised, but coming up with fanciful descriptions of their function and attacking those fanciful descriptions as if they were the real thing, is not going to be at all impactful.
Seriously, let's try to keep the noise down in this debate we're having on HN. Can't hear myself think around here anymore.