You have biased view on the definition of "concept" based on English language and logic.
In Chinese language, concept is 概念.
In Chinese language, happy dog is 快乐的狗.
Notice it has an extra "的" that is missing in English language. This tells you that you can't just treat English grammar and structure as the formal definition of "concept". Some languages do not have words for happiness, or dog. But that doesn't mean the concept of happiness or dog does not exist.
The reverse is also true, you can't claim a concept does not exist if it does not exist in English language. Concept is something beyond any particular language or logical construct or notations that you invent.
> But that doesn't mean the concept of happiness or dog does not exist.
That would be a consequence of your position.
The person who wrote the article is english. The claim being evaluated here is from the article. The term "concept" is english. The *meaning* of that term isn't english, any more than the meaning of "two" is english.
My analysis of "concept" has nothing to do with the english language. "Happy" here stands in for any property-concept and 'dog' any term which can be a object-concept, or a property-concept, or others. If some other language has terms which are translated into terms that do not function in the same way, then that would be a bad translation for the purpose of discussing the structure of concepts.
It is you who are hijacking the meaning of "concept", ignoring the meaning the author intended, substituting one made up 5 minutes ago by self-aggrandising poorly read people in XAI -- and the going off about irrelevant translations into Chinese.
The claim the author made has nothing to do with XAI, nor chinese, nor english. It has to do with mental capacities to "bring objects under a concept", partition experience into its conceptual structure ("conceptualise"), simulate scenarios based on compositions of concepts ("the imagination") and so on. These are mental capabilities a wide class of animals possess, who know no language; that LLMs do not possess.
Okay maybe I need to make myself more clear, and start from your claim:
> It has to do with mental capacities to "bring objects under a concept", partition experience into its conceptual structure ("conceptualise"), simulate scenarios based on compositions of concepts ("the imagination") and so on. These are mental capabilities a wide class of animals possess, who know no language; that LLMs do not possess.
Assume it is true that humans are capable of these capabilities, why do you think LLMs are not capable of them? We don't know if they are capable of these capabilities, and that's what explainable ai is.
Take a step back and assume that LLMs are not capable of these capabilities, how do you prove that these are the fundamental concepts in the universe, instead of projections of higher level concepts from higher dimensional space that humans and animals are not aware of? What if there exist a more universal set of concepts that contains all the concepts we know and others, in a higher dimension, and both LLMs and humans are just using the lower dimension projections of such concepts?
In Chinese language, concept is 概念.
In Chinese language, happy dog is 快乐的狗.
Notice it has an extra "的" that is missing in English language. This tells you that you can't just treat English grammar and structure as the formal definition of "concept". Some languages do not have words for happiness, or dog. But that doesn't mean the concept of happiness or dog does not exist.
The reverse is also true, you can't claim a concept does not exist if it does not exist in English language. Concept is something beyond any particular language or logical construct or notations that you invent.