It depends on if they are just repeating things without understanding, or if they have understanding. My issue is that people that say "I asked gpt" is that they often do not have any understanding themselves.
Copy and pasting from ChatGPT has the same consequences as copying and pasting from StackOverflow, which is to say you're now on the hook supporting code in production that you don't understand.
If you used ChatGPT to teach you the topic, you'd write your own words.
Starting the answer with "I asked ChatGPT and it said..." almost 100% means the poster did not double-check.
(This is the same with other systems: If you say, "According to Google...", then you are admitting you don't know much about this topic. This can occasionally be useful, but most of the time it's just annoying...)
I like to ask AI systems sports trivia. It's something low-stakes, easy-to-check, and for which there's a ton of good clean data out there.
It sucks at sports trivia. It will confidently return information that is straight up wrong [1]. This should be a walk in the park for an LLM, but it fails spectacularly at it. How is this useful for learning at all?
Copy and pasting from ChatGPT has the same consequences as copying and pasting from StackOverflow, which is to say you're now on the hook supporting code in production that you don't understand.