Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

It depends on if they are just repeating things without understanding, or if they have understanding. My issue is that people that say "I asked gpt" is that they often do not have any understanding themselves.

Copy and pasting from ChatGPT has the same consequences as copying and pasting from StackOverflow, which is to say you're now on the hook supporting code in production that you don't understand.



We cannot blame the tools for how they are used by those yielding them.

I can use ChatGPT to teach me and understand a topic or i can use it to give me an answer and not double check and just copy paste.

Just shows off how much you care about the topic at hand, no?


If you used ChatGPT to teach you the topic, you'd write your own words.

Starting the answer with "I asked ChatGPT and it said..." almost 100% means the poster did not double-check.

(This is the same with other systems: If you say, "According to Google...", then you are admitting you don't know much about this topic. This can occasionally be useful, but most of the time it's just annoying...)


How do you know that ChatGPT is teaching you about the topic? It doesn't know what is right or what is wrong.


It can consult any sources about any topic, ChatGPT is as good at teaching as the pupil's capabilities to ask the right questions, if you ask me


I like to ask AI systems sports trivia. It's something low-stakes, easy-to-check, and for which there's a ton of good clean data out there.

It sucks at sports trivia. It will confidently return information that is straight up wrong [1]. This should be a walk in the park for an LLM, but it fails spectacularly at it. How is this useful for learning at all?

[1] https://news.ycombinator.com/item?id=43669364


But just because it's wrong about sports trivia doesn't mean it's wrong about anything else! /s [0]

[0] https://en.m.wikipedia.org/wiki/Gell-Mann_amnesia_effect


It may well consult any source about the topic, or it may simply make something up.

If you don't know anything about the subject area, how do you know if you are asking the right questions?


LLM fans never seem very comfortable answering the question "How do you know it's correct?"


I'm a moderate fan of LLMs.

I will ask for all claims to be backed with cited evidence. And then, I check those.

In other cases, of things like code generation, I ask for a test harness be written in and test.

In some foreign language translation (High German to english), I ask for a sentence to sentence comparison in the syntax of a diff.


We can absolutely blame the people selling and marketing those tools.


Yeah, marketing always seemed to me like a misnomer or doublespeak for legal lies.

All marketing departments are trying to manipulate you to buy their thing, it should be illegal.

But just testing out this new stuff and seeing what's useful for you (or not) is usually the way


This subthread was about blaming people, not the tool.


my bad I had just woke up!


I see nobody here blaming tools and not people!




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: