Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

That's a reasonable first step, but a human can just as well feed your questions to the LLM, and post the responses. So you really have to be vigilant throughout your interactions with the author, or, if the PR is unsalvagable, save yourself some time and effort and reject it from the start.

> I'd rather talk with an intelligent LLM

There's no such thing.



The goal is not to filter out LLMs. The goal is to find and fix legitimate issues with my open-source software. I don't care if a human is feeding my questions to an LLM or an undead dog as long as they produce a decent signal.

If the noise increases to an uncomfortable level, of course, I may have to change my strategies. The point is that humans produce noise, too, sometimes even more than LLMs do.


> The point is that humans produce noise, too, sometimes even more than LLMs do.

The core issue here, and it's something I'm seeing at work as well with "less talented" colleagues is that the kind of contributor that already produced noise passes the minimal threshold to use LLMs. But this happens without them understanding anything meaningful about how the software they are contributing to and if what the LLM generated makes sense or not. So, this makes them a 0.1x engineer with a 100x multiplier (for quantity, not quality).


> The point is that humans produce noise, too, sometimes even more than LLMs do.

Humans can produce noise, but humans using LLMs can produce orders of magnitude more of it.

But your stance is reasonable. If it improves the product, who cares who/what produced it. I personally find reviewing machine-generated code and the process of code review with a machine much more exhausting than interactions with humans, but you may feel otherwise.


> Humans can produce noise, but humans using LLMs can produce orders of magnitude more of it.

Absolutely agreed. Which is why I'm more concerned with the qualities and intentions of the human who is using the LLM, than whether not an LLM is being used at all. Like any technology, an LLM is a force multiplier. Garbage in, more garbage out.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: