They're substantively different. Using a compiler requires you to have an internalized model of a state machine and, importantly, a formal language. C, assembler, java, etc.
are all essentially different from using the softness of the English language to coerce results out of a black box
In both all you need is the ability to communicate to the machine in a way that the machine can convert your ideas into actions.
The restricted language of a compiler is a handicap, not evidence of a skill - we've been saying forever that "Natural Language" compilers would be a game changer, and that's all that an AI really is
Edit: It appears that this discussion is going to end up with a definition of "coding"
Is it coding if you tell the computer to perform some action, or is it coding if you tell it how to do that in some highly optimised way (for varying definitions of optimised, eg. Memory efficient, CPU efficient, Dev time efficient... etc)
No one is skeptical of compilers?! I guess you haven’t met many old fashioned C systems programmers, who go out of their way to disable compiler optimisations as much as they can because “it just produces garbage”.
Every generation, we seem to add a level of abstraction conceding because for most of us, it enhances productivity. And every generation, there is a crowd who rails against the new abstraction, mostly unaware of all of the levels of abstraction they already use in their coding.
Luxury! When I were a lad we didn't have them new fangled compilers, we wrote ASM by hand, because compilers cannot (and still to this day I think) optimise ASM as well as a human
Abstractions and compilers are deterministic, no matter if a neckbeard is cranky about the results. LLMs are not deterministic, they are a guessing game. An LLM is not an abstraction, it's a distraction. If you can't tell the difference, then maybe you should lay off the "AI" slop.
I've been thinking about this - you're right that LLMs are not going to be deterministic (AIUI) when it comes to producing code to solve a problem.
BUT neither are humans, if you give two different humans the same task, then, unless they copy one another, then you will get two different results.
Further, as those humans evolve through their career, the code that they produce will also change.
Now, I do want to point out that I'm very much still at the "LLMs are an aid, not the full answer.. yet" point, but a lot of the argument against them seems to be (rapidly) coming to the point where it's no longer valid (AI slop and all).
You keep making these claims as though you are some sort of authority, but nothing you have said has matched reality.
I mean full credit to you with your disingenuous goalpost shifting and appeals to authority, but reality has no time for you (and neither do I anymore).
Second Edit: Adding the following paragraph from the wikipedia page for emphasis
Researchers have started to experiment with natural language programming environments that use plain language prompts and then use AI (specifically large language models) to turn natural language into formal code. For example Spatial Pixel created a natural language programming environment to turn natural language into P5.js code through OpenAI's API. In 2021 OpenAI developed a natural language programming environment for their programming large language model called Codex.
I think after all the goalpost moving, we have to ask - why the bitflip does it matter what we call it?
Some people are getting a lot of work done using LLMs. Some of us are using it on occasion to handle thing we don't understand deeply but can trivially verify. Some of us are using it out of laziness because it helps with boilerplate. Everyone who is using it outside of occasional tests is doing it because they find it useful to write code. If it's not coding, then I personally couldn't care less. Only a True Scotsman should case.
If my boss came to me and said "hey we're going to start vibe coding everything st work from now on. You can manually edit code but claude code needs to be your primary driver from now on" I would quit and find a new career. I enjoy coding. I like solving puzzles using the specifics of a language syntax. I write libraries and APIs and I put a great deal of effort into making sure the interface is usable by a human being.
If we get to the point where we are no longer coding, we are just describing things in product language to a computer and letting it do all the real work, then I will find a more fulfilling career because this ain't it
By the time it works flawlessly, it won't be your career anymore, it'll be the product manager's. They will describe what they want and the AI will produce it. You won't be told to "use Claude all the time".
I personally hate coding, but it's a means to an end, and I care about the end. I'm also paranoid about code I don't understand, so I only rarely use AI and even then it's either for things I understand 100% or things that don't matter. But it would be silly to claim they don't produce working code, no matter what we want to call it.
This is the core of the issue. You hate coding, and I love it. I chose to be a software engineer not because I like using software, but because I like writing software.
If we get to a point where the engineers are replaced by machines, I would hope that the project managers were replaced years before that, as a final act of revenge
I enjoy a lot of things (Software Engineering is one of them) that in NO way determines whether or not AI is coding, nor does it guarantee me a career (just ask all the blacksmiths that disappeared once cars became the mass transport vehicle).
The fact that people are going to (possibly) be able to instruct a computer to do whatever they wish without the need of a four year degree and several years of experience scares you, I get that, but that's not going to have any effect on reality.
Edit: Have a look at all the peoples careers that have ended because software took over.
And more importantly perhaps to u/shortrounddev2, if they enjoy coding so much, they'll still be able to do it as a hobby! It's just that there may not be anybody willing to pay for a slow lumbering human to work their way through the problem.