The devs have been co-opted into marketing roles now, too - they have to say it's that good to keep the money coming in. IMO this reinforces the original post - this all feels like a scramble.
Whether it's indicative of patterns beyond OpenAI remains to be seen, but I don't expect much originality from tech execs.
You're not wrong about funding, but OP is correct about industry influence.
It varies a little by division/subject matter, but they basically have to run everything by industry and are subject to FOIAs and public shaming by senators and representatives beholden to industry.
Source: long-term partner of FDA employee, though this stuff is pretty widely understood.
Those FOIAs are requested by politicians owned by industry in an effort by industry to scrape whatever leverage they can. FOIAs themselves are not problematic, obviously, but their application isn't limited to do-gooders.
I can't imagine using MS products for anything important these days. There are more reliable options for servers, databases, and OS, and those options all have better docs.
Agreed. Additionally, negative sanctions have been part of human life since the beginning. Anyone who has raised a child or pet understands this.
This discussion of far more nuanced than many of the comments in this post address. It's true people are often swiftly found guilty in the public eye without due process - see most true crime - but it's also true such sanctions have their place.
The answers to stay until you find something else are correct.
I would also add that it's probably pretty easy to fake it - in my experience management, especially executive level, have no idea how things actually get done.
If they aren't prescribing a very specific workflow, you can create you own/install whatever tooling you want.
It's also worth pointing out - if you really are sufficiently experienced, these tools could prove to be a force multiplier and may actually be worth an investigation. You still have to review code and provide clear specifications in discrete, easily palatable chunks.
Hard not to feel this way sometimes given the quality of some job listings and the negativity bias here and broader media. In such moments of despair I recall two axioms: change is inevitable and ongoing, and talent rarely resides in the C-suite.
I'm only 10 years in and currently at a science non-profit using a dead/toy framework, and honestly woefully unprepared for market at the moment. I'm constantly looking at job listings, though, and engaging with scads of recruiters to maintain a good feel for the market to inform my next steps. I see plenty of ads that are hyper-specific about the tooling du-jour, but a non-trivial percentage of the listings I see make it clear that higher-level prowess, like understanding a language and best practices, are more important than what ultimately boils down to the ability to RTFM for whatever widgets the CTO/CE is currently enamored with. These are the jobs I'm looking at. Sure, this could narrow your pool during what appears to be a tight market, but you're more likely to have worthwhile interviews. I'll apply to less intriguing jobs to avoid getting rusty at interviewing, though.
This kind of funk also inevitably drives me harder to just _do what I want to do_. What language and tools _do I want to use_? _What kind of problems do I want to solve_ moving forward? If you've sorted these out, great. Sure, this could _also_ narrow your pool even more, but you're more likely to find a high-quality match.
Finally, all of these companies foaming at the mouth to replace people with AI will regret it; it's already happening, in fact. It's happening in less/non-technical jobs (lol Klarna), so I'm not worried about coding jobs at all in the long run (not to diminish and current or short term turbulence, though). Smart execs/founders will see AI for what it is: a force multiplier, only as good as your existing staff. That said, I think it behooves devs to get right with AI/chat-assisted development. Of all the buzzy tools people fall in love with, I think this is the highest ROI I've seen yet.
TL;DR: I'm just not going to apply to jobs that don't give me "smart exec" smells, and I'm only applying if it really looks like something I'll care about doing. I realize this exudes some degree of privilege, hubris, and/or naivete, but I work my ass off and you only live once.
Lots of good answers here about part of your question: is it hype or not? In that it likely isn't going away, and is becoming a valid force multiplier, it's not.
However, this question is better answered by asking yourself what you're interested in. Do you _want_ a deeper understanding of AI/ML? If so, jump in. If you're not genuinely interested it'll be an interminable slog, and you'll revert to doing whatever you actually want to do eventually.
Nothing wrong with continuing to develop web/full stack apps while leveraging the new tools; that's also quite interesting.
Whether it's indicative of patterns beyond OpenAI remains to be seen, but I don't expect much originality from tech execs.