Wish her the best with this. Intel staying competitive in GPUs can only benefit the consumer. Those who want a mid-tier graphic card, without paying to compete with AI use cases, may not a huge group, but we do exist! Those who use desktop Linux may be a small group among that small group, but we do exist!
Thanks Jesus it's Intel and not Apple, Intel has been extremely good at working upstream and has immense contributions in the Linux kernel, mesa, and elsewhere. Wasting such talent on Apple would make the world worse for us all.
There are a myriad of companies that have thrived in "IP locked" environments, a host that have failed too. Equally there are heaps that have thrived and failed in "IP open" environments.
I think at best you could say it's more challenging or perhaps risky being a bit restricted with IP, but I'd call it miles away from a "graveyard".
You can hardly call Intel/amd/qualcomm etc all struggling due to the architectures being locked down.
Look at powerpc/Isa. It's (entirely?) open and hasn't really done any better than x86.
Fundamentally you're going to be tied to backwards compatibility to some extent. You're limited to evolution, not revolution. And I don't think x86 had failed to evolve? (eg avx10 is very new)
Apple doesn't have any contributions to the Linux kernel or other parts of the Linux graphics stack. They are unlikely to hire someone who wants to work on open source.
To be fair they don't have anything to do with Linux so there is nothing to contribute back towards. They use BSD licensed software for a reason.
Apple does have open source projects. https://opensource.apple.com But the scope is rather limited. For someone of Alyssa's skillset there really isn't anything there.
From what I recall, Apple forbids its employees from participating in open source work it doesn't approve of. And given Apple's culture of secrecy, its agenda of maintaining a walled garden with their products, and her work basically contradicting the two, I doubt her being hired by Apple would benefit anyone other than Apple.
Apple is too much about beeing closed and creating barriers not sure that would have been a good fit. Plus that's a good way to flee a country quickly degrading.
Honestly if Apple had embraced Linux, the Apple Silicon CPUs would have been amazing for all sorts of server, scientific, and AI/LLM work. Too bad they are clamping down on the walled garden to focus on consumer toys instead.
The real shame is the longevity, M1 Pro and M1 Max got discontinued two and a half years ago so they're on their way to the vintage list and could be entirely obsoleted by the end of this decade! Linux support is the only thing that will keep these machines usable after that.
I do have an M2 Macbook running Asahi, which works amazingly well for my casual use, but I think that there is no way that anyone will use last last gen hardware on a volunteer-developed OS for any actual work, server use cases, and so on.
My point was that the graphics division itself will still be around, as integrated mobile SoCs are basically the only revenue stream Intel still has a good handle on. That requires a graphics core, and all of the other usable options are either not for sale to Intel, have burned Intel in the past, or are owned by Arm.
Intel’s core competence is squandering talent by having finance managers and outside consultants make technology business decisions. Something happened to their culture a few decades ago and they forgot that revenue is a trailing indicator of good decisions and you can’t just decide you want to make a lot of money and trust the product strategy to materialize from that.