Search does not necessarily meant "retrieval". It means search, over some space. That space can include ideas or options that are recombinations of other ideas and options. And the search heuristics can be learned I have no doubt, as those can be essentially pretrained.
They may not be shipping good enough products, but on the flip side it still feels like they have almost no competition outside of coding and image gen. The EOL termination of 4-o should be some evidence of this.
OTOH, why don't they ship good enough products? To me all of OpenAIs recent investments strongly suggest they hit a dead end with their current LLM approach. After all, if they knew the path ahead for GPT looks great, why don't they invest into training the next big thing instead of doing datacenters with the intention of renting them out?
But this is not a bank, or an airline, or a real estate giant.
If OpenAI goes bankrupt, what happens? People won’t be able to write their precious slop oh no and serious professionals will just switch to any other LLM provider