ARM didn't "come out of nowhere", it's been used in embedded and mobile systems as a low power CPU for decades now. The earliest mobile use I can find is the Apple Newton in 1993. DEC was marketing their StrongARM product as a low power CPU back in 1996.
And it goes even further back than that: when Acorn got first silicon back for the ARM1, they were hoping for anything less than 1W so that affordable plastic packaging would be sufficient. When they booted up that processor, their multimeter on the power supply line didn't register anything. Further investigation showed that the supply line wasn't actually connected, and the processor had booted using a total of about a tenth of a watt of leakage from other components on the board.
Intel was almost dumb enough to put those on the used manufacturing line from x86 chips... Almost. Then Intel "knifed the baby" by moving those designers to Atom and selling off their ARM stake... At one point StrongARM was the top of the hobbiest Boards available, long before Raspberry got their idea.
They're not, never will be. Especially if there were a concerted effort to put together a GNU/Linux like open source hardware stack to get rid of the fear, uncertainty and doubt of what's inside commercial processors and chipsets. Also there needs to be more decapping of commercial chips to see what's actually in them.
Yeah, but's that strongly in the direction of the "Zero Cost" category of microprocessors, isn't it?
An IEEE I think article laid them out nicely into 4 categories:
Zero cost: what you put in microwaves, every cent counts.
Zero power: used to be obscure, the mobile market is of course making it very much less so, although those have elements of:
Zero time: speed is what counts, and plenty are willing to pay a hefty premium for it.
Zero units: say the military needs a CPU for a combat airplane which won't be made in more than 100s of units. Perhaps worth it for the prestige and getting the government to pay you to figure out neat things that might be usable in your bread and butter channels.
One modern example, but not with custom designs so much, is radiation hardened CPUs for space applications, a unit cost of say 100K is rather small in the bill of materials, the cost of failure in the high millions at minimum, could top a billion or billions, e.g. it would be very very bad if the computing subsystem(s) of the James Webb Space Telescope fail on its way to the Earth-Sun L2 point....