This is true of ANN, and deep learning. They are mathematical models of learning that are finally practical after a couple decades (not to diminish anything the researchers have accomplished which is incredible).
Then there are biologically inspired neural networks, like Hierarchical Temporal Memory (HTM), that actually correlate directly to how the cortex in mammals work. These have also demonstrated learning capabilities, and seem a lot more promising in the road map to general AI, in my opinion, because after all we should be piggy-backing on evolution (not that we can't find a mathematical model first).
So yeah, the hype is just hype, but it could be justified for the wrong reasons if we see breakthroughs in biologically inspired AI (the Brain Project, to name another example).
Demonstrated learning capabilities? I have not seen HTM models make any breakthroughs on any benchmarks. It's also stretching the facts to say it directly correlates to how mammalian cortexs work. At best, you could say it directly correlates to some theories on how mammalian cortexes work - neuroscience has an incredibly poor understanding of brains in general.
Before anyone believes the hype, they should read all the MIT research papers from the mid-1990s that mention the term "emergent intelligence". This was one of the biggest wastes of research money in the history of AI.
Then there are biologically inspired neural networks, like Hierarchical Temporal Memory (HTM), that actually correlate directly to how the cortex in mammals work. These have also demonstrated learning capabilities, and seem a lot more promising in the road map to general AI, in my opinion, because after all we should be piggy-backing on evolution (not that we can't find a mathematical model first).
So yeah, the hype is just hype, but it could be justified for the wrong reasons if we see breakthroughs in biologically inspired AI (the Brain Project, to name another example).