My comment can be read two ways, and neither way is wrong. I wasn't really expressing an opinion as much as bringing up relevant facts. People label things that we don't know how to do yet "AI". And then when these hard problems are solved, they seem like they are not really "intelligent".
This leads to both overuse and trivialization of the word, but also moving goal posts for the field. And actual progress isn't taken seriously, because nothing feels like intelligence when you understand it.
This leads to both overuse and trivialization of the word, but also moving goal posts for the field. And actual progress isn't taken seriously, because nothing feels like intelligence when you understand it.
It's all a Mysterious Answer (http://lesswrong.com/lw/iu/mysterious_answers_to_mysterious_...), a codeword for magic (http://lesswrong.com/lw/ix/say_not_complexity/).