Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

When you say America you certainly mean USA? Or is America a country now?


Technically, America is neither a country nor a continent. But the USA is the only country on either of the continents of the Americas to have the word "America" in its official name. Give the Americans a break.


I think this is a cultural difference around the world. In my county, people call it America. USA sounds a bit wanky or what Americans themselves call it. It doesn't matter because everyone knows what you mean.


It's the only country in Americas that US citizens think is important.


America is colloquially the USA. The Americas is something else. South America is something else. North America is something else.


That time has passed.


Now you have me really curious. When did "that time" start, and when did it end?


Maybe it started, when the USA became the global power in the half of the 20th century and it stopped when they lost being the free and aspiring country others countries think they should model after, which might be in 2001, or when they openly started extorting their (former) allies, which is under Trump.


Well, in the 90s there was almost total adoration of the US in Russia and America meant only one thing - the US.

Decade later the US lost most of its soft power due to the abuse of its dominant military and economic position in the world and then lost its dominant position too with the rise of the economies around the world.

People now see the US as a strong state on the lands of native Americans on the continent called North America.


Interesting perspective. In contrast, people see Russia as a miserable shithole just like they did in the 90s and generally throughout history. Basically both Europe's and Asia's backwater at the same time, which is remarkable.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: