In all seriousness, im just curious about the reason of why people from the USA refer to their country as America. It's not that i hate it, but being from America (the continent) makes me wonder... Are people from the USA self-centered enough to refer themselves as a whole continent??
[b]Edit: Grammar correction. (Thanks Forum Therapist.)[/b]
-
Because the continents are the Americas. North and south. Simply America refers to the United States of America. For anybody who has basic comprehension skills, they can distinguish between America and the Americas.