In all seriousness, im just curious about the reason of why people from the USA refer to their country as America. It's not that i hate it, but being from America (the continent) makes me wonder... Are people from the USA self-centered enough to refer themselves as a whole continent??
[b]Edit: Grammar correction. (Thanks Forum Therapist.)[/b]
-
Great question that I believe has deep roots beginning hundreds of years ago. Being from this country, I agree that the country should be referred to as USA or the United States