View Single Post
Old 06-01-2002, 01:46 AM   #54
Moni
Guest
 

Posts: n/a
Quote:
Originally posted by The Hierophant:
I'm curious as to what alot of Americans percieve 'America' to be. In many cases that I have enquired into (but certainly not all) the primary image that comes to mind when people talk about 'America' is the collective of white, middle-class, capitalism. I'm just wondering what most people think (I mean REALLY, honestly and instinctually think, with no artificially-rhetorical explanations) America in and of itself means. Does your ideal perception of America and the rights of the citizen (whether given or earned) include within it the multitudes of destitute ethnic minorities or displaced Amerindian tribes? Is America the land of the Free? Or the land of the equitably-privilidged?

Please do not take offence at these questions, they are not intended to be insulting or abrasive. As a history student I am merely eager to obtain primary resources of input as to the phenomenon of 'America'.
MY opinion of America is that of a nation where everyone has opportunity whether they were born here or not, where all that ARE born here have equal rights. Granted the rich have the better chance of becoming richer, but I've known dirt poor farmer's sons who today are millionaires, poor people of "other than Caucasian" backgrounds who are today successful business people with bank accounts to envy.
Even those who can prove a percentage of Native heritage are given money and higher educations by the Government.
It is normally the people in the middle who get stuck if they can't afford to better their status in order to raise their incomes...the poor are taken care of and the rich are given the best tax write-offs.
I may not agree with the politicians but I think this is the best country to be in no matter what color you are...you just have to be cautious of what state you live in and which area...racism is part of America the same way it is the rest of the world.