Quote:
Originally posted by Tala
Since when has the US been a democracy?
|
Up until the south lost the civil war. Worst thing that ever happened to this country. The issue of slavery aside, the federal government dictating what we do in our own states is not what was intended when the union was formed.
The federal government was formed so we could unite against outsiders with a common army, money standard etc...now look what has happened. Blacks won the freedom to be forced into "ghettos" and poverty, and we all lost most of our rights to a select few who hold power.