All American programs always show America winning every war. Watch WWII stuff on the military channel, they make it seem like America single handedly beat Nazis. Who? What? Russians? I don't think the Russians were even involved. Of course they're going to change history and say they won in Vietnam, that's what America does.
Quote:
Originally Posted by Relish XXX
Is all American media censored to show lies?
|
Censored is not really the right term, although there are plenty of things you can't mention in the US media. It's a very complicated, highly effective propaganda machine, it's engineered towards painting a certain image of America and another one of everyone else. It's not just the media, the biggest problem is that they do it in schools. Read an American high school history textbook one of these days and you'll see what I mean. They paint what I like to call the story of America, a nation above all others, blessed by God, that does nothing but good in the world and wins every war it enters. They're taught that everything their government does is in the best interests of the greater good.
They tell kids that their revolutionary war involved the people rising up and fighting for their own interests, they don't mention that it was orchestrated by slave owning elites for their own benefit, who would become the ruling class and form the two party system, which has essentially kept the super rich unchallenged and in total control of the rest of the population for over two hundred years.
They tell them that their civil war was fought because one day people woke up and decided to end slavery. They don't explain that it was actually a complicated power struggle between north eastern and southern elites, and that slavery was only used as an issue because the bankers in the north didn't need slaves anymore.
They tell them that the Mexican war was started by Mexicans, when in fact it was an offensive invasion. "We take nothing by force, thank God". (Said famously right before they invaded Mexico and took more than half of their land, which they occupy to this day.)
They tell them that Pearl Harbor was a surprise and unprovoked, they don't mention the sanctions the US put against Japan that would have essentially put an end to their empire. They also call WWI a people's war, there's no mention of the fact that people who simply dissented against the draft in speeches or writings were given decade long prison sentences.
They don't teach about America's empire and wars of conquest, or any of the horrible things they've done over the years in places like South America or South East Asia. The vast majority of Americans can't name half of America's colonies, the real stupid ones even think that America isn't an empire at all. Most Americans never end up getting a real education, which is why you get a population ignorant enough to believe every lie their leaders tell them. This is why they're able to invade and occupy countries with the population at home thinking that they're doing "humanitarian work" or "ridding the world of evil" or whatever nonsense they're feeding the sheep this week. This is also why some Americans have a hard time understanding why the rest of the world does not like them.