Sunday, August 7, 2011

Why do many Americans refer to people on the left-wing as Liberal?

In politics, left-wing, the political left, or the Left are terms that refer to politics that seek to reform or abolish existing social hierarchies and promote a more equal distribution of wealth and privilege. In general, the left advocates for a society where all people have an equal opportunity, which they often describe as a "level playing field". Toward this end, most people who consider themselves left-wing support labor unions.[1] The term "the Left" can encomp a number of ideologies, including Progressivism, Social liberalism, Social democracy, Left-libertarianism, Socialism, Syndicalism, Marxism, Communism, Autonomism, and mainstream Anarchism.

No comments:

Post a Comment