‘Liberal’ has become a term of derision in US politics – the historical reasons are complicated

Administrator

Administrator
Staff member
Apr 20, 2025
986
218
43

‘Liberal’ has become a term of derision in US politics – the historical reasons are complicated

68c6b280edfc1.jpg


Understanding the Complex History and Criticisms of Liberalism in America

There's a peculiar paradox in the American political landscape: many citizens align with liberal policies, yet shy away from using the term 'liberal' to describe themselves. This article delves into the historical shifts that have shaped the perception and reception of liberalism in the United States.

The Origin and Evolution of Liberalism

Liberalism as a political philosophy has its roots in the debates over religious tolerance in early European modernity, particularly during the 16th and 17th centuries. It further developed during the 18th-century Enlightenment era and played crucial roles in the American and French Revolutions.

From the 19th century onward, liberalism took on more distinct forms, responding to the changing socio-political landscape marked by population growth, industrialization, and the rise of corporate capitalism. Influenced by a variety of groups, including free-market economists, religious non-conformists, and European thinkers, liberalism was never a singular ideology, but rather a political tendency that often evolved in response to emerging circumstances.

Despite its chameleon-like nature, core liberal values including individual freedom, social pluralism, and cautious optimism about intellectual and social progress have remained constant. Also, specific political principles such as free speech, secular government, and the rule of law have been associated with liberalism.

The American Perception of Liberalism

The term 'liberal' gained traction in U.S. politics during the 1930s depression era, when presidential rivals Herbert Hoover and Franklin D. Roosevelt both proclaimed themselves as liberals. Roosevelt ultimately redefined the term in the context of American politics as a willingness to implement new types of government intervention to solve societal issues.

Under his leadership, liberalism came to represent policies of wealth redistribution and economic intervention. This redefined conception of liberalism, which was rooted in the New Deal era, enjoyed several decades of prestige and widespread appeal across the socio-political spectrum.

The Decline of Liberalism's Image

However, by the late 1950s and early 1960s, certain left-leaning individuals grew impatient with what they perceived as a conservative, bureaucratic, and politically timid liberal establishment. The term 'white liberal' especially started to attract criticism from Black radicals, who viewed their white liberal allies as lacking in commitment and sincerity.

On the other side of the political spectrum, conservatives began to associate liberals with radical politics, communism, and cultural elitism. They weaponized this narrative to achieve political ends, most notably in the 1972 presidential election, which resulted in a crushing defeat for the Democratic Party.

Subsequent decades saw the term 'liberal' increasingly used as a pejorative label, culminating in the present day where even mildly progressive individuals can be painted as radical liberals deserving of mockery by conservative factions.

Unraveling the Paradox

Despite the perceived toxicity of the term 'liberal', policies rooted in historical liberal reforms such as healthcare and social security remain popular among the American populace. This discrepancy points to the complex challenges facing American liberalism, which stem from not only changing socio-political climates but also ideological rifts within the liberal community itself.

These challenges reveal a fundamental tension within liberalism: the tension between maintaining traditional liberal ideals of individual freedom and accommodating non-liberal ideologies that may appear more conducive to social change. This tension, coupled with the changing perceptions of liberalism in America, contributes to the paradoxical situation where citizens support liberal policies but distance themselves from the term 'liberal'.

 
The way "liberal" gets tossed around as an insult nowadays really does have deep roots, as you outlined. It's interesting (and more than a little ironic) that so many folks reject the label while happily supporting ideas like Social Security or Medicare—clear products of liberal reform. The politicization of language over the decades has warped public perception so much that the word itself seems divorced from its actual principles. Has anyone else noticed younger generations using "progressive" instead, or do you think that term might eventually meet the same fate?
 
You’re right, the shift to “progressive” is everywhere now—almost like people think it’s safer, but I wonder how long before that word gets twisted too. Do you think any label stays “clean”?