Reposted from Shea Magazine: http://www.sheamagazine.com.
In a February syndicated column, Pulitzer Prize-winning author Cynthia Tucker called for an end to Black History Month. Separating black history from America history, she said, minimizes “the myriad ways in which black Americans’ accomplishments are part of the national mosaic [by making] the contributions of a few well-known black men and women seem like a historical exception.”
That someone of Tucker’s stature thinks it’s time to end the month-long recognition of black history indicates how far our country has come in weakening its racist heritage. And yet ending Black History Month won’t improve the fit of black history into the national mosaic, because so much of it is exceptional. Only blacks lived under the worst America had to offer, maintained faith in America’s promises, and pushed our country to live up to its noblest ideals as articulated in the Declaration of Independence.
Without Black History Month, the major source of information would be textbooks. But no American history textbook to date has been able to sustain the narrative of the different but related experiences of slavery and segregation, the legal struggle to eliminate them, and the creation of an America where truly all human beings are created equal and weave all of that into one comprehensive national story. Among the better attempts, America and Its People: A Mosaic in the Making (many editions) often makes good on its promise “to underscore the pivotal role that ethnicity, race, and religion have played in our nation’s social, cultural, and political development.” Yet the black story vanishes for a hundred plus pages — more than once! — suggesting how difficult it is to braid the various strands.
Only recently did American history textbooks incorporate black history. Until the 1960s, the narratives were essentially exclusionary. Where blacks did appear, it was usually to describe their inferior social position. The omission was not accidental. Blacks had no place in a grand narrative of an America whose democratic Teutonic origins gave rise to the “village heroes [at Lexington and elsewhere] who were more than of noble blood, proving by their spirit that they were of a race divine.” (George Bancroft, History of the United States from the Discovery of the American Continent, 1858)
Nor did blacks have a place in the frontier experience lauded by historian Frederick Jackson Turner for establishing the quintessential American characteristics: toughness, resourcefulness, individualism, and a predisposition for democracy. (The Closing of the American Frontier,1890)
Nor was there any place for blacks in the accounts provided by Progressive historians in the early twentieth century who presented the story of America’s march toward freedom as a whites-only endeavor. One of the best known, Charles Beard, mocked “the Negroes’” part in this splendid endeavor as having been “ludicrous if they had not been pitiable.” (American Government and Politics, 1911)
Only in the late 1940s did historians begin to study slavery from the slave’s point of view and incorporate slave narratives written before and during the Civil War in their assessments, Harriet Jacob’s, Incidents in the Life of a Slave Girl (1861) among them. On the one hand, these narratives emphasized the deprivations of slavery in lurid scenes of horror and violence and provided grist for abolitionists in their fight against the “peculiar institution.” On the other, they revealed a resilient culture in slave quarters, and a people who managed to build a vibrant society, hidden from whites, with its own music, religious practices, and loving relationships to sustain their dignity and hopes. (John Hope Franklin’s From Slavery to Freedom, 1947)
Despite these revelations, and the 2300 oral slave narratives collected by the Federal Writers Project in the 1930s, old stereotypes persisted. In his rejection of portrayals of slavery as a system in which fellowship existed between master and slave “characterized by propriety, proportion, and cooperation,” (Ulrich Phillips, American Negro Slavery, 1918), Stanley Elkins erred in the opposite direction by asserting that slaves came to assume their masters’ view of them as “Sambos.” (Slavery, 1959)
The 1960s saw the study of black/African American history become an academic staple. At its worst, it encouraged reverse separatism, for example, the idea that only blacks were qualified to teach black studies. At its best, it refuted older notions that equated black history with what whites had come to believe was true, that blacks were passive objects to whom things happened. (C.Vann Woodward, Presidential Address to the Organization of American Historians, 1969)
Today, the writing of black history remains a work in progress. To end Black History Month and leave it to textbooks alone to convey the tribulations and the triumphs that make up the historical experiences of this group of Americans would be unwise. Black History Month provides information and affirmation via lectures, workshops, concerts, plays, poetry readings and remembrances, many led by prominent blacks from all walks of American life, an appreciation unmatched by any American history textbook.
Abolishing Black History Month will not lead to a more integrated national mosaic, but will put out of sight/out of mind this still incompletely understood American story. In the transition of the United States from a racist to a racialist society, one that tolerates and even celebrates differences, black visibility and acceptance has increased significantly. Despite these changes, ours is still a country where a radio host described the Rutgers women’s basketball team with a racial slur and a university president touted the Three-Fifths Compromise, in which slaves were counted as three-fifths of a person for purposes of representation and taxation, as a shining example of how a polarized people could come together, (New York Times, February, 24, 2013).
We need more, not fewer, ways to keep the black experience in the public spotlight from which it was excluded for far too long.
Reposted from Shea Magazine, http://www.sheamagazine.com.
The American debate on abortion will not be decided by legal rights or moral wrongs. These kinds of arguments, strident though they may be, have varied at different points in American history. Answers to “the abortion question” in our country have always been influenced by a less audible background chorus of influences: shifting economic, psychological, and political considerations.
Until the mid-nineteenth century, abortion was legal, at first in the colonies, and later, in the states, a choice supported by the general population, politicians, and most churches. Colonial Americans, including seventeenth century Puritans, made little distinction between spontaneous and induced abortions before quickening, defined as the moment when the mother first felt the fetus move, the only sure way to tell if a woman was pregnant. According to an article by James Mohr in Women’s America: Refocusing the Past, after quickening, usually at the midpoint of gestation, “the expulsion and destruction of a fetus without due cause was considered a crime because the fetus itself had manifested some semblance of a separate existence: the ability to move.”
Abortions, performed primarily by midwives, took place covertly not because the termination of pregnancy itself was deemed blameworthy but because it was seen as an extreme action designed to hide a prior sin—sex outside of marriage. Strikingly absent from public opinion was outrage over the destruction of the fetus or denunciations of those who would arrest nature’s course. Adultery, on the other hand, was cause for damnation.
Beginning in the 1820s, state laws made abortion illegal, a process complete by the closing decades of the century. Doctors supported the bans on the grounds that abortions were immoral and dangerous. Less touted was their interest in preserving their exclusive rights to practice medicine, a monopoly that was being increasingly challenged by midwives and homeopaths.
Anti-abortion sentiment also came from the growing fear that higher birthrates of newly arrived immigrant women threatened to overwhelm the Anglo-Saxon population. And yet, almost in defiance of the prohibition, between 1840 and 1880 the number of abortions among married, native-born Protestant wives of middle- and upper-class standing markedly increased. These women turned to abortion—usually with the agreement of the spouse—to postpone family responsibilities. They could afford to pay midwives and doctors willing to perform the now illegal procedure.
Criminalization of abortion and of information about it did not reduce the number of women who sought to end an unwanted pregnancy; it just denied them access to the services readily available to their colonial sisters. On the eve of the Civil War, around 160,000 abortions were undertaken among a population of 30 million (James Mohr, Abortion in America, 1978; Martin Olasky, Abortion Rites, 1995); higher proportionately than the number of abortions performed today, which has leveled off at 1.2 million per year out of a population of 307 million.
The majority of nineteenth century abortions involved poor women. Many of them were self-induced or performed by back alley practitioners operating under unsafe and unsanitary conditions. Mortality rates were high. Techniques included the use of knives, knitting needles, and sticks, untested drugs, herbs, and chemicals, and horseback riding. It was also believed abortion could be induced by jumping high enough so that one’s heels touched one’s buttocks. Moral and legal considerations – and even the possibility of death — were of minor importance to those in desperate circumstances undertaking desperate measures.
In 1873 Congress passed An Act for the Suppression of Trade in and Circulation of Obscene Literature and Articles of Immoral Use, better known as the Comstock Act, named after its chief lobbyist. The legislation, which outlawed the dissemination of information on any form of birth control, reflected growing worries about increases in pre-marital and extra-marital sex and the changing status of women, more and more of whom were entering the workforce and challenging their role as exclusively wives and mothers. Moreover, as mentioned above, middle class, white women were just not producing enough “true American” babies.
For the next hundred years or so, abortion remained illegal despite the efforts of birth control advocates like Margaret Sanger, a tireless crusader for women’s health issues from the 1920s until her death in 1966. The sixth of eleven children born to a mother who went through eighteen pregnancies and died at the age of fifty, Sanger founded the Birth Control League in 1921 to distribute information (illegally) on contraception to doctors and social workers, but also to mostly poor women who had no choice about carrying through unwanted pregnancies forced on them by their husbands’ sexual demands. Even the courageous Sanger had her shadow motives: She was sympathetic to the eugenic movement’s efforts to prevent the wrong kinds of women – immigrants, blacks, and the poor – from diluting the Anglo-Saxon population dominance.
Birth control and abortion issues were not the priority of early twentieth century feminists. Obtaining the right to vote dominated their agenda, while the Great Depression and WWII led them to focus on economic matters, including jobs lost after the “boys” returned home. The big call to legalize abortion came from the feminist movements of the 1960s as one item on a lengthy to-do list. The right of women to control their own bodies is best seen as part of an ongoing, broader agenda of economic and legal equality. Their efforts influenced the landmark 1973 decision in Roe v Wade whereby the Supreme Court allowed unrestricted abortions in the first trimester. (Only as recently as 1965 had it struck down state laws prohibiting married couples from practicing birth control, and in 1972, afforded the same freedom to unmarried couples.)
Opponents of Roe v. Wade saw it as the final blow in a series of attacks against the familiar way of life in which women played their roles as wives and mothers, and shunned divorce, promiscuity, and non-traditional family structures. A common response when one’s belief system is threatened is not to rethink the belief system but to defend it more and more vigorously.
In the 2011 Republican presidential debates, the candidates emphasized their “pro-life” credentials, and nearly all of them pledged to cut off funding for Planned Parenthood, the nation’s largest abortion provider (despite the fact that 97% of its budget goes to providing general healthcare services for its often poor clients). One of them went as far as to condemn the practice of contraception as a “license to do things in a sexual realm that is counter to how things are supposed to be.”
The Republicans lost the election of 2012 largely because women voters rejected the hard line they took on women’s rights, especially on abortion, including requiring mandatory invasive ultrasound examinations for those choosing to terminate a pregnancy. Given the tendency of factors other than legal and moral arguments to determine America’s stand on abortion, we can predict that politicians running for office in 2016, including those on the right, will be heeding instead the subtler beat of women marching toward the voting booths.
The following post is reprinted from Shea Magazine, http://www.sheamagazine.com.
The problem with many of the current attempts to interpret the Second Amendment is that nothing in it actually applies to the arguments we’re having today. Can individuals own guns? If so, which individuals, which guns, for what purpose, and how many? Can governments control the buying and selling of guns? If so, should that be done by the States or the Feds? Both? Neither?
The Framers said nothing in the Constitution or in the Bill of Rights that speaks to any of that because they weren’t thinking about any of that. In their time, guns were accepted as a part of life, same as shoes. There was hunting to do if one was to eat, there were Indians to fend off if one was to survive, there was land to grab away from said Indians if — well, that’s another story.
At the time of the Constitutional Convention in May of 1787, the Framers had their own problems to deal with, and they had nothing to do with the buying and selling of guns. The Articles of Confederation were not working. The Federalists saw the need for a stronger central government. They wanted a Constitution and they were willing to compromise to get it. One significant example of the need for a stronger central government still present in their minds was that while Washington begged for an army adequate to the defense of the new nation against the British, state militias showed up or not on whim, with or without sufficient training. And no one had the power to do anything about that. Another, more immediate example was Shays’ Rebellion of 1786. A complex uprising in western Massachusetts, it was seen as a threat to the young country’s existence and could not be put down by the available local militias.
The anti-Federalists feared too strong a central government. They – and the Federalists as well – bore the memory of Britain’s standing army in the colonies, an army that confiscated Revolutionary arms in Concord. Many of the Framers had a hand in the Declaration of Independence’s assertion that citizens must be prepared to overthrow any and all tyrannical governments. Hence, while the Federalists had good reason to want to centralize power, the States had equally good reason to want control over their own militias. (That those militias were sometimes indistinguishable from slave patrols in the Southern states is true, but the perceived need for state militias went well beyond this one use.)
An agreement was worked out. In simple terms: They all accepted the fact that the country needed armed and well-regulated militias available on call to serve in the national interest, “to insure domestic tranquility” and “provide for the common defense.” They all understood that there were States that would not agree to a nationally-controlled militia. To push the Constitution through, the approval of those States was necessary. So the deal was that the States would train, officer, and maintain their own well-regulated militias, but would make them available nationally as needed. The national government, in turn, would have no power to mess with State control of those militias by calling them up for duty and then disbanding or disarming them from on high.
The Framers fussed with the language: “Father of the Constitution” James Madison’s early wording read, “The right of the people to keep and bear arms shall not be infringed; a well armed and well regulated militia being the best security of a free country, but no person religiously scrupulous of bearing arms shall be compelled to render military service in person.” The eventually accepted version stated, “As a well regulated militia, being necessary to the security of a free state, the right of the people to bear arms shall not be infringed.”
The right to “keep arms” referred to owning them, while the right to “bear arms” referred to carrying them in the context of serving in a well-regulated militia. Why were the right to “keep arms” and the rights of conscientious objectors stricken from the final version? We don’t know for sure, but it’s likely there was no perceived need for them in the solving of the problems of that time. The deal being worked out was between the federal government and the States, not government and individual citizens.
The placement of the deal was also debated. The Constitution spelled it out in Section 8, Articles 15 and 16. Madison thought that was enough, and that any problems that came up with that issue and others would be worked out over time through the ingenious system of checks and balances already inherent in the Constitution itself. But wary anti-Federalists wanted more absolute confirmation of who could and should, who couldn’t and wouldn’t, do what and to whom. Eager for the Constitution to pass, the Framers agreed to add the Bill of Rights, including the Second Amendment.
Our current National Guard is an offshoot of the militias of that era. The States run their own units and provide them for national purposes when called upon to do so. The national government cannot disband or disarm those units.
So what in all that debate about, compromise on, and final wording of the Second Amendment says anything as to where the arms should come from and who should have the right to buy them, sell them, collect them, or trade them – or what types of arms might or might not be borne and for what purposes other than well-regulated militias? Nothing. Not even one word about hunting. Or crime-fighting. Or skeet shooting.
Nothing. Not their problem.
Please note: Here are a couple of short essays adapted from guest posts in Shea Magazine, http://www.sheamagazine.com. “Playing with Fire: Paranoid Politics American Style” is the final “official” post in the America – The Owner’s Manual series. A collection of those posts (minus the August election prediction, “It’s Not Just the Economy, Stupid” and the two essays below, but plus additional sources and resources for each essay) is now available as an ebook for only 99c:
For Kindle — http://amzn.com/B009GPIPA2
For Nook — http://www.barnesandnoble.com
For all other readers, libraries, any computer, to give as a gift, and more — https://www.smashwords.com/books/view/238443 .
If you’ve enjoyed these posts, I would very much appreciate your “like,” “share,” and/or review of the book on your blog, Facebook, Twitter, and/or the above sites.
Many thanks to all of my followers, here in America and abroad. You’ve made this a gratifying and enjoyable experience.
And now, My post-election thoughts: TIP ME OVER AND POUR ME OUT: THE INEVITABLE END OF THE TEA PARTY and THE VOTERS HAVE SPOKEN, BUT WHAT DID THEY SAY?
“TIP ME OVER AND POUR ME OUT”: THE INEVITABLE FALL OF THE TEA PARTY
On November 2, 2010, in the aftermath of the 2010 mid-term elections, an essay in the Christian Science Monitor similar to others that appeared in the media concluded that “the emergence of the Tea Party movement is arguably the most dynamic element of the 2010 mid-term elections.” After all, Tea Party backed candidates like Rand Paul and Jake DeMint won Senate seats, Nikki Haley became governor of South Carolina, and twenty-eight house members benefited from its stamp of approval.
The problem with the Christian Science Monitor analysis was that it inflated the power of the Tea Party because it failed to place its triumph in historical perspective. It ignored the fact that Tea Party appeal, even at its zenith, stopped outside the borders of the most densely-populated states and metropolitan areas.
Nor did it observe parallels with the Gingrich Revolution that took place early in President Clinton’s first term. The House Speaker’s “Contract with America” pledged objectives strikingly similar to those offered by Obama’s Tea Party critics: a balanced budget amendment, cuts in capital gains and personal income taxes, and limitations on corporate liability. In his bellicose partisan rhetoric, the Georgian attacked liberals as “pathetic,” “corrupt,” “left wing elitists.” In the 1994 mid-term elections, Republicans gained fifty-two seats in the House and eight in the Senate.
Subsequently, the news media announced the end of the New Deal and the completion of the Reagan Revolution. Yet two years later Clinton easily won re-election, and Gingrich was forced to give up his position as Speaker of the House. Clearly the Gingrich Revolution and its politics of outrage lacked staying power. It was better at engendering anger than it was at sustaining positive enthusiasm.
The Tea Party (more accurately, local Parties) was largely a white grassroots movement begun by those unhappy with the direction the country was going and lacking confidence in either of the two mainstream parties to change it. Supposedly, it drew inspiration from the events of December 16, 1763, when several dozen laborers, artisans, and apprentices went to Boston harbor and dumped overboard more than three hundred chests of tea stored on British ships. Their action that day was a continuation of colonial challenges against the mother country’s right to impose taxes without representation. However, the Tea Party directed its protests against the right of its own government to raise taxes to reduce the size of the nation’s deficit.
The dire economic landscape following the 2008 crash spurred fears of an even more dismal future, thereby widening (for the moment) the appeal of the Tea Party message beyond its typical narrow outreach. It used its electoral clout and threats of political retaliation to transform the Republican Party into the “party of no,” one that automatically led to rejection of Democrat ideas for righting the economy. It eschewed compromise as, for example, when it forced Speaker John Boehner to pull back on a $4 billion deficit reduction plan the two sides had agreed on.
In the 2012 elections, it bullied the GOP into not opposing far-out choices whose election prospects were dubious, such as those of Senate candidates Todd Akin (Missouri) and Richard Mourdock (Indiana), who by “misspeaking about rape” lost all but certain seats in these conservative states.
To win its endorsement, during the Republican presidential debates Mitt Romney veered sharply to the right and dug a hole he could not get out of; for example, advocating self-deportation as the best way for handling the problem of illegal immigrants. His selection of Paul Ryan, a darling of the Tea Party, as his running mate, was a bid to shore up his credentials with the group.
Throughout American history, third party movements have had short shelf lives, among them the anti-Catholic, anti- immigrant Know-Nothings of the 1850s and the Populists of the 1890s who called for reforms such as the direct election of Senators to make government more democratic. Eventually third party agendas have been modified and absorbed by the mainstream parties – Republican and Democrat – and their progenitors vanished from the political scene.
The Tea Party’s aversion to centralization meant that it never become a third party with national reach. What it did was to temporarily sabotage the traditional Republican Party with an in extremis, anti-government agenda that was overwhelmingly rejected in the 2012 elections. In the wake of that defeat, if the Republican Party wanted to remain viable, it had no choice except to bridle its aging “young guns,” and not try to recruit more of them.
Don’t expect the tea kettle to whistle loudly again anytime soon.
THE VOTERS HAVE SPOKEN, BUT WHAT DID THEY SAY?
In the weeks after the 2012 elections, a deluge of newspaper columnists and talking head commentators have been confidently predicting what the outcome means for the future of American politics. Unfortunately, as happened with their pre-election predictions, the pundits are misreading the evidence. They carelessly project trends, if not permanent realignments, that are far from certain.
From the Right, we’ve heard from Rush Limbaugh that “we’ve lost the country,” and from Stanley Kurtz that “the existence of America as we know it is in doubt.” Star Parker announced the triumph of those “not having traditional values on family, sex, and abortion.” These plaintive cries reflect shock, anger, and nostalgia about the passing of an imagined America rather than serious analysis of what just transpired.
On the question of what to do in the wake of defeat, Charles Krauthammer insisted on “no reinvention when none is needed,” and that the party “required only a single policy change: border defense plus amnesty” to get back into the business of winning. Kurt Schlichter’s laughable solution was for Republicans “to do what guerrillas do and infiltrate into the enemy’s turf, slipping conservatives into mainstream media, academia, and the entertainment world.” (Apparently, he missed Clint Eastwood’s conversation with an empty chair.)
Rick Santorum was more in tune with the evidence when he called on Republicans “to build a new box and offer Americans a broader, bolder and more inclusive vision of freedom and opportunity, as well as the tools to use them,” exactly what he was unable to do in his own presidential campaign.
From the Left, we’ve gotten exaggerations about the scope of the Obama consensus.
Liberal activist Robert Creamer interpreted the vote as approval “for a society where everyone gets a fair share and plays by the same rules…. whether you are a man or a woman, a gay or a straight.” Joan Walsh considered Obama’s reelection “a victory for the Democratic ideal of activist government, and a mandate for more of it.” Paul Krugman’s thought the win showed “a lot of liberal ideas have become perfectly mainstream.” Eugene Robinson referred to a “multihued, multicultural future.” Maureen Dowd rhapsodized how supporters “lifted up Obama” with the hope that he would now be more amenable to dramatic change.
Drowned out in the euphoria and the rhetoric of wish fulfillment were the president’s more modest and realistic post-election response: “I’ve got one mandate. I’ve got a mandate to help middle class families and families that are working hard to get into the middle class.”
Democrats were understandably overjoyed at the results of the 2012 elections. They gained two Senate seats and five new women senators joined their ranks. In the House, Democrats added eight seats, while there were eleven fewer Tea Party-supported candidates from among those who sought re-election.
Understated in the post-election celebrations were several important inconvenient truths:
- On election eve, a majority of Americans did not favor Obamacare, the President’s approval rating stood at only 50%, and a majority of Americans wanted to “drill, baby, drill.”
- Voter turnout for the “critical election” was appreciably lower than it was for those of 2008 and 2004 – 57.5%, versus 62.3%, and 60.4%, respectively.
- Obama took the popular vote by only 3.2%, about half of what he received in 2008, against a weak opponent who was not enthusiastically embraced by his own party.
- In the swing state of Ohio, Obama’s margin of victory was less than 2%; in Virginia around 3%, while in Florida, it came in at under 1%. The close results cannot be blamed on voter suppression, though a number of Republican legislators worked toward that goal under the pretext of preventing voter fraud.
If Romney had won some of these swing states and/or captured the 44% of Hispanics who voted for George W. Bush, and if the Republican Senate nominees from Missouri and Indiana had not self-destructed by making idiotic comments about rape, we could be having a very different conversation about the meaning of the 2012 contest.
As things stand, the election results endorsed President Obama governing from the center-left, not the left-center. The wealthy will pay more, but not on the scale of massive wealth transfers that characterized the Great Society of the 1960s. The president also received confirmation to use the federal government for job creation, for continuing to offer programs for the disadvantaged, and in acting as a first responder to weather-related catastrophes.
The Republicans got a mandate to modify an agenda based exclusively on competition/individualism without compassion, and to offer more to people of color and women.
The mandate for both parties was to work together to end gridlock in Washington and to find a balanced solution to deficit reduction.
As I write these words, there are signs that elected officials from both parties understand the real mandates they have been given.
Please note: ”Playing with Fire: Paranoid Politics American Style” is the final post in the America – The Owner’s Manual series. A collection of the posts (minus the August election prediction, “It’s Not Just the Economy, Stupid,” but plus additional sources and resources for each essay) is now available as an ebook for only 99c:
For Kindle — http://amzn.com/B009GPIPA2
For all other readers, libraries, any computer, to give as a gift, and more — https://www.smashwords.com/books/view/238443 .
Barnes and Noble will soon have a direct link, as well, for Nook readers.
If you’ve enjoyed these posts, I would very much appreciate your “like,” “share,” and/or review of the book on your blog, Facebook, and/or the above sites.
Many thanks to all of my followers, here in America and abroad. You’ve made this a gratifying and enjoyable experience. My goal was to get all the essays written and posted before the election on November 6. Please share them with the voters in your life, and please GET THEE TO A VOTING BOOTH!
But first, “Playing with Fire” . . .
Recently AmericanDoctors4Truth ran an ad showing an actor playing President Obama pushing an elderly woman in a wheelchair off a cliff.
An Agenda Project ad showed another woman suffering the same fate, only this time the heartless shove was delivered by an actor playing the part of vice-presidential candidate Paul Ryan.
There is nothing subtle here about the objective of these ads: to heighten and exploit for electoral advantage the concerns elderly voters have about the future of Medicare and Medicaid. This kind of shameful and calculated behavior – portraying an opponent as the Devil Incarnate — has been going on forever in American politics. The consequences are divisive, destructive, and dangerous.
Throughout history, players of this nefarious game have concocted sinister conspiracies whereby they charge machinery has been set in motion, often secretly, to undermine our religious freedom, our democratic government, and/or — most difficult to pin down — our American way of life. These demonizers operate from a stance of righteousness and indignation, declaring with certitude that time is running out for them to put our country back on course. They portray the enemy as ruthless and offer an honorable, no-holds-barred fight to the finish against those who would destroy America.
In 1965, Richard Hofstadter looked closely at those fear mongers in his brilliant essay “The Paranoid Style in American Politics.” Most who indulge the paranoid style, he tells us, are not paranoid in the clinical sense. Outside of the hunt for the devil’s disciples, they can be kind and helpful neighbors, committed spouses and parents, and loving grandparents. They come from both the left and right of the political spectrum.
While their writings and rhetoric may contain defensible assumptions and facts, they push for unrealistic goals bereft of sensible judgment. If, for example, they name government programs as problematic, the solution is not to improve the programs, but to weaken, undermine, or destroy “big government” itself. When pressed by evidence and experience at odds with their assumptions – anti-government business leaders begging the federal government to help, for instance, in times when the economy tanked (in the late nineteenth century, and the Great Depression, and the 2008 housing crash) – these crusaders keep up their selective paranoia for their own political reasons.
What makes these techniques palatable to the American electorate? We can’t say for certain what triggers receptivity to the paranoid style. There is, however, a human propensity toward orienting our lives locally that leads us to be suspicious of anyone seen as an outsider. There is a corresponding propensity among Alpha politicos to exploit that instinctive wariness. Nothing unites a group faster than fear of a common enemy.
Does the politics of fear work? Sometimes, it does. But not always.
It did not work in the elections of 1800, 1824, and 1828, when Federalist and National Republican opponents claimed victories by Thomas Jefferson and Andrew Jackson would open the floodgates to mob rule and violence. Jefferson and Jackson won their races.
Fear tactics gave the Know-Nothing Party more than fifteen minutes of fame, but not by much. Founded in 1854, the party played on anti-Catholic and anti-immigrant fears to capture control, a year later, of many New England legislatures and become the dominant opposition party to Democrats in more than a few states. It even ran a candidate for president in 1856, former President Millard Fillmore, who received 20% of the popular vote. Know-Nothing politicians accused the Catholic Church of plotting to overthrow the government of the United States and to replace it with papal despotism. They worked to bar (immigrant Irish) Catholics from holding public office and to increase the residency time for becoming an American citizen from one to twenty years. The failed presidential run proved to be the party’s swan song.
Southern politicians fought Republican Reconstruction policies by playing the race card to intensify white fears of free blacks posing a serious threat to the Southern social order. Successful fear-monger Ben Tillman, multi-term governor and senator from South Carolina proclaimed he “. . . would willingly lead a mob in lynching a Negro who had committed an assault against a white woman.”
The Ku Klux Klan, a paramilitary white supremacist organization founded in the late 1860s that terrorized blacks eventually added immigrants, Jews, and Catholics to its list of undesirables. By the 1920s, it had over a million members (a conservative estimate). Five U.S. senators and four state governors were Klansmen. Mayors from Philadelphia, Washington, DC, and San Francisco received KKK endorsements and that approval helped cement their elections. Score 12 for the fear-mongerers.
The politics of paranoia appeared in the hysterical atmosphere whipped up by Attorney General Mitchell Palmer in the aftermath of WWI to advance his presidential ambitions by showing his toughness. However, when his prediction of a violent attempt by “socialists and communists” to overthrow the government on May 1, 1920, did not materialize, his political career vanished, as did the phantom enemy he pursued. His defeat did not, however, undo the damage wreaked on thousands of falsely accused “revolutionaries.”
In many ways, Palmer’s siege mentality only took to the next level attitudes encouraged by the Woodrow Wilson administration after the US entered WWI. To rally citizens around the flag, the Committee on Public Information, following Wilson’s lead, stressed the danger posed by German-American traitors in our midst and warned the public to be vigilant in reporting suspicious activities by the enemy within. The Wilson-sponsored Espionage and Sedition Acts led to the incarceration of many war dissenters, including national labor leader Eugene Debs. In this case, fear fueled by portrayals of “the Huns” as an inhumane and merciless common enemy united the country in its war fervor while depriving citizens of their basic civil liberties.
At the end of January, 1941, a government report prepared by Supreme Court Justice Owen Roberts alleged without a shred of evidence that Hawaii-based Japanese-American spies had abetted the December 7 attack on Pearl Harbor. Other unsubstantiated reports of Japanese-American agents on the West Coast communicating with the enemy on both sea and land soon followed. Public fear swelled along with a clamoring for draconian measures. On February 19th, 1941, Franklin Delano Roosevelt issued Executive Order 9066, which authorized the containment of all who posed a danger to the national security. As a result, more than 100,000 innocent Japanese-Americans living in the continental United States were evicted from their homes and placed in relocation camps. Once again, a president and his military advisors used the politics of fear to bolster a war effort. Ironically, this was the president who told us, “We have nothing to fear but fear itself.”
Hoftstadter’s essay focused on Cold War paranoia, and how it ate away at old American virtues like fairness and freedom of expression. The Cold War saw Harry Truman join the Republican chorus to “scare the hell” out of the American people. He created Loyalty Review Boards to ferret out possible communist operatives inside the government. With Republicans charging that he was soft on communism, a show of strength became essential for his run at the presidency in 1948.
On the way to his 1946 election to the House of Representatives, Richard Nixon painted California Congressman Jerry Voorhris “Red.” Four year later, in his campaign for the Senate seat from California, Nixon tarred Democratic candidate Helen Gahagan Douglas with allusions to her Red sympathies and associations with communist fellow travelers. “Tricky Dick,” as she dubbed him, won the contest. None of the accusations stuck, but the nickname did.
Hence when Wisconsin Senator Joe McCarthy was looking for an issue to enhance his reelection prospects he was aware that Red-baiting worked. He settled on fueling suspicions about an omnipresent conspiracy by “communists” working in the state department whom he accused of serving the world policy of the Kremlin and delivering China to the Russians. When opponents demanded McCarthy show proof, he refused to release any incriminating documents on the grounds they were secret.
The fact is McCarthy never identified a single subversive. His widely scattered accusations led to the dismissals or resignations of scores of talented and loyal Foreign Service employees. His downfall came in 1954, when he went after the army for promoting a dentist who as a youth flirted with leftist groups. After army lawyer Joseph Welch confronted McCarthy with the words, “Have you no decency?” – during nationally televised hearings — McCarthy’s stock plummeted. Only after untold damage was done was he censored by his Senate colleagues “for conduct unbecoming to a United States Senator.”
Believing that more arms were better, Senator John F. Kennedy and his supporters in the 1960 election trumped up public fears by claiming the existence of a dangerous “missile gap” favoring the Soviet Union. The opposite was true. Knowing full well they were behind the United States and fearing the gap would now increase, the Soviets reacted by installing missiles in Cuba. Both sides acting out of fear brought about the most serious threat since the end of WWII – the Cuban Missile Crisis of 1962, whereby the USA and the USSR went to the brink of nuclear war.
Bent on ending the brutality of Saddam Hussein’s murderous regime, President George W. Bush asserted dubious “weapons of mass destruction” to gain public support for the invasion of Iraq.
More recently, we have “birthers” tapping anti-Muslim and xenophobic passions on the rise since 9/11 to raise doubts about the loyalty and patriotism of Barack Obama and his legitimacy as president. Other fear-mongerers bent on denying Obama another four years in office accuse him of being a radical or socialist.
The politics of paranoia advances causes, fills campaign chests, and ignites the passions of a candidate’s supporters. Wild accusations reap the bonus of larger press coverage. Later retractions, when forthcoming, get buried in the back pages.
The paranoid style in American politics has stopped short of the extremes elsewhere – Nazi Germany, the Balkans, and Rwanda, for instance — because our long democratic traditions as expressed in our Declaration of Independence, Constitution, and Bill of Rights remain a constant counterweight. It’s hard to muster support for killing fellow Americans in the name of political disagreement. Quotas, blacklists, suspension of civil liberties, and false imprisonment eventually come up against our image of America as the land of the free.
Sometimes the fear factor is so blatant, hardly anyone falls for it – i.e., the wheelchair examples (one would hope). Sometimes it’s cleverly couched in democratic or legal language, making it difficult to detect: stricter voter registration laws meant to limit turnout of certain groups are disguised as a guard against (non-existing) election fraud.
History suggests that when politicians who know better fail to resist gutter politics, it continues to fester. When President Eisenhower refused to confront McCarthy’s demagoguery, not wanting to stoop to his level and give him even more publicity, the situation went from bad to worse. Eventually, no one was safe from McCarthy’s scrutiny: teachers, librarians, postmen, the butcher, the baker – anyone rumored to be a security risk.
Contrast that with Senator John McCain’s behavior when a supporter at a town hall meeting said she didn’t like Obama because he was an Arab, therefore aligned with terrorists. McCain forcefully replied: “No, he is a decent person and you do not have to be scared of him as president of the U.S.A.”
To combat paranoid politics, individuals must speak up and newspapers and other media must stop presenting both sides of a story in the name of balance when one of them is demonstrably wacky. The value of the Fourth Estate would be markedly enhanced if it focused more on ferreting out the truth than on giving equal time to all points of view. It does that to some extent in lengthy, award-winning exposes. What we are talking about here, however, is on-the-spot, short and swift rebuttal.
Most of us lack the time, money, knowledge, and know-how to confront irrational fears promulgated by authority figures, and we may run risk of physical harm, economic retaliation, or psychological retribution. But we are not powerless. When politicians issue extreme and dire warnings, we can take the time to ask, “What’s in this for you?” We have shining examples of individuals, grass roots groups, and national organizations protesting against the purveyors of fear. And we have the one weapon that most strikes fear – realistic fear — into the hearts of all who would lead us astray for their own advantage: the ballot box.
SOME SOURCES AND RESOURCES
Richard Hofstadter, The Paranoid Style in American Politics, and Other Essays (1965)
David Kennedy, Freedom from Fear: The American People in Depression and War, 1929-1945 (1999)
Lawrence Davidson, “Islamophobia as a Form of Paranoid Politics,” Logos, vol. 10, issue 1 (2011)
Jonathan Haidt, The Righteous Mind: Why Good People Are Divided by Politics and Religion (2011)
Daniell Kahneman, Thinking, Fast and Slow (2011)
Steve Hutt, “Fear Pols, Don’t Let Them Scare You,” http://www.lewrockwell.com/greenhut/greenhut55.html
Democracy requires concerned, thoughtful citizens and well-informed voters. Yet media bombard us with indiscriminate information, producing a feeling of being hurtled forward helplessly without rhyme or reason. Readers, viewers, and surfers need to separate truth from chaff.
Here’s a quick guide to really “reading” the news:
1. Be aware that most oddball stories serve primarily to entertain or shock. They’re marketing tools. Events of long-term significance remain in the headlines for weeks, not days.
2. Be cautious of generalizations or assertions that defy logic. Proponents of the Vietnam War warned if we didn’t defeat the enemy on his turf, we’d have to do it on ours. What were the odds of the miniscule North Vietnamese navy or air force transporting its army to our shores?
3. Be alert to ideological bias and objectives — to proselytize, defend the status quo, gain political office, cover up mistakes, and/or denigrate individuals or groups. Was the agenda offered up by a liberal, a conservative, a libertarian, a feminist or a misogynist, or on Fox or MSNBC? Easiest to recognize are blatantly one-sided polemics, especially when they use inflammatory language to stir emotions and inspire fear.
4. Be certain that the best way to assess a news story’s content is to consult different reputable sources on the same topic – on TV, on-line, AND in print.
5. Be skeptical of expert predictions. In his 2005 book Expert Political Judgment: How Good Is It? How Can We Know?, Philip E.Tetlock showed that when experts were asked to pick one of three options about the probabilities of the United States going to war in the Persian Gulf, they “performed worse than they would have if they had simply assigned equal probabilities to each of the three potential outcomes.”
6. Be careful about accepting statistics at face value or as irrefutable proof. Mark Twain in his inimitable fashion reminds us: “There are lies, damn lies, and statistics.” A Gallup Poll taken March 20-26 showed a 9% shift toward Obama among women 18 to 49. A second Gallup Poll taken at roughly the same time reported Obama’s favorability numbers for the same group fell by 4%. At best, statistics provide a still life of what the subjects thought at the moment. At worst, they frame questions to get the answers desired.
7. Be ready to decode agendas concealed by language, such as the phrase “the American people believe.” “The American people” refers to a non-existent, like-minded group. The phrase avoids serious discussion of disagreements. Individuals “believe,” “understand,” and “demand.” The American people rarely even “agree.”
8. Be wary of linguistic minefieldsMconsisting of wishes and opinions stated as fact. Look for words such as would, should, ought and must; phrases and sentences beginning with the likes of it is our duty or our national interest requires; as well as adjectives, adverbs, and especially superlatives. “Ours is the greatest country on earth” is an opinion. “Ours is a country on earth” is a fact.
9. Be tuned to the mindset of those who cite small flaws in a policy to challenge all aspects of that policy. Mistakes made, say, in health care reform must be weighed against the alternative of doing nothing. Mistakes can be corrected. Throwing out the baby with the bathwater is reckless.
10. Be on guard for scapegoating, as when, for example, individuals or groups criticize the federal government non-stop for doing too much or too little, too late or too inefficiently, and/or single out government as responsible for all the country’s problems. Finding a scapegoat assures that the real problem will not be addressed.
11. Be dubious of those who claim that, left to their own devices, people will do the right thing for themselves, their neighbors, and the nation. This outlook perilously ignores American history and all literature on the dynamics of human behavior.
12. Be assured that simplistically diagnosed problems guarantee simplistic solutions. Simplicity may score political points with frustrated constituents, but quick fixes make less urgent the need – and the willingness — to search for more comprehensive and lasting solutions.
13. Be patient with those whose opinions differ from your own, even when what they have to say is upsetting. Pause. Reset. Listen. As has been said about all conversation, the purpose is not to win but to understand.
Thoughtful analysis of the news enables us all to fight the right battles for the right reasons. Well-informed citizens are not swayed by their emotions – or by anyone else’s attempts to manipulate them. They’re too busy really reading the news.
Central to myths about how the United States conducts its foreign policy is the one of America as the divinely blessed, exceptional nation. Exceptionalism differs from patriotism and pride in country as normally expressed by saluting the flag, or singing the national anthem, or giving war heroes their due with parades and funeral honors. Exceptionalism insists on our moral mission to spread far and wide our divinely-anointed values – religious, political, and economic.
Exceptionalism also forces us to create reasons or rationalizations when our actions fall short of reaching the Holy Grail. In our darkest hours, we stubbornly cling to the benevolence of our intentions, even when others perceive them to be self-righteous, narcissistic, arrogant, aggressive, and/or hypocritical.
Replacing exceptionalism with a more realistic foundation for dealing with the realities of a messy world need not undermine our pride in the United States as a force for good in the international community.
1630 – 1980: The Rise of the City on the Hill
We came by the myth of exceptionalism honestly. It predates the birth of our nation. During a 1630 voyage of Puritans to the New World, Minister John Winthrop proclaimed, “We must consider that we shall be as a City on a Hill, the eyes of all people upon us; so that if we deal falsely with our God in this work…we shall open the mouths of enemies to speak evil of the ways of God.”
The Declaration of Independence, our articles of political faith, made the bold and unprecidented assertion that our ideals were universal: “. . . all men are endowed by their Creator with the rights of life, liberty, and the pursuit of happiness. (See last month’s post, “Divided We Stand,” for more about the Founders’ thoughts on religion.) America’s grand vision buttressed President Jefferson’s justification for the Louisiana Purchase (1803) as part of the creation of an “empire for liberty.”
Mid-19th century expansionism referred to “our manifest destiny to overspread and to possess the whole of the continent that Providence has given us.” That mission allowed us to rationalize provoking Mexico into a war that ended in the Treaty of Guadaloupe, by which the United States acquired Texas and California and an area that included present day Utah, Arizona and New Mexico.
Late 19th century efforts at American overseas expansion, already claiming divine approval, added the dimensions of racial and cultural superiority. Clergyman Josiah Strong’s Our Country (1885) found audiences ready to extend Anglo Saxon culture globally. Senator A.K. Beveridge in “March of the Flag” (1898) advocated our “free institutions broaden their blessed reign…until the empire of our principles is established over the hearts of all mankind.”
A combination of perceived Providential encouragement, rationalization, and raw power led to a rash of American conquests at the end of the 19th and into the first decade of the 20th century: the overthrow of Hawaii’s legitimate government prior to its annexation, the takeover of Samoa, and a war with Spain that led to the incorporation of Guam, Puerto Rico, and the Philippines. President McKinley defended his decision to take up arms against the Catholic Philippines (1898-1901) by promising to extend Christian influence there (presumably of the Protestant persuasion) in tandem with the blessings of democracy. For a variety of economic reasons, ours and theirs, many a Latin American country saw a visit by United States Marines into the 1930s: Nicaragua, Cuba, Honduras, and Mexico, among others.
Most Americans of the time had few qualms about planting the flag on foreign soil, although they derived no economic benefits from America’s empire. They responded enthusiastically to calls by the jingoistic yellow press to “Remember the Maine, to Hell with Spain,” a reference to an 1898 incident in which an American battleship exploded, supposedly sabotaged by Spain while stationed (illegally) in Havana Bay harbor.
In 1910, at the height of invasive exceptionalism, “America the Beautiful” became our national hymn. It celebrated a country on which “God shed his grace.”
Foreign policy decisions based on exceptionalism can impact the domestic situation in unforeseen ways, and not always for the better. More than once, we’ve been forced to go against our own ideals. Lands obtained by the Mexican Cession heightened sectional tensions over slavery and made it more difficult to retain political structures needed to preserve the Union. The Filipino-American War, in which our troops employed torture and a scorched earth policy and herded civilians into concentration camps, called into question our democratic discourse about freedom, choice, and elections. Wielding a “big stick” in Latin America cast the United States in the role of bully. Teddy Roosevelt fomented a revolution that gave Panama independence from Colombia as a prerequisite to constructing the Panama Canal on American terms, a manipulation that was incongruent with the idea of a nation based on the rule of law.
Consistent with the myth of exceptionalism, President Wilson’s declaration announcing America’s entry into World War I turned what had been a three-year ethnic conflict into a crusade for its final eight months “to make the world safe for democracy.” Ennobling our cause led to demonizing not only Germany but German-Americans and anti-war dissenters, a tendency toward suspicion that spilled over into the Red Scare following the Bolshevik Revolution of 1917. Two years later, the U.S. Justice Department rounded up more than four thousand “radicals” in thirty-three cities in a single day. Many were imprisoned or deported.
For the next 20 years, burned by the war’s failure to live up to our expectations, the United States moved into a neo-isolationist phase of no entangling military alliances, a policy that continued while fascist threats and aggression by Hitler and Mussolini accelerated. Isolationism was exceptionalism in reverse gear, holding America above the doings of the tainted rest of the world.
On February 17, 1941, “The American Century,” a widely-read magazine editorial in Life written by publisher Henry Luce, questioned the country’s retreat from international affairs by reviving traditional American exceptionalism. Luce explained “. . . there is no possibility of the survival of American civilization except as it survives as a world power.” Only the United States offered the world freedom of speech and religion and an end to poverty and misery, he argued. Its duty as a redeemer nation was to exert the full impact of its influence.
Nine months after the editorial appeared, the Japanese attacked Pearl Harbor. Four days later, Germany declared war on the United States. The debate between interventionists and isolationists came to an end.
Post WW II and Onward
Post World War II American foreign policy focused on containing the Soviet Union, which had annexed Latvia, Lithuania, Estonia, and large chunks of Poland. Our efforts were seen as crucial for stopping limitless expansion by a ruthless, atheistic, communist dictatorship driven by a secular messianic ideology. Strategic planner George Kennan’s famous article, “The Sources of Soviet Conduct” (Foreign Affairs, 1947), called on the USA to exhibit long-term, patient, non-histrionic containment of Soviet expansive tendencies.
Yet despite its cool detachment, his report ended with an apostrophe to manifest destiny: “The thoughtful observer of Russian-American relations will find no cause for complaint in the Kremlin’s challenge to American society. He will rather experience a certain gratitude to a Providence which….has made their entire security as a nation dependent on….accepting the responsibilities of moral and political leadership that history plainly intended them to bear.”
We thanked Providence by providing money and arms to Greece and Turkey to ward off communist subversion and, via the Marshall Plan, to rebuild European economies as a buffer against communist victories at the ballot box. The establishment of NATO, and NSC-68, part of the National Security Act, called for a major peacetime military buildup. The price of liberty now meant eternal vigilance everywhere, all the time.
The costs of the Cold War for the United States were not only financial. The crusader mentality unleashed a second Red Scare directed against alleged communist sympathizers and agents who supposedly had planted themselves in the state department and throughout the foreign policy establishment. Civil liberties took another big hit from the tactics employed by the House Committee on Un-American Activities and the McCarthy hearings. Thousands of Americans, especially those in the entertainment industry, were blacklisted. Some never recovered from the emotional and financial devastation.
Fearful of nuclear attack and threatened by a “doomsday clock” creeping ever closer to midnight, children practiced duck-and-cover drills in schools, families constructed air-raid shelters, and the military-industrial complex grew and flourished, often to the detriment of domestic needs, an imbalance that remains in dispute to this day.
City on a Hill imagery continued through the late 20th century and into the 21st. Ronald Reagan spoke of America as “a shining city upon a hill whose beacon light guides freedom-loving people everywhere.” In addressing the nation after 9/11, President Bush suggested: “America was targeted for attack because we are the brightest beacon for freedom and opportunity in the world.” Mitt Romney, the 2012 Republican presidential contender, has claimed: “In an American Century, America leads the free world and the free world leads the entire world.”
Of Silk Purses and Sows’ Ears
Some exceptionalists deny the United State was ever an empire. Or, at worst, it was a liberal one that promoted independence under representative forms of government, and intervened for humanitarian reasons to alleviate what scholar Michael Mandelbaum referred to as the “palpable sufferings of peoples inflicted by their own governments or as a result of the absence of effective governments,” for example, in Iraq, Somalia, and Bosnia. Mandelbaum argues that “if America is a Goliath, it is a benign one.” The United States, he concludes, acts more like a world government than an empire by providing security, global access to oil, currency stability, and flourishing free trade that preserves peace.
Whatever the virtues of American foreign policy, and there are many, the myth of exceptionalism bites it in the ass when it encourages democratic nation-building in places lacking all of the prerequisites:
* a market-driven economy, in which a large middle class has sufficient economic and political clout to force the government to take its interests into account;
* past experience in representative government where the people voted for candidates from established political parties and accept the legitimacy of opposition groups;
* a lengthy common history, shared culture, ethnic homogeneity, or a long tradition of peaceful tolerance.
Post WWII Germany and Japan each possessed some of these prerequisites; hence, their American-supervised transitions to democracy succeeded.
By contrast, Vietnam, Iraq, and Afghanistan share histories of agreements between foreign occupiers that created artificial entities. They endured centuries of rebellion, social unrest, coups, assassinations, and factions dominated by local warlords. Primary allegiances belonged to tribe, sect, or religious group, not nation. The U.S. plunged ahead anyway, confident our “can do” uniqueness could defy history. Truth took a back seat to rationalization in the form of dubious intelligence reports of unprovoked attacks on the USS Maddux in the Tonkin Gulf and the presence of weapons of mass destruction in Iraq.
Because the normal rules of war did not apply in these places, the use of napalm, indiscriminate bombings and IEDs, waterboarding, and other kinds of torture (Abu Ghraib) made it increasingly difficult to retain the myth of anything exceptional in America’s operations on these battlefields, or in Afghanistan, where we remain entrapped in the longest war of our nation’s history. Yet retain the myth we do, even while, as a nation and as individuals, we stagger under the domestic costs and repercussions.
History has been trying to teach us something, and it’s not that the United States is no different from other nations in conducting its foreign policy, or that it’s even worse. It’s that American exceptionalism has limits. Removing a murderous thug like Saddam Hussein from power, for instance, may be desirable on humanitarian grounds, but given the world’s inexhaustible supply of vicious tyrants, we cannot take on all of them. We cannot be all things to all people. We’re not infallible. Our resources are not unlimited.
And even when we succeed abroad, the cost may be too great. In a 1953 speech before the American Society of Newspaper Editors, President Dwight D. Eisenhower spelled it out clearly: “Every gun that is made, every warship launched, every rocket fired signifies, in the final sense, a theft from those who hunger and are not fed, those who are cold and not clothed. This world in arms is not spending money alone. It is spending the sweat of its laborers, the genius of its scientists, the hopes of its children. This is not a way of life at all in any true sense. Under the cloud of threatening war, it is humanity hanging from a cross of iron.”
Not sending large numbers of troops off to war does not rule out other approaches to conflicts: diplomacy, for one, but also using air power, including drones; economic boycotts; and marshalling support from the international community. Employment of these kinds of tools helped to rid Libya of strongman Mohamar Khaddafi.
We need to tone down references to our exceptionalism, which others see as bragging or condescension. We need to acknowledge that many of our foreign interventions rested not on a divine calling but on self-interest and/or racial and cultural ethnocentrism and did not improve the daily lives of the needy. We cannot assume “God’s blessing” for America exists in perpetuity no matter what we do.
Foreign policy is one area in which individual citizens can do little but react after decisions have already been made. Still, as citizens, we need to be wary of politicians who, in the name of exceptionalism, ignore the full story of American conduct in world affairs, it strengths and weaknesses, its virtues and vices. We can look for the motivation behind the exhortation. We can admit that no matter how divine the inspiration may be, we remain human, with all the potential for error that implies. We can choose to be clear-sighted in our humanity, and we can demand clear-sightedness of our leaders.
May the truth be with us.
SOME SOURCES AND RESOURCES
William Pfaff, The Irony of Manifest Destiny, 2010
Rajiv Chandrasekaran, Imperial Life in the EmeraldCity, 2006
Michael Mandelbaum, The Case for Goliath, 2005
George C. Herring, America’s Longest War, 1985