Government shutdowns, contrary to popular perception, are not rare occurrences. Between 1976 and 1996, the United States government partially shut down 16 times.
The 1996 shutdown lasted 27 days. It followed President Clinton’s vetoing of a Republican budget that called for substantial cuts to social programs, including Medicare and food stamps. Confrontations among politicians at that time exhibited the same fierce, bruising partisanship, acrimony, and rancor now in evidence. In the end, Republicans, who controlled both houses of Congress, blinked and accepted Clinton’s budget proposal to end the federal deficit in seven years. They are in a weaker position this time around because they do not control the Senate. It’s likely they will blink again, but less likely they will publicly acknowledge doing so.
It makes sense to regard the current shutdown as a continuation of its 1996 predecessor. At that time, Republicans attacked the Democratic incumbent for epitomizing all that was wrong with the morally bankrupt left-leaning boomer generation. Enemies depicted Clinton as a profligate big spender on wasteful social programs that created America’s dependence on large government at the cost of staggering, initiative-stifling deficits. They succeeded in killing Clinton’s plans for massive reform of health care insurance.
Emboldened by sweeping gains in the midterm elections of 1994, Republicans signed onto Newt Gingrich’s “Contract with America” that called for a balanced budget/tax limitation amendment. Poor showings in the 1998 mid-term elections – Republicans lost seats in the House and gained none in the Senate – led to the resignation of Gingrich, first from his speakership, and later from the House itself.
Somehow, this lesson was lost on Republicans. They underestimated Clinton who in 1995 surprised them and liberal Democrats by calling for a middle class tax cut and a balanced budget within ten years, while speaking out against violent television content and calling for tough police responses to crime. Thus he blunted the Gingrich offensive, and by achieving a fiscal surplus in 1998 won over business leaders and investors. Nor was he damaged by the Monica Lewinsky scandal that irked Congressional Republicans more than it did the American public, which took a more permissive attitude.
The same kind of personal hostility has carried over to Obama, in part because he is a black man. His American birth, despite impeccable confirming evidence, has been challenged by Republicans, most vociferously by Donald Trump. It was not just Obamacare but his entire domestic legislative agenda that was deemed extremely leftist, if not downright socialist, from the start.
Obama’s 2012 victory included Democratic gains in both the House and Senate, in spite of non-stop criticism of Obamacare, already passed into law in 2010. Republicans were unwilling to deem his decisive victory (by more than 4,000,000 votes) as any kind of mandate for the Affordable Health Care Act. The House passed more than 30 resolutions calling for its complete repudiation, all of which died in the Senate. As justification for their intransigence, House members selectively interpreted polls showing the majority of Americans did not support Obamacare, while ignoring those that showed even larger majorities did not believe the implementation of Obamacare justified shutting down the government.
Hostility to the man and the program led them to ignore or obfuscate the fact that Obamacare strongly resembles plans suggested by the conservative Heritage Foundation that called for the purchase of policies from private insurers, and were implemented (reasonably well) by a certain Republican governor of Massachusetts. Republican demands to attach the defunding or delay of Obamacare as a condition for approving a budget rested less on its cost (in actuality the program was designed to be self-funding) than on fear that its possible success would be a major victory for a Democratic president and an imagined Big Brother America.
Unlike the crisis of 1996, the present situation has the added burden of simultaneously dealing with the budget and the issue of raising the debt ceiling due to expire on October 16th. Virtually all economists agree that the consequences of the country’s failure to pay its financial obligations would be catastrophic both domestically and internationally. Even Speaker Boehner, who thus far has refused to allow a House vote on a clean budget bill, says he will not allow a credit default. On the other hand, Senator Tom Coburn asserted, “I would dispel the rumor that…if we don’t raise the debt ceiling, we will default on our debt.”
Most critics argue that a small group of radical Republicans – Ted Cruz (Texas), Mike Lee (Utah), and others – are responsible for the Republican strategy of refusing to pass a budget or raise the debt ceiling unless Obama abolishes, delays, or significantly alters his health insurance program. The scenario indicates that these zealots have been able to overcome the moderate wing of the party and force them to go along with a strategy of holding the government hostage unless it makes major concessions on health care.
This assessment has validity, but it is not the whole story. Obstructionism antedates the coming into law of Obamacare. Senator Mitch McConnell, supposedly less extreme than Tea Party leaders, proclaimed the party’s most important objective was to do anything possible to prevent Obama’s reelection. The 2012 Republican Party Platform, agreed to by all Republican members of both houses of Congress, stated, rather inelegantly, “In our view the entire act before us is invalid in its entirety.” It is hardly likely that the party platform was exclusively drafted by radical Republicans.
The defunding strategy emerged shortly after Obama began his second term, when a loose-knit group of political activists and organizations, including former Reagan Attorney General Edwin Meese III (who resigned because of his role in the Iran-Contra plots) and Americans for Prosperity, sponsored a “blueprint to defunding Obamacare.” (Shirley Gay Stolberg and Mike McIntire, “A Federal Budget Crisis Months in the Making,” New York Times, October 5, 2013.) This influential group worked closely with Tea Party representatives to target “lukewarm” conservatives and bring them onto the team, while also working against the reelection of moderates who refused to play ball with them.
The distinction between radical and moderate Republicans right now is blurred, and more one of tone than substance. Even centrists, such as PA Representative Pat Meehan, who urged Speaker Boehner to allow a vote on a clean budget, stopped short of being willing to work with Democrats on procedural maneuvers to require a clean spending bill. Moderates’ inaction and retirements have made it easier for the party’s radical wing to create the poisonous political atmosphere now prevailing and to implement the strategy of rendering the federal government irrelevant. (For more on the shifts, see Nate Silver, “Moderate Republicans Fall Away in Senate,” Five Thirty Eight, New York Times, May 8, 2012.) Too many moderates stood on the sidelines while extremists initiated the modus operandi of “just say no” to everything Obama requested.
That said, the most likely scenario is that Republicans will eventually agree to end the government shutdown and raise the debt ceiling. These “concessions” will be part of a broader deal that will also end sequestration (automatic spending cuts in categories of federal outlays) in return for Obama’s “concessions” on long term changes to Social Security, Medicare, and the tax code, and, maybe, on taxing medical devices. The sad thing is, Republicans would have gotten the same give-backs even if they had not employed the strategy of brinksmanship. Once again, that is not the lesson they will draw.
Short term, the Tea Party and its financiers (the Koch brothers and others) will continue to influence a small number of House and Senate elections. Longer term, demographics – the browning of America – will work against them. American history indicates that groups or movements dominated by ideology, either on the right or the left, have short shelf lives. Alienating tactics will not win over sufficient numbers of libertarians and independents to give the movement a stable, majority status. The pendulum swings, but Americans have always tended toward the center.
Neither Ted Cruz nor his clones will get the 2016 Republican presidential nomination. They have angered too many Republicans through personal attacks, and more importantly, the GOP knows the extremists can’t win. If this prediction is wrong, and an extremist does become the Republican nominee, look for an outcome similar to the 1964 Barry Goldwater bid – a Democratic Party landslide.
Unless Republicans chart a different course, one that offers positive appeal rather than mere repudiation of the other party’s programs, they will remain stuck in the wilderness of presidential politics. Sadly, until that time comes – and it will, if for no other reason than they want your vote – the American people will continue to bear the burdens of intransigence.
There are many thoughtful arguments for not going into Syria, but they are not beyond challenge.
Military actions, even the best ones, often do not go according to plan. True, but some limited military operations do achieve their objective. Witness the successful one to take out Osama bin Laden, despite the breakdown of one of the choppers used in the operation.
The defining line for the Obama administration is use of chemical weapons. Why is this more objectionable than chopping off limbs with machetes, systemic rape, and shooting protestors? The Geneva Protocols of 1925, signed by nearly all nations, including Syria, banned the use (but not the manufacture) of chemical and biological weapons. Death by gassing has long been universally considered particularly gruesome and loathsome. The same sentiment persisted in the 1993 Chemical Weapons Convention ratified by 189 states – not Syria – that banned the use, manufacture, and transfer of chemical weapons in the aftermath of a chemical bombardment on the Iraqi city of Halabja that killed between 3200 and 5000 people and wounded thousands more.
It is foolish to act in Syria to maintain the credibility of Obama because he drew a red line warning the Syrians of consequences if the regime used chemical weapons. He has also drawn a red line on Iran possessing nuclear weapons. Why should Teheran believe him if he reneges on his pledge to punish Assad for using chemical weapons?
Faulty intelligence about the existence of weapons of mass destruction was used to justify the invasion of Iraq. Questionable intelligence is not the issue in the Syrian case. Few, other than Assad himself, would deny its government possesses massive stockpiles of chemical weapons and has used them against its own people.
The second invasion of Iraq was followed by mission creep that led to a ten year civil war, the death of thousands of American soldiers and many more Iraqis, and a still uncertain final outcome that is sure to be disappointing. President Obama understands the difficulties of creating multi-ethnic democracies in Iraq and Afghanistan and has clearly stated he has no intentions of attempting to do so in Syria. The operation calls for short duration targeted strikes against some poison gas delivery systems, and a promise of no boots on the ground under all circumstances. It is not, as Thomas Friedman’s article asserts, the “Same War, Different Country. (New York Times, September 7, 2013). Surprisingly little attention has been given to a different analogy that may be more appropriate to the current situation: appeasement. The failure to respond to Hitler’s pre-WWII aggression led him to up the ante.
We can shame Assad into changing his behavior. The historical record overflows with tyrants who changed only when faced with serious consequences for bad behavior. Persuasion, diplomacy, pleas, and shaming don’t work with dictators whose unshakeable objective is to stay in power no matter the costs and consequences.
Instead of a military response, we should lobby the U.N. to set up an International Criminal Tribunal for Syria as it did earlier for Rwanda and Serbia and to start proceedings against Bashar al-Assad and his thugocracy for crimes against humanity. Let’s do it, but the process take years.
Arming the rebels is preferable to aerial assaults. We cannot just arm “the good rebels” and guarantee arms will not also fall into the hands of the bad guys, including Islamic extremists and pro-Jihadi fighters.
There is little strategic rationale in bombing Syria’s chemical weapons stockpiles. There is no way of limiting their delivery systems either. Obama’s proposal does not call for bombing chemical stockpiles, which would be disastrous. Taking out some delivery systems does limit future usage.
A military attack likely would strengthen the Syrian regime. Does this mean rebels would rally around Assad? Or that Alawites, Shiites, Sunnis, and Kurds would forget the ancient grievances among them and rush to defend “their country?” Come on.
Even the president admits his proposed mission will not change the outcome of the civil war that has already cost more than 100,000 lives and created over two million refugees, so why bother? This is not about whether or not to interfere in a civil war. It’s about demonstrating the consequences of using chemical weapons. Only the United States is in a position to take a stand against what a vast majority of the world’s population regards as reprehensible, immoral, and unethical conduct.
American action may lead to retaliation by Syria and its allies, including direct attacks on Israel, as well as reverberations in other Middle East states where American interests risk becoming the targets of those who already resent American presence in the region. All of these things have already occurred before this situation arose. Benghazi-like attacks and terrorist threats that led to the recent closing of nineteen US embassies and consulates will continue whatever we do or do not do in Syria. Inaction will not increase our security or our popularity in the Middle East. It is very unlikely Syria will go after Israel if the US strikes. Following Israeli bombing of suspected Syrian nuclear sites in 2007 and 2011, the Assad regime did nothing. Assad has enough on his hands without taking on external confrontations.
The president can’t declare war without Congressional approval. There has been no Congressional declaration of war since 1941. Yet plenty of presidents have undertaken military actions without such approval. Recent interventions include Lebanon, Grenada, Haiti, Somalia, and Zaire, to name but a few. The 1999 bombings in the Balkans were ordered by President Clinton without Congressional, UN, or public approval, but they did bring Serbia to the negotiating table.
Only 35% of the public supports US military action in Syria. That’s according to the latest poll as of this writing. Public opinion is fickle. A Washington Post poll on September 20, 2012, found 63% of Americans favored involving the United States military if Syria used chemical weapons. How the operation ultimately fares can lead quickly to another major change in public sentiment.
U.S. attacks against a sovereign nation without provocation or the endorsement of the United Nations is a violation of international law. So is the use of chemical weapons. Threatened vetoes by Russia and China, themselves past violators of the U.N. prohibition of non-defensive wars, have rendered the Security Council helpless in the Syrian situation. Despite the considerable good it does, the U.N. cannot act without the support of its most powerful members, even when they’re dead wrong.
Past deceptions and lies like those about the North Vietnamese navy firing on the destroyer Maddox or Iraqi soldiers killing incubated Kuwaiti babies render suspect everything our government tells us. A healthy dose of skepticism is warranted, but that’s different from a cynicism that rejects as untruthful all government reports to the point of immobilizing us from acting when we need to.
Like those who oppose military action, both President Obama and those who support his stand would prefer not to go into Syria. This may now be a possibility. Whatever their arguments for or against intervention, no one predicted – or even considered – that Syria might agree to destroy its chemical weapons to ward off targeted attacks against them. As the discussion moves forward, let it not proceed on the basis of false, incomplete, or ideologically-driven information.
Reposted from Shea Magazine, http://www.sheamagazine.com . . .
In his recent address on foreign policy, published in the New York Times on May 23, President Obama announced that fundamental changes were called for in the assumptions that have kept the United States at war for more than a decade, during which time 7,000 soldiers have made the ultimate sacrifice and we have “spent well over a trillion dollars on war, helping to explode our deficits and constraining our ability to nation-build here at home.” The policy he follows in the current Syrian bloodbath, where more than 70,000 people have been killed, mostly by government forces, will test whether the president’s words are matched by his actions.
In his speech, Obama rejected an outlook dating from the Vietnam War, one which he more or less followed during his first administration. According to international affairs expert William Pfaaf, (The Irony of Manifest Destiny, 2010), this outlook featured an “inveterate American policy of direct intervention in the internal affairs of small non-Western countries, usually mistakenly believed to be victims of some global menace aimed at the U.S., countries incapable of looking after their own affairs or forging their own identities.”
In the Vietnam conflict (part civil war and part North Vietnamese invasion), the global enemy was worldwide communism, supposedly bent on limitless expansion and directed by Russia and China acting in collusion through its Viet Cong and North Vietnamese proxies. According to the containment argument, if not confronted in Southeast Asia, communist forces would reach our shores in the near future.
Fifty-five thousand American deaths, a soaring national debt, the Cambodian genocide, and the communist victory subsequently cautioned the United States, at least for a little while, against large scale military intervention in what were mostly nationalist-xenophobic, ethnic and religious conflicts that posed no direct danger to our national security.
That changed after 9/11, when the Bush administration launched what he called the “war on terror.” “Terror” and “terrorism” do not exist as historical entities, but only as qualities attached to human acts. The labeling matters. Putting the focus on an amorphous, ill-defined quality distracted from a more doable objective of going after specific groups and individuals that sought to kill Americans, everywhere and anywhere, by all means possible.
The war on terror also lumped together al-Qaida and the Taliban, implying, falsely, the two were one and the same. Al- Qaida’s original goal was to get revenge for American support of a detested Saudi Arabian monarchy, which sponsored an unacceptable Wahabi interpretation of the Koran. No coincidence that Osama bin Laden was a Saudi. On the other hand, the Taliban, which the U.S. initially supported after Russia invaded Afghanistan, sought to impose its own ultra conservative reading of that holy book on its own country, Afghanistan.
Lastly, Bush’s grand vision ignored differentiations among 1.5 billion Muslims spread out over thirty countries, from the Arabian Peninsula to Africa’s Atlantic Coast to the Balkans, Turkey, and Indonesia, of whom only 20% are Arabs. Moreover, most Muslims are not, per se, anti-American; when they are, it’s usually in response to the invasions of Iraq and Afghanistan and the “collateral damage” that has befallen ordinary people.
A very small number of extremists have sought retribution by indiscriminate killings of innocent civilians, as in the train bombings in England and Spain, for the deaths and destruction in their own countries by American armed forces and allies. Such tactics came about because these groups were too small in number to win a battlefield conflict. They rationalized their conduct by pointing to the death of innocents in their own homelands as justification. (So did Timothy McVeigh, who cited the FBI assault against the Branch Davidian complex near Waco, Texas, when he blew up a federal building in Oklahoma City, causing the second largest number of Americans (168) to perish from an act of domestic terror.)
The Bush team also assumed American democratic values would be universally welcomed by all who lived under cruel and despotic regimes. In a Foreign Affairs article written in the aftermath of the 2003 Iraq invasion, Secretary of State Condoleezza Rice asserted that “democratic state building is now an urgent component of our national interest,” and that “the democratization of Iraq and the democratization of the Middle East are linked.” This insistence on “nation building” in countries about whose histories and culture we were ignorant and dismissive was considered arrogant and condescending by their people.
In his speech, Obama called for the United States to shift its emphasis from focusing on a depleted al-Qaida to going after more localized threats so as to prevent attacks like those in Benghazi, the oil facilities in Algeria, or carried out by radicalized individuals, such as the shooter at Fort Hood who killed thirteen servicemen, and the Boston bombers, some of whom have been American citizens. Essentially he called for the reimplementation of the pre – 9/11 strategy of a “series of targeted efforts to dismantle networks of violent extremists that target America (Pfaff),” an approach in which counter-terrorism is handled primarily by law enforcement and intelligence agencies. The shift would keep the USA from being drawn into wars it does not need to fight, like those in Vietnam, Iraq, and Afghanistan.
What Obama could not say because of the likely disastrous political consequences was that the end games in Iraq and Afghanistan will turn out badly and very badly: continued sectarian violence in semi-democratic Iraq, and war lord domination outside of Kabul, where an America-supported pervasively corrupt leadership holds sway. Such naked honesty would instigate a vast outpouring of public anger, and lead to an endless series of investigation as to what went wrong and who was responsible.
Obama is under considerable pressure to arm the rebels in Syria as their situation worsens on the battlefield, many of whom belong to unsavory groups and share in common only a desire to topple Bashar al-Assad. Senator John McCain has called for arming them and establishing no-fly zones. Others have urged putting boots on the ground, predicting that a “decisive rebel victory in Syria would constitute a major setback for Iran, since Syria…has always been Iran’s most reliable pathway to its proxy, Hezbollah.” (Ray Takeyh, May 27th, editorial, New York Times.) Most recently, Bill Clinton has come out in support of McCain’s position.
As of this writing, Obama has not given his support for either proposal. Lessons were learned from our Vietnam intervention. And forgotten. Iraq and Afghanistan have provided costly reminders. Do we need yet another wake-up call? No, but the pressure on Obama indicates we may be about to get one.
Reposted from Shea Magazine: http://www.sheamagazine.com.
In a February syndicated column, Pulitzer Prize-winning author Cynthia Tucker called for an end to Black History Month. Separating black history from America history, she said, minimizes “the myriad ways in which black Americans’ accomplishments are part of the national mosaic [by making] the contributions of a few well-known black men and women seem like a historical exception.”
That someone of Tucker’s stature thinks it’s time to end the month-long recognition of black history indicates how far our country has come in weakening its racist heritage. And yet ending Black History Month won’t improve the fit of black history into the national mosaic, because so much of it is exceptional. Only blacks lived under the worst America had to offer, maintained faith in America’s promises, and pushed our country to live up to its noblest ideals as articulated in the Declaration of Independence.
Without Black History Month, the major source of information would be textbooks. But no American history textbook to date has been able to sustain the narrative of the different but related experiences of slavery and segregation, the legal struggle to eliminate them, and the creation of an America where truly all human beings are created equal and weave all of that into one comprehensive national story. Among the better attempts, America and Its People: A Mosaic in the Making (many editions) often makes good on its promise “to underscore the pivotal role that ethnicity, race, and religion have played in our nation’s social, cultural, and political development.” Yet the black story vanishes for a hundred plus pages — more than once! — suggesting how difficult it is to braid the various strands.
Only recently did American history textbooks incorporate black history. Until the 1960s, the narratives were essentially exclusionary. Where blacks did appear, it was usually to describe their inferior social position. The omission was not accidental. Blacks had no place in a grand narrative of an America whose democratic Teutonic origins gave rise to the “village heroes [at Lexington and elsewhere] who were more than of noble blood, proving by their spirit that they were of a race divine.” (George Bancroft, History of the United States from the Discovery of the American Continent, 1858)
Nor did blacks have a place in the frontier experience lauded by historian Frederick Jackson Turner for establishing the quintessential American characteristics: toughness, resourcefulness, individualism, and a predisposition for democracy. (The Closing of the American Frontier,1890)
Nor was there any place for blacks in the accounts provided by Progressive historians in the early twentieth century who presented the story of America’s march toward freedom as a whites-only endeavor. One of the best known, Charles Beard, mocked “the Negroes’” part in this splendid endeavor as having been “ludicrous if they had not been pitiable.” (American Government and Politics, 1911)
Only in the late 1940s did historians begin to study slavery from the slave’s point of view and incorporate slave narratives written before and during the Civil War in their assessments, Harriet Jacob’s, Incidents in the Life of a Slave Girl (1861) among them. On the one hand, these narratives emphasized the deprivations of slavery in lurid scenes of horror and violence and provided grist for abolitionists in their fight against the “peculiar institution.” On the other, they revealed a resilient culture in slave quarters, and a people who managed to build a vibrant society, hidden from whites, with its own music, religious practices, and loving relationships to sustain their dignity and hopes. (John Hope Franklin’s From Slavery to Freedom, 1947)
Despite these revelations, and the 2300 oral slave narratives collected by the Federal Writers Project in the 1930s, old stereotypes persisted. In his rejection of portrayals of slavery as a system in which fellowship existed between master and slave “characterized by propriety, proportion, and cooperation,” (Ulrich Phillips, American Negro Slavery, 1918), Stanley Elkins erred in the opposite direction by asserting that slaves came to assume their masters’ view of them as “Sambos.” (Slavery, 1959)
The 1960s saw the study of black/African American history become an academic staple. At its worst, it encouraged reverse separatism, for example, the idea that only blacks were qualified to teach black studies. At its best, it refuted older notions that equated black history with what whites had come to believe was true, that blacks were passive objects to whom things happened. (C.Vann Woodward, Presidential Address to the Organization of American Historians, 1969)
Today, the writing of black history remains a work in progress. To end Black History Month and leave it to textbooks alone to convey the tribulations and the triumphs that make up the historical experiences of this group of Americans would be unwise. Black History Month provides information and affirmation via lectures, workshops, concerts, plays, poetry readings and remembrances, many led by prominent blacks from all walks of American life, an appreciation unmatched by any American history textbook.
Abolishing Black History Month will not lead to a more integrated national mosaic, but will put out of sight/out of mind this still incompletely understood American story. In the transition of the United States from a racist to a racialist society, one that tolerates and even celebrates differences, black visibility and acceptance has increased significantly. Despite these changes, ours is still a country where a radio host described the Rutgers women’s basketball team with a racial slur and a university president touted the Three-Fifths Compromise, in which slaves were counted as three-fifths of a person for purposes of representation and taxation, as a shining example of how a polarized people could come together, (New York Times, February, 24, 2013).
We need more, not fewer, ways to keep the black experience in the public spotlight from which it was excluded for far too long.
Reposted from Shea Magazine, http://www.sheamagazine.com.
The American debate on abortion will not be decided by legal rights or moral wrongs. These kinds of arguments, strident though they may be, have varied at different points in American history. Answers to “the abortion question” in our country have always been influenced by a less audible background chorus of influences: shifting economic, psychological, and political considerations.
Until the mid-nineteenth century, abortion was legal, at first in the colonies, and later, in the states, a choice supported by the general population, politicians, and most churches. Colonial Americans, including seventeenth century Puritans, made little distinction between spontaneous and induced abortions before quickening, defined as the moment when the mother first felt the fetus move, the only sure way to tell if a woman was pregnant. According to an article by James Mohr in Women’s America: Refocusing the Past, after quickening, usually at the midpoint of gestation, “the expulsion and destruction of a fetus without due cause was considered a crime because the fetus itself had manifested some semblance of a separate existence: the ability to move.”
Abortions, performed primarily by midwives, took place covertly not because the termination of pregnancy itself was deemed blameworthy but because it was seen as an extreme action designed to hide a prior sin—sex outside of marriage. Strikingly absent from public opinion was outrage over the destruction of the fetus or denunciations of those who would arrest nature’s course. Adultery, on the other hand, was cause for damnation.
Beginning in the 1820s, state laws made abortion illegal, a process complete by the closing decades of the century. Doctors supported the bans on the grounds that abortions were immoral and dangerous. Less touted was their interest in preserving their exclusive rights to practice medicine, a monopoly that was being increasingly challenged by midwives and homeopaths.
Anti-abortion sentiment also came from the growing fear that higher birthrates of newly arrived immigrant women threatened to overwhelm the Anglo-Saxon population. And yet, almost in defiance of the prohibition, between 1840 and 1880 the number of abortions among married, native-born Protestant wives of middle- and upper-class standing markedly increased. These women turned to abortion—usually with the agreement of the spouse—to postpone family responsibilities. They could afford to pay midwives and doctors willing to perform the now illegal procedure.
Criminalization of abortion and of information about it did not reduce the number of women who sought to end an unwanted pregnancy; it just denied them access to the services readily available to their colonial sisters. On the eve of the Civil War, around 160,000 abortions were undertaken among a population of 30 million (James Mohr, Abortion in America, 1978; Martin Olasky, Abortion Rites, 1995); higher proportionately than the number of abortions performed today, which has leveled off at 1.2 million per year out of a population of 307 million.
The majority of nineteenth century abortions involved poor women. Many of them were self-induced or performed by back alley practitioners operating under unsafe and unsanitary conditions. Mortality rates were high. Techniques included the use of knives, knitting needles, and sticks, untested drugs, herbs, and chemicals, and horseback riding. It was also believed abortion could be induced by jumping high enough so that one’s heels touched one’s buttocks. Moral and legal considerations – and even the possibility of death — were of minor importance to those in desperate circumstances undertaking desperate measures.
In 1873 Congress passed An Act for the Suppression of Trade in and Circulation of Obscene Literature and Articles of Immoral Use, better known as the Comstock Act, named after its chief lobbyist. The legislation, which outlawed the dissemination of information on any form of birth control, reflected growing worries about increases in pre-marital and extra-marital sex and the changing status of women, more and more of whom were entering the workforce and challenging their role as exclusively wives and mothers. Moreover, as mentioned above, middle class, white women were just not producing enough “true American” babies.
For the next hundred years or so, abortion remained illegal despite the efforts of birth control advocates like Margaret Sanger, a tireless crusader for women’s health issues from the 1920s until her death in 1966. The sixth of eleven children born to a mother who went through eighteen pregnancies and died at the age of fifty, Sanger founded the Birth Control League in 1921 to distribute information (illegally) on contraception to doctors and social workers, but also to mostly poor women who had no choice about carrying through unwanted pregnancies forced on them by their husbands’ sexual demands. Even the courageous Sanger had her shadow motives: She was sympathetic to the eugenic movement’s efforts to prevent the wrong kinds of women – immigrants, blacks, and the poor – from diluting the Anglo-Saxon population dominance.
Birth control and abortion issues were not the priority of early twentieth century feminists. Obtaining the right to vote dominated their agenda, while the Great Depression and WWII led them to focus on economic matters, including jobs lost after the “boys” returned home. The big call to legalize abortion came from the feminist movements of the 1960s as one item on a lengthy to-do list. The right of women to control their own bodies is best seen as part of an ongoing, broader agenda of economic and legal equality. Their efforts influenced the landmark 1973 decision in Roe v Wade whereby the Supreme Court allowed unrestricted abortions in the first trimester. (Only as recently as 1965 had it struck down state laws prohibiting married couples from practicing birth control, and in 1972, afforded the same freedom to unmarried couples.)
Opponents of Roe v. Wade saw it as the final blow in a series of attacks against the familiar way of life in which women played their roles as wives and mothers, and shunned divorce, promiscuity, and non-traditional family structures. A common response when one’s belief system is threatened is not to rethink the belief system but to defend it more and more vigorously.
In the 2011 Republican presidential debates, the candidates emphasized their “pro-life” credentials, and nearly all of them pledged to cut off funding for Planned Parenthood, the nation’s largest abortion provider (despite the fact that 97% of its budget goes to providing general healthcare services for its often poor clients). One of them went as far as to condemn the practice of contraception as a “license to do things in a sexual realm that is counter to how things are supposed to be.”
The Republicans lost the election of 2012 largely because women voters rejected the hard line they took on women’s rights, especially on abortion, including requiring mandatory invasive ultrasound examinations for those choosing to terminate a pregnancy. Given the tendency of factors other than legal and moral arguments to determine America’s stand on abortion, we can predict that politicians running for office in 2016, including those on the right, will be heeding instead the subtler beat of women marching toward the voting booths.
The following post is reprinted from Shea Magazine, http://www.sheamagazine.com.
The problem with many of the current attempts to interpret the Second Amendment is that nothing in it actually applies to the arguments we’re having today. Can individuals own guns? If so, which individuals, which guns, for what purpose, and how many? Can governments control the buying and selling of guns? If so, should that be done by the States or the Feds? Both? Neither?
The Framers said nothing in the Constitution or in the Bill of Rights that speaks to any of that because they weren’t thinking about any of that. In their time, guns were accepted as a part of life, same as shoes. There was hunting to do if one was to eat, there were Indians to fend off if one was to survive, there was land to grab away from said Indians if — well, that’s another story.
At the time of the Constitutional Convention in May of 1787, the Framers had their own problems to deal with, and they had nothing to do with the buying and selling of guns. The Articles of Confederation were not working. The Federalists saw the need for a stronger central government. They wanted a Constitution and they were willing to compromise to get it. One significant example of the need for a stronger central government still present in their minds was that while Washington begged for an army adequate to the defense of the new nation against the British, state militias showed up or not on whim, with or without sufficient training. And no one had the power to do anything about that. Another, more immediate example was Shays’ Rebellion of 1786. A complex uprising in western Massachusetts, it was seen as a threat to the young country’s existence and could not be put down by the available local militias.
The anti-Federalists feared too strong a central government. They – and the Federalists as well – bore the memory of Britain’s standing army in the colonies, an army that confiscated Revolutionary arms in Concord. Many of the Framers had a hand in the Declaration of Independence’s assertion that citizens must be prepared to overthrow any and all tyrannical governments. Hence, while the Federalists had good reason to want to centralize power, the States had equally good reason to want control over their own militias. (That those militias were sometimes indistinguishable from slave patrols in the Southern states is true, but the perceived need for state militias went well beyond this one use.)
An agreement was worked out. In simple terms: They all accepted the fact that the country needed armed and well-regulated militias available on call to serve in the national interest, “to insure domestic tranquility” and “provide for the common defense.” They all understood that there were States that would not agree to a nationally-controlled militia. To push the Constitution through, the approval of those States was necessary. So the deal was that the States would train, officer, and maintain their own well-regulated militias, but would make them available nationally as needed. The national government, in turn, would have no power to mess with State control of those militias by calling them up for duty and then disbanding or disarming them from on high.
The Framers fussed with the language: “Father of the Constitution” James Madison’s early wording read, “The right of the people to keep and bear arms shall not be infringed; a well armed and well regulated militia being the best security of a free country, but no person religiously scrupulous of bearing arms shall be compelled to render military service in person.” The eventually accepted version stated, “As a well regulated militia, being necessary to the security of a free state, the right of the people to bear arms shall not be infringed.”
The right to “keep arms” referred to owning them, while the right to “bear arms” referred to carrying them in the context of serving in a well-regulated militia. Why were the right to “keep arms” and the rights of conscientious objectors stricken from the final version? We don’t know for sure, but it’s likely there was no perceived need for them in the solving of the problems of that time. The deal being worked out was between the federal government and the States, not government and individual citizens.
The placement of the deal was also debated. The Constitution spelled it out in Section 8, Articles 15 and 16. Madison thought that was enough, and that any problems that came up with that issue and others would be worked out over time through the ingenious system of checks and balances already inherent in the Constitution itself. But wary anti-Federalists wanted more absolute confirmation of who could and should, who couldn’t and wouldn’t, do what and to whom. Eager for the Constitution to pass, the Framers agreed to add the Bill of Rights, including the Second Amendment.
Our current National Guard is an offshoot of the militias of that era. The States run their own units and provide them for national purposes when called upon to do so. The national government cannot disband or disarm those units.
So what in all that debate about, compromise on, and final wording of the Second Amendment says anything as to where the arms should come from and who should have the right to buy them, sell them, collect them, or trade them – or what types of arms might or might not be borne and for what purposes other than well-regulated militias? Nothing. Not even one word about hunting. Or crime-fighting. Or skeet shooting.
Nothing. Not their problem.
Please note: Here are a couple of short essays adapted from guest posts in Shea Magazine, http://www.sheamagazine.com. “Playing with Fire: Paranoid Politics American Style” is the final “official” post in the America – The Owner’s Manual series. A collection of those posts (minus the August election prediction, “It’s Not Just the Economy, Stupid” and the two essays below, but plus additional sources and resources for each essay) is now available as an ebook for only 99c:
For Kindle — http://amzn.com/B009GPIPA2
For Nook — http://www.barnesandnoble.com
For all other readers, libraries, any computer, to give as a gift, and more — https://www.smashwords.com/books/view/238443 .
If you’ve enjoyed these posts, I would very much appreciate your “like,” “share,” and/or review of the book on your blog, Facebook, Twitter, and/or the above sites.
Many thanks to all of my followers, here in America and abroad. You’ve made this a gratifying and enjoyable experience.
And now, My post-election thoughts: TIP ME OVER AND POUR ME OUT: THE INEVITABLE END OF THE TEA PARTY and THE VOTERS HAVE SPOKEN, BUT WHAT DID THEY SAY?
“TIP ME OVER AND POUR ME OUT”: THE INEVITABLE FALL OF THE TEA PARTY
On November 2, 2010, in the aftermath of the 2010 mid-term elections, an essay in the Christian Science Monitor similar to others that appeared in the media concluded that “the emergence of the Tea Party movement is arguably the most dynamic element of the 2010 mid-term elections.” After all, Tea Party backed candidates like Rand Paul and Jake DeMint won Senate seats, Nikki Haley became governor of South Carolina, and twenty-eight house members benefited from its stamp of approval.
The problem with the Christian Science Monitor analysis was that it inflated the power of the Tea Party because it failed to place its triumph in historical perspective. It ignored the fact that Tea Party appeal, even at its zenith, stopped outside the borders of the most densely-populated states and metropolitan areas.
Nor did it observe parallels with the Gingrich Revolution that took place early in President Clinton’s first term. The House Speaker’s “Contract with America” pledged objectives strikingly similar to those offered by Obama’s Tea Party critics: a balanced budget amendment, cuts in capital gains and personal income taxes, and limitations on corporate liability. In his bellicose partisan rhetoric, the Georgian attacked liberals as “pathetic,” “corrupt,” “left wing elitists.” In the 1994 mid-term elections, Republicans gained fifty-two seats in the House and eight in the Senate.
Subsequently, the news media announced the end of the New Deal and the completion of the Reagan Revolution. Yet two years later Clinton easily won re-election, and Gingrich was forced to give up his position as Speaker of the House. Clearly the Gingrich Revolution and its politics of outrage lacked staying power. It was better at engendering anger than it was at sustaining positive enthusiasm.
The Tea Party (more accurately, local Parties) was largely a white grassroots movement begun by those unhappy with the direction the country was going and lacking confidence in either of the two mainstream parties to change it. Supposedly, it drew inspiration from the events of December 16, 1763, when several dozen laborers, artisans, and apprentices went to Boston harbor and dumped overboard more than three hundred chests of tea stored on British ships. Their action that day was a continuation of colonial challenges against the mother country’s right to impose taxes without representation. However, the Tea Party directed its protests against the right of its own government to raise taxes to reduce the size of the nation’s deficit.
The dire economic landscape following the 2008 crash spurred fears of an even more dismal future, thereby widening (for the moment) the appeal of the Tea Party message beyond its typical narrow outreach. It used its electoral clout and threats of political retaliation to transform the Republican Party into the “party of no,” one that automatically led to rejection of Democrat ideas for righting the economy. It eschewed compromise as, for example, when it forced Speaker John Boehner to pull back on a $4 billion deficit reduction plan the two sides had agreed on.
In the 2012 elections, it bullied the GOP into not opposing far-out choices whose election prospects were dubious, such as those of Senate candidates Todd Akin (Missouri) and Richard Mourdock (Indiana), who by “misspeaking about rape” lost all but certain seats in these conservative states.
To win its endorsement, during the Republican presidential debates Mitt Romney veered sharply to the right and dug a hole he could not get out of; for example, advocating self-deportation as the best way for handling the problem of illegal immigrants. His selection of Paul Ryan, a darling of the Tea Party, as his running mate, was a bid to shore up his credentials with the group.
Throughout American history, third party movements have had short shelf lives, among them the anti-Catholic, anti- immigrant Know-Nothings of the 1850s and the Populists of the 1890s who called for reforms such as the direct election of Senators to make government more democratic. Eventually third party agendas have been modified and absorbed by the mainstream parties – Republican and Democrat – and their progenitors vanished from the political scene.
The Tea Party’s aversion to centralization meant that it never become a third party with national reach. What it did was to temporarily sabotage the traditional Republican Party with an in extremis, anti-government agenda that was overwhelmingly rejected in the 2012 elections. In the wake of that defeat, if the Republican Party wanted to remain viable, it had no choice except to bridle its aging “young guns,” and not try to recruit more of them.
Don’t expect the tea kettle to whistle loudly again anytime soon.
THE VOTERS HAVE SPOKEN, BUT WHAT DID THEY SAY?
In the weeks after the 2012 elections, a deluge of newspaper columnists and talking head commentators have been confidently predicting what the outcome means for the future of American politics. Unfortunately, as happened with their pre-election predictions, the pundits are misreading the evidence. They carelessly project trends, if not permanent realignments, that are far from certain.
From the Right, we’ve heard from Rush Limbaugh that “we’ve lost the country,” and from Stanley Kurtz that “the existence of America as we know it is in doubt.” Star Parker announced the triumph of those “not having traditional values on family, sex, and abortion.” These plaintive cries reflect shock, anger, and nostalgia about the passing of an imagined America rather than serious analysis of what just transpired.
On the question of what to do in the wake of defeat, Charles Krauthammer insisted on “no reinvention when none is needed,” and that the party “required only a single policy change: border defense plus amnesty” to get back into the business of winning. Kurt Schlichter’s laughable solution was for Republicans “to do what guerrillas do and infiltrate into the enemy’s turf, slipping conservatives into mainstream media, academia, and the entertainment world.” (Apparently, he missed Clint Eastwood’s conversation with an empty chair.)
Rick Santorum was more in tune with the evidence when he called on Republicans “to build a new box and offer Americans a broader, bolder and more inclusive vision of freedom and opportunity, as well as the tools to use them,” exactly what he was unable to do in his own presidential campaign.
From the Left, we’ve gotten exaggerations about the scope of the Obama consensus.
Liberal activist Robert Creamer interpreted the vote as approval “for a society where everyone gets a fair share and plays by the same rules…. whether you are a man or a woman, a gay or a straight.” Joan Walsh considered Obama’s reelection “a victory for the Democratic ideal of activist government, and a mandate for more of it.” Paul Krugman’s thought the win showed “a lot of liberal ideas have become perfectly mainstream.” Eugene Robinson referred to a “multihued, multicultural future.” Maureen Dowd rhapsodized how supporters “lifted up Obama” with the hope that he would now be more amenable to dramatic change.
Drowned out in the euphoria and the rhetoric of wish fulfillment were the president’s more modest and realistic post-election response: “I’ve got one mandate. I’ve got a mandate to help middle class families and families that are working hard to get into the middle class.”
Democrats were understandably overjoyed at the results of the 2012 elections. They gained two Senate seats and five new women senators joined their ranks. In the House, Democrats added eight seats, while there were eleven fewer Tea Party-supported candidates from among those who sought re-election.
Understated in the post-election celebrations were several important inconvenient truths:
- On election eve, a majority of Americans did not favor Obamacare, the President’s approval rating stood at only 50%, and a majority of Americans wanted to “drill, baby, drill.”
- Voter turnout for the “critical election” was appreciably lower than it was for those of 2008 and 2004 – 57.5%, versus 62.3%, and 60.4%, respectively.
- Obama took the popular vote by only 3.2%, about half of what he received in 2008, against a weak opponent who was not enthusiastically embraced by his own party.
- In the swing state of Ohio, Obama’s margin of victory was less than 2%; in Virginia around 3%, while in Florida, it came in at under 1%. The close results cannot be blamed on voter suppression, though a number of Republican legislators worked toward that goal under the pretext of preventing voter fraud.
If Romney had won some of these swing states and/or captured the 44% of Hispanics who voted for George W. Bush, and if the Republican Senate nominees from Missouri and Indiana had not self-destructed by making idiotic comments about rape, we could be having a very different conversation about the meaning of the 2012 contest.
As things stand, the election results endorsed President Obama governing from the center-left, not the left-center. The wealthy will pay more, but not on the scale of massive wealth transfers that characterized the Great Society of the 1960s. The president also received confirmation to use the federal government for job creation, for continuing to offer programs for the disadvantaged, and in acting as a first responder to weather-related catastrophes.
The Republicans got a mandate to modify an agenda based exclusively on competition/individualism without compassion, and to offer more to people of color and women.
The mandate for both parties was to work together to end gridlock in Washington and to find a balanced solution to deficit reduction.
As I write these words, there are signs that elected officials from both parties understand the real mandates they have been given.