"Whenever I go into a restaurant, I order both a chicken and an egg to see which comes first"

Wednesday, October 31, 2012

Risks And Why We Take Them

We all take risks, and most of us do so irrationally. We make assumptions and then take action based on factors which have nothing to do with true likelihood. Cass Sunstein, writing in the New Republic (10.31.12) suggests three major reasons why we take risks when we really should not:

The first involves unrealistic optimism. Some of us show an optimism bias; we believe that we are peculiarly immune from serious risks.

Human beings are also subject to availability bias. When people are evaluating risks, they tend to ask whether other, similar events come to mind, thus using what social scientists call “the availability heuristic.” If you immediately think of cases in which officials exaggerated risks, and asked people to take precautions that proved to be unnecessary, you might decide that there’s nothing to fear

The final problem is motivated reasoning. For many of us, evaluation of risks is not just a matter of calculation. Our motivations, in the sense of our emotions, our desires, and our wishes, count too, and they can much affect our judgments. If you are strongly motivated to believe that capital punishment deters crime, you are likely to credit the studies that support that belief.

The worst possible scenario is when there is a Perfect Storm of all three irrational risk assessments. The likelihood of Superstorm Sandy hitting my house in DC is practically nil, I say, with unrealistic optimism; and even if the weatherman says that the chances are good, they have been wrong so many times before that I have half the basement filled with canned food, say I with availability bias; and the liberal media are so biased towards global warming that they deliberately inflate the risk, I conclude with motivated reason.

I fall more or less into this Perfect Storm category of total denial of risk.  Falling trees have never hit my car, even when I hoped that some deus ex machina would relieve me of the monthly payments; and global warming is a fiction, because who can forget the Snowmaggedon of 2010 when more than three feet of snow fell on DC? I had to dig my car out not once but three times because the new mayor wanted to show he was not like Mayor-for-Life Barry who said, referring to his administration’s poor response to heavy snow, “Get over it.  It will melt”; and scoured every spare truck from the area to come to my neighborhood.   These homeboys and West Virginia crackers didn’t have lawns in their ‘hoods, so humping up the curb and gouging out snow-covered ones which they couldn’t even imagine, was just fine with them.  No problem.

That was only a few years ago, so if there is such a thing as global warming, did it take a break that year?  Nature’s equivalent of skiing in Aspen with two feet of fresh powder?

The most important element in all this is that I have taken so many risks in my life and come out all right that shit simply cannot happen to me.  I have had condoms break and never once received the panicky “We’ve got to talk” call.  I have been seen in places I never should have been seen, and come away clean as a whistle.  I have driven totally whacked out oblivious behind Bombay Black and crystal meth and made it fine from the Village to Jersey (why was I ever going to Jersey in the first place?).  I lived in India for five years in the Sixties, and that was one constant, continuous risk of disease.  I travelled to the hellholes and shitholes of Africa but missed the bullets, the kidnappings, the horrific traffic accidents, or even getting eaten by lions in Great Zimbabwe.  So what’s the big deal about risk?

A few things Sunstein ignores.  First, the 'I-don’t-give-a-shit’ risk-benefit analysis.  I realize that bum-fucking a girl I met in a bar without a condom is risky business; but who cares?  At my age, any girl who will turn bottoms up for me is better than the best Christmas present ever.  Antony, the Triumvir of Rome, respected general, leader of the Roman people, gave it all up for an Egyptian tart.  He knew exactly the risks he was taking.  Either Cleopatra would turn tail on him and run from battle (which she did); or the fraying alliance with Octavian would finally break (which it did).  So what?  He was at the end of his career and his life, so WTF.

Second, we live in an age where the sheer complexity of life makes reasonable and rational risk assessment impossible.  Your oncologist tells you that if you do X, you will have a Y chance of the cancer returning and a Z chance of side effects; and if you do Y…etc.  Bewildering.  The choice is either to close your eyes and pick X or Y, knowing that you are flipping a coin; to force the doctor to choose for you, putting him in an uncomfortable legal position; or to run out of the room knowing that neither choice is a good one.  How many times have I seen medical opinions overturned?  And how many guys are riding the elevators up to their K Street law offices without a prostate and unable to get it up because doctors said “No prostate means no prostate cancer” and they bought the argument, afraid of risk.

I knew an older man who had begun to try to reduce risks to his health in late middle-age.  He first reduced his consumption of fats, sugar, salt, and alcohol by half, then two-thirds, then completely.  Eating dinner at his house was a trial.  The chicken had been boiled so that all fats had been rendered.  This greyish-white lump of meat sat on the plate accompanied by spice-less, half-cooked vegetables, and a baked potato with no fixin’s.  He then became obsessed with indoor air pollution, and he deployed super-sucker air purifiers which mercilessly roared night and day.  He did painstaking and meticulous research to determine which roads between Falls Church and Alexandria, Virginia had the fewest traffic fatalities.  Finally, he never left the house, and the fire department had to break the door down to retrieve him and commit him to the State loony bin.

Once you start becoming concerned with risk, you see that dangers lurk everywhere.  Your fleeting shadow of a life is even more precarious than Hamlet ever imagined; and worse than he, you worry about every combination and permutation of possibility, opportunity, and chance.  Finally in the end, exhausted, and depleted, you either say ‘fuck it’, or pray to God to guide you in the right decision, or simply let the sands of fate, chance, and fortune, cover you up like a scraggly bush in the Sahara.

Our response to risk is really what defines us.  To understand another, it is best to put on the risk-focused glasses and see how he reacts to unknown odds.  Some will resort to fatalism, whether philosophically or religiously determined. Actions are either according to God’s plan or to some random arrangement of cosmic billiard balls.  In either case, you go with the flow or play the cards dealt you, and see what comes up.

Others will resort to beating the odds.  Very American, this approach. We can do anything, solve any problem, beat the most overwhelming odds.  Nothing wimpy, flaccid, neutral, and European about us.  The weight of 2000 or more years of history means nothing when you have resolve, will, determination, and confidence.

I am a firm adherent of the ‘fuck it’ school. I have lived too long and seen too many theories go down the drain to actually trust in received wisdom, or believe in my ability to sort out conflicting theories to be able to accurately assess risk.  Life is far too short to worry all the time, or to spend endless energy in campaigning for Romney or Obama or fighting for the Preservation of the Bay.  What will happen if one or the other wins?  Not much, and in 100 years neither one will be remembered except for their gaffes and missteps. 

Is there a greater risk to the body politic if Romney wins?  I doubt it.  Should I move to the Mojave Desert where, I am told, there are no natural disasters, little air pollution, no terrorist threat, and no spoiled fish?  Not exactly.  I am happy here in Washington, DC, where I am glad that Superstorm Sandy missed us, that my lights never went out, and no tree fell on my car.

Tuesday, October 30, 2012

Why Has Racial Prejudice Increased During The Obama Years?

In a widely reported recent AP Poll (in collaboration with NORC at the University of Chicago, the University of Michigan and Stanford University) the percentage of white Americans who harbor racial prejudices against blacks has actually increased during the last four years. Jonathan Capehart in a blogpost for the New York Times (10.30.12) writes:

In 2008, anti-black attitudes were held by 48 percent of Americans surveyed. Today, that number is 51 percent. When implicit racial attitudes are measured, that statistic jumps to 56 percent. The viewpoint is even worse for Hispanics: A poll done last year showed that anti-Latino attitudes were held by 52 percent of non-Hispanic whites. On the implicit racial attitudes test, the negative views of Hispanics goes to 57 percent.

The ‘implicit racial attitudes test’ is an adaptation of a familiar psychological test which matches descriptive words with images.  Most people, for example, associate flowers with good words, insects with bad; and while avoiding the usual racially stereotypical words like ‘lazy’ researchers can elicit subtle positive or negative associations through a range of words. The test is relatively new and anyone can take it by going to the website(https://implicit.harvard.edu/implicit/demo/).

From a review of the literature generated in the short time after publication of these data, there are no confirmed reasons why this rise has occurred, although some suggest that it is because of President Obama – either the racial hatred buried in many people emerged after the unheard of and unconscionable election of a black man as President; or Obama did not do enough to use his ‘post-racial’ victory to consolidate the significant gains made in racial tolerance.

Harris [Director of the Institute for Research in African-American Studies at Columbia University], penned an op-ed for the New York Times on Sunday that argued, “The Obama presidency has already marked the decline, rather than the pinnacle, of a political vision centered on challenging racial inequality.” And he charged that black elites and intellectuals have traded the substance of what might be achieved for the symbolism of having a black man in the Oval Office.

I don’t buy the second argument, for Obama’s silence on the matter should not increase racial prejudice.  It must be do to the former – he is a lightning rod for racial hatred.  I have spent a lot of time outside Washington, DC in the past few years, mostly in small towns far from the Nation’s Capital.  I lived and worked in areas where Obama was not only a president with whom one disagreed, but a man to be detested.  The conspiracy theories to which I had paid scant attention, assuming that they were espoused by a very few, were alive and thriving.  Never in the days of Watergate when Nixon was hated and vilified by the Liberal Establishment did I hear such virulent, poisonous attacks.  Nixon was hated because he lied, he covered up, and because he was ambitious beyond his means.  Obama is hated because he is assumed to be a Muslim, a Radical Black Activist, a Socialist, and a foreign-born traitor to the American cause. 

I have pondered this for all the years of Obama’s presidency.  Why did Bill Clinton not provoke the same visceral hatred as Obama?  He was certainly as liberal as Obama if not more.  Perhaps it was because he came from Arkansas, spoke with a Southern accent, and was raised poor, just like many whites in the South.  Or perhaps it was because he really did seem to feel the pain of most people who had not attained the American dream.  Or perhaps because he was such a good communicator and connected with his audience. Or perhaps because he was white.  I hate to come to that last conclusion as many of my Eastern liberal friends have.  I would like to believe otherwise and fight what seems to me a facile conclusion.  These new data have got me thinking again.

There is more to it than Obama, however.  Racial prejudice has increased across the board, says the AP poll, not just in areas of the country, like the South, where one might expect a disproportionate racial bias.  Since prejudice usually increases in difficult times when it is easy to blame The Other for one’s misery rather than the complexity of the economic system, perhaps people react strongly to what they see is the continuing preference for minorities.  Minority favoritism, a charge often leveled when people’s sense of fairness has been damaged, might be seen as rising because of Obama, himself a minority. 

With the high unemployment rate of the past four years, it is understandable if out-of-work people take out their frustration on those who they feel have gotten a break through government programs with their taxpayer dollars, when they haven’t. Harris suggests that the extreme political polarity in the United States in the past four years might be a contributing factor.  Obama and the Democrats are being portrayed as socialist income distributors; and poverty, gender, and race merchants, thus exacerbating already sensitive feelings among the white disenfranchised.

Racial discrimination is more than anything a function of income and class.  Few whites, except perhaps in those parts of the country where such discrimination is longstanding, historical, and deeply-rooted, would object to a black doctor, lawyer, or investment banker moving in next to them.  The animus against Obama notwithstanding, if Colin Powell or Morgan Freeman lived down the street, most people would be honored that they chose their neighborhood. Since blacks still have far less social mobility than whites, and proportionately fewer blacks are gaining higher socio-economic ranks, there are still relatively few Colin Powells to move in next door.

A recent report found that 53 percent of blacks born in the bottom income quintile remain there as adults, while only 33 percent of whites do.[45] Research has also found that the children of black middle-class families are more likely to fall out of the middle class.[33]

Despite the increased presence of blacks and women in the work force over the years, women and non-whites hold jobs with less rank, authority, opportunity for advancement and pay than men and whites,[46][47] a "glass ceiling" being said to prevent them from occupying more than a very small percentage in top managerial positions.

One explanation for this is seen in the networks of genders and ethnic groups. The more managers there are in an employees' immediate work environment, the higher the employees chances of interacting and spending time with high status/income employees, the more likely these employees are to be drawn on for promotion.[48][49] As of the 1990s, the vast majority of all clerical and office workers are women, but made up less than half of all managers. Less than 15% of all managers were minorities, while roughly a quarter of all clerical and office employees were. The networks of women and minorities are simply not as strong as those of males and whites, putting them at a disadvantage in status/income mobility. (Wikipedia)

In conclusion, these data were indeed a surprise to many.  After all, we thought, after electing a black president and moving into a post-racial era, discrimination would decrease not increase.  The lessons are clear.  First, the law of unintended consequences is alive and well.  It could be that Obama himself has contributed to this phenomenon in ways we could never have predicted; and second, race is alive and well in America, and it will not disappear as an issue until true economic equality is achieved.

That could take a long, long time.

Is A Fetus A Person? Abortion And The Pain Defense

What if you knew that the fetus growing in your body could feel pain - that it responded to various aggressive stimuli in a reactive way and was trying to move out of harm’s way? Would you reconsider the abortion you had planned?

Image result for images fetus

In an article in the New York Times (10.30.12) William Egginton explores the issue and speculates on whether neurological findings such as the one suggested above could be the final necessary argument to overturn Roe vs. Wade.  If science were to conclude that the fetus in fact does feel pain, then it is a sentient being – a human being – and it then has equal rights under the Fourth Amendment.

Not so fast, argues Eggington.  The definition of a human being is not determined simply by neuro-transmitters, synapses, bio-chemical transfers, and involuntary reflexes.  Philosophers have debated the question of Being for centuries and until now the discussion has focused on consciousness.  “I think, therefore I am”, said Descartes in the 17th Century, suggesting that the best, if not only proof of existence, was awareness of one’s own consciousness. 
While Descartes considered whether a neonate or even young children might have consciousness of this kind, in the end he rejected this hypothesis, insisting on the “reflective” nature of consciousness… “I call the first and simple thoughts of children, that come to them as, for example, they feel the pain caused when some gas enclosed in their intestines distends them, or the pleasure caused by the sweetness of the food that nourishes them…. I call these direct and not reflexive thoughts; but when the young man feels something new and at the same time perceives that he has not felt the same thing before, I call this second perception a reflection, and I relate it only to the understanding, insofar as it is so attached to sensation that the two cannot be distinguished.”
Image result for images descartes

In the ensuing years, philosophers continued to focus on the nature of Being.  Although most assumed that a human soul – a more religious way of describing Being - was given by God, it was up to Man to describe its character and dimension.  This conception became increasingly important in the Age of Enlightenment, an era where God’s will and purpose was subjected to rational scrutiny.  There was never any question of either God or his intent; but it was Man’s duty to try to understand both.

In her book, Bodies of Thought: Science, Religion, and the Soul in the Early Enlightenment, Ann Thomson provides a detailed review of the debates about the nature of mind and soul from the 17th to the 19th centuries.
In spite of the fact that Willis [a 17th Century Cambridge physiologist]  subscribed to the doctrine of an immaterial soul to account for the higher intelligence of human beings, Thomson argues that ‘Willis’s work ... attracted the attention of those wanting to elaborate a [purely] material account of the mind divorced from an immaterial soul’. In her view, Willis developed conceptions of active matter which could account for the higher functions of the mind, and dispense with the need for an immaterial soul (Reviews in History, May 2010)
These debates became even more fundamental and focused on the nature of matter and whether or not the soul was material or immaterial:
Like, Locke later, Baxter [17th century theologian] argued that one would be limiting the divine power by denying that God could super-add thought and perception to matter…A consensus was gradually built up during this period among most philosophers and scientists that matter, generally conceived as consisting of inert atoms, is in itself inactive. What becomes philosophically interesting and up for debate is the question what further entities one must add to the universe to account not only for thought..(Reviews in History, May 2010)
Image result for images john locke

The Age of Enlightenment was also the age during which the concept of individual and human rights emerged, and the American Declaration of Independence and Constitution were based on the philosophy of the age.  The Founding Fathers, although schooled in rationalism and bred on an unshakeable belief that there is a Creator, and that he is the source of our ‘inalienable’ rights, had to translate these principles into legal precepts. The new American was a human being, created by God, endowed with a soul which defined his humanity; and that society – and government - should facilitate his knowledge of both God and Man.
The brain sciences, like all branches of the natural sciences, are immeasurably useful tools for individuals and communities to deploy (one hopes) for the improvement of our understanding of the natural world and the betterment of society. The basic rights guaranteed by the Constitution, however, have nothing at all to do with the truths of science. They are, to use Karl Popper’s terminology, non-falsifiable, like religious beliefs or statements about the immortality of the soul; to use Thomas Jefferson’s word, they are inalienable. Like the equality of its citizens, in other words, the basic liberties of Americans should not be dependent on the changing opinions of science
Image result for images john locke

Later in the 19th Century when rational science was emerging as a new tool to help understand the universe, Kant dismissed its ability to answer basically philosophical questions:
More than 200 years ago the German philosopher Immanuel Kant argued, in his “Critique of Pure Reason” that, while science can tell us much about the world we live in, it can tell us nothing about the existence of God, the immortality of the soul, or the origin of human freedom; moreover, he demonstrated with exquisite precision that should it try to come to conclusions about these questions, it would necessarily fall into error.
Deliberations today are no different and scholars and scientists agree on the indefinable nature of soul, self, or Being:
Current neuroscience distinguishes a spectrum of degrees of  “consciousness” among organisms, ranging from basic perception of external stimuli to fully developed self-consciousness. Even the idea of self is subject to further differentiation. The neuroscientist Antonio Damasio, for instance, distinguishes degrees of consciousness in terms of the kind of “self” wielding it: while nonhuman animals may exhibit the levels he calls proto-self and core-self, both necessary for conscious experience, he considers the autobiographical self, which provides the foundations of personal identity, to be an attribute largely limited to humans.
Which brings us back to the issue of abortion, fetuses, and pain reflex. Egginton argues that while three centuries of philosophical deliberation have concluded that a human being becomes a person when he becomes actively sentient, aware of his surroundings and from the first moments of life, tries to make sense of it all.
Consciousness, in other words, presents a much higher, and much harder-to-establish, standard than mere potentiality of life. Therefore, implicit recourse to the concept fails as a basis for replacing viability as the line dividing the permissibility of abortion from its prohibition. For a fetus to be conscious in a sense that would establish it as a fully actualized human life, according both to current neuro-scientific standards and to the philosophical tradition from which the concept stems, it would have to be capable of self-perception as well as simple perception of stimuli.
Science is and always has been one way of illuminating the search for truth, but not the only way; and to assume so is to ascribe to it a dictatorship of pseudo-rationality. Scientific ‘proof’ is an ephemeral and unattainable goal.  Even Einstein’s Theory of Relativity is being re-examined today.
When science becomes the sole or even primary arbiter of such basic notions as personhood, it ceases to be mankind’s most useful servant and threatens, instead, to become its dictator. Science does not and should not have the power to absolve individuals and communities of the responsibility to choose.

Monday, October 29, 2012

Evil II–Augustine and Kierkegaard

In my first post on the subject of evil (http://www.uncleguidosfacts.com/2012/10/does-evil-exist.html), prompted by an excellent series by Clare Carlisle of the Guardian, I discussed Augustine’s explanation that evil is not a thing in itself, but the denial of goodness.  Carlisle (in the second piece of a series) observed that Augustine need to reconcile the absolute goodness of God with the existence of evil in the world; and he could never either believe that God had a co-equal, the Devil, or that He would create a force, evil, that was contrary to him.

Augustine came to regard this cosmic dualism as heretical, since it undermined God's sovereignty. Of course, he wanted to hold on to the absolute goodness of God. But if God is the source of all things, where did evil come from? Augustine's radical answer to this question is that evil does not actually come from anywhere. Rejecting the idea that evil is a positive force, he argues that it is merely a "name for nothing other than the absence of good".

This explanation has always seemed too tame for most philosophers and for the rest of us who see people contemplating, plotting, and doing evil – a conscious, deliberate amoral act, not just an absence of an abstract.  The horrors of Stalin, Mao, Hitler, and Pol Pot are not just ‘not good’ actions, but something far worse.

Augustine, however, had a second theory, one which corresponded more to the real world:

Augustine thinks that our goodness is derived from God and completely dependent on him. "When the will abandons what is above itself and turns to what is lower, it becomes evil – not because that is evil to which it turns, but because the turning itself is wicked," he writes in The City of God.

Because we are free, Augustine argues, we must be able to choose to turn away from God, to disobey him and to try to live independently – maybe as if he didn't exist at all.

The ‘free will defense’ says that evil is a consequence of freedom; freedom is a good thing and therefore we have to accept evil as its unfortunate side-effect.

So far, so good; but given the extent of evil in the world – not just the Hitlers and Stalins but the Bernie Madoffs, Big Tobacco conspiracists, and those like Iago in Shakespeare’s Othello who appear to have no political, social, or economic motivation for their torture and destruction of others, how can we account for the fact that so many use their God-given free will to turn away from Him, from goodness, and to choose evil?

Augustine was more interested in theological philosophy – explaining theoretically the world of God and the world of Man and how the two intersected – than in answering the question of why people choose evil.  It was enough for him to explain the nature of evil and the philosophical construct – free will – which permitted it and to consider its spiritual consequences.  It took a 19th century philosopher, Soren Kierkegaard, to attempt to answer this question.  Writing when he did, he was exposed to the various social, psychological, economic, and cultural factors which might propel men towards evil; and could therefore add to Augustine’s thought without damaging his theory.

Kierkegaard

Photo of Soren Kierkegaard

Kierkegaard accepted Augustine’s conception of good and evil, especially that evil was caused by men turning away from God as a free choice, but said that this was because of a combination of pride and fear:

Kierkegaard thought that our freedom is itself a big nothing. He describes it as a yawning chasm at the heart of human existence, which has to be filled with decisions and actions. But of course this emptiness can never be filled. When we look down into the abyss of our freedom, says Kierkegaard, we feel sick and dizzy. We take a step back. All that nothingness makes us anxious. We are, in fact, afraid of our own freedom.

Kierkegaard agreed with Augustine that human beings are fundamentally proud, always wanting to overreach themselves, transgress any limits imposed on them, and deny their dependence on God. But he also emphasized that we are as fearful as we are proud – that we shrink back from the unlimited dimension of our being, which is freedom. This makes us very contrary creatures: we think we want to be free of all constraint, but at the same time this freedom terrifies us. Human sinfulness, says Kierkegaard, is a result of this unhappy combination of pride and fear.

This in itself still does not explain why some of us choose evil actions over good ones – both would help fill ‘the yawning chasm’. Kierkegaard offers the following:

Our failure to be good, he argues, is due to the way we deal with being both less and more free than we wish to be. Like stroppy, insecure teenagers, we crave independence, resent authority, and are scared to take responsibility for ourselves.

In other words, like teenagers who have not reached a level of maturity to analyze the ethical and moral consequences of their actions, we act impulsively.  The fear of a chaotic world of choice, unknown risk and consequence, and endless deliberation is such, that we act before we think.

Jean-Paul Sartre echoed Kierkegaard in his philosophy of Being and Nothingness:

Sartre contends that human existence is a conundrum whereby each of us exists, for as long as we live, within an overall condition of nothingness (no thing-ness)—that ultimately allows for free consciousness. But simultaneously, within our being (in the physical world), we are constrained to make continuous, conscious choices.

It is this dichotomy that causes anguish, because choice (subjectivity) represents a limit on freedom within an otherwise unbridled range of thoughts. Subsequently, humans seek to flee our anguish through action-oriented constructs such as escapes, visualizations, or visions (such as dreams) designed to lead us toward some meaningful end, such as necessity, destiny, determinism (God), etc. (Wikipedia)

Where Kierkegaard sees this anguish translated into evil actions, Sartre seems them as neither good nor evil but steps to constructing a being, a personality, a character which gives some meaning to life, although artificial.

Although the vision of Sartre resonates more with me than that of either Kierkegaard or Augustine, I prefer that of Machiavelli and Nietzsche to any of these (see my blog post, above).  Machiavelli (and his advocate, Shakespeare) believed that there was no such thing as good or evil, just variations of human actions determined by human nature; our social, cultural, and economic ecology; and the precedent of history.  Nietzsche agreed, but added the element of will.  The greatest expression of any human being, confronted as he is by the valueless (meaninglessness) of life, is to follow his own primitive, basic, fundamental desires and ambitions to their logical conclusion.  Tamburlaine was a greater man than the grocery clerk.

Sunday, October 28, 2012

Limited Government–A Need To Revisit James Madison

We live in an age of Big Government, and the issue today is only how to reduce its size from the behemoth it is, not to seriously rethink the recent trajectory which has moved the Republic from the vision of of 18th Century American political philosophers – one of simple protection of individual freedom and property to the bewildering array of government intervention in all spheres of American life.  James A. Dorn in The Scope of Government in a Free Society (Cato Journal,Volume 32 Number 3, Fall 2012) has shown just how far we have come:

The Framers of the U.S. Constitution took it as “self-evident that
all men are created equal” and have “unalienable rights”—including
the rights to “life, liberty, and the pursuit of happiness.” Those rights,
as expressed in the Declaration of Independence, were incorporated
in the broad rubric of property—understood by Madison as “everything to which a man may attach a value and have a right; and which leaves to
everyone else the like advantage”.Madison held that the legitimate and primary function of a just government is “to protect property of every sort; as well that which lies in the various rights of individuals, as that which the term particularly expresses”.

Thus, in the Madisonian vision of government and society, there is no separation between good government, personal liberty, private property, and justice.

Madison’s constitutional republic was to be one of limited powers under a rule of law, rather than an intrusive state aimed at redistributing income and wealth via the democratic process”.  

Thomas Jefferson echoed Madison’s convictions:

The “sum of good government” is to “restrain men from injuring one another,” to “leave them . . . free to regulate their own pursuits of industry and improvement,” and to “not take from the mouth of labor the bread it has
earned.”

The key element in this calculus was the concept of justice.  Since the protection of property – the locus of various rights of the individual – is enshrined in the Constitution, then a government system of justice to guarantee and guard these rights, should be the principal role of the State.

The basis of the U.S. experiment in designing a system of government
to “secure the blessings of liberty” was the principle of consent.
Within a regime protecting individual rights to life, liberty, and property,
people would be free to pursue their own happiness without
interfering with the equal rights of others.

There was no intent in the Framers’ minds to expand the role of government to anything more, especially not to be the guarantor of the General Welfare:

The purpose of the law of the Constitution, as stated in the
Preamble was to “establish Justice, insure domestic tranquility, provide
for the common defense, promote the general Welfare, and
secure the Blessings of Liberty.” To do so, the Framers strictly
limited the powers of the federal government, enumerating them in
Article 1, Section 8. In particular, there is no evidence that the
Framers envisioned the General Welfare Clause as a blanket provision
for expanding the size and scope of government.

Madison himself, three decades later reflected on the wisdom of those decisions which were taken to limit the role of government to preserving and protecting individual rights:

“With respect to the words “general welfare,” I have always
regarded them as qualified by the detail of powers connected
with them. To take them in a literal and unlimited sense
would be a metamorphosis of the Constitution into a character
which there is a host of proofs was not contemplated by
its creators.”

More than anything, government was created to protect the individual from injustice; and the 19th Century political philosopher Frederic Bastiat said it best:

When law and force confine a man within the bounds of justice,
they… do not infringe on his personality, or his liberty or his property. They merely safeguard the personality, the liberty, and the property of others. They stand on the defensive; they defend the equal rights of all. They fulfill a
mission whose harmlessness is evident, whose utility is palpable, and whose legitimacy is uncontested.

Bastiat foresaw with great prescience the tumult and illogical chaos which would occur if and when government began intervening in the affairs of individuals, society and the market.  Writing in 1850 he said:

If you make of the law an instrument of plunder for the benefit of particular
individuals or classes, . . . there will be tumult at the door of the legislative chamber; there will be an implacable struggle within it, intellectual confusion, [and] the end of all morality. . . . Government will be held responsible for everyone’s existence and will bend under the weight of such
a responsibility.

Madison explicitly stated his opposition to government expanding its role beyond this fundamental obligation.  These comments are reflected in conservative views today:

A just security to property is not afforded by that government under which unequal taxes oppress one species of property and reward another species;
where arbitrary taxes invade the domestic sanctuaries of the rich, and excessive taxes grind the faces of the poor; where the keenness and competitions of want are deemed an insufficient spur to labor, and taxes are again applied by an unfeeling policy, as another spur.

Note that Madison, although insistent that government not be instrumental in redistributing wealth, but leaving all economic configurations to private enterprise, did not disavow taxation, just the use of tax policy to implement political objectives. This, of course, is the conservative objection to current tax policy and the insistent demand for reform.  In our tax laws the hand of government is everywhere from the housing deduction which government uses as an indirect dirigiste policy to promote the housing industry, to variable capital gains taxes, to tax breaks, credits, and tax holidays, all of which are designed and implemented according to the prevailing political agenda.

Madison, believing as he did in the primacy of personal property as the locus and repository of individual rights, also believed in the freedom to create wealth from its expansion. Above all, government should not stand in the way of such growth – nor should it intervene in the process:

Madison consistently argued that “in a just and free Government . . . the rights both of property and of persons ought to be effectively guarded.” He also recognized that doing so would “encourage industry by securing the
enjoyment of its fruits.” In his speech in the First Congress on
April 9, 1789, Madison emphasized that “commercial shackles are
generally unjust, oppressive, and impolitic.” Moreover, he held that
“all are benefited by exchange, and the less this exchange is cramped
by Government, the greater are the proportions of benefit to each”

The presidential campaign this year has focused more on the issue of the nature and size of government more than any other.  Many ‘progressives’ will reject or choose to ignore the Constitutional principles first enunciated by the Founding Fathers, saying that much has changed in the ensuing 225 years, Madison and Jefferson could never have anticipated the size and complexity of the Republic, and of course they would have given government more scope if they could possibly have known.  This, of course, is self-serving denial.  If anything, a consideration of how to reconfigure the American political and economic landscape is even more important, for as Bastiat and others predicted in the 19th century, a rampant State is disruptive, erosive, and counter-productive.

Saturday, October 27, 2012

Confusion

We have all been confused at one time or another; and in fact confusion is as much a part of life as order and predictability.  I am confused when I am reading one of Shakespeare’s Histories for the first time, trying again and again to sort out who did what to whom and whose claim to the throne is legitimate.  Richard II was one of five sons of Richard, The Black Prince, and they were all squabbling, demanding their rights, and setting the stage for brutal War of the Roses.  My patience paid off, for the more carefully I read (and consulted the genealogical chart that showed lineage), the more the confusion dissipated, and the more the historical complexity receded into the background that Shakespeare intended.  I could appreciate Richard II for who he was, the Poet King who spoke some of Shakespeare’s most poignant and insightful lines.

I am confused in a new supermarket, wandering like a blind man up and down the unfamiliar aisles trying to apply what I have learned from my Whole Foods in DC to the Kroger in Columbus, MS.

For years I have understood that limiting confusion was important.  The more I could limit the unimportant or non-essential choices in my life, the freer and more uncluttered my mind would be for more serious matters.  I found my philosophical match in India, where according to Hinduism, all basic, practical, and earthly activities should be carried out according to age-old Vedic prescriptions.  When you got up, bathed, made love, married, ate, worked, or prayed were all written down and codified.  Nothing was left to chance.  The goal of life was, after all, not earthly success, but spiritual enlightenment, and the mind must be free to contemplate God.

I was watching The Fly a modern remake with Jeff Goldblum of the Vincent Price classic of the Fifties.  The later version is funny, with a black, twisted humor, and you really like Goldblum as wiry hair pops out on his back, his teeth begin to fall out, and he has urges to swing from the rafters. 

There is one scene in the movie which I remember clearly.  Goldblum’s girlfriend asks him why he never changes his clothes.  He is perplexed.  “I change my clothes every day”, he replies, and opens the armoire to display his rack of suits, shirts, ties, and shoes, all exactly the same.  “It helps me focus”, he says, “If I don’t have to think about what to wear”.  My sentiments exactly.

I am lucky to be married to a woman who indulges me and helps me keep my life clear of practical obstacles, like doing the taxes, balancing the checkbook, and hiring gardeners to keep the wild backyard in check.  Unaccustomed as I am to any kind of practical decision which I know will produce confusion (like buying anything with the bewildering array of choices there are for everything today), I am paralyzed when buying a new DVD player, clothes, or God forbid, a used car.  My wife says that I take ‘the easy way out’ since I wait until I have holes in my pants before buying a new pair, or invest thousands in my old clunker just to avoid having to buy a new one; but this is not fair.  My ‘keep the rubber on the road’ philosophy helps me to keep the confusion out of my life.  The problem is that the older I get, my obsession with simplicity gets in my way.  I avoid the unpredictable, the risky, or the unforeseen.

Giles Fraser, writing in the Guardian (10.26.12) provides another take on confusion.  Confusion, and the anxiety it produces, forces us to accept simplistic arguments, thus limiting our universe to the patent, the predictable, and the obvious:

But confusion generates anxiety, and the desire for anxiety reduction is such that one is easily tempted to subscribe to any sort of explanation of things, so long as it regulates confusion and thus diminishes the anxiety.

This is what often deters us from stretching our worldview and imagining things turned upside down. Part of the reason I prefer religion over scientific atheism is that I find atheism to be less tolerant of confusion and disorder. It's easy to subscribe to some general pret-a-porter philosophy that puts everything neatly into its ontological place; but the price you pay is a smaller, diminished world. Or maybe that's the gain: after all, a diminished world is less threatening.

For Fraser, reading Aquinas and Augustine, the tracts of Martin Luther, the Hebrew version of the Old Testament, Greek origins of Christian philosophy; experiencing the spare, absolute faith of Islam; or sorting through the multiplicity of Hindu gods to see the perfect order and symmetry of the religion to come to some conclusion about the existence and nature of God are infinitely more rewarding than simply saying, as atheists do, I don’t believe.  There is an innate reward in wandering through complexity until one finds ones way.

Of course, I'm not really trying to defend confusion as a permanent condition. Rather, I'm trying to defend its more illustrious cousins – puzzlement, wonder and adventurous curiosity. This posher side of the family are often snooty about confusion, wanting to distance themselves from its messy and circuitous ways. Yet Wittgenstein is surely right when he says that "I do not know my way about" is the basic form of a philosophical question.

I am a creature of three worlds – the Hindu one in which confusion reigns in an illusory world, and the only right path is to neutralize it through discipline and order; the Jeff Goldblum one, where non-essential choices must be reduced or even eliminated to allow the mind to use its limitless potential; and the Fraser one in which intellectual challenge and achievement is perhaps the most valid expression of human existence.

I am about to go to the store where I will go to the same aisle of the supermarket, to buy the same tuna fish and condensed milk.  I will take the same route to get there that I have always taken, park in the same spot if it is available, and proceed through the store as I have done for 10 years.  All the while I will be sorting out a particular conundrum – Why, really, did Othello murder Desdemona?

Food–The New Art Form?


Many Italian Americans have grown up with good food, and despite the attempts of first generation mothers to expunge any and all telltale signs of Italian heritage, they were good cooks, and their sons and daughters never forgot their all-day meat sauce, baked artichokes with garlic, lasagna, and eggplant Parmesan. The all-day sauce was the hallmark of a good Italian American kitchen - plenty of oil and garlic,  oregano, a dash of hot pepper flakes, and three kinds of meat, and let the sauce simmer from sunup to sundown.  It was thick, sweet, fragrant, and delicious.


                           www.ouritaliantable.com 

Everyone had an Auntie Angie and their lasagna, gooey with mozzarella and a tomato sauce even and topped with browned mozzarella; their corn fritters, soaked in oil, and as sweet as any dessert; their ham pies; and antipasto with salami, prosciutto, cured olives, sharp cheese, and anchovies.  

Christmas Eve dinner was special.  Although many cooks did not cover all the Five Fishes of Neapolitan lore, there was always spaghetti with anchovy sauce, baked eels, and squid in spicy tomato sauce.

Children of these great cooks were happy to see the foodie movement arrive. Alice Waters, the doyenne of the foodie, organic, locavore movement transformed food into an experience.  Not only were her dishes innovative, very Californian and American, but food in her hands became more than just something delicious.  Presentation and architecture were as important as the ingredients.  


                       www.telegraph.co.uk

Exquisite, thinly slices of fish were perched on top of baby arugula covering a quail egg, all of which was sprinkled with caviar.  There were coulis, drizzled oils, a creative and unusual mixes of ingredients.  Fish and pancetta or pork bellies, oysters and bacon, sea urchin and fresh figs. Happy diners took pictures of their meals and posted them online, and the artfully presented dish joined the art of the artistic photo.

Recently this foodie craze has gone Baroque.  It is hard to get a steak or even to feel satisfied after a five-course meal.  The architecture and the painterly display of bits of candied ginger, gooseberries, and organic Malabar cashews have taken over the main ingredients.  Perhaps the most extreme example of this Baroque combination of organic locavore is Rene Redzepi, a Danish chef who cooks only with what he can forage in the wild:
For 15 minutes Redzepi and a companion nibbled on various petals, leaves and shoots, attracting stares from onlookers in a campground nearby, who no doubt wondered at their sanity and zest for roughage.
“So much of what you see here, it’s edible,” said Mr. Redzepi, who regularly dispatches his staff to collect the scurvy grass and sorrel, as well as what he called sea coriander, beach mustard and bellflowers. All of these make their way into his dishes, along with puffin eggs from Iceland and musk-ox meat from Greenland. (Frank Bruni, NYT, 7.6.10)
 
                          www.businessinsider.co.au 

William Deresiewicz wrote an article in the NY Times (A Matter of Taste 10.27.12) in which he contended that not only has food gone Baroque, it has replaced art as creative expression.  Far from applauding this phenomenon, he criticizes the 30-Somethings who have given up the depth, intellectual challenge, excitement of the mind and soul that great art has always produced and become content with the architecture and painterly displays of fish, meat, fruit, and vegetables.
Young men once headed to the Ivy League to acquire the patina of high culture that would allow them to move in the circles of power. Now kids at elite schools are inducted, through campus farmlets, the local/organic/sustainable fare in dining halls and osmotic absorption via their classmates from Manhattan or the San Francisco Bay Area, into the ways of food. Food, for young people now, is creativity, commerce, politics, health, almost religion.
Food has become an industry as well as an art form:
It has developed, of late, an elaborate cultural apparatus that parallels the one that exists for art, a whole literature of criticism, journalism, appreciation, memoir and theoretical debate. It has its awards, its maestros, its televised performances. It has become a matter of local and national pride, while maintaining, as culture did in the old days, a sense of deference toward the European centers and traditions — enriched at a later stage, in both cases, by a globally minded eclecticism.
The author, however, overstates the case.  He forgets that we are a nation of faddists  moving from one craze to another, often assembling and conflating them to add cachet.  Hardcore foodies are usually into yoga and biking, hipster clothes and thrift shop fashion.

Most of these San Francisco foodies are also making millions at Google, Facebook, and a thousand other IT start-ups in the Bay Area; and have Harvard, MIT, and Stanford educations.  Yet they like most of the rest of us have lost our taste for ‘high culture’.  Classical music stations are closing every year.  The audiences at most opera and symphony performances are well into their Medicare years.  Younger people will take a flip through the latest blockbuster at the National Gallery to see what the fuss  about Vermeer is all about but are uninterested in smaller, more unique collections. 

"A good risotto is a fine thing", says Deresiewicz, " but it isn’t going to give you insight into other people, allow you to see the world in a new way, or force you to take an inventory of your soul."

Yes and no. Food has become a social marker.  You are what you eat has never more been true.  The Huffington Post (4.18.14)  did an article on Twenty-Two Hipster Foods among which are ramps, homemade pickles, PBR, kimchee, Brussels sprouts, fancy donuts, craft beer, cauliflower, and ‘anything foraged’.  There is irony in hipster cooking, so the choice of the common (cauliflower and Pabst Blue Ribbon) is not surprising.  Hipsters are not locavores or Earth-firsters, so they won’t turn down a seared fresh foie gras, and will push the exotic up against endangerment; but all in all they have a good sense of food, are inventive; and the mix of ironic and edible often produces very interesting dishes.


The issue is not with the foods per se but the iconic nature of them. Ironic has its limits, and while an occasional burger at a White Tower in a sketchy neighborhood is definitely in, a Big Mac is not. The cult demands intent, desire, patience, and time.  There apparently is such a thing as a perfect cup of coffee, and I know hipsters who have spent thousands on trying to make it.  

One young friend bought a used espresso machine from Padua (good coffee can only made using seasoned machines), heat and pressure registers from an engineering manufacturing company in Gary, and coffee from Brazil, but roasted in Italy.  He tried coffee at every independent coffee shop in San Francisco, tasting for that Italian uniqueness that he had only found in Tuscany.

Eating well is a function of geography, exposure, education, and most of all income. The fresh, local, organic produce at the Dupont Circle Farmers Market in DC is expensive.  The prime, dry-aged NY strips at Whole Foods or Balducci’s are $20 a pound.  Snapper, Spanish mackerel, tile fish, and Portuguese sardines are almost that.  There is nothing like the fresh, never-frozen Gulf jumbo shrimp, but a meal for two is easily $30.


Washingtonians appreciate the quality, have the cultural exposure and culinary experience to know what to do with exotic varieties of fish, innards, and strange, foraged weeds; and have the money to pay for it all.  It is easy to become a foodie if your socio-economic stars are properly aligned.

Some foodie groups carry the flag of health – it is stupid and ignorant to eat what you should know is not healthy.  Others raise the flag of the little man, the small farmer; and protest through their purchase of local food the depredations and greed of Kraft and Monsanto.  Still others insist on terroir and provenance. Animal rights advocates feel that inhumane treatment of chickens at Frank Perdue’s pens on the Eastern Shore is tantamount to torture, cruelty, and murder.  Finally hipsters refuse to acknowledge anyone who doesn’t ‘get it’ – the thousands of clueless people who can’t seem to wrap their heads around a meal with foie gras and PBR.

What many critics overlook, however, is the importance of social class distinctions which are alive and well in America.  Class has simply moved out of the established redoubts of the Main Line, Nantucket, and Greenwich to the social cloud. Privileged enclaves are no longer physical neighborhoods, but states of mind. Class is still defined by money, but now, more than anything else, taste.

Appreciating an elegant, unique, and remarkably exciting meal for its flavor, display, and plate architecture is no different from admiring a Vermeer.  Cultural tastes have changed - food is the new art form - and the need for social status is as strong as ever.  

While Deresiewicz may lament the decline of great art, music, and literature and the intellectual and spiritual void their departure leaves behind, America is nothing if not a dynamic, robust, inquisitive, and optimistic place; and to assume a dumbing down based on a fading interest in the Old Masters misses the point. Culture has no permanence nor inherent value.  Eventually the works of Shakespeare, Aeschylus, Brahms, and Leonardo will be archived and stored away; and creations just as valid will replace them.  

Artistic food is neither here nor there.  At best it is not only an expression of the same instincts for line, and color as classical artists had; but engages the sense of taste as well.  A multi-dimensional installation.  It may be a stop on the way to something more creative and more 'meaningful'; but even if it is just a temporal digression, it still represents both artistic creativity and American ingenuity; and for that we should celebrate it. 



Friday, October 26, 2012

A Feminist Halloween

You guessed it.  There are women out there who are not only fretting about what kind of Halloween costume to wear, but how to be sure that it does not send the wrong signals.  It used to be that a girl could dress up like a hooker and feel good about it.  Halloween, after all, and anything goes.  Mardi Gras is the best example of fantasy dress, and elaborate costumes which display your inner and 364-days-a-year hidden self are encouraged. Rhiannon Lucy Cosslett in the Guardian (10.26.12) writes:

In recent years, Halloween has become an excuse for women to shed the clothing in favor of sexy costumes, most of which are either shop bought role-play ensembles such as sexy nurse, sexy maid, sexy devil, sexy bunny… (ad infinitum) or an outfit comprised simply of knickers, bra and animal ears. Classy.

This was logical and predictable.  Despite feminism’s loopy circle from the bull-dyke outfits of the Sixties to the acceptable frilly things of today, women still have the desire to dress really sexy, provocative, and cheap. We men have never had that problem.  We have always dressed in dark suits or in the multi-layered, pork-pie hat, hipster outfits of San Francisco and expressed our inner sex monster when it counted – i.e. buck naked. We always knew that women wanted one thing from us – that we desired them and could perform.  They gauged this from attitude, body language, and bedroom eyes not clothes.

This year, according to Cosslett, women are becoming a bit more conscious of their more rational, deliberate side; and for those sisters who can’t think of something really, really good, she has some suggestions:

witch

The witch is always a good choice because it is THE example of female empowerment.  Witches were sorcerers who could conjure up all kinds of horrible troubles for men.  Look at poor Macbeth who should have paid more attention to the Weird Sisters.   According to academics, the witch got transformed from an attractive, seductive woman to the old, green-faced hag of today, and it is up to women to decide which scenario to act out.  If a woman dresses up like a witch she could be playing into the hands of men who think most women are screeching harridans…Or she could adopt the costume to portray the true witch – a powerful woman who has control over men.  The problem, of course, is that in order to dress like a witch, regardless of the back-story, you have to look like one, and most people will take you for a succubus.

Mary Wollstonecraft

This is Mary Wollstonecraft, the proto-feminist who looks very pre-Raphaelite in this photo and who doesn’t scream “Dress up as me”.  Although you might get some appreciate nods from your in-crowd intellectual classmates from Swarthmore when you explain who exactly Mary Wollstonecraft is, this costume is a bit iffy, and not a whole lot of fun.

Protester

For this costume you need only torn jeans and hairy armpits, but if you really get into the Seventies thing, you can not bathe for a week or not wash your hair for three (the greasy, scraggly, nasty look is what you are after), and cross-dress between West Virginia trailer trash and urban bum.  You will stink, and in these days of pseudo-sexy, you will turn most guys off, but there are definitely some retro-studs out there.

Margaret Thatcher

OK, this one is easy.  The Iron Lady.  Forget your tree-hugging, organic, women’s rights, no-glass-ceiling, abortion for all liberalism and just get into this fearsome woman.  Not only did she destroy the miners, but sent a flotilla 10,000 miles to blow the Argentines out of the water and defend a few craggy, sheep-dipped, foggy islands.  Some Republican women still dress like this, so if you go to some high-end ‘seconds’ shops, you can surely find a good costume

VAGINA SEX MUSEUM

“The most feministy costume of the lot. There are ready-made vagina costumes out there, but expect to spend a fortune. Much better to invest in 50m of salmon-pink fabric, a sewing machine, and hours and hours of your time. However, the payoff will be so worth it. Bonus points if you add some Freudian teeth.”

Well, none of these costumes appeal to me, for I still like women acting out their sexual fantasies in hooker outfits, thongs, and mini-tank tops; I don’t count.  Men never have counted if you believe the radical feminist rhetoric that still is spewed out by the likes of Andrea Dworkin; and I like the idea of a feminist utopia where spontaneous generation produces nothing but women without the need for a rooster’s contribution. Let’s face it.  Men want women around when they’re horny, so if a techie of the future could create a virtual world populated by fantasy women in scanty clothes, we would take it.

I love Halloween.

Why Was Foreign Policy Ignored At The ‘Foreign Policy’ Debate?

Many observers have wondered why both candidates in the third presidential debate skirted almost all foreign policy issues in what was billed as a Foreign Policy forum.  No matter how hard the moderator tried to extract ideas, policies, or even comments on the rage of issues that currently affect the United States, he got no more than a few glancing remarks on any issue other than Syria or Libya.

This is not surprising, for although those of us who live in the Northeast and on both Coasts consider world events crucial and directly related to the well-being of the United States, most other Americans do not. A recent survey of the National Assessment of Education Program (NAEP) found that only 12 percent of American public high school students were ‘proficient’ in American history.  The NAEP doesn’t even bother to test students on world history.

After a long career in International Development during which I worked in over 50 countries of Asia, Africa, Latin America, and Eastern and Central Europe, I never expected my fellow Americans to have the understanding of world history that I did, but I was nevertheless astounded at the profound ignorance of even the simplest elements of geography.  Much has been made of Mali recently, because it was mentioned in the debates, and I paid particular attention because it has an ancient culture, unique and compelling music, fascinating indigenous architecture, moderate Islam, and a regional pride. Few people I asked – and these were not exactly crackers and swamp rats – had any idea where it was, let alone what it was like.  Other than Kenya, recognized because of its game parks and the Robert Redford movie Out of Africa, Angola, Mozambique, Gambia, Malawi, and most other African countries got a dull stare.

There are many reasons for this indifference and ignorance.  Perhaps first and foremost we have only two international neighbors – pussycat Canada and poor Mexico – unlike our European allies who fought each other continuously for centuries.  In the 19th century, America was more interested in expanding its national territories and taming the vast frontier.  The War of 1812, the Mexican American War, and the Spanish-American War were not conflicts started because of real threat to the United States but by-products of American expansionism.  They were minor distractions in our push Westward.  Life in the hinterlands, far from the European-style worldview of the Eastern Seaboard, was newly American – individualistic, isolated, remote, and fiercely independent.  World affairs in our huge territory were insignificant.

Third, for much of the post-WWII period, we did not need to know anything about the rest of the world.  We were the undisputed greatest power on earth, a superpower, whose might and influence could not be challenged.  In the Fifties we became aware of the Soviet Union which became our arch-enemy, but other than duck-and-cover, our lives were unchanged.  The Cuban Missile Crisis changed that calculus, and foreign affairs came very close to our shores; but the rest of the world was insignificant. China was a developing country, albeit with an impressive army; most of India was still eating chapattis and rice on palm fronds; Africa was still under colonial rule; and the Arabs, as they had been for over 1000 years were quiet.  We pumped their oil, sheiks ruled a few inhabitants, and the Middle East was just an area on the map until the 1967 War.

During all this pre-globalization period, other than the nuclear standoffs and regional eruptions, there was little to know and be concerned about.  We knew that that the thousands of nuclear weapons aimed at us were matched by the US destructive power aimed at Russia.  Even as the world became more interconnected and dangerous, we could still afford to think simplistically.

Vietnam was the first chink in the armor.  A foreign engagement went badly for the United States, we lost the war, our civil society was rent with violent protest and demonstration; and we realized that foreign affairs could, in fact, encroach upon our lives here at home.

Now, of course, we are intimately linked with the rest of the world and not from a position of power.  The Chinese are an economic threat, the Russians a political one.  Rogue states like North Korea and Iran threaten our very existence.  We are fabulously in debt to China and the rest of the world, limiting our ability to call the shots.  American exceptionalism, that last futile gasp of a muscular political power, is all but dried up. The stock market is no longer concerned with the Dow, GE, and Apple, but what is happening in the EU. 

Ironically world events have become too complicated to follow; and we conveniently fall back on our traditional posture of ignorance and self-contentment.  If our politicians give us nostrums, saying that we are still the greatest country in the world, we easily believe that after 250 years of dominance and supremacy, after having won WWI, WWII, and the Cold War we will never be defeated or brought down by world affairs.

If our historical ignorance and our persistent belief in American manifest destiny and  exceptionalism were not enough, our powerful religious fundamentalism has further closed our doors to the world.  God has a plan, say many, and America will always be its centerpiece. George W. Bush publically admitted that he felt God had chosen him to lead America to greatness.

Lastly and perhaps most generously, most people care only about losing their job down at the plant and getting another one.  Forget the fact that the Euro, China, the Middle East, and international bankers have something to do with this sorry state of affairs.  It is a domestic issue to be solved by Obama or Romney alone.  I don’t get exercised by the lack of depth at the third debate because it was perfectly predictable.  Romney especially understands that elections are about image, presentation, a projection of strength and leadership; and an appeal to core American values, and that there is no need whatsoever to get down into the politics of an insignificant, small, poor African country.