"Whenever I go into a restaurant, I order both a chicken and an egg to see which comes first"

Thursday, January 31, 2013

The Illusions Of Travel–You Can’t Go Home Again

One of the many benefits and pleasures of travel is being able to maintain ideal images and to ignore what might be behind them.  I was able to cruise through Papa and Baby Doc’s Haiti, dance all night in Carrefour, eat elegant French meals in Petionville, walk through the markets, down by the port, and drive out to the beaches of Jacmel with nary a thought to the Duvalier’s reign of terror and the murderous Tonton Macoutes.  Port-au-Prince was all romance, color, and meringue.  There are only a few countries out of the many that I have visited where not only did nothing go wrong, but everything went right.  I was so happy that I spent many mid-night hours on the balcony of the Victorian Hotel Splendide smiling and unbelieving of my good luck as I overlooked the city, listened to the drums of voodoo ceremonies in the hills beyond, and smelled flowers, rain, and wood fires. Until the Baby Doc was overthrown and exiled.

Port-au-Prince when I returned for the first time since Duvalier’s departure was a different place, more suspicious and dangerous. The smell of burning tires filled the city.  Many were burning in protest against the regime, the wealthy, or just in frustrated anger and the lawless, dysfunctional place the city had become.  Many more tires were burning because of ‘necklacing’, a particularly brutal form of lynching where a tire is placed around the neck, set alight, and allowed to burn until the victim catches fire and is roasted to death. 

I insisted on staying at the Splendide and found it open but empty.  I was the only guest and a day or two after I arrived, the tanks of another coup rumbled out of their barracks and up to the Presidential palace where firing broke out.  I hunkered down in my room, listening to the BBC.  I was sure that angry mobs would break into the Splendide, ransack and pillage it since the police had gone in hiding, the military was fighting the rebels, and there was no national or local government.

This anarchy, of course, had always been festering, kept down by the repressive forces of the Duvaliers; so the explosive expression of violence and undirected aggression was not surprising.  I never returned to Haiti, and try as I might to remember only the romance, the music, and the languid days on the beaches of Macaya, I could only recall the acrid smell from the burning tires, the mobs storming the palace, and the frantic ride to the airport before it closed.

I have always succeeded in ignoring the bad and seeing only the good.  I think I am lucky in that regard, because most of my memories are good ones.  When friends asked me about the dark side, the dangers, the upheavals, dirt and disease of the places I visited, I always dismissed their questions, preferring to tell them about my civilized lunches on Lake Tanganyika, by the pool at the Teranga in Dakar, on the beach at Copacabana.

I had once stayed in a grand old hotel in Bujumbura, the capital of Burundi.  It was run by Italians and had a lively, European atmosphere.  It was always filled with visitors, both tourists who had come for the wildlife (you could hear the hippos roar from the verandah) or for work.  The food was excellent, and the city was calm, clean, and cool.

When I returned to Burundi a few years after the first and most devastating clashes between Hutus and Tutsis (never as horrific as the Rwandan genocide but frighteningly savage nonetheless) I insisted on staying at the same hotel.  Once again, I was the only guest.  The pool was half-filled with stagnant, scummy rainwater.  There were mosquitos in the dingy rooms.  The European food had been replaced by gritty local fare, and there were only two or three staff.

Paul Theroux’s latest book, The Lower River, is about a former Peace Corps volunteer who returns to his village in Africa, a place where he had spent some of the happiest years of his life, then finds that both he and the village have changed.  It has become poorer and more desperate, many years removed from the heady and optimistic times after independence, and he had grown older, less resilient, and less hopeful about his own life.  He was returning to Africa not because of Africa but because it might offer him solace and renewed meaning.  He barely escaped from his life. Formerly trustworthy, caring, and social natives robbed, exploited, and manipulated him; took advantage of his good will; and cast him aside unceremoniously.  The story is reminiscent of the true tales of the 18th century British traveller Mungo Park who, at first seeing only the innocence of the primitive, was bought and sold, tied and tethered, robbed and left in the jungle time and time again.

I have always tried to keep this malignant side of culture out of sight.  I ignored the fact that I was played, cadged, and taken by friend and foe alike in poor countries who cared less about the ‘development’ I was peddling and more about the money they could make off its loosely-monitored projects.  I saw this crafty manipulation as part of the Third World, a story to be told to colleagues at the hotel bar, an inconvenience, and only a minor irritation.

I had finally reached the limits of frustration on one trip to Pakistan when I saw my own staff pilfering money, supplies, and equipment in an arrogant, dismissive, and carefree way. I complained to the Secretary of Health who was my official counterpart in the country.  I explained how the money from an American benefactor was being diverted from its original use – to save Pakistani lives (it in fact was blood money paid to the government by a pharmaceutical company whose product had indirectly contributed to deaths in a village because of overdose) – and how he should intervene.

Politely but firmly he told me that I didn’t get it.  It was indeed blood money and neither the pharma company nor the US government cared what happened to it.  Of course the money disappeared.  Why wouldn’t it?  No one wanted the project, were insulted by the stipulations imposed, and rejected American paternalism, and everyone was poor.

Whatever romantic notions I had about that corner of the Subcontinent disappeared like vapor when I heard these dismissive, sarcastic words.

“You can’t go back any more” has become a cliché; but it is nevertheless true.  Time erodes even the fondest memories, especially if they have been built on illusion.  Both Theroux’s character and I suffered from that particular travellers’ disease – fantasy; and while his alter ego barely escaped a savage death, the rest of us have learned that it is better to retain, nurture, and water our illusions rather than see if they are real.

Wednesday, January 30, 2013

Travel In Greene-Land

I am an admirer of Graham Greene, and find in his stories of travel, alienation, and especially moral and religious crisis, a writer of subtlety, depth, and insight.  As one who has travelled much of his life to as places as far-flung as those described by Greene, and

 

e for whom travel for all its dislocation and difficulty – or perhaps because of them – I have gained insights and perspective from him.

I was always anxious before travelling, less about the pitfalls of post-Civil War Angola, the political anarchy of Ivory Coast, the crime and social dysfunction of Kenya, the endless civil unrest in Bangladesh, than about leaving my familiar, settled, and routine life.  I was not afraid of adventure, for romance was always on the other side.  I simply was anxious about being wrenched from my life of family and predictable comfort.  It was because of that sharp anxiety that the experiences of travel were so pointed and transformative.  Because I came from a settled life which I could navigate like a blind man, the new world of new countries was especially clear and bright.

Samanth Subramaniam, in a review of a recent Pico Iyer book (The Man Within My Head)in which the author writes of his kinship with Graham Greene, writes:

In his guise of travel writer, Iyer has really been our most elegant poet of dislocation. Ever since Video Night in Kathmandu, published in 1988, he has not so much travelled as wrenched himself from place to place; he has found kindred disquieted souls in disquieting locations, all out of joint with their space and time. Other travel writers attempt to feel at home in the world; Iyer thrives on alienation, because it is the facets of this alienation that make up his origin and his destination, his means of transport and his ports of transit. His kinship with Greene, whom he calls "the patron saint of the foreigner alone", has already been implicit in his books; Iyer has always been Fowler in Saigon, or Wormold in Havana, or Plarr in Corrientes, or any of Greene's other unsettled Englishmen abroad.

My dislocation was always temporary. I knew I had a home which I never questioned.  Although my travel was always personal and revelatory, I was never Coleridge’s Ancient Mariner, doomed to wander the earth, or a Foreign Legionnaire fleeing from responsibility, debt, and harassing women.  I was not looking for enlightenment like Wordsworth when he crossed the Alps and saw the world from the top of Mt. Blanc with perfect clarity and vision.  I was neither wanderer nor seeker, but I found many answers nonetheless.  In all the more than fifty countries I visited in Africa, Asia, Latin America, and Eastern Europe, I saw a familiar playing out of history and the human nature that underlay it.  All societies whether tribal or imperial, and all individuals whether alone or grouped in families, acted out of the same self-interest, the same imperative to expand territorial perimeters, increase wealth, power, influence, and status.  With all the diversity in culture we are all the same.

Greene’s characters are lonely, dispirited, shipwrecked souls.  Querry (A Burnt Out Case) is a British architect who has lost has spiritual way and moves to Africa to work in a leper colony.  Although his outward motives seem clear – a desire to help the unfortunate – he is only seeking the context of others more miserable than he in which to immerse himself and to forget his loss of faith, purpose, and meaning.  Scobie (The Heart of the Matter) a long-serving policeman on the West Coast of Africa is also an unhappy but resigned man, living in the most remote and untraveled parts of Africa – The White Man’s Grave – in a stultifying marriage to an unbalanced woman.  His Catholic faith keeps him faithful, but after his wife finally leaves, he has a long affair which is never happy but always ridden with religious guilt.

Greene was a man riven by doubt, unable to give himself entirely to a person or to a faith – even as he knew that to refrain from such commitment was no way to live. Greene, Iyer writes, spent "his whole life searching for a haven that, were he to find it, he would only exile himself from or spoil, and then begin the search again." Iyer negotiates these ideas – of detachment, of faith, of home and belonging, of love, of displacement – turning them over and over like river pebbles, puzzling over their place in his own life, thinking them through.

Paul Theroux in his The Tao of Travel has collected the writings of many travelers from Ibn Battuta and Herodotus to Mungo Park, Paul du Chaillu, to people like Pico Iyer; and remarked how all of them told of the particular transformation that took place if one travelled alone.  I hated the moment of departure, kissing my children goodbye, hugging my wife, and waving from the cab; but soon I could feel the muscles in my back relax and the anxious fatigue disappear. Chatting with the Bangladeshi taxi driver was the first step in my elision from home to alone.  I was no longer a husband, a father, a son, a brother, and a friend.  I was only me, independent for a month, free to explore my new environment and my reactions to it.

I often had the feeling when I was very far from home that I would somehow never make it back. Tossing and turning in a stifling room in Timbuktu or waiting in the farthest Angolan outpost near the Congolese diamond-smuggling border, I thought I would never leave, that my adventures had this time taken me one place too far.  I have spent hours and days in a self-enforced solitary confinement. I spent three weeks in a cement room in a hotel in Puno on the altiplano of Peru during the coldest, rainiest season.  The altiplano can be spectacularly brilliant, Lake Titicaca blue and deep, Illimani and other snow-capped peaks inspiring; but in the dreary, wet, slogging winter, all is grey, depressing, and futile.  It was at these times that I had glimmerings of the isolation and remove of Graham’s characters.  There were no insights here, just painful loneliness.

So, I share and appreciate Greene’s sensitive unhappiness in the loneliness of exile; but I have been just an interloper for whom the security and comfort of family and familiarity was only hours away – a neophyte. 

Greene’s simple, elegant, lambent language is somehow the only voice that can tell the tale of Scobie and Querry. Every few years I pull out my tattered, fading paperbacks which I used to carry with me and read when I was in Africa, and re-read them.  His solitary vision never fades.

Tuesday, January 29, 2013

The Ruling Class

George Monbiot writes in The Guardian (1.29.13) about the insularity of the world’s elites and how their hermetic existence encourages uninformed, self-serving decisions.  There is nothing new in this. Kings and their courts were always privileged enclaves, and decisions about taxes, wars, infrastructure, and economy were made to preserve, protect, and defend the monarchy.  Of course, some balance between kings and commoners had to exist.  Kings could not continue to tax monasteries to finance foreign wars without eventual revolt, and Louis XVI and Marie Antoinette came to a very unceremonious end when their court became a Baroque imitation of itself and they were utterly dismissive of the people; but until 1787 the monarchy, the court, and the Church maintained a symbiotic, although testy relationship which excluded the masses.

In the Origins of Totalitarianism, Hannah Arendt explains that the nobles of pre-revolutionary France "did not regard themselves as representative of the nation, but as a separate ruling caste which might have much more in common with a foreign people of the same society and condition than with its compatriots".

Each of the power centers maintained their own elite position.  The Church was wealthy and powerful as the kings of Europe and the Pope wielded influence and commanded allegiance through the dispensation of economic and plenary indulgences and the threat of excommunication. Military allegiances to the king or his usurpers shifted easily and internal strife and dissidence were not unknown; but the corps retained its integrity and its elitism and kept far from the common man except to commandeer him and send him into battle.

The ‘democratic’ dynamics that we now associate with modern society existed within the courts of Europe, not within the people, and especially not between the court and the ruled. Oliver Cromwell was the exception to the rule and for a while an embryonic democracy was the law in England.  However, it did not take long for the elite to realize that this was definitely not a good thing, and the Restoration of the crown followed soon after.

In every society from the most primitive tribe in the Amazon to the historical courts of Europe, India, and China to the present-day global powers there is a natural tendency to want to acquire wealth, position, and power; and an equally natural tendency of elites to defend privilege against all comers. Louis XVI is long interred, and the French aristocracy a faded version of its former elegant self; but the elites still rule the land.  The Élysée is now the redoubt of énarques – graduates of the prestigious, elite Ecole Nationale d’Administration just as Oxbridge, Harvard, and Yale rule the Anglo-Saxon world.  The word ‘meritocracy’ is often used to describe these new elites, suggesting that they have worked hard to get to rule as opposed to being born into privilege; but the fact remains that todays ruling class share many of the same attributes of education and background.

Monbiot describes his experience in an English ‘public’ school, one of many whose goal was to education and train the future leaders of the country.  According to the author, these schools shared much in common with the military – they were designed to break immature allegiances to family and replace them with more robust and patriotic ties to country.

The role of such schools was clear: they broke boys' attachment to their families and re-attached them to the institutions – the colonial service, the government, the armed forces – through which the British ruling class projected its power. Every year they released into the world a cadre of kamikazes, young men fanatically devoted to their caste and culture.

Ancient Rome was no different.  The schools for the elite studied according to the principles of Cato the Elder – a respect for fairness, justice, courage, fortitude, patriotism, eloquence, intellect, and honor – for he knew that with this comprehensive range of attributes, young Roman aristocrats would know how to rule and to remain in power. 

Little has changed since then, and elites are still in power, still insulated from the masses, still making self-serving decisions, still perpetuating themselves and fighting external influence.  Although the political parties in America have become democratized – i.e. candidates are chosen through popular primary voting and not by party regulars – there is little doubt that they remain elite institutions with claques of lobbyists helping them to conclude deals with business.  Insularity is assured.  Politicians ally themselves with moneyed interests (large corporations, Wall Street) who keep them in power, gerrymander electoral districts to assure longevity and incumbency, and easily manipulate the media and their constituents with inflamed rhetoric, inflated promises, and little substance.  People may feel that they are part of a horizontal, equalized, democratic system, but they are not.

Monbiot suggests that elites, who depend only on themselves to define worldviews, create their own versions of reality. “If the world does not fit your worldview, you either shore it up with selectivity and denial, or (if you have power) you try to bend the world to fit the shape it takes in your mind.”  Certainly the Neo-Cons who were the architects of the war in Iraq believed beyond all credibility that not only did the United States have a mission to civilize the Middle East with democracy, but that it could – despite all historical evidence to the contrary - complete the mission.

The hawks who surrounded LBJ during the Vietnam War were equally convinced of America’s moral superiority and mission to spread its way of life.  Only until LBJ was brought down by America’s version of the French Revolution, did the mob rule.

Last year the former Republican staffer Mike Lofgren wrote something very similar about the dominant classes of the US: "the rich elites of this country have far more in common with their counterparts in London, Paris, and Tokyo than with their fellow American citizens … the rich disconnect themselves from the civic life of the nation and from any concern about its well being except as a place to extract loot. Our plutocracy now lives like the British in colonial India: in the place and ruling it, but not of it."

Monbiot writes as though this is some kind of celestial revelation; but it is nothing of the sort.  It is business as usual. Why should anyone expect society to change its patterns and ways expressed for 200,000 years?  There is no doubt that today’s societies are more democratic.  There are far more checks and balances to reign in elite power than there were in the time of  Louis XIV or Henry VIII; but elites continue nonetheless, operating according to the same rules of self-interest and –preservation.

Monbiot laments the depredations of his people – those who were educated, trained, and molded to rule in a benign, caring way.  Although he self-deprecatingly calls his aristocracy only ‘third tier’, there is a lot of exculpatory confession here.

So if you have wondered how the current [British] government can blithely engage in the wholesale transfer of wealth from the poor to the rich, how its frontbench can rock with laughter as it truncates the livelihoods of the poorest people of this country, why it commits troops to ever more pointless post-colonial wars, here, I think, is part of the answer. Many of those who govern us do not in their hearts belong here. They belong to a different culture, a different world, which knows as little of its own acts as it knows of those who suffer them.

Of course, Mr. Monbiot.  Of course.

Monday, January 28, 2013

Deception And Virtual Reality

Gary Younge (The Guardian 1.28.13) has his knickers in a twist about the increasing deception in modern life.  What is to be made, he asks, of Beyoncé lip-synching lyrics at President Obama’s inauguration, or the sneaking of some horsemeat into British hamburgers, or the fake Lance Armstrong?  Before you know it, you won’t know whom to believe.

He misses the point.  Nobody cares.  Our virtual, imagined, and idealistic world will always trump reality. When mind and machine are eventually linked; and when infinite combinations and permutations of individual fantasy are possible in a virtual cyber-world, few if any of us will long for ‘reality’.  Virtuality will replace reality because it will be so much more attractive.  It will be a world without the warts and blemishes, blebs, bulges, grime, and sludge of life and will be Hollywood, Las Vegas, the Chateau de Versailles, and the pristine air of the Himalayas.

Imagine in a not too distant future that your mind has been linked to the computer via electronic patterning, DNA manipulation, and biochemistry. Cyberspace will contain images, sounds, fragrances, and stories of all of history, and you will be able to enter this world, select the period you wish to live in and the people to accompany you.  You can write your own scenarios and your own poetry.  You can walk through the gardens of Versailles with the love of your life, sit at court with Marie Antoinette, listen to chamber music, and wear powdered wigs and high, buckled stockings.  You can make love in the bedroom of the dauphin, look out the window as the summer sun sets and the last breath of lilacs floats into the room

You will not be aware that this is a virtual reality, for it will be so meticulously created through a combination of your own imagination and the historical record, that it will exist in its own space.  You will willingly abandon reality, leave the dross and sludge of real life behind, and enter a virtual world which you create, and travel in it with the woman of your choice whether real or imagined

Would you ever exchange this virtual world for your old, shopworn, hackneyed, predictable real one?  Of course not.  (Adapted from Virtual Reality,4.4.11 http://www.uncleguidosfacts.com/2011/04/virtual-reality.html)

Younge does not agree and is worried that our world is headed for the diabolical universe of The Matrix, all illusion and construct with only one sane, grounded, real man to save the world:

In the science fiction film The Matrix, all-powerful machines transform the planet into a huge computer simulation where humans exist only in a dream world. Among the few sentient "free" people left fighting the machines is Cypher, who abandons the struggle following a revelation: he actually prefers the simulation to reality.

This view is the common one – not only has Man been finally dominated by the very machines he has created (See also Terminator I and II); but he has been forced to live in a frightening world of someone else’s fiction.  Pure Hollywood melodrama, full of conspiracy theory, heroism, and special effects.  Cypher, however, understands the temptation, allure, and final seductiveness of a virtual world:

"I know this steak doesn't exist," he says. "I know that when I put it in my mouth, the Matrix is telling my brain that it is juicy and delicious. After nine years, you know what I realize?" He chews the steak ostentatiously and sighs. "Ignorance is bliss."

What is left out in the retelling is that Cypher is eating not just a delicious steak, but the most delicious, succulent, tender steak he has ever eaten.  In a virtual world perfection according to individual taste and preference will be the currency of the day.

Americans in particular love fake.  Las Vegas is only the most elaborate example.  Why go to Paris, Luxor, or Venice when you can wander through simulated gardens, up and down simulated canals, and stare in wonderment at fake Sphinxes, palaces of the doges, or the elegance of Hyde Park? We willingly give up our disbelief, our sense that it is all not real because it is real.  If you can’t tell the difference, then the distinction between fact and fiction, real and fake disappears.

Is there any doubt that our love of games, now almost indistinguishable from the action of real war or football, will lead to a fully simulated, 3-D, holographic experience?  And is there any doubt at all that we will want to spend hours in an exciting world insulated from shit, Shinola, and dead-end jobs?

Our houses imitate Tudor palaces, Mediterranean villas, Georgian mansions, and Moghul gardens.  American colonial, white-frame or brick homes are so boring, traditional, and ordinary.  Why not opt for the exotic and the foreign?  Who cares if you build a chateau on one measly acre of land in Potomac?  You have created the environment you have always dreamed of. 

So the Beyoncé flap is nothing.  It was her voice belted out on the Mall.  It was she singing, gyrating, emoting.  No one knew the difference until they were told. Were they gypped? No. Younge disagrees:

Well, it makes a difference. If it was as much of an honor to be performing at the inauguration as Beyoncé claimed, she might have found time to rehearse at least once. Moreover, the essence of a live performance is the understanding that the audience is experiencing the event in real time and anything can happen. It is that combination of synchronicity, spontaneity and frailty that gives live performances their edge – it's the one take that matters.

Glenn Gould, the brilliant, eccentric pianist of a few decades ago, known especially for his recordings of Bach’s Goldberg Variations, quit the concert stage abruptly saying that the audience interfered with his creative process; and that alone in his studio he could remaster, electronically edit, and perfect his work.  Listeners thought they were listening to the same Glenn Gould whom they had seen on stage, but they actually heard a virtual pianist, a more perfect and elegant one.  When they found out, they didn’t care and bought his records in even greater numbers.  Live performance is overrated, said Gould.

Truth is thought by many to be knowable, but most know that it is a matter of perception. 

In 1868, after five years work, Robert Browning completed and published the long blank-verse poem The Ring and the Book. Based on a convoluted murder-case from 1690s Rome, the poem is composed of twelve books, essentially ten lengthy dramatic monologues narrated by the various characters in the story, showing their individual perspectives on events, bookended by an introduction and conclusion by Browning himself.

The Alexandria Quartet is a tetralogy of novels by British writer Lawrence Durrell, published between 1957 and 1960. A critical and commercial success, the first three books present three perspectives on a single set of events and characters in Alexandria, Egypt, before and during World War II.The fourth book is set six years later, in Corfu.

As Durrell explains in his preface to Balthazar, the four novels are an exploration of relativity and the notions of continuum and subject–object relation, with modern love as the theme. The Quartet first three books offer the same sequence of events through several points of view, allowing individual perspectives of a single set of events. the fourth book shows change over time. (Wikipedia)

Even earlier (1709) Bishop Berkeley denied the existence of reality itself:

Berkeley’s primary achievement was the advancement of a theory he called "immaterialism" (later referred to as "subjective idealism" by others). This theory denies the existence of material substance and instead contends that familiar objects like tables and chairs are only ideas in the minds of perceivers, and as a result cannot exist without being perceived. Thus, as Berkeley famously put it, for physical objects "esse est percipi" ("to be is to be perceived"). Berkeley is also known for his critique of abstraction, an important premise in his argument for immaterialism.

No one questioned the nature of reality so starkly as Berkeley; and in the early 18th century most people believed in this world, the heavenly, but no more; and while observers in the 19th and mid-20th centuries had matured philosophically to appreciate that truth and reality were more elusive and a matter of perception, today’s world is far more challenging.  In the Age of the Internet, who can tell what is true or false, real or unreal?  It is an age of competing claims, photo-shopping, and imagination.  It is so seductive and easy to believe whatever you read or see, why bother hunting for corroboration?

Gary Younge correctly observes this trend, but draws the wrong conclusion:

These moments of deception go beyond sport and show business. They are emblematic of a culture where marketing trumps substance, cynicism triumphs over sincerity, and what is fake is openly and actively promoted over what is true.

Authenticity and transparency, it turns out, are just two options among many. Worse still, we all too often actively collude in the deception on the grounds that the version of events that has been curated for us is preferable to the truth.

Of course authenticity and transparency are just options in this virtual, cybernetic world; and yes, alternative realities are being curated for us during this transition period before our eventual self-curated imagined, virtual world.  The train has left the station.  The smell of ‘real’ grass, ‘real’ lilacs, and ‘real’ pine trees will soon be things of the past.

Sunday, January 27, 2013

The Catholic Church–Insular And Remote

Frank Bruni reviewing Garry Wills new book, Why Priests? A Failed Tradition (New York Times (1.27.13) refers to “the curse of Catholicism” an elitist insularity which in its self-protective secrecy has widened the gap between the Church and its believers:

The Roman Catholic Church, specifically to its modern incarnation and current leaders, have tucked priests into a cosseted caste above the flock, wrapped them in mysticism and prioritized their protection and reputations over the needs and sometimes even the anguish of the people in the pews. I have a problem, in other words, with the church’s arrogance, a thread that runs through Wills’ book, to be published next month

Wills lays the blame on priests who in contradiction to the wishes of Jesus formed a cabal for purely selfish interests:

Among the Vatican’s issues with Wills was his stated belief in a 2010 article that the priesthood, rather than originating with Jesus and a specially selected group of followers, was selfishly created later by a “privileged group within the community who had abrogated power and authority to themselves.”

Bruni focuses on the sex abuse scandals of recent years and the denials, cover-ups, and mendacity of the Church throughout this period.  This attitude of rectitude and defiance might have protected priests in the early days of the scandal, but eventually alienated even its most faithful followers.

Circling the wagons, however, is nothing new for the Catholic Church.  One thinks first and foremost of  Henry II and his battles with Thomas Becket over the rights and privileges of the Church.  Beckett argued that the clergy should not be subjected to secular law – i.e. the Church would adjudicate all crimes and misdemeanors by its priests – while Henry insisted that the Church, as part of England, should be ruled by secular law.  The Church was a powerful political force in Henry’s day and for centuries thereafter, and the king made a relatively minor incident into a cause celebre to make a strong show of regal force.  The battles between church and state continued for over 400 years until the conflict finally came to a head during the reign of Henry VIII.  The cause celebre at that time was Henry’s marriages, but the political positions were identical.  Both church and king claimed sovereignty.

Five hundred years later, the Church is still fighting to protect its own, to reinforce the bastions and ramparts that protect it from the secular world, and to fight increasingly with today’s kings and presidents.  Not only did the Church act as it always has in the past by slamming its doors to prying eyes, but it lashed out, as the Popes had against the Henrys to assert their authority and dominance.  The Catholic Church today has formed alliances with Protestants, no less, to fight what it sees is the secularization of modern society.  It has been on the aggressive, attacking pro-choice advocates in the United States, threatening excommunication of prominent Catholic political leaders who espouse pro-choice positions, and have begun a holy war on moral dissolution.

Therefore the Church’s actions regarding its pedophile priests should not come as a surprise.  While our generation abhors the sexual abuse of minors, priests over the last 1000 years since Becket and Henry II have certainly committed their share of heinous crimes.  In other words, nothing much has changed.  Priests do dastardly deeds, the Church covers them up, fights to keep out the light of secular justice, turns up the heat of attack on its attackers, and eventually goes back into its comfortable private enclaves.

The second fact that Bruni and Wills overlook is that priestly castes have been around since our emergence from caves.  Shamans, witch doctors, and priests have always been among the most intelligent members of society and have used their powers of observation, analysis, and persuasion to arrogate power to themselves.  The absolute rule of priests in Hinduism, for example, is no different from those in the Catholic Church.  Hindu priests exert control over the faithful because only they have access to sacred texts and only they can perform the holy rites of birth, death, and marriage.  The scenario is the same for other religions who have established a priestly caste as intermediaries between God and His people.

Bruni goes on to summarize another issue raised by Wills:

At the start, Christianity not only didn’t have priests but opposed them. The priesthood was a subsequent tweak, and the same goes for the all-male, celibate nature of the Roman Catholic clergy and the autocratic hierarchy that this clergy inhabits, an unresponsive government whose subjects — the laity — have limited say.

Once again, this is not surprising, for both secular and religious institutions behave in the same ways.  The courts of English kings with their internal rules, regulations, and routines were all organized to concentrate power and to dominate and intimidate the citizens they ruled.  The Vatican is no different.  The all-male clergy has nothing to do with misogyny, and all to do with the hocus-pocus of Biblical interpretation.  If Jesus meant for there to be women priests, the Vatican says, then there would have been women amongst his disciples.  And the application of this doctrine goes a long way in perpetuating the ‘mystery’ of the Church.  Every time a male priest consecrates a Mass, it is as though a disciple of Christ is present on the altar.  There is an unbroken line between priest, bishop, cardinal, Pope, and Peter; and of course by extension to Christ himself. 

Breaking up the all-male priesthood would not simply be ‘catching up with the times’ but a breach in the wall, a tarnishing of the imperious, unknowable majesty of the Church.  No small thing.

I was brought up Catholic and left the Church largely because of the unctuous, sanctimonious priests who lorded it over us all from the pulpit and the confessional.  There was truly a cabal of arrogant, pompous, and self-interested and self-serving men.

Bruni and Wills aver that they have nothing against priests in general, many of whom are good, compassionate, and dedicated men.  It is the Church itself that is responsible for its own decline.  Its secrecy, inwardness, fiery denials, and continued sanctimony in the face of criminal activity are unconscionable and indefensible.

While I agree with that, I cannot exonerate priests who, as recent events have shown, used the cover and protection of the Church to commit unspeakable deeds.  The present diminished state of the Church is the result of a complicity of both clergy and Church.  It takes two to tango.

Saturday, January 26, 2013

Women In The Military–Feminist Objections

Strange as it may seem, American feminists are lining up to oppose the deployment of women in the front lines.  This position is strange for because for over 40 years since the beginning of the modern feminist movement, women have been arguing that they are the equals of men, and therefore should be denied no opportunity available to their male colleagues.  In the early days of feminism, women rejected biological determinism.  Yes, they said, we have babies, hot flashes, and we can’t seem to hold back our tears, but these are but inconsequential side effects of X chromosomes and have nothing to do with social, physical, intellectual, or economic abilities. 

One of the most ridiculously memorable events of that period was ‘The Battle Between The Sexes” when in 1973 Billie Jean King, a top-ranked tennis professional played Bobby Riggs, a 55 year-old former mid-level patzer.  It was Hollywood hype, grand entertainment, and pure American Idol grandiose marketing, but there was still a conviction that this showdown actually meant something.  Men wanted this uppity woman to get thrashed; and women wanted to blow the balled asshole off the court.

The conviction of men was such that they overlooked the fact that King was a big, strong woman in her prime; and Riggs was a paunchy, creaky version of what he might have been. He was a man, after all, and there was no way that a muscled, testosterone-rich, aggressive, competitive male, however diminished by age and disrepair, could possibly lose.  If he no longer had a devastating serve or powerful backhand down the line, he could certainly intimidate the girl, forcing her into errors, and hopefully reduce her to tears. I don’t remember who ‘won’ nor did I care -  the event was as staged as a bout of professional wrestling – but a lot of people tuned in.

The Seventies was a heady time for women who got their first whiff of liberation from male dominance, and we men knew they meant business:

In the Seventies we knew exactly what Feminism was – it was a militant movement of women demanding in society’s laws, traditions, customs, and attitudes.  There was no Right Wing, Left Wing, or Moderate Wing of this particular party – it was all militantly angry, hostile, and incessant in its attacks on patriarchy, male dominance, and maleness itself.  Men were the enemy, pure and simple.  They had enslaved women since the first monkey-man dragged a woman by her hair into the bushes, had his way with her, and then forced her to clean up the cave; and it was time for this to stop.  We knew that women meant business.  They even had an IRA-style military wing which – it was reported – would stop at nothing short of physical emasculation.  (Rebranding Feminism, Uncle Guido’s Facts 12.7.12)

Of course most women subscribed only to the fundamental feminist principle of social equality, who lived happily married lives, and left the take-no-prisoners fight to their more aggressive sisters.  These wives loved and needed men, but did their bit to demand more male participation at the kitchen sink.  Taking out the garbage was simply not enough, they insisted.  It was about time that men changed diapers, did the wash, and – mirabile dictu – clean the toilet.

I had a close friend who was dubbed The Phantom by her male colleagues because she combed the Washington Post and New York Times for the slightest sexist reference, and fired off letters to the editors.  One or two of her incensed letters were published, but most were tossed in the trash by male editors who, her female colleagues concluded, must have assumed that she was hysterical.

During the Seventies women chucked the trappings of traditional femininity – stopped shaving their legs and armpits, wore jackboots and flannel and sported butch hairstyles – and de-gendered the playroom. If there were dolls, strollers, and cribs in the playroom, they were for their sons.  They were continually surprised that the boys ripped the heads off the dolls, turned the cribs over and made forts out of them, and had stroller fights. Mothers were equally surprised that their daughters paid no attention whatsoever to the toy bulldozers, tractors, and dump trucks given to them for Christmas.  They made dolls out dish towels, put their ‘babies’ to bed in home-made crèches, and nursed and rocked these laundry clumps just as lovingly as if they had been real dolls.

Gradually women came to accept the facts – they weren’t as strong as men, couldn’t bench press 1000 lbs. but at least could be proud of the Soviet ‘women’ pumping big-time iron on the Olympic stage.  OK, few American women had so totally given up their feminine upbringing to want to look like these mustachioed Bulgarians; but still, you had to admire them:

Bulgarian Weightlifters Sweep European Silver, Bronze

Later women came to accept the fact equal opportunity was the real goal of feminism, and that women could relax and go back to frilly things as long as they had access to the boardroom.  A few years ago I sat in a meeting at my consulting firm, and the women were all dressed, as my mother would say, ‘provocatively’.  I couldn’t believe the bounty – low cleavage, short skirts, sexy hairdos, and heels.  Of course in our de-sexed workplace, you could look but not touch.  Even for an appreciative smile you could be hauled in front of the zaftig Soviet-style PC harassment monitor on the third floor.

It was a good time to be a women, or so it seemed to the male outsider.  Women could dress and behave however they wanted, assured that any unwanted advances would be severely punished.  They could pick and choose whomever they wanted to flirt and cavort with.  They were pushing up against the glass ceiling and finding that it was not so impermeable, and while there were still plenty of misogynists around, they were troglodytes, worms, and throwbacks. Title IX was the hallmark of equal opportunity for women who long ago had given up the idea of physical matchups.  Equivalence was the keyword, and women could compete at elite levels for equal pay and consideration.

The military was one of the last bastions of male dominance.  Although women could serve in the military, they were always second-class citizens, serving behind the lines while their male counterparts fought for glory.  When would women finally be shown the respect they deserved and the recognition the were owed for their patriotism and courage?  That is, when would they be eligible to take a bullet in the chest?

Finally, President Obama, paying political tribute to the women who helped elect him, removed this final barrier.  Women would now be able to serve on the front lines.

You would think that feminists would be rejoicing.  They are, however, protesting this liberal, conscientious decision by the President.  Why, you might reasonably ask, since feminists have been fighting on their own front lines for gender equality for so long. For one thing say the feminists, women provide the necessary antidote, balance, counterpoint to male aggression and depredation.  If it were not for women, these feminists argue, the world would be an even more violent, untamed, and brutal place.  The words of Virginia Woolf still resonate:  

Though many instincts are held more or less in common by both sexes, to fight has always been the man's habit, not the woman's. Law and practice have developed that difference, whether innate or accidental. Scarcely a human being in the course of history has fallen to a woman's rifle; the vast majority of birds and beasts have been killed by you, not by us; and it is difficult to judge what we do not share.

How then are we to understand your problem, and if we cannot, how can we answer your question, how to prevent war? The answer based upon our experience and our psychology—Why fight?—is not an answer of any value. Obviously there is for you some glory, some necessity, some satisfaction in fighting which we have never felt or enjoyed. Complete understanding could only be achieved by blood transfusion and memory transfusion—a miracle still beyond the reach of science (Three Guineas, quoted by Noah Berlatsky, The Atlantic)

Kathleen Parker, writing in the Washington Post (1.26.13) take another tack – women are simply not the aggressive type:

Unbeknown perhaps to many civilians, combat has a very specific meaning in the military. It has nothing to do with stepping on an IED or suffering the consequences of being in the wrong place at the wrong time. It means aggressively engaging and attacking the enemy with deliberate offensive action, with a high probability of face-to-face contact.

If the enemy is all around you — and you need every available person — that is one set of circumstances. To ask women to engage vicious men and risk capture under any other is beyond understanding. This is not a movie or a game. Every objective study has argued against women in direct combat for reasons that haven’t changed.

This is what makes us men, often amused by but sympathetic to Feminism and the Women’s Movement, wince.  Either women are equal to men or they are not.  Since they have pushed, protested, demanded, complained, and whined for so long, it is time to put up or shut up:

Heather Mac Donald at National Review, who declares with hyperbolic outrage that "the only reason to pursue [the policy of women in combat] is to placate feminism's insatiable and narcissistic drive for absolute official equality between the sexes." (Noah Berlatsky)

As always, there is a middle ground.  What kind of woman, after all, will willingly accept and embrace mortal combat?  Not the weak sisters to whom Parker refers, above, but some tough motherfuckers.  These women who self-select themselves for combat duty in infantry units will more than likely be the first over the hill, the ones who toss grenades into the machine-gun nest, who lay down withering fire as they charge enemy positions.  These women were the girls who opted for the guns, trucks, and bulldozers that their gender-sensitive mothers laid out for them at Christmastime; who played video games, and stood up to bullies. 

The military is currently studying the operational implications of the President’s order, and it is obvious that women will not be routinely sent into combat as men are.  They will have the option to serve in forward units, a policy which will preserve the President’s intention for equal opportunity while assuring the most kill-hungry women to get a chance to take out the enemy. The women will be scary, and most of their male brothers-in-arms will be afraid of them.

But, oops, another little problem raised by Parker:

The threat to unit cohesion should require no elaboration. But let’s leave that obvious point to pedants and cross into enemy territory where somebody’s 18-year-old daughter has been captured. No one wants to imagine a son in these circumstances either, obviously, but women face special tortures. And, no, the rape of men has never held comparable appeal.

Yup, women need protection again.  Back to the drawing board.  Women have special needs – their femininity disqualifies them from service.  Please, spare me the anguish.  Rape is rape – as humiliatingly brutal, degrading, and dehumanizing for a man as a woman.  And while we’re at it, waterboarding, genital electroshock, psychological torture, getting prodded, reamed, sliced, and beaten are no fun for either men or women.

My conclusion? If women want to get chewed up by raking enemy fire, blown apart by IEDs, mutilated, maimed, traumatized, and debilitated; raped, abused, and tortured, go right ahead.  Not an option for every woman, certainly, but by all means let there be this golden opportunity for those tough women who can be called soldiers.

Friday, January 25, 2013

Meritocracy vs. Equality

David Brooks writes (NY Times 1.25.13) that the Obama Administration’s focus on income inequality and the redistributive policies to reduce it are misplaced.  There is a difference in American society more fundamental, says Brooks, than income – the growth of a meritocracy which benefits the few and leaves the many behind.  This is not a bad thing, says Brooks.  After all, high achievement, ambition, and motivation are at the very heart of the American psyche and responsible for our remarkable innovation and productivity.

The first problem with the effort [to promote social and economic equality] is that it’s like shooting a water gun into a waterfall. The Obama measures, earned after a great deal of political pain, simply aren’t significant enough to counteract the underlying trends.

The second problem is the focus on income redistribution. Recently, there’s been far more talk about tax increases than any other subject. But the income disparities are a downstream effect of the human capital and geographic disparities. Pumping a few dollars into San Joaquin, Calif., where 2.9 percent of the residents have bachelor’s degrees and 20.6 percent have high school degrees, may ease suffering, but it won’t alter the dynamic.

No matter what government does, those who can escape the small towns and cities of America for the nodes of higher achievement – Washington, Boston, San Francisco, San Jose – do so.  In these nodal cities, over 50 percent of residents have college degrees, and are employed in upwardly mobile, demanding intellectual jobs.  Not only that, but at the top levels of employment in these cities, many executives, managers, and professors have obtained degrees from America’s top universities:

Robert Oprisko of Butler University found that half of the jobs in university political science programs went to graduates of the top 11 schools. That is to say, if you have a Ph.D. from Harvard, Stanford, Princeton and so on, your odds of getting a job are very good. If you earned your degree from one of the other 100 degree-granting universities, your odds are not. These other 100 schools don’t even want to hire the sort of graduates they themselves produce. They want the elite credential.

What Brooks is suggesting is that no injection of federal money, either directly or indirectly, can transform Flint, Michigan; New Britain, Connecticut; or Vineland, New Jersey into dynamic, fluid, intellectually exciting places.  The brain drain will continue, leaving these cities even worse off.

However, Brooks’ observations only confirm what has always been true in America – upward social mobility and economic competition.  People for generations have fled narrow, stultifying existences in Small Town, America for the bright lights of the city; and have understood that a premier education is the best and most proven passport for entry into this rarified world.  American capitalism has always been a system of inequality because it is a country devoted to equality of opportunity.  Benefits for those left behind are few and far between because we laud and reward those who rise and prosper.  Because intelligence, talent, drive and ambition, creativity, and insight will always be concentrated in the very few, it is no surprise that they leave insular lives for companionship with like-minded people. 

The only real issue, not raised by Brooks, is whether or not concerted efforts to help individuals rise to the meritocracy can prove beneficial.  That is, rather than continuing the generalized,scattershot, and minimal redistribution of income permitted within current American political philosophy; investment should be made in reconfiguring K-12 education to foster, nurture, and encourage the talented (contrasted to today’s policy of favoring the disadvantaged). 

Our system of higher education is like a giant vacuum cleaner that sucks up some of the smartest people from across the country and concentrates them in a few privileged places.

However, it is more often than not a failed system, charging exorbitant fees and providing very little.  Those students who do manage to graduate do so in debt and with few prospects of good employment and little of the civic education necessary to make them responsible citizens.

Poor education is but part of the problem which consigns many American to socio-economic levels below their potential.  Not only must education be reformed, but public funding of social programs which perpetuate dependency, the status quo, and inertia must be discontinued and families forced into the mainstream of competitive, meritocratic America. 

It is not clear how much this investment and radical transformation of local communities will enable more people to rise to the top.  It may help to enable them to climb a few rungs on the socio-economic ladder, but it is unlikely to swell the ranks of the meritocracy.  These super-Americans will always find a way.  One needs only look at the most prominent Americans – our Presidents – to see how their drive and ambition was unstoppable.  Clinton, Reagan, Nixon, Obama were not born of privilege, but with ability, vision, and strength. Few highly successful Americans these days have been born with a silver spoon in their mouths.

American meritocracy is a good thing, and should continue to be encouraged. There should be no leveling of the playing field which restricts or inhibits rise and accession to it.  There will always be a social bell curve with the talented and unique at one end, the less well-endowed and disadvantaged at the other, and the great middle either fat and satisfied or striving to move up and out.

I once wondered if there were meritocracies in primitive tribes.  If there was such a thing as a human bell curve, then where was it amongst the Jivaro? Where could one see power, influence, intelligence, and cultural leadership?  In the priests, witch doctors, and shamans; and in the bravest, most cunning, and most resourceful hunters and warriors.  Even the most primitive, marginal human populations act according to the same laws of human nature.

Therefore Brooks’ observations are interesting, but nothing new.

Thursday, January 24, 2013

Personal Branding–Who Are You, Really?

Everything and now everybody gets branded.  Sarah Banet-Weiser, writing in the Wall Street Journal (1.24.13) reports that in the age of social media, just being you is not enough:

Increasingly, students have come [to me] seeking a different sort of advice, asking: how should I build my personal profile, or what do I need for a successful self-brand? Should I have a personal website, or a strong online presence? Questions that used to be framed as “Who am I, and who should I become?” are more and more “How should I sell myself?”

These questions emerge not because students are more superficial, but because they often feel compelled to compete in a marketplace that requires them to manage themselves as commodities.

This trend is not as alarming or dehumanizing as it might seem.  Social media have given us a chance to express ourselves more completely than ever before.  In seconds we can post pictures, opinions, articles, commentary and cute quotes. Thousands of people get to know us in more diverse ways.  The more we are willing to share, the more complete the picture.  It is only natural that we have begun to think more about how to manage that often scattershot image – how to trim, focus, concentrate, and distill it so that when others see our name, an immediate composite image comes to mind.  If you are sharing baby pictures, recipes, and Tea Party screeds, the image is garbled, messy, and imprecise; but if you restrict your posts to one particular aspect of your life and relate all information to it, then a more focused and clearer image emerges. You have branded yourself.

The brand does not have to be static – e.g., Baby Mom or Tea Party Crank – but can be more dynamic.  A commentator such as David Brooks, for example, does not fall into any particular category, but followers know that when they read him they will usually get an insightful look into current social, political, and intellectual trends.  Others, like Christopher Hitchens was known as a sharp-tongued, brilliant iconoclast; and although you never knew who or what he was going to pillory, expose, or deride, reading his pieces was always entertaining and challenging.

Brands and branding strategies have expanded beyond selling actual products or companies. We are witnessing the expansion of brand language and logic to our personal selves, where individuals often feel obliged to not only construct, but to understand themselves as brands. Branding, at the end of the day, is about selling and marketing things.  When we create a self-brand, we embark on a process that packages, designs, and markets us—human beings—just like other products and commodities.

The unasked question in this article is “Why do we do Facebook and Twitter?” Within the catch-all category of ‘social motivation’ there are many subheadings.  There are those who simply want to keep up with hundreds of virtual friends instead of a few real ones. The virtual network can expand geometrically and the social opportunities within it are almost limitless.  At any time, you can stop the virtual wheel, and get off into a real person’s bed.

For others, it is a way to connect to an exponential world of ideas.  It is one thing to have an opinion, another to have articulated it, and yet another to have shared it; but the real reward is to get responses to it, attract commentary, review, and criticism and by so doing, learn and appreciate more than ever.

I was delighted when I discovered Twitter.  I had always thought it was a teeny-bopper toy to share girly stories, but found it could be a highly sophisticated device which could put me in touch with hundreds of other writers instantaneously. As I scroll down the day’s Tweets, I see many which touch on topics that interest me.  I reply, post articles from my blog, and wait for others to comment.  The hits on my blog have increased tenfold; and if I expanded my Following list beyond the current 50 or so journals and writers, the increase would be even more.

I am concerned about my ‘brand’ because I want to facilitate the communications between me and other writers and their followers.  The more recognizable I become, the more likely readers will click on my posts and reply; and the more the intellectual circle is energized.

Producing one’s visibility through technological and digital tools, across multiple media platforms, is a primary way to create a saleable image of oneself, one that can circulate in a competitive marketplace.  These new, social media enhanced, visible selves, are no longer just versions of who we are, but they have become the essence of who we are. Increasingly, individuals are producing a self-brand not to promote one’s work or accomplishments, but to promote their very presence as a commodity in a market society.

The author, however, goes on to lament the passing of an older, more ‘real’ version of individuality:

Those various reasons one might garner public attention based on what one does, rather than how one is packaged, now seem almost antiquated, old-fashioned ideas in the contemporary economic context. I worry about what this means for a new generation of students and scholars who have been taught to value domain names and retweet campaigns over books and articles and ideas, perception and hype over content and substance, when what we promote becomes the very act of promotion itself.

This is where I disagree.  Why does the exercise that I have described above in anyway mean that I have valued perception and hype over content and substance?  Just the opposite.  Since I have been plugged in to social media, I have never been more intellectually active.  The immediate, critical audience that these media provide has been stimulating, energizing, and motivating.  I look forward to writing every day – articles I hope have substance and insight – largely because I like throwing myself into this large, unanticipated, diverse, and largely unknown reader pool.  What I promote most definitely does not become the very act of promotion itself, as Banet-Weiser suggests.

On the other hand, branding has gotten out of hand.  I used to work for a private consulting firm which provided technical advice to developing countries.  The company did not have a particular area of expertise, but easily found short-term talent on the open market to satisfy contractual requirements.  It was not known for its leadership in health, agriculture, education, or democratic reform, but for coming in on time, under budget, and meeting stated objectives.  It bid on everything that USAID offered, won one-third of all bids, and made a tidy profit. Despite this success, the CEO was not entirely happy.  If we branded ourselves, he said, we would be even more saleable and even more favorably viewed within the contracting agency.

Questionnaires were sent out to us, asking for our opinions on what the company meant to us; and what we thought of its mission, purpose, and overarching goals.  The CEO expected, I am sure, glowing comments about saving lives, mitigating misery, reducing poverty, and making the world a better place. The employees, however, uniformly stated the obvious – the company was in the business of making money, promiscuously went after every contract on the street, and had no mission whatsoever other than enriching the stockholders.

Another company I know – this time a non-profit company working in the same field of international development – spent thousands to hire a consulting firm to rebrand it.  The company name reflected its early origins – a small enterprise working in world education – but given its expansion and diversified portfolio, the CEO wanted to project a new brand image.  He went through the same exercise – a questionnaire soliciting opinions about mission, principles, and goals was circulated, results tabulated and analyzed, and a report issued. Not surprisingly, the employees uniformly said that the company only cared about winning.  It bid on just about everything, hired day-labor to fill technical slots, and instead of doling out shareholder dividends, bought a lot of artwork and built fancy facilities. 

It was a shock for both these companies to realize that they had no distinctive, attractive, sexy brand image.  They were money-makers.  They had no consumer base, only USAID bureaucrats to please; and because government regulations forbade any contact with them, contracts were awarded based only on the quality of proposals submitted.  They were contract mills, and they wasted their money trying to create an image of something they were not and would never be.

A small university in the South where I have been a teacher is currently going through a branding exercise; but in this case it is needed and very worthwhile.  The university has had to compete with two State behemoths which have attracted public attention, alumni contributions, and significant government funding.  As state financial resources have dwindled because of the recession, this small university has been in danger of being marginalized or worse.  It needed to show state bureaucrats and decision-makers that it filled an important niche between the two large universities; but it was not clear to anyone what exactly that niche was.  The branding exercise forced it to focus on all its attributes, to identify which of those were lacking in competitor universities, and enable it to position itself as a unique alternative.  The process has not yet been completed, but it should be productive.

The point is that in an increasingly competitive world, branding is not only essential but rewarding.  It in no way demeans the individual or institution, for they are only looking to enhance themselves, not just their image.  While some companies arrogantly and foolishly assume that they can become something they are not and never will ber simply be rebranding, many more understand today’s marketplace and see the need to reconfigure, retool, and refocus scarce resources to succeed.

Such rebranding, refocusing, and consolidating pays equal rewards for individuals who want to increase their personal value.  Some, like me, want only to be part of a larger and more demanding intellectual world; others want to present a more complete and complex personality and by so doing enhance their chances for partnership. 

There are few downsides to this natural and logical outgrowth of a competitive, market society, and Ms. Barnet-Weiser need not be so concerned.

Wednesday, January 23, 2013

Instrumental Lying

The BBC (1.23.13) reports on ‘instrumental lying’ – tactical untruths to get children to behave:

The study, published in the International Journal of Psychology, examined the use of "instrumental lying" - and found that such tactically-deployed falsehoods were used by an overwhelming majority of parents in both the United States and China - based on interviews with about 200 families.

The most frequent example was parents threatening to leave children alone in public unless they behaved. Persuasion ranged from invoking the support of the tooth fairy to telling children they would go blind unless they ate particular vegetables.

An interesting concept, instrumental lying, and a welcome addition to the many other kinds of lying which are usually used to cover up a misdeed – e.g. Mark Sanford saying that he was walking the Appalachian Trail when he was actually visiting his Argentine firecracker; or John Edwards denying that he had a love child then compounding the lie by asking his underling to take the fall.  Since there is something purposeful in instrumental lying – i.e. the education of one’s children – it doesn’t seem wrong at all; and many people would not even consider putting it into the same moral bin as willful deceit.  Besides, young children have not yet matured enough to distinguish fantasy from reality, so the whole concept of a lie has not yet acquired salience or value.  They believe in Santa Claus and the Easter Bunny, after all, so why not invoke Pinocchio and tell them that if they lie, their noses will grow long and pointy just like his?

Many parents use Santa Claus to achieve their behavioral objectives, albeit short-term.  They recite the key lines from the popular Christmas song:

“He knows when you are sleeping.  He knows when you are awake.  He knows when you’ve been bad or good, so be good for goodness sake.”

Parents need all the help they can get, so who can blame them for a little deceit to keep their little ankle-biters in line for a few weeks?  I have never heard of any parent using the Easter Bunny as hall monitor, but few children would want to risk the thought of no brightly-colored Easter eggs left in the hedges for them to find.

In our family, Santa Claus and the Easter Bunny were kept as benign, generous, and miraculous figures.  Why deliberately sully the magic of a jolly, red-suited Santa and his sleigh of reindeer coursing through the night sky, coming down the chimney and leaving presents? That would be the real sin, not perpetuating the fantasy lie.

As a child I remember only one instrumental lie spread by my parents – coffee will stunt your growth – but there were plenty of half-truths said less out of conviction than out of old wives tales.  “Don’t go outside with a wet head or you will catch pneumonia”, for example, or “Don’t sit in drafts because they will give you sciatica”. Too much ice cream would give me constipation, and not enough carrots would ruin my eyesight.  There were too many lies spread about masturbation to decide who actually said them, especially my parents.  Jerking off gave you warts, made you blind, gave you the palsy, addled the brain, and weakened your resolve.

I heard the best indirect injunction against masturbation in India. Loosely translated, the warning said” “You have only so many shots in your magazine, so use them wisely”.

Apparently parents resort to very twisted lies to control their children:

The most frequent example was parents threatening to leave children alone in public unless they behaved.

There were "untrue statements related to misbehavior", which included: ''If you don't behave, I will call the police," and: "If you don't quiet down and start behaving, the lady over there will be angry with you.''

Under the category of "Untrue statements related to leaving or staying" a parent was recorded as saying: "If you don't follow me, a kidnapper will come to kidnap you while I'm gone." (BBC)

I once took my young son to see the film Pinocchio. There is a scene in which  Pinocchio is shanghaied by Stromboli to be a puppet in a circus and is carted off in a wagon.  The animation shows Pinocchio looking out the back, staring into a dark rain, sad and distraught because he was being taken from his beloved father, Geppetto. My son started to cry, and as many times as I had seen and loved this great Disney classic as an adult, I had never realized how painful this scene of loss must be to a child. Once I saw Pinocchio’s journey through a child’s eyes, and realized how his death to save his father would be so saddening and disconsolately terrifying to a child, did I realize what the fear of separation and loss must be.

Telling lies of kidnapping, police abduction and parental abandonment are cruel and indefensible.

Under the category of "Untrue statements related to leaving or staying" a parent was recorded as saying: "If you don't follow me, a kidnapper will come to kidnap you while I'm gone."

There were also lies motivated by protecting a child's feelings - labeled as "Untrue statements related to positive feelings."

This included the optimistic: "Your pet went to live on your uncle's farm where he will have more space to run around." (BBC)

Once, when we vacationed in Italy for the summer and rented an old farmhouse in Tuscany, we rented a rabbit as a pet for our children, an animal raised along with chickens and ducks by a local farmer.  When it was time to go back to the United States and to return the rented rabbit, my son asked what would happen to Buns.  “Mr. Vanni will give him a good home”, I lied.  “He will be very happy with all his bunny friends”. 

My daughter who was two years older gave me a quizzical look.  She had seen the skinned rabbits lined up in the butcher shop with only little tufts of hair left around their paws, and knew exactly what was going to happen to Buns.  I changed the subject before she could expose my lie.

I used to tease my children all the time and make up implausible but possible stories.  I told myself that these untruths were educational tools to help them become discerning, analytical adults; but my children saw it differently.  One summer we were on an interminable road trip, and a large RV passed us slowly.  It was enormous, bus-long, with railings on the roof.  “What’s up there, Daddy?”, asked my daughter. 

“A swimming pool”, I said.  “These big campers have heated pools on their roofs so that children can have fun and not be bored on long trips”. 

Many weeks later, when we were on an upper level of an outdoor parking lot, my daughter looked down on a big RV, similar to the one that had passed us on the road.  There was of course no swimming pool on the roof, just junk – extra bicycles, beach chairs, and umbrellas.  “You lied to us”, said my daughter angrily.  I replied that I had not lied, but had just made up the story.  “No”, she insisted.  “You lied!”.

Sissela Bok wrote Lying and in it explored the moral and ethical dimensions of lying and how the practice, although common, was never justified.  She quotes Charles Fried:

A good man does not lie. It is this intuition which brings lying so naturally within the domain of things categorically wrong. Yet many lies do little if any harm, and some lies do real good. How are we to account for this stringent judgment on lying, particularly in face of the possible trivial, if not positively beneficial, consequences of lying? (Right and Wrong, 1978)

Robert Fullinwider (www.infed.org) summarizes Bok’s “Theory of Veracity”, a very strong moral presumption against lying:

What, she asks you, would it be like to live in a world in which truth-telling was not the common practice? In such a world, you could never trust anything you were told or anything you read. You would have to find out everything for yourself, first-hand. You would have to invest enormous amounts of your time to find out the simplest matters. In fact, you probably couldn't even find out the simplest matters: in a world without trust, you could never acquire the education you need to find out anything for yourself, since such an education depends upon your taking the word of what you read in your lesson books. A moment's reflection of this sort, says Bok, makes it crystal clear that you benefit enormously by living in a world in which a great deal of trust exists – a world in which the practice of truth-telling is widespread. All the important things you want to do in life are made possible by pervasive trust.

There is but one definition of a lie:

What is a lie? A lie is a statement, believed by the liar to be false, made to another person with the intention that the person be deceived by the statement. This is the definition used by Sissela Bok and it has antecedents as far back as St. Augustine.

However, there are so many shades of lying (white lies, tales of fantasy, half-truths, minor deceptions) and so many compelling cases for benign lies (withholding a diagnosis of terminal cancer, “Mommy will be back soon”, or “The weather will probably clear”) that most people lie without even thinking twice about it.  Yet, as Bok observes, such pervasive lying is corrosive, and ultimately destructive.

Parents should indeed give the question of lying a second thought; or at least to distinguish between malicious deception and harmless, teasing fictions.  No one has even tried to show a correlation between ‘instrumental lying’ to children and their eventual veracity; but that misses the point. Lying can hurt children.  It sets a bad example – that lying is an acceptable way of enforcing volition; and, as Bok suggests, it contributes to a world in which the child’s first instinct is to wonder what is true.

The study raises the longer-term issue of the impact on families of such opportunistic approaches to the truth. It suggests it could influence family relationships as children get older.

Researchers concluded that this raises "important moral questions for parents about when, if ever, parental lying is justified". (BBC)

Tuesday, January 22, 2013

Volunteerism–Myth And Reality

America is a country of volunteers.  Almost 30 percent of us volunteered for a US non-profit agency in 2011 (The Business Journals 3.12). I was one of these legions, albeit years ago.  At prep school I tutored slower students from the town in French and Latin ; and at Yale I was a member of Dwight Hall, a social service agency which ran a variety of programs in New Haven.  I told stories to second graders in poor, black neighborhoods and tutored math students in area high schools.  In Bombay, I volunteered to teach basketball to kids in a nearby private school. 

In all this I don’t remember having what one corporate interviewer described as a ‘social service motivation’ – i.e., doing good – but got involved in these programs because I enjoyed being around children. The ivory towers of the Ivy League and its feeders were very confining, and spending time with ‘real’ people in ‘real’ communities gave me the contact with the more intimate and personal life that I missed. 

The schools in New Haven were in the ghetto,  and the children had unbelievable stories to tell during Show-and-Tell.   “My Mommy took a bottle upside my Daddy’s head and he didn’t get up for a long time”, said one.  “My Mommy turned on the gas and when my Daddy got close to the stove, she lit it and burnt his butt”, said another. It was sad, but funny.

I was reading French literature of the mid-19th century and la nostalgie de la boue, a desire to get back to the dirt, grime, and bare reality of the world after a life of poetry and fine arts, and my sophomoric fantasy was to experience a reality that I would never have to live. Perhaps more to the point, the children were goofy, uninhibited, silly, and giddy – far from the stuffed, taxidermy of Imbrie Buffum or Maynard Mack, my college professors.

After Yale I worked at St. _______School in Westchester County, a home/school for potentially delinquent children who had been referred by the Juvenile Justice Department of New York City.  These children lived at St. _____ and because they were not socialized enough to attend public school, were taught on campus.  Every summer, college students like me lived at the school, organized games, outings and events. This was in the Sixties, in the first days of civil rights, so no one in the administration of the school looked askance at 5 white Ivy League kids shepherding 100 black ones.  Like Dwight Hall at Yale, St. _____ gave me a chance to see what life was like beyond the wall and to enjoy the silliness of young children.  A lot of the spontaneity and joy had been beaten out of them from a shitty home life in the slums, but they were still children.

In short, my desire for la nostalgie de la boue, a slice of life was satisfied. My cruise on the surface of ghetto,slum, and dysfunction gave me perspective without pain.  I was neither transformed nor proud.  I simply liked what I did.

The children with whom I worked were, I think, also satisfied.  They could tell that I liked them, and liked being with them; and despite initial suspicions of the white boy from Connecticut, gave in and had a good time.  In short, my volunteerism was an expression of very precise and particular personal interests, none of which had anything to do with ‘helping’ or ‘caring’; but it had the result that organizers wished.  The kids were happy, learned French, math, and Latin. It was a perfect balance.

Which brings me to the enforced volunteerism of my own children’s day.  At neither private school which they attended was there any real sense of the purpose of volunteering.  Was it for the benefit of the students, i.e. to teach them empathy, solicitude, and charity? Was it for the benefit of the slum communities in which the students were to work? Was ‘real’ volunteering only working with black, poor communities; or was teaching a drawing class at an after-school class for wealthy white kids OK?  What was the definition of community?  Was it someone else’s community, such as the neighborhoods of Southeast Washington where my children were bussed to pick up someone else’s trash, needles, and used condoms? Or was it their community – the libraries, hospitals, and convalescent homes of well-to-do Washington?

Because of this lack of clarity of purpose and the enforced, obligatory nature of the volunteer program, most students emerged from the experience more hardened in their opinions about those less fortunate than ever.  I offered to go with the students of my son’s school to ‘help the community’ clean up the area around a church located in a poor area of Northwest DC.  However there were no members of the congregation there to pitch in, so the little white kids picked up cigarette butts, candy wrappers, soda cans, dreck and shit that church-goers had stepped over.  What kind of a message was that?  

The next year, my son decided that he would volunteer for something more suited to his interests and abilities and taught advanced drawing to an after-school program at a private art and dancing school.  He had found the school, elicited an interest from the director, and proposed the work to the head of the volunteer program at his school.  “No”, said the Director.  “That is not real volunteering”; by which he meant that the only way to fulfill your school and moral obligation was to suffer in the course of doing good.  Needless to say my son never did any volunteering ever again. 

Many of the 30 percent who volunteer do it for their church or some other familiar, close-in social group; and relatively few go out of bounds; but in so doing they represent the best of volunteerism.  My mother’s favorite adage was “Charity begins at home” and in these less Depression-straitened times it can be expanded to include one’s own crowd.  My neighbor, for example, always picks up the paper from our next door neighbor who is old and infirm and is the first to shovel his driveway when it snows.  She always refused payment on behalf of her son for chores he did for us.  “He’s your neighbor”, she said. 

For many years my wife volunteered at the Vassar Book Sale in Washington, a grand affair filling the halls of the Building Museum with used books collected by alumnae, women who also sorted, selected, and worked the register.  The book sale always generated revenue for the school which in turn used it for scholarships.  The ladies volunteered because of allegiance to Vassar, an old, formerly Ivy League WASP redoubt in the 30s and 40s when they went there.  Vassar didn’t really need the relatively small amount of money generated; and the sale was more for the ladies who did something for their alma mater and got to shmooze with Washingtonians not unlike them who loved books.

The point is, volunteerism need not – should not – be a piece of someone’s social service agenda.  My mother’s other adage was “God helps those who help themselves”; and hard love and the withdrawal of social service programs, well-meaning volunteers, and social reformers will do more for inciting people to change their own lives than any external assistance.

Individualism if interpreted as the acceptance of personal responsibility is a good thing; and is the only way for dysfunctional communities – the ones getting volunteer help alongside public assistance – to give up the culture of entitlement and move on, up, and out.  Volunteerism can prolong the misery of the dysfunction, delay the onset of internal, homegrown activism.

Obama’s calls for diluting individualism with a dose of collectivism are somewhat discordant.  The world has changed since the day of his volunteer activism and community organization in Chicago.  Online petitions clog the web and produce results as do online contributions to organizations promoting environmentalism, elder care, or education.  Tens of millions of dollars were contributed by American conservatives to defeat Obama in an almost successful attempt to extirpate the enemy – i.e. volunteerism at its most active and effective.  Not your grandfather’s volunteerism to be sure, but volunteerism nonetheless. 

In fact, this social networking, online activism is paired with market forces.  Marketing experts, armed with big data and sophisticated algorithms group atomized individuals virtually and create large volunteer groups.  Viral commercial marketing does the same thing, and engages consumers into promoters.  When you ‘Like’ a product, place, or service on Facebook, you have volunteered your services to the companies behind them.

So, both Obama’s Inaugural and the now institutionalized MLK Day of Community Service are quaint, old-fashioned, well-meaning, but way off-target in this dynamic, changing, electric world of 2013

Obama And Collectivism

In his Inaugural address, President Obama set forth a liberal vision for America; and although he punctuated his speech with words of praise for the individualism that has always been at the heart of our democracy, economy, and society, his plea was to shed much of the aggressive self-interest that has characterized America in the last decades and to turn to collective action.  Not only is no man an island, suggested Obama, but only through concerted, cooperative action can men change the world for the better.

The President, freed from the constraints of a first term in which he had to consolidate his power, expand his base, negotiate carefully with Congress, secure the support of both corporate America and Wall Street; he is now who everyone knew he was from the beginning – a Chicago community organizer. 

“But we have always understood that when times change” Obama said, “so must we; that fidelity to our founding principles requires new responses to new challenges; that preserving our individual freedoms ultimately requires collective action. For the American people can no more meet the demands of today's world by acting alone than American soldiers could have met the forces of fascism or communism with muskets and militias. No single person can train all the math and science teachers we'll need to equip our children for the future, or build the roads and networks and research labs that will bring new jobs and businesses to our shores. Now, more than ever, we must do these things together, as one nation, and one people.

He went on to say that collective action is especially required in the fight against Global Warming and for the full inclusion and acceptance of gay men and women into the majority.  These issues are too important and the challenges too great for any individual or even loosely affiliated individuals to make a difference.  Strong, motivated, ambitious collective action is what is required.

He then went on to focus on individualism per se – the moral and Christian imperative to help others.  While he, in this carefully crafted speech, did not characterize the American populace is increasingly self-centered, venal, and indifferent, it was not hard to read between the lines.  The safety net is more than just a government construct, a well-designed catch-all for those who did not succeed in America’s competitive society; it represents a social cohesion, the goodness of Americans.

To emphasize the fact that this idea is very American, he cited the words of the Founding Fathers – Life, Liberty, and the Pursuit of Happiness – and suggested that we all need to reexamine this phrase and to reflect on its continuing relevance.  The Founding Fathers, Obama suggested, believed that our nation was to be built on a good life, one based on liberty and freedom, but one in which individual enterprise was carried out for the sake of the common good. While not specifically citing Thomas Jefferson, he must have been thinking of him.  Jefferson, a child of The Enlightenment and influenced by Locke and Rousseau contended that the moral duty and obligation of the individual was to work for the greater good:

The Founding Fathers were very much concerned with the respecting the rights of the individual but also with the fostering of civic community which would offer protection and benefits.  They sought to achieve a balance between the inviolable rights of the individual to pursue his spiritual needs and to work freely and without encumbrance; and the importance of community to support these and other goals.  When Jefferson wrote about ‘the pursuit of happiness' in the Declaration of Independence he was influenced by the philosophy of John Locke – one did not pursue happiness for personal pleasure or satisfaction, but for the well-being of society – the aggregation of individuals

Jean-Jacques Rousseau certainly influenced Jefferson as well.  As noted by  Anne Deneys-Tunney in the Guardian (7.15.12), writing about Rousseau’s Social Contract the philosophy of both theorists coincided on the subject of the precedence of societal values over the more personal interests of the individual:

As a revolutionary thinker, Rousseau understood that the general will, or the will of the people, should be sovereign – and that is the catch. It is here where we regain our freedom inside social organization. Only the general will – general interest as opposed to private interest – guarantees man his autonomy. No society can be free unless individuals understand that the general will or general interest should prevail over their own individual one. http://www.uncleguidosfacts.com/2012/07/rousseauthe-social-contract-and.html)

As indirectly and carefully as Obama said it, the observation that America had fallen off those sanctified rails of early democracy was evident; and the call to regain the course while gentle, was urgent.

This is all well and good. America at times does seem self-absorbed, over individualistic, greedy, and ignorant of any moral principles of right behavior.  Wall Street greed, the depredations of corporate America, the tossing of the weak and infirm into the gutter in a sink-or-swim conservative tide, are all clear examples – or so say ‘progressive’ commentators.  Yet capitalism has always been a system with exaggerated edges, highs and lows.  In our country there are the super-rich and the desperately poor.  They have always existed and they always will.  Experiments with Socialism and Communism, attempts to level the field and to ensure that there were no such ends of the bell curve but only one happy middle have failed, and we are left with our own imperfect but dynamic system.

Somehow we have managed to survive the crises and dislocations provoked by capitalism.  We survived not only The Great Depression but the serious recessions of ‘37, ‘45, ‘48, ‘53, ‘57 and many more up to and including the most recent one from which we are just now emerging.  Government legislation helped to curb the excesses of the Robber Barons and the worst of laissez-faire capitalism, to protect workers, to provide jobs for the unemployed.  Because the laws were not so restrictive to hobble private enterprise, the economy thrived in the interim, and in fact was dynamic.  Obama was right to cite the important regulatory role of government; but he did not provide the proper context for the inevitable business cycles, the enviable business productivity, and the remarkable, persistent, and exceptional enterprise of the individual.

The world is a dramatically different place than it was in the halcyon days of liberalism where government acted almost in loco parentis. American economic exuberance is now almost beyond public constraint and a new paradigm is emerging.  First, market forces are more powerful than ever. As government, urged on by environmentalists, made questionable investments in alternative energy, the oil and gas industry made radical structural changes on its own.  Weary of spending billions on the defense of oil fields in unstable Third World countries and subject to the capriciousness of corrupt dictatorships, they invested in North America.  While the search for and exploitation of new energy sources (oil shale, fracking, oil sands) involved significant capital investment, it has paid off in a big way.  Not only is America on the way to energy self-sufficiency, but we are less dependent on foreign supply.  As importantly, as cheap gas replaces relatively expensive coal, the environment benefits. Finally cheap energy is attracting manufacturing jobs back to America.   In short, depredations of tin-pot dictators in Africa started a long chain which has ended in great benefits to the US, all through the private market.

American small business, especially in the IT sector is as robust as it has ever been; and garage-educated twenty-somethings are not multi-billionaires.  The law reins in the most rapacious competitors – even the august Apple, Google, and Microsoft have felt the long arm of the law – but the private sector is the engine of American progress.  It is no wonder that the US is emerging from a long recession quicker than the statist economies of Western Europe.

Obama’s suggestion that collectivism needs renewal is not correct.  Lobby groups are the perfect example of an energized electorate.  They are not just legions of suits promoting big corporate interests, but legions of grey-hairs organized by AARP to demand their ‘right’ to Social Security and Medicare.  Obama suggest that more collective action – more moral commitment to each other – is called for; but in reality the AARP is one powerful organization.  The reason why Medicare and Social Security will outlast Congressional reformers is because of the over 50s.

Secondly, even in our increasingly atomized society, individuals are collectively organized through the Internet, especially social media.  When Netflix arbitrarily split streaming and DVD rental, thus collecting more revenues, the public outcry – via the Internet – was deafening and Netflix rolled back its new charges.  The government Consumer Protection Agency is nothing compared to price- and product-sensitive consumers, newly-savvy thanks to their access to mediated information.

Obama voices concerns over jobs and the unemployed, and somehow suggests collective action here as well.  Yet, the restructuring of American industry, while temporarily causing dislocations of labor is revitalizing and upgrading it.  The final elimination of manufacturing unions has enabled businesses to reduce unnecessary costs, allowed for a freer flow of labor and capital, and has produced a more internationally competitive industry. Online retail, with its inbuilt consumer feedback, is more responsive to individuals.  Big data and sophisticated algorithms enable marketers to understand consumers far more than they ever have.  Hotel chains like Marriott and Hilton are applying new ‘sentiment’ software to pick up on critical observations made by unhappy lodgers on Facebook and Twitter, and thus this ‘collectivity’ of individual consumers effects change.

Americans, like all people, are well aware that collective action is sometimes required to effect change.  If enough of us in the neighborhood are sick and tired of indifferent street repair by the DC government, then we can easily organize a campaign to deluge our Council Member with complaints.  There is no longer a problem of dog shit because neighbors one-by-one and then collectively changed the prevailing social norm.  We besiege school administrators with demands thanks to collective action in PTAs.  The list is endless.  When we see the need, we act; and in that collective action we are no different from primitive tribes who band together because one bow-and-arrow is not enough to bring down the beast.

The fact is that Obama has selected his own particular liberal agenda and called for more concerted collective action to achieve the ends he has in mind.  Tens of millions of people contributed money to political campaigns in a collective activity to rid the world of him.

David Brooks writing in the New York Times (1.22.13) expresses a similar view and says it well.  America does not need a dose of old liberalism, government sympathy and patronization, but just the opposite:

I also think Obama misunderstands this moment. The Progressive Era, New Deal and Great Society laws were enacted when America was still a young and growing nation. They were enacted in a nation that was vibrant, raw, underinstitutionalized and needed taming.

We are no longer that nation. We are now a mature nation with an aging population. Far from being underinstitutionalized, we are bogged down with a bloated political system, a tangled tax code, a byzantine legal code and a crushing debt.

The task of reinvigorating a mature nation is fundamentally different than the task of civilizing a young and boisterous one. It does require some collective action: investing in human capital. But, in other areas, it also involves stripping away — streamlining the special interest sinecures that have built up over the years and liberating private daring.