"Whenever I go into a restaurant, I order both a chicken and an egg to see which comes first"

Saturday, August 31, 2013

Hot Cars

The idea of buying a hot muscle car had always been unimaginable. My parents had big, finny Cadillacs, but all the sconces, parlors, bella figura, and El Dorados of goomba-land had been squeezed out of me by the time I was 12.  Muirland Country Day, a New England boarding School, and Yale had expunged all traces of guinea, so my first car was a banged-up Volkswagen beetle and my second a ‘63 Valiant with 100K.

Why I bought a ‘65 GTO is still beyond me.  I not only had beaters before the GTO, I have always had them afterwards.  Three oil-leaking, overheating, splashy steering Buick Skylarks, third-hand rejects from my sister; a 1976 Volare with mildew, mushrooms, and a gat-toothed grille; a 1964 Falcon with no rear seat and a dodgy transmission; and a 1986 Buick Electra wagon, the biggest car on the road in an era before SUVs, a stone lemon with three recalls, bad brakes, and no springs.  It must have been Laura DiMarco, a flamboyant, eccentric, Isadora Duncan exhibitionist who promised me the sensual delights of the Arabian Nights if I only gave up my Dodge.

So I bought the GTO.  It was not just Detroit stock, but gussied up, tricked out, and given a racing tune by Joe D’Alessio, a no-show favorite of the Newark North Ward and stock car mechanic.   It was a hardtop coupé, 389 cu in.V8, 325 BHP with a single Carter AFB four-barrel carburetor and dual exhaust, chromed valve covers and air cleaner, 7-blade clutch fan, a floor-shifted three-speed manual transmission with Hurst shifter, stiffer springs, larger diameter front sway bar, wider wheels with 7.50 × 14 redline tires, hood scoops, and GTO badges.

It was so hot that it was stolen within six months of delivery.  By that time I realized that on my small starter salary the car payments were way beyond my means, and I quickly settled with the insurance company and went back to hoopdies, beaters, wrecks, and plain girlfriends.  My last car before leaving the United States was a ‘61 Dart whose tires were so bald that I got flats other week.

I think I got the taste for muscle cars when I was still in high school.  Richard Peterson had a ‘62 Corvette, and we would race for pink slips on the 4-lane unopened stretch of CT 71.  Richie’s ‘Vet was fast, but if the red ‘61 hadn’t missed a shift, my racing days would have been over.

Even before I could drive, I had a thing for fast cars. Sammy Bridge and I would listen to vinyl LPs of the racing cars of the 50s.  “The Aston Martins”, the sonorous British voice would intone, followed by the throaty rumble of the engine.  “The Porsches”, the voice continued, and then the unmistakably smooth, deep, angry growl of the Carrera filled the room. There was a pause on the record, and after a second or two the announcer said, “And….the Ferraris….”.  The scream of the monster V-12, high-pitched, revving to impossible RPMs, a powerful, inimitable, climax of sound followed.  Oh, to drive one of those cars.

One weekend I drove up to Smith College with Sammy for a date. As we walked from the car to the dorm, we could hear the deep-throated, unmistakable throbbing engine exhaust of a big Porsche; and as we approached the quadrangle we could see this monster Carrera doing turns in first gear around the circular Senior Walk.  Pure ego and arrogance from a piss-on-you Piping Rock WASP show-off asshole; but what a car! What a sound!

A few years ago and many decades after college, I was wine-tasting in Sonoma, and as I left the tasting room to return to my car, I saw  25 late-model Ferraris in the parking lot.  The Ferrari Club of Northern California was having its annual wine-fest and lunch.  I was surrounded by Spiders, Barchettas, Testarossas, Scagliettis, and Superamericas.  They were elegant, classy, powerful, aerodynamic, and sweet.

 Sonoma and Napa 028

I recently visited the Automobile Museum in Tupelo, Mississippi, a hanger-full of 100 cars from 1900 to the present.  What struck me most was how within ten years (1900-1910) cars moved from being simple motorized horse-and-buggies:

to real cars:

On the next row were the elegant show-palace cars of the 20s, all polished brass, mahogany, leather, and fittings; the Al Capone black sedans of the 30s; the more sedate post-WWII cars of the 40s, the fins and two-tones of the 60s, and the more familiar muscle cars of the 70s.

I recently went to a car show in Columbus MS. On display were triple-chrome, high-glaze, super-polished dandies with jumping struts and mega-engines.

Not quite the classic and elegant Franklins, Chandlers, and Deusenbergs of the 20s; and more a celebration of brute power (the theme was to put monster engines in little frames); but impressive.

When I trade in my ‘99 Camry (getting dodgy at 200K) it will be for a new Camry.  The least flashy car on the road.  Mr. Dependable, alte kocker ride par excellence. Familiar, trusted, and simple. Like a Holiday Inn, so uniform and predictable that it is the preferred hotel for blind people.  Getting behind the wheel of a 2014 is very little different from my ‘99.

Whenever I hear the growl of a Carrera, or the throaty rumble of a Cobra; or see a yellow Lamborghini Gallardo leave a gated community off River Road in Potomac (MD), I follow them and try to pull up alongside at the next light.  I still want to ease into the contoured leather seats, palm the 7-speed gear shift, tap the accelerator and feel the surge of the V-10.  I could buy one, I suppose; dip into my hedge fund futures and hock the inheritance and get back to my muscle car-cum-St. Tropez fantasy; but I know that the stolid, dependable, Camry has become my car.  Even if I made it to the Lamborghini showroom, I would poke out a side door, embarrassed, and guilty.  “What were you thinking?”, I would ask myself.

On the other hand, I could be throttling the Testarossa up to 90 in first gear, hearing the engine whine just like on the record Sammy Bridge and I listened to the 1957. Why not? You only live once.

Watch this space.

Friday, August 30, 2013

Punishment Or Pampering–Which Works Better?

We live in an age of feel-good self-esteem.  Everybody is great, and everyone is equal. Can’t color within the lines? No problem.  You can jump high.  Having trouble with your multiplication tables? That’s OK.  You have an engaging smile that makes others happy. 

Punishment – or at least harsh disapproval for failing to meet the minimum standards required by society (i.e. how to count and read) – is a thing of the past and smacks of Miss Trunchbull’s whacks on the knuckles with a stiff ruler.  Today, pampering in school is the norm. Still reading at a second grade level when you are in fifth?  “Try harder, sweetheart.  I know you can do it.”

       Quentin Blake, “Matilda” by Roald Dahl

Of course the shit hits the fan once these pampered children hit the streets. They don’t have to be able to read and write to flip burgers, but just about any other job requires some proficiency.  A bus driver, at least when he starts off on the N6 has to be able to read road signs; and even the ward councilman’s niece at the DMV has to be able to read customers’ queue numbers.

Most employers in a competitive economy will fire an employee if he doesn’t flip burgers fast enough, or if he continues to rack clothes wrong, or forget to show up on time.  Every company I know, whether soft non-profit or hard for-profit, gives no quarter when it comes to fucking up. All the pampering in school and the inflated praises at home have no currency in the marketplace.

I recently worked for a tough, bottom-line non-profit firm that dealt in poverty alleviation.  The mission was to make life better for the poor Africans suffering under the yokes of disease, illiteracy, and unemployment.  The company was proud, the President said, to apply the same policies of caring and concern to his employees.  It was all window dressing, of course.  There were far more bright young things anxious to do good works than jobs available, so the company demanded long hours at low pay; suffered no malcontents, and hired and fired with regularity.  Writing proposals and managing field projects was not code-breaking or heart surgery, so management rightly felt that they had no sunken costs in their lower-level employees.  Work, perform, and produce or you are out.

In other words, management saw no need to pamper employees, either to increase their benefits or to be generous and forgiving when it came to their faults.  Because each new, young employee saw a line of highly qualified job applicants at the front door, she worked her ass off to keep her job.

Derek Thompson, writing in The Atlantic (8.30.13) cities recent economic studies which show that during the recent recession, productivity went up.  Employees were scared of losing their jobs and worked harder:

According to economists Edward P. Lazear, Kathryn L. Shaw, and Christopher Stanton, the authors of a new paper "MAKING DO WITH LESS: WORKING HARDER DURING RECESSIONS”, employers made do with less because people worked harder during the recession. Fully 85% of increase in productivity came from workers' "increased effort."

No Miss Trunchbull had to come around with her ruler.  Employees felt the implicit, more powerful threat of job loss, and that was all they needed. In fact being employed in a high unemployment area increased productivity even for the least productive workers.

In communities where demand and supply are more equally matched, employers have found that surveillance increases productivity.  That is, if an employee knows he is watched at every turn – number of keystrokes tapped on the computer, articles scanned at the checkout counter, number of bathroom breaks taken in a workday, frequency of late punch-ins, etc.- he will work harder, more efficiently, and more productively.

A recent study measured the impact of surveillance cameras on retail theft.  Not surprisingly, theft went down with surveillance; but unexpectedly revenues and worker productivity went up:

The savings from the theft alerts themselves were modest, $108 a week per restaurant. However, after installing the monitoring software, the revenue per restaurant increased by an average of $2,982 a week, or about 7 percent.

What happened here? Yes, theft declined. That's to be expected. But also, revenue spiked. Productivity increased. Turning casual-dining restaurants into casual-dining panopticons made everybody work harder, perhaps by cutting down on procrastination or encouraging waiters to sell more drinks and appetizers to customers.

There are still those unreconstructed idealists who believe in their bones that treating people ‘properly’ is a moral imperative.  Corporations should accept lower productivity as the price for doing the right thing. The Feel Good Boss creates a congenial workplace:

Four-day workweeks mean more meaningful work hours. Shorter work days translate into better focus. More vacation time means heightened creativity. More short breaks means sharper attention to detail. More "toys" means more creative, playful employees who are necessary for out-of-the-box thinking.

Wrong, say the data.  Feel Good bosses will lose money every time.

The lesson here is not that the threat of punishment increases productivity.  Most of us know that intuitively.  It is about the serious disconnect between the feel-good, self-esteem child-rearing culture of today, and the increasingly competitive nature of the market.  Not only do our schools turn out students who cannot read or write, but these students have no idea that their ‘special’ talents and abilities will have value in the real world. It is double indemnity.  A student graduates with few saleable skills, no discipline, no fear of failure, and an overly optimistic view of himself and the world.

It is high time for school districts to shape up, re-establish intellectual and social discipline, retreat from idealistic pampering, reward high performance and call out the laggards, and create an educational model which resembles that demanded by employers and society.

Thursday, August 29, 2013

Surfing, Extreme Skiing, Risk, And Love

I grew up in Connecticut, so the biggest waves we ever saw were little blebs petering out on the rocky shores of Long Island Sound. I graduated to Misquamicut Beach in Rhode Island which was on the Atlantic and a step up, but the waves were still tame - enough to for a gravelly pink belly on the way in, but little more.  The ocean off Kennebunkport was broken by breakwaters and frigid, Arctic, blue-lipped, aching ankle cold; exciting to look at as the surf broke onto the basalt and igneous rocks on the shore (‘Blowing Rock” was a tourist attraction), but no surfer idyll.

I spent many summers day-tripping on the Jersey Shore - Asbury Park, Ocean Grove, Belmar, Spring Lake, Manasquan, Point Pleasant Beach, and Seaside Heights.  The waves there got big enough for body surfing; but me and the Nicky Nork goombas never went for the waves. We were cruisin’ for poontang, and not about to get blue balls in the early July North Atlantic.

I graduated to the big time in the mid-60s when a work buddy and I went hurricane kayaking off of Pt. Pleasant. A day after the hurricane had passed, the winds died down but the surf was up, ugly big breakers under a cold, grey October sky.  I paddled the kayak out past the surf line, then caught a wave towards shore. I got it just right, and the kayak slid inland on top of the wave.  However, the convection currents of the roiled hurricane-driven waters circled up and around faster and with more power than I had anticipated.  The nose of the flimsy kayak dipped forward and got caught by the wave’s centripetal force. I was tipped, turned, and driven head down into the sand.  The weight of water and kayak forced me into the sea bottom.  The surge of the next wave dislodged me, and pushed me to the surface.  I struggled to gain traction underfoot and purchase in the water.

A few years later I was body-surfing south of Atlantic City.  It was a bright, sunny day, and the beach was crowded, all Coppertone and tan. The surf was regular and strong. I caught a wave at its crest, on its sweet spot, the perfect poise of 3000 miles of Atlantic fetch and shallow, regular beach bottom; but the pitch was not perfect, nor the calculus of the wave; and I was upended, tossed, and thrown into the last thrust of the long, African wave.

I gave up messing with the Atlantic surf, spent years in the calm tropical waters of the Caribbean and the South China Sea, the island inlets near Colombo, and the waters off Batu Feringhi in Penang,

A few years ago I was in Porto-Novo, Benin; and the Holiday Inn was on the beach – a long, interrupted stretch of undeveloped sand and dunes washed by the Atlantic. I saw no one swimming, not even at the shoreline.  No one wading, ankle-bathing.  No women lifting the hems of their panyas. They were removed, elegant, and seductive, but not wet.

 

They were not interested in the ocean, nor the surf; and afraid of the notorious riptides of the Bight of Benin as much as they were maintaining West African decorum.

Many a seaman, ship, and unwary tourist had been sucked under the waters of the Atlantic to disappear without a trace.

The waters of the Atlantic off the coast of Mauritania were the most inviting.  The local expatriates disparagingly referred to the coastline of the country as ‘The longest beach in the world’ – not because its 1000 miles was longer than any other, but because the Sahara extended for 2000 miles to the west– unremitting sand, desert, and emptiness until the Arabian Sea. 

Some of the most extreme surfing in the world is off the coast of Humboldt County in Northern California – rocky, treacherous, cold, and unpredictable; and it has some of the highest surf in the world. You have to be crazy to surf there – obsessed and death-seeking.  You can die in a minute, dashed on the rocks onto a cold, grey, and misty shore.

I have read of surfers getting their necks broken in the surf off of Maui.  Their last thoughts are of the green, tropical mountains, the warm sun, and the soft sandy shores of the beach.  They surfed in Paradise, and were pulled under by a friendly arm of the sea.   Drowning at Patrick’s Point, on the other hand, was letting go of life alone in freezing waters off a desolate, grey, cold, and isolated coast, never to be found.

I have read of surfers in Australia who have hd their arms torn off by White Sharks, others who have been paralyzed from the neck down after being tossed into the sand by the crippling blow of a wave; others who have been sucked out to sea by rip tides and ocean currents.

There is nothing more dramatic than the sight of a surfer in the tube of a wave off of Hawaii.

Perfect poise, balance, confidence.

I have known extreme snowboarders who ski the double black diamonds at the top of the most challenging slopes in the Rockies.

A snowboarder catching air of rock cliff in the backcountry in Golden, British Columbia, Canada.

Speed, air, precipice, rock outcrops, and the risk of getting crushed to a pulp at 12,000 ft.; yet this is nothing compared to the terror of getting sucked out into the North Pacific off the Northern California coast, joining the Humboldt Current, and pulled at 25 mph towards Japan on frigid, dark, and deep waters to another continent.

I once went hiking in the Catskills.  The mountains were nothing compared to the Rockies, but were challenging in spots and were tricky in the late Fall when early snowstorms moved in from the west. We stopped at a gas station, and my friend asked for directions to the trailhead, pointing to the sheer cliffs hanging over the small Greene County town.  The attendant looked at us, shook his head, and said, “You should be in bed fucking your girlfriend, not climbing those cliffs”; and right he was, for as we approached the summit an October storm blanketed us with a foot of snow, leaving us stranded 500 ft. below the shelter.

In my life I have risked disease, abduction, kidnapping, dehydration, and discovery. I have risked censure, moral indignation, and been fined for sexual libertinage. I have narrowly avoided teetering, overloaded Tata trucks on potholed Indian highways; nearly pitched over the railings of 1000 ft. ravines in the Andes; eaten scrofulous and rotten bush meat in Cameroon, been eaten by Anopheles mosquitos in the Comoros and bitten by Tse-Tse flies in Malawi; but never have I chosen to risk death.  I took my chances in the massage parlors of Bangkok or with a lover on the gurneys of the surgical ward of the Aga Khan Hospital in Rawalpindi, but never on the slippery rocks of the Yellow Mountains or the slopes of the high Appalachians.

I have never understood the nature of extreme risk – why people risk their lives for what seems to me few rewards.  I am told that the exhilaration of skiing down a 30 degree slope high on the highest mountain at Whistler is incomparable and almost spiritual.  Next to God in the thin air, crystalline sky, brilliant blue sky and blindingly white snow; but I cannot get beyond the thought of concussions and compound fractures, plaster casts and immobility.  The pain I feel after leaving an Indonesian lover will dissipate; but I will have indelible memories.

Risk is what makes life bearable.  Without it there would be nothing but lunch pails, cubicles, and stale sex. Extreme risk-takers are part of an exclusive club.  None of us take the same risks,but we all share in the adventure of leaving ordinariness behind.

I have become friends with a double Bronze Star Vietnam veteran, 3-tour helicopter pilot, and much decorated hero of the war. He has endless stories of landing in hot zones, avoiding withering fire from the bush, negotiating sharp turns up and down the rivers of Pleiku, flying under the radar and behind enemy encampments.  He did this because it was his job and his duty, but his medals were not only reward his courage under fire; but acknowledge his very American sense of risk, challenge, and reward.

Although I have been a social risk-taker all my life, I place myself only at the bottom of the pyramid.  I have risked reputation, family, and occupation; but my Vietnam buddy, the extreme surfer in Humboldt County, and the off-piste hiker in the Yellow Mountains have risked their lives.

Where we all land, however, is in the land of rewards.  I will still and always take the memories of love and sensuality in the Far East over surviving a fall in the Grand Tetons.

Wednesday, August 28, 2013

So, What Do You Do? Are We Afraid Of Leisure Time?

After my father retired, he wandered about the house all day, always underfoot, staring at the bird feeder, taking out bits of trash, and finding excuses to run down to the corner store.  He never should have quit medicine so early (he was in his mid-70s), but he said he didn’t want to make a mistake – miss an appendicitis, ignore a pre-cancerous growth, or dismiss a serious complaint from yet another hypochondriac whiner.  I never believed that this was the real reason for early retirement; but I am certain that he had no idea that he would be spending his later years puttering around the yard.

He made a few attempts at keeping busy. He tried to teach English as a second language to the many Polish immigrants in our city, but after years of treating them as patients, he concluded that they didn’t have the brains to speak their own language let alone English; and he quit.  Later he offered his services gratis to the hospital where he was on staff, but because of liability risks ended up in Medical Records not far from the boiler room; and he quit that too.

After a while he gave up, resigned himself to doing nothing except watching sports on TV, falling asleep in the Barcalounger, and putzing around the garden.  He never liked this life of leisure.  He just accepted it.

He was consigned to a life of enforced leisure.  Everyone in his immigrant family had either dropped dead on the factory floor or simply wore out.  He was lucky because he never had to work a night shift or sweat in a machine shop.  In fact, many of his patients in our working class town had lost fingers to the lathe or were stone deaf at 45 from the banging of metal presses.  They indeed had something to look forward to in retirement – no more lunch pail days on the assembly line – and happily spent years on lawn chairs in the backyard.

My father was caught in the middle of the American transition from deserved leisure – doing nothing after decades of drilling, sod-busting, and bus driving – to bored leisure. The second career Type A retirement of my generation was a long way off.

There is now a competition among former professionals or corporate managers to see whose retirement can be the most productive, engaging, and impressive.  Putzing in the garden is not an option, but landscaping it is. Reading by the fire is out, but writing a book is in.  Taking an adult education course is fogey-business, only serving to stem the ebbing tide of brain cells; but teaching a course shows vigor, enterprise, and imagination.  Visiting grandchildren is passé unless they live in Paris and you can visit the Louvre.

Turning leisure time into work is a very American thing. It is hard to imagine a retired Frenchman leaving his office and volunteering in the Arab suburbs; or providing tax advice to the poor; or joining an advocacy group to stop global warming. They must certainly be enjoying themselves on the beach at St. Tropez, sailing the Mediterranean, or riding to hounds.

Jenny Diski, writing in the New Statesman (8.28.13) comments on this phenomenon and recalls the panic many retired people feel when asked the question, “What do you do?”:

But what if as you use the phrase “I used to [. . .]” your own heart sinks, or your psyche panics at the idea that you might not be what you think yourself to be? Or that what you think yourself to be crumbles into nameless dread at the thought that you are not being what you are doing? The party questioner is only you (or me) on another day, wondering how on earth we are to get through the rest of our time as conscious beings without the reassurance that we are a writer [or] a teacher…

If you are what you do, what are you when you stop doing it and you still are? There are people who don’t find this a problem, who have not entirely or even at all identified existence with what they do and how they make a living, but they are evidently a great problem to those – the majority –who do.

A friend of mine who worked for a large corporation in Washington told me how he and his colleagues feted a co-worker who had just retired.  They wished him well, said they were sorry to see him go, and hoped that they would see him again soon.

Their wishes were granted, for he returned the very next week to take up residence in his new ‘ex-officio’ quarters set aside for retired executives and managers.  Apparently he came into the office every morning at 8am and stayed until 5 until one day the cleaning lady found him slumped over his keyboard, dead as a doornail.  He had died in his traces as he had always wished.  He never did anything meaningful or important as an unofficial member of the team.  Although he cranked out white papers, memos, and idea-notes, no one paid any attention to them or to him.  As time went on the younger members of the firm wondered what this old fossil was doing wandering the halls, but in time they too ceased to notice.

Jenny Diski says that a similar fate befell her grandfather:

My father often used to tell me how my immigrant grandfather declined in health and spirit once he gave up the café he ran from dawn to late into the night in Petticoat Lane to retire to a leafy suburb. It was only a matter of time, my father said of the man I never met and knew almost nothing else about, before he died of having stopped work. I think this story is the equivalent of an urban myth of that generation. The decent man who worked all the hours that God sent and more, provided what he could (which was never lavish) for his family, toiled unceasingly in order to make sure his son went to a good school and got a profession, collapsed and died once he stepped off the treadmill.

He in many ways was like my father and others of his generation who had no options of opera, theatre, or summers at the Vineyard to give them the foundations of a varied and productive retirement.  Work, family, and responsibility were the only parameters of their lives.

It is strange to see, therefore, that retirees of my generation who have had all the benefits of wealth, education, and modest privilege still work until they topple over or one day look at the computer screen and have no idea what it is.

Diski writes that primitive settlements were the first leisure societies.  They operated on the margins, had little, and when survival was at least extended (killing a wildebeest, gathering nuts, thatching the hut), they could relax.  There was absolutely nothing to distract them from their leisure.

Only when you worship the idea of accumulation and status based on its perceived wealth-giving properties do you have to work hard all the time. Accumulation was hampering; you had to carry it about with you when you moved from camp to camp, or find ways of storing and securing it if you were sedentary. Without the idea of surplus as a value beyond its use value, when you needed/wanted something you got it, and when you had it, you enjoyed it until it was time to get some more.

This doesn’t sound very appealing to me – running hundred of miles across the veldt to kill your prey, hauling it back to camp, gorging on raw meat, then doing nothing for a week – but I suppose it had its moments.

Diski then turns capitalist-basher.  It is greed, she says, which forces older folks to work longer, a distorted economy which erodes their pensions and savings, and contributes to the misery of the old.  They bought a capitalist dream and were cheated.

This is nonsense, of course.  The same people who have some of their savings compromised voted in lock-step with their unions for big benefits; supported venal politicians who gave away the store for patronage; and who spent unwisely on ‘things’.

People with money will always have an easier time of life and retirement than those without.  My generation and those to follow will have many retirement options – lie in the hammock and read trashy novels, or teach Shakespeare at a local university; while those before were conditioned by their immigrant origins and the Depression to keep on the treadmill. Capitalism cannot be blamed for the travails of the working poor.  The system simply deals the cards in a fair game.  Some people are winners and others are losers.

Diski closes with the dark, pessimistic conclusion:

Leisure, not doing, is so terrifying in our culture that we cut it up into small, manageable chunks throughout our working year in case an excess of it will drive us mad, and leave the greatest amount of it to the very end, in the half-conscious hope that we might be saved from its horrors by an early death.

I don’t think leisure is that terrifying at all.  For those of us with options, we can choose to be idle – simply lying on a beach or toddling about with the grandkids; or engaging in ‘productive’ activities.  If we see a great, gaping, dark chasm ahead of us as we leave the office the last time, that’s our problem, and it is easy to avoid.

For those millions working hard for a living, retirement – however it is defined – is a break from the toils of labor.  I cannot imagine a long-distance trucker, kidneys banged to shit, slipped disks and lumbago, bunions, and  rancid stomach, staring vacantly out at the birdfeeder like my father did.  Leisure and retirement means no work, and that is enough.

There are relatively few of us who are so paralyzed and terrified by the thought of retirement that we hope for an early death.  That is the fate of the diminished, sick, and broken.

Tuesday, August 27, 2013

Does Bombing Syria Constitute A Just War?

Philosophers have debated the issue of the morality of war for centuries, and have concluded that there is, in fact, such a thing as a just war; and a just way of waging war.  In a two-part series for the New York Times (11.12.12) Jeff McMahan has written about the origins of the just war theory and how it is being challenged by the changing nature of war.  The principles of right wars and right conduct were developed and applied when wars took place between nation-states; but now that armed conflicts rarely pit countries against each other and more often set factions in opposition within a country or a region, these principles may no longer be applicable or appropriate.

Enshrined in the principles of the Geneva Convention, such wars must adhere to the following principles:

In most presentations of the theory of the just war there are six principles of jus ad bellum [undertaking just wars], each with its own label: just cause, legitimate authority, right intention, necessity or last resort, proportionality and reasonable hope of success. Jus in bello [conduct in just wars]comprises three principles: discrimination, necessity or minimal force, and, again, proportionality. These principles articulate in a compressed form an understanding of the morality of war that is, in its fundamental structure, much the same as it was 300 years ago.

These principles were rarely adhered to, even in the more innocent age of World War II.  America’s firebombing of Dresden, or the nuclear destruction of Hiroshima and Nagasaki could hardly be called proportional; but obviously the generals who planned these attacks certainly thought so.  Curtis LeMay, a senior officer in the Air Force who advocated annihilation of the enemy through massive air bombing, said it best.  War is hell, he averred, saving American lives was the only priority, and all calculations and equations of Japanese dead had no relevance whatsoever.  Bombing the Japanese back to the Stone Age was perfectly right and acceptable because it would shorten the war and stop the killing of American soldiers.  His argument, indifferent to the numbers of Japanese dead, was only focused on the morality of victory and lives of the victor saved.

For LeMay ‘proportionality’ had no meaning.  Since the value of one American soldier had no equivalent for an enemy force – that is one can never put a value on an American life because that life is inestimable.  Similarly one could never put a value on Japanese lives because they had none.

Israel applies the same theory.  There is no way to calculate the life of one Israeli solider because it is inestimable; and if it takes the death of 100 or even 1000 Palestinians to stop the killing of Israeli personnel, then so be it.  In short, Israel has their own definition of proportionality.

‘Legitimate authority’ has been generally interpreted as a response to an armed attack.  A just war would be one in which the aggrieved country has a legitimate right to declare war on the aggressor.  There was no question, therefore, concerning the Allies’ armed response to the invasions of Nazi Germany and Imperial Japan.  Given this rather simple and straightforward definition, the principle of ‘right intention’ followed logically.  ‘Reasonable hope of success’ seems the least important and relevant, for few countries would embark on a war with little expectation of victory.

I am sure that American politicians and their generals felt that the Mexican-American War, the Spanish-American War, and the War of 1812 were just wars, with clear legitimacy, right intention, and reasonable hope of success.  Yet, a strong case could be made that these were American wars of aggression designed to consolidate territory, remove the British once and for all from American lands, and to extend horizons of empire. 

While the Korean War appears to fit within the just war category, since consensus is that it was provoked by the North Koreans, the Vietnam War clearly does not.  No one can claim that this war met any of the conditions of the Just War Theory.  What legitimacy was there in a war for which the casus belli was invented (Gulf of Tonkin)?  How was it legitimate to intervene in what was basically a civil conflict fought over nationalism, not regional domination?  Where was the proportionality in the massive, LeMay-style carpet bombing of the North?  Where was the reasonable expectation of victory when the United States completely underestimated the force and resolve of the Viet Cong and the North Vietnamese.

The point is, the argument about just wars is an academic exercise with no real relevance.  From the perspective of the aggressor, all wars are justified.

The academic exercise itself gets more complicated as philosophers try to make sense of the asymmetrical wars of today.  Countries do not fight each other much anymore, but al-Qaeda, the Taliban, Boko Haram, al-Shaba and a hundred other armed militias in the world attack ruling regimes or each other with increasing regularity.  These ‘wars’ are, like state conflicts, fought over land, resources, power, and perhaps some principle; and there can be no doubt of the perceived legitimacy of them.

The moral argument comes when superpowers have to decide whether to intervene or not.  Surely there was a moral case for a just war for the United States to intervene in the Rwandan genocide and wage war against the Hutus; or to send in expeditionary forces into Sudan to stop the killings in Darfur; or to have intervened far earlier in the war in Bosnia.  Many argue that in failing to fight a just war, America and its allies were immoral.  Justness or rightness have to be defined within the context of sins of omission as well as sins of commission.

Philosophers are trying to decipher the new rules of engagement and place them within a moral context.  Some, like McMahan want to revert back to classical theory which places the onus of responsibility on individuals, rather than states.  In the War Against Terror we are neither fighting states or armed groups, but individuals who want to do us harm in a general cause.  Destroying or neutralizing them is necessary to keep the United States safe. 

[The Theory] returns, for example, to the idea that it is individual persons, not states, who kill and are killed in war, and that they, rather than their state, bear primary responsibility for their participation and action in war.

This to me is counting angels on the head of a pin.  We wage war under the briefest and flimsiest pretenses, and no one really seems to care who is threatening us or how.  It is enough to be threatened and to take whatever action necessary to remove the threat. Once we engage in war, the issue of just action arises; and any semblance of Geneva convention niceties have been thrown out the window.  Who doubts that an enemy of the United States would hesitate to use chemical or biological weapons; or that we would use strategic nuclear strikes if necessary?

I agree with Curtis LeMay and take his philosophy one step further.  If wars were left to be more brutal, more horrific, more unthinkable in their savagery and brutality, there would certainly be fewer of them.  No school of moral philosophy nor any attempt to apply its principles on the battlefield have made any difference.  A nice academic exercise, but little more.

The debate over whether to attack Syria continues, but there appears to be no compelling realpolitik reason to do so. Bombing Syria would serve no purpose whatsoever.  It will not get rid of Assad.  It will enflame the Middle East even further, alienate Russia and China, do little to help the opposition (which the US fears just as much as Assad), and do nothing to help us shed our Paper Tiger image.  The only reason we might attack is for moral reasons – chemical weapons are immoral.  Nonsense.  They are instruments of war, just like the thousands of nuclear warheads the US stockpiles.  If pushed to the wall, we would use them.  Nothing we have done in Afghanistan or Iraq has changed the calculus there. The Taliban are resurgent and powerful, ethnic and sectarian violence will continue for decades in Iraq.  Democracy is a sham in both countries and has little hope of surviving.

Either the US goes in with a vengeance, rein down the Wrath of God on Assad and his regime, occupy the country, and over a period of decades promote democracy – which we should have done in both Afghanistan and Iraq – or stay out of it.  Morality has no place in the equation.

(Title changed, updated from blog post which appeared in November 2012)

Monday, August 26, 2013

Secular Religion–We May Not Need God, But We Still Need To Believe

America is perhaps the most religious country on earth; and the number of avowed believers, those who attend church regularly, and those who believe in divine intervention in everyday life is very high indeed. While the number of non-believers is rising, they are still relatively few.  We are still a God-fearing nation and will be for some time to come.

Andrew Brown, writing in The Guardian (8.26.13) laments the passing of Robert Bellah, a sociologist who believed that a society without religion is impossible and dismisses entirely the notion, embodied in the Constitution, that the collective will of the people guided by the benevolent hand of the State is enough – there is no need for religion:

Liberalism… has no need of God because it trusts that the self-interest of the citizens will lead them to the best possible outcome: "the state is a purely neutral legal mechanism without purposes or values. Its sole function is to protect the rights of individuals, that is, to protect freedom." Such a state is, he thinks, an absurd impossibility, which could never exist.

The two notions can, of course, co-exist. Society functions well because of individual self-interest – our fundamental, ineluctable, hardwired human nature. We all follow the law of the jungle.  However, while clubbing a rival hunter who intruded onto our patch of the veldt was as reasonable as tigers defending their territory with tooth and claw; agriculture and the large communities it engendered demanded cooperation. Law and civil society were begun.

Religion developed independently. It satisfied other, more primal needs. Ignorance bred fear of and respect for lightning, thunder, and earthquakes; and eventually the worship of the powers that were behind them.  If we worshipped and prayed, perhaps lightning wouldn’t strike.

No less is true today.  Civil society works just fine without religion, and the scale measuring the balance between individual enterprise and government authority is usually balanced even as human societies grow ever more complex.  ‘Government’ is now thousands of little governments, jurisdictions, departments, courts, and division; and ‘civil society’ is far less atomized and more agglomerative – the AARP has clout.  However, regardless of complexity, the system needs no divine intervention.

At the same time, religious worship is no different than what it was in pre-history.  We still worship unknown forces, pray for divine intervention, and solve the riddles of the universe by saying ‘God did that’.  We still have priests who interpret divine will; ceremonies of sacrifice, penance, and repentance; and we all wear the protective mantle of belief.

Since there will never be any answers to the big questions, there will always be a role for religion. And as long as society remains unequal, religion will always provide the last refuge for the poor, the disadvantaged, and the unfortunate.

One important distinction, however, should be made. It is belief that is universal and unchangeable, not religion. If the cycle of history turns towards the secular, and God is put on the shelf for a few hundred years, we will create other, secular gods.  Atheism, for example, is not simply a personal rejection of God, but a religion of unbelief – a collective movement with its own dogma, rituals, recitations, and cant.  There are now atheists’ conferences, school clubs, websites, and jamborees.  It is simply not enough to wake up one morning, look in the mirror, and conclude that God doesn’t exist.  We need affirmation, collective support.  We need to be in a crowd shouting “God doesn’t exist”, carry copies of the Atheist Catechism, proselytize and evangelize, and pray that others will see the errors of their ways.

Given the passion and  irrational fervor of Environmentalism, it has become the religious movement of the day, and is little different from the millennialism of the past.  The world will end in a fiery Armageddon, say Environmentalists.  We will pay for our sins against the Earth, and our fate will be hot, brutal, and inescapable. However, we can save the Earth and ourselves through prayer and good works.  There is still time.  How different are these warnings, chastisements, and admonitions from the fire and brimstone that rages from the pulpit every Sunday?  No different at all.

Environmentalism may be the best example of secular religion, but America is awash in causes with believers just as fervent. The social media are crammed with appeals for animal rights, women’s rights, gay rights, and civil rights. The faithful cram auditoriums to listen to secular priests fulminate about doom and disaster – puppies eviscerated, women banging their heads on glass ceilings, gays marginalized and abused. These true believers leave the room feeling as sanctified as those who receive Holy Communion.

We invent God daily.  We cannot do without a belief that supersedes our small lives. This, however, has nothing whatsoever to do with the real world in which we piss on our perimeters, fight like she-bears to protect our children, go to war with our rivals and competitors, scramble up the corporate ladder or game the welfare system, and, bloodied and bruised, pull over when John Law fires up his flashers.

Sunday, August 25, 2013

Can’t Concentrate? Go To A Noisy Cafeteria

I do my best thinking in cafeterias, coffee shops, and walks in city parks.  I am at my worst when it is quiet except for an occasional noise, like the peep of an elevator or the tumbling of an ice-maker far down the corridor -  intermittent and infrequent sounds that I listen for.  I know these small but intrusive noises are coming, and I wait for them, I begin to hate them; and I cannot think when I know they are coming but know not when.

However, there is nothing wrong with noise per se. It is its infrequency, intermittency, or suddenness which disturb concentration. When I am in a crowded place, I take noise for granted.  One bit of chatter blends with all others. The clatter of one coffee cup becomes part of a low, continuous collective clatter – unnoticed and unintrusive.

I never was bothered by the incidental noise of an office.  Neither the tapping of keyboards, the shuffling of paper out of the copier, cubicle phone conversations, or the sounds of traffic from the street. They became part of the office ambience, its background, and as such were never intrusive.

I do my best creative thinking on walks in busy places.  I have composed short stories walking the streets of Adams Morgan, organized lectures on Othello along the busy C&O Canal, and prepared addresses before international colloquia wandering the narrow lanes of Trastervere or on the Hudson River Promenade. 

Whenever I learned a new language, I would take long city walks and invent conversations to fit the scene. I practiced the impossible declensions of Russian walking the streets of Kiev; the foggy subjunctives of Portuguese amidst the tourists at the Washington Monument, and the intensifiers of Italian visiting the old industrial town of New Brighton, Connecticut, my home town.  Nothing could interrupt these internal dialogues – not sirens, construction, pedestrian crossing beepers, or yelling children. I conjugated; mastered the conditional and  the peremptory and polite forms of the imperative.  I practiced greetings and salutations, directions, questions and likely answers.

According to George Prochnik writing in the New York Times (8.25.13) the philosopher Schopenhauer considered noise to be the worst enemy of the deep thinker:

His argument against noise was simple: A great mind can have great thoughts only if all its powers of concentration are brought to bear on one subject, in the same way that a concave mirror focuses light on one point. Just as a mighty army becomes useless if its soldiers are scattered helter-skelter, a great mind becomes ordinary the moment its energies are dispersed.

Schopenhauer went on to suggest that it isn’t noise alone which causes the great mind to be deflected, it is “brutish jolts of sound” – my banging ice makers and elevator doors –and  the isolated, intrusive, and unexpected noises penetrating silence.  Schopenhauer’s particular nemesis was the “infernal cracking’ of coachmen’s whips. The philosopher’s problem may not have been too much noise but not enough.  Had he been able to surround himself with the random clatter and voices of cafeterias, coffee shops, busy streets, and busy parks, he might have been able to think more great thoughts.

Prochnik goes on to cite evolutionary evidence of our innate, programmed response to unusual noise:

Mammalian hearing developed primarily as an animal-detector system — and it was crucial to hear every rustle from afar. The evolved ear is an extraordinary amplifier. By the time the brain registers a sound, our auditory mechanism has jacked the volume several hundredfold from the level at which the sound wave first started washing around the loopy whirls of our ears. This is why, in a reasonably quiet room, we actually can hear a pin drop.

While this may indeed be true, it ignores the real question of concentration raised by Schopenhauer.  It doesn’t matter that we can hear a pin drop in a quiet room, or that I can count the ice cubes tumbling out of the ice maker 100 feet down the corridor.  It matters only how one manages or manipulates sound to enhance concentration.  Concentration improves in a crowded cafeteria because there is a lot of noise, and there are no unexpected drips of water, pins dropping, or buzzers beeping.  It all becomes background.

Prochnik turns to really loud invasive noise – jet planes, jackhammers, and sirens; but once again, these sounds cannot be taken out of context.  My office in New York City was on a major avenue and sirens and jackhammers were frequent, daily occurrences.  Soon they became part of the unintrusive background of the city – they lost their uniqueness and became part of the ambient sound environment.  I no longer heard them.

The same can be true of aircraft.  There is no doubt that if air traffic control changes flight patterns because of wind or storms, and if the jets taking off and landing at National Airport are diverted to my normally empty airspace, I will look up.  The same for helicopters.  Every service has them – DC Police, FBI, Treasury, Homeland Security, Defense, and the White House – but flights are irregular and unexpected.  No one can ignore the thumping of copter rotors a few hundred feet overhead; but if flights settled into a pattern, they would easily be ignored.

Prochnik finally recounts the time he was interviewed on a radio station serving remote parts of Newfoundland:

One caller lived in a village with just a few houses and almost no vehicular traffic. Her family had been sitting in the living room one evening when the power suddenly cut off…All at once, the many electronic devices around them (including the refrigerator, computers, generator, lamps and home entertainment systems and the unnatural ambient hum they generated and to which the family had become oblivious) went silent. The family members didn’t realize until the sound went off how loud it had become. Without knowing it, each family member’s mental energy was constantly diverted by and responsive to the threat posed by that sound.

This story, however, again diverts attention from the central issue posed by Schopenhauer – concentration and how noise affects it.  Most of us living in metropolitan areas, let alone the backwoods of Newfoundland, stop for a moment when the refrigerator stops humming.  We comment that we never realized how much noise it makes.  We notice the silence but we never noticed the noise and  went about our business talking, reading, or relaxing.  The hum of appliances, heating, and cooling were part of our familiar environment.  The noise disappeared once we became used to it.

The author raises issues of hearing damage and loss caused by noise, and there is no doubt that if one is subjected to the pounding decibels of jet engines every day, regardless how completely they are ignored, harm can result. This, however, is a separate problem and unrelated to concentration.

Schopenhauer gave in to noise later in his life, and found that once he relaxed and accepted noise and sounds as necessary and interesting elements of life which could be controlled, managed, and manipulated if necessary, he was happier:

Apparently he, too, believed it important to observe as much of life as possible. And when he moved to Frankfurt, he didn’t bring earplugs. He brought along a poodle known to bark on occasion, and the flute he loved to play after writing.

No one knows if he ever ‘embraced’ noise, but leaving his vernal retreat and joining the rest of the world seems like the first step.

Saturday, August 24, 2013

Self-Esteem Blather–We Are All Not That Great

The self-esteem juggernaut keeps on rolling.  All children are special, say PC advocates, and all are equal. The fat, uncoordinated kid is special because he can color inside the lines.  The dumb kid who picks his nose during spelling is generous and loving. The dysfunctional out-of-bounds kid who trips people on the stairs and starts fights on the playground has ‘energy and ebullience’.  They all walk out of school every day thinking that they are special until they get whupped, put down, and turned out.  The cult of self-esteem has done them no good.  Better be honest with them and prepare them for failure or at least modest achievement. 

It seems particularly hard for today’s ‘progressive’ educators to give intelligence its due. Smart kids will more than likely turn out to be smart, successful adults; and most other classmates with fewer brains but ‘excelling’ and drawing, music, or sports, are likely to be left behind.  Yet, throwing these intellectually superior children into the same educational mosh pit with slower ones dumbs down the entire academic process.  It is strangely all right to compliment and encourage kids who can do the do-se-do and paint rainbows, but not correct to provide advanced learning challenges to those with above-average abilities.

Harping on self-esteem makes matters even worse.  Not only do smart kids get short shrift, but the dumb ones are pushed along on false notions of happy equality.  Public elementary education could not be any worse.

Apparently this self-esteem craze has taken hold in Hollywood; and more and more children’s movies are about life’s little losers who end up wild successes. Luke Epplin, writing in The Atlantic (8.24.13) writes about this ‘magic feather’ syndrome:

It's probably no coincidence that the supremacy of the magic-feather syndrome in children's movies overlaps with the so-called "cult of self-esteem." The restless protagonists of these films never have to wake up to the reality that crop-dusters simply can't fly faster than sleek racing aircraft. Instead, it's the naysaying authority figures who need to be enlightened about the importance of never giving up on your dreams, no matter how irrational, improbable, or disruptive to the larger community. As Jean Twenge… argues in her bestselling book Generation Me, younger generations "simply take it for granted that we should all feel good about ourselves, we are all special, and we all deserve to follow our dreams."

Turbo, Planes, and Kung Fu Panda are all about how little, ordinary ones with big dreams realize their fantasies.  The message is the same as that given in classrooms – everyone is special, and only if you try, you can realize your dreams. Self-esteem has become a big industry and a Hollywood phenomenon.

                                                    Turbo

None of this is surprising, of course. We Americans – at least in public – hate to admit that privilege, wealth, and success are limited to a very few; and that such achievement is conditioned by brains, family, innate drive and ambition, risk-taking, and confidence – traits of the very few.  Rather than acknowledge and admit vast differences in ability, we tend to trick out the stumblers in pretty dresses, and display them to appreciative audiences.

Epplin, in The Atlantic article notes that this was not always so. Charlie Brown was a downright loser.  Trying was never enough. Self-esteem never entered the picture. Lucy, Violet, or Patty always pull the football out before Charlie can kick it. 

                Charles Schultz, Peanuts

Charlie is no good at anything, and none of his friends try to gloss over the fact that he is a loser.  Better tell it like it is says Peanuts creator, Charles Schultz. In the movie A Boy Named Charlie Brown, Charlie is determined to prove himself and enters the school spelling bee; and by winning he becomes the area representative for the National Elimination Spelling Bee in Washington, DC.

Before his departure he confides to Linus, "There's a good chance that instead of being a hero I'll make a bigger fool of myself than ever." Somewhat unhelpfully, Linus responds, "Don't be discouraged, Charlie Brown. You have nothing to lose. You'll either be a hero or a goat."

The lesson is clear:

Charlie Brown learns through Linus's tough-love speech that failure, no matter how painful, is not permanent, and that the best means of withstanding it is simply to show up the next day to school with the fortitude to try again. Losing also forces Charlie Brown to come to terms with his own limitations. He can't rely on a miraculous victory to rescue him from his tormented childhood. He followed his dream, it didn't pan out, and he ends up more or less where he started, only a little more experienced and presumably with a little more respect from his peers.

The advice to come to terms with your own limitations, however reasonable, is rarely given these days.  The self-esteem mantra is repeated from elementary school on up.  Students who have no place in four-year colleges are encouraged to go there not only because of the encouragement of idealistic, feel-good, self-esteem soaked advisors, but also because matriculation is made easy by lax admissions policies (not to mention affirmative action), liberal loans and grants, and non-existent performance standards. Unqualified students who leave after two years are discouraged, saddled with debt, and tarred with failure.

Few academic advisors are willing to say, “Billy, you are not college material, but you would make a great car mechanic”.  Community colleges have for far too long been the repositories of mixed-up young people drifting and finding their way; and only a few (such as the East Mississippi Community College) have found, accepted, and flourished in their vocational niche.

Life is all about failure, not success, despite the American Dream. The best and most innovative companies in Silicon Valley teach that failure is the price of risk-taking; and without risk, there can be no success.  On a more mundane level, many Americans slog away at low-paying jobs, struggle to pay the mortgage, and a night out is a Family Meal at McDonalds. Speaking honestly and realistically to the children of these families about their prospects is not to consign them to penury and toil, but to allow them to prosper and excel within their limitations – just like Charlie Brown.

Europeans always sneer at Americans for our boundless enthusiasm, idealism, and optimism.  Life has never been that way, they say, and look meaningfully at the wars, pestilence, famine, and brutality of their thousand year history.  We refuse to accept the past, limitations, or obstacles; and we are the most positive country on earth – despite the fantasy of that illusion. 

Nevertheless, a little dose of realism can’t hurt; and the fat kid will get over the fact that he can’t jump once he stops trying and turns to other more productive pursuits. Those of us born well before the self-esteem era took our licks on the playground and the taunts of the cafeteria, and played with the cards we were dealt.  Not a bad lesson as Charles Schultz taught us.

Friday, August 23, 2013

Faith-Based Foreign Aid? A Very Bad Idea

I had a friend who had worked for many years in international development, first as a founder and managing partner in a new private, non-governmental enterprise which assisted foreign governments in strategic health planning, and later as a private economic consultant.

“You’re a Jew, aren’t you?”, asked the Director of a well-known faith-based organization here in Washington, hearing my friend’s surname. “So I presume that you have not accepted Jesus Christ as your personal savior”.

My friend thought a moment before answering, at first wanting to challenge this preposterous statement and wanting even more to put this unctuous, pompous asshole in his place. He looked around the room, saw the benevolent pictures of Jesus with the poor, Jesus multiplying the loaves and fishes, and Jesus curing the sick; and knew that he was in enemy territory.  Response would be foolish and, given the fixed, sanctimonious smile on the Director’s face, a total waste of time.

“All our employees are Christians”, said the Director, “and it is Jesus – no one else – who guides our work.  If an employee does not know Him and is not guided by His hand, he cannot do God’s work”.

Unmentioned, however, were the tens of millions of American taxpayer dollars funneled into this charlatan shop. On the pretense of providing aid and technical assistance to the poor, it was simply trolling for souls. No one in Washington ever minded that it never had any impact whatsoever on the economic or social well-being of its intended beneficiaries. Both the Congress and the President were quite happy to trot out the Director, invite him to prayer breakfasts and let him, once again, invoke the good offices of Our Lord in public.

I took a consultancy assignment to a newly independent Eastern European country shortly after the fall of the Soviet Union.  My job was to evaluate an American religious organization affiliated with one of the mega-churches and its well-known televangelist pastor.  The organization had received millions to dig deep wells to replace the polluted and decrepit ones left by the Soviet regime.  The whole operation was a sham.  Half the wells were built for the organization’s own churches and schools.  The warehouse supposedly built to house drilling equipment was overbuilt and used to store used clothing donated by the faithful back in Atlanta; and the only wells dug outside the perimeter were for Christian churches.  Up until my coming, the mega-church representatives had been given a free ride.  Don’t ask, don’t tell had been the rule.  The the famous Bible-thumping evangelist seen weekly on his television show had a huge following and contributed mightily to the Republican party.  No one wanted to rattle his cage.

Years later I took another assignment, this time to Rwanda, to help the country design its first National HIV/AIDS Prevention Plan.  One morning I sat at the breakfast table with two missionaries.  They were in Kigali volunteering for their church in Little Rock whose donations helped fund the humanitarian effort managed by a larger religious NGO similar to the one which had rejected my Jewish friend.

After we had gotten to know each other a bit, one of the women said, “Yesterday, we witnessed a miracle”.  Her colleague nodded and smiled.  “We had planned to serve a hot meal for 50 children – you know, those poor little ones living in the slumst”. She stopped, looked meaningfully out across the city to the hills beyond and continued, “The lilies of the field which have not been allowed to flower.

“After a half-hour of distributing food, we saw that there wouldn’t be enough.  The number of mothers lined up at our door was at least 100 and many more were arriving.  All we could do was to keep feeding and to pray for a miracle”.  Again she paused, looking knowingly at her friend and smiling.

“Jesus looked down on us and said, ‘No child will go hungry’, and lo and behold, our food never ran out.  No matter how many children opened their hungry mouths to us, the food was rich and plentiful.  It was a miracle”.

In the late 70s I lived in Ecuador and had a chance to travel in the high jungle.  Upon arriving I was told that I should visit an American family living nearby.  I was given directions, and the next day rode an hour in the back of a pickup down a rutted, dusty road to visit. What I saw as I drove up was a mirage – a white, Iowa farmhouse with a porch swing, picket fence, trellis of tropical flowers, and an old-fashioned well.  Two little blonde girls, dressed in petticoats and pinafores were playing on the lawn.

The Hendersons were warm, friendly, and hospitable.  They invited me in for dinner -  a full spread of pot roast, garden vegetables, mashed potatoes, milk, and blueberry pie.  After dinner, the father and I sat on the porch swing, and he told me what they were doing in this remote part of the high Amazon.

“We are bringing the Lord to these people”, he said, pointing to a group of Amazonian Indians who were working the fields nearby. “Before we came, they worshipped trees, rocks, and animals, if you can believe it.  Pagan, heathen beliefs.  The lived in a dark, godless world.  They wore no clothes, copulated like buffalo, and had no shame.”  He paused to sip his tea and reached for his Bible.

“We brought them the Word of God”, he said, tapping the well-worn leather cover of the book.  First we only read to them, and then, with the help of Jesus, we taught them to read.  We clothed their shame, righted their ways, and they joined the communion of the Lord”.  A young Indian came out to the porch with glasses of lemonade.  “Jose”, he said, “What did the Lord say to the Pharisees in the Temple?”.  Jose, still holding the silver tray, recited, “Woe to you, teachers of the law and Pharisees, you hypocrites! You shut the door of the kingdom of heaven in people’s faces. You yourselves do not enter, nor will you let those enter who are trying to.”  He stumbled over the verses a bit. ‘Pharisees’ came out ‘Frizzies’, and he mumbled the last few words, but both he and the missionary were proud of the performance.

“See”, the missionary said.  “The Lord’s work has been done”.

He went on to tell me that his little jungle enterprise was supported by his church back in the United States and through the generosity of yet another mega-church NGO in Nashville which in turn received funding from the US government.  I asked him how his religious mission satisfied the secular provisions of a US government contract. “Oh that”, he sneered. “We distribute anti-worm medicine and once a year our people in Quito come down to vaccinate the children.  It doesn’t interrupt our real work too much”.

I could go on.  I worked for over 40 years in international development and ran across these crazies in every country I visited.  George W. Bush gave the so-called ‘faith-based’ organizations a big boost, and extra points were given for every subcontract given to one of these Christian operations. As a Director in a large Washington firm which did all of its business with the US Government, I was told that I had to include at least one faith-based organization in every proposal bid I prepared.  As soon as word got out that the federal sluice gates had opened, every possible Christian organization quickly registered and lined up for money. I got applications from every crackpot, cockamamie, fly-by-night, street-corner group in the United States.  They would call me, email, and even show up unannounced at my door.  Their ships had come in, and they were ready to be paid.

Under the US government contracting system, no prime contractor actually has to use the services of subcontractors; but everyone knows that most of the women-owned, small-business, Vietnam-era, and faith-based groups are recruited just for show.  Under Bush, however, more and more money had to flow through them.  It wasn’t enough to have Partners in Christ on the proposal cover.  They had to receive and spend money in the field.

According to Katherine Marshall writing in The Guardian (8.23.13)

The launch of the new US State Department's Office of Faith-Based Community Initiatives this month by Secretary John Kerry has prompted a slew of questions and something of a stir: is such a new office needed? What will it do? Are there special risks involved? Is it proper, in the light of the fabled Jeffersonian "wall of separation" between religion and state?

Obviously Obama is pandering to the religious right and the 60 percent plus Americans who hold fundamentalist Christian beliefs.  It is his way of mollifying the anti-government Tea Partiers, holding off his critics who claim that he is a Muslim, and getting nods of approval if not applause from a very religiously conservative Congress.  For Obama and all presidents before him, not only is there no downside to funding religious groups, there is political hay to be reaped.

Marshall argues in favor of faith-based initiatives because the world’s population is overwhelmingly religious, and because the moral teachings of religions are universal, foundational, and should be the guiding principles of any enterprise.

This is all well and good. Who can argue the correctness of the Sermon on the Mount or the wisdom of the Ten Commandments?  It’s just that saving souls – the bread and butter of faith-based organizations – gets in the way of everything else.

Many years ago, I worked for a major international relief and development organization in India.  The Director was a big, bluff, ex-longshoreman who had been recruited in the post-WWII years when the organization was in the business of delivering relief packages to the displaced of Europe.  By the time I joined, the company had expanded its scope and was now distributing surplus US farm commodities throughout the developing world. A lot of non-profit organizations were competing for distribution rights because revenues were based in large part on the quantity of food delivered.

One day our Director convened a meeting to discuss the inroads that an upstart religious organization had been making in the country.  It had to be stopped at all costs.  One of my colleagues opined that there certainly was room for all, and that Children Are Beautiful was operating far beyond our perimeter.

The Director reddened and the veins on his thick neck, still strong and massive even though the docks of Hoboken were far behind him, bulged and throbbed.  “Fuck Children Are Beautiful”, he spat.  “Fuck ‘em. Erase them.”

The world of international development, whether peopled by secular or religious organizations is as territorial, cut-throat, and aggressive as any corporate one.  It was all fine and dandy that Children Are Beautiful subscribed to the teachings of Jesus Christ and answered to a higher calling.  They were still elbowing for influence and money in Washington and on the Gangetic Plain.

Secretary of State Kerry talks of listening, sharing, and respect; and especially bringing more religious actors ‘to the table’.  These platitudes, nostrums, and feel-good slogans are empty, shopworn, and worthless.  Christian evangelists, like those of any religion, are up to no good.  They are motivated by parochial if not venal and self-serving interests and cannot be trusted with secular investments.

This position of the State Department does not surprise me. Just as our invasion of Iraq was based on vague, quasi-religious, idealistic neo-con sentiments of bringing democracy (our secular Jesus) to the Middle East, our promotion of Christian morality, ideals, and principles is just as out of place in a world of realpolitik and international competition.

Leave the preachers to their American flock.  There they can preach to the converted and do far less harm than if they are released into the wild.