"Whenever I go into a restaurant, I order both a chicken and an egg to see which comes first"

Friday, November 30, 2012

Is The US Really More Liberal?

In an article in the New York Times (11.30.12) Timothy Egan writes about what he sees as a ‘liberalization’ of America despite the fact that a significant percentage of Americans call themselves ‘moderates’ or ‘conservatives’. What is his evidence – the referenda on dope and gay marriage, and the fact that a black president is pushing for higher taxes for the rich – all true statements, but missing the point that the country has become profoundly conservative. Although white evangelicals have lost demographic ground, they still represent 25-30 percent of the population or between 75-80 million people.  A Rasmussen poll in November 2012 showed that almost 25 percent of Americans wanted only to cut spending to resolve the current fiscal crisis.

A Rasmussen poll in February 2012 showed that 55 percent of Americans oppose affirmative action, and even more among ‘Millenials’.  The percentage of Americans in labor unions in the private sector has dropped to 7 percent, and Pew (2/11) reports that “The favorability ratings for labor unions remain at nearly their lowest level in a quarter century with 45% expressing a positive view.”  Pew also reports that over 50 percent of Americans favor school vouchers, with the number increasing.  The Friedman Foundation’s recent (2012) polls show that support for vouchers is between 55-70 percent, depending on the state.  The Washington Post poll (8/12) showed that 55 percent of Americans wanted smaller government.

I have selected statistics on just a few indicators; but could have chosen subsidies, social programs, welfare, Political Correctness on campus, and any number of other liberal issues to indicate the population is not at all ‘progressive’, but very conservative. Despite this evidence, Egan goes on in his article to praise past liberal achievement, apparently wanting to sanctify Liberalism and promote it as the only way forward.  However, even his choices are suspect.

He starts with Lincoln’s support for the 13th Amendment.  While a significant milestone in American history, it was the constant pressure of the Abolitionists who, contrary to Lincoln’s much more pragmatic and constitutional opinion, insisted not only on immediate emancipation but equal rights for freed slaves, something Lincoln never advocated.  Lincoln’s most compelling desire was to save the Union; and he knew that since the Constitution afforded extensive States’ Rights, he could not run roughshod over them.  He was a conservative Constitutionalist who eventually bowed to Abolitionist pressure. While Lincoln favored the abolition of slavery, he was very careful to try to do so within the framework of the Constitution.

Egan then talks of Teddy Roosevelt’s institution of the income tax as a transformative and historical measure; and implies that somehow it, too should be sanctified and continued ad perpetuam. While the tax system did establish the means for government to raise revenue, that very fact is the reason why it should be suspect. Our deficit and debt problems today are because of an unholy dualism – raise taxes and raise spending in a pernicious cycle until we are pushed to a Fiscal Cliff.

To select such ‘liberal’ achievements as women’s suffrage and the Civil Rights Act is myopic. They were important milestones in the country’s evolution, but should be seen within the context of the economic development that provided the dynamic economic environment into which women and minorities would be integrated.  It was the ‘conservative’ focus on business enterprise, the move towards liberalizing the regulatory and fiscal systems, the extension of a muscular international presence which helped to erode the Communist threat and helped to initiate a truly global marketplace.  .  In other words, without a perennially accelerating economy into which women and minorities could enter, these ‘liberal’ achievements would have been for naught.

The Progressives of the early 20th had an amazing run — direct elections of senators, regulation of monopolistic trusts, modernization of public schools, cleaning up the food supply — with only one major blooper: Prohibition.

Once again Egan is myopic.  There is no point at looking at history unless you look at subsequent outcomes.  Direct elections have been so distorted by the endless state primaries, the contentiousness of media-fueled, rancorous political debate, the ubiquity of conspiracy theories abounding in the Internet, the persistent unenlightened electorate, that many observers look wistfully back to the days of Alexander Hamilton who had more conservative ideas on elections and backed the Electoral College:

Hamilton argued (Federalist Paper No. 68) that the "sense of the people", through the election of the electors to the college, should have a part of the process. The final say, however, lies with the electors, who Hamilton notes are:

"Men most capable of analyzing the qualities adapted to the station and acting under circumstances favorable to deliberation, and to a judicious combination of all the reasons and inducements which were proper to govern their choice."

Hamiltion worried that corrupted individuals could potentially be elected president, particularly those who are either more directly associated with a foreign state, or individuals who do not have the capacity to run the country.

"Talents for low intrigue, and the little arts of popularity, may alone suffice to elevate a man to the first honors in a single State; but it will require other talents, and a different kind of merit, to establish him in the esteem and confidence of the whole Union, or of so considerable a portion of it as would be necessary to make him a successful candidate for the distinguished office of President of the United States"

The regulation of monopolistic trusts was important in the days of the Robber Barons when there were no checks on rampant laissez-faire capitalism to create a more equal playing field; but a more conservative approach that allowed mergers and acquisitions, enabling corporations to maximize efficiency, lower costs, and maximize shareholder profits has been a mainstay of the modern Republic.  In other words, capitalism is a dynamic system which has always operated at the margins and which swings from a highly-regulated system to a freer one.  One historical moment does not reify or sanctify it; nor does it serve as a justifying moment for a political theory, ‘liberalism’.

The same argument can be made for “modernizing the public schools”, another moment in history serving to move education from the parochial to the large civic community; but there is nothing sacrosanct about public education, and it is conservatism which is not providing the impetus for needed change. “Cleaning up the food supply” is of course good; but it has been conservatives who have put the brake on overweening government interference in private transactions and who have assured a more realistic balance between needed technical oversight and free enterprise.

It is ironic that Egan ends with this statement – “All political moments are ephemeral. This one could vanish in the blink of a donkey’s eye”; because he is inadvertently supporting the argument I make here – there is no reason to crow about the historical ‘achievements’ of liberalism as apocryphal, transcendent events signifying an eternal, blessed movement.  American democracy is nothing if not a continuing dynamic interplay between liberal and conservative forces which ebb and flow over time, which have immediate and unexpected long-term consequences.

Recipes: Curried Cauliflower And Pureed Turnips With Nutmeg

These are two unrelated but delicious dishes I have prepared in the last few days.  The curried cauliflower is authentic – I make it with individual Indian spices, and it tastes just like that made in the Oberoi Hotel in Delhi, the best curried cauliflower I have ever had.  Like other Indian recipes, it calls for a lot of spices, but once you have them, the dish is easy to prepare.

The pureed turnips has only one spice – nutmeg – but if you can buy fresh turnips (with the leaves and tops still attached, usually available at farmers’ markets) the taste is fragrant, sweet, with exactly the right touch of earthiness.

Curried Cauliflower

* 1 med. fresh cauliflower, steamed, cut into small florets and stem pieces

* 4-5 Tbsp. olive oil

* 5 whole cardamoms, crushed lightly

* 2 tsp. fenugreek seeds, crushed (they are very hard, so just give them a few poundings

* 2 tsp. whole coriander seeds, crushed lightly

* 1 tsp. mustard seeds

* 1 tsp. poppy seeds (optional)

* 3 bay leaves

* 5 lg. cloves garlic, chopped

* 1 Tbsp. grated ginger

* 5 shakes hot pepper flakes

* 2 tsp. cumin powder

* 5 whole cloves

* 10 whole peppercorns

* 1 tsp. turmeric

* 2 Tbsp. fresh cilantro, chopped very lightly (as garnish)

- Steam the cauliflower, reserve

- Sautee the garlic in the olive oil until lightly browned (browned garlic gives this curry its special taste)

- Add the grated ginger and sauté with the garlic for about 5 minutes

- Add the rest of the spices and sauté over low-medium heat for about ten minutes.  You can tell when it is done when you begin to smell the spices, just starting to brown (Be careful not to scorch or blacken)

- Add the cauliflower pieces, mix well, add salt and pepper, sprinkle cilantro leaves over the top and serve

Pureed Turnips

* Five medium-large fresh turnips (the fresher, the better), quartered

* 1/2 cup whole milk

* 1 Tbsp. unsalted European-style butter

* 1 tsp. freshly ground nutmeg

- Boil the turnips until soft but not mushy.  The fresher the turnips, the shorter the cooking time, so check after ten minutes

- Drain and put the turnips in a blender with the milk and butter with salt and pepper to taste

- Puree; then taste for nutmeg, salt, pepper, butter; and add more milk if the mixture is too thick.  You want the puree thick and not runny.

- Serve

 

Thursday, November 29, 2012

Non-Profit Failure

Sarika Bansal has written The Power of Failure in the New York Times (11.28.12), an article about failure in the non-profit sector.  It apparently is a very common occurrence:

Seven years ago, the consulting group Bridgespan presented details on the performance of several prestigious nonprofits. Nearly all of them had one thing in common — failure.  These organizations had a point at which they struggled financially, stalled on a project or experienced high rates of attrition.  “Everyone in the room had the same response, which was relief,” said Paul Schmitz, the chief executive of the nonprofit Public Allies.  “It was good to see that I wasn’t the only one struggling with these things.”

The article goes on to talk about how these many non-profits have dealt with failure, have adopted a stance of openness and transparency; and that this willingness to face, analyze, admit, and rectify faults is the key to reformation and further success.

I can attest to the failure of non-profits in the field of International Development where I worked for over 40 years. Their record in terms of achieving success however measured, is pitiful.  The number of taxpayer dollars that I saw wasted in inefficient, poorly-designed, poorly-managed, and irrelevant projects was staggering. Why did this happen?

First and foremost, American international development projects are funded by USAID, a branch of the State Department.  It is USAID, through a competitive bidding process, which awards project grants to non-profit (and for-profit) organizations.  These organizations in turn manage the projects in the field. The State Department has clear geopolitical objectives for the programming of its money.  The list of the top 20 countries receiving developing assistance is not as one might hope, the poorest countries of the world, but those with which the United States has or wishes to cultivate a positive political or economic relationship.  It is no surprise, then that Iraq, Afghanistan, Israel, Egypt, West Bank/Gaza, Pakistan, Georgia and Jordan feature high on the list. Most of the remaining countries are on the list because of drug trafficking (Haiti), oil (Nigeria, Indonesia, Mozambique), strategic location (Ethiopia), or historical (Liberia) or PR (South Sudan)  regions.

Countries that receive this assistance know that it is an entitlement and that project performance has little to do with the flow of funds. In fact, since in the past USAID grants went exclusively to US contractors who in turn hired American employees and purchased American products and local governments say very little of it, they used every ruse in the book to siphon off what they could.  Usually it was small potatoes, cheating on the number of ‘trainees’ in a workshop, cooking the books on travel and per diem, and operating kickbacks for any locally-purchased products whether imported American or foreign-manufactured.  When the corruption became outrageous, USAID had to sit up and take notice as it recently did in one of its big recipient countries and one of the most corrupt.  A lot of money went missing; but rather than blame country officials and risk their opprobrium, USAID went after the contractor and ruined it.

USAID is so skittish these days about ‘waste, fraud, and misuse’ it has focused its management attention almost exclusively on protecting taxpayer money, not on assuring the execution of successful projects which actually have an impact on the people whom it is supposed to help.

As noteworthy is the fact that USAID must report its success or failure to the US Congress; but Congress is only concerned with numbers.  Targets can be very narrowly defined.  First and foremost, did the contractor spend the money assigned; was the project completed on time; and was the money ‘protected’.  It is very easy for USAID to justify all this because of the entitlement nature of the business.  Congress could care less what development impact a project has in Pakistan, a country considered one of the most corrupt in the world but an ‘ally’ of the United States; but it would care if its money was diverted, misspent, or disappeared.

US contractors, whether non-profit or for-profit are savvy and hip to this game.  They know that if a project in a politically important country is failing, and they make the obligatory public assurances that they have identified the problem and “will do better in the next phase”, the money keeps flowing.  I have a good friend who worked for a well-known non-profit.  When one of his company’s projects kept reporting failure, all he had to do was to draw up a remedial plan for Phase II, III, ad infinitum, submit it to USAID, and the funding kept coming.

USAID is run by bureaucrats; and although there have been some talented professionals who have joined its ranks, most physicians, agronomists, engineers, and food scientists are working in the private sector or for the private companies which execute government contracts.  Given the sorry history of US diplomatic and strategic military misadventures over the past fifty years, it is no surprise that USAID project designers have really little idea about the realities of the countries to which they are allocating money.  Furthermore, these designers are beholden to interest groups and the resultant Congressional earmarks, so projects are not the choice of USAID but of these public or private lobbyists.  The former push agendas like breastfeeding; the latter like American-produced insecticides for treated bed nets.

Once the Request for Proposals (RFP) has been received by a company, a proposal is churned out to fit slavishly within the confines and constraints of the RFP.  While there is a type of project (Cooperative Agreement) which allows non-profits to be more flexible, most prospective contractors are keen on one thing and one thing only – winning.  They are less concerned with inserting their own ideas, even if they know that their proposals are more attuned to local conditions, history, and geography.  A company I worked for had a successful business model – bid on everything, for history showed that using this strategy, and given the wide variations in quality, the company should win one-third of the bids submitted.  Year after year they were proven right.

So project success or failure has always been tied to the original USAID project design, the entitlement nature of the foreign assistance, the focus on accounting and transparency, and on the production of numbers.  Everyone in the business learned very quickly how to gloss over true failures – e.g. few people learned anything in the hundreds of poorly-designed training courses administered – and produce a dizzying assortment of charts showing how many people were trained, where, for how long, etc.

Under these conditions few projects are ever failures.  The only failure for most companies is if they fail to win a bid; or worse, fail to win a follow-on.

In one of the greatest ironies of ‘development’, the World Bank is hosting a look-inward conference to assess their performance:

Some organizations encourage employees to talk about failure in office events that are closed to the public.  The World Bank is holding an internal FailFaire in December, which will be moderated by Jim Yong Kim, the bank’s president. 

Ironic because only one or two countries have ever been removed from the list of worthy recipients; and since the Bank has always behaved like the American mortgage industry – get the money out the door – its projects have performed no better than their politically-driven unilateral donors like the US. In fact, it has become a joke that every so often the Bank will publish a mea culpa in which they admit failure, but since they know what the failure is, they can fix it.  The Bank is not alone:

Others publish their failures for the world to see.  Engineers Without Borders Canada, which creates engineering solutions to international development problems, publishes a “failure report” every year alongside its annual report.  “I only let the best failures into the report,” said Ashley Good, its editor. The examples that are published, she said, show people who are “taking risks to be innovative.”

However, since Bank monies are loaned to countries, it has lost the protective intermediary of trusted European or American contractors. The countries which know that the Bank has to lend, and that few countries are ever cut off, are often low-performing.  Things were better in pre-McNamara days when the Bank was the lender of last resort, loans were developed by countries for projects they really wanted, and who had a vested interest in their performance.

[Some non-profits], instead of trying to constantly dazzle funders, recommend developing long-term relationships that allow for failure and growth.  If a funder is invested in you, he said, he or she will share your sense of vulnerability if a project is not going as planned, and may sometimes help collectively problem solve.

Perhaps in social arenas where private donor funding prevails; but not in government contracting.  Given the above-mentioned concern about waste, fraud, etc., USAID develops and enforces ever more stringent bidding policies to be sure that not even a hint of favoritism or impropriety is suggested.  It would make good sense to develop a collaborative relationship between USAID and its most successful, high-performing contractors; for on this two-way street certainly more intelligent ideas would be available to improve project design, development, and implementation.  This, however, is impossible; and so the cycle of strict bidding on irrelevant projects which are never seriously evaluated continues.

Failure is written in to USAID projects from the very beginning.

Wednesday, November 28, 2012

Embracing The Ordinary

Embracing the Ordinary is a book by Michael Foley (reviewed by Stuart Jefferies in the Guardian 8.18.12) about discovering, enjoying, and reveling in everyday life.  Not only waking up to smell the roses, but to appreciate real estate, traffic, and dogs.

Thirty years ago, Michael Foley had an epiphany. As he emerged from jury service, the street outside the court became "illuminated, transfigured, a portal to infinite being".

Everything became sublime, especially the menu of the café advertising "egg's, sausage's and tomato's". "Those misplaced apostrophes tore at my heart like orphan children, blessed like the first timid snowdrops of February, sparkled like a dusting of precious stones. I wanted to rush in and embrace the illiterate proprietor.

He then wandered up the street to an estate agent's office, inside which a middle-aged woman in a heavy cardigan was snuffling into a tissue and leaning over a two-bar fire, exposing the white roots at the parting of her dyed hair. "I wanted to go in, tell her that I understood her disappointments and difficulties and then help pay to have her roots done by renting a one-bedroom flat above a launderette."

A lot of writers have been fascinated with the mundane – James Joyce’s Ulysses and Proust’s In Search of Lost Time are perhaps the most famous. Neither writer was satisfied with giving the reader a slice of life, or even better the occasional morsel as other writers have, but to give page after page of often impenetrable or sloggingly boring prose.  My favorite was Andy Warhol’s a, A Novel:

a, A Novel, Warhol's knowing response to James Joyce's Ulysses, was intended as an uninterrupted twenty-four hours in the life of Ondine, an actor who was famous mostly as a Factory fixture.  A taped conversation between Warhol and Ondine, the book was actually recorded over a few separate days, during a two-year period. The book is a verbatim printing of the typed manuscripts and contains every typo, abbreviation and inconsistency that the typists produced from the twenty-four tapes (each chapter is named for its respective tape and side, from '1/ 1' to '24/ 2'). Ondine's monologues and disjointed conversations are further fragmented by Warhol's insistence on maintaining a purity of the transcriptions.

Talk about thuddingly boring.  Even better (or worse) was Warhol’s movie Sleep where he filmed his friend sleeping for five hours and twenty minutes.  Warhol was spoofing Joyce, who was actually trying to recreate the life of Bloom and Molly with his acres of unconscious, spontaneous, associative musings; but in both Ulysses and a, A Novel, getting through the pages was work.

Proust is in another category altogether.  At least in Ulysses there is some dialogue and real time exchanges; and in all the blather that passes for conversation between Ondine and Warhol there are enough references to the underground life of New York, amphetamines, sex, and painting to make it somewhat interesting.  The only thing that most readers of In Search of Lost Time  is the madeleines that open the mental sluices to produce floods of memory.

No one claims that Foley is another Proust.  At least Proust left it to readers to figure out what his writing were all about.  Foley has to moralize:

"The crucial thing is to start paying attention now." Like a budget Buddha, Foley counsels mindfulness, attending to the here and now rather than dreaming of the future or fetishizing the past. That way we can sacralize the profane, even empathize with estate agents with tragic hairdos. Or, as Foley puts it, "by the sweaters of Benetton I sat down and wept".

Easier said than done:

But how do we reach the portal to infinite being without the lottery of jury service? [Foley’s epiphany to the mundane happened as he walked out of jury duty] How elude the hellish maw opened up by break-out areas, corporate away-days and co-workers with signs on their desks reading "Situation worsening – please send chocolate!"?

I used to hate the ‘message’ pictures of women around the village well on the walls of my office; the schlocky African carvings, and the god-awful print wall hangings with solidarity slogans in five languages.  Rather than embrace them, I ignored them.  To somehow glorify these uber-bourgeois chotchkies was never an option.  Why should I? I reserved my mini-moments for nice things – a rocky mountain of Skookum and Wellfleet oysters piled high in the window of Clyde’s, for example.

I think Foley is on to one thing, however – smelling the roses or enjoying a break-out session at a conference does break up the routine and slow time.  Every day that I make my bed, arrange the sheets, pillows, blankets, and bedcover in precisely the same way, even out the rumples and bunches, aligning the patterns, and tuck the stray bit of fray, I make another big X on the calendar.  If I walk the other way on my familiar Spring Valley walk I see the houses, gardens, and SUVs from a different direction.  I notice more, remember more.  If my days are are filled with lunches, outings, coffees, and movies they pass just as quickly as more routine ones, but I don’t seem to notice the time passing with such deadening regularity.

Foley recommends spiritual exercises, appropriating the French theorist Michel de Certeau who, in The Practice of Everyday Life, railed against those snobby morons who fail to realize that if the everyday is everything that is ignored by official forms of knowledge, its very invisibility gives the potential for freedom and even subversion.

Certeau advocated the idea of the ruse, reconfiguring everyday tasks for transgressive personal purposes. One night, Foley relates, he did just this when he sneaked out to a neighbor's skip with an old ironing board. He "felt not only 30 years younger but as lean, tough, resourceful and clandestine as a Viet Cong tunnel rat".

I am all for these ruses.  I used to stick large Post-Its in the Men’s Room at the World Bank, with the words “Is It True?” written in bold black letters.  A week later I would repeat the question, but add more nonsense: “Is It True? Check below” where I would draw a Yes and a No box. I added detail and innuendo each successive Monday.   After a few weeks of this, I heard colleagues talking about them, wondering, of course, who did it; but also what did they mean?  Was the anonymous author after some bit of office intrigue? A veiled warning about another reorganization. 

This somehow leavened the boredom of the job, made me more able to look at the lifeless photos of village women at the pump, and speed up the day without cancelling it out.

But Joyce and Proust, on whom Foley spends much time meditating in these pages, are not the only high priests of low life. He hails others: Alice Munro, Walker Evans, Edward Hopper, Georges Perec, David Foster Wallace, Jan Vermeer. He cites the advice of the poet Rainer Maria Rilke to a depressed man: "If your daily life seems poor do not blame it; blame yourself, tell yourself that you are not poet enough to call forth its riches."

The point of all this navel-gazing is not self-evident.  I had a good reason for shaking up my day and for creating ruses to slow the passage of time.  Foley wants only to observe objects and events of his mundane life for what? To feel more alive?  Sooner rather than later he will become very sick and tired, nauseated even, looking at wastebaskets and toilet bowls, and on that day he will stop embracing them and complain to Engineering to bloody clean them.

My hero is Nietzsche who had something to say about living in a mundane, repetitious world – follow your will, express it with all the force and power of your being, be a Superman.  Trample on whomever is in your path, clear the way to your goal, accelerate beyond good and evil, beyond pedestrian moral constraints, realize your full potential and validate your life.

Schopenhauer saw the world as meaningless and purposeless… He had failed to see the sense of joy and vitality that is achieve when the superior person faces the meaningless world and clear-sightedly imposes his own values on it. The superior person neither shrinks from the struggle of life, nor struggles blindly, but wills to live deliberately and consciously. Nietzsche calls this sense of joy and vitality accompanying the imposition of values on a meaningless world tragic optimism. It is belies the "reality" that the world is not [Schopenhauer’s] Will to Existence, but Will to Power. (www.carroll.edu Friedrich Nietzsche)

For the legions of ordinary people who form part of Nietzsche’s ‘herd’ and who never rise above the perpetual sameness of life, embracing the mundane may be the only way to shake up the Grand Mechanism (Jan Kott on Shakespeare’s Histories) and catch a glimmer of what life is really like.  But thin gruel indeed compared to Nietzsche’s superman, Marlowe’s Tamburlaine, or Richard III.

Like most of us I am caught in the middle.  I aspire to be a Superman, to take great risks to achieve great things and I want to slow time.  Because I am neither a man of Pure Will nor a believer in the transformative power of the mundane, I must resort to ruses, find ways to trick time.  Very unsatisfactory and very unsettling.

Tuesday, November 27, 2012

Bad Behavior

David Brooks writes in the New York Times (11.27.12, How People Change) about bad behavior in children, and something called ‘The Crews Missile’, a letter one disappointed and disgruntled father sent to all his wayward children (published by one of Crew’s daughters):

Dear All Three,” he wrote. “With last evening’s crop of whinges and tidings of more rotten news for which you seem to treat your mother like a cess-pit, I feel it is time to come off my perch.

“It is obvious that none of you has the faintest notion of the bitter disappointment each of you has in your own way dished out to us. We are seeing the miserable death throes of the fourth of your collective marriages at the same time we see the advent of a fifth.”

Then he turned to his grandchildren. “So we witness the introduction to this life of six beautiful children — soon to be seven — none of whose parents have had the maturity and sound judgment to make a reasonable fist at making essential threshold decisions. ... “

Crews has become famous in Britain for saying what was on most people’s minds – it is time to stop making excuses for the whining little ankle-biters who have become progressively bratty, self-absorbed, and incompetent adults; to stop accepting any of the blame, and put it squarely where it belongs – on the little buggers themselves.

While most of us have, at one time or another, wanted to ship our children to some Bornean penal colony, we have also realized that we share a lot in the way they turn out.  First, there is that niggling little matter of genetics.  Although it is true that little William could have inherited particularly nasty bits of DNA from his ne’er-do-well, layabout great-uncle Fosdick, most of his genetic rigging comes from the parents.  Second, children do spend a lot of time in the home with their mother and father, and this environment, for better or worse, cannot be ignored.  Thirdly, their social environment outside the home – schools, sports groups, ballet classes - is largely determined by parents. 

There are those child psychologists who are convinced that parents’ influence is not what it is cracked up to be, and that once children get out into the competitive, demanding world of the schoolyard, they are on their own; but this is debatable. Parents will always have a tremendous influence over their children for what they – the parents – do or don’t do.

So it is Crews who is doing the whingeing and whining, not his children.  When most reasonable people read his account of children gone bad, they look first at him.  After all, there wasn’t just one bad apple in the lot – the one with Great-Uncle Fosdick’s deformed DNA – but all his kids turned out wrong. What does that tell you?

Bringing up children, especially in the teenage years when they are most likely to fall of the rails, is a bit like being a field marshal in a major war.  You need to build an alliance of other parents, teachers, neighbors, and in-laws; choose your battles even if you lose skirmishes to win the war; balance force with diplomacy and negotiation; use intimidation and threat but balanced with reward and satisfaction. 

Of course all this can go wrong and the Law of Unintended Consequences always seems to be in force.  A daughter growing up in a good home with parents in a good marriage might so desperately want what her parents had, that at age 37 she plunges into wedlock with another Great-Uncle Fosdick.

Then, of course, there is accident.  Desdemona might still be alive and Othello Duke of Venice if she hadn’t dropped that damned handkerchief!  An only slightly wobbly son might get sent to jail simply by being with a bad seed during a crime; and those prison years might turn him in all kinds of twisted directions.

Brooks turns to prevention – how to assure that children don’t turn out badly.  However, he offers platitudes and the very obvious:

Human behavior flows from hidden springs and calls for constant and crafty prodding more than blunt hectoring. The way to get someone out of a negative cascade is not with a ferocious e-mail trying to attack their bad behavior. It’s to go on offense and try to maximize some alternative good behavior. There’s a trove of research suggesting that it’s best to tackle negative behaviors obliquely, by redirecting attention toward different, positive ones.

It is pretty darn hard to address negative behaviors ‘obliquely’ when your son has trashed the basement with his doper friends while you were gone; and ‘redirecting attention towards different, positive ones’ almost impossible even when your fury has died down.  The only way to be a good parent in troubled times is to read Thucydides and his histories of the Spartan Wars.  Have your battle plan ready, but be agile enough to change directions as the enemy adjusts his.  Assemble your field commanders and diplomats, martial all your resources and reserves, and have at it.

I know that Brooks means to be more general and to offer advice about dealing with any obstinate, dangerous, or difficult behavior; and while the literature (and my 40 years in the field of Behavior Change) confirms his contention that the positive usually trumps the negative, child behavior begins in the distant past (Great Uncle Fosdick), is subjected to the final recombinant twists of parental genes, is conditioned by everything in the home and outside it, and can be affected by ‘outrageous fortune’ at every turn.  Being positive just doesn’t cut it.

Brooks ends his article with this:

It’s foolhardy to try to persuade people to see the profound errors of their ways in the hope that mental change will lead to behavioral change. Instead, try to change superficial behavior first and hope that, if they act differently, they’ll eventually think differently. Lure people toward success with the promise of admiration instead of trying to punish failure with criticism. Positive rewards are more powerful.

Mr. Brooks, I read and enjoy most of your articles, but puleeeeze…..

Monday, November 26, 2012

Our Paranoia–We Freak Out About Everything!

One of my close relatives was a worrier.  He boiled his chicken so that all the fat would be rendered and nothing but the tough, stringy meat was left.  He steamed all his vegetables so that no nutrients would be lost and refused to put anything on them like butter or olive oil.  He didn’t care that undercooked or hardly-cooked broccoli, cauliflower, or squash have no taste and eating them is not much different than grazing in a cow pasture.  He eschewed my grilled squash with caramelized garlic and onions in olive oil; my pureed cauliflower with parmesan cheese, nutmeg, and sour cream; and choked down my pasta with Italian sausage.

Every sink and adjacent counter smelled of disinfectant.  He bought a special de-odorizing, sanitizing vacuum cleaner that he used every other day on the carpets.  Bottles of Purell anti-microbial liquid were by every door handle in the house, both exterior and interior.  He installed air filters for every room, and spent thousands on a special industrial air cleaner for the furnace.

He worried about indoor air pollution and felt gagged by all the PCBs that were being released into the air from all the plastics and synthetic fibers.  He bagged himself up more tightly than a Saudi Arabian woman when he went out into the sun, went to the doctor’s every week to check out a new spot on his skin, a shaky feeling in the gut, or a bit of unsteadiness on the feet.

By today’s standards, writes Brooke Allen in the Wall Street Journal reviewing Encyclopedia Paranoica (11.24.12) he was in the dark ages of worry.  His worries were trifling and pathetic compared to the perceived dangers lurking in the modern world:

[The authors write of the hazards of toilets] When flushing, you’d do well to keep the seat down because toilet water and all its contents are vaporized by the flushing action and settle upon everything in your bathroom—including your toothbrush. A lovely hot bath turns out to be… a foul stew of pathogens, with up to 100,000 bacteria per square inch. But showers are not much better—they distribute the scary Mycobacterium avium.

Dishwashers carry fungi on the rubber band in the door. Kitchen sinks: According to one scientist consulted by the authors, "if an alien came from space and studied bacteria counts in the typical home, he would probably conclude he should wash his hands in the toilet, and pee in your sink." Sponges: Their "damp, porous environment serves as a perfect breeding ground in which the microbes can flourish and multiply until there are literally billions of them." Cutting boards—let's not even go there.

image

(Photo from the Wall Street Journal article)

If this weren’t bad enough, the cures for these pathogenic horrors are just as bad:

"The ozone spewed out by [room air purifiers] is more hazardous than any substances they may remove." The overuse of antibacterial products is creating a new breed of "superbugs" resistant to the original agents and to antibiotics as well..Hand-drying machines are actually scary: In one study, "people who used a hot-air hand-drying machine to dry their hands had two to three times as many bacteria on their hands as they did before they washed them."

Some people, faced with the terrors of this hazardous, polluted, and threatening world, often turn to yoga and meditation for solace and refuge.  Nothing like a good centering to provide perspective on life and its living.  However, think again:

Even yoga and meditation are not unmixed blessings. The former activity can cause injuries and even debilitating strokes, while the latter has been known to produce a grisly array of symptoms including, according to one expert, "uncomfortable kinesthetic sensation, mild dissociation, feelings of guilt . . . and psychosis-like symptoms, grandiosity, elations, destructive behavior, and suicidal feelings."

Well then, tuck into a good, healthy meal.  Properly prepared it will be just the comfort food that you are looking for. Wrong again:

We hardly need to be told that Chicken McNuggets and Wendy's baconator triple are to be avoided, but there are problems with "healthy" alternatives, too: brown rice (arsenic), leafy green vegetables (food-borne illnesses), sprouts (salmonella) and soy (possibly carcinogenic).

Even pot-sitting – reading a good book while doing Number Two – is bad, for it can give you hemorrhoids. What is one to do? Well, relaxing a little might be a good place to start; and contemplating your mortality might be a good next step.  It really doesn’t matter if you’re 40, 50, or 70, the end is always in sight and coming up fast; so you might as well enjoy the rest of the ride. Or invent a plausible excuse, as I have done.  “How could I get sick from environmental pathogens”, I reason, “when I have lived and worked in the most befouled, and pestilential environments on earth and survived?”.  I walked through cities and towns of South Asia with no plumbing and raw sewage ran in the streets.  I ate meals covered with the very flies who had lighted on this waste.  I walked through narrow hallways of coughing, suppurating patients waiting at a health center.  I breathed the dust which contained billions of germs, micro-parasites, and the polluted urban air which contained more toxic pollutants than in 100 square miles of Los Angeles.

I have a friend of mine who takes incredible risks with his money.  He takes very few precautions to secure his online accounts, loses his credit cards monthly, leaves vital financial statements lying around in the top of the trash, is cavalier about his passwords, and could care less about issues of privacy.  “Why”, I asked him, “are you so careless?”.

“I am a smoker”, he replied in the most wonderful contortion of logic I have ever heard.  His chances of dying of lung cancer are so great, he reasoned, that he would certainly die before he ever had to suffer identity theft or depletion of his bank accounts.

This paranoia about germs is very American.  Most Americans who visited France a few years ago were appalled that baguettes were sold to you with no more than a little flimsy piece of paper wrapped around the middle – no sanitized plastic bag to protect it from the noxious fumes of the outside world.  Iron skillets were encrusted, nasty looking, caveman blunt instruments which were never washed (ruins the seasoning).  We are amazed at how cavalier people in other cultures are about germs when they can do something about it! That’s really at stake here.  We Americans truly believe that we can cheat death; that there is a solution for every problem; and that if we are careful like my relative we can live well into our 100s.  It is that constant search for practical solutions which keeps our noses in the microbial soup.  With our heads down we cannot afford to look up let alone smell the roses.

I know a woman who comes from a family very much like my relative’s.  She is practical, problem-solving, and unremittingly concerned with better bathroom fixtures, how to remodel a perfectly good sink, replacing curtains with more resistant fabric, improving drainage, circulation; selecting more efficient traffic routes; saving a few cents on every item, on and on.  She is a good American.  She works at solving the little things, and her consumer behavior is far more contributive to the national economy than mine; but her incessant attentiveness to practical detail is just as bad as a paranoiac terror of germs.  It fills up too much space within an ever-narrowing and speeding trajectory to the end of the line.

I personally will take my chances that the roof won’t leak if I don’t replace it; that I won’t get the squirts if I eat a rare hamburger; and that I can squeeze another few thousand miles out of my car.  “What? Me worry?”. 

Sunday, November 25, 2012

Living Life To The Full

Larry Hagman died last week, and although I never watched Dallas I knew that it was an American cultural icon as popular abroad as it was in the United States.  Enough of the cachet of the program must have filtered through my indifference, because on my first trip to Dallas a few years ago, I was expecting something glitzy, oversized, bigger-than-life, and dramatic.  Of course I got none of that – only a big, hot High Mass wedding and a country club reception – but I have to admit I was somewhat disappointed.  If even for a moment we want things to be as we imagine them.

In a very funny piece in the Guardian (11.15.12), Carole Cadwalladr has written about regrets – we all have them, but Larry Hagman apparently did not, for his family released a statement saying he had “lived life to the full”.  Although the family provided no specifics, Cadwalladr fills in the blanks:

Four bottles of champagne a day, several experiences on LSD in which he said he could read people's minds and felt the living breath of the universe. And a habit of popping down to the shops wearing a gorilla suit.

As one might imagine, all this had its consequences:

True, the four-bottles-a-day routine led to a diagnosis of cirrhosis and a subsequent liver transplant, but even that doesn't seem to have unduly held him back. He told a journalist that after his death, he wanted his remains to be "spread over a field and have marijuana and wheat planted and to harvest it in a couple of years and then have a big marijuana cake". It would be enough, he hoped, for 200-300 people to "eat a little of Larry".

This isn’t quite so wacky.  Ted Williams had himself frozen until a cure for what ailed him could be found; and hundreds of Hollywood stars have had their ashes strewn in unusual places.  Jill Ireland, Charles Bronson’s late wife had her ashes made into a walking stick for him; and Joan Rivers scattered some of her husband’s ashes on the Late Show with David Letterman.

All of which prompted Cadwalladr to reflect on the subject of human regret.  Most of us, unlike Larry Hagman, regret something:

a survey carried out by the British Heart Foundation into the nation's regrets and unfulfilled desires. Nobody, sadly, said that they regretted not spending enough time wearing a gorilla suit taking LSD.

Instead, the regrets varied from the mundane (not having travelled more) to the delusional: 7% of men believe they could have been professional sportsmen if only they'd trained harder…And the 11% who "regret not sleeping with more women".

Seven percent of the 25 million men in Britain, calculates Cadwalladr is nearly two million, all of whom opine that they could have been David Beckham if they only had tried harder.  I am surprised that only 11 percent of men regret not sleeping with women.  Given the fact that men think about sex all day, every day, usually about women other than their wives, one would think that the regret levels would be higher.

In any case further research has found that regrets come only at a stage in life when we can do something about it:

What emerged [from the primate study] was that apes suffer a similar lull in happiness in their middle-age years as humans. Study after study has shown that we're happy when we're young, happy when we're old and pathetic, moping wrecks when we're middle-aged, convinced that David Beckham is living the life we should have had.

Perhaps that middle-aged regrets are an evolutionary mechanism which if caught early enough will motivate us to bigger and better things in life. Or, says Cadwalladr:

Regret makes us smarter. More human. More understanding of human frailty. With a greater insight into the role that luck plays in all our lives.

Regret in this nostalgic, reminiscent sense is very different from guilty regret.  “I really wish I had not left that incriminating email on the screen”; or “I should never have had that extra gin fizz”.  It is different from frustrated regret – “I should have spoken up…done him in…humiliated the little wanker….”; and it is nothing like the acid reflux-producing regret at not having made the right career moves and getting passed over; or not disciplining an errant child and having to bail him out of jail.  No, nostalgic regret is remembering for 40 years the girl on a park bench who looked at you through the March mist, alone and interested, in the most romantic of scenes, and then remembering that you did nothing. 

Having no regrets is what we all want.  Living life to its fullest is the only way to face the end with any kind of equanimity.  I have never understood people who, of sound mind and body, kick back and do little in their retirement years.  However many years I might have remaining to me, they will be filled to the brim with new ideas, new activities, new challenges.  I have had few regrets in my life (other than not owning a Ferrari) and do not want to start creating them now.  Maybe I should re-write my will and add the proviso that my ashes should be scattered to the 200 MPH winds whistling past a screaming Testarossa.

Sibling Rivalry

“Show me a sibling who experiences no sibling rivalry, and I’ll show you an only child”, says George Colt in a hilarious account (New York Times 11.25.12) of sibling rivalry in his and more famous households:

Growing up in a crowded apartment on East 93rd Street, Chico, Harpo, Groucho and Gummo Marx shared a bed relatively peaceably, but not a meal. “There was generally some kind of a brawl at the dinner table over who would get what,” said Groucho, who recalled reaching for the last roll on the plate only to see a cleaver, wielded by the normally equable Harpo, slam down within an inch of his hand.

The same drama was played out in the James Joyce and John F. Kennedy households as well, researchers inform us, in all societies. Child psychologist David Levy who did the first scientific studies of sibling rivalry in the United States in 1930 expanded his study to the Kekchi Indians of Guatemala; and concluded that regardless of age, gender, birth order or cultural background, sibling rivalry is a fact of family life.

Levy’s original study in the United States showed how bloody-minded sibling rivalry is:

Levy, who gave his patients celluloid dolls that represented their parents and younger siblings and asked them what they felt when they saw the baby brother or sister doll nursing at its mother’s breast. There ensued scenes of sibling carnage to rival anything in the Old Testament. Among the responses were:  “dropping,” “shooting,” “throwing,” “slapping,” “hitting with stick,” “hammering,” “tearing apart,” “scattering parts,” “biting,” “crushing with fingers,” “crushing with feet,” “crushing with truck” and “piercing (with screw driver).”

My children were born 18 months apart and despite the age difference were very equally matched.  My son was always bigger and stronger; and my daughter, like most girls, matured more quickly and was more socially savvy.  He could physically attack and pummel her.  She could outwit and trick him.  As they grew older she learned how to wound and punish with a few well-timed and sharp verbal barbs about his hair or clothes. She kept a lot of poisoned arrows in her quiver and I told her that it was to her credit that in the heat of sibling battle, she didn’t let many fly. “Sometimes he deserves it”, she said.

Even when they were older the physical battles continued.  One day my daughter was so incensed with my son that she punched her hand through a window to get at him.  When she was an adult I mentioned this incident to her and said, “You and your brother didn’t really want to hurt each other did you?”  I was thinking of dogs who usually just pissed on fire hydrants, raised their hackles, bared their teeth, growled, and looked menacing until one backed off.  “Oh yes we did”, she said. “We wanted to kill each other.”

Colt, the author of the Times piece remembers how his parents did everything they could to make things exactly equal so that he and his brothers would not fight:

Our rivalry played out most nakedly at the dinner table. Who got the largest hamburger? Who finished eating fast enough to get seconds before the food ran out? Who got the biggest slice of pie? Attempting to forestall quarrels, our mother cut portions so nearly identical it would have taken a micrometer to tell them apart. But in vain. Whether lunging for the last hot dog, filching an extra piece of crispy skin from the roast chicken or merely noting who had gotten the most cherries in his fruit cocktail, each of us struggled, constantly, to get our fair share — or, preferably, a lot more.

I remember doing the same thing, pouring the juice slowly and deliberately in two glasses as my children watched, eyeball to juice level, inspecting for any infraction, any favoritism.  I couldn’t resist the teaching moment, however, and once poured the juice in two different glasses, one tall and thin, the other short and fat, but with the same volume.  “Hey”, yelled my son. “She’s getting more than me!”  He had fallen into my trap.  “No, she didn’t”, I replied and proceeded to complete the experiment.  I had one-upped him.

Apparently it is not unusual that sibling rivalry centers around food:

Food’s central role in that rivalry is, in part, a matter of biology. [Sibling food fights are] triggered by the same instinct that drives piglets to fight for position nearest the sow’s head, where the nipples deliver the most milk... Feeding time in the Marx kitchen was tame compared with feeding time in any blue-footed booby nest, where the eldest chick often pecks the youngest to death in order to increase its own chances of survival.. Spotted hyenas, for example, routinely attack their younger sibling within minutes of its birth, sinking their teeth into its shoulder blades and shaking, in order to minimize competition for their mother’s milk.

I once asked my children why they fought.  They fought over nothing (“He’s sitting too close to me….She kicked me….He threw a potato chip at me…”) but they fought all the time.  My daughter hesitated for a brief moment and then replied, “You know, I don’t really remember.  I just hated him”.  This didn’t surprise me.  I had read enough to understand that it was not about them, it was about us, the parents.  Yet we prided ourselves on what we thought was our even-handed treatment of our children. We never gave either one reason to feel slighted. It obviously was about something much more deeply-rooted:

The word “rival” is derived from the Latin “rivalis,” meaning “using the same stream as another.” In pre-Christian times, rivals were people or tribes who fought over water from the same river. “In our terms,” the psychoanalyst Peter Neubauer once observed, “the river is the mother who supplies our basic needs, and the children compete for access to her.”

Most parents will do anything to stop the fighting, and there is nothing more annoying than the constant bickering, squabbling, pushing, grabbing, and kicking.  It can drive even the most patient parent bonkers.  One day we all were driving up to Connecticut from DC to see my parents.  We had an old car and it had no air conditioning.  The ride was hot and noisy, and I am sure the children were just as pissed at all of it as we were; but the difference was they took out their frustration and prickly heat out on each other.  Although we had separated the car seats about as far as they could go, the little animals could still get at each other. 

Because they were strapped in they could only land glancing blows and deflected kicks, so they started spitting and throwing whatever they could find at each other.  Bits of sandwich corners, broken Oreo cookies, and banana peels.  We broke our strict No Junk Food household rules when it came to car travel.  We left home early in the morning, around 4am when the bambini were still groggy and half-asleep.  We knew that we could get a good hour-and-a-half, maybe even two hours of peace and quiet while they slept.

When they woke up, we fed them junk food.  This breach in routine and rules threw them for a minute, but they quickly grabbed and gobbled the chips, cookies, and candies we threw into the cage.  They gorged themselves insensate, and dropped off to sleep.  We had gained another hour and a half.   Three more to go.  I suspect that once all the sugar was digested and got into the bloodstream, it hit their little brains like mainlined coke.  Their eruptions became volcanic, their scrambling attempts to punch, hurt, and maim were Olympic.  They yelled, cried, spat, and flailed.  I became as angry as they and started yelling and screaming at them.  The small car was a circus gone bad.  There was only one thing I could do.  I wrenched the steering wheel and swerved two lanes to the shoulder of the Interstate and screeched to a halt, throwing all the Oreos, banana peels, and Doritos forward in a shower of fast food debris.  Silence.  It worked.

It worked alright, but only for about twenty minutes, but tired and dispirited with no more energy or will to battle, all we could do was let them have at it.  My mother always wondered why we arrived such wrecks.

Colt reports that some parents actually encourage sibling rivalry to stimulate competition:

In some homes, parents organize the competition; to encourage Darwinian resilience among his elder four sons, Samuel Marx set out three sweet rolls each morning; as soon as a brother had wolfed down his breakfast, he was permitted to grab a roll, leaving the slowest eater empty-handed.

Mr. Marx must have had domestic help or had an obedient and accepting wife who would deal with the aftermath, because no parent in their right mind would ever bring on cyclones of sibling rage and antagonism.

Our niños now are the best of friends, and I like to think that the herculean struggles of childhood were somehow necessary for them to establish a more mature, respectful, and loving relationship. My wife and I still try to be even-handed, not because we don’t want to stir up hornet’s nests or provoke a return to some deeply-buried, residual infantile rivalries; but because it seems the right thing to do for offspring who have come into their own.  From what I can tell, the sibling rivalry has disappeared; but then again I have heard of siblings who lived their whole lives in loving harmony, only to come out of their corners punching to get the biggest piece of the inheritance pie. 

When my older brother, Harry, traveling in India after college, was stricken with a mysterious gastrointestinal disease and lost 30 pounds, he suspects the culprit was a half-eaten Popsicle he found on the street and instinctively gobbled up before someone else got it.

If this story is true, then Mr. Colt may have some deadly brutal fights coming once his parents are six-feet under and the will is read to the surviving brothers. 

Recipes–Easy Chicken Biryani

You will find many recipes for Chicken Biryani on the Internet, and this one diverges somewhat from the classic ones.  Why I like it, however, is because of the sweetness.  I adjusted the spices (adding garam masala,  more cinnamon and cloves and a touch of sweet mango chutney) so that it would move out of the familiar curry taste spectrum to something very Indian/Pakistani, but a lot more interesting. I deliberately left out the cumin powder which usually is added to the more traditional recipes, but since cumin is a very strong spice, it tends to take over and mask the subtleties of the other, sweeter spices.

The garam masala is the key spice in the dish. I also use coconut milk mixed with water to cook the rice. The recipe below is for four servings as a side dish.  If you would like to make this a main dish, then double everything.

Don’t be dismayed by the number of spices.  They all add to the taste of a really delicious dish.  If you don’t have the garam masala, you can easily get it at any Indian store, and while you’re there, you can get any of the more exotic spices like the fenugreek.  All the other spices are available at Whole Foods.

Chicken Biryani (4 servings)

* 1 lg. boneless chicken breast (boiled, cut up into 1” pieces)

* 1 cup Jasmine, Basmati, or other long-grained rice

* 4-5 Tbsp. olive oil

* 5-6 lg. cloves garlic, loosely chopped

* 1 medium onion, coarsely chopped

* 5-6 whole cardamoms, lightly crushed

* 2 tsp. fenugreek seeds, crushed well (pieces of fenugreek can be bitter)

* 2 tsp.coriander seeds, lightly crushed

* 10 whole cloves

* 10-12 whole peppercorns

* 1/2 tsp. powdered cinnamon

* 1/2 tsp. powdered ginger

* 3-4 shakes hot pepper flakes

* 1 tsp. turmeric

* 1 Tbsp. sweet mango chutney

* 1 1/2 Tbsp. garam masala

* 15 raisins (approx.)

* 1 cup coconut milk

- Cook the rice in 1 cup water, 1 cup coconut milk with a dash of salt

- Boil the chicken breast until done (about 15 minutes). Test after ten minutes or so to be sure that it is done, but not overcooked.  You can also bake the chicken, but I find that boiling it is easy and the spices in the sauce will give all the flavor you need.  When cool, chop.

- LET THE RICE COOL COMPLETELY.  The best thing to do is to cook the rice in the morning for an evening meal or the night before.  Warm or hot rice mixed with the sauce will be gummy and gluey.

- Sautee the garlic and onion in the olive oil for about 5 minutes until done – the garlic should not brown

- Add the rest of the spices (except the chutney), and sauté over low heat for about 15 minutes, stirring frequently.  Add more oil if the mixture appears too dry, but not too much.  For the final product the rice should not be wet or oily, but more fluffy.

- Add the chopped chicken and chutney and mix well.

- Taste and adjust the spices.  At this point, you can add anything except more garlic, coriander, cardamom, or fenugreek seeds

- Add the cool rice and the raisins and mix well.  Again taste, warm the mixture over low heat for about 20 minutes and serve.

Saturday, November 24, 2012

Big Data–The Hottest New Idea of 2012

Billions of bits, bytes, megabytes, terabytes, and godzillabytes are being generated and stored every day.  Think of the millions of surveillance cameras, weather sensors, traffic monitors, Geiger Counters and seismic instruments there are in the United States alone.  Then add to that the unimaginable number of emails, tweets, and Facebook posts that are transmitted and received every day. Then add all the text messages, Internet searches, and GPS systems; and data collected by Toyota, Honda, and Ford from each onboard computer of every vehicle brought in for service.  Then the unimaginable amount of data captured and recorded by Amazon or Google from Internet searchers and online consumers.  The amount of data that is being collected and stored today is staggering.

The problem is how to use it.  Such huge data bases allow for more sophisticated analysis of patterns and trends than smaller ones.  Data collected from one large data base can produce more reliable and useful correlations than the same amount of data from collected individual sources. The following example is illustrative:

Tobias Preis et al. used Google Trends data to demonstrate that Internet users from countries with a higher per capita gross domestic product (GDP) are more likely to search for information about the future than information about the past. The findings suggest there may be a link between online behavior and real-world economic indicators.

The results hint that there may potentially be a relationship between the economic success of a country and the information-seeking behavior of its citizens captured in big data. (Wikipedia)

In other words, Preis and his colleagues had access to Google search data from countries around the world and controlled for just one factor – future-oriented inquiries.  These could be anything from economic and financial predictions, to the likelihood of scientific breakthroughs, or the likely impact of global warming.  The data showed that either economic progress encouraged future-oriented searches; or that the tendency for such searches indicated a greater degree of positivism or entrepreneurial spirit than those who looked only backward.  In any case, business planners were able to use the data to build investment models – better to invest in countries with future-oriented populations than history-oriented ones.

Google’s Gmail reviews and records all email traffic and through application of sophisticated algorithms can discern trends both in individual user preferences (e.g. someone who often refers to Persian carpets might be interested in buying one) and in large population groups (an increase in the words ‘cough’ and ‘fever’ in Midwestern states might offer a clue to how the flu epidemic is progressing).  Both retailers and the CDC can use this information to generate sales or to set up flu treatment centers ahead of the epidemic.

The analysis of big meteorological data can improve forecasting and long-term trends.  In the days of the clipper ships, data was collected from thermometers and instruments that measured ocean temperature, currents and wind direction and velocity; and were used to guide future navigation.  Today similar data is collected from billions of micro-sensors distributed throughout the planet, fed into super-computers or thousands of parallel-processing computers to discern trends and correlate them with related meteorological events.

On a smaller scale, the US Department of Health collects data on every Medicare patient in the US, and with the advent of electronic medical records and the consolidation of much patient data through Obamacare, trends and correlations can be determined which will help guide the channeling of both public and private resources.  Already much of the received wisdom about breast and prostate cancer screening has been thrown into doubt by the mining of big data.  Analysis has shown that there is little correlation between repeated mammography or PSA screening and cancer survival rates.  The increased number of genetic screening tests to isolate genes for various types of disease can generate data which can be correlated with outcomes – i.e. how many individuals with a suspect gene actually develop the disease with which it is associated?

At the corporate level, industries can track everything about employee behavior, from hours worked to productivity; from sick days to performance.  In other words, a large company with enough data from its workforce can isolate the determinants of productivity and high performance.  A large corporation with thousands of employees working in hundreds of sites can correlate efficiency and productivity with flextime, office configuration, window access, or number and qualifications of employees. According to McKinsey there are five broad ways in which using big data can create value

First, big data can unlock significant value by making information transparent and usable at much higher frequency.

Second, as organizations create and store more transactional data in digital form, they can collect more accurate and detailed performance information on everything from product inventories to sick days, and therefore expose variability and boost performance.

Third, big data allows ever-narrower segmentation of customers and therefore much more precisely tailored products or services.

Fourth, sophisticated analytics can substantially improve decision-making. Finally, big data can be used to improve the development of the next generation of products and services. For instance, manufacturers are using data obtained from sensors embedded in products to create innovative after-sales service offerings such as proactive maintenance (preventive measures that take place before a failure occurs or is even noticed). McKinsey Global Institute: “Big Data”, Manyika et al.

There are other emerging uses of big data.  One of them is crowdsourcing (http://www.uncleguidosfacts.com/2012/11/crowdsourcing-and-predictive-markets.html), an innovative use large, often unrelated data groups, to make predictions, to offer solutions to complex problems, or to suggest new and innovative ideas.  The simplest example of crowdsourcing is to ask a large number of random people to bet on something – the number of gumballs in a jar or the winner of a presidential election. 

When researchers gave people the chance to guess the number of jawbreakers in the jar, the average estimate was remarkably near the correct answer.  The random guessers did far better than the expert mathematicians.  In the 19th century when betting on political campaigns was still legal, bettors always got the presidential elections more right than did pundits.  The principle is that people who bet on a particular issue or event have at least some analytical reason for their choice.  Although their reasons may be very subjective, astrological, or severely mathematical, the average always comes out better than considered expert opinion.

Highly complex problems have been broken down into component parts have been farmed out, or crowdsourced to individuals or teams who have not been chosen, but who have offered to solve the problem in return for a reward if they solve it.  This is a type of human parallel computing on a big data scale.

Google, in a well-known enterprise, offered a million dollars for anyone who could come up with a new algorithm for its search engine.  Although Google maintains a prestige research laboratory, the company chose to outsource the problem.  The effort was successful.

Big data and crowdsourcing are not simply new ways to mine and use data.  They represent seismic shifts in intellectual inquiry.  The individual expert, scientist, pundit, or academician are becoming increasingly irrelevant or peripheral.  If masses of unrelated individuals can arrive at better conclusions than they every time, why bother with them?

Big data is a truly revolutionary phenomenon.  The amount of data generated and collected increases geometrically every week.  Software programs that organize, exploit, and correlate these data in increasingly sophisticated ways are being developed at almost the same rate.  The two phenomena fit perfectly.  Our world will become increasingly knowable not through reflection and speculation but through hard, objective data.

Ironically, I have written also about the invasions of privacy that accompany this generation, mining, and use of data.  Although big data can be of great use to science, business, and government, the nearly universal and constant surveillance of our every keystroke should give shivers to even the most committed data freak.  I have written extensively about this (http://www.uncleguidosfacts.com/2012/01/invasion-of-privacywere-all-at-fault.html and search words ‘invasion of privacy’) but have given up.  Obviously most of us perceive more benefit to our computer cookies than harm; willingly give up privacy rights in the interests of anti-terrorism and crime-fighting; and are very happy that Amazon can recommend books and movies that we will like.

In any case, the great tsunami of big data has already inundated us, and it can only increase in size and importance.  I think it is a good thing.

Friday, November 23, 2012

Men As Cooks–Do We Take It All Too Seriously?

I have been interested in food as far back as I can remember.  I cared less about seeing my aunts and more about the cheesy, garlicky, hot, bubbling, sausage lasagna Aunty Angie made; or the antipasto palette of capicola, mortadella, salami, and wizened oil cured olives arranged by Aunty Ona or her corn fritters, ham pies and rich, succulent eggplant parmesan.  The only reason that I didn’t kick and scream more over the enforced visits to my father’s sisters in New Haven is because they could cook.  The minute we walked into the old Wooster Square tenement, I could smell the artichokes, stuffed with parmesan, parsley, and garlic, drenched in olive oil, and roasted until the tips were crispy and the hearts covered with melted cheese.

I knew instinctively that my grandmother’s all-day spaghetti sauce was the real thing, and that the watered-down, garlic free, un-oiled version that my mother cooked was a pitiful imitation.  I learned a few things from my mother who was a decent cook, but my own cooking had little to do with hers – except for Christmas Eve when, in deference to my much more traditional Italian father, she made a passable try at the sette pesce – grilled eels, squid in fiery hot tomato sauce, spaghetti with anchovies, and baccala.  I loved those fishy things and made comparable dishes any time of year.

I did not cook during our almost five years in India, but I learned a lot particularly about vegetarian cooking.  Although much of Indian food was rendered almost inedible because of the stale oil and blazing chilies, the best cuisine was remarkable.  I especially liked Gujarati thali, where rice and dal are accompanied by an array of side dishes.  All Gujarati thali had to have foods appealing to all the tastes – sweet, sour, bitter, and salt – and they were all both complementary and contrasting.  I also learned about tandoori, creamy Punjabi curries, wickedly hot South Indian shrimp, and Bengali seafood fried in mustard oil. 

I did a lot of cooking in South America.  I learned how to make black beans with bacon and coriander, fried plantains, and spicy steamed mussels.  I ate the best Brazilian churrasco misto, a mixed grill with every possible cut from every possible meat.  I loved the Argentine parillada, a different type of mixed grill with innards, testicles, and udders.  I learned that everything is good to eat; and I only demurred when it came to insects.  On a trip to Kinshasa during an outing to the neighborhoods to eat, drink, and dance I noticed in a simmering pot of tomato sauce what looked like caterpillars.  Couldn’t possibly be, I thought, but they were.  “Try them”, said my hosts.  “They are a national dish”. 

After ten years of living overseas, I returned to the US and had a family.  I was always the cook.  I am not sure how it happened; but it was one of those felicitous arrangements where I did what I liked – to cook – and my wife did the rest.  Not a good arrangement, some said, with my wife left with the bills and the finances; but she never complained. Every night I cooked something different, unusual, and tasty. I remember how when I told my mother that I liked to cook, she responded “Not if you had to do it every day”; but I was different from her and never found cooking laborious or boring.  It was something I looked forward to and found myself planning meals in the morning and preparing them in the evening.

My children, now adults, ate everything I prepared and never balked at anchovies, squid, brains, liver, kidneys, or sweetbreads like their friends.  I could make the Indian food as spicy as I wanted and never got screams of pain.  No one lurched for the water between bites.  My son became a vegetarian for about four or five years, and I and the whole family accommodated the change.  What was so vegetarian about Neapolitan spaghetti with peas, or spaghettini al olio, aglio, e peperoncino – a marvelously smooth and sweet dish with carmelized garlic, hot pepper flakes, and extra virgin olive oil, both dishes I had always made?  The same with my Indian vegetarian dishes.  I could make dal ten different ways – different lentils, different spices, different ingredients. I knew how to make black beans, red beans, navy beans, and kidney beans, all in different rich, spicy sauces.

I finally got around to writing a cookbook and putting it online (www.uncleguidosfacts.com search word ‘Recipes’).  I hardly ever refer to it because I still make something different on most days – perhaps not radically different, but slightly different.  A new spice, perhaps, a different kind of fish.

The point of all this is that I love cooking and have never taken it too seriously.  In an article in the Telegraph (11.23.12) Max Davidson wonders if we men chefs are taking ourselves a bit too seriously:

'And what’s your signature dish?” friends ask, with the hint of a sneer, when I tell them that I share the cooking 50-50 with my partner.

“I’ve got three,” I say proudly, pausing from my exertions at the stove to take questions from the audience. “Chicken and tarragon pie with roast Mediterranean vegetables. Canarian garlic soup with paprika and lardons…”

The third is ‘exploding eggs’ a modest mix of eggs, ham, and tomatoes in the microwave, not so much chosen for its uniqueness or specialty but as a winking self-deprecatory reference to proclaim: “I am not a food-obsessed”.

That’s what men chefs have become, according to Davidson.  We take ourselves far too seriously.  It is one thing to enjoy cooking:

For me, as for millions of men, cooking creatively, not just in a spirit of henpecked resentment, has become one of life’s great pleasures. Nine in ten men now cook “regularly”, according to a new poll, with the typical man spending more than 11 hours a week in the kitchen, not all of them looking for beer in the fridge. And a thumping 56 per cent of us regard ourselves as “adventurous” cooks, more likely to experiment than our partners.

It is another thing entirely to transform cooking into something more than it is.  For many men it is a signal that they have signed on to the feminist revolution.  See, I cook, and I still have my balls intact. For others it is the focal point of the radiating circles of PC goodness – locally-produced food, small farmers, organic produce, environment-respectful water use, high labor standards, and health and wellness.  For others it is a creative enterprise within easy reach.  While we may not be able to draw, paint, dance, or play the guitar, we can cook. 

In certain hipster communities food has become an important social glue.  By simply subscribing to the good food ethos, you automatically subscribe to all the liberal, PC awareness movement that surround it.  Food has become a code for a lot more.

There is a whole other aspect of foodie culture that I have overlooked.  It is the competitive nature of male cooking.  Many men are not in it for critical acclaim, but to WIN! Davidson writes about TV cooking reality shows, cook-offs, and no-holds-barred food fights:

These shows see men competing with women on equal terms, sometimes better than equal terms. The final of the last series of The Great British Bake Off was an all-male affair, as the women’s soufflés and fondants failed to rise. The competitive element was shamelessly milked by the presenters…

Why are men so obsessed with winning? This is cookery, for heaven’s sake, the heating up of food, not the 100 meters Olympics final. If a husband doing his part in the kitchen is admirable, a husband out to show he is a better cook than his wife is obnoxious, a recipe for an unhappy marriage.

Or men just compete against each other:

In its all-out quest for a winner – someone who can prepare a perfectly presented plate of food against the clock, using a frankly ridiculous number of ingredients – it takes half the fun out of cooking.

Whatever it is, this obsession with cooking illustrates how wealthy and well-off many Americans are.  Despite a severe recession we cooked, created architectural food-palaces of pork belly, foie gras and coulis, and invented the best possible wine pairings.

The foodie craze will pass.  Already San Francisco hipsters are shunning the baroque and the overblown and are returning to their Alice Waters roots – simplicity, good, fresh, local ingredients, and a no-nonsense presentation signaling that it is the food that matters. I am sure that just as PBR (Pabst Blue Ribbon beer), a tasteless workman’s brew became popular because of its irony, so will hot dogs, hamburgers, and macaroni and cheese.  Let’s see if the hipsters will just eat this good old American chow or will talk endlessly about it.

Fiction Or Non-Fiction?

I am a lover of fiction, and after many years immersed in history, political geography, economics, and political science, I have returned to Shakespeare.  I was disappointed after travelling in over 50 countries, studying why they were the way they were and what might be their future trajectory only to come to the most predictable answers.  People, communities, and nations all seemed to be marching to the same drummer who beat out rhythms of accession, succession, expansion, defense, conquest, and subjugation in repeating cycles.  How could it be that all communities from aboriginal tribes hidden away from civilization since their particular dawn of history to the most sophisticated and advanced Western and Eastern societies behave in the same way?

Shakespeare knew that this endless cycle of history was what the critic Jan Kott called The Grand Mechanism – a perpetual motion machine whose gears cranked and meshed the lives of kings and queens, pretenders and usurpers, sons and daughters, wives and lovers again and again.  Lay out the Histories one by one, he said, and you will find the same trajectories and the same results.  Pure will, said Shakespeare and later his devotee Nietzsche, was what enabled great men to rise above the din of the clattering machine.

Shakespeare, of course, did not stop with reflections on human history, but gave us insights into human behavior.  He was Freudian long before Freud, understood sexual politics and sexual dynamics long before Post-modern Feminists deconstructed intimate relationships, provided us with treatises on management, public relations, and strategy; and along the way gave us some of the most memorable characters in literature.

This leads me to the question posed today in the New York Times (11.23.12) by Sara Mosle, What Should Children Read? There is a great debate these days on the relative merits of fiction and non-fiction as teaching tools, and non-fiction is winning out.  Curricula, even in lower grades, are stacked with very practical subjects as school administrators prepare for the Common Core State Standards to go into effect in 2014:

The Common Core dictates that by fourth grade, public school students devote half of their reading time in class to historical documents, scientific tracts, maps and other “informational texts” — like recipes and train schedules. Per the guidelines, 70 percent of the high school English curriculum will consist of nonfiction titles. Alarmed English teachers worry we’re about to toss Shakespeare so students can study, in the words of one former educator, “memos, technical manuals and menus.”

The reason behind this move is clear – public education should be designed to produce civic-minded and economically productive students. 

David Coleman, president of the College Board, who helped design and promote the Common Core, says English classes today focus too much on self-expression. “It is rare in a working environment,” he’s argued, “that someone says, ‘Johnson, I need a market analysis by Friday but before that I need a compelling account of your childhood.’ ”

This is all well and good, especially for public education.  I see no reason why taxpayer dollars should be spent on courses, for example, which teach the deliberately indecipherable and irrelevant dicta of Jacques Derrida, or which equate all ‘texts’ and pile them pêle-mêle, car manuals alongside of Hamlet

Focusing so exclusively on non-fiction in lower grades, however, seems a bit rough.  Children from the earliest ages are read to by their parents – not economic dissertations, but Wind in the Willows, Mr. Gumpy’s Motorcar, The Cat in the Hat, and Goodnight Moon,stories that appeal to wonder, imagination, and fantasy.  These are the stories that engage children in reading, teach them that books can be sources of pure joy; and if told right by aware parents can provide moral lessons. Why abruptly stop this sojourn in the fanciful and replace it with practical elements that are far from a child’s reality?  What fourth-grader can possibly appreciate even the simplest deliberations of the Founding Fathers or have any clue about what such reflections meant?

The early grades are certainly more about learning how to learn than learning a particular fact.  Regardless of whether a young student reads a history book or a novel, he should know how to read it.  He should learn how to analyze facts as presented in a history, validate and confirm then with other sources, decipher meaning from past events and suggest likely future outcomes.  He should ask why fictional characters do what they do, and what their actions suggest about character or psychology.  He should consider the social and cultural context in which the characters are presented and understand allegory, metaphor, and allusion.  Either way – fiction or non-fiction – he should learn how to think.

Which brings us to the second reason for reading – learning how to write.  I was once told by a supervisor that I wrote well.  “I am amazed”, he said, “at how fast you turn out documents.  Where did you learn to write?”.  He thought that writing concise, well-conceived, logically-presented, substantiated arguments must have come from some writing course I had in college, some verbal dexterity, and maybe typing speed.  The dope ignored the fact that I had learned how to think. One of the principal purposes of school from K-12 and beyond is to teach students what I learned – how to read quickly, analyze contents, formulate conclusions, organize a reply, and write clear, unambiguous, declarative prose.

What schools really need isn’t more nonfiction but better nonfiction, especially that which provides good models for student writing. Most students could use greater familiarity with what newspaper, magazine and book editors call “narrative nonfiction”: writing that tells a factual story, sometimes even a personal one, but also makes an argument and conveys information in vivid, effective ways.

The point is not necessarily better non-fiction, it is better books and writing, period.  If fiction and non-fiction are taught to encourage the same intellectual rigor in students, then it does not matter what they read as the books are well-known, challenging, and interesting.  I do not mean to say that all the juice and fun should be wrung out of fiction.  Far from it.  Critical analysis helps the reader to understand dynamics of a novel or play that he might not have noticed or appreciated.  Parsing of poetry or lyrical prose passages can help the reader appreciate the evocative lines even more.

If nothing else, such focus on rigor, exegesis, and critical thinking will provide a strong defense against the wobbly intellectual tendencies encouraged by the Internet.  There is so much information transmitted and shared so quickly that critical appraisal of it is rare.  If there have always been conspiracy theories, there are 100 times the number than there once were as questionable information whizzes through the web, attracting credulous believers who pass it on until it has life and veracity of its own.  Good teaching can help stem that irrational torrent.

Finally, I think that room should be made in the Core Curriculum for non-traditional subjects which are directly relevant to a productive civic and economic life.  I have lamented before on this blog about how students are given a 19th Century education in the 21st Century.  They should be learning about risk, independence and creativity, and enterprise when these are routinely discouraged.  Schools favor ‘cooperative’ learning and inclusiveness to the exclusion of the talented, the very bright, and the intellectually curious.

Thursday, November 22, 2012

Was Darwin Wrong?

 Most of us believers in evolution at one time or another have to wonder why ‘Nature’ or natural selection followed a predictable course, one small incremental change to encourage survival after another, and then all of a sudden all bets are off, the 4.5 billion history of tiny changes in protoplasm, feather tips, or finger length are themselves history, and along comes Man whose intelligence represents a quantum leap from his nearest evolutionary relative.  What happened? we ask.  Did Nature decide that it was working too slowly and had to hurry up and get to the end of the line?  Or did God intervene, waiting until just the right moment?

Currently there are only two theories explaining how we got to where we are – Natural Selection and Intelligent Design.  Natural selection can be thought of either as a godless enterprise set in motion randomly after the Big Bang, or a God-driven one.  That is, there is a God and Natural Selection is his design for creating people who will then, through their intelligence, come to know Him.  Intelligent Design believes that while God did indeed create Man as a special, knowing, species; the did not do it through Natural Selection.  How could we be related to the apes or common slugs if an all-knowing God with both hindsight and foresight did the work?  Why would he put us through all that bestiality and hairiness?

Eminent philosopher Thomas Nagel has explored other options; and in a review of his recent book Mind and Cosmos: Why the Materialist Neo-Darwinian Conception of Nature is Almost Certainly False, Alvin Plantinga (New Republic, 11.16.12) describes Nagel’s criticism of Darwinism and his conclusions.

Nagel rejects nearly every contention of materialist naturalism.  First, the claim that life has come to be just by the workings of the laws of physics and chemistry. This is extremely improbable, at least given current evidence: no one has suggested any reasonably plausible process whereby this could have happened. As Nagel remarks, “It is an assumption governing the scientific project rather than a well-confirmed scientific hypothesis.”

Nagel also rejects the idea that once life was established on our planet, all the enormous variety of contemporary life came to be by way of the processes evolutionary science tells us about: natural selection operating on genetic mutation. He thinks it incredible that the fantastic diversity of life, including we human beings, should have come to be in this way: “the more details we learn about the chemical basis of life and the intricacy of the genetic code, the more unbelievable the standard historical account becomes.”

Nagel then turns to the question I raised above – how to account for the quantum leap in intellectual abilities from ape to Man?

Nagel thinks it is especially improbable that consciousness and reason should come to be if materialist naturalism is true. “Consciousness is the most conspicuous obstacle to a comprehensive naturalism that relies only on the resources of physical science.” Why so? Nagel’s point seems to be that the physical sciences—physics, chemistry, biology, neurology—cannot explain or account for the fact that we human beings and presumably some other animals are conscious.

Even more challenging is our ability to reason.  In evolutionary terms, there was no reason for human beings to have the ability to see beyond their natural world, to speculate, to hypothesize, and especially to figure things out without ever seeing them.  Einstein deduced that nothing was faster than the speed of light; and only decades later were scientists able to test and confirm his theories.

Nagel said that “the problem that I want to take up now concerns mental functions such as thought, reasoning, and evaluation that are limited to humans, though their beginnings may be found in a few other species.”  He believes that it is monumentally unlikely that unguided natural selection should have “generated creatures with the capacity to discover by reason the truth about a reality that extends vastly beyond the initial appearances.”

Having got to this point, Nagel now has to step into uncharted waters.  He, as an atheist has rejected theism which offers a perfectly logical explanation for why natural selection cannot be correct: there is God behind creation, and once one accepts that and the fact that He is an all-powerful and all-knowing being, explaining elements of human evolution not convincingly proven by Darwin, is easy.  The quantum leap in intelligence between ape and Man was a deliberate divine intervention for a purpose.

Nagel offers his own vision:

First there is panpsychism, or the idea that there is mind, or proto-mind, or something like mind, all the way down. In this view, mind never emerges in the universe: it is present from the start, in that even the most elementary particles display some kind of mindedness. The thought is not, of course, that elementary particles are able to do mathematical calculations, or that they are self-conscious; but they do enjoy some kind of mentality.

Nagel himself has trouble with this idea which contains a basic contradiction – if small bits of intelligence, embedded in every molecule since the dawn of time has contributed to our now enormous mental abilities; how, then, is it possible that we cannot explain this phenomenon?  The reviewer, a Catholic professor of philosophy at Notre Dame agrees.

Nagel’s second theoretical explanation is the following:

At each stage in the development of our universe (perhaps we can think of that development as starting with the big bang), there are several different possibilities as to what will happen next. Some of these possibilities are steps on the way toward the existence of creatures with minds like ours; others are not. According to Nagel’s natural teleology, there is a sort of intrinsic bias in the universe toward those possibilities that lead to minds. Or perhaps there was an intrinsic bias in the universe toward the sorts of initial conditions that would lead to the existence of minds like ours.

This for the reviewer and for me is hard to accept. 

Does it really make sense to suppose that the world in itself, without the presence of God, should be doing something we could sensibly call “aiming at” some states of affairs rather than others—that it has as a goal the actuality of some states of affairs as opposed to others?

After reading this review I am still left with my original conclusions that there is such a thing as natural selection; and that however it was put into motion, it has inexorably facilitated the evolution of the human species.  I am not concerned whether or not God (if there is such a being) set the process in motion billions of years ago, and whether or not He had a design in mind for so doing.  I am persuaded enough by the fossil record to be convinced that we are the result of billions of years of improved feather tips and taste buds. 

I still cannot resolve in my mind the question I posed at the beginning of this post – How can one explain the quantum leap in intelligence, cognitive ability, and sophisticated reasoning when simply having a little more brain power than the previous generation would have sufficed.  In other words, why did we quickly evolve a brain capable of Einsteinian insight when all we needed was to build a better club to bash brains out?

Nagel, always a provocateur, but never dishonest, provided the best answer:

“I am certain that my own attempt to explore alternatives is far too unimaginative. An understanding of the universe as basically prone to generate life and mind will probably require a much more radical departure from the familiar forms of naturalistic explanation than I am at present able to conceive.”