"Whenever I go into a restaurant, I order both a chicken and an egg to see which comes first"

Monday, December 31, 2012

Recipes - Curried Coconut Rice

Most of the time I like my rice plain with a little butter.  I have changed from Jasmine or Basmati long grain rice to sushi short grain.  I love the flavor and the texture.  When I do something else with rice, such as make paella or fried rice, I use the long grain.

I recently posted a recipe for biryani – a wonderful rice dish with lots of spices, especially garam masala.  It should have a slightly sweet taste and added cloves, cardamoms, and cinnamon result in a well-balanced dish.

This recipe for Curried Coconut Rice gives a lot of the same flavor as biryani, but takes far less time.  The flavors are less intense, but the result is a fragrant, delicious dish.

Curried Coconut Rice

* 1 cup basmati or jasmine rice

* 1 cup coconut milk

* 1 cup water

* 1 Tbsp. garam masala

* 1 Tbsp. Madras curry powder

* 5-6 whole cloves

* 7-8 whole cardamoms, pounded

* 2 tsp. unsalted European style butter (my favorite, but regular butter OK)

* salt to taste

- Bring the water and the coconut milk to a boil

- Add the spices and butter and stir well, returning the liquid to boil

- Add the rice, stir with a fork once or twice, lower the heat to a low simmer and cook  for  25 minutes.  If the rice still looks wet, then add five more minutes.

- Let sit for 1 hour (to dry any excess liquid)

- Serve

Child-Rearing In New Guinea–Lessons For Us?

I have travelled in more than 50 countries in the world, many of them in Africa, Asia, Latin America, and the Caribbean, and I have been a professional observer of child-rearing practices.  My job was to identify those traditional but unhealthy practices which should be changed in improve mortality and morbidity.  Many women give up breastfeeding very soon after birth, thus depriving their infants of  valuable nutritional and immunological protection.  Perhaps most importantly, exclusive breastfeeding keeps the child from becoming exposed to the virulent pathogens found in most developing country environments – bad water and unhygienic bottles and vessels.  Child nutrition was often poor, resulting in wasting and stunting. 

Vaccinations were requested indifferently, mothers sought the treatment of traditional healers and/or witch doctors, and every possible theory of hot and cold foods, drafts, ill winds, spells and possessions, evil eyes, and the influence of dwarves circulated throughout the community.  Families continued to have more children than they could afford to raise properly, thus contributing to infant and child mortality.

Traditional practices usually have a basis in economics.  African or Asian women, just like their European counterparts, were very happy to remove the albatross of a baby from around their necks, and the bottle represented both economic and social freedom.  In one respect, it was the first small step in liberating women from their imprisoning duties as village wives and mothers.  Vaccinations, especially the second and third in a series, were not carried out because of the distance and expense of travelling to a health clinic; and/or a lack of education which would allow a better understanding of immunization. 

Jared Diamond in his book The World Until Yesterday – What We Can Learn From Traditional Societies:disagrees and sees some higher, more intrinsic value in the practices of traditional societies. As Noah Berlatsky notes in his review (Atlantic, 12.28.12):

Among the !Kung of Southern Africa and other hunter-gatherer groups,  nursing typically continues for three years or longer. Part of what makes this type of nursing possible is almost constant contact between mother and child, or at least between some adult and the child.

Nonsense. Breastfeeding is done exclusively in the poorest communities for economic reasons by those women who had no access to formula.  Women tightly strapped their infants to their backs so that they could work in the fields, not because they valued mother-child bonding. 

Mothers pick up their children when they start crying because it is easier to give them the breast (or, common in India not many years ago, to give a goli, a little ball of opium) to shut them up; and shutting them up meant that they could go on with their productive work.

Treatments by herbalists, homeopaths, and quacks were sought out because neither the causative links to disease nor the effectiveness of modern medicine could easily be seen.  It takes years and a large population pool, for example, for uneducated people to visibly see the benefits of vaccinations.

Large families in economically marginal societies made economic sense because there were more arms and hands to do field work and collect fuel and water.  The benefits of a small family, like those of vaccinations, could not easily been seen.  In a poor village, those families with one or two children were worse off than those with more.

Therefore, the lionizing of traditional practices by Noah Berlatsky is as out of touch as are the observations of Diamond:

My nine-year-old has been begging me for a while to let him walk alone to his friend's house, half a block and two not-very-busy-street-crossings away. I finally let him do it, inspired in part by an anecdote from Jared Diamond's book:

The anecdote told by Diamond was this:

When I arrived at one particular village [says Diamond], most of the porters from the previous village who had brought me there left, and I sought help from people of any age capable of carrying a pack and wanting to earn money. The youngest person who volunteered was a boy about 10 years old, named Yuro. He joined me expecting to be away from his village for a couple of days. But...Yuro remained with me for a month...It was evidently considered normal that a 10-year-old boy would decide by himself to go away for an indeterminate length of time.

Berlatsky’s conclusion that he should not worry so much about his young son’s independence when traditional New Guinean families let their young children go off for months at a time is disingenuous. Berlatsky should worry if he lives in a city anything like Washington, DC where I make my home.  There are a lot of wackos out there, schools are not safe havens; pedophiles and rapists prowl the streets, hang around playground, and ply children with candy; drunk drivers routinely mow down pedestrians and especially children, road rage takes victims daily.  Maybe there are still tigers in the forests of New Guinea, but other than that, they must be pretty safe.

“If a New Guinean kid could go wandering away from home for weeks at a time”, says Berlatsky, “ I figured my son could probably go up the block.”

Berlatsky goes on to limn the praises of families who let their infants and young children sleep with them. Over 90 percent of traditional families respect this practice, he says.  Of course they do since most live in one-room huts with adults, infants, and children sharing beds let alone rooms.  Most of these traditional families would happily opt for some more space.  Lebensraum is not just a Western idea.

As much as Americans might not want to admit it), we believe that separating infants from their parents at an early age is a good thing, for it teaches emotional independence and self-reliance - ‘traditional’ aspects of our culture just as sleeping in crowded but intimate rooms is for Third World villagers.

Sharing is another idea that Berlatsky and Diamond long for in American society and see so lacking here. 

Diamond expresses admiration for the way in which New Guinea children from some traditional cultures are encouraged to share. He describes one game in which children are each given a banana, which they cut in half; they eat one half and then pass the other on. This happens numerous times, so that each child is dependent on the fairness of the others. Such activities contrast strongly with American games, which emphasize winning and individual victory.

All I can say is that New Guineans, for all their sharing, are still picking bananas, while the rest of the world has gone on to strive for and reach more ambitious goals.  Western history is a legacy of not sharing.  Kingdoms, empires, civilizations were created and achieved greatness because of powerful ambition, insatiable desire for wealth, territory, and influence.   Of course we parents try to get our children to share; but we know they do it reluctantly, and will quickly learn the American lesson that it doesn’t pay dividends.

I have traipsed through villages for 40 years, picked my way through urban slums, trekked into the interior bush to bring American ‘aid’ to the poor and helpless; and emerged unchanged – Marx was right.  Man is an economic animal, and human societies develop in function of their economic conditions and opportunities. Traditional societies are poor societies, and most people living in them want to come to the United States if they have heard of the United States.

I once asked an Indian villager back in the late 60s if he knew where America was.  “Yes”, he replied. “Just north of Delhi”.  Not too far to travel, he reckoned, to get out of his traditional, fly-specked, and incestuous village.

Why Is There Art?

Most of us take art for granted – it is around us and before us everywhere.  While there may be a difference of opinion on what is art – i.e., are crafts art?  Is popular music art? etc. – there is no doubt that if all creative expression were to cease, the world would be a bare, mechanical, and indifferent place.

Art has existed for millennia and is found in all cultures.  From the cave paintings to Lascaux, to primitive tribes in Borneo, to Western contemporary art, artistic expression is an integral part of human existence. Much art has been religious – or art least a way of explaining the unknowable.  In a remarkable exhibit, The Roads of Arabia at the Sackler Gallery in Washington, DC, stone carvings of almost 10,000 years ago are displayed:

Roads of Arabia

They are haunting, powerful, and emblematic of an art which was far more evolved than that of Lascaux:

Unicorn

Human beings were expressing abstract thoughts through abstract design. The history of art, of course, is one which moves from the abstract to the symbolic to the representational and back to the abstract with fluidity.  Art has illuminated religious history, glorified kings and emperors, represented courtly and pastoral life; and is as varied and surprising as any human enterprise.

What purpose does art serve? asks Adam Kirsch in The New Republic (7.12.12), reviewing the works of well-known artists, philosophers, and evolutionary biologists.  Does it have an evolutionary purpose?

He starts by quoting Thomas Mann from an early story, Tonio Kroger in which the author expresses his feeling that the artist will always observe human society from the outside, from a perch of loneliness and isolation:

At the end of the story, Tonio has a vision of these two paired off in happy, fruitful partnership—a destiny he can never share: “To be like you! To begin again, to grow up like you, regular like you, simple and normal and cheerful, in conformity and understanding with God and man, beloved of the innocent and happy.” Love and marriage and parenthood are barred to Tonio, because he has an artist’s soul: “For some go of necessity astray, because for them there is no such thing as a right path.”

This vision of the lone and lonely artist on the fringes of society is not new, says Kirsch, but is particularly relevant within the context of Darwinism which Mann near the end of his life began to appreciate.  Does the artist play any important role, Mann wondered, if the course of humanity is determined by such a mechanistic and indifferent process?

In associating art with loneliness, sorrow, and death, Mann was not presenting a new idea but perfecting an old tradition. Everywhere you look in the art and literature and music of the nineteenth century, you find examples of this same figure, the artist banished from life: in Leopardi, the stunted, ugly, miserable poet; in Flaubert, the novelist too fastidious for bourgeois existence; in Nietzsche, the wanderer upon the earth. What is different in Mann is that, writing in 1903, he has fully assimilated the Darwinian revolution, which taught him to think about life in terms of survival and fitness. In his great novel Buddenbrooks, Mann tells the story of a family whose fitness to thrive in modern society declines in tandem with the growth of its interest in ideas and art.

If art has nothing to do with evolutionary progress, then why does it exist at all? and as importantly, why has it existed since the dawn of time?

Nietzsche went further and stated that the artist is anti-heroic, and artistic enterprise stunts the expression of the will, which in turn is the force that forges historical endeavor:

When art seizes an individual powerfully [says Nietzsche], it draws him back to the views of those times when art flowered most vigorously.... The artist comes more and more to revere sudden excitements, believes in gods and demons, imbues nature with a soul, hates science, becomes unchangeable in his moods like the men of antiquity, and desires the overthrow of all conditions that are not favorable to art…

Darwin himself was at some pains to explain art in evolutionary terms.  On a superficial level he saw human artistic expression as little different from the strutting of male peacock; but on a more profound level found the human ‘excesses’ of art as accidental distractions and incidental to human development; and the implausible amount of energy devoted to such an avian display perplexing:

The problem plagued Darwin: “The sight of a feather in the peacock’s tail, whenever I gaze at it, makes me sick.”

In other words, human and animal nature are continually producing ‘beauty’ for no apparent or at least discernible reason.

“Many of the faculties, which have been of inestimable service to man for his progressive advancement, such as the powers of the imagination, wonder, curiosity, an undefined sense of beauty, a tendency to imitation, and the love of excitement or novelty, could hardly fail to lead to capricious changes of customs and fashions.” Such changes are “capricious” in the sense that they are unpredictable from first principles.

Dennis Dutton in The Art Instinct offered another view of art, and sounds much like Noam Chomsky when he says that our linguistic ability is innate and hardwired. Since art is everywhere, in every culture, and in every period of history, the ability to produce it must be innate:

As Dutton put it: “The universality of art and artistic behaviors, their spontaneous appearance everywhere across the globe ... and the fact that in most cases they can be easily recognized as artistic across cultures suggest that they derive from a natural, innate source: a universal human psychology.”

This may be true, but it still only describes what is, not why it is.  His supposition is that because art is universal, it must be innate; and if it is innate, then it must have an evolutionary purpose even though that purpose is not completely clear. Dutton does evoke Chomsky, however, when he suggests that art is a further expression of language – it is a non-verbal means of communication, as valuable for conveying essential elements of human progress and survival as language.  This conclusion is highly debated.

Stephen Jay Gould and Stephen Pinker weighed in on the debate with the contention that art was a by-product of direct evolutionary human traits.  In other words, art wasn’t necessary; it just came as part of the package of big-brain operations:

Stephen Jay Gould suggested that art was not an evolutionary adaptation but what he called a “spandrel”—that is, a showy but accidental by-product of other adaptations that were truly functional. Gould, Dutton writes, “came to regard the whole realm of human cultural conduct and experience as a by-product of a single adaptation: the oversized human brain.” Having a large brain was useful to our ancestors, allowing them to plan and to forecast and to cooperate and to invent; and it just so happens that a large brain also allowed them to make art. Stephen Pinker suggested something similar, if more disparagingly, when he described the brain as a “toolbox” which, in addition to promoting survival and reproduction, “can be used to assemble Sunday afternoon projects of dubious adaptive significance.”

Brian Boyd, a biographer of Vladimir Nabokov, suggested that art served as a kind of mental calisthenics – not exactly related to evolution, but helpful:

Art, then, can be defined as the calisthenics of pattern-finding. “Just as animal physical play refines performance, flexibility, and efficiency in key behaviors,” Boyd writes, “so human art refines our performance in our key perceptual and cognitive modes, in sight (the visual arts), sound (music), and social cognition (story). These three modes of art, I propose, are adaptations ... they show evidence of special design in humans, design that offers survival and especially reproductive advantages.”

Boyd and I.A. Richards before him, thought that creating and appreciating art was a process of recognizing patterns.  Deciphering Shakespeare’s Sonnets, for example, was a valuable exercise:

If pattern is good for us, and if Shakespeare’s sonnets contain many patterns, then Shakespeare’s sonnets are good for us. Boyd’s concern in his book is to prove the minor premise, which is easy to do, and which he does intelligently and well. Like Helen Vendler in her commentaries on Shakespeare’s sonnets, Boyd emphasizes the verbal texture of the poems, the play with sounds and images, the parallels and the oppositions between different sonnets

Mark Pagel in Wired For Culture, suggests another theory – that art is part of a culture which defines societies, gives them a visible and recognizable emblem.  Art is, therefore, evolutionary because without art and culture nationalism would not be possible.

Culture in the first sense—works of art, music, and literature—is therefore able to justify itself as part of culture in the second sense, the sum total of practices and beliefs that define the particular way of being of a group of people. The first kind of culture gives us paintings, the second gives us patriotism; and while paintings are not obviously adaptive, patriotism is.

Finally Kirsch discusses – and dismisses – the work of Eric R. Kandel (The Art of Insight), a neurobiologist who describes how the brain functions to perceive art, but who offers nothing in the way of explanation why art is important.

In summary, all of these theories seem reasonable but in the end unconvincing; and perhaps looking for meaning only in evolutionary terms is not the only way to understanding.  Many religious philosophers have suggested that art is the highest expression of a God-given soul – it is the spiritual voice rising above the practical, mundane, but necessary noise of survival.  It is easier to accept either one or the other – that is, that art is an important tool for human evolution, or that it is an expression of God within us – than to accept that it is only a ‘spandrel’, in the words of Gould, or the result of Pinker’s Sunday afternoon pastime, accidental and irrelevant human trifles.

Sunday, December 30, 2012

Diwali, Rosh Hashanah, FY 13, January 1st–My New Years

One of my most lasting memories is that of Benares (Varanasi) during Diwali, an important Hindu festival celebrating many things, one of which is the new year – a time to reflect on the old but look forward to the coming prosperity of the new. The festival starts with Dhanteras on which most Indian business communities begin their financial year; and the third day of Diwali, marks the worship of Lakshmi, the goddess of wealth.

It is also the Festival of Lights, and each year pilgrims to the holy city float thousands of small clay lamps onto the Ganges.  The city receives pilgrims all year ‘round, for Varanasi is one of the seven holiest sites in Hinduism, but during Diwali it is transformed.  The devotion is even more visible, perhaps because of all the lights floating on the river, or on balconies, ledges, and walls, but because of the rituals, ablutions, prayers, and ceremonies taking place along the ghats from early morning till evening.  I would rent a country dugout at first light and travel slowly up the river watching the pilgrims make their way down the steps to bathe in the river. 

 

There are nearly 100 ghats along the Ganges in Varanasi – Ganga Mahal, Scindia, Tulsi, Raj, Mana Mandir to name just a few.  Some are small and modest, others temples to wealth and influence. 

I always think first of Diwali when I think of New Year’s.  The often forced revelry of an American New Year’s has never appealed to me, and our celebrations have always been quiet ones with especially good food and friends.  American New Year’s is more a retrospective of the past year’s most notable events – killings, elections, storms, political upheavals, economic cliffs and downfalls – than a time for serious reflection.  Resolutions are made as easily as popping a party favor, and as quickly discarded.  New Year’s day is the only real quiet day of the year – nothing is open, few people want to go pub crawling or drinking, and most are just plain tuckered out after too late hours and too much to drink.

Diwali on the other hand always gave me pause.  I went there every year during my five year residence in India, and each year I happily and willingly became part of the crowds, the alms-givers, the pilgrims, and the lights.  It was a time for spiritual reflection and possibly renewal.  I have never been a religious person, but Varanasi, imbued with devotion and spiritual expression, was as close as I have ever come to letting myself slip into belief.  I even toyed with the idea of studying philosophy at Benares University.

Rosh Hashanah, the Jewish New Year, is close to this idea of reflection and resolve:

In Jewish tradition, Rosh Hashanah marks the anniversary of the creation of the world as described in the Torah. It is also the day on which God inscribes the fate of each person in the "Book of Life" or the "Book of Death," determining both if they will have a good or bad year and whether we will live or die.  Rosh Hashanah also marks the beginning of a ten-day period on the Jewish calendar that focuses on repentance or teshuvah (About.com).

My wife always half-jokingly says that her new year begins on October 1st, the beginning of the new fiscal year.  That marker has, she says, practical meaning – a time to put the affairs of the old year in order and to start the new one with a clean slate.  I was never sure why the chronological and fiscal years did not coincide; but perhaps it is better to keep personal and financial issues separate.  For years as a government contractor, I dealt more in FY’s than January 1sts, and was very aware of project fiscal cliffs.  It was very important in my incessantly bottom-line business to meet spending targets and to come out even.  Fiscal years were a part of business, however, and I never attached any more importance to them as did my wife.

Most people of a certain age don’t even bother to make New Year’s Resolutions – too great a chance that they won’t live till the end of the year.  Young people, still happily in the ‘whatever’ or ‘fuck it’ period don’t bother either – they just look forward to another year of exploration, adventure, and romance.  Everybody in between makes some kind of resolution.  It is never to late to start dieting again, or to quit smoking, or to get more exercise.  “I will spend more time with my family” is not only the classic Washington excuse for resigning from one job to take a better one; but also the perennial sop to unhappy spouses and children.  There is never even a scintilla of conviction when a Type A workaholic lobbyist says or thinks these words.  Yet, the vast middle – middle-class, middle-age, middle-brow – will continue to feel that life has not yet excluded them totally from the American dream; that anything is possible in this great country of ours; that hope springs eternal; and that the grass is definitely green wherever you look.

I am an amalgamation of all of this.  I still have a ‘fuck it’ attitude, come what may, let life bring it on; but it is looking a bit shopworn these days compared to what it was a few years ago when every new year meant new countries, new adventures, new loves, new and unexpected happenings.  I still have that middle-level optimism, and have taken on new challenges – modest to be sure, but challenges nonetheless.  I hope to finally be able to speak intelligently about all of Shakespeare’s works, direct a Tennessee Williams play in his birthplace, continue to learn and write, teach new and difficult courses.  I also have that alter kocker resignation that this year might be my last, so might as well check the date off the calendar and try not to be “Too soon old and too late schmart” .

One thing of which I am particularly aware in my Golden Years is how time has accelerated. How could that be possible?  How could one day run into another without my noticing it? New Year’s.  What, another already?

I have the usual parent’s modest hopes for my children – that they may be happy, productive, healthy, and fulfilled.  I pay no attention to their still-youthful hopes (“2013 will be my best year yet”).  I am too much a seasoned realist to think that any year will be much different from the one preceding it.  All ups and downs, satisfaction and disappointment, obstacles and free running.

And I am too much of a Stoic to wish anyone a Happy New Year, so Good New Year to all!

Saturday, December 29, 2012

In Praise Of Colonialism

The City Museum in Amsterdam has just opened a new exhibit called ‘The Golden Age’, celebrating the Dutch colonial era of the late 16th and early 17th Century.  In an article in The Guardian (12.28.12) Martin Kettle reflects on this history and considers that of Great Britain:

In a few short years at the end of the 16th and the start of the 17th century, the Dutch republic made itself the hub of the world. State of the art shipping, weapons and science enabled them to capture and dominate the lucrative spice trade with the East Indies. Back in the Netherlands, the wealth and freedom fuelled by this trade brought a glittering age of writing, painting and technological invention. Their freedom of press and religion was a magnet to the rest of Europe. Its primary monument remains Amsterdam itself, so it is easy to feel the connection to this day.

In his modern classic, Vermeer's Hat, Timothy Brook says simply that 17th-century Netherlands raised the curtain on the global world – which is our world. The Dutch bought and sold wherever they could find anything to trade. They wrote the fundamentals of international law to suit their needs. They mapped the globe and the heavens. Their way of life became multicultural. When Vermeer painted a geographer in 1669, he dressed him in a Japanese kimono and gave him a globe depicting the Indian Ocean.

The Dutch opened the world to goods and ideas; and while they certainly took much in return, they are right to celebrate the enlightenment that their trade and investments made possible:

The Amsterdam exhibition tracks all these aspects of globalization's first wave. The Dutch established colonies in modern-day Brazil, South Africa, Sri Lanka and Java – and on Manhattan, too. Theirs was a connected world. In a 1656 picture of the center of Amsterdam, Ottoman merchants are shown negotiating a deal just round the corner from where the picture itself now hangs. A Dutch translation of the Qur'an was printed there in 1696.

The Dutch – like all other European nations of the time – participated in practices that today would be considered wrong and distinctly unenlightened. 

But this was a time of slavery and war too. Slavery was illegal in the Netherlands, but Dutch ships carried and sold slaves in Africa and Surinam, and Dutch fortunes waxed rich from the profits of the trade. The Dutch were renowned in China for their violence, and their arms industry – still the sixth-largest in the world today – was formidable. By modern standards, Dutch justice was anything but enlightened. Two ghoulish Rembrandt drawings of the public strangulation of a female murderer depict one of the many dark sides of the golden age.

There are many ‘progressive’ historians and political scientists who continue to blame colonization for a host of the world’s ills.  The Spanish brought diseases to the New World which decimated indigenous Indian tribes.  They raped and plundered the Americas in their search for silver and gold to finance their European wars.  The British exploited their Empire for its wealth and subjugated local populations to maintain control over restive populations.  The consignment of colonized populations to an inferior and powerless status enabled corrupt and brutal dictatorships to emerge from the European period, and Cold War political alliances and competition thwarted nationalist movements in Southeast Asia and Africa.

Yet most colonization brought benefits to the world.  India’s development into a modern democracy was at least partly due to the infrastructure, administration, and bureaucracy built and established by the British.  The French brought European ideas, values, art and culture to Africa.  Centuries before, the Romans brought sophisticated concepts of governance, administration, laws, literature and discourse to the lands of their Empire.  The early Persians spread a culture of science, writing, literature, mathematics and architecture throughout the known world.  The Arabs under Mohammed spread monotheism and, like Persia and Rome, ideas, science, and culture.

If one were to create a balance sheet with colonialisms pluses and minuses, the advantages would far outweigh the disadvantages.  The value of this early colonial globalization cannot be underestimated for it opened the doors to areas of the world which had been closed and isolated.

Perhaps more importantly, colonialism – or expansionism – is as human an enterprise as any.  From the earliest glimmers of human society, men were driven to protect and expand their perimeters.  The more land and natural resources one controlled, the greater chance for survival.  As societies developed and became more complex and powerful, this expansion took on a whole new dimension and kingdoms and empires emerged. Those who argue that the Spanish were only after New World riches to fund unnecessary wars at home; and that colonialism’s benefits were only marginal and peripheral to the real, venal reasons for outward expansion, are misguided.  All civilizations act in the same, predictable way.  No sooner did the Puritans and the Cavaliers establish colonies in America did they look to expand their territories.  Within the short space of 250 years, we had driven from east to west, north to south, letting little stand in our way.

The author wonders why the British have not celebrated their colonial history as the Dutch have:

Eventually, however, Britain became richer and more powerful, while the Netherlands dwindled in influence. London became the global city that Amsterdam had once been. The British empire was larger and lasted longer. The English language created a network of soft power that nowadays extends even into every corner of Anglophone Amsterdam itself.

Perhaps the explanation is only that Dutch prowess began to dwindle so long ago compared with Britain's more recent decline. Certainly, modern-day Netherlands is extremely conscious that it is now a small country, dependent on European alliances in a way that is manifestly not mirrored in increasingly Eurosceptic Britain. Perhaps a small country feels permitted to dwell on a distant golden age in a way that a bigger one does not.

In this modern world that ‘celebrates diversity’ but at the same time fears it, praising a colonial past might seem inappropriate or untoward.  Britain is reluctant to talk of the Indian Raj when so many South Asians reside in the UK.  The Musée du Quai Branly in Paris celebrates the indigenous cultures of former French colonies, not the ‘mission civilatrice’ of French colonialism.  America treads lightly over its period of Manifest Destiny and the opening of the West.  Despite the fact that The Louisiana Purchase was one of the most important events of the early American period, it is not touted as the seminal historical event that it was. 

We live in an age of ‘victims’ who are always on the right side of moral opinion.  Whether the predators are colonialists or multi-national corporations, their achievements in spreading or creating wealth, disseminating new ideas and technology, their contributions to culture and society are overlooked or underestimated compared to the havoc they are said to have wrought on innocent victims.

History is and always has been an ebb and flow of power.  Since time immemorial human societies have aggregated wealth and sought to expand territory, power, and influence.  In most cases this expansion has brought enlightenment.  In others, such as the violent depredations of the Mongols who spread more mayhem than civilization, there was very little.  There is no guarantee that satisfying the imperatives of human nature will always turn out well, but there is no stopping the fundamental, hard-wired, human desire for wealth, conquest, and power.

Latrines, Those Smelly Things

For four years I worked for the United Nations Water and Sanitation Decade of the 80s  promoting low-cost sanitation in India.  At the time hundreds of millions of Indians either had primitive sanitation facilities or none at all.  Most rural Indians simply used the fields.  From my early morning train window, I could see them popping up like rabbits or meerkats, carrying ‘ablution tins’ – rusted Dalda cans used for anal cleansing. The scene always had a pastoral, even peaceful feel to it.  The cycle of nature was being repeated, dust to dust, food to waste to fertilizer to renewal.  There was nothing offensive or repulsive about it. All business was done discretely and demurely.

Urban areas were different.  In the poorest areas, residents simply went in the gutters or in the narrow sluices that collected runoff during the rains.  Children, not yet having learned proper hygiene, defecated everywhere.  In the early days of the monsoon before the heavy rains washed the streets clean, the streets were fetid, gummy, foul pathways of excrement and slime.

People who had a little more money and who lived in apartment blocks used what was euphemistically called a ‘dry latrine’ – a corner of the courtyard designated for defecation. It was a step up from the sluice and the street – more exclusive and private, reserved for tenants of the building – but not much.  The flies, stench, and nastiness of piles of excrement that accumulated for months was impressive.

If you had more money, you used what was called a ‘wet latrine’, an ingenious affair which consisted of a hole in the floor of your flat.  Your excrement dropped down a chute and splatted on a concrete slab at sidewalk level.  A ‘sweeper’ – a low-caste sanitation worker – came by daily to scoop up the mess into a loosely-woven palm thatch basket, and carry it by head load to a dumping ground – another nasty, intolerably fetid mess somewhere on the outskirts of the city.

For countries like India which practiced ablution, the Pour-Flush (PF) latrine was introduced. In all others, the Ventilated Improved Pit latrine (VIP) was promoted.  The VIP was designed to take advantages of heat differentials and to create convection currents which drew smell and flies up from the fecal pit up through an external pipe to the open air. 

The PF was comprised of a fiberglass pan and trap, concrete plinth, and two pits which were alternated.  When one filled up, the consumer switched to the second, allowing the first to become non-pathogenic compost which could be sold.  The PF was ideal for urban settings and designed to replace ‘wet’ latrines; and the VIP better suited for rural areas.

I was responsible for looking at the marketing aspects of promoting these latrines – how to convince those people using traditional sanitation methods to adopt new, more hygienic ones.  However, the ‘elegant solutions’ of the PF and the VIP were appreciated only to the engineers who designed them.  They could go on for hours about thermal currents, static heads, decomposition times, and waste efficiency. 

Consumers, however, saw things differently.  Installing a PF in an urban residence in India meant that the waste pits would have to be built directly under the verandah, which meant that higher-caste Hindus would have to live in what would be considered the most religiously polluting environment on earth.  There was a point to all this public excretion – it was an economic necessity and it was part of a ritual purification process.  Under this philosophy, one was far less concerned with the consequences of one’s cleansing than with the process of purification itself.  In other words, cleaning up the excrement was definitely someone else’s business – and in particular the outcaste sweepers who had been left out of the cycle of becoming and spiritual renewal.

The VIP was an even harder sell.  When you have acres of open fields surrounding your village, why would you ever want the bother of a latrine?  In principle it had to be cleaned and the building maintained, while the open-air defecation had no costs whatsoever.  More importantly, field defecation was part of a a daily social routine.  You did your ablutions with your friends and neighbors, while in a latrine you were alone in a dark, cramped, and solitary cell.

This was only part of the public opposition.  Many focus group respondents told us that they were afraid that their little children would fall down into the pit.   They would have to find some sweeper to pull them out and since there was no provision in the social caste hierarchy for cleaning befouled children. So then what?  Other respondents told us that it was so dark in the VIP latrine and the drop down into the pit so far that they couldn’t see their excrement which was a country-folk means to diagnose illness.  None of the respondents gave the fly- and odor-less environment of the latrine a second thought.  They lived in an environment that was putrid and fly-infested with cow, goat, chicken, dog, and bat guano as well as human waste, so the VIP was an aberration not an ideal. 

The model PF latrines that we built were dismantled and the ceramic and fiberglass pans – worth more than an entire household’s belongings – were used as flower pots.  The PF latrines and their elegant convection sheds were left to mold, rot, and degrade in the punishing rains and heat of Deccan India.

Rose George, writing in the New York Times (12.29.12) about latrines  notes that the situation is not much improved forty years later:

Many students in India, where around 650 million people still lack toilets, can’t say the same. Most schools I visited had filthy latrines, used only because there was no alternative. Some had none at all. Students and teachers made do with fields and back alleys.

One of our field staff working in Africa came up with an alternative –  a traditional pit latrine with a cover.  To my mind this was indeed an elegant solution.  Dig a hole, defecate in it, cover it up, and when it is full, dig another.  The engineers in my office – the ones who came up with the ‘elegant solutions’ of the VIP and PF – strenuously objected.  They could not deny that the African latrine would achieve the results intended – proper human waste disposal – and would also limit flies and smell; but it was just too damn simple.  It didn’t matter, for in the end just as few people used it as the VIP or PF.  Within days of our trial runs, the covers were discarded as too much trouble, the pit quickly filled up, and no new ones were built.  The dogs and birds returned, hunting for bits of undigested corn or wheat to eat, and flies once again covered the whole mess.

In my career in international development I had worked in the fields of nutrition and family planning (food and sex) and the work was fascinating.  Dietary behavior is a function of a myriad of socio-economic and cultural factors and devising even modest changes in traditional practices was a challenge.  Introducing contraception was equally difficult and stimulating.  These disciplines were dominated by women, and it was extremely pleasant to work and socialize with them.  Entering the world of water and sanitation was a shock.  Toilets, sewer systems, and latrines were not sexy at all.  Sanitation engineering was a male bastion and for some reason these men took it all very seriously.  Whereas my nutrition and family planning field trips were to schools, dairies, and experimental kitchen gardens; and to private clinics for women, my forays for my new employer were to survey the realities of a befouled environment.  I lasted only a few years and went back to my previous life.

During that latrine stage of my professional career my once firmly-held believe in cultural relativism had been sorely tested.  While I could easily accept diverse dietary and reproductive behavior, listen attentively to the most far-fetched theories of disease, and sit patiently through discourses on ill winds, hot and cold foods, spells, bewitching, and astrology, I could never get past the fact that Indians violated the old adage “Don’t shit where you eat.”  How could a whole culture traipse through piles of excrement, walk past bubbling black sludge, smell the overpowering stench of diarrhea without at least covering it up? I tried every possible explanation – people were too poor, too caste-ridden, too uneducated, too politically powerless to address the problem.  None did the trick.  I reviewed culture after culture and later travelled through one urban slum after another; and I kept coming back to India.  I finally dropped my insistence on cultural relativity.  It was simply wrong to defecate so indifferently and so indiscriminately. 

According to the Times article over half the Indian population has no proper sanitation, so things in the rural areas and in urban slums would probably look no different to me now than they did decades ago.  The other half have moved on and up to modern Western living.  The vibrant, exciting, and dynamic Indian economy, finally unrestrained after years of Soviet-style socialism, has enabled middle class families to live well and at or above international standards.  Consumerism is the tide that raises all boats.  If you have money, a new kitchen and a great-looking bathroom are de rigeur in your new flat.  So in a sense my ditching cultural relativity was wrong. The grandparents of these young people manning the phones in call centers in India did their bit to add to the bubbling fecal sludge; but that bygone era is not only distant but remote and insignificant for these bright young things. I simply was impatient.

Friday, December 28, 2012

The Taming Of Shrews

Most theatergoers who after the first ten minutes of Albee’s Who’s Afraid of Virginia Woolf have no doubt that George and Martha hate each other; that Martha is a harridan, a succubus, and a man-eater; and that George is a weak, put-upon, hen-pecked, and failed academic.  They are totally ill-matched.  Martha is a strong, willful, outrageously forward and blunt woman who cares little who is savaged by her words or actions.  George is at her mercy, hanging on to her and her father.

As the play goes on, more is revealed.  Martha becomes even more shrill and shrewish, devouring her husband and the two innocent guests who join them after a dinner party. She humiliates him, castrates him, and appears to be on the point of destroying him until George asserts himself.  Through his cruel, manipulative games, he is the one who is the destroyer, first of Nick and Honey, the two young guests who have much duplicity and dishonesty to atone for; and then of Martha by stripping her of her most important and all-sustaining fantasy – the existence of an imaginary son.  In the end it is George who tames his shrew, Martha; and if we can believe Albee, this dismantling of the soul of Martha, this cutting away of all pretense and illusion from her, is the only way that either one will survive.  In the end, they are completely in psychological harmony.  They need each other, thrive on each other’s cruelty, and revel in each other’s egotism and theatricality.  It is a good marriage.

The same is true in Shakespeare’s Taming of the Shrew. Kate is the model for Martha.  She is a vixen, a virago, an emasculating, domineering, and impossible woman.  Despite her outrageous behavior in the first two acts, she comes to be ‘tamed’ by Petruchio, a virile, dominant noble.  Just as Martha gives up her shrewish character for George, so does Kate submit herself to Petruchio.  She loves him from the very first, and although she rails against him, puts up bulwarks against his male insistence, she loves his supreme confidence and arrogance. She, like many women, love bad boys because of their flamboyance, easy masculinity, and indifference. Petruchio is like Stanley in Tennessee Williams’ Streetcar Named Desire.  Stella, despite her patrician upbringing, cannot resist her swaggering, leering, but infinitely potent Polack.

Petruchio gives Kate a way out of a family with a domineering and unappreciative father and a bland, uninteresting, but calculating sister.  Baptista cannot understand or appreciate why his daughter is the way she is.  He is a distant and indifferent father who cannot sense her frustration and who plays favorites.  Bianca, Kate’s sister, is, on the surface, a sweet, beautiful girl whom every suitor wants to marry; but Kate knows that there is little underneath, and that once married her sister will make life hell for her husband.

At first glance, the play is about men’s subjugation of women; and how women willingly give in to their desires and wishes.  More than simply reflecting the social norms of the day, Shakespeare is saying that women of any era want to be dominated by strong males, taken by them, if not ruled.  He has perceived common traits in women and men who act out female and male aspects of human nature, and has incorporated them in Kate and Petruchio.

Some critics, such as Bloom, say absolutely not.  Kate, although she gives an eloquent paean to male dominance in the final passage of the play, doesn’t really mean it.  She is acceding to the social norms of the time, but because she knows that Petruchio has come to love her – and love her desperately – she is in control.

Other critics, like Nuttall, has said that drawing such conclusions do no justice to Kate.  She is not, he says, like the heroines of Shakespeare’s later Comedies who run circles around their men just to wind their webs around them, catch them, and secure position, wealth, and status.  She means what she says, Nuttall avers, for she is intelligently aware of the world in which she lives and because she loves Petruchio and is happily willing to give up her frustrated independent ranting for a satisfying marriage. He is not the domineering, misogynist bounty hunter he seems, but wants Kate’s energy and sexual power. 

“Kate has the uncommon good fortune to find Petruchio, who is man enough to know what he wants and how to get it. He wants her spirit and her energy because he wants a wife worth keeping” (Germaine Greer quoted in Nuttall, Shakespeare the Thinker)

Greer has also said that Kate’s ‘monogamy’ speech rests “upon the role of the husband as protector and friend, and it is valid because Kate has a man who is capable of being both, for Petruchio is both gentle and strong” (Nuttall).

Nuttall says that Kate and Petruchio are a happy couple and selects this short passage to illustrate:

KATE: Husband, let’s follow; to see the end of this ado

PETRUCHIO: First kiss me, Kate, and we will.

KATE: What, in the middle of the street?

PET: What, art thou asham’d of me?

KATE: No, sir.  God forbid, but asham’d to kiss.

PET: Why, then let’s home again.

KATE: Nay, I will give thee a kiss. Now pray thee, love, stay

PET: Is not this well? Come my sweet Kate.

        Better once than never, for never too late

This is no exchange ‘between a bully and a broken spirit’, says Nuttall. “These are like-minded happy people”.

Shakespeare turns his later Comedies right around – the women always dominate weaker men and must settle for them in the end.  The likes of Rosalind, Beatriz, Portia, and Viola are more complex, interesting, and able than the men they marry; and they are victims of the times, for they have to marry well, regardless of their personal preferences. In this play, it is Petruchio who is the more dynamic, strong, confident, and alluring character.

In the later plays, we are convinced that although the couples are happy on their wedding day, they will soon be at each other like George and Martha.  The wooing games have been duels of wits – again like George and Martha but less destructive – and end in superficial harmony.  The women win their men, the men marry the women who have beguiled them, but there has been none of the deeper and more fulfilling transformation that has occurred in both Kate and Petruchio who will, we feel, have a long and lasting life together.

Kate and Petruchio have played out the games that George and Martha have played – deadly serious charades designed to expose the real core of character and personality hidden beneath superficial posturing.  Petruchio’s ‘game’ is not to destroy Kate or to beat her into a meek, subservient, and docilely uninteresting woman; but to ‘tame’ (exorcise) the frustration and wild excesses of compensatory railing, and to allow the ‘real’ Kate to emerge.

When Petruchio bursts into Kate’s world, she is psychologically in a bad way. Her unremitting fury is a kind of illness. We hear much today of pathological depression and much less of pathological anger, yet such anger exists (Nuttall)

Petruchio is the one man who can relieve this frustration through sexual healing and, ore practically, by removing Kate by marriage out of her suppressive and repressive family.  Remember than in the opening scenes of the play her sweet sister, Bianca, has tied and trussed her up to humiliate her.

Both Bloom and Nuttall agree that the reason why playgoers have always liked this play is not because of its George and Martha fireworks; nor necessarily because of its humor (the theatrical excesses of Kate and the comments of family and friends are funny); nor even for some a seeming celebration of the dominance of men over women.  It is because of the sexual dynamics between Kate and Petruchio.  Think of The Story of O, The Night Porter, Les Liaisons Dangereuses or a thousand lesser books and films which address issues of sexual dominance and submission let alone sadomasochism.

Petruchio is, first, an effortlessly dominant male like Richard III and, second, very funny.  It is a winning combination.  e is the hilarious opposite of the humble lover on his knees, deviously seeking a gratification conferred by the lady out of pure pity. It is not hard to find a women in the present century who say that they respond more readily to the sweep-you-off-your-feet kind of man than to the wheedling lover of courtly tradition – that is, they find him sexier (Nuttall)

In Act II, Scene 1 Petruchio, meeting Kate for the first time, strides into the room and confidently starts in with his effusive compliments, all ironic because he has heard of her reputation, and all calculated to begin his courtship which he hopes will be a richly rewarded one:

You lie, in faith; for you are call’d plain Kate/ And Bonny Kate, and sometimes Kate the curst;/ But Kate, the prettiest Kate in Christendom;/ Kate of Kate Hall, my super-dainty Kate,/ For dainties are all Kates…

The sexual fireworks then begin, and from this first encounter, one knows that this is good, equally-paired match:

KATE: I knew you at the first.  You were a movable.

PET: Why, what’s a movable?

KATE: A joint stool.

PET: Thou hast hit it: come sit on me

And later in the same scene:

KATE: If I be waspish, beware my sting.

PET: My remedy is then to pluck it out.

KATE: Ay, if the fool could find it where it lies.

PET: Who knows not where a wasp does wear his sting?

KATE: In his tongue.

PET: Whose tongue?

KATE: Yours, if you talk of tales; and so fare well.

PET: What! with my tongue in your tail? nay, come again.

The passion of the relationship is not derived from this typically Shakespearean comedic interchange, complete with puns and innuendos.  It comes from the sexual tensions set up between the defiant Kate wanting to be tamed, and the strong and knowing Petruchio who wants to be the tamer.

In Act II, Scene ii…Petruchio announces that he must leave at once because of business pressures.  Immediately we sense the special tension of the “delayed consummation” motif that permeates Othello…(Nutall)

The play is also engaging because it is a love story. Kate and Petruchio grow to love each other.  There is a fondness and caring in many exchanges later in the play; and this unusual couple is one of the rare examples of marital harmony in Shakespeare.  Only the Macbeths before their fall, the Caesars, and the Brutuses come to mind.

Kate’s long last soliloquy in which she sings the praises of female submission has been debated for centuries.  On the surface it is a song in praise of men and marriage – “the greatest defense of Christian monogamy ever written” said Germaine Greer.

Thy husband is thy lord, thy life, thy keeper,/Thy head, they sovereign…/Place your hands below your husband’s foot:/In token of which duty, if he please/My hand is ready; may it do him ease

Kate goes on to say that women are different from men – their bodies are soft and pliable, and their hearts and souls are no different:

But that our soft conditions and our hearts/Should well agree with our external parts…

Many feminist critics have railed against this passage and the play as a whole because of what they consider to be these outlandish sentiments of a male-dominated world and of a bitter misogynist, Shakespeare.  Petruchio is his instrument for female torture, submission, and confinement, and in his celebration of it shows, despite his finer instincts, is a troglodyte.

Others, like Bloom, have said nothing of the sort.  This last passage is a cunning one, and Kate knows very well what she is doing:

One would have to be very literal-minded indeed not to hear the delicious irony that is Kate’s undersong, centered on the great line “I am asham’d that women are so simple”. It requires a very good actress to deliver this set piece properly, and a better director than we tend to have now, if the actress is to be given her full chance, for she is advising women how to rule absolutely while feigning obedience [italics mine]  (Harold Bloom, Shakespeare, The Invention of the Human).

Nuttall disagrees, saying that the passage should be played straight, for it is the prevailing philosophy of the times. As he has stated above, Kate’s subservience, although derived in part from prevailing social mores, is really a willing offering to Petruchio, her psychological and sexual savior.

Both Kate and Petruchio and George and Martha drive off into the sunset to live life happily ever after; but we only believe that the former will succeed.  At the end of Who’s Afraid of Virginia Woolf, we are not so sure that George and Martha, having stripped each other to the marrow, will reconstruct their life in a more understanding and loving way.  There is the nagging thought that since they fought, scrapped, and wounded each other for decades, they will continue to do so in the end.  Although we may wonder for a moment whether Petruchio, given his easy command over women (Kate herself makes reference to his ‘thousands of lovers’), will be a one-woman man, we at least have reason to believe that he will.  As far as Kate is concerned, there is no doubt that she has found her social, sexual, intellectual equal and psychological mate.

Wednesday, December 26, 2012

New Year’s Resolutions? Lose Weight? Stop Smoking? Think Again.

I worked in the field of ‘Behavior Change Communications’ – using media to improve individual health outcomes – for over 40 years and had little success.  People’s beliefs, especially about food and sex (nutrition and reproduction, the two areas in which I was most involved) are case-hardened and remarkably resistant to change.  One would think that if a person learned about a change in diet, sexual, or reproductive behavior that could improve his life – say using a condom to prevent a life-threatening illness, or taking the Pill to eliminate unwanted pregnancies – he would jump at the chance.  Nothing could be farther from the truth.  I tried everything in my long career, with little success. 

People in developing countries eventually change behavior in their own good time – i.e. when they have enough of an economic cushion to move them in from the margins, and make the risk of adopting a new and untried behavior acceptable; and when the market provides the necessary goods and services.

When I first started my career in India back in the late 1960s, I had the idea that advertising and marketing techniques could easily be applied to social change. “If you can sell soap”, I said, “You can sell green leafy vegetables”.  The principle itself was sound, and decades of American advertising successes were the proof.  Advertisements appealing to social aspirations, status, sexuality, envy, and a raft of other psycho-social determinants of consumer behavior could easily be applied to food, condoms, or diarrhea control.

My first project was to promote green leafy vegetables, a source rich in iron, a necessary element in everyone’s diet but especially in pregnant women who suffered from anemia at significantly high levels in India. I designed a multi-media program in which the good news about green leafy vegetables was presented through ten media from folk drama to Bollywood films.  I had enough money to assure good reach and frequency, and I was sure that the campaign was successful.  It was a total failure.  Although we succeeded in raising awareness about the problem and the solution, we were totally unable to change people’s attitudes (i.e. favorable towards dietary change) let alone get them to start eating spinach.

It wasn’t that the campaign was bad – our data on public awareness and appreciation of the posters, billboards, films, radio spots, comic books, etc. showed that people liked it – it was that we never considered the many other factors that enable people to change.  As suggested above, families who live on the economic margins are unlikely to take any risk at all because failure would mean liquidation, misery, and death. Second, human nutrition is a complicated matter.  Within the context of a high-carbohydrate diet, common in poor countries, a woman could eat five times her recommended daily allowances of spinach and still be anemic.  Why?  Phytate build-up.  Apparently some foods block the absorption and assimilation of nutrients, and high-phytate rice blocks iron.

As important, growing vegetables is a near impossibility in the hot, dry Indian climate.  As a result the price of spinach in the market is beyond the reach of most of anemic population.  Growing vegetables at home in kitchen gardens was just as unfeasible because of the time (opportunity cost), labor (seeding, planting, weeding, watering), and expense.

So although people heard our message, it was rejected or ignored for good, logical reasons.  The media can only do so much; and if economic, social, or cultural factors are not considered, no change will occur.  The provision of reliable, valid, attractive, and motivational  information is only the first step in the process.

The failure of most anti-smoking campaigns in the United States resulted from a similar failure to address these constraints. The first warnings of the link between smoking and lung cancer were published by the Surgeon General in 1964, and it took over 25 years to see any dent in American smoking rates.  When I joined the World Bank in 1984, all the secretaries smoked, and it was a smoke-filled environment, far from the smoke-free era of today.  Cigarettes were cheap, no no-smoking laws were in effect, no cigarette pack warnings printed, and most people ignored the health issue.

Even now there are still approximately 20 percent of all adults who continue to smoke, and even more troubling is the fact that the rate among teenagers remains high (20 percent) and shows no signs of dropping. Even with what should be considered prohibitive pricing -  a pack of cigarettes costs $12.00 in New York State – rates remain high.

Only one media-only anti-smoking success story comes to mind. Teen smoking 15 years ago was very high (over 35 percent) but this rate declined significantly until plateauing to its current resistant level.  One reason given for this decline was a Florida advertising campaign based on adolescent idealism.  “Why enrich  fat cat capitalist tobacco companies’, the ads said.  Stop buying cigarettes’.  Where millions of dollars had been wasted trying to discourage smoking by appealing to health, image (Smoking makes your clothes and breath stink), and social status, here was a campaign that didn’t even tell teenagers to stop smoking.  It said stop buying cigarettes.

Smoking is a particularly resistant behavior to change because of nicotine addiction; and it may be that the 20 percent of Americans who continue to smoke simply are so addicted that no matter what they cannot.  However, this is not the case with teenagers.  It is still acceptable if not cool to smoke; and few if any media campaigns have been developed to take up where the Florida effort left off.

Shedding pounds seems even more difficult than stopping smoking.

In an article in the Wall Street Journal (12.26.12) Shirley S. Wang writes:

Some 45 million Americans go on a diet each year, spending $33 billion on weight-loss products, according to the Boston Medical Center, but only a fraction succeed. More than half of the 45 million adult smokers in the U.S. tried to quit in 2010, but less than 10% of them managed to stop, according to the Centers for Disease Control and Prevention.

Some experts, like Dr. David Kessler, former head of the FDA, say that sugar, fat, and salt are addictive (The End of Overeating), and that are brains become conditioned to demand them and feel unsatisfied when deprived of them.  The obesity epidemic is currently at alarming levels, and the trend keeps going up.  It is not hard to see why, and although experts disagree on the reasons, most agree that a sedentary life style, poverty, and advertising are major contributing factors.  Americans don’t get around much any more.  Poor people can only afford cornmeal, fatback, fried foods, and McDonalds; and children and adults are barraged with ads for junk food.

Other critics have noted that the US agricultural subsidy program favors foods like corn, rice, and potatoes which are high in fat and calories, and not healthy foods.

The government is complicit in this phenomenon.  There are no direct subsidies for vegetables, but potatoes receive generous US dollars. Potato subsidies in Maine alone totaled $535,858 from 1995-2010.  Idaho, Washington, North Dakota, Wisconsin, Colorado, Minnesota, California, and Michigan are also recipients.  Cheap potatoes allow McDonalds and other fast-food restaurants to offer huge portions for relatively nothing. (http://www.uncleguidosfacts.com/2012/05/obesitywere-in-it-for-long-haul.html)

Finally, there is a psychological dimension to obesity; and many recent studies have concluded that depression and obesity often occur as co-variables in morbidity.  On a less toxic level, diets high in carbohydrates in poor countries are especially attractive because at least the produce a feeling, although temporary, of satiety and satisfaction.

There are also psychological dimensions which favor reducing and other salubrious changes in behavior.  Wang reports on recent studies which have shown that realism as opposed to optimism is key to behavior change.  In other words, if unrealistic goals (“the false-hope syndrome”) are set for weight reduction, then people will be discouraged when they do not meet them and will give up the effort altogether.

More successful programs have set realistic goals for dieters, and although the time necessary to lose the weight desired may be longer, the chance of achieving them is higher.  Removing obstacles to dietary change is also key:

People who prepare plans on how to reach their goals, which psychologists call "implementation intentions," are more effective at reaching goals, by spelling out in their minds what they will do if an obstacle arises, says Peter Gollwitzer, a psychology professor at New York University. If a cookie tempts you every time you walk into a cafe, come up with a plan to reach for an apple.

These rather simple and mechanistic approaches to weight loss may be exactly what some, already motivated people need to become more fit.  However, for most people, it is not so easy.  As mentioned above, poverty is perhaps the most important factor in weight, and few people can do much about it.  More well-off Americans can shop for the freshest and tastiest vegetables at Whole Foods, exercise at a private gym, and be surrounded by other trim people (important because research has shown that we tend to unconsciously mimic our peers).  Poorer people have none of these advantages.

So, it is all well and good for the professional living in Northwest Washington to devise and follow a weight-loss program based on the sound behavioral principles presented in Wang’s article, up his frequency at the gym, and try some of the latest designer greens; but most obese people are stuck in a cycle of restricted income which leads to high-calorie, low-nutrition foods; which lead in turn to addiction to fat, salt, and sugar; which, combined with the energy-sapping two-job daily routine, result in obesity.

My article Obesity – We’re In It For The Long Haul – details these any other factors relating to obesity and behavior change in general.  As the title implies, I am not very optimistic about reducing obesity levels any time soon.  There are simply too many factors which contribute to it to enable any private or public advocacy group to have any effect.  Smoking rates should come down because of the radical change in social norms and high price of cigarettes.  It is now totally unacceptable by social standard and by law to smoke.  However, because of addiction, they are likely to plateau at a high, but ‘acceptable’ level.

Tuesday, December 25, 2012

Cyber-bullying–Towards A More Rational Response

Playground bullying is and always has been a rite of passage. I wrote about my own childhood experiences (http://www.uncleguidosfacts.com/2012/01/bullies.html) and concluded that I and my classmates learned to deal with bullies in many ways.  Some of us just ignored them, understanding that they were social misfits and of little earthly use to anyone and would eventually go away.  Others went crying to parents and teachers who more often than not told them to grow up and to deal with it.  A few others stood up to the bully, backed him down, and forced him to reveal his cowardice and real lack of character and conviction.  My point was not to canonize those who stand up and fight, but to suggest that the playground is the crucible within which we all learn how to deal with life’s unwanted intrusions and aggressions and each of us does this in different ways.  If we don’t learn lessons of social survival on the playground, then we will have to learn them later and more painfully.  Pretty much like childhood illnesses. 

In another article I wrote about how teachers and parents should mediate bullying incidents.  I felt that there was too much PC control, and attempts by well-meaning teachers to stop all bullying which often resulted in unfair and unreasonable dismissals and suspensions for the flimsiest unproven allegations, were wrong and counter-productive.  Teachers already were charged to stop fights, physical intimidation, and classroom violence, regardless of the cause, and to me this is where their authority should stop.  It is up to parents to counsel children just as they would about any other social hazard; and then it is up to the children themselves to translate that advice into practice (http://www.uncleguidosfacts.com/2012/10/bullyingfinally-reasonable-response.html)

A lot has been made recently of cyber-bulling – using the Internet and social media to harass, intimidate, humiliate, and punish others.  Critics have said that such cyber-bullying takes the playground, in-your-face variety to another level altogether; and it is the anonymity of cyber-attacks which makes them so much easier to carry out.  A quick review of the literature suggests what most laymen can surmise.  Cyber-bullies, like their non-virtual counterparts act out adolescent insecurities, expressing unformed stereotypical notions of power and social status in aggressive ways. In addition all animals, human beings included, peck the Ugly Duckling out of the group.  Outsiders have always been threats to social integrity and cohesion and always will.  Social status is and always will be the prevailing feature in any society.  The pecking order is necessary for social order as well as for the survival of the fittest. This combination of the natural human tendencies for group homogeneity and status; plus adolescent insecurity and immature development of personal moral and ethical codes; is a heady mix.  It is no wonder why bullies exist.

However, cyber-bullying does add some new and important dimensions.  First, it allows ‘nerds’ to wreak the same havoc as the jocks.  In most American school environments, the athletes form the in-crowd, the cool clique, and the most admired; and in most schools, they are the ones who, in their immature adolescent desire to consolidate their power intimidate, humiliate, and marginalize others.  These jock-bullies are no different from authoritarian rulers.  Actual power is never enough. Construction and positive achievements are unsatisfying.  Only through total dominance and perceived supremacy can they finally enjoy their reign.

In the age of the Internet, the geek and the nerd have achieved parity.  They can exact revenge, demonstrate the power to humiliate and destroy, and can rise in social status as well as enjoy overdue psychological satisfaction.  The in-crowd uses cyberspace to diversify their bullying, not necessarily to add to their power status which they have in reality and in abundance.  Nerds gain power; and at least in the eyes of their equally nerdy friends, gain in status.

Cyber-bulling also allows for a more creative and inventive bullying.  Emails, texting, voice mail, social media are all dynamic media available to the cyber-bully.  Whereas a bully might confront his prey once a day on the playground, now he can pursue, harass, and stalk constantly and continuously.  The person bullied cannot get away.

Other than these two points, bullying remains the same.  Many bullies suffer from anxiety and depression, have little motivation for and participation in school, and have disengaged parents.  While most adolescents grapple with their insecurities, fears, and image alone; bullies appear more damaged than most and need to bully to cover over wounds already bleeding.

There is one additional factor about cyber-bullying that causes concern – many bullies report that they bully on the Internet because they are bored; i.e., they do it for entertainment.  This is perhaps the most troubling aspect of all this.  Children are not bullying out of immaturity, psychological pain, or unformed social constructs.  They do it because they have nothing else to do.

Those advocacy groups especially concerned about bullying have suggested that cyber-bullying is the direct cause of teenage suicides.  While this cry has received a lot of attention and certainly has rallied many previously indifferent parents around the issue, it is not true.

A new study released at the American Academy of Pediatrics (AAP) National Conference and Exhibition this past weekend shows that cyber-bullying is rarely the only reason teens commit suicide. Most suicide cases also involve real-world bullying as well as depression. (Huffington Post, October 2012)

The real issue, then, is what to do about cyber-bullying, and the public outcry may well alert parents, teachers, and advisors to get more involved.  However, as we have seen in the case of  PC overkill in schools, there is a tendency to overreach and to invest in measures that address the symptom, not the cause.

In the field of behavior change, researchers always look first not at those whose practices need modification, but at those whose don’t.  In those countries in which I developed social marketing campaigns to positively change nutritional, health, and reproductive behavior, I first identified and isolated those families who were doing things right – despite the same socio-economic constraints as other families, they managed to understand the problem, figure out a way to deal with it, and prosper.   The same is true for bullying.  Every child has been bullied at some time in his or her life; and most have survived. They may have suffered at the time, but in so doing learned how to react in the future.  Very few children who had no serious underlying psychological disorders commit suicide.

Why is this? The answer is in part due to strong parental guidance.  A mechanistic approach to bullying (“If you are bullied, report it to your teachers and to your parents”) is clearly not enough.  Most children do not want to divulge their humiliation, especially to teachers and parents. Invasive surveillance of children’s communications is wrong and counter-productive.

Providing a strong moral education, one which focuses on principles rather than individual behavioral issues, enables the child to understand aggression, insult, and power in a more general way.  Respect, honor, fairness are right and proper.  Disrespect, humiliation, injustice are always wrong.   If there is a fundamental, underlying cause to bullying, it is in the lack of firm moral architecture within which the child can grow up. Children who receive this basic moral guidance are less likely to be bullies or to be bullied.

A good example of this has been cited in my blog post, above Bullies – Finally A Reasonable Response:

The “Be More Than a Bystander” campaign, orchestrated by the nonprofit Advertising Council underscores the problem with a series of television, print and online ads and a Web site promoting the idea that if witnesses know what to do, they can take various steps, such as moving the victim away from the situation or reporting the treatment to an adult, to defuse the bullying.

This approach also rejects the intrusive, politically-correct policies to control behavior through policing.  It accepts that bullying is part of growing up, but also acknowledges that it can and should be stopped by peers.  The approach encourages moral courage on the part of the observer and provides a face-saving out for the bullied (better to be helped by a friend rather than rescued by an adult)

A lot of this well-financed ‘Don’t be a bystander’ campaign focuses on what the designers consider the pernicious (as opposed to normal) aspects of bullying and still want to engineer PC corrections; but at least it brings the issue down to practical basics. Bullying exists.  Children need to learn how to deal with bullies who will never go away; and they need to learn how right it is to stand up for the defenseless.

In conclusion, although cyber-bullying has indeed added a new and troubling dimension to the problem, bullying still is and always will be a part of growing up.  A strong moral framework established by parents provides the foundation for children’s actions.  If brought up within these strict moral codes, they are likely to be the heroes in the ‘Don’t be a bystander’ program; and even if they are not, they are at least armed with the knowledge that it is the bullies who are damaged goods. Teachers will continue to have a role in policing the most egregious bullying offenses, and parents, once they find out about a cyber-attack in which their child has participated, must be as stern and unaccommodating as for any other serious offense.   Finally children will simply have to learn how to deal with bullying on their own.  Young people today live in a world of Internet exchanges and social networks.

If there is more bullying today, and if it is of a more insidious and pernicious nature, it is wrong to blame only the Internet and the distance and anonymity that it affords.  If children bully more aggressively and cruelly, it is because the traditional brakes on anti-social behavior are not only not working – the brakes themselves are non-functional and out of date.  If children exist in a new electronic world, then parents, educators, and religious leaders need to expand their moral precepts to include all the new opportunities for social dereliction of responsibility.

Monday, December 24, 2012

In Praise Of Snow

I hate the cold. I hate the encumbering ritual of sweaters, hats, gloves, scraping the windshield, arguing about the thermostat setting, feeling icy winds sneak down the collar of my parka and up my sleeves, shoveling snow, skidding on unplowed streets, slipping on patchy sidewalks.  I hate all of it.  When others talk of the magic of snowflakes, the pristine beauty and serene silence of the hours after a snowfall, I only see my car under two feet of snow, the boxwoods snapping under the weight of the snow, and the unplowed street.  I wait for power outages, tree limbs coming down and blocking the street, and being trapped for days, cottage-bound by the snow.

I think only of warm weather in winter.  I dream of the palm trees, blue waters and soft breezes of the Caribbean.  I picture myself on a chaise longue in a grove of coconut palms, doing nothing but being warm.  It is lazy, languorous, sensuous, and a perfect stasis.  My body temperature and ambient temperature are almost perfectly balanced.  There is no shock of abrupt change. In one of the most pleasant winter vacations ever, I stayed on the Samaná Península, at the time a remote northern piece of the Dominican Republic.  There were only two ways to get there – driving over steep mountainous roads that climbed high above Puerta Plata then descended down to the isolated town of Las Terrenas; or flying in a single-engine Piper Cub from Santo Domingo.  Neither was a good option.  The roads over the mountains were badly-maintained, and the last leg down from the highest point to the beach was an unpaved, steep, and rutted track.  The convection currents rising from the hot land, caught between mountain peaks, swirled and gusted, and turned the short flight into a stomach-pitching, wrenching affair.

It was always worth it.  In the days before Samaná became a popular destination and multi-storied condos were built to accommodate vacationers who came over good roads and on comfortable twin-engine Cessnas, the resort at Las Terrenas was a simple affair – wooden cabanas, with slatted, screened windows and ceiling fans; prix-fixe stays with locally-caught lobster, pompano, corvina, and all the rum punches and piña coladas you could drink.  The water was warm, turquoise, and shimmering under the tropical sun. On either side of the white sand beach were stands of mangroves, and I would walk along them in the cool shade looking out at the expanse of the Caribbean.  There were few guests at the resort at the time, and these quiet, solitary walks were peaceful, untroubled, and completely relaxing.

The main thing was the climate.  I luxuriated in the perfect warmth and the humidity which caught and held the scent of tropical flowers and the sea, which smelled fresh and earthy in the morning when a light breeze blew down the mountains just before sunrise.  I was never cold, never even gave a thought to a sweater, and stayed happily in shorts, T-shirt, and sandals for the time we were there.

I grew up in New England in the days when winters were still brutal and snow fell before Thanksgiving.  I built igloos on the front lawn, tracked rabbits in the snowy woods behind our house, had snowball fights, and built elaborate tunnels around the house.  When I was older, I walked at night onto the golf course across the road, stood on the snow-covered green of the Fifth Hole in the bitter sub-zero cold and looked up at the perfectly clear, black, sky.  On those impossibly cold nights the sky was impossibly black but somehow luminescent with the wash of the milky way and the billion stars around it.

On sunny winter days, the temperatures still in single digits, I would again walk out to the golf course, and lie against the bank of a snowy sand trap.  The sun even in the depth of winter still had some warmth, and the contrast of the intense cold, the brilliant white snow, and the warmth of the sun radiating through my clothes was exciting and satisfying.

I will never forget my family’s first trip to Florida in the depth of winter.  It was the early Fifties, so we took the overnight sleeper train.  All through the night I kept waking up to look out the window to see if the snows had receded and if the sandy beaches had begun.  After many hours the train made a stop – not an official station-stop, but an improvised one in the middle of an orange grove.  As soon as the conductor opened the doors of the train car and lowered the steps, I smelled an sweet and unbelievable fragrant scent of orange blossoms. The sun was warm and everything was like summer, but no Northern summer, more a kind of a designed tropical climate, more imagination and fiction. We bought glasses of freshly-squeezed orange juice from vendors.  I had never tasted such sweetness, such fresh, pulpy, thick and fragrant juice.  I was hooked. 

I never looked back after that first seminal trip south.  Although I continued to take my freezing midnight walks and sojourns on the sunny fairway snow-dunes, I always wanted to get away from the cold and the layers of heavy clothes and to return to Florida.

For my entire professional career I worked in hot places – some brutally hot like Ouagadougou in the summer (120F), Bihar in May (125F), and Bombay just before the monsoons (95F, 90 percent humidity) – but rarely did I complain.  I was happy to be able to retreat to an air-conditioned room after a day’s field trips to scorched villages, banging over animal tracks and rock-hard mud ruts in an old Jeep; but I still luxuriated in the heat, and always sat outside in the evening when the temperature fell to just under 100F, the mosquitos had not yet come out, and the slightest hint of coolness blew in from the north.

My most fixed memories are those of tropical places – the remote, palm-lined beaches of southern Sri Lanka; Copacabana; the southern shore beaches of Port Salut and Jacmel in Haiti; and the Teranga Hotel in Dakar overlooking the Atlantic from the Corniche.  My favorite was the beaches outside of Banjul in the Gambia.  The high surf pounded under  under our cantilevered hotel and onto the long beaches nearby. I spent every weekend on the black sand beaches of El Salvador, eating ceviche prepared fresh by local women for lunch, and corvina, lobster, and grilled snapper at night.

Coming back to Washington in the winter after those tropical idylls made the reentry even more difficult.  My life as a consultant was a footloose and free one.  For four months of the year, under the cover of a well-paid contract, I enjoyed the freedom of travel, the independence, suspension of responsibility, and adventure.  Coming back to a cold, grey, and barren city was depressing.

I have always heard of Eskimos who have 100 words for snow just as the nomadic Arabs of The Empty Quarter have as many for sand and dunes; but I never suspected that they liked snow or saw any beauty in it; or preferred the Arctic cold to anything warmer.  I assumed that they, like many indigenous populations, simply did not have the resources to move to any more accommodating climes.  When I lived in Bolivia, I commiserated with the Indians of the altiplano who could never get warm.  Fuel was scarce so high above sea level (14,000 ft.) and burned fitfully in the thin air.  When it rained, the cold was even worse.  Everything was damp and wet, wool clothing was rank with animal oils and human sweat.  Why, I asked, didn’t these Quechua and Aymara Indians simply do down the mountain?

In an article in The Atlantic (12.24.12 reprinted from the January 1995 edition), author Cullen Murphy writes In Praise of Snow:

"If you're talking about snow crystals in the atmosphere," said Mark Williams, geographer at the University of Colorado and a specialist in the properties of snow, "well, then, there are scores of terms. There are needles and sheaths and columns. There are pyramids. Cups. Bullets. Plates. Scrolls. Branches. Dendritic crystals. Stellar crystals." And those are just some of the basic forms. Snow crystals also come in combinations. Stellar crystals with plates. Dendritic crystals with branches. Hollow bullets. Bullets with dendrites. Plates with scrolls. Plates with spatial dendrites. Rimed particles. Rimed needle crystals. Lump graupels.

Graupel-like snow with nonrimed extensions. Some of the names of snow crystals (branches, needles, bullets) are appropriately suggestive: in high wind, snow crystals can be as abrasive as sand.

After snow has fallen, the name for it picks up additional qualifiers as it begins to settle or drift, as heat and cold and wind and moisture and the snow's own weight begin to make their influence felt. Freshly fallen snow starts out as what Williams calls an "ice skeleton"—a loose scaffolding of crystals amid an enormous volume of air.

Snow for Murphy and for Williams is a universe in itself with unimaginable diversity, complexity, and beauty.  Although the luxury of such poetic appreciation is affordable only to the warm scientist and writer, both have explored and presented a world most of us have ignored.  Both are interested in snow, however, less for its aesthetics than for its practicality.  Snow is necessary for survival – not just for the Inuit and other Arctic tribes and the animals they hunt, but for ranchers in the West who rely on melting snowpack for the water to irrigate the farms and water their livestock.  On a recent trip to Colorado, I learned from a rancher friend that there was almost no snowpack last winter.  Without water, animals had no feed and had to be sold off at low prices.  The West was literally drying up.  Recent reports are that the Mississippi River is down almost below navigable limits; and while the river system is fed principally by rain, the lack of snow in the north has slowed runoff into the Missouri.

It is snow that powers the great rivers of the West—the Colorado, the Rio Grande, the Columbia, the Missouri—on their long journeys through sometimes parched or semi-arid terrain, ribbons of brown and silver that at times enverdure entire basins, at times support the merest Nilotic fringe of green. How much water does the West's winter snow turn into? The snowmelt that finds its way into the Columbia River alone in an average year comes to 26 trillion gallons, which is 81 million acre-feet—enough to cover all of Kansas in knee-deep water, or to raise Lake Michigan by almost six feet.

I had never appreciated the importance of water – and by extension, snow – to the West.  Land itself has no value except in relation to the water which nourishes it.  Water rights are complex and highly political.  Both government regulation and private community collaboration are necessary to marshal this essential resource.  For those of us living in the East where rainfall is abundant and snow is for skiing or shoveling, it is hard to understand the Western obsession with water.  In much of the West, rainfall is only a small fraction of Eastern levels, and crops, rangeland, and ranches can only survive from mountain snow.  If it disappears, so do they.

As I prepared to leave Santa Fe, the space shuttle Endeavour was high overhead, in the midst of a successful ten-day test of its new radar. I drove north out of town to the banks of the Rio Grande, which flows through a broad plain between the Jemez and the Sangre de Cristo mountains. The snow in the lower elevations had begun to melt, and the river, though it was still shallow and slow-moving, had begun to rise. According to the newspaper that morning, the Sangre de Cristo Water Company's reservoirs in the Santa Fe Canyon, which trap the spring snowmelt and were now nearly full, would be releasing water into the Rio Grande in a matter of weeks.

Looking up, I could see the alpine snowpack—still intact, and, on average, about ten feet deep, according to information I had received from the Soil Conservation Service. Or, as I might have put it at another time, "The snow 120." The cottonwoods along the Rio Grande displayed the haze of fuzzy lime-green they briefly exhibit every spring, reminding me that this was exactly the time of year that Horace had been writing about: "The snows have dispersed, now grass returns to the fields and leaves to the trees."

This was written in 1995 before the snowpack started to melt; and in this passage the author expresses his appreciation of the beauty of the system that waters the West.  The entire article is a paean to snow – its complexity, its beauty, and its fundamental role in natural ecology.  Even I who get a sinking feeling every time I see the first snowflake fall, come away impressed.  I doubt I will ever get over my Washingtonian’s fear of snow after so many years travelling in countries that are warm and tropical.  I may regain some of the childhood wonder I had of snow, and may someday venture north in winter; but I doubt it.  My mind, spirit, and ambitions are now all south.

Saturday, December 22, 2012

Why Are White Men Mass Murderers?

There have been two recent articles on the relationship between race and gender and mass murders in America – one by Christy Wampole (New York Times 12.17.12)  and the other by Sarah Jane Glynn (Atlantic 12.20.12). While both writers agree that white men do most of the killing (65 percent of such killings are committed by white men), they disagree on why.  Wampole suggests that it is become white men have become the new disenfranchised.  Women and minorities have usurped their traditional role as dominant species in American culture, and they are frustrated and enraged to madness.

I would argue that maleness and whiteness are commodities in decline. And while those of us who are not male or white have enjoyed some benefits from their decline, the sort of violence and murder that took place at Sandy Hook Elementary will continue to occur if we do not find a way to carry them along with us in our successes rather than leaving them behind.

Can [non-white women] imagine being in the shoes of the one who feels his power slipping away? Who can find nothing stable to believe in? Who feels himself becoming unnecessary? That powerlessness and fear ties a dark knot in his stomach. As this knot thickens, a centripetal hatred moves inward toward the self as a centrifugal hatred is cast outward at others: his parents, his girlfriend, his boss, his classmates, society, life.

Glynn responds:

White men are doing just fine. On average they earn the highest wages, and are the most likely to be employed when you control for the fact that they also have the highest labor force participation rates. The idea that our economy is a zero-sum game, and that women and people of color are somehow stealing opportunities from white men is old, insidious, and untrue. So where is the problem? What makes men more likely to be violent if it isn't the advances of women and people of color?

Men have been committing a disproportionate amount of lethal violence for a long time—as long as we have been keeping records. Men are both more likely to be the victims and the perpetrators of lethal violence—expect for intimate-partner violence which is much more likely to be committed by men against women.

I think that both writers overstate the case. I believe that it is the combination of a persistent male ethos of aggression and dominance; chromosomatic sexual differences; the male gun culture; and mental illness which provide the rocket fuel for male havoc. I would add that a culture of resentment, frustration, anger, and hatred is the ideal medium for within which individual killers mature.

There has been a vigorous debate since the emergence of Feminism in the late Sixties concerning male and female ‘nature’.  Are boys genetically programmed to be more aggressive and violent; and girls more conciliatory, social, and caring?  Or are these attributes simply social constructs?  To this day, academics still write about the coming and going of Barbies.  Why is it, says a recent article (Guys and Dolls No More, Washington Post 12.22.12) that little has changed since the halcyon days of early Feminism when gender-neutral was the operative term of the day, and that parents still buy frilly, pink, girly things for girls, and trucks, tanks, and macho video gams for boys?

While many, such as the author of the Guys and Dolls article, argue that it is the pervasive and insidious influence of the media and their marketers and the still-homophobic culture of America, others would say that nature always trumps nurture in matters of behavior.  It is a well-known observation among parents that if they remove all toy guns from a boy’s playroom, he will use carrot sticks as pistols and broom handles as rifles in violent, aggressive games with his friends.  And if they give girls trucks, tools, and equipment, they will not play with them and find something to cuddle.

Regardless of the eventual outcome of the debate, boys are still brought up and nurtured to be boys. They still are the biggest consumers of violent video games, the enthusiastic supporters of violent sports, and the most rabid spectators at NASCAR races, the most male of all sports – speed, risk, fiery crashes, and raw aggression.

The gun culture is also predominantly male.  Men hunt and shoot far more than women.  Most of the hunters on Versus  are men.  Men like to hunt and kill; and it is hard not to reflect on pre-history when men did exactly that on the African veldt while women were gathering and cooking. With a few exceptions (a muscular Linda Blair in Terminator 2), it is men who are the shooters (heroes or bad guys) in the movies. Guns have always been an extension of maleness.

While according to NMIH (National Institute of Mental Health) over 26 percent of Americans suffer from mental illness, and nearly half of them suffer from two mental disorders. A combination of any of them – e.g. Post-Traumatic Stress Disorder, Bipolar Disorder, Schizophrenia, etc. – can be lethal.  The rate of Serious Mental Illness (SMI) already high in the national population (5-6 percent) is near 25 percent in prison populations, suggesting that many inmates have some personality disorder when they enter prison and when they leave.  Recent observations in the light of Sandy Hook (Newtown) have suggested that the rates of both mental illness and especially SMI are far higher because of our poor diagnostic tools.  Many mentally ill people escape detection. 

This combination – aggressive men whether by nature or nurture; a gun culture which extends maleness and idealizes it; mental illness which leads to irrational, unexplainable anti-social behavior – is enough to cause outbursts of havoc.  However, two more elements have to be added to the mix.

Guns are everywhere, and the number of guns per capita in America beggar that of any other Western country.  The gun culture has been encouraged by a liberal interpretation of the Second Amendment, nostalgic and idealized memories of the rugged individualism of the Wild West, and a persistent and unsettling growth of conspiracy theories, most of which talk of insidious national and international threats to ‘Our Way of Life’.  It is easy to get guns and guns that kill quickly and efficiently.

Finally I would suggest that the culture of frustration and resentment felt in many parts of the country must be considered as an important crucible of individual violence.  Many parts of the South still harbor a visceral and unnamed hostility to the Federal Government because of the Civil War, Reconstruction, and Civil Rights – all programs which either fought to bring down the Southern economy and social order; or to enforce radical changes in them. 

This generalized hostility is not the preserve of the South.  I have been with Angelinos who become apoplectic at the mention of illegal immigration.  In most cases, Mexican labor in no way threatens their personal livelihood.  The rage is coming from another place, a deeper place – an anger that America itself is being overrun, overtaken, and fundamentally changed.  I recently met a Maryland contractor who felt Communist, Socialist, Anarchist, and Jewish threats all around him.  He was doing fine, he told me.  He just couldn’t abide what was happening to the country.

The only unanswered question concerns race.  Why are mass murderers overwhelmingly white?  One explanation is that for a variety of psycho-sociological reasons, black violence is street violence, mostly confined within black neighborhoods.  Survival is an immediate, daily challenge, more than enough to occupy the lives of poor, minority inner-city residents.  They are less infected by conspiracy theories and political frustration and more corrupted by internecine warfare over drugs, money, and local power.

Some observers have noted that white men still feel a sense of entitlement, and that the rage and frustration noted above is made especially acute because of a perceived generalized threat to them.  It is not so much a feeling of individual loss or deprivation – a black woman getting preference over a white male – but that they, the undefined socializing government are taking it away.

If this is the case, then black people, still affected by vestiges of 19th century racism, have far less to be angry about.  Government has, at least since the 1960s, been there to help not to take away.

The racial element in mass killings is the least well-understood.  Researchers still tip-toe around the subject of race in all areas of inquiry.  It is has been more exposed to scrutiny now since it is white males who are the perpetrators, not blacks; but few convincing theories have emerged; and most ideas are still speculation. It is also important to reiterate the fact that 35 percent of the mass murders are committed by black men, a not insignificant proportion, suggesting that facile conclusions must be avoided.