"Whenever I go into a restaurant, I order both a chicken and an egg to see which comes first"

Sunday, September 30, 2012

Sacrifice–Is That Really What Is Missing Today?

Frank Bruni has written in the New York Times (9.30.12) about the lamentable absence of sacrifice in today’s America.  He notes that scant mention was made of it in the acceptance speeches of Obama and Romney, and that the only president who talked about it – Jimmy Carter – was lambasted for asking Americans to turn down their thermostats and said in 1979  “Too many of us now tend to worship self-indulgence and consumption”. 

Dwight Eisenhower, ridiculed by the Left while he was in office because he was a do-nothing president is now being increasingly revered by these same critics because of his perspicacity and wisdom because of his warning about the dangers of the ‘military-industrial complex’.  In his farewell address in January 1961, the speech in which he made this reference, he also said something else very visionary and wise:

As we peer into society's future, we -- you and I, and our government -- must avoid the impulse to live only for today, plundering, for our own ease and convenience, the precious resources of tomorrow. We cannot mortgage the material assets of our grandchildren without risking the loss also of their political and spiritual heritage. We want democracy to survive for all generations to come, not to become the insolvent phantom of tomorrow.

Sacrifice is at the heart of most religions and is an important element of family and society.  Every family has a story of personal sacrifice – the immigrant grandmother who scrimped and save to give her promising son a college education; the brother who donated a kidney to his sibling; the mother who worked two shit jobs to make ends meet. In many ways sacrifice is the glue which holds us all together.

Bruni says that lack of sacrifice is what is wrong with America today.  In this hyper-individualistic world we now risk losing our moorings.  Maybe we are willing to sacrifice for our children, Bruni implies, but not for anything larger.  The concept of financial or economic sacrifice for the good of the country is ignored.  Raising taxes is an anathema, and if it is mentioned at all, it is in reference to the filthy rich – i.e., no sacrifice required there.

Bruni cites Ronald Reagan’s famous ‘Morning in America’ speech as a watershed moment when we once and for all jettisoned the wimpy, defeatist sentiments of Jimmy Carter and got our mojo back.  Reagan’s constant upbeat message that America is great and all’s right with the world was just what we wanted to hear.  No sacrifice needed here.  Work, enterprise, and a positive attitude would generate wealth and prosperity for all.

In this worldview no one is required to sacrifice.  Everyone is, in fact, entitled to enjoy the bounties that America can provide without pain or strain.  Now that America has temporarily failed in that promise, and most of us are feeling some pain, isn’t it time for sacrifice once again, for each one of us to contribute to the common weal as our Founding Fathers envisaged? Not exactly.

What is seen in this time of constraint is another expression of NIMBY.  Sure, say most Americans, sacrifice is exactly what we need, but from HIM, not me.  I didn’t get us into this mess, goddamit, so it is not my problem.  Every possible excuse is given for not raising taxes – they are unwisely spent; government programs are cesspools of waste, fraud, and inefficiency or are unnecessary entitlements to people who should be working for a living; etc.

Most Americans are happy with the all-volunteer army.  If young people want to join the military, we say, that’s great.  Get trained, good benefits, pull these kids out of the projects, no sacrifice required because they want serve.

As we baby boomers became adults, less than 1 percent of the population served in the military,” wrote Matthew Paull and Steve Krause in an article in The New Republic last year. It was titled, “Why Are the Children of the ‘Greatest Generation’ So Selfish?”

“In World War II, that figure was over 10 percent,” the authors continued, later adding: “With relatively few of us sharing the bonds, lessons and sacrifices of military service, perhaps there is little widespread experiential counterbalance to each of us pursuing only our self-interest.”

Not mentioned is the fact that if there were a national draft, much fairer than the one in force during the Vietnam War (mainly the poor were called up), then we most certainly would not enter wars of questionable purpose.

Harping on the loss of the moral integrity that sacrifice implies, however, is missing the point.  It is fairness which matters most to Americans, and there would certainly be more willingness to pay taxes or to serve in the military if citizens felt that no one was getting a cushy ride while others were being screwed.  Hyper-individualism might be the result of unfairness rather than the cause.  A fair tax system may be impossible today because of many reasons.  The tax code is labyrinthine and impossibly complex with direct and indirect benefits so buried that any individual would be hard-pressed to discover that his taxes are going for a road-to-nowhere or to protect the Flecked Nuthatch let alone sort out the pros and cons of the subsidy.  Most of us who live in DC and many other big cities are incensed at the way our taxes are raised, wasted by a corrupt, patronage-driven, and venal politicians.

The same can be said of military service.  If the wars were just, then we would be willing to sacrifice for them.  As it is, all but World War II have been of dubious value.  Who can forget the famous words of Muhammad Ali – “I ain’t got no quarrel with the Viet Cong…(No Viet Cong ever called me nigger)”

Bruni, however, misses perhaps the central issue in the sacrifice debate.  Most people contribute to the common good without thinking about sacrifice; and that is the most important and central concept of American democracy – through individual enterprise, done for itself, the many will benefit.  Most people work hard and through this employment give others employment.  Their wages are spent on food, clothes, housing, entertainment all of which are important cogs in the engine of America’s economy.  Wall Street bankers, no matter how much they are vilified, contribute to the country’s wealth.  By making companies profitable they not only increase shareholders’ revenue, but expand their operations, adding jobs and wealth.

Sacrifice is not required, therefore, if everyone does their job to the best of their ability and within moral and ethical boundaries.

In short, a focus on sacrifice may be misleading.  Fairness and living according to a moral and ethical code may be even more fundamental and important.

Saturday, September 29, 2012

Memories of The Second Grade

My first memory of elementary school was from kindergarten where this baby-breath fat kid kept poking me in show-and-tell.  My mother said he was probably still sucking the bottle and was infantile and to pay no attention to him.  My father told me to hit him in the nose the next time he did it.  Funny how only pieces of memories stick with you.  I remember the rancid baby breath, my mother’s comment and my father’s advice more than I do what happened later.  I would like to think I popped the fat kid, but I honestly don’t remember.

I do remember doing ‘a fat kid’ and shoving little girls in the thorn bushes on walk down Commonwealth Avenue to the Stanley School.  I got a wicked lashing from my father more because he felt he had to be polite to the fathers who furiously complained “to do something about that kid or else” than for the actual transgression.  Boys will be boys was his motto and he just was taking out his own frustration on my ass.

My next memory, a happy one, was from second grade.  I was the teacher’s pet because I was such a good student and behaved (no further macho incidents), and on the last day of school she offered me a going-away gift (I was transferring to another school) – anything on one of the bookshelves.  I chose a copy of the Odyssey and a tiny cactus plant in a small, porcelain frog.  I still have the book.

I have fragmented memories of third grade, and for some reason Robert Macciato has stayed with me.  He was a tall, uncoordinated, very dumb kid who kept asking the teacher what she meant.  In those days most subjects were divided by ability-groups, and Robert was always in the ones at the bottom; but in more general classes such as geography, we were all together.  The sessions were bad enough – I had gone through the Rand McNally World Atlas many times at home – and Robert Macciato made them even worse.

I have no recollection of what the classrooms were like, just a collection of desks and chairs, a blackboard, chalk, and big erasers.  I walked the mile from home, sat through the morning, walked home for lunch, then returned to finish the day.  I moved from one class to the next without incident or note.  My real memories only started when I started the seventh grade in a small country day school near my house. It wasn’t the school that cemented the memories, but early adolescence.  My most lasting memory – a look up the sleeveless blouse of Nancy Barth – is a good example. 

Despite my indifference to my surroundings, educators and administrators has been tinkering with the school environment for at least a hundred years.  It was not only important what you learned but how.  A retrospective of these reforms was written by Alison Lurie a few years ago in The New York Review of Books (12.18.08)

Until the mid-eighteenth century boys and girls were often seen as miniature adults, as uncivilized imps of Satan, or, with John Locke, as blank sheets of paper on which a parent or teacher could inscribe knowledge and morality. The Romantic movement of the late eighteenth century cast the child as a Wordsworthian innocent, naturally good and eager to learn; it also had important and lasting, though far from universal, effects on the physical form of schools.

Louisa May Alcott represented the radical educational theories of Ralph Waldo Emerson Little Men and Jo’s Boys - that all children are potentially good, and that if they are educated with kindness and according to their individual needs they will grow up to be worthy citizens of a democracy. The fictional school looks like a large, comfortable family house, which it once was, and is surrounded by orchards and woods. The children have their own garden plots, keep pets, and go on educational nature walks.

The Waldorf movement begun in Europe in 1919 believed in encouraging individuality and creativity, and “its schools today, in many parts of the world, still tend to be rambling, informal-looking buildings that sometimes recall Alpine chalets, with their steep overhanging roofs and peaked gables; a few look rather like the fantasy houses of Oz”.

Montessori schools stressed self-directed learning and physical activity and a special appreciation of nature. “These schools, too, tend to look like large, comfortable houses surrounded by grass and trees. Inside, their classrooms are full of samples of the natural world ant farms and chickens and white mice, and the walls are papered with the children’s drawings and paintings.”

Closer to our day, equally radical ideas about educational space emerged, themselves products of the social revolution of the 60s and 70s:

The open-classroom movement had a significant though not always lasting effect on school design. One of its central texts was Herbert R. Kohl’s The Open Classroom (1969). Kohl criticized the hierarchal structure of contemporary schools, which, he believed, consciously or unconsciously taught obedience to authority and suppression of ideas and opinions.

Along with the campaign for open-space schools went demands for more comfortable learning areas. The sociologist Robert Sommer, for instance, criticized what he called “hard classrooms,” with tile floors, institutional furniture, dull colors, and overhead fluorescent lighting. He recommended instead a “soft classroom,” furnished with carpets, upholstered benches and hassocks, floor pillows, and spot lighting.

The structure of the classroom changed along with this new overall design. Gone were the old-style fixed one-person desks and chairs to be replaced by collaborative learning units. “This kind of classroom plan suggests that it is natural for teams or groups rather than individuals to compete, and that you will belong to different groups at different times.”

The story gets more interesting when politics gets involved.  Politicians of all stripes, impatient with the pseudo-reforms characterized only by rearranging desks and chairs while student performance languished in the doldrums, lamented the passing of ‘The Little Red Schoolhouse’

Liberals liked the small classes, the mixing of ages and skills, the informal scheduling, and the individual attention. Conservatives praised the one-room schoolhouse for its basic no-nonsense curriculum of “readin’, ritin’, and ‘rithmetic” and its emphasis on order, discipline, and obedience to the teacher; they also pointed out that in many one-room schools daily prayer and Bible reading were part of the curriculum.

One of the greatest bits of intellectual gymnastics comes from those who say that the cushy environment provided by warm, welcoming, colorful schools with romantic links to nature and he wider world is wrong:

Childhood should be presented as a lesser and more limited and uncomfortable state of being, while adulthood is shown to have far greater rewards and privileges. Otherwise, children will never want to grow up, and college students won’t want to graduate, take jobs, or sometimes even leave home. With the best intentions in the world, we will have created a population of sulky, disappointed adults who will long all their lives for the lost paradises of their youth.

The school environment is constantly changing and today’s is more affected by the Internet and 9/11 than any Romantic notion of life.  Schools are becoming less accessible, more security-conscious and even moving towards the windowless big-box commercial model.  This model conforms to security concerns for the school, the view that the outside world is a dangerous place, and the conviction that a child’s world can be geometrically expanded through the Internet than by any hokey nature walks. 

Perhaps most importantly these stripped down, institutional environments with functional furniture, computer terminals, and no distractions on the walls, are thought to be more conducive to basic learning – i.e. to improve test scores.  Sadly, they are easier for teachers to manage unruly students who really are not socialized enough to be in school.

I only remember the fat kid with baby breath, stupid Robert Macciato, and The Odyssey; and had no idea that I was being manipulated; but even so, society is all about manipulation and being manipulated.  Change never seems to matter, but as good American optimists we keep on trying.  I really should Google Robert Macciato and see if he ever made it out of the third grade; or better yet, try to find Nancy Barth and see if by any chance she’s available.

Friday, September 28, 2012

The Infamous ‘Anti-Islamic’ Video And Free Speech

I have always been a defender of free speech and cringe every time someone posits a ‘good’ excuse for limiting it.  In these days of Political Correctness free speech has been under assault like never before.  More attention is being paid to protecting the so-called ‘rights’ of those offended by free speech than in protecting the speaker. 

The recent ‘anti-Islamic’ video is a case in point.  Despite the fact that volumes of scurrilous anti-Catholic, anti-Christian, and anti-Semitic material is on the Internet; and the fact that even the most faithful dismiss it immediately as a product of twisted, fringe, and imbalanced people, calls for the suppression of this particular video are increasing simply because the reactions to it have been politically damaging.

This is wrong.  Since the video, taken in the context of its equally offensive cyber-mates, is no worse than most; and since we do not drag the mates into court, then why are we collapsing so pusillanimously?  In other words, we have defended free speech in all other similar cases; but we are considering cracking down on it now simply because the reaction it caused was politically destabilizing. However, it is the reaction that is contemptible, not the video.

Free speech is one of the foundational principles of our Republic and was not so enshrined because the Founding Fathers had only the speaker in mind, but because society as a whole would benefit from it.  If free speech were curtailed in any way, concentration of power would surely result.  Such concentration would eventually cause civil unrest and instability, consequences far worse than any potentially offending speech.

Free speech is an important democratic principle not only because it acts as a countervailing force against the arrogation of power, but because it promotes a more honest, transparent, and accountable society.  Political cartoons are good examples of this principle, for in a few brush strokes the artist is able to pillory obtuseness, pomposity, and outrageous positions whether political, social, or religious.  The caricatures are often cruel and unforgiving, but the reader and the person pilloried both  get the point. 

The Danish cartoons may have cut close to the bone, given the incendiary nature of religion today, but were clearly satirical and on-point.  Religion is a legitimate target especially when it becomes political, and it always does.  In the United States Christian churches have become embroiled in electoral politics.  Jewish history and religion are inextricably linked to events in the modern-day Middle East.  Representatives of these religions will of course cry foul when attacked, but they will get over it, and maybe get the point.

Is there a line across which free speech is not acceptable?  I cannot see one, especially if one considers the larger context in which speech is uttered.  That is, while any one particular speech might be unnecessarily brutal and devised more to damage than to enlighten, when it is taken as part of a range of expression whose targets come and go with time, its content and intent will fade, will be subsumed and lost within the larger mainstream of ideas.

Of course there are limitations on free speech all the time, many of which are legitimate in intent, but illegitimate in application. Protecting national security is in the country’s interest, but it is too often used to muzzle dissent.  Corporate confidentiality agreements have a legitimate purpose in protecting innovative ideas which are in their infancy, but are often used as paranoiac threats to insulate corporations from their critics. Fortunately there are now ‘whistle-blower’ laws which protect employees if they choose to expose illegal or unsavory doings on the part of their employers.

The anti-Islamic video affair has taken a new and nasty turn, for the producer has been taken into custody not for the video but for violating parole, one of the conditions of which was not to use the Internet except under official supervision.  The video producer had been convicted of bank fraud, and obviously the parole restriction was to keep him from dirty financial tricks while he was out of prison.  Officials are now using the production of this video as a flimsy excuse to haul him into court.  This is an obvious political stunt designed to give the Administration a way of showing the Muslim world that we mean business.

It is nothing of the kind.  If exploited as I expect, it will send exactly the wrong message to the world – our adherence to the principles of free speech, and by extension the free flow of information is weakening.  According to BBC online (9.28.12)

The Obama administration has requested Google, the company that owns YouTube, to remove the clip. The technology firm refused, saying the film did not violate its rules.

Good for Google, but shame on Obama for even trying to get the company to pull the video.  By doing so, it admitted its retreat from the principle of free speech.

While I understand the Administration’s desire to make this issue go away, it does no one any service by capitulating to temporal political demands.  Not only that, it has not come out and defended the principle to its attackers.  I have heard no eloquent arguments about why free speech is essential to democracy, why its suppression is antithetical to popular movements, and why religion itself is strengthened by free and diverse expression. I don’t believe the ‘Arab Street’ will listen, but that does not absolve Obama from responsibility.

Not only have I not heard a strong defense of free speech, I have seen the Administration duck and run under the cover of other political explanations for the problems in Egypt and Libya.  While it is undoubtedly true that these revolts have been engineered for political reasons and that thousands of gullible people have been manipulated by power-hungry leaders, there is no reason to totally ignore or forget the issue of free speech.  Moreover, it is exactly the time to raise it.  Again and again.

Thursday, September 27, 2012

The Myth (And Reality) Of Income Mobility

Income mobility has become an important issue in this year’s election, and both parties have staked their political reputation on increasing it.  Democrats have long felt that by government intercession in the marketplace through poverty alleviation, job training, and a variety of educational and social programs, talented individuals from lower socio-economic segments of the population can more easily move up and out of poverty.  Republicans have felt that by freeing the individual from the chains of state welfare and demeaning and inefficient government programs, natural pent-up creative energy can be released.

In an article in the The New Republic (2.8.12) by Timothy Noah traces the historical roots of the myth of Horatio Alger, the now-familiar icon of The American Dream:

Alger wrote dime novels for boys about getting ahead through virtue and hard work….and fully 5 percent of all the books checked out of the Muncie, Indiana, public library between November 1891 and December 1902 were authored by Alger.

His influence stems from the fact that one of these books—The Epic of America (1931)—introduced the phrase “the American dream” to our national discourse. Writing at the start of the Great Depression, Adams envisioned not “a dream of motor cars and high wages merely,” but rather “a dream of a social order in which each man and each woman shall be able to attain to the fullest stature of which they are innately capable, and be recognized by others for what they are, regardless of the fortuitous circumstances of birth or position.”

Noah concludes, however that the American Dream is a myth.  Although most Americans deeply and passionately believe that through hard work and enterprise alone an individual can be successful, and that America is the only country where this is possible, there is no basis in fact for these beliefs.  Most of us are stuck where we were born.

Noah bases his conclusions on comparative historical studies of fathers and sons from different socio-economic levels.  Based on census data from 1850 to 1920 and between 1950 and 1973, Joseph Ferrie of Northwestern concluded that:

Forty-one percent of farmers’ sons advanced to white-collar jobs between 1880 and 1900, compared with 32 percent between 1950 and 1973. Ferrie’s conclusion held up when he looked at all four job categories (factory-worker, white collar, etc.)  Between the horse-and-buggy days and the interstate-highway era, American society had become significantly less mobile.

This, of course, is a specious argument. Every statistician and economist studying change knows that the lower the starting point, the easier it is to move to a higher level.  It is far easier to improve academic achievement, for example, from a score of 10 to 20 percent (100 percent improvement) than from 90 to 100 percent. In 1880, although most families worked the land, America was just beginning one of the most dynamic economic periods in its history, one in which the demands for labor were high and where any reasonably talented and ambitious worker could move from a demanding and only modestly-rewarding farm job to a higher-paying industrial one.  While there is no doubt that the important labor reforms which improved the lot of industrial workers came only later, a move to the city was still up and out.

The US population between the years 1950-1973 was not only many times greater than in the earlier Alger generation of the 19th Century (thus increasing competition for jobs) but that workers were far more skill-dependent than previously. Whereas a farmer in 1900 could move up to an industrial job, and a factory worker could move up to a clerical or even management position, he could do so because jobs at all levels were less skill-dependent than they were in the mid-Twentieth Century.  The post-WWII period was one of great economic expansion, few government regulations, and many government incentives (e.g. the G.I. Bill); and yet, mobility decreased.  This could only have been due to a structural economic immobility.

Since 2000 there have been many retrospective studies on ‘intergenerational income elasticity’ (income heritability)– or, how likely is a son or likely to achieve a higher socio-economic status then his father.  Interpretations vary, but the most recent and most convincing is about 60 percent – that is, you have only a 40 percent chance of moving to a higher status than your father, thus suggesting that the American Dream is only partially true.

Yet this conclusion is also disingenuous and disregards certain formative historical events. The 1960s were years in which many young people rejected the values of their parents and chose lifestyles and careers which were not tracked to continue the moneymaking of the flush 50s.  Although this phenomenon was restricted to middle-class individuals (i.e. not working class), their numbers cannot be ignored. In other words, many young people of the Sixties deliberately chose to move ‘down’ the socio-economic scale.

Many studies have shown that income heritability in Europe is lower than the United States, suggesting that despite their Old World image, Western European countries have changed their ways:

Meanwhile, mobility in the United States has fallen dramatically behind mobility in other comparably developed democracies. A 2007 study by the Organization for Economic Cooperation and Development (OECD) combined a number of previous estimates and found income heritability to be greater in the United States than in Denmark, Australia, Norway, Finland, Canada, Sweden, Germany, Spain, and France.

This, too, is not surprising because Europe has had its own ‘Sixties’ when EU effects kicked in, governments were less centrally-planned, labor was free to move, and income could be determined by opportunity not family.  At the same time, the US mobility has decreased because of a large immigrant population that has not yet found its economic running shoes.

Yet, there may be an upside to the American tendency to overestimate their chances for mobility:

James Fallows writes that a society in which “people routinely overestimated their chances for success,” in which entrepreneurs “launched ventures that by rational standards were likely to fail,” was a society that, collectively and over the long term, would invent more, innovate more, and succeed more. Society benefits when people don’t know “their place.”

The philosopher Joseph Campbell found myth to be central to all civilizations and should never be dismissed simply because it was not ‘real’.  Myth can be as powerful a motivating, guiding, and disciplinary force as any more rational or secular one.  The American Dream falls into that category.

Wednesday, September 26, 2012

Why Do The Smartest Kids Feel The Need To Cheat?

Stuyvesant High School in New York City is one of the nation’s top public high schools.  It is a so-called ‘exam school’ where admission is contingent only upon test scores.  Stuyvesant is not the only such exam school in New York – Hunter being another – and in the Washington, DC area, Thomas Jefferson High School in Alexandria is well known.

These schools are academically elite, only the brightest students get in, and admission is free and a ticket to a prestigious college.  Although these schools have come under criticism for being elitist and favoring children whose parents provide the home environment, extracurricular activities, and test preparation that facilitate admission, they are in fact the most democratic of institutions.  Legacies, favoritism, or financial influence are out. These schools’ admission policy is little different from that of prestigious French universities such as the École Nationale d'Administration open to all based solely on test scores.

If there is any drawback to these free, top-of-the-line high schools, it is that because so many graduating seniors have the SATs, grades, and extra-curricular activities to go to Harvard or MIT, only a few get in.  In a weird twist of logic, it might be better for a talented student to suffer through an ordinary secondary education, graduate at the top of his class, and go wherever he wanted.

Recently it was discovered that a significant number of Stuyvesant students had cheated; and questions were immediately asked why.  Every possible theory surfaced:  the students’ sense of ethics had been compromised by the hi-octane competitive hyper-individualism of the age; success was linked more than ever to graduation from top universities which had maintained or elevated their standards while others lowered them to make ends meet; ethical choices were less frequently rewarded or went unnoticed in a society driven by academic success; students from Asian and Jewish families (who predominate at Stuyvesant and other exam schools), always culturally insistent on learning, knowledge and education had become even more so. 

Whatever the reason, kids who shouldn’t need to cheat cheated.  The reasons they gave were fascinating and much more telling about American culture than the stock explanations suggested above. 

In article in the New York Times (9.26.12) Vivian Yee writes about one commonly-cited but shocking justification that students gave for cheating. Cheating in a worthless class – worthless because of an inferior teacher or because it satisfied some ‘dumb’ requirement was OK.  Absolute right and wrong – it is wrong to cheat – was supplanted by a sliding scale.  It was wrong to cheat in some classes, perfectly all right to cheat in others.

A recent alumnus said that by the time he took his French final exam one year, he, along with his classmates, had lost all respect for the teacher. He framed the decision to cheat as a choice between pursuing the computer science and politics projects he loved or studying for a class he believed was a joke.

“When it came to French class, where the teacher had literally taught me nothing all year, and during the final the students around me were openly discussing the answers, should I not listen?” he said.

This reminded me of an experience in Poland shortly after the fall of the Berlin Wall.  The Communist system was being dismantled and liberal democracy and market economics replaced it.  I was part of a team making a television series on this transition, and I had the opportunity to interview the President of the Warsaw Chamber of Commerce who told me of his dilemma.  Under Communism, he said, cheating and lying were considered heroic acts because they were done to undermine a corrupt and corrosive system.  Now that liberal democracy was taking over, these young people had to be re-educated: lying and cheating now were wrong, socially irresponsible, and personally unethical.  This particular element of the transition would not be easy.

By the time they graduate, many [Stuyvesant students] have internalized a moral and academic math: Copying homework is fine, but cheating on a test is less so; cheating to get by in a required class is more acceptable than cheating on an Advanced Placement exam; anything less than a grade of 85 is “failing”; achieve anything more than a grade-point average of 95, and you might be bound for the Massachusetts Institute of Technology or Yale (Yee)

Cheating in the form of plagiarism was made easy by the Internet and devalued as a crime of ethics.  It did not take many ethical gymnastics to appropriate an idea generated by someone else because it was ‘common knowledge’.  Transforming a direct quote to an amended, modified, and slightly changed version also fell within the realm of acceptable acts.  Who is harmed? asked the students.  No royalties are lost; and in fact original idea are now, thanks to the plagiarizing student, getting even more currency.

How many of us while writing articles, blog posts, or incidental references on Facebook have edited an original source to suit our argument?  Eliminating a caveat, an exception, or a warning.

Surprisingly, cheating is often done collectively, belying the dog-eat-dog myth of the highly competitive school.  Internet chat groups take the place of evening study sessions and raise the ante, for they provide sources, quotes, and references. 

Although Stuyvesant has a reputation for being cutthroat, students say collaboration, not competition, is the norm. Several framed the collaboration as banding together against a system designed to grind them down. Many classes have private Facebook groups that students use to exchange advice or, sometimes, to post full sets of answers for classmates to copy. Take-home exams are seen as an invitation to work together (Yee).

As in the case of Poland, students said that cheating is OK because the system is designed to grind them down.

Perhaps worst of all is the complicity of teachers:

Many teachers were so understanding of the pressure students faced that they would hand out lighter punishments for cheating.  [Teachers endorsed] steps like telling students who were copying homework simply to put it away and allowing those cheats to retake tests, despite policies that prescribe a range of punishments, from giving a zero on the assignment up to suspending the student.

If the best and the brightest cheat, we can only imagine what the rest of American students do. 

Church and School

An article in the Guardian by Katherine Stewart entitled How Evangelicals Are Making Children Their Missionaries in Public Schools writes about the tactics of evangelical groups to skirt the Constitutionally prohibited teaching of religion.  These groups have used children to start sectarian prayer groups and other venues for the promotion of religious concepts and beliefs because the children are immune from censure or prosecution:

In a tactical sense, religious fundamentalists in America appear to have taken a page from the same book. The constitution and the law prohibits adults from, say, establishing ministries within public schools aimed at proselytizing to the children during school hours. But a growing number of religious activists have come to realize that it's technically legal if they get the kids to do their work for them.

The issue is based on a long-standing argument over what consists of religious activity.  Should it be completely outlawed, regardless whether it is held as part of extracurricular groups or part of formal instruction? Or should it be permitted if it is held as part of a clearly private, group function?  In a recent case, this issue went before the Supreme Court which ruled that extracurricular student religious groups are indeed legal and do not countermand the Establishment Clause of the Constitution.

The Supreme Court ruled 8-1 that the Equal Access Act is constitutional and that religious clubs can meet in secondary public schools if they are initiated and led by students. Religious clubs are also allowed only if the school offers other non-curricular clubs.  It is interesting to examine the basis on which the Court decided.  Writing for the majority, Justice O’Connor said:

there is a crucial difference between government speech endorsing religion, which the Establishment clause forbids, and private speech endorsing religion, which the Free speech and Free Exercise clauses protect. We think that secondary school students are mature enough and are likely to understand that a school does not endorse or support student speech that it merely permits on a nondiscriminatory basis.”

Justice Stevens wrote the dissent:

“Can Congress really have intended to issue an order to every public high school in the Nation stating, in substance, that if you sponsor a chess club, a scuba diving club, or a French club—without having formal classes in those subjects—you must also open your doors to every religious, political, or social organization, no matter how controversial or distasteful its views may be? I think not….The Act, as construed by the majority, comes perilously close to an outright command to allow organized prayer…on school premises.”

I have always felt that a strict prohibition of any kind of religious teaching in public schools is a good thing.  It is easy to see how a teacher with strong religious convictions and a mission to evangelize could easily introduce her own ideas into the classroom.  Yet in a school district like mine where not only are there students from the major religions of the world, but from those religions’ subgroups.  There are Muslims who are either Sunni or Shiite; Hindus who either follow Krishna or Vishnu; Jews who are either Reformed, Conservative, or Orthodox. Any formal introduction of religion into the classroom is bound to either offend or go against the precepts and teachings of someone.  A public school teacher is an influential person in the eyes of still-unformed children, and she should not be allowed to use this trust to promote her own beliefs. 

The line between what is formal sectarian preaching and providing religious context to secular subjects is not entirely clear.  There can be no discussion of American history, for example, without talking about the influence of Puritans in the early days of the Republic or the role of religious principles behind the abolitionist sentiments of the Methodists and Quakers and Methodists.  European history would not be history at all without a reflection on the role of the Popes and the Catholic Church and their rivalries with Henry II, King John, and Henry VIII.  It is impossible to understand the great art of the Middle Ages and Renaissance without appreciating the Christian beliefs which motivated the artists.

Yet teachers, because of the Establishment Clause have to be extremely careful when talking about these events; and as a result tend to either gloss over them or eliminate them entirely.  Our society has become so litigious, that teachers are certain that some angry parent who is also a lawyer will come down hard on them.  Students are the worse off for this de jure and de facto prohibition.  If taught objectively, subjects such as Puritanism, the religious expressions of native America, the importance of Jews as ‘People of the Book’ who believe in the primacy of Scripture and for whom education is sacred; or even a comparison of the world’s major religions can be among the most important subjects taught in school. 

What is far less clear is the objection to extra-curricular activities.  How could such events possibly be construed as evangelism or proselytizing? The Guardian article suggests that these religious clubs may not be so innocent.  Although adults are prohibited from evangelizing in or at school, students themselves may do so with impunity. Once again, the issue is not very clear.  What is the problem if faith-driven students try to recruit others who are not?  If a student can talk about Jesus or Mohammed to his classmates on the playground in an attempt to influence him, how is a collection of student-believers any more threatening ?

The author minds most what she sees as the manipulation of the law – an insidious attempt to exploit young people to do their bidding.  As importantly she objects to the very sophisticated marketing efforts of some groups which use savvy advertising to promote these evangelical clubs and to recruit volunteers.  Yet this is America after all, and we all have the freedom to pursue our own interests and to operate on the margins of the law. The Every Student Every School (ESES) campaign is no different. Moreover many recruits come from evangelical families and are no means duped or forced into religious servitude by ESES. This is a patronizing assumption, no different from the one that insists that all girls who wear the hijab have been forced to do so by male adults. 

In any case, such matters, whether they occur on school property or not, are private; and if school-based evangelism is a powerful, well-organized movement, so is evangelism in the United States as a whole.

Monday, September 24, 2012

Epitaph: “He Stayed Late At The Office”

I rarely stayed late at the office, and always arrived as late and left as early as possible.  My work was always a means to an end, the necessary anchor to allow me to travel; and travel I did to over 40 countries in Africa, Asia, Latin America, and Eastern Europe.  I stayed at the very best colonial hotels like the Raffles in Singapore, the Oriental in Bangkok, the Grand in Calcutta, and the Galle Face in Colombo; ate Nile Perch overlooking Lake Tanganyika and the Niger River, Afghan naan in Rawalpindi, full lamb burra near Kashmiri Gate in Old Delhi, got drunk on cachaça on the beaches of Rio, smoked Cuban puros overlooking the volcano in San Salvador; made love in the Arab port city of Moroni; danced High Life in the quartiers of Kinshasa and meringue in the bars of Carrefour in Port-au-Prince; swam in the Arabian Sea, the Bay of Bengal, the tropical lagoons near Galle at the southern tip of Sri Lanka.

I had a great and wonderful time.  Each three week trip was a free ticket to another life, far from the pleasant but ordinary life at home in Washington, and a temporary but complete fugue to a completely different world, one of excitement, stimulation, and adventure – my own unique, personal world crafted and determined only by me.

I am not sure when I realized that enjoying life was all that mattered; that the world’s problems would roll on inevitably, inexorably, and predictably; that there were no hell fires to scare me into a tightly-wrapped moral and ethical being; that there was no celestial paradise awaiting me if I followed the rules, no land of milk and honey, cool brooks and succulent melons, or endless fields bathed in divine light. 

Not only was there nothing but this world, this life, this being; but it would always be so. As long as human nature is at the foundation of all activity and enterprise; and as long as our desires are predictably self-serving, self-protective and designed by evolution to assure our survival, nothing will change.  There will be thousands of millions of me as there were thousands of millions of me in the past, revolving in and endless, purposeless cycle.

Contemplating this ever-repeating world, turning again and again until the sun goes out; or the hundreds of trillions of other endlessly revolving worlds in this and any other universe, was by no means depressing.  It was liberating, exhilarating, and life-affirming.  If this was all there was and it lasted a scant four-score and twenty years, I had better get busy.

William Pfaff reviewing Francis Fukyama’s latest book in the New Yorker, took exception with the author’s optimism and hope for the future.  Pfaff said:   

I am not myself aware that human character and conduct today display any general improvement over that recorded in the historical past. The political crimes of the twentieth century had their counterparts in the past. Things comparable to or worse than Asian barbarism, past wars of religion and race, enslavement, or mass extermination waged by men like Genghis Khan, continue to happen in our times. That men and women are morally improved from what they were at the beginning of recorded history has yet to be demonstrated.

Jan Kott, one of the foremost critics of Shakespeare, wrote about what he called The Grand Mechanism – the eternally turning cogs in a world-machine which always turned in the same way to the same end:

Feudal history is like a great staircase on which there treads a constant procession of kings. Every step upwards is marked by murder, perfidy, treachery. Every step brings the throne nearer. Another step and the crown will fall. One will soon be able to snatch it.
From the highest step there is only a leap into the abyss. The monarchs change. But all of them -- good and bad, brave and cowardly, vile and noble, naive and cynical -- tread on the steps that are always the same. . .

Everyone is caught up in the movement in Kott’s ‘Grand Mechanism’ to which the whole kingdom is subjected.  “A mechanism whose cogs are both great lords and hired assassins; a mechanism which forces people to violence, cruelty, and treason; which constantly claims new victims…”

What makes commoners as well as royalty subject to the Grand Mechanism? They – we - are no different in our desire for power, wealth, and supremacy. Kott went on to conclude that the engine for this Grand Mechanism was human nature; and if one understood this and accepted the fact that it would never change and would always determine history, we would never be surprised and perhaps be better prepared.

Many of my friends have been perplexed at my personal philosophy.  How could I live without God, they asked; or without the hope and expectation of a life after death?  Others wondered at my indifference to politics or world affairs.  I have always been fascinated with the drama of history and wondered how history - its own playwright - rewrote the same tragedy over and over again but with different characters, different music, different sets, and different costumes; but I have had no interested in changing its course.  Why would I ever spend my valuable energy, my store of energy, passion, and determination for events which would eventually and inevitably be folded into a great lava flow, sputter, be submerged and incinerated, and ultimately forgotten?

My attitude was selfish, my friends cried.  Didn’t I care for community, society, the welfare of the human race?  Where was I on matters of environmental catastrophe, devastating wars, mutilation, depredation, and insanity?

Political conservatives were just as perplexed as ‘progressives’, but understood my position a bit better.  They did not really believe in human progress, or a positive evolution of the human race; but saw the individual as the center of the universe.  Progress, if it occurred at all, was a matter between the individual and his Creator – a spiritual road to personal salvation.   As Christian a concept as this is, it is remarkably similar to that of Hinduism.  Life is maya, illusion, and a belief in the tangible ‘real world’ with all its deceptive promises leads nowhere.  The only meaningful, logical path, is a spiritual one. 

I suppose it might be nice to believe that when you die you simply board a train that takes you to a very nice vacation spot, one with no troubles, and where you can always drink the water, and live happily for ever after; but human beings always put conditions on nice things.  If you want tickets on that train, you’ve got to pay for them.  Uh-uh.  A bad investment, a bad bet. 

This does not mean that I have jettisoned any of my moral and ethical anchors.  Who said that morality and ethics only exist within a religious framework, or within a ‘progressive’ one?  Although most societies have wanted to live in a world with promised celestial rewards, those that have not have done just fine indeed; and in fact those with a purported spiritual basis for right actions have been among the worst killers and looters the world has ever seen.

A few years ago I went to the memorial service for a woman with whom I had been very close friends.  She died young and unexpectedly.  Speaker after speaker talked about her special talent of writing proposals, managing projects, supervising local health activities.  They all missed the real B by a mile.  She was a passionate, funny, outrageously hungry woman who laughed at everything, floated over the surface of life with ‘An Unbearable Lightness of Being’.  I felt happy in her presence, insanely happy in her ‘unreal’ world.  And here she was being remembered as someone who’s best qualities were bureaucratic, socially responsible, and predictable.  I loved her, and was appalled to hear the sodden remembrances of friends and colleagues.  I left the church in tears, not because I missed her – which I did – but for the distortion of her memory, for the erasure of what made her who she was.

I know many of my former colleagues who simply cannot give up their work.  Many of them will one day topple over in the emeritus offices provided them by their employers, splash in a spill of coffee as their heads hit the desk before they know it.  There are others who insist that they have ‘unfinished business’ –  final papers on energy, health, education, or civil society that will make a difference, make a contribution, make the world a better place.  And this with fifteen or maybe twenty years to go before their lives flicker out.

“He stayed late at the office”, their epitaphs will read.

Friday, September 21, 2012

Opting Out Of Vaccinations–Is It Ethical?

Opting out of vaccination programs has become common in many parts of the country.  Parents, concerned about the reputed side effects from immunizations, fearing government-pharmaceutical conspiracy, or simply ‘going organic’, have decided to take their chances in what the perceive to be a benign environment where cases of  measles, whooping cough, and other childhood illnesses are extremely rare.  Yet these diseases have not been completely eradicated, the world’s borders have become increasingly porous, and while the perceived threat might be low, the actual possibility of infection if not epidemic is high.

Since most childhood vaccines have been around for decades with no credible,scientific evidence of any serious side effects, why would a mother not vaccinate her child eliminating all risk for him and perhaps as importantly her community?  No childhood disease is innocuous, and while most cases are relatively mild, some are dangerous if not life-threatening.  Many of these diseases, such as mumps, can have serious consequences if they are acquired in adulthood.  One would assume that parents would want to avoid them at all costs.

As mentioned above, short-term perception seems to rule.  Fortunately one sees very few childhood diseases anymore, and it is easily to conclude that they have been eliminated when in fact they are alive and well in most populations.  Second, many parents believe that childhood diseases are a simple rite of passage as it was for their parents before them – unpleasant, unfortunate, but ultimately harmless and more of a temporary bother than a serious health issue. 

Third, many Americans persistently believe that vaccines are harmful and cause more health issues than they prevent.  No matter how many reports are issued by the American Medical Association, the Centers for Disease Control, or the Federal Drug Administration; and no matter how far back the longitudinal studies extend, the wariness continues. Fourth, the organic-locavore-natural-environmental movement where many marginally-related but philosophically resonant theories are conflated, has many adherents in ‘progressive’ communities such as Vashon Island, Washington:

A stronghold of vaccine skeptics is Vashon Island, a short ferry ride from Seattle, where the share of parents who have opted out of having their children vaccinated has been as high as one in four.

Not only that, conspiracy theories abound, not unlike the surprisingly durable one concerning fluoridation of water (Communist plot). Lastly, parents feel that they can immunize their children ‘naturally’, by deliberately exposing them to disease.  Chickenpox parties are all the rage in some communities where children have sleep-overs and use blankets and bedding used by others who have already had the disease.

Sabrina Tavernise, writing in the New York Times (9.20.12) has chronicled Washington State’s tightening of the opt-out laws regarding immunizations and referred to these ill-advised popular theories:

A distrust of the medical establishment has also fueled skepticism about vaccines. And while the Internet is a powerful source of information, it has also allowed the rapid spread of false information, such as the theory by Andrew Wakefield, a former British surgeon, that the measles-mumps-rubella vaccine was linked to the onset of autism.

“With the Internet, you can have one cranky corner of Kentucky ending up influencing Indonesia,” said Heidi Larson, a lecturer at the Project to Monitor Public Confidence in Immunization, at the London School of Hygiene and Tropical Medicine.

Other more politically-motivated parents opt out of immunization programs because they see an unholy alliance between states and pharmaceutical companies who collude greedily to push an unsafe product on unwary consumers for profit.

Jonathan Bell, a naturopathic doctor in Washington State who encourages his patients to vaccinate their children. Those who opt out, he said, tend to distrust the public health establishment because of what they see as its unsavory connections with the pharmaceutical industry. “The argument is, ‘Oh no, I’m putting off vaccines,’ ” he said. “ ‘I’m part of a group that’s smart enough to understand the government is a pawn of big pharma.’ ”

The problem of ‘holes’ in the protective coverage provided by vaccines is of growing concern, and the debate has centered around the rights of the individual and the rights of the state representing the public interest.

For despite efforts to educate the public on the risks of forgoing immunization, more parents are choosing not to have their children vaccinated, especially in states that make it easy to opt out, according to a study published on Thursday in The New England Journal of Medicine.

And while the rate of children whose parents claimed exemptions remains low — slightly over 2 percent of all kindergarten students in 2011, up from just over 1 percent in 2006 — the national increase is “concerning,” said Saad Omer, an assistant professor of global health at Emory University who led the study.

Families of unvaccinated children tend to live in close proximity, increasing the risk of a hole in the immunity for an entire area. That can speed the spread of diseases such as measles, which have come back in recent years.

Opting out of state vaccination programs is a good example of what some critics have referred to as the hyper-individualism of today.  People have come to believe that all government is corrupt, inefficient, burdensome, and antithetical to individual liberty; and that individual enterprise and expression are the only true values in American society.  Not only is a sense of shared communal values being lost, but more importantly the responsibility to behave ethically towards your neighbors.

While parents might be cavalier about the health of their own children, putting conspiracy theories, bad science, and New Age belief ahead of reality, it is ethically if not morally wrong to subject others to these convictions.

Once these holes of vaccine opt-outers get large enough, childhood diseases will be back with a vengeance.  The state has a compelling interest in public health, but obviously cannot force people to have vaccinations.  It can make it difficult for a family to opt out as Washington has done; but a more drastic and perhaps more effective solution will come from the private sector. Eventually private insurance companies will drop families whose children have not been vaccinated or refuse to pay for treatment of the disease.

It is a shame that families have taken individualism so far, and this very sentiment is producing a result opposite to the one intended – creating yet another role for government.  It is time for people to begin to think more responsibly and ethically about the communities in which they live.

Thursday, September 20, 2012

Can Boy Toys Make Girls Engineers?

I remember when my kids were little in the early 80s, the mothers at the park in my liberal enclave of Northwest Washington were, a decade removed from hardcore feminism, still encouraging their daughters to play with dump trucks, fire engines, bulldozers, and cranes instead of dolls and frilly things.  What usually happened, of course, was that the boys simply grabbed the machinery, grunted out sounds of motors and exhaust, and made miniature cities in the sandbox.  The girls, although for a moment surprised, simply dug into Mommy’s bag and pulled out Barbie.  It had been hidden under sweaters, diapers, and Kleenex, but it was there ‘just in case’.  It didn’t take long for Mommies to realize that the masculinization of their daughters was a lost cause.  No matter what they did, the girls played with dolls, fussed with each others’ hair, and wore frilly dresses.  Of course the mothers didn’t call their re-education process ‘masculinization’ – they were aspiring to a ‘gender-neutral play environment’, but the result was the same.  All the dump trucks, camouflage pants, and tool belts thrown at them were ignored by their daughters but played with so enthusiastically by their sons that their wheels and doors came off.

Eventually, these women gave up on the boy toy campaign, and began to realize that their own emerging professions of law and medicine were perfect examples of gender-neutral occupations.  Men could pump up their maleness leading their lieutenants through the arteries and valves of the body, and women could satisfy their care-giving urges.  Men loved the thrill of battle in the courtroom and women could satisfy their sense of community by creating a better social contract.  Who cared what their daughters or sons played with?  Opportunities were numerous.

A lot of research has been done over the last few decades to try to get at the bottom of the toy issue.  ‘Progressives’ have been convinced that girls were not engineers, truck drivers, heavy equipment operators and car mechanics because their socialization made it impossible.  It was precisely because they were given girly toys and dresses and not things with gears and wheels they would shy away from the real thing.  Conservatives were convinced of what they saw – men and women are different, have different tastes and preferences, and thank God are different from men, so let children play with whatever suits them.

However, no matter how hard the latter-day feminists of the 80s tried and the more they tried to hide Barbie, their daughters persistently refused to become engineers (forget the working class heavy equipment operator, a non-starter for everyone in my neighborhood).  Current estimates are that over 90 percent of engineers are men.  Theories abound to explain this.  Former Harvard President Larry Summers got fired for suggesting that, well, maybe there was something to this hardwired gender thing.  Others close to Summers’ assumption about inherent male-female gender differences said that women were naturally drawn to caring professions (nursing, international health care, social services, medicine) and repelled by the idea of pushing dirt around.   On the other end of the spectrum, liberal observers postulated that gender stereotypes persist, male enforcement remains the rule, and society does little to break down differences.  Look at the Soviet Union, they crowed before 1989, showing off pictures of zaftig female riveters who could do anything

The debate should be over since if women want to become engineers there is no reason why they can’t.  There are no oil-smeared glass doors keeping them out.  And besides, just about every other scientific profession has high proportions of women (out of my daughter’s high school class of 60 three girls became neuroscientists and two biochemists); but people on a mission cannot leave well enough alone.

So, along comes Goldiblox, the girl engineer:

GoldieBlox_SusanBurdick2-615.jpg

Debbie Sterling, a San Francisco-based entrepreneur is out to prove that with the right tweaking and direction, you can combine the best female impulses with the nuts and bolts of building things. Her ideas are echoed in the work of Christine Cunningham, VP of the Museum of Science in Boston:

When Cunningham set about to redesign an electrical-engineering activity with girls in mind, she and her team embedded it in a story about a girl living on a ranch who needs to keep a trough filled with water for the baby lambs. The character decides to build herself an alarm as a reminder. That gives girls a purpose, and they'll "engage in the same tasks and have the same sort of outcomes, because they're linking it back to the safety of the baby lambs," Cunningham told me. (Psychology Today, April 17, 2008 Satoshi Kanazawa)

We all love compromise, but linking baby lambs with electrical engineering seems a little far-fetched to me; and the above picture showing a budding girl engineer spooling pink ribbon in the project seems silly.

These ‘progressive’ behaviorists, however, have to deal with harder science, and various researchers working with primates have found that the same gender differences in toy selection for children occurs in monkeys.  In 2002 two researchers from Texas A&M and City University, London, designed an experiment whereby male and female vervet monkeys were asked to choose between classically male toys (trucks) and female (dolls), with gender-neutral toys as a control

Their data demonstrated that male monkeys showed significantly greater interest in the masculine toys, and the female monkeys showed significantly greater interest in the feminine toys. (Kanazawa)

A subsequent study confirmed these findings.  In a soon-to-be published study, researchers at Emory University concluded:

When given a choice between stereotypically male “wheeled toys” (such as a wagon, a truck, and a car) and stereotypically female “plush toys” (such as Winnie the Pooh, Raggedy Ann, and a koala bear hand puppet), male rhesus monkeys show strong and significant preference for the masculine toys. Female rhesus monkeys show preference for the feminine toys (Kanazawa).

I am not sure what this all proves, and even if it does, it will only confirm what parents all over the world have known for millennia – girls and boys play with gender-specific toys.  Whether or not this has any relationship at all with professional choices, who knows? But women are outpacing men in most professions, so maybe let’s not fuss with engineering.

Wednesday, September 19, 2012

Innovation At No Cost

Most innovations have a significant cost.  R&D budgets for the biggest companies are significant, launching new products or services requires significant marketing and advertising.  Some innovations cost nothing, and they are the holy grail of management.  Most fall into the category of “Why didn’t I think of that?” – that is, they are obvious in retrospect, are genius in their simplicity and ability to reduce costs, generate profits, or improve performance.  There are two examples which come immediately to mind.

First is eliminating both-way tolls on Manhattan bridges and tunnels.  Up until relatively recently, travellers had to sit in endless lines in each direction, going in to New York City and back out again.  Until someone asked the question: “Why don’t we eliminate the outbound tolls and charge double on the inbound?”.  That would eliminate half the bottlenecks that occur every day with no negative impact on revenue.  Ninety percent of cars that enter Manhattan have to exit at one time or another (it is an island), and most will do so as daily commuters. Brilliant.

The second example is the university which decided to eliminate trays from the school cafeteria.  If students had to return to the steam trays to get refills, they would be likely to eat less, helping to keep the famous ‘sophomore spread’ under control, and reducing plate waste.  Eliminating trays would eliminate the cost of the trays, would significantly reduce the labor costs of washing them, and would avoid releasing dishwashing chemicals into the environment.  All at no cost.  Brilliant.

Along comes the idea of ‘loss aversion’:

In economics and decision theory, loss aversion refers to people's tendency to strongly prefer avoiding losses to acquiring gains. Some studies suggest that losses are twice as powerful, psychologically, as gains. (Wikipedia).

Loss aversion theory is frequently applied to marketing.  Sellers take advantage of the general tendency of buyers to retain (not lose) what they have purchased and offer rebates and trial periods.

A related theory is called The Endowment Effect: The value of a good increases when it becomes a part of a persons endowment. The person demands more to give up an object then they would be willing to pay to acquire it.

In simple terms, you will do more to keep a good obtained, then invest in efforts to attain it.

Shankar Vedantam of NPR reported on a study to apply loss aversion theory to teacher performance.

Economist John List at the University of Chicago recently conducted an unusual field experiment in [a poorly-performing] a school district near Chicago.

List and his colleagues divided 150 teachers into three groups. One group got no incentive; they just went about their school year as usual. A second group was promised a bonus if their students did well at math.

The third group is where the psychology came in: The teachers were given a bonus of $4,000 upfront — but it had a catch. If student math performance didn't improve, teachers had to sign a contract promising to return some or all of the money.

The experiment showed that the give-back group performed two to three times higher than that which received the incentive amount at the end of the performance period.

What we found is strong evidence in favor of loss aversion," he said. "Teachers who were paid in advance and [were] asked to give the money back if their students did not perform — their [students'] test scores were actually out of the roof: two to three times higher than the gains of the teachers in the traditional bonus group."

To change performance incentives from the back-end to the front-end of an educational cycle costs nothing, and the rewards are significant.  Brilliant.

Losing Weight–Does Product Labeling Help?

Martin Bruegel writing in the New York Times (9.19.12) provides a short history of public education on nutrition, an activity which surprisingly began almost 150 years ago.

Nutritional recommendations were born at the end of the 19th century with the discovery that humans need 20 calories per pound of weight each day; 55 to 65 percent of this energy intake ought to come from carbohydrates, a quarter from fats and something over 10 percent from proteins.

Yet very little has worked, and people’s appetites and the need for that special gratification that only food can give have always trumped logic and science.  We eat what we do for many reasons.  Poor people in India have always been reluctant to give up the mounds of rice that not only provide energy but a feeling of satiety.  In their constrained and often miserable existence, there is still the feeling of a full stomach.  Poor people in America may like to try sushi or confit de canard but are stuck with pork chops, cornpone, and fatback because they cannot afford much more.  A Happy Meal at McDonalds may be loaded with calories, salt, and sugar, but it’s cheap and an entertaining evening out for a family of four.

In an era of stagnant wages, dystopian politics and cultural anomie, eating indulgent if unhealthful food has become a last redoubt of enjoyment for Americans who don’t feel they have much control in their lives.

Shortly after the fall of Ceausescu, the Romanian dictator, I suggested to the Ministry of Health that they apply for a World Bank loan to address lifestyle health problems.  Romanians were smoking like chimneys, getting drunk on vodka, and eating the fattiest imaginable foods, and public education campaigns designed to change these pernicious habits might be a good idea.  The Minister replied that while this might be a good idea in the long run, now was not the time.  “Our people have been deprived for 50 years”, he said.  “Let them enjoy themselves for a while longer”. 

While most people in America agree that weight is a function of calories, and that despite decades of public information about diet, cholesterol, sodium, and trans-fats, they keep getting fatter.  Clearly there is a serious disconnect between science and popular reality.

In 1888, the American chemist Wilbur O. Atwater devised a series of formulas that would help people get the most energy from the least food and pioneered a movement that came to be known as “scientific eating.”

French physicians promoted a program of “rational eating” aimed at instructing the poor to keep food expenses within the limits of their (modest) budgets. They urged the substitution of protein-rich legumes for red meat, pasta for sausages, and sugared beverages for wine.

Nutrition facts were put next to the items on the menu cards in factory canteens and in working-class restaurants. Scales at the entrance to eating places helped customers to monitor their weight. A menu board, listing carefully calibrated culinary options, would allow workers to assemble nutritious meals from a set of limited options.  The program flopped.

Despite these European failures, Americans took up the fight with typical vigor.

In 1914 the New York State Board of Health introduced a “scientific restaurant,” where staff luncheons were made according to “the most modern dietary theories.” Restaurants across the country began to list energy and protein content on their menus.

Childs Restaurants, an ancestor of today’s global fast-food chains, provided “a complete lesson in dietetics, mathematics, food conservation, patience, economy, and patriotism and a meal thrown in for good” to its clientele.

These programs withered and died.  No matter how much they tried, public reformers could not make a dent in Americans’ eating habits, and nothing much has changed in 100 years.

This does not mean we have given up.  There has been one iteration after another of some kind of food pyramid or pie explaining nutritional balance. They have all been confusing, unhelpful, and ultimately ignored.  Simple charts showing recommended daily allowances of particular nutrients are rarely read, for the percentages of Recommended Daily Allowances (RDA) mean little. “Is 15 percent of my sodium RDA a lot or a little?  Wow, only 200 calories per serving and zero grams cholesterol?”.  Even the most committed nutritious eater has to calculate based on some denominator to make sense of these figures.

Now, because people have done nothing on their own to curb obesity, the state is stepping in with Mayor Bloomberg’s nanny state legislation to outlaw calorie-rich popular drinks. 

New York City has mandated that chain restaurants post calories since 2008, and the federal health care law adopted in 2010 will eventually require fast-food restaurants across the United States to do so.

Chains like McDonalds are jumping on the bandwagon and beginning to provide calorie information for some of its products.

Why do we assume that this will do any good?.  Why should a patron of McDonalds pay attention to nutritional content of a mega-calorie burger when he can say, reasonably, “Hey, I know it’s high calorie, but what the hell, how often do I eat out?” Marketers will surely find a way around the NYC ban on large-size soft drinks if consumer demand is there.

Unfortunately Bruegel resorts to the old prescriptive saw – more education in the schools; but this, too, has been tried; and in these days of performance-based testing, there is no room in curricula for such fluff.  The common denominator of obesity is not education but income – poor people cannot afford the expensive fresh fruits and vegetables available for the well-off; nor can they afford the gyms, spas, racing bikes, or skiing vacations in Aspen; nor do they have the time to enforce ‘no-junk-food’ regulations for their children or regulate the amount of television (and gloppy food commercials) they watch. 

If looked at cynically, the health care system for all its sanctimony about obesity, does little to control it.  Drugs to control hypertension, Type II diabetes, and other weight-related problems are available under private insurance and Medicare.  No conditions concerning weight govern insurance eligibility.  There is little or no incentive for individuals to control their weight.

So, tinkering with labeling and food pyramids, adding a calorie line to Happy Meals, or outlawing super-drinks are unlikely to do any good – just as they have done no good in the last 100 years.

Encouraging The Best And The Brightest–A Call For A More Elite Education

Chester Finn, writing in the New York Times (9.19.12) has reiterated an argument which I and others have made – it is time to redress the imbalance between spending on low-performing children and on the gifted, talented, highly intelligent, and most promising.  While this is not a call for ignoring the less fortunate nor for dismantling programs to assist them, it is a call for recognizing the disproportionate contribution talented youngsters make to society.

The reasons for this disparity in attention results from a misplaced notion that all children are born equal and if some fall behind, it is for no fault of their own.  Children are not born equal.  Some are born with those attributes which signal success in society – intelligence, perceptiveness, social acuity, drive and ambition, and confidence; and many more are born into families that recognize and nurture these native abilities.  The combination of innate ability and family support is a powerful one.

Unfortunately, once these children enter public school they are thrown into an environment of forced homogeneity.  Because all children are equal according to the ruling ethos, there should be no educational distinction among them – intelligent and slow learners should not only be in the same classes, but the more gifted should help the less well-endowed regardless of the consequences to their own educational development.  There is not one intelligence, say these ‘progressive’ social reformers, but multiple ones, all equal.  In this calculus athletic ability or artistic expression are the equivalents of traditional cognitive ability.  It does not matter of Jonathan cannot do math or read below grade level, he can excel on the sports field or on the dance floor.  Yet traditional cognitive abilities – the facility to do math, excel at reading, think logically and inferentially, write clearly and in a well-organized manner – are the keys to personal achievement and social success. 

Public education’s neglect of high-ability students doesn’t just deny individuals opportunities they deserve. It also imperils the country’s future supply of scientists, inventors and entrepreneurs.  

Many school systems have dismantled gifted and talented programs in favor of programs for under-achievers; and those few that have retained them have treated them almost as an extra-curricular activity, not with the rigorous content necessary to challenge students to the limits of their abilities.

Finn and his colleagues did a survey of so-called ‘exam schools’ – public schools which do not consider family, race, ethnicity, or background for admission, only high scores on a rigorous test of traditional cognitive abilities.  These schools provide a top-quality, challenging education to every child who is admitted, an unequalled public sector opportunity.

Yet these schools have come under attack for being elitist and untrue to American democratic principles.  How can you admit students on the basis of test scores alone, these critics argue, when children from disadvantaged communities and families cannot possibly have the preparation required to do well on the test?  How can you comprise a student body with children of only one type of intelligence when there are many?

These fallacious arguments have served to push back these academically elite schools from their mandate and mission.  Of course children from intact homes with motivated parents will gain admission more easily than those without; but this is no reason to admit them.  It is a reason to obligate parents to take more educational responsibility early in their child’s development. 

Diluting the high-intensity intellectual environment of a school by admitting less qualified students would do a disservice to all.  One of the most important elements of top private schools is this competitive, high-octane environment where students challenge and learn from each other as well as from the teachers.

There has been an equally passionate but retrograde outcry against voucher programs which provide public funding for children to attend private schools.  This will drain the public school system of its best students, these critics say, leaving only the most dysfunctional and hard to educate.  Yes, but voucher programs finally allow families of limited means to have the educational choice that wealthier parents have always had.

It’s time to end the bias against gifted and talented education and quit assuming that every school must be all things to all students, a simplistic formula that ends up neglecting all sorts of girls and boys, many of them poor and minority, who would benefit more from specialized public schools. America should have a thousand or more high schools for able students, not 165, and elementary and middle schools that spot and prepare their future pupils.

The exhortation to ‘quit assuming that every school must be all things to all students’ is easier said than done.  Modifying  public education to create more equal opportunity for the talented requires structural reform.  First, school administrators have to reassess the prevailing philosophy that all students are equal, and accept the principle that some are more able than others; and that the duty of public educators is to recognize, promote, and encourage those students who will contribute most to civil society and economic productivity.  Third, these administrators have to be firm in their denial of local politicians who were elected by voters from poor, dysfunctional communities and who want every cent of public funds to address their demands. Fourth, they have to reject the persistent cries of elitism or reverse racism (favoring whites and Asians); and finally they have to recruit and train teachers to teach and challenge, not just babysit, the gifted and talented.

Given what we see in major city school districts like Chicago and Washington, DC, resistance to change is the rule not the exception; and unless there is some flexibility on the part of teachers and school administrators, public school systems will indeed be depleted of their best and brightest as they flee to private schools or elite ‘exam schools’. 

With their support for school choice, Mr. Romney and Mr. Obama have both edged toward recognizing that kids aren’t all the same and schools shouldn’t be, either. Yet fear of seeming elitist will most likely keep them from proposing more exam schools. Which is ironic and sad, considering where they went to school. Smart kids shouldn’t have to go to private schools or get turned away from Bronx Science or Thomas Jefferson simply because there’s no room for them.

Tuesday, September 18, 2012

Why We Are Unlikely To Change Our Minds

I grew up in a family, educational, and social environment in which debate was encouraged and challenging received wisdom was thought healthy.  I was taught never to make up my mind until all the facts were in, to be sure that the facts were actually facts and not some second or third-hand report of events, and to dismiss soft claims (“I can’t remember where I read it, but I think it was the New York Times”). Polemics were always encouraged.  I never minded it if one of my children defended an issue that was not entirely consistent with one previously held.  They were pushing the logical envelope, trying out the other side to see how it felt, fitting it out with new clothes to see if it looked any better than the old. 

Some have said that because of this liberal upbringing, rigorously debating both sides of an argument, and coming to purely logical conclusions missed the point.  Everyone had to have some religious, philosophical, or social anchor around which arguments were constructed.  If not, then ideas, arguments, and concepts would simply whirl around and eventually get sifted, but with no particular purpose. Eighteenth Century man to the end, I preferred to pursue the truth.

It has been hard for me to accept people whose minds are always made up.  I am a harsh critic of my friends who have never left the safe confines of the Sixties.  Nothing has changed for them in fifty years, even though the world little resembles the one of their mustachioed youth.  It is really hard to dismiss Richard Nixon, Ronald Reagan, the Bushes entirely, and to cling only to the iconic images of Jimmy Carter’s values-driven foreign policy, LBJ’s War on Poverty, and Bill Clinton’s ‘Looking Like America” diversity.  But they do. 

I have recently travelled and lived in one of the most conservative states in the Union – conservative in religious views, social issues, and politics.  I knew that I would hear views very different from my own; but I was surprised by the vehemence with they were held.  Not only had my conservative friends – like my liberal ones from back home – not changed in fifty years, they had developed hardened and aggressive stances.  It wasn’t enough to disagree with Obama, they hated him.   Although I was used to political tenacity and fidelity, I was not prepared for such vehemence and vitriol.

An article in the New York Times (9.18.12) by Cass Sunstein explains the phenomenon of hardened opinion.  Why, he asks, when presented with reasonable opposing positions, people not only reject them, but use them to harden their own opinions?

IT is well known that when like-minded people get together, they tend to end up thinking a more extreme version of what they thought before they started to talk. The same kind of echo-chamber effect can happen as people get news from various media. Liberals viewing MSNBC or reading left-of-center blogs may well end up embracing liberal talking points even more firmly; conservative fans of Fox News may well react in similar fashion on the right.

The result can be a situation in which beliefs do not merely harden but migrate toward the extreme ends of the political spectrum. As current events in the Middle East demonstrate, discussions among like-minded people can ultimately produce violence.

How does this happen?

The answer is called “biased assimilation,” which means that people assimilate new information in a selective fashion. When people get information that supports what they initially thought, they give it considerable weight. When they get information that undermines their initial beliefs, they tend to dismiss it.

This natural human tendency explains why it’s so hard to dislodge false rumors and factual errors. Corrections can even be self-defeating, leading people to stronger commitment to their erroneous beliefs.

This has happened to me over and over again in discussions with friends from the Left and the Right.  I would present a well-researched and logical argument about a particular issue, and rather than enter what I thought would be a rational debate on the merits of the case with one side modifying an original position, I became involved in a contentious argument.  My sources were challenged.  I had been infected by the radical Left (or Tea Party Right).  I had lost my bearings living in the South or equally had lived in the North far, far too long.

What I found again and again was that my friends were reading only information that confirmed their opinions.  The only read MoveOn.org or watched Fox News, or worse, plumbed the depths of twisted conspiracy theories that raised ugly racist, anti-capitalist, anti-Semitic, or anti-immigrant sentiments.  It made no difference whether or not I countered the arguments reasonably.  One recent exchange with a liberal friend is illustrative.  He posted an article on FB which concluded that the Obama stimulus efforts and the QE actions of the Fed were all good.  I replied with an article– by an equally respected source – that not only did these interventions not have the intended consequences, they resulted in very negative unintended consequences.  As Sunstein predicted, not only did my friend dismiss my argument, he hardened his.

The only factor that social researchers have found that makes people listen to both sides is the credibility of the source. 

If civil rights leaders oppose affirmative action, or if well-known climate change skeptics say that they were wrong, people are more likely to change their views.

This, of course, is unlikely to happen.  It is hard to reject not only an argument but the faithful constituents who have put you in office.  Perhaps more importantly, in our very fragmented, rapidly changing, and unpredictable world, it is not surprising that we define ourselves with immutable, aggressive positions.  “I am against abortion” is much more affirming and identifying than the more wishy-washy “I don’t like the idea of abortion, but the rights of women, blah blah”; or “Wait until the data are all in”.

I still love to argue ideas, and I have a few friends with whom I have a mutually respectful relationship despite our political differences.  I think of our heated discussions as ways to refine or reject my premises, and I suspect my friends are only trying to change them; but we both leave the table satisfied.

Monday, September 17, 2012

Why Is The Muslim World So Easily Offended?

Fouad Ajami has written an article of this title in the New York Times (9.16.12). In it he says that the current violent demonstrations in Egypt and Libya and the all-too-common recurrent anti-American sentiments in the Middle East are due to Muslim culture and history.  The glories of the Muslim world expressed in mathematics, science, and literature in the early Middle Ages are long gone, have never been repeated, and by comparison to Western countries Muslim countries are poor, backward, and politically weak.  It is a feeling of cultural loss, a cultural and geo-political alienation, and a persistent inferiority complex throughout the region that is behind the frustration, anger, and hostility shown against the West, the most prominent and visible being the United States.

There is an Arab pain and a volatility in the face of judgment by outsiders that stem from a deep and enduring sense of humiliation. A vast chasm separates the poor standing of Arabs in the world today from their history of greatness. In this context, their injured pride is easy to understand.

In the narrative of history transmitted to schoolchildren throughout the Arab world and reinforced by the media, religious scholars and laymen alike, Arabs were favored by divine providence. They had come out of the Arabian Peninsula in the 7th century, carrying Islam from Morocco to faraway Indonesia. In the process, they overran the Byzantine and Persian empires, then crossed the Strait of Gibraltar to Iberia, and there they fashioned a brilliant civilization that stood as a rebuke to the intolerance of the European states to the north.  [Everywhere] there was poetry, glamorous courts, and philosophers who debated the great issues of the day.

The reign was short-lived, and beginning in the 13th century, Arab lands were overrun by Mongols and Ottoman Turks, and with them a destruction of the rich cultural life of preceding eras.

The coming of the West to their world brought superior military, administrative and intellectual achievement into their midst — and the outsiders were unsparing in their judgments. They belittled the military prowess of the Arabs, and they were scandalized by the traditional treatment of women and the separation of the sexes that crippled Arab society.

Arabs today “know that more than 300 million Arabs have fallen to economic stagnation and cultural decline. They know that the standing of Arab states along the measures that matter — political freedom, status of women, economic growth — is low.”

According to Ajami, the Western ‘insults’ to Islam, beginning with Salman Rushdie’s Satanic Verses, then followed by the Danish cartoons and Dutch filmmaker Theo van Gogh’s release of a documentary critical of Muslim treatment of women all resulted in an apparently spontaneous, violent, and widespread outpouring of rage and violence.  A fatwa was placed on the head of Rushdie who was forced to live in isolation for decades; and van Gogh was murdered by a Muslim extremist.  The latest violent protests in Egypt and Libya are a continuation of that rage. 

According to Ajami, Western free speech (i.e. Rushdie’s book, van Gogh’s film, the Danish cartoons) was a very visible symbol of Western corruption and indifference to religion and Islam.  Because these productions were graphic and could circulate within minutes through the Internet, they quickly became rallying points for an expression of Muslim frustration. 

Ajami’s explanation, however, is but one of many and is perhaps the most understanding and defensive.  A more compelling factor, say many on America’s ‘progressive’ Left is Israel’s brutal treatment of the Palestinians.  The invasive and demeaning checkpoints, the insistent building of new settlements, the inhumane bombing of Palestinian cities and populations, the recalcitrance and inflexibility on matters of statehood, are wholly responsible for Arab and Muslim expressions of violence against the United States – a much more visible and much bigger iconic target than Israel itself.

This argument has as many holes in it as Ajami’s respectful historical analysis.  Israel was not responsible for the 1979 Iranian revolution, the rise and resurgence of the Taliban, and the increase of Islamic fundamentalism in Europe.  There must be something else going on which has its own internal dynamic.  Islam, these neo-culturalists say, is returning to its fundamental Wahhabi roots – a brand of Islam, practiced most noticeably in Saudi Arabia, which is fiercely conservative.  There, the   absolute simplicity of the faith (Islam has none of the baroque trappings of Christian dogma or the multiplicity of religious expressions as Hinduism) is rigorously followed.  This is not a movement in rejection of the West, but has existed for centuries, and its absolutism and extreme fundamentalism is a perfect foil for the corrupt, liberal, and undisciplined cultures of Europe and America.

This fundamentalism has expanded to formerly more moderate countries like Bangladesh because of its appeal to the poor.  Radical Islamists give both a religious and political voice to the disenfranchised, and give them both economic and spiritual hope.

Still other political observers say that the real problem is America itself.  For too long have we supported authoritarian regimes throughout the world in the name of political and economic stability.  Our support of Mubarak is only one example of an obvious and persistent policy.  These authoritarian regimes were repressive towards any opposition, but particularly toward Muslim fundamentalists such as the Muslim Brotherhood.  Where the US had no client state relationship, it supported colonial and neo-colonial regimes in Algeria and Tunisia.  America has been seen not only as a friend of the hated Israel, but a friend of many other national oppressors.  

In short, Ajami’s conclusions are reasonable, but only part of the explanation.  What is clear is that these recent developments are not just over the weird indie film, but over many, many other perceived grievances.

Sunday, September 16, 2012

Genetic Modification–A Bad California Proposition to Label Foods

California is set to vote on Proposition 37 which would make it compulsory for food companies to label their products accordingly if they are genetically-modified (GM) in any way.  The reason why the labeling issue come up at all, of course, is that many environmentalists feel that GM foods are harmful, both for the consumer and for the environment.  The label will not be a smoking-type warning, e.g. “GM Foods Are Bad For You” because there is not enough science to back the claim, just an indication that the foods have been altered.  Although this would seem like an innocuous, value-neutral act, it is not at all.  The simple fact of including the information on food packaging would send an unequivocal message – Don’t buy this product.

Proponents of the measure disingenuously claim that it is ‘simple’ information, no different from the list of Recommended Daily Allowances (RDA) already displayed on most foods; yet it is not an RDA which are indeed value neutral and scientifically sound. While there is no doubt some political motivation behind RDAs, particularly to address the growing problems of obesity and hypertension (calories, fat, and salt); and to encourage foods rich in the vitamins and minerals listed, none of the listings are dubious, contentious, or misleading.

The current listing – as factual as it is - already presents as many problems as it purportedly solves, and a new, doubtful claim would set off another round of food-panic in already nutritionally-confused customers.  “What exactly are trans-fats anyway?…I have heard that pounding zinc is a good way to beat off a new cold, so why isn’t zinc on the label, and does that mean the FDA discounts the claim? …What if I eat a tree full of oranges one day?  Can I ignore Vitamin C  for a week?… I thought Vitamin D came from the sun, so why is it on the label?…  Hmmmm… 15 percent sodium doesn’t sound like much”.  Of course this last careful shopper fails to take into consideration that her husband already empties half the salt shaker on his meat every night.  Twenty percent of anything doesn’t sound like a lot.  Who needs Vitamin B1 anyway?

The point is, while some consumers would ignore the labeling, many would be as perplexed by it and reject the food on the basis of befuddlement; and still others would pass it up because of the hysterical reports about ‘Frankenfoods’, that they will turn babies into monsters, forever damage the ‘natural’ world, and only serve to enrich fat-cat corporate executives who live off the backs of the poor.

Even if food companies were required to give the GM label to foods, what actually would that entail?  Only the modification of the food itself?  Or the products that go into growing them? Sugar is an all-too-common ingredient in processed foods and it usually comes from corn.  If the corn has been genetically modified, then the corn syrup from the plants is genetically modified, then does the can of peas which have not been modified, have to be labeled GM?

In short, labeling foods as GM is not by any means neutral, and is a not very subtle attempt to play on consumer ignorance or confusion or growing bias against GM foods.  Worse, without scientific evidence, it is prejudicial to private corporations which have for years insisted that GM foods not only do no harm to the consumer are a positive force of good for the environment.  They eliminate the need for pesticides and fertilizer.  New fast-growing varieties reduce the amount of water needed for irrigation.  High yielding varieties can reduce the acreage needed for a given production target.

What does the science say.  Sarah Zhang in Mother Jones (6.19.12) has compiled the results of many recent international research efforts designed to determine possible negative effects of GM foods.  She provides hyperlinks to all these studies.  Of particular interest are conclusions made by the EU to reject calls for bans on GM foods.

[A recent study done by the EU] drew its conclusions from the work of more than 130 research projects, covering a period of more than 25 years of research involving more than 500 independent research groups. Its most important conclusion was “that biotechnology, and in particular GMOs, are not per se more risky than e.g. conventional plant breeding technologies. (A Decade of EU-funded GMO Research 2000-2010)

In conclusion, the EFSA GMO Panel considers that, based on the documentation submitted by France, there is no specific scientific evidence, in terms of risk to human and animal health or the environment, that would support the notification of an emergency measure under Article 34 of Regulation (EC) No 1829/2003 and that would invalidate its previous risk assessments of maize MON 810 (European Food Safety Authority, 2012)

The National Academies Press produced Safety of Genetically Engineered Foods – Assessing the Health Risks (2004) and found:

Genetic engineering is one of the newer technologies available to produce desirable traits in plants and animals used for food, but it poses no unique health risks that cannot also arise from conventional breeding and other genetic alteration methods.

Not only is evidence growing to dispel claims of GM harm, more evidence is emerging about unexpected benefits to the environment.  Not only to GM foods obviate the necessity of pesticides, fertilizers, etc. but they have other benefits as well:

A massive 20-year study just published in the journal Nature found that using GM cotton in China to control cotton bollworms closely tracked with a rebound in natural enemy populations, which in turn keep out secondary pests like aphids that usually proliferate when chemical insecticides kill the bollworms (Zhang).

There are hundreds of similar citations mentioned in the Zhang article and through an even cursory search on the Web; and yet despite the evidence that GM foods do no harm and are in fact have a positive effect on the environment, the hysteria to remove them (and labeling is one way of putting pressure on GM producers) is increasing.

Some observers, again disingenuously, say that Proposition 37 is not about banning GM foods.  It is a ‘Right to Know’ legal provision; but because it is government-mandated and –identified, and will go alongside FDA requirements, it is a de facto warning, based on little scientific evidence, and prejudicial to producers.  It should be defeated.