I took four years of high school French, primarily because it wasn’t German. German was mach-uber-scarrrry, with its hundred-character words and its link to some of history’s greatest wrongs, like lederhosen. But French was soft and lyric. In my classes, I loved saying the French word for “garbage,” which is “garbage.” Only better.

By my senior year, only about a half-dozen students had stuck with the language, and as a reward, Madame Virgillo had us read two French classics that we had no hope of understanding, even in English: Voltaire’s Candide and the subject of today’s musings, Le Petit Prince, by Antoine de Saint-Exupéry. She also had us watch a truly bizarre French movie called L’Argent de poche, which included young kids telling dirty jokes and a scene where a toddler falls out of a third-floor window only to jump up and play gleefully. Between the surreal movies and Voltaire’s scathing political satire, it’s surprising that we didn’t switch to the German classes and invade.

The Little Prince is one of the most published and beloved novels in all of human history, having been printed more than 140 million times in over 250 languages, including a Latin version that my son has for his own high school foreign language travails. And it is so French, filled with liberté, fraternité, and snakes that swallow whole elephants. In the story, a pilot crash-lands in the Sahara Desert—something that did in fact happen to the author—where he meets up with a diminutive prince, recently arrived from his asteroid homeland. Through their conversations, the two explore the strangeness of being an adult—at least up until the childlike prince tries to commit suicide. Bread and Jam for Frances never had a suicide scene, even with “France” in the title.

It’s been many years since I read this work, but I still recall the opening words: “Lorsque j’avais six ans j’ai vu, une fois, une magnifique image…” (“When I was six years old I saw a beautiful picture….”). It’s a simple opening for a book that is deeply philosophical and at times political. The baobab trees that try to take over the prince’s asteroid, for example, represent Nazi aggression. Naturally. It’s not a straightforward book. But if you can catch the subtle allegories, an adult can get a lot out of the text. And a child will get even more. Especially if they can read French.

This article was posted on October 14, 2014. Related articles: Other Books, .

Leave a Footnote

 

It was back in fifth or sixth grade that someone gifted me Merriam-Webster’s Vest Pocket Dictionary, a blue, smallish reference book of very few words, marginal utility, and if memory serves, much excitement. It was such a desirable book that, even though my mother purged it from my shelves in the Great Garage Sale of 1980, I was able to pick up a replacement used copy about a decade ago.

What drew me to this work wasn’t its depth of content. Most of the definitions top out at five or six words. It certainly wasn’t the number of entries since even with the elderly-offending font, its 370 pages leave little room for variety. (The X, Y, and Z sections together utilize all of two-and-a-half pages.) And it definitely wasn’t the pedagogical import, since I doubt I looked up more than one hundred words total while it was in my possession.

What made it a significant book can be summed up in one short definition.

grown-up \’grôn‚əp\ n: adult —grown-up adj

This small dictionary wasn’t one of those grand library editions, nor was it a childish “beginner’s” dictionary common for the younger set. This was a dictionary that mimicked the look and feel of a big people’s dictionary—each entry even included one of those incomprehensible pronunciation guides—but was still small enough to travel around with a juvenile. This book declared to an elementary school student: “You are grown up, and deserve grown-up things.”

With the advent of dictionary apps and Wikipedia, small reference books like this don’t carry the same emotional charge as they did decades ago. I asked my teenage son what he thought of the book, and his first response was, “It’s bigger than my phone.” Sigh. Kids today, once I chase them off my lawn, have ready electronic access to the wealth of human knowledge. Perhaps that makes them more grown-up than I was at that age. Yet I wonder if putting such power in their hands so early removes opportunities for simple, joyful discovery made possible through things like vest-pocket dictionaries.

This article was posted on October 6, 2014. Related articles: Other Books, .

Leave a Footnote

 

I start this review with an obligatory disclaimer: I am not one of those science-hating creationists. It’s a testament to the state of public discourse in this country that I have to include that preface. But admitting openly that I read a book with a title like Darwin’s Doubt carries certain risks. It’s the kind of publication that you have to wrap in a copy of Lady Chatterley’s Lover if you want to read it on the subway.

And yet, it’s not that bad. In fact, Darwin’s Doubt is a surprising book, one of those works that stirs in you those feelings and ideas that you had all along, but didn’t know how to articulate or even whether you should.

The book’s title comes from portions of Charles Darwin’s Origin of Species where he discusses possible problems and setbacks with his overall theory of descent from a common ancestor. Darwin was familiar with the Cambrian Explosion, a ten-million-year block of time approximately 540 million years ago that saw a rapid increase in the number and complexity of biological forms. The geologically sudden appearance of such diversity differed from Darwin’s view of gradual transition. In light of this apparent conflict, Darwin sought to assure his readers: “If my theory be true, it is indisputable that before the lowest Silurian [Cambrian] stratum was deposited, long periods elapsed, as long as, or probably far longer than, the whole interval from the Silurian age to the present day; and that during these vast, yet quite unknown, periods of time, the world swarmed with living creatures.”

Darwin, it turns out, was wrong. While the world did swarm with creatures before the Cambrian era, they weren’t that varied. More than a century of research has shown that the Cambrian Explosion is what it is: a blink of an eye that produced creatures at a rate that far exceeds Darwin’s estimates. How did life appear so rapidly? In the absence of a powerful deity, what would be needed to make this happen?

What is needed is information. According to Stephen C. Meyer, the book’s author, advancements in life are in reality advancements in digital information in the form of DNA and epigenetic content. “Building a fundamentally new form of life from a simpler form of life requires an immense amount of new information.” It’s as if a room full of monkeys started banging on typewriters and produced, not works of Shakespeare, but complex computer algorithms. Randomly generating this level of quality DNA seems unrealistic to Meyer. The heart of Darwin’s Doubt is an exploration of why an increase in functional DNA-based information could not come about through neo-Darwinian evolution alone.

Meyer addresses this possible impediment to materialistic evolution by looking at four key issues: (1) the likelihood of random processes generating beneficial and usable genetic material, (2) the time needed to generate this digital genetic information, (3) the window of time in a species’ development when DNA-generating mutations must occur, and (4) whether DNA alone is sufficient to build new life forms.

Darwin himself was not aware of DNA or of how genetics worked at the cellular level. In today’s neo-Darwinian world, it is well understood that genetic mutations must drive speciation. Cellular mutations are common, but generally deleterious; we call some of them “birth defects.” Even if a mutation could be considered neutral or beneficial, Meyer explains that there need to be enough coordinated mutations present to generate a new protein fold. “New protein folds represent the smallest unit of structural innovation that natural selection can select…. Therefore, mutations must generate new protein folds for natural selection to have an opportunity to preserve and accumulate structural innovations,” an action that is unlikely by Meyer’s calculations.

Meyer’s second point deals with the study of population genetics. Given the size of a population, its reproduction rate, its overall mutation rates, and the tendency for natural selection to weed out members with diminished genetic fitness, how much time is needed to generate new genetic information that would result in the transition to a new type of creature? Quite a lot, it turns out. “For the standard neo-Darwinian mechanism to generate just two coordinated mutations, it typically needed unreasonably long waiting times, times that exceeded the duration of life on earth, or it needed unreasonably large population sizes, populations exceeding the number of multicellular organisms that have ever lived.” Since the majority of genetic advancements would require more than two coordinated mutations, the numbers quickly become more severe.

Timing also plays a key role in genetic transmission. To have a development impact, Meyer notes that mutations must come early in the evolutionary lifetime of a species, when populations are still small and morphological changes have the best chance of taking hold. However, mutations at this stage also have the biggest probability of imparting damage, and similar changes in larger populations are more likely to be removed by natural selection.

Finally, Meyer discusses epigenetic information, the instructional content stored in cell structures but not in DNA sequences themselves. This information is essential in driving the overall body plan of an animal as it develops from a single cell to its full adult size. What is not clear is how it could be altered through genetic mutation, since specific epigenetic pressures are not coded directly by the DNA base pairs. There is also the issue of “junk DNA,” those coding blocks of the genome that were once thought to be useless castoffs of evolution, but are now understood to drive biological processes in living creatures.

In its more than 500 very accessible pages and its 100-plus pages of footnotes and bibliographical references, Darwin’s Doubt does at good job at expressing Meyer’s own doubts that “wholly blind and undirected” evolution could produce the variety of life we see on earth. As the director of the Discovery Institute’s Center for Science and Culture, Meyer is a strong advocate for Intelligent Design. “Our uniform experience of cause and effect shows that intelligent design is the only known cause of the origin of large amounts of functionally specified digital information. It follows that the great infusion of such information in the Cambrian explosion points decisively to an intelligent cause.” His book makes a strong case, although the appeal to non-materialistic explanations will always turn off some readers. As proof of the struggle, Meyer quotes Scott Todd, a biologist writing in Nature: “Even if all the data point to an intelligent designer, such a hypothesis is excluded from science because it is not naturalistic.”

So here you have the conundrum. Despite what you read in the news, the science surrounding biological evolution is still somewhat fluid, in part due to the issues raised in Darwin’s Doubt. Yet for those who have reasonable struggles with the evolution-is-settled narrative, it’s hard to shake the baggage carried by those on the religious fringe who insist that any non-literal interpretation of the creation story is blasphemy. Another book on my reading list is Mind and Cosmos, by Thomas Nagel, an atheist who rejects the standard neo-Darwinian view for some the same reasons as expressed by Meyer. (Anthony Flew, once known as “the world’s most notorious atheist,” later came to believe in a deity because of similar issues surrounding evolution.) A book like Darwin’s Doubt requires vigorous analysis to confirm that it is scientifically sound. But if it is, then the neo-Darwinian evolution model itself might turn out to be little more than an intelligently designed idea.

This article was posted on September 17, 2014. Related articles: Biology, Other Books, , .

Footnotes for “Review: Darwin’s Doubt”

  1. Have you seen the Ben Stein documentary “Expelled?” Not to start a debate here (I honestly think both theories carry truth), but it’s probably the best example of everything that’s wrong with Creationism…mainly that it’s usually presented all wrong. I’d be curious to read a book that presents creationism with a little more…science…as it seems Meyer has done.

  2. I did see Expelled a while back. If I remember correctly, it’s not so much about the science behind intelligent design as it is about the difficulty intelligent design advocates feel they have in getting their message out. Stein can be both jocular and heavy-handed, perhaps dual tributes to his background both in Ferris Bueller’s Day Off and as a speechwriter for Nixon.

    Meyer’s book is pretty much all science, all the time. There’s only passing references to religion, about the same as you might find in any general-audience science book that communicates the pro-evolution stance. I’m a layman when it comes to biology and chemistry, but his argument did have the feel of rigor and competency.

    More eye-opening than Meyer’s analysis was his content documenting various experts who openly discuss the scientific complexities and roadblocks present in origins-of-life research. High school textbooks give the basic common-descent narrative with finch-beak examples, and it all feels very settled and boring. But if you access the journal articles referenced by Meyer, there’s a whole world of scientific debate covering many aspects of the evolution story. It was quite fascinating to read peer-reviewed content from those who promote evolution, and yet still understand that they need to grapple with the problems evolution brings up.

  3. I’d like to see you challenge yourself to think critically about this work.

  4. In addition to his statement above, Carl sent me some links offline (here and here) that attempt to correct misconceptions held by those who reject evolution. He also warned me not to avoid challenging books that use myth and outdated research as their core forms of support.

    This is part of the difficulty in talking about books like Darwin’s Doubt. Those like Meyer who propose ideas that challenge the Darwinian model are assumed to be wacko religious knuckle-draggers. Often, that assumption is correct, but not always, and the assumption by itself is not proof that the ideas are faulty. Sites like the ones listed by Carl attempt to correct the religious nuts, but they seldom address those who earnestly and seriously grapple with the incomplete scientific and statistical detail offered by modern evolutionary research.

    Within the scientific community, there is near unanimous acceptance of neo-Darwinian evolution, even among scientists who express a belief in God. However, there is not unanimous agreement about specific components or processes associated with the evolution narrative. This should come as no surprise, since science is a constant battle to understand and expand gaps in theoretical knowledge. Yet I am regularly surprised that the public presentation of evolution brings with it the false statement that there are no known issues with the theory of evolution. There are real, complex issues, and it is these issues that Darwin’s Doubt addresses. Meyer’s conclusions are a valid target of debate. But anyone who claims that evolution is scientifically problem-free is avoiding their own need to challenge ideas.

    In response to another of Carl’s challenges, Darwin’s Doubt does not depend on myth–it doesn’t discuss religion at all beyond providing common responses to critics–and it generally sticks with research from the most recent two decades, except when it is specifically quoting content from Darwin’s era.

  5. From a subsequent link within your first link in #4:

    “Evolution myths: Evolution violates the second law of thermodynamics

    The second law of thermodynamics states that entropy, a measure of randomness, cannot decrease in a isolated system. Our planet is not a isolated system.”

    http://www.newscientist.com/article/dn13720-evolution-myths-evolution-violates-the-second-law-of-thermodynamics.html#.VDMCJfldVXY

    It is very true that we are not in an isolated system. However it is a huge leap to say that that in itself is sufficient for evolution to occur.

    Energy from outside a system can be used to increase complexity, true. But only if there is a mechanism to capture and use that energy. For example a quart of gasoline can power some aspect of a manufacturing process. And the same quart in a Molotov cocktail can destroy the factory. The ways that energy can increase disorganization far outnumber the ways it can increase organization.

    And how can a system that uses energy to increase complexity come into existence in the absence of a system to capture and use energy in that way? Pulling yourself up by your bootstraps just doesn’t work.

Leave a Footnote

 

A recent article in the The Independent documented racial angst over a tribute to Robin Williams at this year’s Emmy Awards. During Billy Crystal’s public eulogy for the late comedian, a video showed Williams donning a makeshift hijab and blurting out, “I would like to welcome you to Iran…Help me!” The Twittersphere quickly labeled the shtick as racist, but you shouldn’t believe everything you read on the Internet.

Robin Williams’ poke at the situation in Iran might have offended some people, but it wasn’t racist. Iran is obviously not a race; it’s a country, a geopolitical region defined by sovereign states, not by DNA. Its religion is monolithic, but that too is immaterial, since it is the nation’s recent political history, and not its primary creed, that makes Williams’ humor so biting. The transformation of Iran from a somewhat open, liberal society to one where a joke about the status of women resonates with outsiders happened just a few decades ago, and was a political change, not a racial one.

“Racist” has become the go-to explanation for many of America’s current woes. From police shootings in the Midwest to conflicts in the Middle East and every gripe about President Obama’s policies in between, if there’s trouble in the news, there’s bound to be an accusation of racism soon after. It turns out that calling someone a racist is a great way to silence an opponent. But in the United States in the twenty-first century, such accusations are rarely accurate.

Racism in America is basically over. The country used to be a lot more racist, back in the Civil War and Jim Crow days, and even as recently as the wartime fiasco of Japanese internment. But it’s not like that anymore, not even close. Sure, the KKK and neo-Nazi groups still exist today, but we call them “fringe radicals” for a reason. Over the last 150 years, and especially since the Civil Rights Era of the mid-1960s, the United States has steadily distanced itself from its racist past.

That national evolution doesn’t mean that race is no longer an issue. As proved by the police shooting of Michael Brown in Ferguson, Missouri several weeks ago, and by the riots that followed, race and racial sentiments still have an impact. Does the shooting of a black youth by a white officer substantiate a disturbing trend in race relations? The TV footage says “yes,” but the actual numbers related to police shootings say, “not really.” A post-Ferguson New York Times column looked at the conventional wisdom that says “the use of deadly force by police officers unfairly targets blacks. All that is needed are the numbers to prove it. But those numbers do not exist.”

Young black men are arrested and incarcerated at rates quite disproportionate to their overall population. Is that racist? Perhaps in some situations. But crime statistics by race always have titles like “Crimes by race” instead of “Crimes by race, age, sex, region, type of offense, weapon choice, family background, education level, financial situation, work history, music preference, and local political culture.” Such summary numbers always gloss over poverty and education variations, the distribution, frequency, and extenuating circumstances of different types of crimes, cultural differences between urban and rural environments, the variation in mandatory sentencing laws, the recidivism rates for different regions and demographic groups, changes in family makeup over recent decades, and the role of technology, music, and communications as they relate to criminal activities.

In other words, crime is a complicated and messy thing, and race is just one of many components. The temptation to shout “racist” whenever there is conflict between different ethnic groups ignores that messiness. The rush to stick the racist label on newsworthy events reflects America’s modern tendency to reach for simple explanations to complicated problems. Calling someone a racist is easy, and frees the accuser from doing the hard work of ferreting out the complexities of social interactions between the races, and between individual and situational nuances within a nation of over 300 million diverse citizens.

The current President, two Supreme Court justices, and a full fifty percent of all schoolchildren in America are classified as minorities. If the White Man has been trying to keep the other races down, he’s certainly done a lousy job at it. The shooting of Michael Brown in Ferguson is troubling, but not because America is racist. The United States is, instead, one of the most ethnically diverse and non-racist places on earth, and a nation in which crying out for help from behind a veil would be a joke.

[Image Credits: Possibly the Associated Press]

This article was posted on September 3, 2014. Related articles: Commentary, , , .

Footnotes for “American Racism is Basically Over”

  1. I see: don’t ask, don’t prove, just tell. Tell the women that sexism doesn’t exist, tell the old or very young people that ageism doesn’t exist, tell a Muslim or a Jew that religious bias doesn’t exist, and try to tell the racist that Iranians and Iraqis aren’t Arabs (totally true, but racists hate with a wide brush). Tell everyone that there isn’t anything wrong with a 95% white police force in a town that’s 67% black.

  2. Of course there are still racists in America, just as there are racists in Canada and Mexico and Japan and Chile and Russia and Uganda. You can always find incidents of racism or similar -isms if you look. But that doesn’t excuse the modern confusion of perceived racism with non-racist causes of life’s difficulties.

    The police department in Ferguson might be a racist group. But the percentage of white officers when compared to the general population is by itself insufficient evidence to claim racism. Between the 2000 and 2010 Census counts, the white population in Ferguson dropped from about 75% to about 45%. That’s a sizable decrease in ten years, and it’s not surprising that a small police department didn’t change as quickly. Like I said, it may be a racist precinct, but that determination should be made on racist activities and not just on statistics alone. If the department was 95% black instead of white, that also would not be an automatic indication of racism.

Leave a Footnote

 

Earlier this month, President Obama authorized targeted airstrikes against the terrorist group ISIS in Iraq. While it appeared to be a reversal of his long-term policy of disengagement from that nation, the president made it clear that the operation was limited in scope, and that America was not returning to Iraq. “I ran for this office in part to end our war in Iraq and welcome our troops home, and that’s what we’ve done…. The only lasting solution is reconciliation among Iraqi communities and stronger Iraqi security forces.”

In other words, the President’s plan is to provide the equivalent of an aspirin for a broken leg, hoping for a complete recovery. Minimum input, maximum output. It’s the policy of wishful thinking.

Wishful thinking has become the action of choice for modern American politicians on both sides of the aisle. It’s not hard to understand why: good news is good! Spinning Middle East drama into its best possible outcome sounds better than ordering up fresh military recruits. Liking and keeping your healthcare policy sells better than correcting decades of regulations and laws that act as enablers for high-charging medical firms.

Americans want the happy ending, the tidy solution that resolves all loose ends. They want wishful thinking. And since politicians are people-pleasers, they give the public what it wants. The declaration of the happy goal is paramount; the details on how to get there are downplayed or rejected completely.

Wishful thinking sounds nice, but it ignores reality. Iraq is a mess, and has been since long before its modern form came about in 1916. Thoughtful people can disagree about whether the United States should once again commit its military resources to that region of the world. But we delude ourselves if we think that Iraq will meander safely toward peace while more stable countries watch from afar. It didn’t work in Cambodia in the 1970s; it hasn’t worked in Sudan over the past decade; and it won’t work in Iraq.

Ethnic and denominational relations inside Iraq and with its neighbors have been tense for a long time. Tribal conflicts that include human rights violations crop up regularly, and in the absence of an external diplomatic or military force, there is little reason to expect much change in the near future. As commander-in-chief, President Obama is free to reposition American troops. But there is no military or historical justification for coupling a complete Iraqi withdrawal with assurances that “it will all work out somehow.” It’s not honest or accurate. It’s just wishing.

Politicians engage in wishful thinking whenever they offer rosy promises of simple solutions to complex problems. Americans partake of wishful thinking every time they clamor for a slick candidate who claims to hold to key to social difficulties, or believe that the next presidential election will right all wrongs. There are times when government may need to act (or step back from prior actions) to address some societal difficulty. In general, if these actions are to have a positive impact, they will be expensive, time-consuming, controversial, partly ineffective, and injurious to the private sector or some other national concern. Any politician who tells you otherwise is trying to get reelected.

In many ways, it’s our own fault. The wealth and safety we enjoy as Americans has allowed us to entertain ourselves rather than pursue civic education, to prefer a comfortable peace instead of troublesome truths. We desire sitcom solutions to the ugly problems of life. “We will eliminate poverty in our lifetimes” sounds so much better than the reality that some people will always be poor, no matter how much time and money we throw at the problem. Helping the poor is good; believing the help will succeed in all cases is delusional.

Wishful thinking is ignorance with a happy face. Thomas Jefferson warned against such political illiteracy: “If a nation expects to be ignorant and free in a state of civilization, it expects what never was and never will be.” Every time we allow a politician to lull us into apathy with wishful thinking, we move one step farther from civilization.

[Image Credits: whitehouse.gov and disneyclips.com]

This article was posted on August 18, 2014. Related articles: Commentary, .

Footnotes for “The Era of Wishful Thinking”

  1. I guess I missed the part where you actually propose an alternative approach and put that up for scrutiny.

    Or are you just wishful thinking for an ideal solution that no one has proposed yet?

  2. I’m not calling the current approaches right or wrong. My point is that a proclamation that things will work out is not enough to ensure that they will actually work out. There are no ideal or simple solutions to truly complex problems. Admitting this truth is an essential first step to addressing them.

Leave a Footnote