When my son signed up to take the SAT test, one of my first thoughts was, “I glad I don’t have to go through that nightmare again.” And yet here I am on December 7, taking the N3 level of the 日本語能力試験 (nihongo noryoku shiken, the Japanese Language Proficiency Test). N3 is the middle of five levels, inconveniently arranged from 5 to 1 based on the difficulty level of the test. This handy list shows you the expectations for each level.

  • N5 — You enjoy the taste of sushi.
  • N4 — You took one semester of college-level Japanese, or nailed that Japanese for Wandering Americans book on the flight east.
  • N3 — You took two or three years of college Japanese.
  • N2 — You spent two or three years running a college in Japan.
  • N1 — You play go with Emperor Akihito.

Clearly, I’m going to stop with N3. I started Japanese studies 25 years ago, but found myself stuck at the same second-college-year level. Taking the JLPT, I hoped, would spur me to study with more rigor. It worked in part. I finally made it through my kanji book with its 2,200 glyphs, and I probably added 1,000 to 2,000 new words to my vocabulary. But I skimped on new grammar points, which is of course the only thing covered by the test.

This year, the test takes place on the campus of California State University Los Angeles (“Motto: No, you want UCLA. Let me get you a map.”) I’ve scheduled this post to appear at the apex of my stress level, just as I sit down to begin the test. If you’re reading this right when it comes out, then you missed your chance to take the once-per-year exam. But if you enjoy Japanese food or are looking for a challenge beyond your weekly go games with the emperor, then this might be the test for you.

This article was posted on December 7, 2014. Related articles: Off Topic, .

Leave a Footnote


In a free society, there is no such thing as a “fair share.” The freedom that individuals have to create or squander their wealth and resources ensures that no two people will have a completely equal and fair proportion of a nation’s riches. Even for those in Communist regimes, fairness quickly gives way to power and graft. Fairness, though desirable, is elusive.

During a recent campaign and fundraising tour, Vice President Joe Biden attempted to force a false sense of fairness on the American public. “I think we should make them start to pay their fair share,” Biden said of America’s richest citizens, expressing a desire by many to “take the burden off the middle class.” Since he didn’t come right out and state an amount that the rich should pay, we will have to make an educated guess as to what he meant.

Ever since the Sixteenth Amendment made possible a national income tax, the rich have always paid more than their fair share. Just months after ratification of the amendment, Woodrow Wilson signed the Revenue Act of 1913, imposing a 1% tax on households earning $4,000 or more per year (equivalent to $96,000 in 2014 dollars), rising to a top rate of 7% for those making more than $500,000 (about $12 million today). Democrats controlled both houses of Congress in those days, so it seems likely that even a Progressive like Wilson found the tax rate he signed into law “fair.”

Today, the top rate for upper-income Americans sits close to 40%, although after deductions, the richest 10% of taxpayers pay about 19% of their incomes to the IRS. That’s still nearly triple the top rate from a century ago, and significantly more than the typical 3% rate currently paid by the bottom half of all income earners. (All data is from 2011 IRS statistics, and conveniently summarized here.)

The top 10% of taxpayers earn about 45% of all incomes in America, and yet they pay more than 68% of all income taxes. If life were truly fair, they would only pay 45% of all income taxes, in line with their incomes. But things aren’t fair, and in general, everyone is OK with that. Even the rich understand that a progressive tax system that charges them about six times the rate paid by the poor, while not actually fair, is acceptable.

When Mr. Biden called for the rich to pay their fair share, he certainly wasn’t demanding that their tax rates be lowered to match their incomes. He may have instead wanted marginal tax rates raised, but it’s hard to explain how an extra 5%—or even 20%—in addition to the current 39.6% top rate is inherently more fair. Would 100% be the ultimate in fairness?

The Vice President, as an elected official with a top-secret security clearance, has access to the same Internal Revenue Service statistics that I do. He has no excuse for tossing out the “fair share” misstatement, unless he meant something else by it. If his intent was to drive tax policy, he could have stated a number instead. But it seems more likely that his goal was to stir up emotional animosity in a way that will motivate voters this coming November. “Fair share,” it seems, really means, “Vote for Democrats.” It’s a propagandizing form of politics that is, at its heart, extremely unfair.

[Image Credits: Microsoft Office clip art]

This article was posted on October 23, 2014. Related articles: Commentary, , .

Leave a Footnote


I took four years of high school French, primarily because it wasn’t German. German was mach-uber-scarrrry, with its hundred-character words and its link to some of history’s greatest wrongs, like lederhosen. But French was soft and lyric. In my classes, I loved saying the French word for “garbage,” which is “garbage.” Only better.

By my senior year, only about a half-dozen students had stuck with the language, and as a reward, Madame Virgillo had us read two French classics that we had no hope of understanding, even in English: Voltaire’s Candide and the subject of today’s musings, Le Petit Prince, by Antoine de Saint-Exupéry. She also had us watch a truly bizarre French movie called L’Argent de poche, which included young kids telling dirty jokes and a scene where a toddler falls out of a third-floor window only to jump up and play gleefully. Between the surreal movies and Voltaire’s scathing political satire, it’s surprising that we didn’t switch to the German classes and invade.

The Little Prince is one of the most published and beloved novels in all of human history, having been printed more than 140 million times in over 250 languages, including a Latin version that my son has for his own high school foreign language travails. And it is so French, filled with liberté, fraternité, and snakes that swallow whole elephants. In the story, a pilot crash-lands in the Sahara Desert—something that did in fact happen to the author—where he meets up with a diminutive prince, recently arrived from his asteroid homeland. Through their conversations, the two explore the strangeness of being an adult—at least up until the childlike prince tries to commit suicide. Bread and Jam for Frances never had a suicide scene, even with “France” in the title.

It’s been many years since I read this work, but I still recall the opening words: “Lorsque j’avais six ans j’ai vu, une fois, une magnifique image…” (“When I was six years old I saw a beautiful picture….”). It’s a simple opening for a book that is deeply philosophical and at times political. The baobab trees that try to take over the prince’s asteroid, for example, represent Nazi aggression. Naturally. It’s not a straightforward book. But if you can catch the subtle allegories, an adult can get a lot out of the text. And a child will get even more. Especially if they can read French.

This article was posted on October 14, 2014. Related articles: Other Books, .

Leave a Footnote


It was back in fifth or sixth grade that someone gifted me Merriam-Webster’s Vest Pocket Dictionary, a blue, smallish reference book of very few words, marginal utility, and if memory serves, much excitement. It was such a desirable book that, even though my mother purged it from my shelves in the Great Garage Sale of 1980, I was able to pick up a replacement used copy about a decade ago.

What drew me to this work wasn’t its depth of content. Most of the definitions top out at five or six words. It certainly wasn’t the number of entries since even with the elderly-offending font, its 370 pages leave little room for variety. (The X, Y, and Z sections together utilize all of two-and-a-half pages.) And it definitely wasn’t the pedagogical import, since I doubt I looked up more than one hundred words total while it was in my possession.

What made it a significant book can be summed up in one short definition.

grown-up \’grôn‚əp\ n: adult —grown-up adj

This small dictionary wasn’t one of those grand library editions, nor was it a childish “beginner’s” dictionary common for the younger set. This was a dictionary that mimicked the look and feel of a big people’s dictionary—each entry even included one of those incomprehensible pronunciation guides—but was still small enough to travel around with a juvenile. This book declared to an elementary school student: “You are grown up, and deserve grown-up things.”

With the advent of dictionary apps and Wikipedia, small reference books like this don’t carry the same emotional charge as they did decades ago. I asked my teenage son what he thought of the book, and his first response was, “It’s bigger than my phone.” Sigh. Kids today, once I chase them off my lawn, have ready electronic access to the wealth of human knowledge. Perhaps that makes them more grown-up than I was at that age. Yet I wonder if putting such power in their hands so early removes opportunities for simple, joyful discovery made possible through things like vest-pocket dictionaries.

This article was posted on October 6, 2014. Related articles: Other Books, .

Leave a Footnote


I start this review with an obligatory disclaimer: I am not one of those science-hating creationists. It’s a testament to the state of public discourse in this country that I have to include that preface. But admitting openly that I read a book with a title like Darwin’s Doubt carries certain risks. It’s the kind of publication that you have to wrap in a copy of Lady Chatterley’s Lover if you want to read it on the subway.

And yet, it’s not that bad. In fact, Darwin’s Doubt is a surprising book, one of those works that stirs in you those feelings and ideas that you had all along, but didn’t know how to articulate or even whether you should.

The book’s title comes from portions of Charles Darwin’s Origin of Species where he discusses possible problems and setbacks with his overall theory of descent from a common ancestor. Darwin was familiar with the Cambrian Explosion, a ten-million-year block of time approximately 540 million years ago that saw a rapid increase in the number and complexity of biological forms. The geologically sudden appearance of such diversity differed from Darwin’s view of gradual transition. In light of this apparent conflict, Darwin sought to assure his readers: “If my theory be true, it is indisputable that before the lowest Silurian [Cambrian] stratum was deposited, long periods elapsed, as long as, or probably far longer than, the whole interval from the Silurian age to the present day; and that during these vast, yet quite unknown, periods of time, the world swarmed with living creatures.”

Darwin, it turns out, was wrong. While the world did swarm with creatures before the Cambrian era, they weren’t that varied. More than a century of research has shown that the Cambrian Explosion is what it is: a blink of an eye that produced creatures at a rate that far exceeds Darwin’s estimates. How did life appear so rapidly? In the absence of a powerful deity, what would be needed to make this happen?

What is needed is information. According to Stephen C. Meyer, the book’s author, advancements in life are in reality advancements in digital information in the form of DNA and epigenetic content. “Building a fundamentally new form of life from a simpler form of life requires an immense amount of new information.” It’s as if a room full of monkeys started banging on typewriters and produced, not works of Shakespeare, but complex computer algorithms. Randomly generating this level of quality DNA seems unrealistic to Meyer. The heart of Darwin’s Doubt is an exploration of why an increase in functional DNA-based information could not come about through neo-Darwinian evolution alone.

Meyer addresses this possible impediment to materialistic evolution by looking at four key issues: (1) the likelihood of random processes generating beneficial and usable genetic material, (2) the time needed to generate this digital genetic information, (3) the window of time in a species’ development when DNA-generating mutations must occur, and (4) whether DNA alone is sufficient to build new life forms.

Darwin himself was not aware of DNA or of how genetics worked at the cellular level. In today’s neo-Darwinian world, it is well understood that genetic mutations must drive speciation. Cellular mutations are common, but generally deleterious; we call some of them “birth defects.” Even if a mutation could be considered neutral or beneficial, Meyer explains that there need to be enough coordinated mutations present to generate a new protein fold. “New protein folds represent the smallest unit of structural innovation that natural selection can select…. Therefore, mutations must generate new protein folds for natural selection to have an opportunity to preserve and accumulate structural innovations,” an action that is unlikely by Meyer’s calculations.

Meyer’s second point deals with the study of population genetics. Given the size of a population, its reproduction rate, its overall mutation rates, and the tendency for natural selection to weed out members with diminished genetic fitness, how much time is needed to generate new genetic information that would result in the transition to a new type of creature? Quite a lot, it turns out. “For the standard neo-Darwinian mechanism to generate just two coordinated mutations, it typically needed unreasonably long waiting times, times that exceeded the duration of life on earth, or it needed unreasonably large population sizes, populations exceeding the number of multicellular organisms that have ever lived.” Since the majority of genetic advancements would require more than two coordinated mutations, the numbers quickly become more severe.

Timing also plays a key role in genetic transmission. To have a development impact, Meyer notes that mutations must come early in the evolutionary lifetime of a species, when populations are still small and morphological changes have the best chance of taking hold. However, mutations at this stage also have the biggest probability of imparting damage, and similar changes in larger populations are more likely to be removed by natural selection.

Finally, Meyer discusses epigenetic information, the instructional content stored in cell structures but not in DNA sequences themselves. This information is essential in driving the overall body plan of an animal as it develops from a single cell to its full adult size. What is not clear is how it could be altered through genetic mutation, since specific epigenetic pressures are not coded directly by the DNA base pairs. There is also the issue of “junk DNA,” those coding blocks of the genome that were once thought to be useless castoffs of evolution, but are now understood to drive biological processes in living creatures.

In its more than 500 very accessible pages and its 100-plus pages of footnotes and bibliographical references, Darwin’s Doubt does at good job at expressing Meyer’s own doubts that “wholly blind and undirected” evolution could produce the variety of life we see on earth. As the director of the Discovery Institute’s Center for Science and Culture, Meyer is a strong advocate for Intelligent Design. “Our uniform experience of cause and effect shows that intelligent design is the only known cause of the origin of large amounts of functionally specified digital information. It follows that the great infusion of such information in the Cambrian explosion points decisively to an intelligent cause.” His book makes a strong case, although the appeal to non-materialistic explanations will always turn off some readers. As proof of the struggle, Meyer quotes Scott Todd, a biologist writing in Nature: “Even if all the data point to an intelligent designer, such a hypothesis is excluded from science because it is not naturalistic.”

So here you have the conundrum. Despite what you read in the news, the science surrounding biological evolution is still somewhat fluid, in part due to the issues raised in Darwin’s Doubt. Yet for those who have reasonable struggles with the evolution-is-settled narrative, it’s hard to shake the baggage carried by those on the religious fringe who insist that any non-literal interpretation of the creation story is blasphemy. Another book on my reading list is Mind and Cosmos, by Thomas Nagel, an atheist who rejects the standard neo-Darwinian view for some the same reasons as expressed by Meyer. (Anthony Flew, once known as “the world’s most notorious atheist,” later came to believe in a deity because of similar issues surrounding evolution.) A book like Darwin’s Doubt requires vigorous analysis to confirm that it is scientifically sound. But if it is, then the neo-Darwinian evolution model itself might turn out to be little more than an intelligently designed idea.

This article was posted on September 17, 2014. Related articles: Biology, Other Books, , .

Footnotes for “Review: Darwin’s Doubt”

  1. Have you seen the Ben Stein documentary “Expelled?” Not to start a debate here (I honestly think both theories carry truth), but it’s probably the best example of everything that’s wrong with Creationism…mainly that it’s usually presented all wrong. I’d be curious to read a book that presents creationism with a little more…science…as it seems Meyer has done.

  2. I did see Expelled a while back. If I remember correctly, it’s not so much about the science behind intelligent design as it is about the difficulty intelligent design advocates feel they have in getting their message out. Stein can be both jocular and heavy-handed, perhaps dual tributes to his background both in Ferris Bueller’s Day Off and as a speechwriter for Nixon.

    Meyer’s book is pretty much all science, all the time. There’s only passing references to religion, about the same as you might find in any general-audience science book that communicates the pro-evolution stance. I’m a layman when it comes to biology and chemistry, but his argument did have the feel of rigor and competency.

    More eye-opening than Meyer’s analysis was his content documenting various experts who openly discuss the scientific complexities and roadblocks present in origins-of-life research. High school textbooks give the basic common-descent narrative with finch-beak examples, and it all feels very settled and boring. But if you access the journal articles referenced by Meyer, there’s a whole world of scientific debate covering many aspects of the evolution story. It was quite fascinating to read peer-reviewed content from those who promote evolution, and yet still understand that they need to grapple with the problems evolution brings up.

  3. I’d like to see you challenge yourself to think critically about this work.

  4. In addition to his statement above, Carl sent me some links offline (here and here) that attempt to correct misconceptions held by those who reject evolution. He also warned me not to avoid challenging books that use myth and outdated research as their core forms of support.

    This is part of the difficulty in talking about books like Darwin’s Doubt. Those like Meyer who propose ideas that challenge the Darwinian model are assumed to be wacko religious knuckle-draggers. Often, that assumption is correct, but not always, and the assumption by itself is not proof that the ideas are faulty. Sites like the ones listed by Carl attempt to correct the religious nuts, but they seldom address those who earnestly and seriously grapple with the incomplete scientific and statistical detail offered by modern evolutionary research.

    Within the scientific community, there is near unanimous acceptance of neo-Darwinian evolution, even among scientists who express a belief in God. However, there is not unanimous agreement about specific components or processes associated with the evolution narrative. This should come as no surprise, since science is a constant battle to understand and expand gaps in theoretical knowledge. Yet I am regularly surprised that the public presentation of evolution brings with it the false statement that there are no known issues with the theory of evolution. There are real, complex issues, and it is these issues that Darwin’s Doubt addresses. Meyer’s conclusions are a valid target of debate. But anyone who claims that evolution is scientifically problem-free is avoiding their own need to challenge ideas.

    In response to another of Carl’s challenges, Darwin’s Doubt does not depend on myth–it doesn’t discuss religion at all beyond providing common responses to critics–and it generally sticks with research from the most recent two decades, except when it is specifically quoting content from Darwin’s era.

  5. From a subsequent link within your first link in #4:

    “Evolution myths: Evolution violates the second law of thermodynamics

    The second law of thermodynamics states that entropy, a measure of randomness, cannot decrease in a isolated system. Our planet is not a isolated system.”


    It is very true that we are not in an isolated system. However it is a huge leap to say that that in itself is sufficient for evolution to occur.

    Energy from outside a system can be used to increase complexity, true. But only if there is a mechanism to capture and use that energy. For example a quart of gasoline can power some aspect of a manufacturing process. And the same quart in a Molotov cocktail can destroy the factory. The ways that energy can increase disorganization far outnumber the ways it can increase organization.

    And how can a system that uses energy to increase complexity come into existence in the absence of a system to capture and use energy in that way? Pulling yourself up by your bootstraps just doesn’t work.

Leave a Footnote