Still the Best Hope

America is a uniquely blessed nation. In terms of natural resources, cultural diversity, and basic liberty, the United States has experienced a short yet rich life unparalleled by any other country throughout history. Many citizens see these blessings as something to share with other peoples, making real the “City on a Hill” first mentioned in 1630 by Puritan John Winthrop, one of the earliest orators to comment on America’s influential role.

Unfortunately, there are American citizens who have a difficult time seeing these blessings, and others who, though they may recognize them, are unable to articulate why it is that we find ourselves with such benefits. To address these groups, political commentator Dennis Prager penned Still the Best Hope: Why the World Needs American Values to Triumph. Published in 2012, the book documents Americans blessings by defining its core values and comparing those values to other competing worldviews.

The majority of the book contrasts the American value system with its two modern competitors: Leftism and Islamism. He also mentions in passing China’s Confucian-influenced Communist system as a possible fourth major worldview, but does not discuss it in detail because it currently lacks worldwide appeal, generally limited as it is to Chinese domestic society.

For Prager, Leftism is not a tired relic of a failed Soviet Union, but a living movement that is voracious in its appetite for the control of people’s lives and hearts. It is old-time Communism, gussied up in a form palatable to American tastes. It is also a religion of sorts, with communal utopia as its heaven, material inequality as its sin, taxes as its offerings, and the state as its god. Leftism believes that all people are basically good, and that external forces—specifically the economic disparities identified by Karl Marx—make them bad and destroy their lives. To restore goodness and fairness, Leftists must—to quote Barack Obama, an exemplar of Leftism from Prager’s perspective—do the work of “fundamentally transforming the United States of America.” (Quoted from an October 2008 campaign speech in Columbia, Missouri.) To accomplish this, Leftism must displace American values with its own values, by influence and legislation if possible, by force if necessary. As a warning against this latter Marx-approved method, Prager chronicles the moral failures of Leftism throughout the Twentieth Century, including its record of nearly 100 million deaths.

Prager moves on to Islamism, defining it as “holding the belief that not only should all mankind be converted to Islam, but that all Muslim societies be governed by Sharia.” He does separate this view from the more accepted forms of Islam in general. He also says that Islamism does not require, by definition, the use of violence or terrorism. Despite these qualifications, he does include a brief overview of Muslim history, with its sword-based expansion during the Middle Ages and its scriptural support for the subjugation of non-Muslims, all as a warning about Islam’s tendency to see its worldview as something to be imposed on all nations. As with Leftism, force has been an option in the spreading of the Islamic worldview, and the author documents such incidents, both long ago and in our era.

In the final section of the book, Prager formally defines the American values system by invoking the three phrases found on all United States coins: “Liberty,” “In God We Trust,” and “E Pluribus Unum.” Liberty encompasses the freedoms of political, religious, and economic activities, assembly, speech, the press, and above all, freedom from an oppressive state. For the author, this requires a smaller government footprint, since larger governments historically have a tendency to abuse the power entrusted to them.

To reduce the temptation to anarchy that comes with great freedom, the author sees “In God We Trust” as a must. This is not a demand for a Christian nation—Prager is a Jew—but for the core of Judeo-Christian values, what he calls “ethical monotheism.” People are not good by nature, and morality must be inculcated, either by God or by the state. Prager recommends God.

Finally, he invokes “E Pluribus Unum” (“out of many, one”) as the third part of the “American Trinity” of values. As a melting pot, America promotes nationalism over racism—despite some missteps—with a rule that anyone of any background can be an equal citizen. People are accepted because they have value as people, and not because they are of a specific bloodline.

Prager is a master at breaking a worldview down into its most minute components, and then comparing those components across the spectrum of competing systems. He does this in a way that speaks to the common man instead of to Ivy-league philosophy types. The book uses numerous examples—too many, actually—to bolster points, and covers all of the modern hot-button topics, including terrorism, homophobia, global warming, education, mass media, homelessness, as well as less prominent issues such as swine flu, foul language, and anorexia.

In Still the Best Hope, the goal is liberty, a liberty can only be realized by those who believe in an objective God, and who put shared values before race. Leftists and Islamists hate such American liberty because it acts as a brake on their forward momentum. It offers freedom of thought and choice over dogmatic religious laws and capricious power grabs for self-appointed saviors. Prager calls American values the “Best Hope” for the future, but warns readers that Leftism and Islamism will work hard to keep that from happening.

This article was posted on April 7, 2014. Related articles: Other Books.

Leave a Footnote

 

MS-DOS

Back in the mid-1990s, I attended an event on the Microsoft campus, and happened to overhear a conversation between two engineers who worked on the company’s popular Excel product. The more experienced programmer was offering advice and guidance to a new recruit, and part of the discussion included a warning not to modify certain portions of the internal Excel source code “because nobody knows how those parts work.” Having spent a lot of quality time on customer projects grappling with some long-gone programmer’s attempt at source code, I didn’t doubt the gist of the conversation. But I had no ability to examine any of the Microsoft Office source code to see for myself if there were in fact portions beyond the comprehension of mere mortals.

Until now, that is. This week, Microsoft announced the public release of the source code for an early version of its Word for Windows application, as well as for two initial releases of its MS-DOS operating system. Microsoft donated the code for Word version 1.1a (from 1990), and for MS-DOS versions 1.1 (from 1982) and 2.0 (from 1983), to the Computer History Museum (press release). The files are posted on the Museum’s web site, and either MS-DOS or Word can be yours for the price of a simple click on a license agreement button.

Be aware: the code is really, really old, at least in terms of geologic software time. Yet those were simpler times. The MS-DOS 1.1 release fit into just 12K of memory—try that with Windows 8.1—a feat accomplished by coding directly in assembly language. It’s compact and terse, and a perfect magnet for bloatware. By the 2.0 edition (included in the same download), the size of the source had grown more than tenfold.

The Word for Windows code is mostly a C-language endeavor. Programming in managed languages and client- and server-side script has dulled me to the former harsh realities of Windows development. I had forgotten that world of message loops, obsessive-compulsive memory management, and the need to draw almost everything on the screen by yourself, including the blinking cursor.

If assembly language and non-OOP C code aren’t your style, you might also consider Microsoft’s .NET Framework Reference Source web site. For years, Microsoft has made much of it’s .NET Framework source code available to developers as a debugging aid, but that access was limited to the Visual Studio environment or a brute-force download of the entire library. A few weeks ago, Microsoft simplified access to the source by webifying it. The updated Reference Source web site lets you browse through Framework source files, a mixture of C# and Visual Basic content.

Beyond these Microsoft sources, the Computer History Museum also offers glimpses of computer code for other historic systems, including Apple II DOS, MacPaint, and an early release of Adobe Photoshop. If you have any geek blood in you at all, take a moment to pull back the veil from these programs.

This article was posted on March 29, 2014. Related articles: Technology.

Leave a Footnote

 

Remembering the Kanji

Any journey through an East Asian language begins with a single brushstroke. That one line begets others, which eventually become a jumble of seemingly indecipherable marks on a page. When I first started my Japanese studies more than two decades ago, I depended on rote memorization for these complex characters, the kanji. I learned to reproduce each symbol by feel rather than thought, a failed strategy as it turns out.

Then I discovered James Heisig and his landmark book Remembering the Kanji. This nearly forty-year-old work brought order to what had been until then a losing game of pickup sticks. Heisig’s method involves telling stories, using one or two keywords for each character as the starting point for the imagination. For some of the earliest and simplest kanji, there’s not much of a story to tell. The character for “five,” 五, gets little more than the “just remember it already” treatment. But once you get a few glyphs under your belt, Heisig starts to combine them and weave tales that bring the kanji to life. “Bright” (明) is built from “day/sun” (日) and “month/moon” (月) components, and his story speaks of “nature’s bright lights,” the sun and the moon, being the source of not only “bright” light, but “bright” inspiration for poets and thinkers.

After 500 or so characters, Heisig lets you take control of story creation for most of the remaining 1,700 characters. Stories that are out of the ordinary or are tied to personal experiences make for the best memories. My story for “post” (職, made from the character for “ear”/耳 and a “primitive” invented by Heisig that means “kazoo”) references Star Wars: A New Hope, when Luke comes out of the Millennium Falcon wearing a stormtrooper’s uniform and a supervisor asks him through the kazoo-quality transmitter, “Why aren’t you at your post?” There’s no rule in Heisig’s book about not leaning on George Lucas’ imagination for your stories.

I first discovered RTK more than fifteen years ago when a friend gave me his lightly used First Edition. I began in earnest, yet even with the great focus on stories, the need to come up with those narratives for hundreds of characters was daunting, and I aborted the process twice. Last year, I decided I would push through. As incentive, I purchased the latest Sixth Edition, a major update that adds the Japanese government’s newest additions to the official list of joyo kanji that all schoolchildren must learn. At about ten new characters/stories per day, I was able to complete the entire book in nine months.

I wouldn’t have made it without the support of an online community of Japanese-learners who cling to the Heisig method. Reviewing the Kanji is a place for mutual encouragement, and also a repository of user-submitted stories for every kanji character in the book. More than half of the stories I hand-wrote in my textbook sprang from the imaginations of other Heisig readers.

Remembering the Kanji is a fantastic resource, but not everyone finds it useful. The focus of the book is found in its subtitle: “How not to forget the meaning and writing of Japanese Characters.” It’s this focus that turns some language learners off. Heisig will not teach you how to pronounce any of the kanji, nor introduce you to meanings or combinations beyond a single, sometimes misleading keyword for each character. It’s all about writing. And yet, this focus on reproduction-from-keyword does bring reason and order to the Sino-Japanese characters, opening a world of language that includes writing, reading, and speaking Japanese.

This article was posted on January 23, 2014. Related articles: Other Books, , .

Leave a Footnote

 

Given the option, government tends to increase its authority and power rather than reduce or distribute that same power. This reality is a major reason why the original American states insisted that the Constitution be ratified only with the addition of the Tenth Amendment: “The powers not delegated to the United States by the Constitution, nor prohibited by it to the States, are reserved to the States respectively, or to the people.” This limitation was easier to visualize when each state desired to retain some of its independence from the others. When the states started to merge into one large unified bloc, that desire for local authority started to slip away.

The federal government does its best to incorporate into itself various “powers not delegated to the United States,” the National Security Agency’s PRISM system among the most recent. The founders knew from their European experience that this coalescence of power was always a risk. What’s surprising today is not that government continues to seek more central power, but that the American populace, from urban centers to the most rural community, has forgotten the key reasons why the states insisted on the Tenth Amendment in the first place, and how their regional diversity played a key role.

The states—the people—used to have more authority in how they ran their jurisdictions. For example, nine of the thirteen original colonies had official religions, and at the time of the Constitution’s implementation in 1789, at least three states—New Hampshire, Massachusetts, and Connecticut—still funneled tax income to approved church organs. Massachusetts was the last state to fully disestablish the church in 1833, although some local governments, especially in Maryland, were still requiring political candidates to submit to religious oaths as late as 1961.

My point isn’t that states should have official religions. Even during the Revolutionary War, the colonies were already starting to shed their religious components, and the Fourteenth Amendment, especially as interpreted in the 1920s via the Incorporation doctrine, finally made that practice little more than an anecdote of American history. But these changes in religious practice are just one example of how state and local jurisdictions used to employ self-governance to distinguish themselves within the national Melting Pot. Nearly two centuries ago, states asserted their sovereign authority to the point of civil war. Today, states are slapped down or ridiculed in the public press for even suggesting that they act in a way that is at variance with the federal standard, and in many areas of legal import, they don’t even bother grappling with the state-federal divide.

Humans and organizations both seem to gravitate naturally toward unanimity. Back in the 1980s and 1990s, Apple promoted the ideas of independence and freethinking through its 1984 and Think Different advertising campaigns. Yet here we are in the twenty-first century, and it’s the iPhone rather than its Android or Windows Phone counterparts that now embodies the concept of uniformity, and Apple actively prevents both its users and partners from straying too far from its standard of what a computing life should be.

In Leviathan, Thomas Hobbes described the process by which the members of a commonwealth voluntarily transferred some of their authority to the sovereign, a personification of either an individual ruler or of a collection of national leaders and government structures. In my earlier review of that work, I summarized both the benefits and the risks of such concentrated authority.

Since the commonwealth—Leviathan—is a creature, it has rights just like man does, although these rights are passed on from those in the commonwealth. These rights culminate in a strong sovereign who acts on behalf of the citizens to define laws, enforce contracts, mediate justice, issue rewards and punishments, and carry out peace and war. Because of the need to enforce the rules of society, the power of the sovereign is absolute; he can do no wrong and he cannot be corrected or arrested. He has the power of life and death over his citizens, although he cannot take away an inalienable right from any citizen.

The American founders understood both the need for and the dangers of this Leviathan, and established limits—including the Tenth Amendment—to restrain this “power of life and death over his citizens.” As a nation, unity is a must, especially when dealing with external threats. But internal variations are equally important. Maintaining this balance is messy, and much of that messiness stemmed from each of the states acting in ways that were different and even offensive to the others. But there is nothing wrong with getting a little messy. It’s when we try to keep things unified, when we pay lip service to the idea of checks and balances between the federal and state levels, and when we assume that Leviathan acts only in our best interests, that we begin to lose the civic ideals of independence and variety that the individual colonies fought for.

[Image credits: elitedaily.com]

This article was posted on July 29, 2013. Related articles: Commentary, , , .

Leave a Footnote

 

Let’s face it: Obama has blown it. Despite his reasonable desire to invoke the power of government to resolve the national economic situation, his domestic policy decisions have had little to no positive impact on the jobs outlook. Some acts, like the changes in the health insurance industry brought about by Obamacare, may have prolonged the pain. He’s being flogged publicly in the wake of an IRS scandal, and an American citizen sitting in an airport in Moscow is showing the world that the current president’s rhetoric about the overreaching powers of the Bush era was just a way to deflect scrutiny for his own overreaching programs.

I’m not surprised at his failures, since Obama is a lot like Bush before him. George W. Bush ruined any post-9/11 goodwill he built up by poorly managing a poorly explained war, and by engaging in spending behaviors far out of line with the sensibilities of most on the Republican side of the political spectrum. It’s kind of like Bill Clinton before him, whose political acumen meant nothing whenever a woman caught his eye. Or like Bush-41, who broke his number one promise on taxes. Or like Reagan, whose adroit handling of the massive Soviet Union was almost forgotten because he made a mess of a much smaller situation in Nicaragua. Or like Carter before him, who thought that symbolic acts with the National Christmas Tree could overcome his wrong decisions on energy and the economy, or atone for his bungling of the hostage situation in Iran. And don’t get me started on Richard Nixon, Warren Harding, or Zachary Taylor.

The president is our Commander in Chief, the head of America’s First Family, and the public face of the United States to a complex world. Each election season, we vote for a candidate not primarily because of the policies he or she espouses—though we should—but because we desire to put a hero in office. Hope and Change, the terms invoked by Obama during his first run for president, may as well have been Truth, Justice, and the American Way.

We like our presidents to be heroes, and in the majority of cases, a shiny new president seems to be the champion we were looking for. Most of those who have risen to the highest elected office in the land did so because they possessed a driving vision, a history of successful executive or legislative management, or obvious powers of persuasion, and were able to demonstrate those superpowers to an entire nation. Television, with its urgency and its red, white, and blue backgrounds, enhances the aura of strength and victory.

But it is all a façade. Whatever skills these presidents have in managing large bureaucratic hierarchies or speaking publicly on topics of national concern, they are fallible, imperfect, broken. Whether you believe in original sin or not, it’s clear that humans can’t maintain the image of perfection for very long. Something always goes wrong, and when it goes wrong with a hero, it can be demoralizing to the hero worshipers.

America’s founders understood the importance of having a president rather than a hero. The framers of the Constitution knew that they must enforce limits and checks on the power of the executive office, even when held by someone as collectively respected and honored as George Washington—especially because it would be held by a hero like Washington.

In our image-centric culture, we tend to extend one heroic act to the entire life of a person. But this is a mistake. People will disappoint, sometimes at the worst possible moment. That’s why we need clearly defined limits and boundaries on the power of the political branches. When we shrug off NSA-initiated privacy concerns or IRS abuses, we do so because we look to the heroism of the Oval Office. We should instead look to the humanity of the person in it.

This article was posted on July 15, 2013. Related articles: Commentary, US Presidents, , .

Leave a Footnote