At a small private school in Seattle, a group of mothers scrounged up $3,000 to purchase computer equipment for the students. The school didn’t really have any computers before that, so the gift was a great opportunity for the kids, especially the young geeks who did what geeks tend to do when confronted with technology. It sounds like a heartfelt story you might hear anywhere in America. But this tale takes place at a school named Lakeside in 1968, and the geeks included future Microsoft founders Bill Gates and Paul Allen.

Malcolm Gladwell, in his book Outliers, uses the introduction of computers at Lakeside to demonstrate one of the key features of those who, like Gates and Allen, excel far beyond the bulk of humanity: opportunity. The countless hours that Allen and Gates devoted to their beloved mainframes (an example of the book’s “10,000-Hour Rule”) helped guide their careers. But Gladwell puts the focus instead on the mother’s who purchased the computer equipment, a chance opportunity that gave Microsoft’s pioneers a key advantage. For the outliers discussed in the book, “success arises out of the steady accumulation of advantages: where and when you are born, what your parents did for a living, and what the circumstances of your upbringing were.”

Gladwell is a gifted storyteller, and he is adept at finding just the right anecdote or statistic to move his points forward. But as I read the success stories fleshed out in his text, I found myself unsure if he was drawing the correct conclusions. Bill Gates is a perfect example. He was certainly provided with great advantages, growing up in a time when computing was about to make the transition from business to personal, and coming from a family with the intellectual and financial means to put him in places of opportunity. But he was also a genius of sorts, an aspect of outlier success that Gladwell downplays. To make the anti-genius theme clearer, Gladwell relates the woeful tale of a genius named Christopher Langan who experienced one opportunity setback after another, despite his high intelligence.

Outliers looks to the “web of advantages and inheritances” that great people experience. In the book, the forces that birth outliers are external rather than innate. It was this emphasis that I found lacking. Bill Gates wasn’t the only youth in 1968 to give his every waking hour to computing. But he was one among only a tiny handful of these students to become someone on the level of, well, of Bill Gates. There was something more than opportunity, something more than heritage, that made his success possible, and Gladwell discounts it.

Despite this oversight, the book is still a great read. I found the chapter that discusses the transformation of Korean Airlines from an accident-prone company to one that has one of the best safety records in the industry simply fascinating. Gladwell ends the book with a story of his own family life, hinting not so subtly that he himself is an outlier, and that you, dear reader, might be as well.

This article was posted on June 25, 2014. Related articles: Other Books, , .

Leave a Footnote

 

Following a recent exchange of five Guantanamo Bay detainees for one American Army soldier held for five years by the Taliban in Afghanistan, the United States has run dangerously low on enemy combatants available for barter purposes. In the years following the terrorist attacks on the World Trade Center, the United States found itself flush with nearly 800 ready prisoners amid a shortage of hostage situations. But the numbers have fallen in recent years. With last month’s transfer, the total now hovers at just 149 human bargaining chips.

Congressional Republicans took advantage of the ominous news to lambast President Obama, calling him out for his “frivolous attacks and spend policy” of dealing with captured insurgents. “We set up the Guantanamo Bay Prisoner Trust Fund for a reason,” said House Majority Leader Eric Cantor. “You can’t just hand out these terrorists five or six at a time. It’s typical left-wing wastefulness.”

The low prison camp numbers come at an especially dangerous time in American international relations. Just days after the release of Taliban prisoner Bowe Burgdahl, North Korea announced that it had arrested an American tourist for unspecified and likely sucky reasons. “We do not negotiate with terrorists,” said departing White House Press Secretary Jay Carney. “But we can negotiate with the Qataris if Kim Jong-un wants go to through them.”

Evil and formerly evil nations like North Korea and Afghanistan may find it hard to trade in their hostages for increasingly valuable Guantanamo Bay ne’er-do-wells. With terrorist supplies at historic lows, the United States may be forced to dip into its National Domestic Prisoner Reserves for the first time since the 1970s. Fortunately, the flagging economy and a spate of high-profile GM vehicle accidents have reduced the need for license plates, “so the prisoners are available for release anytime,” said Secretary of Wardens Bill Westermont.

However, homegrown murderers and white-collar criminals might not be good enough for America’s enemies. The swap of five Taliban prisoners for one American citizen has set a de facto price, and even with an ample supply of American inmates, foreign terror organizations might not want them. Hamid Kahm-Jones, a Taliban mucky muck living on the run in eastern Afghanistan, explained the problem. “We very much appreciate the United States’ attempt to provide homegrown criminals in exchange for our kidnap victims. But let’s face it: American prisoners are soft, with their three square meals per day, their easy access to cable television—including the Food Network—and washing machines. It’s pathetic. Death to America, to Comcast, and to Maytag!”

[Image Credits: Wikimedia Commons]

Humorality

This article was posted on June 11, 2014. Related articles: Humor, , .

Footnotes for “Guantanamo Swappable Prisoner Supply Reaches Historic Low”

  1. Good news: we still have 149 detainees at GitMo, and they have NONE of ours.

    Bad news: (according to right-wing talking heads) the terrorists will now be kidnapping American soldiers to ransom for hostages. This assumes that they’ve been nice to American soldiers (or blowing them up with IEDs) and thus missing opportunities to capture and trade them.

Leave a Footnote

 

At its Worldwide Developers Conference (WWDC) earlier this week, Apple announced the latest iOS 8 and Mac OS X “Yosemite” operating systems. Apple fanboys will appreciate the attention to pixels included in this latest revision. And after all the glitz of the OS updates, the keynote speakers followed up with a new product most Apple users will never even notice: Swift. Swift is a new programming language, crafted to meet the needs of iOS and Mac OS app developers. In the vernacular of software developers: It’s about (reader.Age >= 18 ? GetNextCurseWord() : “”) time!

Anyone who is serious about creating software for Apple platforms has had to contend with Objective-C. It’s a powerful language, in much the same way that a weapons factory is powerful. You can build something really awesome, but first you have to get access to basic components and crude raw materials, engineer a complete assembly-line process with little room for error, and fill out the requisite government forms. Swift, it seems, is out to change that.

A quick glance at some sample Swift code shows it to be an amalgam of different language sources. It has curly braces and other syntactic trappings common to most C-like languages; a love for data manipulation found in JavaScript and other web-centric scripting tools; generics, inferred strong typing, extension methods, and other modern conveniences from platforms like Microsoft’s .NET Framework; and just a hint of excess baggage from Smalltalk. And the most surprising progenitor is found in the very first language Apple ever took seriously: BASIC, especially its ability to process code logic outside the context of a full application. Showing off the immediacy of the language’s always-doing-something compiler was a major selling point during the keynote demo.

Apple isn’t the only company to craft a new language. Late last year, Google put the finishing touches on Dart, its self-proclaimed JavaScript-killer. Microsoft, already the proud owner of two popular coding systems (C# and Visual Basic), continues work on the open-source TypeScript language, a strongly typed superset of JavaScript. And who can forget Mozilla’s own unfortunately named Rust language. Of these software newcomers, TypeScript and Swift will likely gain the most traction, since they both exist to address widespread difficulties in two very popular development languages, but in ways that leave current investments in legacy source code intact.

A prerelease edition of Swift is available right now as part of Apple’s Xcode 6 beta release. You can also download a free ebook edition of the Swift language guide.

This article was posted on June 4, 2014. Related articles: Technology, , .

Leave a Footnote

 

I’m not really into military history. Whether their covering the Vietnam War or the battle-of-the-week in the Middle East, such books can’t help going on and on about how these five battalions defeated these other three divisions. Or is it four squadrons? There’s only so many times you can read about a general sending someone off to reconnoiter, or successfully outflanking the bad guys. I doubt I’ve ever met anyone in real life who’s flanked.

So I was a little unsure about reading Operation Mincemeat, Ben MacIntyre’s book about a secret British military plan executed smack dab in the middle of World War II. Fortunately, it’s a great wartime read, in part because it’s not really about war at all. In your typical military book, there’s page after page about how specific operations resulted in enemy deaths. That’s where MacIntyre’s book differs: the main character is dead before the operation even begins.

In January 1943, a thirty-four-year-old impoverished Welshman—in and out of homelessness, in and out of mental illness—accidentally or intentionally ingested rat poison and died. His name was Glyndwr Michael, but Adolf Hitler knew him as Major William Martin of the British Royal Navy. Between the time of Michael’s death and Martin’s impact on the Third Reich, he managed to earn a military rank, a sexy girlfriend named Pam, a clandestine submarine trip to within dead-guy floating distance of the Spanish coast, and a briefcase loaded with forged personal letters and erroneous top secret military plans that proposed an Allied invasion of Greece instead of the true Axis-fortified target of Sicily. The Germans, upon having the documents virtually forced into their hands, fell for the deception, altered their plans accordingly, and turned the tide of the war.

Sun Tzu advocated the use of deceit long ago. “All warfare,” he insisted, “is based on deception.” Here, the deceit is explained in fascinating detail. It’s not surprising that Great Britain included subterfuge in its offensive arsenal. The shocking part—something that comes out clearly in the book’s progressive reveal—is that the Germans were begging to be misled in order to confirm, despite recent military setbacks, that they were destined for a millennial victory. Even when the Allies attacked Sicily in force instead of Greece—even as Mussolini implored the Fuhrer to come to his rescue—the Germans kept ignoring Sun Tzu. Two weeks after the Sicily landing, the Italian dictator was out of power thanks to the ploy, and the Axis was on its way to sure defeat.

MacIntyre relates a tale worthy of a James Bond novel, in part because elements of the plan to use a dead body to trick the Nazis sprang from the mind of Bond author Ian Fleming. Extensively researched using declassified military files and interviews with those involved, Operation Mincemeat is hands down the most enjoyable non-military military-history book I’ve ever read.

This article was posted on May 28, 2014. Related articles: Other Books, , , .

Footnotes for “Review: Operation Mincemeat”

  1. A new book : “The Mystery of William Martin”

    http://www.youtube.com/watch?v=NmQNAnXxMvo

    In English:

    http://www.youtube.com/watch?v=hMUZrStsiv0

Leave a Footnote

 

This month marks the fiftieth anniversary of the BASIC computer programming language. In the early hours of May 1, 1964, Dartmouth professors John Kemeny and Thomas Kurtz activated the new language. Although only a handful of college students had access to that first trial, the vision of Kemeny and Kurtz was nothing short of revolution.

BASIC wasn’t the first English-like language. Fortran had been around since 1957. It wasn’t a fun language by any measure, but you didn’t have to be a rocket scientist to use it. You just had to work for one. In this way, BASIC wasn’t too different, despite its promise of being “easy to learn” and “a stepping-stone for students who may later wish to learn one of the standard languages.” You still needed access to a computer, and in the 1960s that typically required engineering smarts. Not much of a coup.

But it was. Kemeny and Kurtz didn’t usher in an era where every kid would write software. But by lowering the entry requirements for programming, they advocated for a world where anyone could control computer resources. Getting access to the first BASIC system required acceptance into the Dartmouth engineering program. Today, you only need to flop down on your sofa, grab your iPad, and start poking at it with your fingers.

To find out more about one of the most popular software development language families used by businesses today, visit the BASIC fiftieth anniversary site at Dartmouth.

[Image Credits: Geology professor Robert Reynolds, at right, and chemistry professor Roger Soderberg develop a computing component for their course work. (Photo by Adrian N. Bouchard/courtesy of Rauner Special Collections Library, Dartmouth College)]

This article was posted on May 14, 2014. Related articles: Technology, , .

Leave a Footnote