ASEE PRISM Magazine
Smart Parts
Missiles & Medicine
Leading the Way
Behind the Screens
Old Meets New in Montreal
Comments
E-Mail
Briefings
Refractions
ASEE Today
Marketplace
Classifieds
Last Word
Back Issues

- By Steve Lohr
Basic Books. 250 pp. $27.50

Exploring the time before geeks began ruling our lives, a recent book examines the early days of the information age.

Go To The Story of the Math Majors, Bridge Players, Engineers, Chess Wizards, Maverick Scientists and Iconoclasts—The Programmers Who Created the Software Revolution

Alexander Graham Bell. Thomas Edison. Everyone knows the names of the inventors responsible for the myriad innovations—from the telephone to the light bulb—that transformed civilization and propelled humankind giant steps closer to the next century. But beyond Bill Gates and Steve Jobs, even the most gung-ho technophiles don't know the names of the men and women who created programming software, which has revolutionized everything from education, entertainment, and shopping to manufacturing, banking, and health care. “It's a field that is almost uniquely ignorant of its own history,” says Steve Lohr, a technology writer for the New York Times.

That's why Lohr has written a book profiling the people behind the bits, bytes, clicks, and codes that define the information age. With a mouthful of a title, “Go To: The Story of the Math Majors, Bridge Players, Engineers, Chess Wizards, Maverick Scientists and Iconoclasts—The Programmers Who Created the Software Revolution,” the book lays out the evolution of the major programming languages that have taken software from room-sized computers that demanded a cadre of highly trained programmers to run to ever-shrinking-in-size PCs that ordinary Joes and Josephines with nary a computer skill can operate with the simple click of a mouse.

History buffs and die-hard geeks aren't the only readers who will appreciate the insights in “Go To.” Learning about software's origins can also help engineering educators—and students. As Lohr explains, advances in software tend to be incremental, so being well-versed in the software lore can provide an edge. What's more, in painting portraits of the various and sundry individuals responsible for breakthrough software, Lohr helps pinpoint the attributes of a good programmer in case the undeclared are looking for guidance. “The best programmers have the mental muscles for both conceptual thinking and procedural detail and the uncanny ability to shift back and forth effortlessly, from high-level design to close-to-the-machine implementation,” says Lohr in the book. “And for the people who are good at it, using those mental muscles is not only fun but oddly compulsive.”

Some of the movers and shakers Lohr profiles include Jean Sammet, a Sperry Gyroscope Company employee who studied math at the University of Illinois. Sammet was part of the team that dreamed up COBOL. “To her, math was an elevated intellectual pursuit, while engineering was the grubby work of machinists—a fairly prevalent attitude among those studying math theory,” says Lohr. She soon changed her tune, earning a place in history.

Ken Thompson and Dennis Ritchie, who created the UNIX operating system in 1969, owe the success of their endeavor to the joy of tinkering around to find the right tool without the pressures of commercialism. “We didn't have the mentality,” says Thompson. “We did it for ourselves. We were arrogant czars in that sense.” And the turning point that led to UNIX? When Bell Labs scientist Doug McIlroy suggested connecting programs like connecting two garden hose segments. In other words, “screw in another segment when it becomes necessary to massage data in another way.”

Perhaps one of the most interesting chapters in “Go To” relays the birth of BASIC. In the early ‘60s, two Dartmouth College professors, Thomas Kurtz and John Kemeny, realized that computers were the future—yet only a quarter of Dartmouth's student body majored in science or engineering, “the group most likely to be interested in computing.” Kurtz and Kemeny felt that didn't jibe with the decision-makers of business and politics, who typically came from the non-techie contingent. In order to teach the liberal art students computer know-how (remember, they weren't bred on computers like today's youngsters are), the two men designed BASIC. It was a tool for the beginner, easily understood, with no hardware knowledge required. Its commands were simple: LET, READ, DATA, INPUT, GOTO, IF THEN, etc. In the fall of 1964, the two pioneers taught their first semester, which led to some “basic” modifications and streamlining in the second semester. “We had underestimated how badly our students type.”

Lohr devotes a good chunk in his book to Charles Simonyi, the young Hungarian computer whiz who designed Bravo with colleague Butler Lampson at Xerox PARC in the ‘70s. Simonyi eventually teamed up with Bill Gates at Microsoft to spearhead the technical development of Word—one of the world's most popular software programs. Says Lohr: “Simonyi's effort on Bravo alone was a significant contribution to the field—one that would eventually affect how millions of people create documents on computers, from business memos to novels.”

What's so fascinating about Simonyi, other than his penchant for showing up at noctural debugging sessions in a “debugging suit” of a black net shirt and translucent, skin tight black pants, is that he is still perfecting software, still looking for a “Big Bang-style breakthrough in programming productivity.” Specifically, Simonyi wants to devise a system to “free the human intelligence of the developer” from the “walled cities of individual computer language.”

Kinder, Gentler Approach

Important to mention in any book about software is Andy Hertzfeld, who was hired by Apple in 1979 and later programmed the guts of the vaunted Macintosh, “the machine that brought point-and-click computing” to the masses, says Lohr. The secret to the Mac's success was its friendly face, and Lohr credits Hertzfeld and his team for recognizing that PCs would never become household appliances until the touchy-feely aspects of software were mastered. “It's really fun to be in the middle of the technical, precise, and objective computer,” Hertzfeld tells Lohr in the book, “and the fuzzy, emotional, subjective human being. I've always loved art, especially literature and music, and I think the human element is what can elevate engineering to the realm of art.”

Perhaps nothing on the information age landscape—and to commerce and industry—has had as much impact as the World Wide Web. That's all because of Tim Berners-Lee, a British physicist, father of URL, HTTP, and HTML. How he got there, he says in the book, is no mystery: He based it on a program he wrote for himself for storing personal information, perfecting the addressing, linking, and transferring as the years went by.

After the debut of the World Wide Web, it's hard to fathom another software milestone. But it was only a matter of time until Java made headlines. Created by James Gosling, Java, says Lohr, is the FORTRAN of the Internet age. Indeed, Lohr believes Java is the model of incremental development, the essence of any breakthrough software. “All innovation is incremental in that it builds on top of previous knowledge. Yet a creative insight is required to assemble the building blocks of previous knowledge in new ways. Java contains that spark of fresh, organizing insight.” No wonder Gosling is the man behind the magic. He had a knack for repairing things as a young boy, from a decrepit baler in his grandfather's yard to the first computer he made from spare parts found in a dumpster at the age of 12. He, too, is still hard at work, aiming to develop more tools to help programmers as software becomes ever more complex.

Throughout the book, Lohr sprinkles in fascinating details that often get lost in current discussions of the information age. An example: Gosling originally wanted to call Java “Oak,” inspired by a tree outside his office. But a naming committee liked someone's notion that Java made them feel excited, like being overdosed on coffee.

Beyond getting to know the story of software, readers will be reminded that the past is prologue. For example, the team that created FORTRAN in 1957, the brainchild of IBM's John Backus, “was an eclectic bunch—a crystallographer, a cryptographer, a chess wizard, an employee loaned from United Aircraft, a researcher from MIT, a young woman who joined the project straight out of Vassar.” Today's dot-commers might chuckle to learn they weren't the first to cobble together a motley crew—and invent a product that could change the world.

What's more, many of today's standard operating procedures in Silicon Valley derived from practical considerations. Before personal computers, techies shared time on big mainframe computers. “They often worked at nights because it was the only time they could get valuable time on the machine to test and debug their code,” says Lohr. “The odd hours and close work bred camaraderie.” And relieving pressure by playing silly games wasn't just a ‘90s trend. “For relaxation, there were lunch-time chess matches and, in the winter, impromptu snowball fights.”

Of course, no book on software would be complete without a discussion of the open source movement, which demands that software be distributed freely. Lohr provides a unique perspective on the movement, explaining views of the proponents Linus Torvalds and Richard Stallman, who are closely associated with it.

Perhaps there's a sequel in the works. Lohr's “Afterword” touches on the next step in the evolution of software—grid computing—which is already being researched in various labs and universities across the nation. At software's current pace of development, Lohr might be writing for a long, long time.



Margaret Mannix is a freelance writer based in suburban Washington, D.C. She can be reached by e-mail at mmannix@asee.org.