Posted December 9th, 2009
Attendees of the Lynford Lecture Series on November 19 learned about the most serious problems facing computer science today, but perhaps the greater benefit to guests that afternoon was the history lesson provided by one of the field’s founders, Frances Allen. The first female recipient of the A. M. Turing Award — the equivalent of the Nobel Prize for computing — as well as the first woman to be named an IBM Fellow and the first female guest speaker in the 12-year history of the lecture series, Allen spoke about the early days of computer science and about the field’s future. As Provost Dianne Rekow noted in her introductory remarks to Allen’s talk, a tour of the projects Allen worked on at IBM was akin to a tour of the greatest innovations in computer science.Allen underscored that observation in her lecture when she explained, “We had to invent everything.” Her comment addressed the STRETCH/Harvest era of supercomputers when IBM attempted to build a supercomputer 100 times faster than the 704, the fastest computer of that time. Tom Watson, then president of IBM, publicly apologized when STRETCH fell significantly short of IBM’s lofty goal. But STRETCH did “define the limits of the possible,” said Allen, as she quoted Dag Spicer, curator of the Computer History Museum.
It was a theme Allen returned to as she reviewed her projects at IBM. “We made a lot of mistakes,” she said, “but we didn’t know anything. There were no courses on compilers at the time.”
Yet Allen’s boss supported — even demanded — rapid failures. Quoting him, she reminded attendees: “The fastest way to succeed is to double your failure rate.” Allen’s belief in the mantra was genuine, as she lobbied the audience to enable bold thinkers and high-risk projects.
It’s precisely that kind of innovation that’s needed to overcome the field’s greatest challenge: hot chips. Speaking in the direct speech for which she’s known, Allen said simply, “Computers have hit a performance limit.” Expanding on that idea, she described how, as transistors continue to shrink, more and more will fit on a chip. In turn, the chips will run faster and faster, but physics caps their performance: the chips will get too hot to operate properly. Quoting luminaries in her field, Allen called it “the biggest problem computer science has ever faced.”
But such challenges are what have always driven Allen — “I love to explore unsolved problems,” she said before the lecture — and she was able to offer suggestions for tackling issues in the field. Software and users must organize tasks to execute in parallel on multiple, chip-located processors, Allen said. Her recommendation touched on the concept of parallelism, a type of programming more difficult than the more common sequential programming. It’s an approach that’s just beginning to gain ground, Allen admitted. “We don’t know yet how to talk about parallelism,” she said.
Allen spoke more easily about other recommendations for the future of computer science, urging attendees to recast compilers, rebuild software stacks, and eliminate C, JAVA, and other general-purpose computer languages. The last got a chuckle from audience members, even as Allen explained that such languages couldn’t be well analyzed.
The reaction seemed somehow fitting for a woman who had pioneered so much in computer science: Allen was more interested in furthering the field than in maintaining the dominance of C and JAVA or, more generally, the status quo. As she observed how computer science has moved from the bit to the byte and users have gone from talking about data to discussing information, Allen described how the progression puts the field at “the edge of knowledge.” It was a final point the inheritors of her legacy and those of her colleagues were happy to applaud.