Information and Communication Technologies
The Frontiers of Knowledge Award goes to John Hennessy and David Patterson for turning computer architecture into a science and designing the processors that power today’s devices
The BBVA Foundation Frontiers of Knowledge Award in Information and Communication Technologies has gone in this thirteenth edition to John Hennessy (Stanford University) and David Patterson (University of California, Berkeley) for taking computer architecture, the discipline behind the central processor or “brain” of every computer system, and launching it as a new scientific area. In the video which can be seen by clicking Play on the image at the top of this page, the president of the jury, Professor Joos Vandewalle, Honorary President of the Royal Flemish Academy of Belgium for Science and the Arts, reads the citation of the award.
10 February, 2021
“Professors John Hennessy and David Patterson are synonymous with the inception and formalization of this field,” the citation reads. “Before their work, the design of computers – and in particular the measurement of computer performance – was more of an art than a science, and practitioners lacked a set of repeatable principles to conceptualize and evaluate computer designs. Patterson and Hennessy provided, for the first time, a conceptual framework that gave the field a grounded approach towards measuring a computer’s performance, energy efficiency, and complexity.”
The new laureates’ scientific contributions had their didactic parallel in a landmark textbook, Computer Architecture: A Quantitative Approach, which three decades on from its first release and after six editions with regularly updated content, is still considered “the bible” for the discipline in universities around the world.
Hennessy and Patterson have not only transfigured computer architecture as designers and educators, reaching new generations of computer scientists through their teaching activity, they have also turned their ideas into technological innovation and industry applications. They are the joint creators of RISC, an architecture that underpins the design of central processors and today “sits at the heart of virtually every data center server, desktop, laptop, smartphone, and embedded computer [in televisions, cars and Internet of Things devices],” said the committee in its citation.
RISC, which stands for reduced instruction set computer, was developed in the 1980s building on a concept of their own devising; one that Hennessy summed up after hearing of the award as “simpler equals more efficient.” This principle was squarely at odds with the thinking of the computer designers of the time, and it was precisely finding themselves in this minority position that encouraged the two men to collaborate.
A systematic and reproducible method
For Patterson, what their work shares is the determination to bring a systematic and reproducible method to their research. It was this that enabled them to formalize the domain of computer architecture, this that led them to RISC, and this that informed the writing of their book: “We design processors the same way we design books, through experiment and trial,” he said in a video conference after being informed of the decision of the award committee appointed jointly by the BBVA Foundation and the Spanish National Research Council (CSIC).
Proof of its success is that RISC technology, and its efficiency maximizing principles, are currently found in 99% of all processors, and are behind both the lightweight laptops we know today and the long-lasting batteries of our smartphones. As for their textbook that laid the pillars of a discipline and continues to disseminate its key ideas, Hennessy remarks that “one of the great joys of my life is finding that there are students all over the world who appreciate our work.”
“The work of Hennessy and Patterson has had a deep and enduring impact,” the committee’s citation concludes. “They conceived the scientific field of computer architecture, motivated a systematic and quantitative design approach to system performance, created a style of reduced instruction set processors that has transformed how industry builds computer systems, and have made transformative advancements in computer reliability and in large-scale system coherence.”
From ‘dark art’ to science ‘for all’
In the 1980s, before Hennessy and Patterson made their mark, “each company had its own way of designing processors, based on the intuition of a handful of experts,” recalls Ron Ho, committee secretary and Director of Silicon Engineering at Facebook (United States). “It was like witchcraft, a series of methods that were hard to pass on. Hennessy and Patterson changed all that. They created a framework and defined parameters that let us compare systems for efficiency and speed. Computer architecture moved from being a dark art to being a science, a systematized body of transmissible knowledge. They democratized the knowledge needed to design computers, bringing it within everyone’s reach.”
The two look back at their efforts to set down in writing the discipline’s foundations: “Both John and I were professors, and loved giving class,” says Patterson. “We decided to write a book out of sheer frustration that there was nothing out there to help us teach our students what we knew.”
Hennessy takes up this point: “Computer architecture was being taught in a very descriptive fashion, almost if you were walking through a museum like El Prado and looking at two different paintings and trying to compare them. We weren’t happy with that approach. We wanted something that was based on important measures like performance and cost. We began to write our book based on those things so that the field would acquire an engineering and scientific approach rather than one based on mere description. That was just over 30 years ago and today the book is available in more than a dozen languages and has been used by hundreds of thousands of young students around the world.”
Simpler is more efficient
The RISC processor, devised in the early 1980s, is also the product of a conceptual shift. At the time the prevailing approach to computer architecture was that a processer would run faster if it had fewer instructions to deal with, however complex they might be. But what Hennessy and Patterson showed with their RISC processor was that computing could be made more efficient with larger but simpler instruction sets that could be completed in less time.
“RISC is all about efficiency,” explains Hennessy. “The key insight is that simpler is more efficient. So think of an essay that you are reading. Suppose the essay uses really complex words and difficult sentence structures, so it’s hard to read fast. Now instead imagine an essay that’s written with really simple words that you can read really fast. That is what RISC does. It uses instructions that are very simple and can be executed very fast. That gave us a breakthrough in terms of performance which today has led to major advantages in terms of efficiency and power use.”
From initial rejection to near universal use
But RISC’s good results were not enough to convince the design community. “The technology was too disruptive and counter-intuitive, saying if you do things more simply, the computer will run faster,” Hennessy reflects. “In the beginning RISC was incredibly controversial,” Patterson concurs. “At conferences John and I would be on one side of the debate and everybody else on the other. But after a few years it got kind of switched around. We identified the formulas that explained why RISC would be better and then it led to start-ups. John founded his own company [MIPS Technologies] and I joined an existing start-up [Sun Microsystems] that brought some very successful products to the market place and that changed people’s minds.”
The emergence in the last ten years of small but more powerful devices like smartphones and tablets has made the advantages of RISC technology even more evident, with greater energy efficiency translating into longer battery life and a lower cost.
The committee finds room in its citation for the laureates’ other innovations beyond their joint work on computer architecture and RISC technology. Patterson “created a field of study around computer reliability,” while Hennessy “worked on the development of distributed shared memory multiprocessor systems.” Together, it says, these ideas form the underpinning of how we build modern data centers, databases and Internet search engines. All these systems and tools require the continuous support of highly reliable large-scale computers, which would not have been possible without the multiprocessors created, in turn, through the visionary work of the two laureates.
A new ‘golden age’ for computer architects
For both men, the race to ever greater processor miniaturization is not over yet, meaning the challenges for computer architecture will only get bigger in the short run.
“Moore’s Law [the number of transistors in a processor doubles approximately every two years], which has driven the technology used to build increasingly efficient computers, is coming to an end,” says Patterson. “Yet people still want much faster computers. What this means is that computer architects are going to have to figure how to design computers with no better transistors and still deliver performance gains. The field could see a new ‘golden age’.”
Hennessy, for his part, sees the rise of artificial intelligence as one of the great challenges facing the discipline: “The demand for performance for AI is growing by leaps and bounds, so we are going to rethink the way we design computers to design them to do those highly intensive tasks, like machine learning, very efficiently. And that is going to lead to lots of new innovation and excitement, and opportunities for young people to make important contributions.”
Information and Communication technologies committee and evaluation support panel
The committee in this category was chaired by Joos Vandewalle, Honorary President of the Royal Flemish Academy of Belgium for Science and the Arts, with Ron Ho, Director of Silicon Engineering at Facebook (United States) acting as secretary. Remaining members were Regina Barzilay, Delta Electronics Professor in the Department of Electrical Engineering and Computer Science at Massachusetts Institute of Technology (United States), Georg Gottlob, Professor of Informatics at the University of Oxford (United Kingdom) and Vienna University of Technology (Austria), Oussama Khatib, Professor of Computer Science and Director of the Robotics Laboratory at Stanford University (United States), Rudolf Kruse, Emeritus Professor in the Faculty of Computer Science at the University of Magdeburg (Germany), and Mario Piattini, Professor of Computer Languages and Systems at the University of Castilla-La Mancha (Spain).
The evaluation support panel of the Spanish National Research Council (CSIC) was coordinated by M. Victoria Moreno, Deputy Vice President for Scientific and Technical Areas, and formed by: Carmen García García, Deputy Coordinator of the Global Area MATERIA and research professor at the Institute of Corpuscular Physics (IFC); Gabriela Cembrano Gennari, tenured scientist at the Institute of Robotics and Industrial Informatics (IRRI); Josep María Porta Pleite, tenured scientist at the Institute of Robotics and Industrial Informatics (IRII); Carlos Prieto de Castro, Coordinator of the Global Area MATERIA and research professor at the Institute of Materials Science of Madrid (ICMM); and Carles Sierra García, research professor at the Artificial Intelligence Research Institute (IIIA).