Bradley Efron (Minnesota, United States; 1938) is Professor of Statistics and Biomedical Data Science at Stanford University. He studied mathematics at California Institute of Technology (Caltech) before completing an MS in statistics at Stanford University in 1962. That same year, he began his doctoral studies in statistics under the supervision of Rupert Miller and Herb Solomon, and after earning his PhD (1964), joined the faculty at Stanford, where he has pursued the remainder of his teaching and research career. Since 1988, he has held the Max H. Stein Professorship in the School of Sciences and Humanities.
Bradley Efron has made numerous contributions to statistical science, but his best known paper – “Bootstrap Methods: Another Look at the Jackknife” – was published in The Annals of Statistics in 1979. This initially controversial method is now considered a triumph of applied mathematics in conjunction with numerical analysis, and has had a profound impact upon statistical practice, finding wide application in science and medicine.
Bradley Efron has published extensively in leading journals like Biometrika, The Annals of Statistics or Journal of the American Statistics Association. He has been honored with a MacArthur Prize Fellowship and is a Fellow of the U.S. National Academy of Sciences, the American Academy of Arts and Sciences, the Institute of Mathematical Statistics and the American Statistical Association. His many awards include the Lester R. Ford Award, the Wilks Medal, the Rao Prize and the National Medal of Science.
Speech
Basic Sciences 9th edition
Micro interview
“Statistics learns from experience, it's a detective game”
For mathematician Bradley Efron, statistics is the “supporting actor” of science, but no less important for that. He means by this that statistical methods rarely occupy the spotlight, which is reserved logically enough for the big discoveries. But then, discoveries come from data, and data need interpreting… with statistics. Few advances would be possible without the aid of this apparently unglamorous branch of science. And the BBVA Foundation Frontiers of Knowledge Award in Basic Sciences bestowed on Efron, Professor of Statistics at Stanford University (United States), and his colleague David Cox of the University of Oxford (United Kingdom) acknowledges its vital contribution to results obtained in multiple domains from medicine to astrophysics by way of genomics or particle physics.
In today’s world of Big Data, when not only science but also technology and economics feed off vast quantities of data, Cox and Efron’s work takes on even more importance. “The problems confronted by modern science involve working with larger and larger data sets, and that creates noise,” remarks Efron. “Now to get to the important information we need to eliminate a lot of noise, and it is statistics that lets us do that.”
Cox and Efron published their most celebrated contributions in the 1970s. In Cox’s paper ‘Regression Models and Life-Tables’ (1972), he explains the workings of a powerful tool to estimate the time interval between two events dependent on identifiable factors. The Cox model can tell us, for instance, about the mortality risk of patients under treatment, the school dropout rate in a given population, or the risk of business bankruptcy. It finds use in cancer research, epidemiology and sociology; in testing the durability of industrial products and even predicting the likelihood of earthquakes.
At the time of developing his model, David Cox (Birmingham, United Kingdom; 1924) was already a reputed researcher. Trained at the University of Cambridge, his move into statistics was due to military imperatives at the time of the Second World War. Before earning his PhD, he had been employed at the Royal Aircraft Establishment, and he afterwards worked for the Wool Industries Research Association. His subsequent career would take him to the University of Cambridge, the University of London, Imperial College London and, since 1988, the University of Oxford.
The story of the Cox model started with a question posed independently by various friends studying patient survival time: How can we tell how much the treatment applied is influencing survival compared to other features inherent to each patient? “It took me three or four years to develop a solution,” says Cox, “and finally my work got published in an academic journal. Rather to my surprise, lots of people found it useful, which I am very happy about.” In his view, the most interesting applications of his research have to do with organ transplants and the treatment of life-threatening diseases like cystic fibrosis
Bradley Efron has also interacted closely with biomedicine – his Stanford professorship is in the Department of Biomedical Data Science at the university’s medical school. However, what remains his seminal contribution finds application in almost every realm of science. The bootstrap, as it is known, is a statistical technique to determine such a vital scientific variable as the margin of error of a given measurement.
Efron, whose interest in statistics was sparked by his father’s fondness for sports rankings, was looking for a way to determine the accuracy of a result without taking repeat measurements – impossible in many cases like, for instance, invasive medical tests. The solution he found was so simple in appearance that it initially met with distrust: “It seemed like cheating and it wasn’t obvious that it would work,” Efron recalls. Not long after, its utility was endorsed in literally thousands of papers. The crux of his idea was to gather data at random from the only available sample, and subject them to analysis. The same process is then re-run many times over, and it is this random re-sampling that provides the margin of error. This is a technique that hinges on the use of computers, which owe their preponderance in statistics to Efron’s work. With computers, re-sampling could be done time and time again, fine-tuning the precision of outcomes.
The term ‘bootstrap’ is itself a clue to Efron’s effervescent talent. Searching for a name at least as catchy as the other statistical tool called the jackknife, Efron took his inspiration from the adventures of Baron Munchausen. In one story, the Baron saves himself from drowning by hoisting himself up by the straps of his boots, an appropriate image for a technique that relies on the data per se, without external inputs.
Bootstrapping was described in a paper published in 1979. Until then, the margin of error had been worked out by mathematical approximations “which could be very complex and were not always correct,” explains Efron. “With the bootstrap, you delegate the hard part to the machines.”