Experto del MIT en computadoras poderosas e innovación

Powerful Computers and Innovation

: Un nuevo documento de trabajo intenta cuantificar la importancia de computadoras más poderosas para mejorar los resultados en la sociedad. En él, los investigadores analizaron cinco áreas donde la computación es esencial, incluido el pronóstico del tiempo, la exploración de petróleo y el plegamiento de proteínas (importante para el descubrimiento de fármacos). Crédito: MIT

Preguntas y respuestas: Neil Thompson del MIT sobre el poder informático y la innovación

La innovación en muchas industrias se ha visto impulsada por los rápidos aumentos en la velocidad y potencia de los microchips, pero la trayectoria futura de estos increíbles avances puede estar en peligro.

Gordon Moore, cofundador de Intel, predijo que la cantidad de transistores en un chip se duplicaría cada uno o dos años. Esta predicción se conoce como Ley de Moore. Desde la década de 1970, esta predicción se ha cumplido o superado en su mayoría; la potencia de procesamiento se duplica aproximadamente cada dos años, mientras que los microchips mejores y más rápidos se vuelven más asequibles.

Durante muchos años, este aumento exponencial de la potencia informática ha estimulado la innovación. Sin embargo, a principios del siglo XXI, los investigadores comenzaron a preocuparse de que la Ley de Moore pudiera estar desacelerándose. Existen restricciones físicas sobre el tamaño y la cantidad de transistores que se pueden incluir en un microprocesador económico que utiliza la tecnología de silicio actual.

Para medir el valor de las computadoras más potentes para mejorar los resultados en toda la sociedad, Neil Thompson, investigador de[{” attribute=””>MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL) and Sloan School of Management, and his research group set out to do just that. They examined five fields where computing is essential, such as weather forecasting, oil exploration, and protein folding (important for drug discovery), in a recent working paper. Gabriel F. Manso and Shuning Ge, two research assistants, are co-authors of the working paper.

They discovered that the contribution of processing power to these advancements ranges from 49 to 94%. For instance, boosting computer power by ten improves three-day-ahead predictions by a third of a degree in weather forecasting.

However, technological advancement in computers is lagging, which might have significant effects on the economy and society. Thompson discussed this study and the effects of Moore’s Law’s demise in an interview with MIT News.

Q: How did you approach this analysis and quantify the impact computing has had on different domains?

A: Quantifying the impact of computing on real outcomes is tricky. The most common way to look at computing power, and IT progress more generally, is to study how much companies are spending on it, and look at how that correlates to outcomes. But spending is a tough measure to use because it only partially reflects the value of the computing power being purchased. For example, today’s computer chip may cost the same amount as last year’s, but it is also much more powerful. Economists do try to adjust for that quality change, but it is hard to get your hands around exactly what that number should be. For our project, we measured the computing power more directly — for instance, by looking at capabilities of the systems used when protein folding was done for the first time using deep learning. By looking directly at capabilities, we are able to get more precise measurements and thus get better estimates of how computing power influences performance.

Q: How are more powerful computers enabling improvements in weather forecasting, oil exploration, and protein folding?

A: The short answer is that increases in computing power have had an enormous effect on these areas. With weather prediction, we found that there has been a trillionfold increase in the amount of computing power used for these models. That puts into perspective how much computing power has increased, and also how we have harnessed it. This is not someone just taking an old program and putting it on a faster computer; instead users must constantly redesign their algorithms to take advantage of 10 or 100 times more computer power. There is still a lot of human ingenuity that has to go into improving performance, but what our results show is that much of that ingenuity is focused on how to harness ever-more-powerful computing engines.

Oil exploration is an interesting case because it gets harder over time as the easy wells are drilled, so what is left is more difficult. Oil companies fight that trend with some of the biggest supercomputers in the world, using them to interpret seismic data and map the subsurface geology. This helps them to do a better job of drilling in exactly the right place.

Using computing to do better protein folding has been a longstanding goal because it is crucial for understanding the three-dimensional shapes of these molecules, which in turn determines how they interact with other molecules. In recent years, the AlphaFold systems have made remarkable breakthroughs in this area. What our analysis shows is that these improvements are well-predicted by the massive increases in computing power they use.

Q: What were some of the biggest challenges of conducting this analysis?

A: When one is looking at two trends that are growing over time, in this case performance and computing power, one of the most important challenges is disentangling what of the relationship between them is causation and what is actually just correlation. We can answer that question, partially, because in the areas we studied companies are investing huge amounts of money, so they are doing a lot of testing. In weather modeling, for instance, they are not just spending tens of millions of dollars on new machines and then hoping they work. They do an evaluation and find that running a model for twice as long does improve performance. Then they buy a system that is powerful enough to do that calculation in a shorter time so they can use it operationally. That gives us a lot of confidence. But there are also other ways that we can see the causality. For example, we see that there were a number of big jumps in the computing power used by NOAA (the National Oceanic and Atmospheric Administration) for weather prediction. And, when they purchased a bigger computer and it got installed all at once, performance really jumps.

Q: Would these advancements have been possible without exponential increases in computing power?

A: That is a tricky question because there are a lot of different inputs: human capital, traditional capital, and also computing power. All three are changing over time. One might say, if you have a trillionfold increase in computing power, surely that has the biggest effect. And that’s a good intuition, but you also have to account for diminishing marginal returns. For example, if you go from not having a computer to having one computer, that is a huge change. But if you go from having 100 computers to having 101, that extra one doesn’t provide nearly as much gain. So there are two competing forces — big increases in computing on one side but decreasing marginal benefits on the other side. Our research shows that, even though we already have tons of computing power, it is getting bigger so fast that it explains a lot of the performance improvement in these areas.

Q: What are the implications that come from Moore’s Law slowing down?

A: The implications are quite worrisome. As computing improves, it powers better weather prediction and the other areas we studied, but it also improves countless other areas we didn’t measure but that are nevertheless critical parts of our economy and society. If that engine of improvement slows down, it means that all those follow-on effects also slow down.

Some might disagree, arguing that there are lots of ways of innovating — if one pathway slows down, other ones will compensate. At some level that is true. For example, we are already seeing increased interest in designing specialized computer chips as a way to compensate for the end of Moore’s Law. But the problem is the magnitude of these effects. The gains from Moore’s Law were so large that, in many application areas, other sources of innovation will not be able to compensate.

Reference: “The Importance of (Exponentially More) Computing Power” by Neil C. Thompson, Shuning Ge and Gabriel F. Manso, 28 June 2022, Computer Science > Hardware Architecture.
DOI: 10.48550/arXiv.2206.14007


Deja una respuesta

Tu dirección de correo electrónico no será publicada. Los campos obligatorios están marcados con *