How Supercomputers Are Getting Us Closer to a Covid-19 Vaccine

Matt Pene

The global scientific community has joined forces in an unprecedented effort to understand, track, forecast, test for, and find a cure for the current coronavirus pandemic. But in a crisis where every second lost means more loss of lives, solidarity alone isn’t enough. Supercomputers are enabling a vastly accelerated pace by which scientists can conduct research and collect and analyze data. Never have they proven their value to society more than during this COVID-19 pandemic.

Supercomputers provide scientists with unique capabilities: they can explore the structure and behavior of the virus at the molecular level, and forecast the spread of the disease and design drugs much faster than would otherwise be possible.

The Texas Advanced Computing Center (TACC) began fielding requests for compute time to assist in the fight against COVID-19 in February 2020. In March, the White House enlisted some of the world’s most powerful supercomputers in the battle against COVID-1 through the COVID-19 High Performance Computing (HPC) Consortium, a public-private partnership providing researchers worldwide with massive computing resources.

As part of this effort, we are working closely with teams to provide priority access to supercomputing resources here and across the world. In the U.S. alone, there are more than 100 projects, involving thousands of researchers, using HPC systems to predict the effects of interventions like stay-at-home orders and school closings; to simulate the molecular behavior of the proteins that make the virus virulent; to understand the genetics of the virus and its mutability; to screen potential drugs and vaccines for efficacy; and to visualize and interactively share data with decision-makers.

At TACC, nearly a third of all computing time has been dedicated to accelerating these efforts — the equivalent of 40,000 desktop computers churning non-stop. None of this would be possible without federal funding for high performance computing by the National Science Foundation (NSF) and Department of Energy (DOE), who have made open access to supercomputers part of their mission for more than four decades.

Beyond big machines, supercomputing centers employ some of the brightest minds in computational science, and these individuals are collaborating with teams across the nation to fast-track research.

Among these partnerships is the University of Texas at Austin COVID-19 Modeling Consortium, led by Dr. Lauren Ancel Meyers, which developed one of the leading epidemiological models of how the disease spreads based on virus transmission and real-time cell phone data. The White House and CDC, as well as the national media and public, have used the model to inform their understanding and decision-making.

A team from DOE’s Argonne and Brookhaven National Laboratories applied several of the most powerful supercomputers in the world to accelerate an AI-based approach to drug docking. Their effort narrowed 6 billion possible small molecules to the 30 with the best chance of binding to one of the virus’ proteins and disrupting its function. These are now being tested in labs at the University of Chicago.

The TACC-powered COVID-19 Drug Discovery Consortium is collaborating with Enamine, the world’s largest provider of screening compounds, and Boston University, Texas A&M, and the University of Texas Medical Branch, to identify the 600 most promising, readily available, drug-like molecules (out of 2.6 million) and test them in high-containment laboratories in order to find potential drugs in months rather than years.

New projects are launching daily.

In many of these cases, long-term research collaborations helped speed the projects out of the gate. The UT Austin Modeling Consortium’s projections built on a decade of federally-funded R&D on flu pandemic modeling by Meyers’ team. The DOE researchers adapted AI-based cancer drug discovery methods for SARS-CoV-2. The Drug Discovery Consortium leveraged tools and methods developed over many years to fight bioterrorism. Our ongoing relationships with these teams has made it possible for them to shift their research focus, expand their scope, and reduce limitations as they work towards a common good.

Academic research is frequently the first step in a long process that requires efforts by government agencies, philanthropic organizations, and industry. Basic science helps decision-makers protect the populace, and informs the creation of vaccines and treatments.

Under normal circumstances, this process takes years or decades. However, time is a luxury we simply do not have. The urgency of the challenge we face makes the application of research accelerators like supercomputers even more critical to help flatten the curve and ultimately solve the greatest crisis we as a society have ever faced.

Dan Stanzione is the director of TACC at The University of Texas at Austin.

A version of this op-ed appeared in The Hill.

Source link

You May Also Like

About the Author: Matt Pene

Leave a Reply

Your email address will not be published. Required fields are marked *

The maximum upload file size: 10 MB. You can upload: image, audio, video, document, spreadsheet, interactive, other. Links to YouTube, Facebook, Twitter and other services inserted in the comment text will be automatically embedded.