In may of 2011, a sequence of tornadoes roared across the midwestern United States. The incident became a focal point for scientists eager to learn what it is about supercell storms that allow them to form such devastating tornados. It’s an important field of study, but a challenging one — these storms are so enormous there’s simply too much data for typical methods to work through. So, what’s a atmospheric scientist to do? Use a supercomputer, of course.
Leigh Orf at the University of Wisconsin-Madison had the 2011 storm simulated by the University of Illinois’ Blue Waters machine — tasking the supercomputer with breaking the enormous supercell into almost two billion small chunks spread over a 75 square mile area. The wind speed, temperature, pressure humidity and precipitation of each of those smaller sections was individually calculated before reassembling the bits into one large recreation of the entire storm. The task took three days and 20,000 of Blue Waters’ processing cores, but it was worth it.
“For the first time, we’ve been able to peer into the inner workings of a supercell that produces a tornado,” Orf says. “We have the full storm, and we can see everything going on inside of it.” This lets his team directly study how these deadly twisters are formed from the inside-out. It also gives us a hauntingly beautiful video of the storms formation to watch.
It’s a research problem that couldn’t have been solved any other way, too — not so much because the weather is complex, Orf says, but because there’s just too much data to be handled any other way. “This type of work needs the world’s strongest computers just because the problem demands it,” he told Popular Science. “there’s no way around it.”