Cray Supercomputers Power Weather Forecasts Globally, and Now in the U.S.

Seattle supercomputer maker Cray, a quiet stalwart of the region’s technology industry, has been on a growth spurt lately, thanks in part to a spate of sales to government weather forecasting centers.

Cray has long provided machines—made in Chippewa Falls, WI—that help scientists model the climate and predict storms. Recent deals with forecasters in the U.K., Europe, Korea, and the U.S. (more on that below), highlight how weather forecasting—a quintessential big data problem—is a bellwether for many other fields with a growing appetite for supercomputers.

“The weather is just a leading indicator of many other areas,” said Barry Bolding, who heads business development at Cray (NASDAQ: [[ticker:CRAY]]).

Energy companies are buying more systems from Cray as they struggle to process high-resolution sensor and imaging data pointing to potential oil and gas deposits. (The company recently announced a contract with Petroleum Geo-Services for a powerful XC40 system to perform just this kind of work.) Financial services firms are using Cray supercomputers to calculate investment and insurance risks in real time. Even sports teams are purchasing Cray systems—albeit ones costing orders of magnitude less than the $40 million-plus systems government weather centers buy—to crunch stats, predict favorable player matchups, and find signs of potential injuries.

Weather and climate prediction require computational models involving physics, chemistry, hydrodynamics, and other demanding mathematical calculations, which Bolding called “the sweet spot” of Cray systems.

Moreover, weather forecasting relies not just on theory, but observation, too—lots of it. The U.K. Met Office, which is paying Cray $128 million over three years for its Cray XC40 systems, digests some 10 million daily observations.

“It comes from gathering lots of raw data from sensors that are scattered around the globe, and then using that data to process and solve the weather prediction problem,” Bolding said.

For the national weather forecasters, getting it right is a challenge with high stakes. Their end product—accurate weather forecasts, climate models, and other information services—are important to a wide range of industries.

Bolding
Bolding

“Those services range from daily forecasts or very short-term emergency storm forecasts, all the way out to simulations and services to help farmers predict what crops to plant, or help to mitigate climate change and provide climate vulnerability analysis for the longer term,” Bolding said.

Consistent forecasting accuracy depends on several factors, including the mathematical models used to parse the observational inputs and the computing hardware to churn through those models. More power means higher-resolution forecasts.

“There was a lot of press around Hurricane Sandy and the fact that the European models were getting that correct, and the U.S. models were not,” Bolding said. “Part of that may well be due to not having enough computational resources. That certainly can cause you to decrease the resolution of your model.”

That’s why news early this year of a Cray deal to supply the U.S. National Oceanic and Atmospheric Administration (NOAA)—of which the National Weather Service is part—with new supercomputers was something to get excited about. But not too excited, as U.S. weather prediction supercomputing capacity will remain behind that of smaller nations.

The $44.5 million contract (more than half of which comes from the 2013 Disaster Relief Appropriations Act in the wake of Hurricane Sandy) will, by October, increase NOAA’s supercomputing capacity to 5 petaflops (two machines each capable of 2.5 petaflops—that’s 2.5 quadrillion floating point operations per second).

When all is said and done, NOAA forecasters will have increased their computing capacity by nearly a factor of 10—which highlights how under-powered U.S. weather forecasting computers had been. Even at 5 petaflops, the U.S. remains far behind much smaller nations, as University of Washington atmospheric sciences professor Cliff Mass explained in a blog post last fall.

Mass noted that the U.K.’s new machines, capable of 15 petaflops, sit in a country with a landmass that’s roughly

Author: Benjamin Romano

Benjamin is the former Editor of Xconomy Seattle. He has covered the intersections of business, technology and the environment in the Pacific Northwest and beyond for more than a decade. At The Seattle Times he was the lead beat reporter covering Microsoft during Bill Gates’ transition from business to philanthropy. He also covered Seattle venture capital and biotech. Most recently, Benjamin followed the technology, finance and policies driving renewable energy development in the Western US for Recharge, a global trade publication. He has a bachelor’s degree from the University of Oregon School of Journalism and Communication.