NW Advanced Computing Partnership Looks to Tackle Big Challenges

graph analytics, social media, parallel languages, systems biology, modeling of eco-hazards, smart grid simulation, and data visualization.

NIAC-affiliated researchers are going after an NIH funding opportunity to create Centers of Excellence for Big Data Computing in the Biomedical Sciences. Other funding possibilities include Department of Defense university research programs.

Thom Dunning Jr., NIAC co-director from the PNNL side beginning officially in January, has spent his career in both university and national laboratory settings. He surveyed other university-national lab partnerships around the country for models that could be applied here.

Lawrence Berkeley National Laboratory and the University of California, Berkeley, have perhaps the best relationship, thanks to their shared history and proximity: You can walk from one to the other in about 15 minutes. The NIAC will not enjoy the benefits of such a close physical location, with PNNL located more than three hours from UW by car.

Another one is the Joint Institute for Computational Sciences, which links up University of Tennessee and Oak Ridge National Laboratory.

Dunning says the best model for the NIAC is the Computation Institute, established in 2000 by the University of Chicago and Argonne National Laboratory, which are separated by only 30 miles, but an hour of frustrating Chicago traffic. Dunning, who most recently led the National Center for Supercomputing Applications at the University of Illinois at Urbana-Champaign, says the Computation Institute was established for many of the same reasons as the NIAC.

In 13 years, it has grown to 120 researchers and faculty divided between the two institutions, plus 50 supporting staff, and 30 graduate students (a number Dunning says is surprisingly low).

Dunning Jr.
Dunning Jr.

The Computation Institute has had some successes in tapping new sources of funding for its partnering institutions to pursue a handful of very large strategic computing projects, while also fostering a smaller number of collaborations among principal investigators. It has attracted about $150 million in the last five years, he says.

The largest project was TeraGrid, a major network of high-performance computing centers and other resources for e-science, backed by the National Science Foundation (NSF).

“It’s very difficult for a national laboratory to get funding from the National Science Foundation,” Dunning says.

The 17 Department of Energy national laboratories get most of their funding from energy and defense departments, while the NSF channels money—about $7 billion in the 2012 fiscal year—toward university researchers.

Argonne, Dunning says, attracted NSF funding “by basically masking themselves as University of Chicago,” even though it was Argonne expertise driving the project.

Another major Computation Institute project, Beagle, was driven by University of Chicago medical faculty who needed a supercomputer for life sciences research, and needed Argonne expertise to operate it, Dunning says. It was funded by National Institutes of Health, among others.

Dunning says Computation Institute staffers told him that their failures include insufficient strategic focus—something they are trying to correct through a new Urban Center for Computation and Data, meant to corral resources around areas of expertise at both institutions.

He acknowledges that the NIAC’s initial research directions are relatively broad, setting up the potential for a similar lack of focus. Dunning and other UW and PNNL leaders say the NIAC, to be successful, needs to identify about three significant strategic projects that get serious participation from as many researchers as is reasonable at both institutions.

The NIAC will also have a significant role for partnerships with industry, though Jandhyala acknowledges that this is one of the more challenging elements to be negotiated between the two institutions.

It seems to have good support from at least one key player locally. Cray CEO Peter Ungaro laid out the supercomputer maker’s big data strategy at the event Wednesday and says he is excited to form a partnership with the new institute.

Author: Benjamin Romano

Benjamin is the former Editor of Xconomy Seattle. He has covered the intersections of business, technology and the environment in the Pacific Northwest and beyond for more than a decade. At The Seattle Times he was the lead beat reporter covering Microsoft during Bill Gates’ transition from business to philanthropy. He also covered Seattle venture capital and biotech. Most recently, Benjamin followed the technology, finance and policies driving renewable energy development in the Western US for Recharge, a global trade publication. He has a bachelor’s degree from the University of Oregon School of Journalism and Communication.