After a Year on the Market, Qumulo Counts 50+ Customers, 40 Petabytes

In the year since it began selling its enterprise data storage software, Seattle-based Qumulo has amassed more than 50 customers storing upwards of 40 petabytes of data in aggregate.

Qumulo distinguishes itself in a crowded field in several ways, but one is its emphasis on continuous delivery of software updates, rolling out new features and bug fixes every two weeks—a pattern that hews closer to a modern web service than a piece of data center infrastructure. So, it was curious to learn that Qumulo is delivering a new version—dubbed Qumulo Core 2.0—in what feels like a more typical enterprise software release.

Jeff Cobb, Qumulo’s vice president of product management, says that the regular cadence of biweekly releases tend to deliver “incremental” improvements—fixing a rough edge in the user interface, making slight performance enhancements, and the like.

He says the company is “anointing” this release 2.0—even though it still falls into the regular sequence of two-week releases—because it carries two new features that Qumulo leaders believe will be particularly important to customers and potential customers. One is called erasure-coded data protection, which Cobb says is a more efficient, cost-effective way to back up data in the event that a physical disk drive fails. The other is an analytics tool to track the flow of data into and out of a data warehouse over time.

Marketing clearly plays an important role here.

“Really, it’s all about signaling the importance and the value of that release into our customer base and into the marketplace,” Cobb says.

Qumulo was founded in 2012 by veterans of Isilon Systems, a data-storage company that ranks as one of Seattle’s top startup successes (it was acquired by EMC in 2010 for $2.25 billion). Qumulo has raised some $67 million in venture capital and scaled up to 160 employees, double the number from a year ago, with an emphasis on growing its engineering and sales teams, Cobb says.

Cobb.
Cobb.

He says Qumulo’s data storage software—which runs on an expanding range of high-end storage appliances made by Qumulo, including a 10-terabyte hard disk drive supplied by HGST—allows customers to scale up quickly. (Qumulo plans to support other vendors’ hardware in the future, Cobb adds.)

But with petabyte-scale data warehouses comes a harder challenge: managing the data. Qumulo was built from the outset with this challenge in mind.

“Understanding what you have, how did things get there, who put them there, what’s hot and what’s cold, what’s working and what’s not working—being able to answer questions about your data turns out to really be the central problem that’s created by scale,” Cobb says.

The new analytics feature helps with that. It’s like a smart fuel gauge, showing a storage system’s capacity over time, with the ability to examine which files were added and deleted during a given period of time.

Qumulo can answer questions about capacity changes—say, why an organization’s storage system went from 70 percent full to 99.5 percent full overnight—“on the fly,” Cobb says. That’s because it was designed for just such a scenario. It’s constantly gathering operational intelligence about the stored data. “It works all the time, so that when you have a question, we know the answer,” Cobb says.

The alternative, he says, is sending out a frantic e-mail asking users who just uploaded something huge—it probably wasn’t a human who did it, as we’ll learn below—or initiating a brute force query across an organization’s tens of billions of individual files to determine what just filled up the tanks.

Monitoring data storage capacity in Qumulo Core 2.0.
Monitoring data storage capacity in Qumulo Core 2.0.

“The bigger these systems get, the more files there are, the more capacity there is, the worse and worse brute force gets as a solution,” Cobb says.

Erasure-coded data protection, the other new feature, provides backup services that use significantly less disk space than the status quo technique of mirroring, which keeps an extra copy of each disk drive somewhere else to protect against hardware failures. That’s particularly costly when individual drives expand to, say, 10 terabytes.

Cobb says Qumulo is gaining customers in several industries, but particularly those in which companies are dealing with large volumes of machine-generated data.

“It’s in areas like media and entertainment with animation and rendering farms, with transcoding; in life sciences with genomic analysis; it’s in telecos and cable companies,” Cobb says. “You can only imagine how much networking gear those guys have in all of their various data centers all over the place.”

Author: Benjamin Romano

Benjamin is the former Editor of Xconomy Seattle. He has covered the intersections of business, technology and the environment in the Pacific Northwest and beyond for more than a decade. At The Seattle Times he was the lead beat reporter covering Microsoft during Bill Gates’ transition from business to philanthropy. He also covered Seattle venture capital and biotech. Most recently, Benjamin followed the technology, finance and policies driving renewable energy development in the Western US for Recharge, a global trade publication. He has a bachelor’s degree from the University of Oregon School of Journalism and Communication.