Algorithmia Lands In-Q-Tel Deal, Adds Deep Learning Capabilities

Algorithmia, which runs a public marketplace for algorithms, has just landed a deal to provide a private algorithm-sharing platform for the U.S. intelligence community.

The deal with In-Q-Tel, which invests in and procures new technologies for intelligence agencies, comes on the heels of a significant upgrade in capabilities for Algorithmia’s primary business of brokering access to algorithms—the mathematical formulas that underpin modern apps—through a marketplace open to anyone. The company implemented upgrades to handle the data- and compute-heavy requirements of trained deep learning models, which have largely been the domain of tech giants and academic researchers.

In-Q-Tel will have access to Algorithmia’s Codex platform—essentially a private version of the broader Algorithmia marketplace, which is based on public cloud computing infrastructure.

“We knew from day one that not all businesses or organizations were going to be able to take advantage of that, either because of security concerns or because it’s just not part of their policies,” says Diego Oppenheimer, CEO of the Seattle-based startup.

But these organizations still have a lot of algorithmic intellectual property that needs to be shared among developers internally. Leading tech companies have their own methods for doing this. They set up central code repositories where developers across the company can tap into proprietary algorithms through APIs. For other large enterprises, Algorithmia offers a central algorithm repository-as-a-service. In-Q-Tel is not the company’s first enterprise customer, but it is by far its largest and highest-profile, Oppenheimer says, calling it a “huge validation of the technology behind our platform.”

Algorithmia took a major step forward with its platform last week when it began hosting and distributing trained deep learning models.

Deep Learning Models

Think of these as vastly more complicated versions of a financial model you’d build in a spreadsheet. Deep learning models take inputs, such as a batch of black-and-white photos, and produce an output such as smart colorization of those photos. Deep learning models must be trained on vast amounts of data.

A branch of machine learning, deep learning is not a new concept. Researchers have been working on it—though not always using today’s buzzwords—for decades. Some deep learning techniques attempt to replicate in machines the way the human brain responds to stimuli and learns new things. While it has long been theoretically possible, only recently have advances in data and computing power made it possible in practice.

Capabilities such as facial recognition, Google DeepMind’s AlphaGo, and chatbots that can react to natural language inputs all use elements of deep learning.

Microsoft Azure, Amazon Web Services, Google Cloud and other public cloud providers rent out computing power at a massive scale, lately including the graphics processing chips originally designed for gaming and video, which are much faster than standard processors at handling deep learning workloads. And, of course, the world is accumulating digital data of all kinds at rates never seen before.

One ingredient in the deep learning recipe remains scarce: the talent to build, train, and implement the models themselves.

“As exciting and amazing as it is, [deep learning] has been essentially monopolized by these extremely deep-pocketed technical companies that have the ability to hire the very specialized people that are needed to build these models and deploy them at massive scale so anybody can use them,” Oppenheimer says.

Daniel, left, and Oppenheimer.
Algorithmia co-founders Kenny Daniel, left, and Diego Oppenheimer.

Academic researchers—those that haven’t been enticed into the arms of one of the tech giants—are pushing the state of the art in deep learning. But there are drawbacks to the traditional channels available to them for distributing their work, Oppenheimer says: Academic journals reach a relatively limited audience. There are open-source code repositories such as GitHub, but these require much more work on the part of a developer interested in implementing a deep learning model.

Algorithmia is designed to streamline the process of finding, testing, and implementing algorithms of all kinds. It’s essentially an app store for algorithms, paired with cloud infrastructure on which to run them, with multiple models for compensating algorithm creators, depending on their individual motivations.

Academic Impact

Oppenheimer believes Algorithmia’s new capabilities will

Author: Benjamin Romano

Benjamin is the former Editor of Xconomy Seattle. He has covered the intersections of business, technology and the environment in the Pacific Northwest and beyond for more than a decade. At The Seattle Times he was the lead beat reporter covering Microsoft during Bill Gates’ transition from business to philanthropy. He also covered Seattle venture capital and biotech. Most recently, Benjamin followed the technology, finance and policies driving renewable energy development in the Western US for Recharge, a global trade publication. He has a bachelor’s degree from the University of Oregon School of Journalism and Communication.