How Metatomix is Bringing the Semantic Web to Life in Law Enforcement

In 2005, nine-year-old Jessica Lunsford was raped and murdered by a known sex offender who did not inform the police that he had moved to her neighborhood in Homosassa, FL, and who had been released earlier by a state court that did not have access to parts of his criminal record. Florida lawmakers responded to the tragedy with a new law, since copied by many other states, that requires permanent electronic monitoring for many sex offenders and stiffens penalties for those who don’t report their locations. The law also created a task force to recommend better ways for courts and law enforcement agencies to share data on criminal suspects, convicts, and parolees. And now new information-integration technologies being adopted by the Florida court system to meet that task force’s mandate are putting the spotlight on a Dedham, MA, company called Metatomix.

The company’s system, which helps judges and law enforcement officers piece together records drawn from disparate databases managed by competing agencies, is one of the first successful implementations of so-called “semantic Web” technology that World Wide Web inventor Tim Berners-Lee and other computer scientists have been advocating for years. And it’s working, say engineers at Metatomix, because it doesn’t try to transform the entire Web, but is narrowly focused on the specific problem of managing criminal records.

The big idea behind the semantic Web is to tag raw data with detailed descriptions or “metadata” that explain what the data is about and how it should be used; in theory, automated software can then recognize the data and reuse it in more intelligent ways. The project is widely considered laudable but impractical and unlikely to be realized in the near term, in part because it’s so hard to come up with a single, standard framework of descriptions (called an “ontology”) that could be applied to the entire Web.

But programmers are achieving greater success by implementing semantic Web concepts in restricted fields where ontologies are easier to build. And it turns out that tracking criminals is one of those fields.

As criminal defendants, prisoners, and parolees move through the justice system, they leave an electronic trail in the databases of dozens of agencies. What became clear after the Jessica Lunsford murder, and what turned into the biggest challenge for Metatomix, was that these databases often store the data in different ways. Something as simple as a home address, for example, might be represented using three different varieties of metadata (or no metadata at all) by the information systems at agencies such as county sheriff’s departments, drug rehabilitation centers, and state corrections offices.

Metatomix’s trick is to study the legacy databases operated by all the various agencies who have pieces of the puzzle and create an ontology that matches up the data contained in each according to its meaning. When a officer of the court needs to see the data all in one place, Metatomix’s system can pull out the individual pieces and assemble them in a temporary database. As soon as the case is finished, the data is purged; if it’s needed again, it is reassembled from scratch.

It’s an elegant solution to an ugly reality—government bureaucracies often have an active disincentive to make their information systems communicate. “At the state and local government level, there is very little information-sharing, because owning information means you have a budget, and with a budget comes power,” explains John Pilkington, Metatomix’s vice president of marketing. “In Florida, they had this guy in custody the day before, and they had all this information but they couldn’t put it together. So they let him go, and he ended up

Author: Wade Roush

Between 2007 and 2014, I was a staff editor for Xconomy in Boston and San Francisco. Since 2008 I've been writing a weekly opinion/review column called VOX: The Voice of Xperience. (From 2008 to 2013 the column was known as World Wide Wade.) I've been writing about science and technology professionally since 1994. Before joining Xconomy in 2007, I was a staff member at MIT’s Technology Review from 2001 to 2006, serving as senior editor, San Francisco bureau chief, and executive editor of TechnologyReview.com. Before that, I was the Boston bureau reporter for Science, managing editor of supercomputing publications at NASA Ames Research Center, and Web editor at e-book pioneer NuvoMedia. I have a B.A. in the history of science from Harvard College and a PhD in the history and social study of science and technology from MIT. I've published articles in Science, Technology Review, IEEE Spectrum, Encyclopaedia Brittanica, Technology and Culture, Alaska Airlines Magazine, and World Business, and I've been a guest of NPR, CNN, CNBC, NECN, WGBH and the PBS NewsHour. I'm a frequent conference participant and enjoy opportunities to moderate panel discussions and on-stage chats. My personal site: waderoush.com My social media coordinates: Twitter: @wroush Facebook: facebook.com/wade.roush LinkedIn: linkedin.com/in/waderoush Google+ : google.com/+WadeRoush YouTube: youtube.com/wroush1967 Flickr: flickr.com/photos/wroush/ Pinterest: pinterest.com/waderoush/