Occipital, the San Francisco-based spatial computing startup, first planted its flag in computer vision tech five years ago when it released a plug-in sensor, billed as the first 3D sensor created for mobile devices such as the iPad. Called Structure Sensor, it allows users—mostly software developers and gamers—to capture 3D models or images in order to map the surrounding area.
Occipital announced this week that it has released a second sensor, called Structure Core. According to Adam Rodnitzky, Occipital’s vice president of marketing, Structure Core differs from the original product in that it is more self-contained, doesn’t need to be attached to an iPad, and has a “very sophisticated optical package—cameras and sensors that allow it to work in a number of different ways.”
Structure Sensor was designed for primarily for users with fairly standard 3D scanning needs, but Structure Core is aimed at drone, virtual reality/augmented reality, and robotic applications.
Occipital’s Structure Sensor technology hinges on an image-capture technique called structured light, which involves projecting a pattern of dots on top of an image. The dot pattern is calibrated to minimize distortion and create geometrically dense 3D images. Structure Core incorporates structured light projection along with infrared and wide-vision cameras and onboard data signal processing, all packed into a pocket-sized piece of hardware. Structure Core can also be used without the laser projector, instead relying on triangulation to create an image, and has a built-in inertial measurement unit (IMU) for motion capture.
“We can take inputs from the IMU, which is movement-based, and cameras, which are vision-based, for AR/VR motion tracking,” Rodnitzky says. Combining these inputs improves the quality of the AR/VR motion tracking, he adds.
These days, many of the 3D sensing innovations we hear about are related to the development of autonomous vehicle (AV) technologies. Rodnitzky says that while the company isn’t ruling out future AV uses, the new sensors have a range restriction of about 15 feet—not very practical for vehicles that need to see hundreds of feet around them. He says Occipital’s products might have a role one day in powering the interior functions of AVs, such as gesture control.
The 65-employee company was co-founded by two University of Michigan alumni and has additional offices in Boulder, CO, and Gainesville, FL. Occipital has a number of investors, including Intel Capital and Foundry Group, and sees the release of Structure Core as a turning point.
“Five years after releasing Structure Sensor, the world knows us as the maker of sensors for iPads, so people building robots and drones might not know about us,” Rodnitzky says. “When we think historically about how we’re perceived as a company, there are a few points where it has changed radically. This is one of those points.”
What excites the company is being able to “play a bit more as an OEM [original equipment manufacturer] instead of making sensors for consumers,” he adds. To that end, Structure Core’s first product integration has been with partner Misty Robotics. Other companies Occipital works with include headset maker Kura AR; Fit3D, which sells fitness trackers; and AutoModality, which makes drones for use in industrial inspections.
In 2019, Occipital plans to advance perception software for device OEMs to improve positional tracking for VR/AR headsets, drones, and robots. It also plans to expand its Canvas division, which is doing “a lot of work with interior designers” in an effort to expand the company’s product for consumer use.