Welcome to Xconomy Seattle’s new Week in Review. Today, we’ll look back at significant happenings from the Seattle area and beyond, in space, virtual reality, and machine intelligence.
The idea is to give you a quick, visual way to digest some of the week’s news. As we all know, dealing with information overload is an ongoing challenge (our erstwhile columnist Wade Roush was back on Xconomy’s pages this week with his excellent method for browsing and sorting through the endless stream in a thoughtful way). We hope this weekly review will be a place you can come for essential and fun innovation news, tailored to our Pacific Northwest readers. Your feedback is always welcome!
—I watched the 2014 movie Interstellar last weekend, so I was primed (sort of) for some spacetime news. But this week brought us a potentially Nobel-worthy blockbuster: LIGO—the Laser Interferometer Gravitational-Wave Observatory, which has one of its twin detector arrays at Hanford, WA—confirmed what Albert Einstein predicted a century ago in his general theory of relativity: there are gravitational waves, distortions in spacetime, rippling across the universe. The specific phenomenon that has popular and scientific press buzzing this week is the result of a collision of two black holes (depicted above), a long time ago (1.3 billion years) in a galaxy far, far away. Their collision emitted an unfathomably large amount of energy, in the form of gravitational waves, which humanity observed on Sep. 14, 2015, for the first time. This is important because it marks a major milestone in a new field of study of the universe, gravitational wave astronomy, which promises possibilities of discovery “as rich and boundless as they have been with light-based astronomy,” LIGO leaders say. So, will gravitational wave astronomy someday inform humanity’s creation of wormholes through which we can travel to a distant universe, as depicted in Interstellar? I have no idea.
—A little closer to home, the NASA Jet Propulsion Laboratory commissioned a series of travel posters, advertising trips to Mars, the moons of Jupiter, a cloud observatory on Venus, as well as asteroids and exoplanets. Three of these retro-futuristic posters were created by brothers Don and Ryan Clark, designers and illustrators at Seattle-based studio Invisible Creature. The fantastic twist to this story is that the Clarks’ grandfather, Al Paulsen, spent more than 30 years as an illustrator and graphic designer for NASA. “This project was obviously a special one,” Don Clark writes in a blog post.
—In another bit of local space news, Spaceflight, a Seattle-based company providing satellite launch services, said it entered a contract with the U.S. General Services Administration, recognizing it as “a preferred launch services vendor fully authorized to conduct business directly with federal government agencies,” the company said in a news release. It can now provide federal agencies with pre-negotiated, fixed-price services, which should result in quicker, easier small satellite launches, and reduced administrative costs for the agencies, Spaceflight says. The company last fall announced the purchase of a SpaceX Falcon 9 rocket to launch its own “rideshare” mission, sending a payload of satellites into an Earth orbit aligned with the sun in 2017.
Switching gears, the buzz around virtual reality is growing to a roar. Here’s some of our recent coverage:
—Nearpod, which makes course-creation software for teachers, is pushing forward with virtual reality in education, leveraging a low-end Google Cardboard-like virtual reality viewer. Backed by donors including the John S. and James L. Knight Foundation, Salesforce CEO Marc Benioff, and Krillion Ventures, Nearpod is giving away technology to schools that can’t afford it for free. Applications are open through March 31.
—Education is an exciting application of virtual reality technology, and there’s interesting work being done by Foundry10, based in Seattle. The education nonprofit is running a pilot study of virtual reality in the classroom. It conducted a “pre-pilot” survey of some 300 students to gauge their interests, aspirations, and background in technology and virtual reality specifically.
—A Google team in Seattle has been instrumental in creating new technologies for capturing immersive virtual reality video and panoramic images, building up an ecosystem around Google Cardboard, and, perhaps, a more sophisticated head-mounted device. The team is led by Steve Seitz, Google’s “teleportation lead” and a University of Washington computer science professor. Some of his team members added some special effects to the photo of him that ran with our story this week.
—Xconomy also talked to virtual reality filmmakers, including Eric Darnell of Baobab Studios, on the questions this new platform raises for the conventions of cinema.
—Microsoft rolled out a nifty application of computer vision and machine learning that can determine a dog’s breed from a picture. Fetch! popped up repeatedly in my Twitter timeline this week, but not to identify dog breeds. Instead, people were using it to analyze pictures of themselves, answering the age-old question: What dog am I? Your loyal correspondent was ID’d as a Dutch Shepherd: “Smart, athletic, independent-minded.” I’ll take it.
The technology under the hood is from Project Oxford, a collection of machine intelligence services Microsoft released last spring. There are now some 15 services available to developers for use in their apps. They include face recognition, tracking, and verification; emotion recognition; and speech-to-text and back again.
And the machine learning / artificial intelligence news continues unabated. This week, Xconomy covered:
—Diffbot’s $10 million Series A funding round, which will help the Palo Alto, CA-based startup continue cataloging the Web, collecting facts to power a variety of services such as monitoring customer feedback, price comparisons, competitive intelligence, and more.
—DataRobot’s $33 million Series B round, bringing total investment in the Boston-area company making machine learning and data-science tools for businesses to more than $57 million.
—And Vast’s $14 million funding round. The Austin, TX-based company is using machine learning to power a search engine for cars that makes recommendations based on what a buyer wants to use a vehicle for, rather than typical search parameters like make and model.