In my latest novel, Silver Rivers, high-tech archeologists use a robotic airship to explore an underground city that has already claimed the lives of three workers who tried to enter it. The problem is that the atmosphere in the enclosed space has been oxygen depleted, and poisoned with mercury. It’s the archetypical 3-Ds scenario, where the task at hand is both dirty and dangerous.
Each day, the scientists assign the robot to fly over a certain section of the city, recording video from six cameras arranged in an array mounted under the airship’s hull. A supercomputer then analyzes the data, and computes a three-dimensional “point cloud” providing a pointillist picture of that part of the city. The scientists then use a virtual reality system to “fly through” the point cloud, and see what they want to see.
In the real world, researchers at the Jacobs School of Engineering at the University of California-San Diego have built a similar system for an even more practical purpose. In a June 5 announcement, Y. Chen et al described FFR — a firefighting scout robot intended to autonomously explore burning buildings. Using stereo vision, infrared imaging and other sensors, the system is intended to characterize the state of the fire, including temperatures, volatile gases and the building’s structural integrity, while looking for survivors (presumably in need of rescue).
Unlike Bertha, the fictional cave-exploring airship of my novel, FFR builds a sparse point cloud with a reduced data set that is much faster to calculate. The city Bertha was exploring has been undisturbed for two millennia. It’s not going anywhere, so taking 24 hours to calculate the point cloud isn’t a problem. In a burning building, that order of magnitude is too slow. Hence, capturing just enough points to paint a rough picture and reporting them in as close to real time as possible is top priority.
In the June 5 article, the authors present a rather neat video showing FFR at work. The beast looks and moves like a miniature Segway, but includes an elevator feature that allows it to climb and descend stairs. The cameras and sensors mount on a board extending upward between the drive wheels like the Segway’s steering handle. We look forward to watching further progress in this really neat project.
C.G. Masi has been blogging about technology and society since 2006. In a career spanning more than a quarter century, he has written more than 400 articles for scholarly and technical journals, and six novels dealing with automation’s place in technically advanced society. For more information, visit www.cgmasi.com.