Robotics

From robot navigation and multimedia information acquisition by remote robot

Development of a Networked Robotic System for Disaster Mitigation
Murros The Horizontal fixed viewpoints Biconical Paraboloidal:HBP mirror is an anisotropic convex mirror that has inhomogeneous angular resolution about azimuth angle. In this paper, we propose a remote surveillance system with a specially designed HBP mirror system for collecting information of devastated areas. We used a virtual remote surveillance environment for investigating effectiveness of the HBP mirror system compared with conventional omnidirectional imaging systems. From results of object searching experiments, we confirmed that objects can be detected early and with certainty by the HBP mirror system. We also developed a real remote surveillance system for actual surveillance experiments. Through those two types of experiments, we got the results that the HBP mirror system effectively works for remote surveillance.
  • Publications
    1. Keiji Nagatani, Kazuya Yoshida, Kiyoshi Kiyokawa, Yasushi Yagi, Tadashi Adachi, Hiroaki Saitoh, Toshiya Suzuki, Osamu Takizawa, "Development of a Networked Robotic System for Disaster Mitigation - System Description of Multi-robot System and Report of Performance Tests-", In Proceedings of the 6th International Conference on Field and Service Robotics, pp.333-342, 2007.
    2. Kazuaki Kondo, Yasuhiro Mukaigawa, Toshiya Suzuki, Yasushi Yagi, "Evaluation of a HBP mirror system for remote surveillance", In Proc. of IEEE/RSJ International Conference on Intelligent Robots and Systems, Beijing, China, October 11-13, 2006.
    3. Kazuya Yoshida, Keiji Nagatani, Kiyoshi Kiyokawa, Yasushi Yagi, Tadashi Adachi, Hiroaki Saitoh, Hiroyuki Tanaka, Hiroyuki Ohno, "Development of a Networked Robotic System for Disaster Mitigation - Test Bed Experiments for Remote Operation Over Rough Terrain and High Resolution 3D Geometry Acquisition -", In Proc. the 5th International Conference on Field and Service Robotics, North Queensland, Australia, 29-31, July, 2005.



Non-isotropic Omnidirectional Imaging System for an Autonomous Mobile Robot
HBPMirror A real-time omnidirectional imaging system that can acquire an omnidirectional field of view at video rate using a convex mirror was applied to a variety of conditions. The imaging system consists of an isotropic convex mirror and a camera pointing vertically toward the mirror with its optical axis aligned with the mirror's optical axis. Because of these optics, angular resolution is independent of the azimuth angle. However, it is important for a mobile robot to find and avoid obstacles in its path. We consider that angular resolution in the direction of the robot's moving needs higher resolution than that of its lateral view. In this paper, we propose a non-isotropic omnidirectional imaging system for navigating a mobile robot.
  • Publications
    1. Kazuaki Kondo, Yasushi Yagi, Masahiko Yachida, "Non-isotropic Omnidirectional Imaging System for an Autonomous Mobile Robot", In Proc. 2005 IEEE International Conference on Robotics and Automation, Barcelona, Spain, April 18-22, 2005.



Real Time 3D Environment Modeling for a Mobile Robot by Aligning Range Image Sequences
RealTimeRegistration This paper describes real time 3D modeling of the environment for a mobile robot. A real time laser range finder is mounted on the robot, and obtains a range image sequence of the environment while moving around. In this paper, we detail our method that accomplished simultaneous localization and 3D modeling by aligning the acquired range images. The method incrementally aligns range images in real time by a variant of the iterative closest point (ICP) method. By estimating the uncertainty of range measurements, we introduce a new weighting scheme into the aligning framework. In an experiment, we first evaluate the accuracy of the localization results by aligning range images. Second, we show the results of modeling and localization when a robot moves along a meandering path. Finally, we summarize the conditions and limitations required of the robot's motion and the environment for our method to work well.
  • Publications
    1. Ryusuke Sagawa, Nanaho Osawa, Tomio Echigo, Yasushi Yagi, "Real Time 3D Environment Modeling for a Mobile Robot by Aligning Range Image Sequences", In Proc. British Machine Vision Conference 2005, vol.1, pp.330-339, Oxford, UK, September, 2005.