Rescue Robotics

Rescue Robotics employs advanced technology for disaster response, mitigating risks to human responders. Robots like Quince and Kenaf traverse difficult terrains and damaged structures, gathering crucial data for recovery efforts. Initiatives like RoboCup Rescue promote advancements in the field, while specialized robots like the Active Scope Camera and SAR Robo-dogs provide unique capabilities. The use of Micro Aerial Vehicles further expands the scope of exploration and damage assessment, affirming the invaluable role of robotics in disaster response.

Mobile Robots

Mobile Robots are integral to next-generation vehicle technologies, requiring expertise in motion planning, 3D measurement, environment recognition, and stochastic position estimation. The NEDO Intelligent RT Software Project develops adaptable autonomous robots using RT-Middleware, enabling easy reconfiguration of sensors and motors. The Tsukuba Challenge exemplifies real-world application of these robots in autonomous navigation. Additionally, the development of autonomous electric vehicles for public roadways showcases the progress in the field. Research is also underway to improve the maneuverability and safety of cycling wheelchairs for enhanced rehabilitation.

Real World Recognition

Real World Recognition in robotics facilitates human support in everyday environments and disaster sites. This involves sensing and recognizing surroundings to guide decision-making. The SAKIGAKE project focuses on creating a database for autonomous robots by gathering detailed information about unknown objects based on visual and motion cues. Utilizing a 3-D laser scanner on a tracked vehicle, we've developed a 3-D mapping method for dense shape measurement. This method, further improved through collaboration with Japanese firefighters, supports effective navigation and decision-making in diverse real-world scenarios.


The emphasis of the research revolves around the representation of artificial tactile feeling as a means to enhance virtual reality technologies and other remote interactive applications. This entails investigating methods for presenting qualitative surface textures, such as those of clothes, through vibratory stimuli applied to fingers, and understanding the human tactile information processing. The researchers have developed an ICPF tactile display that replicates multiple tactile feelings, and methods to synchronize tactile displays with hand motions. Their work includes the creation of a system allowing users to virtually interact with a 3D model of a pig, presenting various tactile sensations. The ultimate objective of this research is to develop technologies that enhance sensory perception and support motor functions, with potential applications in creating safe and easy-to-use walking support systems, prosthetic limbs, and efficient rehabilitations.


The mechanism research team has moved to Osaka University in April 2024.

Osaka University Tadakuma Laboratory: