Full Length Research Paper
Abstract
This paper presents a mobile humanoid robot platform which is able to understand humans' speech commands in teleoperation environments. For service in unstructured environments, the robot must operate efficiently under active auditory perception system to ensure a coordinated human-robot system. First, the speech-based teleoperation control is introduced, with the cameras mounting on the robots, transmitting the video to the user through the wireless network, while the user sends speech commands to drive the robot to fulfill the desired task through the same communication channel. In order to eliminate the time delay in the communication channel, the authors incorporate the event-based motion control into the robot control. It drives the system to achieve the best possible motion. Finally, the event-based speech teleoperation is experimentally implemented and verified using a human-like mobile robot.
Key words: Human-like mobile robot, speech recognition, teleoperation, event-based control.
Copyright © 2024 Author(s) retain the copyright of this article.
This article is published under the terms of the Creative Commons Attribution License 4.0