On control of robots in remote workspaces using lasers

From ISLAB/CAISR
Revision as of 18:02, 13 October 2017 by Jendav (Talk | contribs)

(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)
Title On control of robots in remote workspaces using lasers
Summary Supervised autonomy for controlling robots at remote locations. Lasers and ultrasonic ranging to be used.
Keywords Mechatronics, laser pointers, ultrasonic ranging, Supervised Autonomy, “man-in-the-feedback-loop”, IMU.
TimeFrame October 2017 to June 2018, with possible extension to September 2018
References In the video clip, http://mobile-robotics.com/ragvald.php , a rate gyro is used to stabilized the heading of a mini UGV prototype.
Prerequisites
Author
Supervisor Björn Åstrand, Jennifer David
Level Master
Status Open


Objective and research questions The goal is to build a mobile robot to test how robots can be controlled relative to objects in a remote work space. Sensors aboard the robot are laser, rate gyros/accelerometers and ultrasonic ranger. The commands of the driver to the robot should be done remotely using a navigation camera and laser pointers. The long-term goals of the project is to “understand fundamental limits” for tele-robotics. Models & tests.

Background To illustrate the technology, consider the video sequence: http://www.mobile-robotics.com/ugv/ It shows a rate gyro stabilized 4-wheel min-ATV that runs on the surface of gravel and stones. The direction of the vehicle is determined by the driver by directing the navigation camera. Gyro-switched 90° turns and U-turns are also displayed. Second video illustrates instability of type ”phase-windup”.

The control of the robot is to be with three laser pointers on board the robot: - a laser fix on the robot that gives the heading of the robot, - a laser pointer where the robot indicates where it will drive with current control. This is a virtual steering wheel. - a laser pointer that the driver uses to specify where the robot should move. Feedback to the driver is via a forward looking navigation camera.

The robot's hardware is two servo motors with wheel coders. This motor package will be the "driving front axle" for the mobile robot. Motors with wheel coders and drive electronics ready. Raspberry Pi with sensor card (SensorHat) is placed on the robot with the camera.

Project steps

WP1: Litterature Review and Basic motion – getting started The two motors form a “wheel chair” robot. First test is to driven along an 8-shaped trajectory. Laser pointer on RC-servo to be used as a virtual steering wheel. Caster wheel for balance.

WP2: Car type of robot The pair of motors above will be the front axis for a car type of robot.

WP3: Final tests. Reversing with a trailer behind car robot Driver to use a laser pointer for control. Faster than manual control? Learning the geometry by driving forward. Then reversing. Applications for vehicles in mines, in forests, at construction sites etc.