Autonomous Navigation of Mobile Robots: Marker-based Localization System and On-line Path
Permanent address of the item is
Traditional wheelchairs are controlled mainly by joystick, which is not suitable solution with major disabilities. Current thesis aiming to create a human-machine interface and create a software, which performs indoor autonomous navigation of the commercial wheelchair RoboEye, developed at the Measurements Instrumentations Robotic Laboratory at the University of Trento in collaboration with Robosense and Xtrensa,. RoboEye is an intelligent wheelchair that aims to support people by providing independence and autonomy of movement, affected by serious mobility problems from impairing pathologies (for example ALS – amyotrophic lateral sclerosis). This thesis is divided into two main parts – human machine interface creation plus integration of existing services into developed solution, and performing possible solution how given wheelchair can navigate manually utilizing eye-tracking technologies, TOF cameras, odometric localization and Aruco markers. Developed interface supports manual, semi-autonomous and autonomous navigation. In addition to that following user experience specific for eye-tracking devices and people with major disabilities. Application delevoped on Unity 3D software using C# script following state-machine approach with multiple scenes and components. In the current master thesis, suggested solution satisfies user’s need to navigate hands-free, as less tiring as possible. Moreover, user can choose the destination point from defined in advance points of interests and reach it with no further input needed. User interface is intuitive and clear for experienced and inexperienced users. The user can choose UI’s icons image, scale and font size. Software performs in a state machine module, which is tested among users using test cases. Path planning routine is solved using Dijkstra approach and proved to be efficient.