Smart Wheelchairs
: Semantic mapping and correct selection of goals within unconstrained environments

Student thesis: Doctoral ThesisDoctor of Philosophy

Abstract

People with a wide range of physical disabilities rely on wheelchairs to aid mobility and improve their independence. Existing smart wheelchair prototypes provide a subset of autonomous tasks available for the user, such as docking and terrain adaption. However, the biggest constraints are navigating within dynamic environments, such as the home. This thesis develops a prototype smart wheelchair and introduces a novel visual perception pipeline to generate a semantic map to aid in the goal selection for semi-autonomous navigation. Semantic mapping is a method whereby a robot can include real-world data about object types and locations, as a result or the collection of contextual scene data and object detection. For example, when a verbal command “take me to a fridge” is received, the wheelchair will understand that the fridge is located within the kitchen, and therefore select the kitchen as a goal and then the fridge. As the smart wheelchair navigates the environment, it identifies and records the locations of all detected objects within an occupancy grid. In doing so, false positive
object detections are generated by a Deep Neural network (DNN). This results in a semantic map populated with incorrectly labeled objects that the smart wheelchair may select. Several methods are proposed to reduce the influence of false positive object instances within the semantic map and to correctly select a goal instructed by the user. The proposed methods utilise the DNN object confidence, the uniqueness of the object within the environment and the number of times the object has been re-detected or deemed missing by the visual perception pipeline. Results show that utilising context data for semantic mapping reduces the likelihood of false positive object detections being selected by the smart wheelchair, in comparison to deciding goals without context data. We also show that the proposed methods can automatically adapt to the movement of objects within the environment.
Date of Award2024
Original languageEnglish
Awarding Institution
  • Aberystwyth University
SupervisorPatricia Shaw (Supervisor) & Frédéric Labrosse (Supervisor)

Keywords

  • assistive robotics
  • smart wheelchairs
  • semantic mapping
  • computer vision

Cite this

'