Smart wheelchairs: Visual perception pipeline to improve prediction of user intention

Allbwn ymchwil: Pennod mewn Llyfr/Adroddiad/Trafodion CynhadleddTrafodion Cynhadledd (Nid-Cyfnodolyn fathau)


Wheelchairs aid people with physical disabilities by assisting with mobility, thus improving their independence. Autonomous assistance on wheelchairs are limited to prototypes that provide ‘smart functionality’, by completing tasks such as docking or terrain adaption. The biggest constraints are navigating within dynamic environments, such as the home.

This paper describes the data pipeline to automate the wheelchair navigation process, from classifying an object, esti- mating the user’s intention via verbal command (e.g. take me to the fridge) and navigating towards a goal. Object locations will be registered within a map whilst contex- tual meta data is calculated. A combination of object classification confidence and object instances is used to calculate the uniqueness of all identifiable objects. Thus, assisting in predicting the user’s intention. For example, if a “go to the fridge” request is received, the wheelchair will know that the fridge is located within the kitchen, and therefore drive to the kitchen and then the fridge.

Results show that utilising contextual data reduces the like- lihood of false-positive object detections being registered by the navigation pipeline, thus is more likely to interpret the user intention more accurately.
Iaith wreiddiolSaesneg
TeitlUKRAS22 Conference “Robotics for Unconstrained Environments” Proceedings
CyhoeddwrUK-RAS Network
Nifer y tudalennau2
Dynodwyr Gwrthrych Digidol (DOIs)
StatwsCyhoeddwyd - 26 Awst 2022

Ôl bys

Gweld gwybodaeth am bynciau ymchwil 'Smart wheelchairs: Visual perception pipeline to improve prediction of user intention'. Gyda’i gilydd, maen nhw’n ffurfio ôl bys unigryw.

Dyfynnu hyn