An Evaluation of Image-Based Robot Orientation Estimation

Research output: Chapter in Book/Report/Conference proceedingConference Proceeding (Non-Journal item)

1 Citation (Scopus)

Abstract

This paper describes a novel image-based method for robot orientation estimation based on a single omnidirectional camera. The estimation of orientation is computed by finding the best pixel-wise match between images as a function of the rotation of the second image. This is done either using the first image as the reference image or with a moving reference image. Three datasets were collected in different scenarios along a “Gummy Bear” path in outdoor environments. This carefully designed path has the appearance of a gummy bear in profile, and provides many curves and sets of image pairs that are challenging for visual robot localisation. We compare our method to a feature-based method using SIFT and another appearance-based visual compass. Experimental results demonstrate that the appearance-based methods perform well and more consistently than the feature based method, especially when the compared images were grabbed at positions far apart.
Original languageEnglish
Title of host publicationTowards Autonomous Robotic Systems
PublisherSpringer Nature
Pages135-147
Number of pages23
ISBN (Print)9783662436448, 3662436442
DOIs
Publication statusPublished - 31 Jul 2013
Event14th Annual Conference, TAROS 2013 - Oxford, United Kingdom of Great Britain and Northern Ireland
Duration: 28 Aug 201330 Aug 2013

Publication series

NameLecture Notes in Artificial Intelligence

Conference

Conference14th Annual Conference, TAROS 2013
Country/TerritoryUnited Kingdom of Great Britain and Northern Ireland
CityOxford
Period28 Aug 201330 Aug 2013

Fingerprint

Dive into the research topics of 'An Evaluation of Image-Based Robot Orientation Estimation'. Together they form a unique fingerprint.

Cite this