Automated ground-plane estimation for trajectory rectification

Ian Hales, David Hogg, Kia Ng, Roger Boyle

Research output: Chapter in Book/Report/Conference proceedingConference Proceeding (Non-Journal item)


We present a system to determine ground-plane parameters in densely crowded scenes where use of geometric features such as parallel lines or reliable estimates of agent dimensions are not possible. Using feature points tracked over short intervals, together with some plausible scene assumptions, we can estimate the parameters of the ground-plane to a sufficient degree of accuracy to correct usefully for perspective distortion. This paper describes feasibility studies conducted on controlled, simulated data, to establish how different levels and types of noise affect the accuracy of the estimation, and a verification of the approach on live data, showing the method can estimate ground-plane parameters, thus allowing improved accuracy of trajectory analysis.

Original languageEnglish
Title of host publicationComputer Analysis of Images and Patterns - 15th International Conference, CAIP 2013, Proceedings
PublisherSpringer Nature
Number of pages8
EditionPART 2
ISBN (Print)9783642402456
Publication statusPublished - 07 Aug 2013
Externally publishedYes
Event15th International Conference on Computer Analysis of Images and Patterns, CAIP 2013 - York, United Kingdom of Great Britain and Northern Ireland
Duration: 27 Aug 201329 Aug 2013

Publication series

NameLecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
NumberPART 2
Volume8048 LNCS
ISSN (Print)0302-9743
ISSN (Electronic)1611-3349


Conference15th International Conference on Computer Analysis of Images and Patterns, CAIP 2013
Country/TerritoryUnited Kingdom of Great Britain and Northern Ireland
Period27 Aug 201329 Aug 2013


  • crowd-motion
  • ground-plane
  • rectification
  • trajectory


Dive into the research topics of 'Automated ground-plane estimation for trajectory rectification'. Together they form a unique fingerprint.

Cite this