Formalization of the General Video Temporal Synchronization Problem

Authors

  • Anthony Whitehead Carleton University
  • Robert Laganiere Univeristy of Ottawa
  • Prosenjit Bose Carleton University

Abstract

In this work, we present a theoretical formalization of the temporal synchronization problem and a method to temporally synchronize multiple stationary video cameras with overlapping views of the same scene. The method uses a two stage approach that first approximates the synchronization by tracking moving objects and identifying curvature points. The method then proceeds to refine the estimate using a consensus based matching heuristic to find frames that best agree with the pre-computed camera geometries from stationary background image features. By using the fundamental matrix and the trifocal tensor in the second refinement step, we improve the estimation of the first step and handle a broader more generic range of input scenarios and camera conditions. The method is relatively simple compared to current techniques and is no harder than feature tracking in stage one and computing accurate geometries in stage two. We also provide a robust method to assist synchronization in the presence of inaccurate geometry computation, and a theoretical limit on the accuracy that can be expected from any synchronization system.

Keywords

Motion, Tracking, Video Analysis, Machine Vision, Video Surveillance, Remote Sensing

Published

2010-04-21

Downloads

Download data is not yet available.