2009. 3. 27. 21:33
Computer Vision
Scalable Monocular SLAM
Eade, E. Drummond, T.
Cambridge University;
This paper appears in: Computer Vision and Pattern Recognition, 2006 IEEE Computer Society Conference on
Publication Date: 17-22 June 2006
Volume: 1, On page(s): 469- 476
ISSN: 1063-6919
ISBN: 0-7695-2597-0
Digital Object Identifier: 10.1109/CVPR.2006.263
Current Version Published: 2006-07-05
Ethan Eade & Tom Drummond
Machine Intelligence Laboratory
the Division of Information Engineering at Cambridge University Engineering Department
monocular SLAM
particle filter + top-down search => real-time, large number of landmarks
the first to apply this FastSLAM-type particle filter to single-camera SLAM
particle filter + top-down search => real-time, large number of landmarks
the first to apply this FastSLAM-type particle filter to single-camera SLAM
1. Introduction
SLAM = Simultaneous Localization and Mapping
: process of causally estimating both egomotion and structure in an online system
SLAM using visual data in computer vision
SFM (= structure from motion): reconstructing scene geometry
+ causal or recursive estimation techniques
perspective-projection cameras
filtering methods to allow indirect observation models
Kalman filtering framework
Extended Kalman filter = EKF (-> to linearize the observation and dynamics models of the system)
causal estimation with recursive algorithms (cp. estimation depending only on observations up to the current time)
=> online operation (cp. SFM on global nonlinear optimization)
=> online operation (cp. SFM on global nonlinear optimization)
Davision's SLAM with a single camera
> EKF estimation framework
> top-down Bayesian estimation approach searching for landmarks in image regions constrained by estimate > uncertainty (instead of performing extensive bottom-up image processing and feature matching)
> Bayesian partial-initialization scheme for incorporating new landmarks
- cannot scale to large environment
> EKF estimation framework
> top-down Bayesian estimation approach searching for landmarks in image regions constrained by estimate > uncertainty (instead of performing extensive bottom-up image processing and feature matching)
> Bayesian partial-initialization scheme for incorporating new landmarks
- cannot scale to large environment
EKF = the Extended Kalman filter
- N*N covariace matrix for N landmarks
- updated with N*N computation cost
> SLAM system using a single camera as the only sensor
> frame-rate operation with many landmarks
> FastSLAM-style particle filter (the first use of such an approach in a monocular SLAM setting)
> top-down active search
> an efficient algorithm for discovering the depth of new landmarks that avoids linearization errors
> a novel method for using partially initialized landmarks to help constrain camera pose
> frame-rate operation with many landmarks
> FastSLAM-style particle filter (the first use of such an approach in a monocular SLAM setting)
> top-down active search
> an efficient algorithm for discovering the depth of new landmarks that avoids linearization errors
> a novel method for using partially initialized landmarks to help constrain camera pose
FastSLAM
: based on the Rao-Blackwellized Particle Filter
2. Background
2.1 Scalable SLAM
> submap
bounded complexity -> bounded computation and space requirements
Montemerlo & Thrun
If the entire camera motion is known then the estimates of the positions of different landmarks become independent of each other.
Rao-Blackwellized Particle Filter
ZNCC = the Zero mean Normalized Cross-Correlation function epipolar constraint
epipolar constraint
http://en.wikipedia.org/wiki/Epipolar_geometry
'Computer Vision' 카테고리의 다른 글
Frank C. Park and Bryan J. Martin <Robot Sensor Calibrattion: Solving AX = XB on the Euclidean Group> (0) | 2009.03.31 |
---|---|
Rao-Blackwellized Particle Filter (0) | 2009.03.31 |
people in SLAM (0) | 2009.03.27 |
Civera, Davison & Montiel <Inverse Depth Parametrization for Monocular SLAM> (0) | 2009.03.26 |
camera calibration 09-02-16 (0) | 2009.02.16 |