Search results for: monocular.
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 6

Search results for: monocular.

6 Novel Rao-Blackwellized Particle Filter for Mobile Robot SLAM Using Monocular Vision

Authors: Maohai Li, Bingrong Hong, Zesu Cai, Ronghua Luo

Abstract:

This paper presents the novel Rao-Blackwellised particle filter (RBPF) for mobile robot simultaneous localization and mapping (SLAM) using monocular vision. The particle filter is combined with unscented Kalman filter (UKF) to extending the path posterior by sampling new poses that integrate the current observation which drastically reduces the uncertainty about the robot pose. The landmark position estimation and update is also implemented through UKF. Furthermore, the number of resampling steps is determined adaptively, which seriously reduces the particle depletion problem, and introducing the evolution strategies (ES) for avoiding particle impoverishment. The 3D natural point landmarks are structured with matching Scale Invariant Feature Transform (SIFT) feature pairs. The matching for multi-dimension SIFT features is implemented with a KD-Tree in the time cost of O(log2 N). Experiment results on real robot in our indoor environment show the advantages of our methods over previous approaches.

Keywords: Mobile robot, simultaneous localization and mapping, Rao-Blackwellised particle filter, evolution strategies, scale invariant feature transform.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2089
5 Single-Camera EKF-vSLAM

Authors: ML. Benmessaoud, A. Lamrani, K. Nemra, AK. Souici

Abstract:

This paper presents an Extended Kaman Filter implementation of a single-camera Visual Simultaneous Localization and Mapping algorithm, a novel algorithm for simultaneous localization and mapping problem widely studied in mobile robotics field. The algorithm is vision and odometry-based, The odometry data is incremental, and therefore it will accumulate error over time, since the robot may slip or may be lifted, consequently if the odometry is used alone we can not accurately estimate the robot position, in this paper we show that a combination of odometry and visual landmark via the extended Kalman filter can improve the robot position estimate. We use a Pioneer II robot and motorized pan tilt camera models to implement the algorithm.

Keywords: Mobile Robot, Navigation, vSLAM, EKF, monocular.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1636
4 Vehicle Position Estimation for Driver Assistance System

Authors: Hyun-Koo Kim, Sangmoon Lee, Ho-Youl Jung, Ju H. Park

Abstract:

We present a system that finds road boundaries and constructs the virtual lane based on fusion data from a laser and a monocular sensor, and detects forward vehicle position even in no lane markers or bad environmental conditions. When the road environment is dark or a lot of vehicles are parked on the both sides of the road, it is difficult to detect lane and road boundary. For this reason we use fusion of laser and vision sensor to extract road boundary to acquire three dimensional data. We use parabolic road model to calculate road boundaries which is based on vehicle and sensors state parameters and construct virtual lane. And then we distinguish vehicle position in each lane.

Keywords: Vehicle Detection, Adaboost, Haar-like Feature, Road Boundary Detection

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1586
3 Visual Odometry and Trajectory Reconstruction for UAVs

Authors: Sandro Bartolini, Alessandro Mecocci, Alessio Medaglini

Abstract:

The growing popularity of systems based on Unmanned Aerial Vehicles (UAVs) is highlighting their vulnerability particularly in relation to the positioning system used. Typically, UAV architectures use the civilian GPS which is exposed to a number of different attacks, such as jamming or spoofing. This is why it is important to develop alternative methodologies to accurately estimate the actual UAV position without relying on GPS measurements only. In this paper we propose a position estimate method for UAVs based on monocular visual odometry. We have developed a flight control system capable of keeping track of the entire trajectory travelled, with a reduced dependency on the availability of GPS signal. Moreover, the simplicity of the developed solution makes it applicable to a wide range of commercial drones. The final goal is to allow for safer flights in all conditions, even under cyber-attacks trying to deceive the drone.

Keywords: Visual odometry, autonomous UAV, position measurement, autonomous outdoor flight.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 519
2 An Efficient Fundamental Matrix Estimation for Moving Object Detection

Authors: Yeongyu Choi, Ju H. Park, S. M. Lee, Ho-Youl Jung

Abstract:

In this paper, an improved method for estimating fundamental matrix is proposed. The method is applied effectively to monocular camera based moving object detection. The method consists of corner points detection, moving object’s motion estimation and fundamental matrix calculation. The corner points are obtained by using Harris corner detector, motions of moving objects is calculated from pyramidal Lucas-Kanade optical flow algorithm. Through epipolar geometry analysis using RANSAC, the fundamental matrix is calculated. In this method, we have improved the performances of moving object detection by using two threshold values that determine inlier or outlier. Through the simulations, we compare the performances with varying the two threshold values.

Keywords: Corner detection, optical flow, epipolar geometry, RANSAC.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1077
1 Image-Based UAV Vertical Distance and Velocity Estimation Algorithm during the Vertical Landing Phase Using Low-Resolution Images

Authors: Seyed-Yaser Nabavi-Chashmi, Davood Asadi, Karim Ahmadi, Eren Demir

Abstract:

The landing phase of a UAV is very critical as there are many uncertainties in this phase, which can easily entail a hard landing or even a crash. In this paper, the estimation of relative distance and velocity to the ground, as one of the most important processes during the landing phase, is studied. Using accurate measurement sensors as an alternative approach can be very expensive for sensors like LIDAR, or with a limited operational range, for sensors like ultrasonic sensors. Additionally, absolute positioning systems like GPS or IMU cannot provide distance to the ground independently. The focus of this paper is to determine whether we can measure the relative distance and velocity of UAV and ground in the landing phase using just low-resolution images taken by a monocular camera. The Lucas-Konda feature detection technique is employed to extract the most suitable feature in a series of images taken during the UAV landing. Two different approaches based on Extended Kalman Filters (EKF) have been proposed, and their performance in estimation of the relative distance and velocity are compared. The first approach uses the kinematics of the UAV as the process and the calculated optical flow as the measurement. On the other hand, the second approach uses the feature’s projection on the camera plane (pixel position) as the measurement while employing both the kinematics of the UAV and the dynamics of variation of projected point as the process to estimate both relative distance and relative velocity. To verify the results, a sequence of low-quality images taken by a camera that is moving on a specifically developed testbed has been used to compare the performance of the proposed algorithm. The case studies show that the quality of images results in considerable noise, which reduces the performance of the first approach. On the other hand, using the projected feature position is much less sensitive to the noise and estimates the distance and velocity with relatively high accuracy. This approach also can be used to predict the future projected feature position, which can drastically decrease the computational workload, as an important criterion for real-time applications.

Keywords: Automatic landing, multirotor, nonlinear control, parameters estimation, optical flow.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 423