Partial 3D Reconstruction using Evolutionary Algorithms
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 33093
Partial 3D Reconstruction using Evolutionary Algorithms

Authors: Mónica Pérez-Meza, Rodrigo Montúfar-Chaveznava

Abstract:

When reconstructing a scenario, it is necessary to know the structure of the elements present on the scene to have an interpretation. In this work we link 3D scenes reconstruction to evolutionary algorithms through the vision stereo theory. We consider vision stereo as a method that provides the reconstruction of a scene using only a couple of images of the scene and performing some computation. Through several images of a scene, captured from different positions, vision stereo can give us an idea about the threedimensional characteristics of the world. Vision stereo usually requires of two cameras, making an analogy to the mammalian vision system. In this work we employ only a camera, which is translated along a path, capturing images every certain distance. As we can not perform all computations required for an exhaustive reconstruction, we employ an evolutionary algorithm to partially reconstruct the scene in real time. The algorithm employed is the fly algorithm, which employ “flies" to reconstruct the principal characteristics of the world following certain evolutionary rules.

Keywords: 3D Reconstruction, Computer Vision, EvolutionaryAlgorithms, Vision Stereo.

Digital Object Identifier (DOI): doi.org/10.5281/zenodo.1328778

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1885

References:


[1] Louchet, J. September 2000. "Stereo analysis using individual evolution strategy". International conference on pattern recognition. Barcelona.
[2] Boumaza, A. M., Louchet, J. 2001. "Dynamic Flies: Using Real-Time Parisian Evolution in Robotics". EvoWorkshops, pp.. 288-297.
[3] Louchet, J., Guyon, M., Lesot, M.-J., Boumaza, A. 2002, "Dynamic Flies : a new pattern recognition tool applied to stereo sequence processing", Pattern Recognition Letters, No. 23 pp. 335-345.