Image-Based UAV Vertical Distance and Velocity Estimation Algorithm during the Vertical Landing Phase Using Low-Resolution Images
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 33100
Image-Based UAV Vertical Distance and Velocity Estimation Algorithm during the Vertical Landing Phase Using Low-Resolution Images

Authors: Seyed-Yaser Nabavi-Chashmi, Davood Asadi, Karim Ahmadi, Eren Demir

Abstract:

The landing phase of a UAV is very critical as there are many uncertainties in this phase, which can easily entail a hard landing or even a crash. In this paper, the estimation of relative distance and velocity to the ground, as one of the most important processes during the landing phase, is studied. Using accurate measurement sensors as an alternative approach can be very expensive for sensors like LIDAR, or with a limited operational range, for sensors like ultrasonic sensors. Additionally, absolute positioning systems like GPS or IMU cannot provide distance to the ground independently. The focus of this paper is to determine whether we can measure the relative distance and velocity of UAV and ground in the landing phase using just low-resolution images taken by a monocular camera. The Lucas-Konda feature detection technique is employed to extract the most suitable feature in a series of images taken during the UAV landing. Two different approaches based on Extended Kalman Filters (EKF) have been proposed, and their performance in estimation of the relative distance and velocity are compared. The first approach uses the kinematics of the UAV as the process and the calculated optical flow as the measurement. On the other hand, the second approach uses the feature’s projection on the camera plane (pixel position) as the measurement while employing both the kinematics of the UAV and the dynamics of variation of projected point as the process to estimate both relative distance and relative velocity. To verify the results, a sequence of low-quality images taken by a camera that is moving on a specifically developed testbed has been used to compare the performance of the proposed algorithm. The case studies show that the quality of images results in considerable noise, which reduces the performance of the first approach. On the other hand, using the projected feature position is much less sensitive to the noise and estimates the distance and velocity with relatively high accuracy. This approach also can be used to predict the future projected feature position, which can drastically decrease the computational workload, as an important criterion for real-time applications.

Keywords: Automatic landing, multirotor, nonlinear control, parameters estimation, optical flow.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 528

References:


[1] Asadi, D., Ahmadi, K., Nabavi-chashmi, S., and Ö., T. “Controlability of Multi-Rotors under Motor Fault Effect.” Artıbilim: Adana Alparslan Türkeş Bilim ve Teknoloji Üniversitesi Fen Bilimleri Dergisi, Vol. 4, No. 2, 2021, pp. 24–43.
[2] G. Fink, M. Franke, A. F. Lynch, K. Röbenack, and B. Godbolt, “Visual Inertial SLAM: Application to Unmanned Aerial Vehicles,” IFAC-PapersOnLine, vol. 50, no. 1, pp. 1965–1970, 2017, doi: https://doi.org/10.1016/j.ifacol.2017.08.162.
[3] H. W. Ho and Q. P. Chu, “Automatic Landing System of a Quadrotor UAV Using Visual Servoing,” EuroGNC, pp. 1264–1283, 2013, (Online). Available: https://aerospace-europe.eu/media/books/delft-0046.pdf.
[4] S. Abujoub, J. McPhee, C. Westin, and R. A. Irani, “Unmanned Aerial Vehicle Landing on Maritime Vessels using Signal Prediction of the Ship Motion,” Ocean. 2018 MTS/IEEE Charleston, Ocean 2018, 2019, doi: 10.1109/OCEANS.2018.8604820.
[5] Y. Lu, Z. Xue, G.-S. Xia, and L. Zhang, “A survey on vision-based UAV navigation,” Geo-spatial Inf. Sci., vol. 21, no. 1, pp. 21–32, Jan. 2018, doi: 10.1080/10095020.2017.1420509.
[6] Z. Tang, R. Cunha, T. Hamel, and C. Silvestre, “Aircraft Landing Using Dynamic Two-Dimensional Image-Based Guidance Control,” IEEE Trans. Aerosp. Electron. Syst., vol. 55, no. 5, pp. 2104–2117, 2019, doi: 10.1109/TAES.2018.2881354.
[7] A. Yol, B. Delabarre, A. Dame, J. Dartois, and E. Marchand, “Vision-based absolute localization for unmanned aerial vehicles,” in 2014 IEEE/RSJ International Conference on Intelligent Robots and Systems, 2014, pp. 3429–3434, doi: 10.1109/IROS.2014.6943040.
[8] T. Baca et al., “Autonomous landing on a moving vehicle with an unmanned aerial vehicle,” J. F. Robot., vol. 36, no. 5, pp. 874–891, 2019, doi: 10.1002/rob.21858.
[9] A. Borowczyk, D. T. Nguyen, A. Phu-Van Nguyen, D. Q. Nguyen, D. Saussié, and J. Le Ny, “Autonomous Landing of a Multirotor Micro Air Vehicle on a High Velocity Ground Vehicle,” IFAC-PapersOnLine, vol. 50, no. 1, pp. 10488–10494, 2017, doi: 10.1016/j.ifacol.2017.08.1980.
[10] A. Borowczyk, D. T. Nguyen, A. P. Van Nguyen, D. Q. Nguyen, D. Saussié, and J. Le Ny, “Autonomous landing of a quadcopter on a high-speed ground vehicle,” J. Guid. Control. Dyn., vol. 40, no. 9, pp. 2373–2380, 2017, doi: 10.2514/1.G002703.
[11] Y. Meng, W. Wang, H. Han, and J. Ban, “A visual/inertial integrated landing guidance method for UAV landing on the ship,” Aerosp. Sci. Technol., vol. 85, pp. 474–480, 2019, doi: 10.1016/j.ast.2018.12.030.
[12] T. Yang, P. Li, H. Zhang, J. Li, and Z. Li, “Monocular vision SLAM-based UAV autonomous landing in emergencies and unknown environments,” Electron., vol. 7, no. 5, 2018, doi: 10.3390/electronics7050073.
[13] M. K. Mittal, R. Mohan, W. Burgard, and A. Valada, “Vision-Based Autonomous UAV Navigation and Landing for Urban Search and Rescue,” ArXiv, vol. abs/1906.0, 2019.
[14] C. Symeonidis, E. Kakaletsis, I. Mademlis, N. Nikolaidis, A. Tefas, and I. Pitas, “Vision-based UAV Safe Landing exploiting Lightweight Deep Neural Networks,” ACM Int. Conf. Proceeding Ser., pp. 13–19, 2021, doi: 10.1145/3447587.3447590.
[15] S. Lee et al., “Sliding Mode Guidance and Control for UAV Carrier Landing,” IEEE Trans. Aerosp. Electron. Syst., vol. 55, no. 2, pp. 951–966, 2019, doi: 10.1109/TAES.2018.2867259.
[16] D. Cabecinhas, R. Naldi, C. Silvestre, R. Cunha, and L. Marconi, “Robust Landing and Sliding Maneuver Hybrid Controller for a Quadrotor Vehicle,” IEEE Trans. Control Syst. Technol., vol. 24, no. 2, pp. 400–412, 2016, doi: 10.1109/TCST.2015.2454445.
[17] B. K. P. Horn and B. G. Schunck, “Determining optical flow,” Artif. Intell., vol. 17, no. 1–3, pp. 185–203, 1981, doi: 10.1016/0004-3702(81)90024-2.
[18] G. De Croon, D. Alazard, and D. Izzo, “Controlling spacecraft landings with constantly and exponentially decreasing time-to-contact,” IEEE Trans. Aerosp. Electron. Syst., vol. 51, no. 2, pp. 1241–1252, 2015, doi: 10.1109/TAES.2014.130135.
[19] H. W. Ho, C. De Wagter, B. D. W. Remes, and G. C. H. E. de Croon, “Optical-flow based self-supervised learning of obstacle appearance applied to MAV landing,” Rob. Auton. Syst., vol. 100, pp. 78–94, 2018, doi: https://doi.org/10.1016/j.robot.2017.10.004.
[20] F. Ruffier and N. Franceschini, “Aerial robot piloted in steep relief by optic flow sensors,” 2008 IEEE/RSJ Int. Conf. Intell. Robot. Syst. IROS, pp. 1266–1273, 2008, doi: 10.1109/IROS.2008.4651089.
[21] H. W. Cheng, T. L. Chen, and C. H. Tien, “Motion estimation by hybrid optical flow technology for UAV landing in an unvisited area,” Sensors (Switzerland), vol. 19, no. 6, pp. 1–13, 2019, doi: 10.3390/s19061380.
[22] A. Miller, B. Miller, A. Popov, and K. Stepanyan, “UAV Landing Based on the Optical Flow Videonavigation,” Sensors, vol. 19, no. 6. 2019, doi: 10.3390/s19061351.
[23] H. W. Ho, G. C. H. E. De Croon, E. Van Kampen, Q. P. Chu, and M. Mulder, “Adaptive Gain Control Strategy for Constant Optical Flow Divergence Landing,” IEEE Trans. Robot., vol. 34, no. 2, pp. 508–516, 2018, doi: 10.1109/TRO.2018.2817418.
[24] Y. Zhou, H. W. Ho, and Q. Chu, “Extended incremental nonlinear dynamic inversion for optical flow control of micro air vehicles,” Aerosp. Sci. Technol., vol. 116, p. 106889, 2021, doi: 10.1016/j.ast.2021.106889.
[25] Asadi, D., and Ahmadi, K. “Nonlinear Robust Adaptive Control of an Airplane with Structural Damage.” Proceedings of the Institution of Mechanical Engineers, Part G: Journal of Aerospace Engineering, Vol. 234, No. 14, 2020, pp. 2076–2088.
[26] Asadi, D., Ahmadi, K., and Nabavi, S. Y. “Fault-Tolerant Trajectory Tracking Control of a Quadcopter in Presence of a Motor Fault.” International J. of Aeronautical and Space Sciences, 2021. https://doi.org/10.1007/s42405-021-00412-9.
[27] Karim Ahmadi, Davood Asadi, Seyed-Yaser Nabavi-Chashmi, Onder Tutsoy, Modified adaptive discrete-time incremental nonlinear dynamic inversion control for quad-rotors in the presence of motor faults, Mechanical Systems and Signal Processing, Volume 188, 2023, doi.org/10.1016/j.ymssp.2022.109989.