Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 32759
Driver Readiness in Autonomous Vehicle Take-Overs

Authors: Abdurrahman Arslanyilmaz, Salman Al Matouq, Durmus V. Doner

Abstract:

Level 3 autonomous vehicles are able to take full responsibility over the control of the vehicle unless a system boundary is reached or a system failure occurs, in which case, the driver is expected to take-over the control of the vehicle. While this happens, the driver is often not aware of the traffic situation or is engaged in a secondary task. Factors affecting the duration and quality of take-overs in these situations have included secondary task type and nature, traffic density, take-over request (TOR) time, and TOR warning type and modality. However, to the best of the authors’ knowledge, no prior study examined time buffer for TORs when a system failure occurs immediately before intersections. The first objective of this study is to investigate the effect of time buffer (3 and 7 seconds) on the duration and quality of take-overs when a system failure occurs just prior to intersections. In addition, eye-tracking has become one of the most popular methods to report what individuals view, in what order, for how long, and how often, and it has been utilized in driving simulations with various objectives. However, to the extent of authors’ knowledge, none has compared drivers’ eye gaze behavior in the two different time buffers in order to examine drivers’ attention and comprehension of salient information. The second objective is to understand the driver’s attentional focus on comprehension of salient traffic-related information presented on different parts of the dashboard and on the roads.

Keywords: Autonomous vehicles, driving simulation, eye gaze, attention, comprehension, take-over duration, take-over quality, time buffer.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 752

References:


[1] “Preliminary Statement of Policy Concerning Automated Vehicles,” National Highway Traffic Safety Administration, 2013.
[2] M. Walch, K. Lange, M. Baumann, and M. Weber, “Autonomous Driving: Investigating the Feasibility of Car-driver Handover Assistance,” in Proceedings of the 7th International Conference on Automotive User Interfaces and Interactive Vehicular Applications, New York, NY, USA, 2015, pp. 11–18, doi: 10.1145/2799250.2799268.
[3] T. Arakawa, R. Hibi, and T. Fujishiro, “Psychophysical assessment of a driver’s mental state in autonomous vehicles,” Transp. Res. Part Policy Pract., vol. 124, pp. 587–610, Jun. 2019, doi: 10.1016/j.tra.2018.05.003.
[4] C. Gold, M. Körber, D. Lechner, and K. Bengler, “Taking Over Control from Highly Automated Vehicles in Complex Traffic Situations: The Role of Traffic Density,” Hum. Factors J. Hum. Factors Ergon. Soc., vol. 58, no. 4, pp. 642–652, Jun. 2016, doi: 10.1177/0018720816634226.
[5] S. Petermeijer, P. Bazilinskyy, K. Bengler, and J. de Winter, “Take-over again: Investigating multimodal and directional TORs to get the driver back into the loop,” Appl. Ergon., vol. 62, pp. 204–215, Jul. 2017, doi: 10.1016/j.apergo.2017.02.023.
[6] K. Zeeb, A. Buchner, and M. Schrauf, “What determines the take-over time? An integrated model approach of driver take-over after automated driving,” Accid. Anal. Prev., vol. 78, pp. 212–221, May 2015, doi: 10.1016/j.aap.2015.02.023.
[7] J. Radlmayr, C. Gold, L. Lorenz, M. Farid, and K. Bengler, “How traffic situations and non-driving related tasks affect the take-over quality in highly automated driving,” 2014, doi: 10.1177/1541931214581434.
[8] C. Gold, D. Damböck, L. Lorenz, and K. Bengler, “‘Take over!’ How long does it take to get the driver back into the loop?,” 2016.
[9] W. Vlakveld, N. van Nes, J. de Bruin, L. Vissers, and M. van der Kroft, “Situation awareness increases when drivers have more time to take over the wheel in a Level 3 automated car: A simulator study,” Transp. Res. Part F Psychol. Behav., vol. 58, pp. 917–929, Oct. 2018, doi: 10.1016/j.trf.2018.07.025.
[10] T. Louw, G. Markkula, E. Boer, R. Madigan, O. Carsten, and N. Merat, “Coming back into the loop: Drivers’ perceptual-motor performance in critical events after automated driving,” Accid. Anal. Prev., vol. 108, pp. 9–18, 01 2017, doi: 10.1016/j.aap.2017.08.011.
[11] F. Shic, “Computational methods for eye-tracking analysis: Applications to autism,” ProQuest Information & Learning, US, 2009.
[12] T. Busjahn, C. Schulte, and A. Busjahn, “Analysis of code reading to gain more Insight in program comprehension,” in Proceedings of the 11th Koli Calling International Conference on Computing Education Research (Koli Calling ’11), Koli, Finland, 2011, pp. 1–9, doi: 10.1145/2094131.2094133.
[13] M. K. Eckstein, B. Guerra-Carrillo, A. T. Miller Singley, and S. A. Bunge, “Beyond eye gaze: What else can eyetracking reveal about cognition and cognitive development?,” Dev. Cogn. Neurosci., vol. 25, pp. 69–91, 2017, doi: 10.1016/j.dcn.2016.11.001.
[14] M. E. Crosby and J. Stelovsky, “Subject differences in the reading of computer algorithms,” in Proceedings of the Third International Conference on Human-Computer Interaction on Designing and Using Human-Computer Interfaces and Knowledge Based Systems (2nd ed.), Boston, Massachusetts, USA, 1989.
[15] A. Arslanyilmaz and J. Sullins, “Multi-player online simulated driving game to improve hazard perception,” Transp. Res. Part F Psychol. Behav., 2018, doi: 10.1016/j.trf.2018.02.015.
[16] S. Lemonnier, R. Brémond, and T. Baccino, “Gaze behavior when approaching an intersection: Dwell time distribution and comparison with a quantitative prediction,” Transp. Res. Part F Psychol. Behav., vol. 35, pp. 60–74, Nov. 2015, doi: 10.1016/j.trf.2015.10.015.
[17] A. S. Al-Ghamdi, “Using logistic regression to estimate the influence of accident factors on accident severity,” Accid. Anal. Prev., vol. 34, no. 6, pp. 729–741, Jan. 2002, doi: 10.1016/S0001-4575(01)00073-2.
[18] S. Papavlasopoulou, K. Sharma, and M. Giannakos, “How do you feel about learning to code? Investigating the effect of children’s attitudes towards coding using eye-tracking,” Int. J. Child-Comput. Interact., vol. 17, pp. 50–60, Sep. 2018, doi: 10.1016/j.ijcci.2018.01.004.
[19] C. Hasse, D. Grasshoff, and C. Bruder, “How to measure monitoring performance of pilots and air traffic controllers,” in Proceedings of the Symposium on Eye Tracking Research and Applications (ETRA ’12), New York, NY, USA, 2012, pp. 409–412, doi: 10.1145/2168556.2168649.
[20] S. Eivazi, R. Bednarik, M. Tukiainen, M. Fraunberg, V. Leinonen, and J. Jääskeläinen, “Gaze behaviour of expert and novice microneurosurgeons differs during observations of tumor removal recordings,” in Proceedings of the Symposium on Eye Tracking Research and Applications - ETRA ’12, Santa Barbara, California, 2012, pp. 377–380, doi: 10.1145/2168556.2168641.
[21] G. Tien, M. S. Atkins, B. Zheng, and C. Swindells, “Measuring situation awareness of surgeons in laparoscopic training,” in Proceedings of the 2010 Symposium on Eye-Tracking Research & Applications (ETRA ’10), New York, NY, 2010, pp. 149–152, doi: 10.1145/1743666.1743703.
[22] Y. Guéhéneuc and P. T. Laigle, “TAUPE: Towards Understanding Program Comprehension,” in Proceedings of The Conference of the Center for Advanced Studies on Collaborative Research (CASCON’06, 2006, pp. 16–19, doi: 10.1145/1188966.1188968.
[23] K. Sharma, P. Jermann, and P. Dillenbourg, “How students learn using MOOCs: An eye-tracking insight,” in Proceedings of the 2nd MOOC European Stakeholders Summit, Lausanne, France, 2014.
[24] K. Sharma, P. Jermann, and P. Dillenbourg, “With-me-ness: A gaze-measure for students attention in,” in Proceedings of International Conference of the Learning Sciences, Boulder, Colorado, USA, 2014.
[25] D. A. Redelmeier and R. J. Tibshirani, “Association between cellular-telephone calls and motor vehicle collisions,” N. Engl. J. Med., no. 7, p. 453, 1997.
[26] C. S. Gauld, I. Lewis, and K. M. White, “Concealing their communication: Exploring psychosocial predictors of young drivers’ intentions and engagement in concealed texting,” Accid. Anal. Prev., vol. 62, pp. 285–293, Jan. 2014, doi: 10.1016/j.aap.2013.10.016.
[27] Neil M. Issar et al., “The link between texting and motor vehicle collision frequency in the orthopaedic trauma population,” J. Inj. Violence Res., no. 2, p. 95, 2013, doi: 10.5249/jivr.v5i2.330.
[28] NHTSA’s National Center for Statistics and Analysis, “Distracted Driving in Fatal Crashes,” National Highway Traffic Safety Administration, Washington, DC, Summary of Statistical Findings DOTHS812700, Apr. 2019.
[29] J. l. ( 1 ) Cook and R. m. ( 2 ) Jones, “Texting and Accessing the Web While Driving: Traffic Citations and Crashes Among Young Adult Drivers,” Traffic Inj. Prev., vol. 12, no. 6, pp. 545–549, 01 2011, doi: 10.1080/15389588.2011.620999.
[30] M. A. Harrison, “College students’ prevalence and perceptions of text messaging while driving,” Accid. Anal. Prev., vol. 43, no. 4, pp. 1516–1520, Jan. 2011, doi: 10.1016/j.aap.2011.03.003.
[31] Z. Lu, X. Coster, and J. de Winter, “How much time do drivers need to obtain situation awareness? A laboratory-based study of automated driving,” Appl. Ergon., vol. 60, pp. 293–304, Apr. 2017, doi: 10.1016/j.apergo.2016.12.003.
[32] S. Samuel, A. Borowsky, S. Zilberstein, and D. Fisher, “Minimum Time to Situation Awareness in Scenarios Involving Transfer of Control from an Automated Driving Suite,” Transp. Res. Rec., vol. 2602, no. 1, pp. 115–120, 2016.
[33] T. j. (1) Wright, S. (1) Samuel, S. (1) Zilberstein, A. (2) Borowsky, and D. l. (3) Fisher, “Experienced drivers are quicker to achieve situation awareness than inexperienced drivers in situations of transfer of control within a level 3 autonomous environment,” in Proceedings of the Human Factors and Ergonomics Society, 2016, pp. 270–273, doi: 10.1177/1541931213601062.
[34] Marvin. McCallum, J. L. Brown, C. M. Richard, J. L. Campbell, Battelle Center for Human Performance and Safety., and United States. National Highway Traffic Safety Administration., “Crash warning system interfaces,” 2007.