Vorm, Eric S., & Andrew D. Miller (2020, July). Modeling user information needs to enable successful human-machine teams: Designing transparency for autonomous systems. Dylan D. Schmorrow & Cali M. Fidopiastis (Eds.), Augmented cognition. Human cognition and behavior (14th International Conference, AC 2020, Held as Part of the 22nd HCI [Human-Computer Interaction] International Conference, HCII 2020, Copenhagen, Denmark, July 19–24, 2020, Proceedings, Part II, pp. 445-465). Cham, Switzerland: Springer Nature. (doi: 10.1007/978-3-030-50439-7_31) This publication is also part of Springer’s Lecture Notes in Computer Science book series (LNCS, volume 12197).

Abstract: Intelligent autonomous systems are quickly becoming part of everyday life. Efforts to design systems whose behaviors are transparent and explainable to users are stymied by models that are increasingly complex and interdependent, and compounded by an ever-increasing scope in autonomy, allowing for more autonomous system decision making and actions than ever before. Previous efforts toward designing transparency in autonomous systems have focused largely on explanations of algorithms for the benefit of programmers and back-end debugging. Less emphasis has been applied to model the information needs of end-users, or to evaluate what features most impact end-user trust and influence positive user engagements in the context of human-machine teaming. This study investigated user information preferences and priorities directly by presenting users with an interaction scenario that depicted ambiguous, unexpected, and potentially unsafe system behaviors. We then elicited what features these users desired most from the system to resolve these interaction conflicts (i.e., what information is most necessary for users to trust the system and continue using it in our described scenario). Using factor analysis, we built detailed user typologies that arranged and prioritized user information needs and communication strategies. This typology can be adapted as a user model for autonomous system designs in order to guide design decisions. This mixed methods approach to modeling user interactions with complex sociotechnical systems revealed design strategies which have the potential to increase user understanding of system behaviors, which may in turn improve user trust in complex autonomous systems.

Eric S Vorm <esvorm@gmail.com> is in the Navy Center for Applied Research in Artificial Intelligence, US Naval Research Laboratory, Washington, DC; and Andrew D Miller <andrewm@iupui.edu> is in the School of Informatics and Computing, Indiana University Purdue University Indianapolis, Indianapolis, IN (USA).