Please use this identifier to cite or link to this item: https://hdl.handle.net/10419/288107 
Year of Publication: 
2023
Citation: 
[Journal:] International Journal of Selection and Assessment [ISSN:] 1468-2389 [Volume:] 31 [Issue:] 3 [Publisher:] Wiley [Place:] Hoboken, NJ [Year:] 2023 [Pages:] 388-402
Publisher: 
Wiley, Hoboken, NJ
Abstract: 
Research has examined trust in humans and trust in automated decision support. Although reflecting a likely realization of decision support in high‐risk tasks such as personnel selection, trust in hybrid human‐automation teams has thus far received limited attention. In two experiments (N1 = 170, N2 = 154) we compare trust, trustworthiness, and trusting behavior for different types of decision‐support (automated, human, hybrid) across two assessment contexts (personnel selection, bonus payments). We additionally examined a possible trust violation by presenting one group of participants a preselection that included predominantly male candidates, thus reflecting possible unfair bias. Whereas fully‐automated decisions were trusted less, results suggest that trust in hybrid decision support was similar to trust in human‐only support. Trust violations were not perceived differently based on the type of support. We discuss theoretical (e.g., trust in hybrid support) and practical implications (e.g., keeping humans in the loop to prevent negative reactions).
Subjects: 
artificial intelligence
decision‐support
human‐automation collaboration
personnel selection
trust
Persistent Identifier of the first edition: 
Creative Commons License: 
cc-by-nc Logo
Document Type: 
Article
Document Version: 
Published Version

Files in This Item:
File
Size





Items in EconStor are protected by copyright, with all rights reserved, unless otherwise indicated.