Bitte verwenden Sie diesen Link, um diese Publikation zu zitieren, oder auf sie als Internetquelle zu verweisen: https://hdl.handle.net/10419/288107 
Erscheinungsjahr: 
2023
Quellenangabe: 
[Journal:] International Journal of Selection and Assessment [ISSN:] 1468-2389 [Volume:] 31 [Issue:] 3 [Publisher:] Wiley [Place:] Hoboken, NJ [Year:] 2023 [Pages:] 388-402
Verlag: 
Wiley, Hoboken, NJ
Zusammenfassung: 
Research has examined trust in humans and trust in automated decision support. Although reflecting a likely realization of decision support in high‐risk tasks such as personnel selection, trust in hybrid human‐automation teams has thus far received limited attention. In two experiments (N1 = 170, N2 = 154) we compare trust, trustworthiness, and trusting behavior for different types of decision‐support (automated, human, hybrid) across two assessment contexts (personnel selection, bonus payments). We additionally examined a possible trust violation by presenting one group of participants a preselection that included predominantly male candidates, thus reflecting possible unfair bias. Whereas fully‐automated decisions were trusted less, results suggest that trust in hybrid decision support was similar to trust in human‐only support. Trust violations were not perceived differently based on the type of support. We discuss theoretical (e.g., trust in hybrid support) and practical implications (e.g., keeping humans in the loop to prevent negative reactions).
Schlagwörter: 
artificial intelligence
decision‐support
human‐automation collaboration
personnel selection
trust
Persistent Identifier der Erstveröffentlichung: 
Creative-Commons-Lizenz: 
cc-by-nc Logo
Dokumentart: 
Article
Dokumentversion: 
Published Version

Datei(en):
Datei
Größe
919.04 kB





Publikationen in EconStor sind urheberrechtlich geschützt.