Abstract
An important question with regard to new systems is whether they can be trusted by human users, especially when they are safety-critical. This problem should already be addressed in the course of requirements engineering for such systems. However, trust is actually a complex psychological and sociological concept. Thus, it cannot simply be taken into account as a desired or needed property of a system. We propose to learn from the Human Factors discipline on trust in technical systems. In particular, we argue that both undertrust and overtrust need to be avoided. The challenge is to determine system properties and activities in the course of requirements engineering for achieving that. We conjecture that both actual properties like safety and subjective assessment like perceived safety will be important, and how they will have to be balanced for avoiding undertrust and overtrust.
Original language | British English |
---|---|
Journal | CEUR Workshop Proceedings |
Volume | 2376 |
State | Published - 2019 |
Event | 2019 Joint of International Conference on Requirements Engineering: Foundation for Software Quality Workshops, Doctoral Symposium, Live Studies Track, and Poster Track, REFSQ-JP 2019 - Essen, Germany Duration: 18 Mar 2019 → … |