Dr. John F. Tripp, PhD, Assistant Professor, Baylor University

Technology, Humanness, And Trust: Rethinking Trust-in-Technology

Information systems (IS) research has demonstrated that humans can and do trust
technology. In the current trust-in-technology literature, two very different types of trust-intechnology
constructs are employed. Some researchers use human-like trust constructs (e.g.,
benevolence, integrity, and ability), while other researchers use system-like trust constructs
(e.g., helpfulness, reliability, and functionality). Interestingly, past research shows that both sets
of measures influence important dependent variables, but the literature does not explain when
one type should be used instead of the other type. In this paper we use trust, social presence,
and affordance theories to shed light on this research problem. We report on two studies. In
Study 1, we argue first that technologies vary in their perceived “humanness.” Second, we
argue that because users perceive two technologies to differ in humanness, they will develop
trust in each technology differently, i.e., along more human-like criteria or more system-like
criteria. We study two technologies that vary in humanness to explore these differences
empirically and theoretically. We demonstrate that when the trust construct used aligns well with
how human the technology is, it produces stronger effects on selected outcome variables than
does a misaligned trust construct. In Study 2, we assess whether these technologies differ in
humanness based on social presence, social affordances, and affordances for sociality. We find
that these factors do distinguish whether technology is more human-like or system-like.
Implications for trust-in-technology research are provided.