Conference PaperPDF Available


User assistance systems are often invoked automatically based on simple triggers (e.g., the assistant pops up after the user has been idle for some time) or they require users to invoke them manually. Both invocation modes have their weaknesses. Therefore, we argue that, ideally, the assistance should be invoked intelligently based on the users’ actual need for assistance. In this paper, we propose a research project investigating the role of users’ cognitive-affective states when providing assistance using NeuroIS measurements. Drawing on the theoretical foundations of the Attentional Control Theory, we propose an experiment that helps to nderstand how cognitive-affective states can serve as indicators for the best point of time for the invocation f user assistance systems. The research described in this paper will ultimately help to design intelligent nvocation of user assistance systems.
This is the author’s version of a work that was published in the following source
Friemel, C.; Morana, S.; Pfeiffer, J.; and Maedche, A. (2017) "On the Role of
User's Cognitive-Affective States for User Assistance Invocation", to appear in:
Proceedings of the Gmunden Retreat on NeuroIS 2017.
Please note: Copyright is owned by the author and / or the publisher.
Commercial use is not allowed.
Institute of Information Systems and Marketing (IISM)
Fritz-Erler-Strasse 23
76133 Karlsruhe - Germany
Karlsruhe Service Research Institute (KSRI)
Kaiserstraße 89
76133 Karlsruhe Germany
© 2017. This manuscript version is made available under the CC-
BY-NC-ND 4.0 license
On the Role of Users’ Cognitive-Affective States for User
Assistance Invocation
Celina Friemel1, Stefan Morana1, Jella Pfeiffer1, and Alexander Maedche1
1Institute of Information Systems and Marketing (IISM), Karlsruhe Institute of Technology
(KIT), Karlsruhe, Germany
{celina.friemel, stefan.morana, jella.pfeiffer,
Abstract. User assistance systems are often invoked automatically based on sim-
ple triggers (e.g., the assistant pops up after the user has been idle for some time)
or they require users to invoke them manually. Both invocation modes have their
weaknesses. Therefore, we argue that, ideally, the assistance should be invoked
intelligently based on the users’ actual need for assistance. In this paper, we pro-
pose a research project investigating the role of users’ cognitive-affective states
when providing assistance using NeuroIS measurements. Drawing on the theo-
retical foundations of the Attentional Control Theory, we propose an experiment
that helps to understand how cognitive-affective states can serve as indicators for
the best point of time for the invocation of user assistance systems. The research
described in this paper will ultimately help to design intelligent invocation of user
assistance systems.
Keywords: Assistance; invocation; NeuroIS; Attentional Control Theory; cog-
nitive-affective user states; affect; mental effort
1 Introduction
Digital assistants like Siri or Alexa, chatbots like the ones on WeChat [1] and other
forms of user assistance strongly developed over the last years and the trend towards
providing advanced user assistance in digital services is even growing [2, 3]. The com-
mon idea of user assistance systems it to support users to perform their tasks better [4].
One of the early attempts to create such an assistant was Microsoft’s Clippy. Yet,
Clippy is a famous and regularly trending example for the dismal failure of such user
assistance [5]. One of Clippy’s most severe design mistakes was its proactive invoca-
tion mode. Proactively offering assistance at the right point can be a helpful feature in
order to relieve the user’s effort, ensure successful task performance and avoid errors
[2, 6]. Certainly, Clippy appeared in the most inappropriate moments and interrupted
users when not required. This led to Clippy’s rapid downfall [5], which demonstrates
the importance of a careful invocation design of assistance. The example shows that the
communication via assistants needs to be well designed and adapted to the user in order
to enhance trust and usage, and ultimately performance [7, 8]. Thus, the right timing of
assistance invocation is an important design aspect [9, 10]. Invocation design in this
context describes how assistance is activated. Some researchers [10] suggest a more
advanced invocation which is provided by the system that “monitors the user in some
way” (p. 504). However, existing approaches of assistance invocation are mainly dom-
inated by either automatic activation or manual user requests. The automatic provision-
ing is often designed with static predefined rules by e.g. applying explicit user model-
ling, that incorporates the users’ goals, needs, or other preferences to detect their need
for assistance [11]. Both modes have been proven to not be entirely sufficient [6, 12],
possibly because users are not always aware of when they need help and likewise fre-
quently do not know how to use assistance effectively [1315]. Furthermore, the “right
time to intervene is [still] difficult to predict” [16] for the systems and consequently
users get annoyed or out of flow when being interrupted at the wrong moments [5, 17].
As user assistance serves to relieve users’ mental working memory [10] the system
should not additionally burden the user with interruptions at the wrong time.
To address this research gap of providing intelligent invocation of user assistance
[18], we argue for taking into account the cognitive-affective states of the user in real-
time. With the term cognitive-affective states we refer to user states that involve both,
affective as well as cognitive activity [19]. These states heavily impact the interaction
between humans and technology, users’ need for assistance and consequently their task
performance [2023]. User states that influence users’ interaction with IT are, in par-
ticular, task-dependent negative cognitive-affective states, such as frustration or anxiety
[6] as well as high mental effort [24]. Thus, we assume that these user states corre-
spondingly influence users’ need for assistance [21, 22]. Drawing on the theoretical
assumptions of the Attentional Control Theory [20], we argue that the assessment of
the users’ negative cognitive-affective states with neurophysiological data is an im-
portant design aspect to further improve user assistance invocation [6, 25]. Ultimately,
systems can automatically adjust assistance invocation to sensed user states to increase
the users’ efficiency, performance, and satisfaction [6]. In our research we follow a
NeuroIS approach [8, 26]. One major advantage with regard to the outlined problem is
the opportunity to observe latent variables, such as the users’ need for assistance, di-
rectly from body signals” [27]. Thus, the research question guiding this work is:
Can we identify users’ need for assistance by unobtrusive and real-time measure-
ment and analysis of cognitive-affective states of the user?
With this we want to expand and add value to NeuroIS literature as well as user
assistance research by investigating psychophysiological correlates that reliably and
timely detect the users’ need for assistance and the IT-related behavior of assistance
usage [26]. The research described in this paper will ultimately help to design intelligent
invocation of user assistance systems.
2 Conceptual and Theoretical Foundations
2.1 Assistance and Invocation Modes
Assistance systems are provided in order to support users to perform their tasks bet-
ter [4]. Assistance tends to become more and more tailored to the users’ needs in order
to increase performance at the right time and in the right context [2, 4]. Moreover, var-
ying in their degree of system intelligence (e.g. provision of context-aware assistance)
and interaction enabled by the system (e.g. offering highly sophisticated dialog inter-
faces), assistance systems can exhibit different maturity levels in terms of sensing the
users’ current environment and activities [4].
One critical design aspect when providing adequate user assistance is to determine
when a user actually wants or needs assistance [16]. The right point of interrupting
users has been studied extensively in the context of notifications [13, 28, 29]. Badly
timed interruptions can cause deteriorated performance and decision-making, negative
user states (like annoyance, frustration, cognitive overload) and ultimately distrust in
the systems’ competency and usage [2831]. Correspondingly, the timing or invocation
of assistance is crucial for the assistance’ success. Recent research examining the right
time to provide assistance agrees that it should be guided by the users’ characteristics,
needs, and context [6, 16, 25, 32]. First approaches of such user modeling [11, 16] did
not provide accurate determinants for assistance invocation [12]. In line with other re-
search [6, 3335], we argue that users’ cognitive-affective states determine the need for
assistance. Modeling approaches on this new determinants exists [6, 35]. Yet, they fo-
cus on user modeling, or use manually pre-determined thresholds for assistance invo-
cation. This reveals the lack of efficient timing to intervene with assistance that acts on
the sensing of users’ cognitive-affective states in real-time.
2.2 Theorizing on Invocation Determinants: A NeuroIS Perspective
Assistance and NeuroIS. In order to gain a better theoretical understanding of hu-
man behavior, NeuroIS research [36] is eager to find psychophysiological and neural
correlates for already established IT constructs [26]. Especially in the context of offer-
ing user assistance the traditional IS research methods like surveys and interviews en-
counter difficulties of reliably predicting users’ need of assistance [15].
As users have been found to either not always be aware of when they need help,
underestimate their need for assistance, or occasionally do not want to admit that they
need assistance [13, 15, 37], a NeuroIS research approach can potentially shed light on
this issue. However, objective measurement methods for users’ assistance need in an
objective and unobtrusive way are still absent. Neurophysiological tools offer great po-
tential for new insights on user states by measuring direct responses to stimuli from the
human body [27]. Applying this approach enables to capture unconscious processes
that users might not be able to introspect or to gain insights on determinants of behavior
that users are uncomfortable to report on [26]. Moreover, the possibility of obtaining
real-time as well as continuous data enables analysis of temporal aspects and the meas-
urement of simultaneous processes of constructs [26]. Consequently, this helps to de-
sign adaptive IT artifacts that considers user states which determine IT behavior [38].
With regard to designing assistance invocation, this integration of psychophysiolog-
ical determinants for users assistance needs is absent in existing research. First at-
tempts to include affective user states into invocation determination of user assistance
exists [6, 35, 39, 40]. Yet, these studies primarily address this issue only from a theo-
retical view. Particularly, to our knowledge, there exists no published research on de-
termining assistance invocation by empirically integrating psychophysiological meas-
urements in order to monitor users’ states. However, not only from a NeuroIS perspec-
tive but also from a psychological view this approach can unlock great potential for
reliably detecting users’ need for assistance.
Attentional Control Theory (ACT). Since decades, researchers agree that affective
as well as cognitive states have motivational properties that lead to observable behavior
in IS [38, 41, 42] as well as non-IS contexts [20, 43].
In their psychological theory, Eysenck et al. [20, 44] offer valuable insights into the
effects of affect and cognition on users’ need for assistance. It describes the influence
of especially negative affective states on people’s task performance and related behav-
ior, in particular, their coping strategies. In order to prevent a performance loss due to
experienced negative affect and increased cognitive effort, people adjust their behavior
with, for instance, searching for assistance. Eysenck et al. revealed that provoked anx-
iety impairs peoples’ processing efficiency when working on a goal-directed task be-
cause people shift their attentional focus from the current task to the threatening stim-
ulus. This increases their cognitive resource utilization. When responding to this
change, people need additional resources (internal or external) to cope with the situation
in order to not experiencing a loss in performance effectiveness. However, this leads to
a decrease in processing efficiency. One possibility to prevent people from such an
efficiency loss caused by negative affective states is the provision of auxiliary pro-
cessing resources [20], e.g. by offering assistance.
Cognitive-Affective States. The ACT mainly focuses on anxiety as negative affec-
tive state which impairs attentional control on a current task and ultimately efficiency
and resulting performance [20]. Nevertheless, the theoretical implications of ACT have
already been applied in the IS context and expanded to negative affective states, in
general, that evidently influence IT-related behavior [42, 45]. By definition, an affec-
tive state arises from an individual’s reaction to an event and influences cognitive, phys-
iological, as well as behavioral components [6, 46]. Especially in the context of human-
computer interaction, affective states play an important role when explaining user be-
havior [19, 21, 47]. Moreover, user states of high cognitive activity are often related to
emotional responses; either in parallel or as interacting occurrences [26]. Monitoring
user states that involve both, affective as well as cognitive activity, can reveal new
insights on the users and their needs [26]. Baker et al. [19] refer to the latter as cogni-
tive-affective user states. Within the context of assistance invocation especially nega-
tive cognitive-affective states are assumed to reveal important insights [23]. They are
characterized by a negative affective valence, which can be assessed with the help of
facial electromyography tools or facial expression analysis [48]. Examples of negative
cognitive-affective user states are frustration or boredom [49]. Likewise, anxiety can
be categorized as such a cognitive-affective state [50]. As antecedents of negative cog-
nitive-affective user states certain stressors such as time pressure caused by dropped
network connections and high task complexity have been found to increase the need for
assistance [37].
3 Research Propositions
Drawing on ACT, we assume that negative cognitive-affective user states influence
the related users’ behavior in general and specifically the usage of assistance. This as-
sumption is based on the influence of negative cognitive-affective user state on the us-
ers’ attention focus. As this focus will shift from executing the task to coping with the
affect-evoking stimuli the user has to invest more cognitive resources [20]. This in-
crease in resource utilization is represented by users mental effort, respectively the
amount of cognitive resources that is required to manage the workload demanded by a
task [51]. As users experience a negative cognitive-affective state, we therefore assume
that their level of mental effort increases accordingly:
Proposition P1: A negative cognitive-affective state increases users’ mental effort.
This increase in mental effort leads to potential loss in users’ efficiency and ulti-
mately task performance [20]. In order to prevent them from this undesirable outcome
users are hypothesized to utilize coping strategies to decrease mental effort, respec-
tively to increase efficiency and performance [20, 52, 53]. In the case of our research
project, this is represented by the usage of a user assistance system that offers additional
information [54, 55]:
Proposition P2: Increased mental effort increases assistance usage.
4 Proposed Methodology
In a first step, we propose to examine the effect of negative cognitive-affective user
states on the users’ IT-related behavior (in this case the usage of the offered user assis-
tance). To test whether the theoretically derived propositions proof to be valid we plan
to conduct a laboratory experiment.
We measure the users’ cognitive-affective state with a combination of psychophys-
iological tools. We assess users emotional valence via facial expressions with
webcams [48] and users’ mental effort via heart rate with ECG [8, 38]. ECG is one
possibility to assess mental effort among others and has been found to be a reliable
predictor for peoples’ mental effort [56]. Furthermore, compared to other methods, as
EEG [57], it constitutes a minimally invasive measurement method [26]. Together with
existing technical restrictions, this led to our decision of approximating mental effort
via ECG measures. As the experimental context, we chose a travel booking scenario
and formulate this as a goal-directed task according to the ACT [20] by providing par-
ticipants incentives for a successful task execution. The experimental task will compro-
mise the configuration of a travel with specific constraints with respect to budget and
time. The participants task is to configure the optimal travel in order to fulfill the ex-
periments’ objective. A virtual travel agency will offer the assistance that the user can
consult manually, if needed. The participants will be randomly assigned to one of two
treatment groups and a control group. The treatment structure will be composed of a
low negative affect condition and a high negative affect condition. The treatments differ
with respect to the amount of task features that stimulate negative cognitive-affective
states. We then observe when they use the assistance and how this depends on the treat-
5 Expected Contribution and Future Work
In the next steps, we will finalize the experimental design and carry out the experi-
ment. Conducting the experiment and subsequently evaluating the experimental data
will reveal valuable insights on the determinants of users’ assistance needs. With this,
we will gain first design knowledge on the appropriate invocation timing of user assis-
tance systems. We aim at validating the two suggested constructs of user states as neu-
rophysiological correlates for assistance need in order to design neuro-adaptive invo-
cation of user assistance in later stages of this research [26]. The final objective is to
test and evaluate the resulting derived design knowledge in a follow-up experiment.
Thereby, we contribute to collecting initial design knowledge towards finding the right
moments for user assistance invocation. With ultimately testing the effects of such an
invocation on the user, we will further contribute to research on user assistance in gen-
eral. The hypothesized positive outcomes by applying timely user assistance [6] as well
as possible negative effects caused by interrupting the user with the assistance itself
[58] will be identified. Conceivably, the expected results will identify the optimal time
to offer user assistance even before the user experiences any negative cognitive-affec-
tive state. User assistance in the future can then be designed to detect if the user is
trending towards a negative cognitive-affective state in order to prevent any associated
performance loss by offering timely user assistance.
Furthermore, the proposed experiment has some limitations that open up opportuni-
ties for future work on the results. As we will use an ECG measurement approach for
assessing participants’ mental effort, future research on the topic could evaluate other
measurement methodologies in comparison. The approach of Eye-Fixation Related Po-
tential (EFRP) proposed by Léger et al. [59] could offer further insights into partici-
pants’ mental effort during task execution and help to identify when to offer user assis-
tance. Moreover, we are aware of the fact that not only users’ negative cognitive-affec-
tive states might constitute a need for user assistance. Other factors apart from these
should be investigated in future research on the topic. Positive affective states have
been found to influence task performance, too [60]. Together with examining the role
of users’ attention, this could complement the proposed research of finding the optimal
timing for offering user assistance.
1. Mou, Y., Xu, K.: The media inequality: Comparing the initial human-human
and human-AI social interactions. Comput. Human Behav. 72, 432440 (2017).
2. Sarikaya, R.: The Technology Behind Personal Digital Assistants: An overview
of the system architecture and key components. IEEE Signal Process. Mag. 34,
6781 (2017).
3. Baig, E.C.: Personal digital assistants are on the rise (and they want to talk),
4. Maedche, A., Morana, S., Schacht, S., Werth, D., Krumeich, J.: Advanced User
Assistance Systems. Bus. Inf. Syst. Eng. 58, 25 (2016).
5. Veletsianos, G.: Cognitive and Affective Benefits of an Animated Pedagogical
Agent: Considering Contextual Relevance and Aesthetics. J. Educ. Comput.
Res. 36, 373377 (2007).
6. Liao, W., Zhang, W., Zhu, Z., Ji, Q., Gray, W.D.: Toward a decision-theoretic
framework for affect recognition and user assistance. Int. J. Hum. Comput.
Stud. 64, 847873 (2006).
7. Parasuraman, R., Wilson, G.F.: Putting the brain to work: neuroergonomics
past, present, and future. Hum. Factors. 50, 468474 (2008).
8. vom Brocke, J., Riedl, R., ger, P.-M.: Application Strategies for
Neuroscience in Information Systems Design Science Research. J. Comput.
Inf. Syst. 53, 113 (2013).
9. Silver, M.S.: On the Design Features of Decision Support Systems: The Role
of System Restrictiveness and Decisional Guidance. In: Handbook on Decision
Support Systems 2: Variations. pp. 261293 (2008).
10. Gregor, S., Benbasat, I.: Explanations from Intelligent Systems: Theoretical
Foundations and Implications for Practice. MIS Q. 23, 497530 (1999).
11. Horvitz, E., Breese, J., Heckerman, D., Hovel, D., Rommelse, K.: The Lumiere
Project: Bayesian User Modeling for Inferring the Goals and Needs of Software
Users. Fourteenth Conf. Uncertain. Artif. Intell. 256265 (1998).
12. Antwarg, L., Lavie, T., Rokach, L., Shapira, B., Meyer, J.: Highlighting items
as means of adaptive assistance. Behav. Inf. Technol. 32, 117 (2012).
13. Schiaffino, S., Amandi, A.: Polite personal agents. IEEE Intell. Syst. 21, 1219
14. Babin, L.M., Tricot, A., Mariné, C.: Seeking and providing assistance while
learning to use information systems. Comput. Educ. 53, 10291039 (2009).
15. Novick, D.G., Elizalde, E., Bean, N.: Toward a More Accurate View of When
and How People Seek Help with Computer Applications. ACM 25th Int. Conf.
Des. Commun. 95102 (2007).
16. Ginon, B., Stumpf, S., Jean-Daubias, S.: Towards the Right Assistance at the
Right Time for Using Complex Interfaces. In: Proceedings of the International
Working Conference on Advanced Visual Interfaces - AVI ’16. pp. 240243.
ACM Press, New York, New York, USA (2016).
17. McFarlane, D., Latorella, K.: The Scope and Importance of Human Interruption
in Human-Computer Interaction Design. Human-Computer Interact. 17, 161
18. Morana, S., Schacht, S., Scherp, A., Maedche, A.: A review of the nature and
effects of guidance design features. Decis. Support Syst. 97, 3142 (2017).
19. Baker, R.S.J. d., D’Mello, S.K., Rodrigo, M.M.T., Graesser, A.C.: Better to be
frustrated than bored: The incidence, persistence, and impact of learners’
cognitive-affective states during interactions with three different computer-
based learning environments. Int. J. Hum. Comput. Stud. 68, 223241 (2010).
20. Eysenck, M.W., Derakshan, N., Santos, R., Calvo, M.G.: Anxiety and cognitive
performance: attentional control theory. Emotion. 7, 33653 (2007).
21. Klein, J., Moon, Y., Picard, R.W.: This computer responds to user frustration:
Theory, design, and results. Interact. Comput. 14, 119140 (2002).
22. Picard, R.W.: Affective computing: Challenges. Int. J. Hum. Comput. Stud. 59,
5564 (2003).
23. Fairclough, S.H.: Fundamentals of physiological computing. Interact. Comput.
21, 133145 (2009).
24. Fehrenbacher, D.D., Djamasbi, S.: Information systems and task demand: An
exploratory pupillometry study of computerized decision making. Decis.
Support Syst. (2017).
25. Yorke-Smith, N., Saadati, S., Myers, K.L., Morley, D.N.: The Design of a
Proactive Personal Agent for Task Management. Int. J. Artif. Intell. Tools. 21,
1250004-130 (2012).
26. Dimoka, A., Banker, R.D., Benbasat, I., Davis, F.D., Dennis, A.R., Gefen, D.,
Gupta, A., Ischebeck, A., Kenning, P., Pavlou, P.A., Müller-Putz, G.R., Riedl,
R., vom Brocke, J., Weber, B.: On the Use of Neuropyhsiological Tools in IS
Research: Developing a Research Agenda for NeuroIS. MIS Q. 36, 679702
27. Liang, T.-P., vom Brocke, J.: Special Issue: Neuroscience in Information
Systems Research. J. Manag. Inf. Syst. 30, 712 (2014).
28. Bailey, B.P., Konstan, J.A.: On the need for attention-aware systems:
Measuring effects of interruption on task performance, error rate, and affective
state. Comput. Human Behav. 22, 685708 (2006).
29. Adamczyk, P.D., Bailey, B.P.: If not now when?: the effects of interruption at
different moments within task execution. Proc. SIGCHI Conf. Hum. factors
Comput. Syst. 6, 271278 (2004).
30. Speier, C., Valacich, J.S., Vessey, I.: The Influence of Task Interruption on
Individual Decision Making: An Information Overload Perspective. Decis. Sci.
30, 337360 (1999).
31. LeeTiernan, S., Cutrell, E.B., Czerwinski, M., Hoffman, H.: Effective
Notification Systems Depend on User Trust. Human-Computer Interact. -
INTERACT 2001 Conf. Proc. 684685 (2001).
32. Wang, J., Wang, Y., Wang, Y.: CAPFF: A Context-Aware Assistant for Paper
Form Filling. IEEE Trans. Human-Machine Syst. 16 (2016).
33. Hudlicka, E.: To feel or not to feel: The role of affect in human-computer
interaction. Int. J. Hum. Comput. Stud. 59, 132 (2003).
34. Liao, W., Zhang, W., Zhu, Z., Ji, Q.: A decision theoretic model for stress
recognition and user assistance. Proc. Twent. Natl. Conf. Artif. Intell.
Seventeenth Innov. Appl. Artif. Intell. Conf. 529534 (2005).
35. Li, X., Ji, Q.: Active Affective State Detection and User Assistance With
Dynamic Bayesian Networks. IEEE Trans. Syst. Man, Cybern. A Syst.
Humans. 35, 93105 (2005).
36. René, Riedl, D., F., Davis, Hevner, A.R.: Towards a NeuroIS Research
Methodology : Intensifying the Discussion on Methods, Tools, and
Measurement. J. Assoc. Inf. Syst. 15, ixxxv (2014).
37. Ellis, S., Tyre, M.J.: Helping relations between technology users and
developers: A vignette study. IEEE Trans. Eng. Manag. 48, 5669 (2001).
38. Riedl, R., Léger, P.-M.: Fundamentals of NeuroIS - Information Systems and
the Brain. (2016).
39. Dong, J., Yang, H.I., Oyama, K., Chang, C.K.: Human desire inference process
based on affective computing. Proc. - Int. Comput. Softw. Appl. Conf. 347
350 (2010).
40. Frank, K., Robertson, P., Gross, M., Wiesner, K.: Sensor-based identification
of human stress levels. 2013 IEEE Int. Conf. Pervasive Comput. Commun.
Work. PerCom Work. 2013. 127132 (2013).
41. Gregor, S., Lin, A.C.H., Gedeon, T., Riaz, A., Zhu, D.: Neuroscience and a
Nomological Network for the Understanding and Assessment of Emotions in
Information Systems Research. J. Manag. Inf. Syst. 30, 1348 (2014).
42. Hibbeln, M., Jenkins, J.L., Schneider, C., Valacich, J.S., Weinmann, M.: How
is your user feeling? Inferring emotion trough Human-Computer Interaction
devices. MIS Q. 41, 122 (2017).
43. Brown, J.S., Farber, I.E.: Emotions conceptualized as intervening variables -
with suggestions toward a theory of frustration. Psychol. Bull. 48, 465495
44. Eysenck, M.W., Derakshan, N.: New perspectives in attentional control theory.
Pers. Individ. Dif. 50, 955960 (2011).
45. Anderson, B.B., Vance, A., Kirwan, C.B., Eargle, D., Jenkins, J.L.: How users
perceive and respond to security messages: a NeuroIS research agenda and
empirical study. Eur. J. Inf. Syst. 2016, 127 (2016).
46. Nass, C., Brave, S.: Emotion in Human-Computer Interaction. In: IEEE
Transactions on Pattern Analysis and Machine Intelligence. pp. 7792 (2007).
47. Scheirer, J., Fernandez, R., Klein, J., Picard, R.W.: Frustrating the user on
purpose: a step toward building an affective computer. Interact. Comput. 14,
93118 (2002).
48. Ekman, P.: Facial expression and emotion. Am. Psychol. 48, 384392 (1993).
49. D’Mello, S., Graesser, A.: The half-life of cognitive-affective states during
complex learning. Cogn. Emot. 25, 12991308 (2011).
50. Schlenker, B.R., Leary, M.R.: Social anxiety and self-presentation: a
conceptualization and model. Psychol. Bull. 92, 641669 (1982).
51. Sweller, J., van Merrienboer, J., Paas, F.: Cognitive architecture and
instructional design. Educ Psychol Rev. 10, 251296 (1998).
52. Peavler, S.W.: Pupil Size, Information Overload and Performance Differences.
Psychophysiology. 11, 559566 (1974).
53. Paas, F., Van Merrienboer, J.J.G.: The Efficiency of Instructional Conditions:
An Approach to Combine Mental Effort and Performance Measures. Hum.
Factors. 35, 737743 (1993).
54. Folkman, S., Moskowitz, J.T.: Positive affect and the other side of coping. Am.
Psychol. 55, 647654 (2000).
55. Beaudry, A., Pinsonneault, A.: Understanding User Responses to Information
Technology: A Coping Model of User Adaptation. MIS Q. 29, 493524 (2005).
56. Haapalainen, E., Kim, S., Forlizzi, J.F., Dey, A.K.: Psycho-Physiological
Measures for Assessing Cognitive Load. Proc. 12th ACM Int. Conf. Ubiquitous
Comput. 301310 (2010).
57. Mirhoseini, S.M.M., Léger, P.-M., Sénécal, S.: The Influence of Task
Characteristics on Multiple Objective and Subjective Cognitive Load
Measures. Presented at the (2017).
58. Jenkins, J.L., Anderson, B.B., Vance, A., Kirwan, C.B., Eargle, D.: More Harm
Than Good? How Messages That Interrupt Can Make Us Vulnerable. Inf. Syst.
Res. (2016).
59. Léger, P.M., Titah, R., Sénecal, S., Fredette, M., Courtemanche, F., Labonte-
Lemoyne, Él., De Guinea, A.O.: Precision is in the eye of the beholder:
Application of eye fixation-related potentials to information systems research.
J. Assoc. Inf. Syst. 15, 651678 (2014).
60. Jung, N., Wranke, C., Hamburger, K., Knauff, M.: How emotions affect logical
reasoning: evidence from experiments with mood-manipulated participants,
spider phobics, and people with exam anxiety. Front. Psychol. 5, (2014).
Conference Paper
Full-text available
The main purpose of this research is to develop a framework of trust determinants in the interactions between people and cognitive assistants (CAs). We define CAs as new decision tools, able to provide people with high quality recommendations and help them make data-driven decisions understanding the environment around people. We also define trust as the belief of people that CAs will help them reach a desired decision. An extensive review on trust in psychology, sociology, economics and policy making, organizational science, automation, and robotics is conducted to determine the factors influence people’s trust on CAs. On the basis of this review, we develop a framework of trust determinants in people’s interaction with CAs where reliability, attractiveness, and emotional attachment positively affect the intention of people in society to use CAs. Our framework also shows that innovativeness positively moderates the intention to use CAs. Finally, in this paper, we suggest future research directions for developing and validating more concrete scales in measuring trust determinants in the interactions between people and CAs.
Information systems (IS) play an important role in successful execution of organizational decisions, and the ensuing tasks that rely on those decisions. Because decision making models show that cognitive load has a significant impact on how people use information systems, objective measurement of cognitive load becomes both relevant and important in IS research. In this paper, we manipulate task demand during a decision making task in four different ways. We then investigate how increasing task demand affects a user's pupil data during interaction with a computerized decision aid. Our results suggest that pupillometry has the potential to serve as a reliable, objective, continuous and unobtrusive measure of task demand and that the adaptive decision making theory may serve as a suitable framework for studying user pupillary responses in the IS domain.
Guidance design features in information systems are used to help people in decision-making, problem solving, and task execution. Various information systems instantiate guidance design features, which have specifically been researched in the field of decision support systems for decades. However, due to the lack of a common conceptualization, it is difficult to compare the research findings on guidance design features from different literature streams. This article reviews and analyzes the work of the research streams of decisional guidance, explanations, and decision aids conducted in the last 25 years. Building on and grounded by the analyzed literature, we theorize an integrated taxonomy on guidance design features. Applying the taxonomy, we discuss existing empirical results, identify effects of different guidance design features, and propose opportunities for future research. Overall, this article contributes to research and practice. The taxonomy allows researchers to describe their work by using a set of dimensions and characteristics and to systematically compare existing research on guidance design features. From a practice-oriented perspective, we provide an overview on design features to support implementing guidance in various types of information systems.
As human-machine communication has yet to become prevalent, the rules of interactions between human and intelligent machines need to be explored. This study aims to investigate a specific question: During human users' initial interactions with artificial intelligence, would they reveal their personality traits and communicative attributes differently from human-human interactions? A sample of 245 participants was recruited to view six targets' twelve conversation transcripts on a social media platform: Half with a chatbot Microsoft's Little Ice, and half with human friends. The findings suggested that when the targets interacted with Little Ice, they demonstrated different personality traits and communication attributes from interacting with humans. Specifically, users tended to be more open, more agreeable, more extroverted, more conscientious and self-disclosing when interacting with humans than with AI. The findings not only echo Mischel's cognitive-affective processing system model but also complement the Computers Are Social Actors Paradigm. Theoretical implications were discussed.
We have long envisioned that one day computers will understand natural language and anticipate what we need, when and where we need it, and proactively complete tasks on our behalf. As computers get smaller and more pervasive, how humans interact with them is becoming a crucial issue. Despite numerous attempts over the past 30 years to make language understanding (LU) an effective and robust natural user interface for computer interaction, success has been limited and scoped to applications that were not particularly central to everyday use. However, speech recognition and machine learning have continued to be refined, and structured data served by applications and content providers has emerged. These advances, along with increased computational power, have broadened the application of natural LU to a wide spectrum of everyday tasks that are central to a user's productivity. We believe that as computers become smaller and more ubiquitous [e.g., wearables and Internet of Things (IoT)], and the number of applications increases, both system-initiated and user-initiated task completion across various applications and web services will become indispensable for personal life management and work productivity. In this article, we give an overview of personal digital assistants (PDAs); describe the system architecture, key components, and technology behind them; and discuss their future potential to fully redefine human?computer interaction.
Using Electroencephalography (EEG), this study aims at extracting three features from instantaneous mental workload measure and link them to different aspect of the workload construct. An experiment was designed to investigate the effect of two workload inductors (Task difficulty and uncertainty) on extracted features along with a subjective measure of mental workload. Results suggest that both subjective and objective measures of workload are able to capture the effect of task difficulty; however only accumulated load was found to be sensitive to task uncertainty. We discuss that the three EEG measures derived from instantaneous workload can be used as criteria for designing more efficient information systems.
System-generated alerts are ubiquitous in personal computing and, with the proliferation of mobile devices, daily activity. While these interruptions provide timely information, research shows they come at a high cost in terms of increased stress and decreased productivity. This is due to dual-task interference (DTI), a cognitive limitation in which even simple tasks cannot be simultaneously performed without significant performance loss. Although previous research has examined how DTI impacts the performance of a primary task (the task that was interrupted), no research has examined the effect of DTI on the interrupting task. This is an important gap because in many contexts, failing to heed an alert—the interruption itself—can introduce critical vulnerabilities. Using security messages as our context, we address this gap by using functional magnetic resonance imaging (fMRI) to explore how (1) DTI occurs in the brain in response to interruptive alerts, (2) DTI influences message security disregard, and (3) the effects of DTI can be mitigated by finessing the timing of the interruption. We show that neural activation is substantially reduced under a condition of high DTI, and the degree of reduction in turn significantly predicts security message disregard. Interestingly, we show that when a message immediately follows a primary task, neural activity in the medial temporal lobe is comparable to when attending to the message is the only task. Further, we apply these findings in an online behavioral experiment in the context of a web-browser warning. We demonstrate a practical way to mitigate the DTI effect by presenting the warning at low-DTI times, and show how mouse cursor tracking and psychometric measures can be used to validate low-DTI times in other contexts. Our findings suggest that although alerts are pervasive in personal computing, they should be bounded in their presentation. The timing of interruptions strongly influences the occurrence of DTI in the brain, which in turn substantially impacts alert disregard. This paper provides a theoretically grounded, cost-effective approach to reduce the effects of DTI for a wide variety of interruptive messages that are important but do not require immediate attention.
Conference Paper
Many users struggle when they have to use complex interfaces to complete everyday computing tasks. Offering intelligent, proactive assistance is becoming commonplace yet determining the right time to provide help is still difficult. We conducted an empirical study that aimed to uncover what user factors influenced following advice. Our results describe a user's background and expectations that appear to play a role in heeding assistance. Our work is a step towards understanding how to provide the right assistance at the right time and build proactive assistance systems that are personalized for individual users.
Users are vital to the information security of organizations. In spite of technical safeguards, users make many critical security decisions. An example is users' responses to security messages - discrete communication designed to persuade users to either impair or improve their security status. Research shows that although users are highly susceptible to malicious messages (e.g., phishing attacks), they are highly resistant to protective messages such as security warnings. Research is therefore needed to better understand how users perceive and respond to security messages. In this article, we argue for the potential of NeuroIS - cognitive neuroscience applied to Information Systems - to shed new light on users' reception of security messages in the areas of (1) habituation, (2) stress, (3) fear, and (4) dual-task interference. We present an illustrative study that shows the value of using NeuroIS to investigate one of our research questions. This example uses eye tracking to gain unique insight into how habituation occurs when people repeatedly view security messages, allowing us to design more effective security messages. Our results indicate that the eye movement-based memory (EMM) effect is a cause of habituation to security messages - a phenomenon in which people unconsciously scrutinize stimuli that they have previously seen less than other stimuli. We show that after only a few exposures to a warning, this neural aspect of habituation sets in rapidly, and continues with further repetitions. We also created a polymorphic warning that continually updates its appearance and found that it is effective in substantially reducing the rate of habituation as measured by the EMM effect. Our research agenda and empirical example demonstrate the promise of using NeuroIS to gain novel insight into users' responses to security messages that will encourage more secure user behaviors and facilitate more effective security message designs.