Christine Searle’s research while affiliated with Concordia University Ann Arbor and other places

What is this page?


This page lists works of an author who doesn't have a ResearchGate profile or hasn't added the works to their profile yet. It is automatically generated from public (personal) data to further our legitimate goal of comprehensive and accurate scientific recordkeeping. If you are this author and want this page removed, please let us know.

Publications (2)


Figure 3. Illustration of target pictures a, b, c, ….
Figure 5. Illustration of the two-alternative forced-choice (2AFC) test.
Figure 8. Mean and standard error (SE) values of trust increment in Patterns 2, 3, and 7 ( *** p < .001).
Figure 9. Mean and standard error (SE) values of trust decrement in Patterns 0, 4, and 5 ( *** p < .001).
Figure 10. Comparing the magnitude of trust adjustment between (a) Patterns 0 and 2 and between (b) Patterns 5 and 7 ( *** p < .001).

+4

Toward Quantifying Trust Dynamics: How People Adjust Their Trust After Moment-to-Moment Interaction With Automation
  • Article
  • Full-text available

August 2021

·

239 Reads

·

74 Citations

Human Factors The Journal of the Human Factors and Ergonomics Society

·

Christopher Schemanske

·

Christine Searle

Objective We examine how human operators adjust their trust in automation as a result of their moment-to-moment interaction with automation. Background Most existing studies measured trust by administering questionnaires at the end of an experiment. Only a limited number of studies viewed trust as a dynamic variable that can strengthen or decay over time. Method Seventy-five participants took part in an aided memory recognition task. In the task, participants viewed a series of images and later on performed 40 trials of the recognition task to identify a target image when it was presented with a distractor. In each trial, participants performed the initial recognition by themselves, received a recommendation from an automated decision aid, and performed the final recognition. After each trial, participants reported their trust on a visual analog scale. Results Outcome bias and contrast effect significantly influence human operators’ trust adjustments. An automation failure leads to a larger trust decrement if the final outcome is undesirable, and a marginally larger trust decrement if the human operator succeeds the task by him/herself. An automation success engenders a greater trust increment if the human operator fails the task. Additionally, automation failures have a larger effect on trust adjustment than automation successes. Conclusion Human operators adjust their trust in automation as a result of their moment-to-moment interaction with automation. Their trust adjustments are significantly influenced by decision-making heuristics/biases. Application Understanding the trust adjustment process enables accurate prediction of the operators’ moment-to-moment trust in automation and informs the design of trust-aware adaptive automation.

Download

Figure 1 . The static snapshot view of trust versus the dynamic view of trust. If taking a snapshot at time t, both agents have the same trust level, but their trust dynamics differ.
Figure 2 . Flow chart of the aided memory recognition task
Figure 3 . Illustration of target pictures A, B, C, ...
8 (2 × 2 × 2) possible performance patterns based on the
Toward quantifying trust dynamics: How people adjust their trust after moment-to-moment interaction with automation

July 2021

·

115 Reads

Objective: We examine how human operators adjust their trust in automation as a result of their moment-to-moment interaction with automation. Background: Most existing studies measured trust by administering questionnaires at the end of an experiment. Only a limited number of studies viewed trust as a dynamic variable that can strengthen or decay over time. Method: Seventy-five participants took part in an aided memory recognition task. In the task, participants viewed a series of images and later on performed 40 trials of the recognition task to identify a target image when it was presented with a distractor. In each trial, participants performed the initial recognition by themselves, received a recommendation from an automated decision aid, and performed the final recognition. After each trial, participants reported their trust on a visual analog scale. Results: Outcome bias and contrast effect significantly influence human operators' trust adjustments. An automation failure leads to a larger trust decrement if the final outcome is undesirable, and a marginally larger trust decrement if the human operator succeeds the task by him-/her-self. An automation success engenders a greater trust increment if the human operator fails the task. Additionally, automation failures have a larger effect on trust adjustment than automation successes. Conclusion: Human operators adjust their trust in automation as a result of their moment-to-moment interaction with automation. Their trust adjustments are significantly influenced by decision-making heuristics/biases. Application: Understanding the trust adjustment process enables accurate prediction of the operators' moment-to-moment trust in automation and informs the design of trust-aware adaptive automation.

Citations (1)


... 3. The AI systems provide personalized services that enhance my tourism experience. Yang et al., 2023) Please indicate your level of agreement with the following statements regarding trust dynamics in the context of AI-driven tourism technologies (1 =Strongly Disagree to 5 =Strongly Agree): ...

Reference:

Assessing the interplay of trust dynamics, personalization, ethical AI practices, and tourist behavior in the adoption of AI-driven smart tourism technologies
Toward Quantifying Trust Dynamics: How People Adjust Their Trust After Moment-to-Moment Interaction With Automation

Human Factors The Journal of the Human Factors and Ergonomics Society