Access to this full-text is provided by Springer Nature.
Content available from Psychological Injury and Law
This content is subject to copyright. Terms and conditions apply.
Vol.:(0123456789)
1 3
Psychological Injury and Law (2023) 16:83–97
https://doi.org/10.1007/s12207-022-09462-0
On theUse ofEye Movements inSymptom Validity Assessment
ofFeigned Schizophrenia
FrancescaAles1 · LucianoGiromini1· LaraWarmelink2· MeganPolden2· ThomasWilcockson3· ClaireKelly4·
ChristinaWinters5· AlessandroZennaro1· TrevorCrawford2
Received: 15 April 2022 / Accepted: 24 August 2022 / Published online: 5 September 2022
© The Author(s) 2022
Abstract
Assessing the credibility of reported mental health problems is critical in a variety of assessment situations, particularly in
forensic contexts. Previous research has examined how the assessment of performance validity can be improved through
the use of bio-behavioral measures (e.g., eye movements). To date, however, there is a paucity of literature on the use of eye
tracking technology in assessing the validity of presented symptoms of schizophrenia, a disorder that is known to be associ-
ated with oculomotor abnormalities. Thus, we collected eye tracking data from 83 healthy individuals during the comple-
tion of the Inventory of Problems – 29 and investigated whether the oculomotor behavior of participants instructed to feign
schizophrenia would differ from those of control participants asked to respond honestly. Results showed that feigners had
a longer dwell time and a greater number of fixations in the feigning-keyed response options, regardless of whether they
eventually endorsed those options (d > 0.80). Implications on how eye tracking technology can deepen comprehension on
simulation strategies are discussed, as well as the potential of investigating eye movements to advance the field of symptom
validity assessment.
Keywords Malingering· Schizophrenia· Symptom validity assessment· Eye movements
Introduction
The term malingering refers to the conscious fabrication or
exaggeration of mental or physical symptoms in order to
gain secondary personal benefits or financial compensation,
avoid school, work or military service, receive drugs or med-
ication, or obtain mitigation of criminal charges (American
Psychological Association, 2013). Failure to detect malingering
results in enormous social costs and places a heavy bur-
den on the healthcare system (Shapiro & Teasell, 1998). As
* Francesca Ales
francesca.ales@unito.it
Luciano Giromini
luciano.giromini@unito.it
Lara Warmelink
l.warmelink@lancaster.ac.uk
Megan Polden
m.polden@lancaster.ac.uk
Thomas Wilcockson
t.wilcockson@lboro.ac.uk
Claire Kelly
c.kelly8@newcastle.ac.uk
Christina Winters
C.L.Winters@tilburguniversity.edu
Alessandro Zennaro
alessandro.zennaro@unito.it
Trevor Crawford
t.crawford@lancaster.ac.uk
1 Department ofPsychology, University ofTurin, Via Verdi
10, 10123Turin, TO, Italy
2 Department ofPsychology, Lancaster University, Lancaster,
UK
3 Department ofPsychology, Loughborough University,
Loughborough, UK
4 School ofPsychology, Newcastle University,
NewcastleuponTyne, UK
5 Tilburg Institute forLaw, Technology, andSociety (TLS),
Tilburg University, Tilburg, TheNetherlands
Content courtesy of Springer Nature, terms of use apply. Rights reserved.
84 Psychological Injury and Law (2023) 16:83–97
1 3
such, evaluating the credibility of presented symptoms has
become a key issue for almost all psychological injury evalu-
ations (Bush etal., 2014; Giromini etal., 2022; Sherman
etal., 2020; Sweet etal., 2021; Young, 2014).
Psychotic symptoms are particularly commonly feigned
in the context of criminal trials. A study conducted in the
Los Angeles County jail, which is considered the largest jail
system in the USA, reports that almost a third of inmates
engaged in malingering psychotic symptoms in order to be
prescribed psychoactive drugs (Pierre etal., 2004). Further-
more, given that being diagnosed with a mental illness often
leads to mitigation of punishment, defendants charged with
serious crimes may be particularly tempted to pretend to suf-
fer from psychosis (Resnick, 1999). Given that, and because
clinical judgment alone is not sufficient to identify the pres-
ence of malingering (Dandachi-FitzGerald etal., 2017), in
these settings, professionals are expected to include addi-
tional assessments specifically developed to test the validity
of presented mental health problems (Giromini etal., 2022;
Sherman etal., 2020; Sweet etal., 2021). These are typi-
cally referred to as Symptom Validity Tests (SVTs) when
employing a self-report format and Performance Validity
Tests (PVTs) when they present the test-taker with cognitive
problems or tasks to solve (Larrabee, 2012).
Symptom Validity Assessment
Symptom validity assessment consists of evaluating the
overall credibility of the mental health problems reported
by the examinee. In essence, SVTs and PVTs assist pro-
fessionals in determining whether the examinee has pro-
vided an accurate and truthful picture of their symptoms
and mental health problems without feigning or exagger-
ating their health status (Bush etal., 2005). To this end,
current guidelines recommend administering multiple SVTs
and multiple PVTs, and experts agree that decisions about
symptom validity should not be based on a single validity
test (Giromini etal., 2022; Sherman etal., 2020; Sweet etal.,
2021). In addition, several other sources of information need
to be considered too, such as observational materials and
interview-related behaviors.
The Structured Interview of Reported Symptoms (SIRS;
Rogers etal., 1992; SIRS-2; Rogers etal., 2010) and the
Miller Forensic Assessment of Symptoms Test (M-FAST;
Miller, 2001) are two well-known examples of interview-
based SVTs. In addition, a list of widely used and/or psy-
chometrically sound self-report SVTs has been reviewed
recently in a special issue of Psychological Injury and
Law (Giromini etal., 2022). These include, among others,
the Structured Inventory of Malingered Symptomatology
(SIMS; Smith & Burger, 1997), the Inventory of Problems
– 29 (IOP-29; Viglione & Giromini, 2020), and the validity
scales of the Minnesota Multiphasic Personality Inventory
(MMPI-RF; Ben-Porath & Tellegen, 2008; MMPI-3; Ben-
Porath & Tellegen, 2020a, b) and Personality Assessment
Inventory (PAI; Morey, 1991, 2007).
Eye Movements andFeigning
In recent years, technological advancement has prompted
researchers to find other measures able to detect non-
credible symptom presentations to use alongside self-
reports. For example, reaction times were found to be
useful in the assessment of invalid responding in both
symptom and performance validity tests (Hartman, 2008;
Vendemia etal., 2005; Willison & Tombaugh, 2006).
More specifically, it has been shown that reaction times
tend to be slower during feigning attempts compared to
honest responding (Browndyke, 2013; Johnson etal.,
2003), suggesting a delay when the respondent has to
plan a simulation strategy and then endorse a non-genuine
response (Willison & Tombaugh, 2006).
Other research has investigated individuals’ attempts of
feigning by means of psychophysiological and neurophysi-
ological techniques such as electroencephalography (EEG)
and magnetic resonance imaging (fMRI). The rationale
behind these studies is that the brain activity and neural
processes of individuals who cooperate with the assess-
ment process might differ from those of individuals who
feign. Thus, some studies suggested that the EEG signals
of individuals who feign are characterized by excessive
cognitive load (Vagnini etal., 2008). Similarly, Kozel etal.
(2005) conducted an fMRI study and showed that specific
brain regions (i.e., anterior cingulate, orbitofrontal cortex,
and dorsolateral prefrontal cortex) are involved in deception
attempt mechanisms. Other studies have examined the role
of Event-Related Potentials (ERPs) in malingering assess-
ment and detection. However, although early scientific evi-
dence suggested that the use of ERPs may be especially
useful in detecting feigned memory impairment (Ellwanger
etal., 1996, 1999; Rosenfeld etal., 1999,1998,1996; Tardif
etal., 2000,2002), the results of ERPs research have been
mixed, overall. Finally, another important line of psycho-
physiological research focused on malingering involves the
study of electrodermal activity (EDA) during deception.
In particular, Kozel etal. (2005) conducted a pilot study
showing that changes in EDA correlated with activation of
specific brain regions, i.e., the orbitofrontal cortex and the
anterior cingulate.
Among all these other technological advances, oculomo-
tor measures seem particularly promising for the detection of
attempts of invalid responding (Hannula etal., 2012). In fact,
eye tracking technology allows non-invasive measurement of
eye position and behavior providing a useful and deep under-
standing of cognitive processes in both healthy adults and
clinical populations (Duchowski, 2007). A number of studies
Content courtesy of Springer Nature, terms of use apply. Rights reserved.
85Psychological Injury and Law (2023) 16:83–97
1 3
demonstrated that eye movements are associated with cogni-
tive processing, executive functions, attention deployment,
working memory, and response inhibition (Barnes, 2008;
Gooding & Basso, 2008; Hutton, 2008; Müri & Nyffeler,
2008; Olk & Kingstone, 2003; Pierrot-Deseilligny etal.,
2004; Sharpe, 2008). Additionally, abnormalities in eye
movements are typical of some neurological conditions and
mental disorders, such as dementia, Parkinson’s disease,
Alzheimer disease, and schizophrenia (Crawford etal.,
2005; Heitger etal., 2009; Maruta etal., 2010; van Stockum
etal., 2008). The latter, in particular, has been studied exten-
sively in relation to eye movements. The first study report-
ing abnormalities in the eye movements of individuals with
schizophrenia dates back to 1908 (Diefendorf & Dodge, 1908),
and the visual scanning behavior of these patients has been
studied ever since. Abnormalities in the oculomotor patterns
of individuals suffering from schizophrenia cover a wide
range of eye movements, from smooth pursuit to anti-
saccadic movements to more exploratory search patterns, i.e.,
visual search (for a more exhaustive treatise on this topic,
see the next section “Eye Movements and Schizophrenia”).
Currently, non-invasive eye-tracking systems using video
cameras are available. Recent advances in the performance
of eye-tracking cameras allow us to measure eye movements
with high temporal and spatial resolution. Thus, researches
on the eye movements of subjects with mental illnesses
including schizophrenia have been actively conducted. In
the following section, we will review the neural basis of
eye movement control and characteristics of schizophrenia.
We will then discuss the prospects for eye movements as
biomarkers for mental illnesses.
The study of eye movements is a valuable source of infor-
mation in both clinical and research settings. However, eye
tracking technology is still underutilized in malingering-
related research. One of the few studies using eye move-
ments to better understand the phenomenon of malingering
is an unpublished doctoral dissertation (Bashem, 2016). In
this work, the author inspected eye movements of individu-
als suffering vs individuals feigning mild Traumatic Brain
Injury (mTBI) symptoms, while taking the Test of Memory
Malingering (TOMM; Tombaugh, 1996). Results indicated
that certain oculomotor patterns could provide incremen-
tal validity over the classification accuracy of the TOMM,
supporting the hypothesis that eye tracking technology
might add a significant contribution to symptom validity
assessment.
Similar results were found in a recent study (Kanser etal.,
2020) that investigated the incremental validity of eye move-
ments on PVTs in identifying individuals instructed to feign
cognitive deficits. Kanser etal. (2020) found that feigners
showed multiple eye tracking indexes of greater cognitive
effort compared to both healthy controls and individuals
with genuine TBI. Results of this study also indicated that
eye movements were the best predictor in discriminating
group membership. In light of these findings, Kanser etal.
(2020) suggested that the investigation of eye movements
may be an important integration to performance validity
assessment, and that including eye movements’ measure-
ment in routine cognitive evaluations would provide reliable,
bio-behavioral data able to improve sensitivity to feigned
deficits.
Another recent study (Tomer etal., 2018) also highlighted
the potential of eye movements to detect feigned cognitive
impairment by using eye tracking technology in conjunction
with the Word Memory Test (WMT; Green etal., 1996).
Results showed that eye movements used along with the
WMT were able to predict group membership (simulators vs
honest controls), with eye movements uniquely contributing
to this prediction. Tomer etal. (2018) thus concluded that
eye movements represent a promising addition to perfor-
mance validity assessment and that they are able to shed
light on the strategies used by simulators when attempting
to exaggerate or fabricate a cognitive deficit or a mental
disorder.
Another similar pattern of findings was reported inspect-
ing eye movements in combination with the Binomial
Forced-Choice Digit Memory Test (BFDMT; Liu etal.,
2001), a tool widely used in China for testing performance
validity (Zhong etal., 2021). Specifically, feigners showed
longer dwell time and more fixations compared to honest
controls, suggesting that various eye tracking parameters
may be potential markers to detect simulators.
Eye Movements andSchizophrenia
Taken together, all of the findings described above suggest
that oculomotor patterns may be useful for understanding the
cognitive processes underlying feigning and over-reporting.
To date, the literature has focused mainly on the use of eye
movements to detect feigned brain damage, and no study has
tested their efficacy in mental disorders in which eye move-
ment abnormalities are also detected, such as schizophre-
nia. Indeed, individuals with schizophrenia are known to
exhibit oculomotor anomalies in both simple subconscious
eye movements, such as smooth pursuit, and more complex
cognitive tasks such as the anti-saccade task and visual
search (Morita etal., 2019). With regard to the former, dur-
ing smooth pursuit eye movements, individuals are required
to follow a moving target (usually horizontally, vertically,
or elliptically) with their eyes. Individuals suffering from
schizophrenia are not able to smoothly follow the target as
their eyes cannot keep up with its speed (Lencer etal., 2015;
O’Driscoll & Callahan, 2008). With regard to the latter, sac-
cades are rapid eye movements that humans constantly (3–4
saccades per second, on average) make to bring the area of
interest to match the fovea and can occur as an involuntary
Content courtesy of Springer Nature, terms of use apply. Rights reserved.
86 Psychological Injury and Law (2023) 16:83–97
1 3
reflex, or as a voluntary movement to redirect fixation. This
second kind of eye movements can be assessed with the
anti-saccade task, which is based on the premise that usu-
ally when a stimulus appears in our visual field, we are led
to perform a saccade directly to the stimulus and to avoid
any distracters. In the anti-saccade task, the subject is asked
to inhibit this involuntary saccadic reflex and look in the
opposite direction (e.g., if the distractor cue appears to the
left, the subject should look to the right). Previous literature
is consistent in supporting that patients with schizophrenia
have lower performance on the anti-saccade task compared
to control participants (Benson etal., 2012; Radant etal.,
2015).
Finally, Kojima etal. (1990) identified deficits in explora-
tory movements (i.e., visual search) of patients with schizo-
phrenia. This type of eye movements is strongly associated
with cognitive processing (Thomas & Lleras, 2007; Van der
Stigchel etal., 2006), which is equally impaired in patients
with schizophrenia (Silverstein & Keane, 2011).
First-degree relatives of patients suffering from schizo-
phrenia also underperform in smooth pursuit, anti-saccade,
and exploratory eye movement tasks (Kikuchi etal., 2018;
Levy etal., 2010; Takahashi etal., 2008), and candidate
genes related to these oculomotor abnormalities have been
identified (Greenwood etal., 2007, 2011). In fact, specific
eye movement patterns have been suggested as neuropsycho-
logical biomarkers for schizophrenia (Calkins etal., 2007;
Kojima etal., 2001; Light etal., 2012; Suzuki etal., 2009).
For all these reasons, examination of eye movements may
prove particularly informative in assessing the credibility of
schizophrenia-related symptoms.
To our knowledge, only one study (Ales etal., 2021)
so far investigated whether experimental simulators could
reproduce eye movement abnormalities typical of patients
suffering from schizophrenia. More specifically, eye move-
ment data were registered during two tasks widely used to
evaluate oculomotor deficits in schizophrenia (i.e., smooth
pursuit and anti-saccade) in order to test whether eye move-
ments of experimental feigners would differ from those of
honest participants. Results were also compared with those
reported in two major studies (O’Driscoll & Callahan, 2008;
Radant etal., 2015) that had collected eye movements’
data in a very large sample of schizophrenia patients. Ales
etal. (2021) observed that individuals who attempted to
feign schizophrenia were only partially able to reproduce
eye movement abnormalities typically shown by genuine
patients suffering from schizophrenia. It was therefore con-
cluded that eye movements’ investigation may be a valuable
addition to detect malingered schizophrenia.
The current study aimed to provide additional evidence
that eye tracking technology may contribute to symptom
validity assessment. More specifically, we investigated
whether the eye movements of healthy individuals taking an
SVT with the instruction to feign schizophrenia would differ
from those of control participants taking the same test but
with the instruction to respond honestly. In order to address
this research question, we recorded eye movements of 83
healthy volunteers taking the IOP-29. Approximately half
were instructed to respond honestly, whereas the other group
was instructed to feign schizophrenia. Our hypothesis was
that experimental feigners would spend more time fixating
on the different response options of the same items, com-
pared to control participants instructed to respond honestly.
In other words, we hypothesized that the extra uncertainty
and cognitive effort associated with feigning would lead to
extra consideration of the answer options (hypothesis 1).
Additionally, we also speculated that feigners would focus
more than controls on those response options that the IOP-
29 identifies as more indicative of feigning, whether or
not they actually endorsed those options (hypothesis 2). It
should be noted that although these hypotheses have not
yet been tested, the same data set has been used before to
evaluate some other hypotheses related to eye movements,
and the results of these other analyses have been described
in another article (Ales etal., 2021).
Methods
Participants
The demographic composition of the sample is described
in more detail in Ales etal. (2021). Briefly, the sample con-
sisted of 83 participants whose native language was English.
Sixty-four were women, and the mean age was 23.35years
(SD = 6.84, range = 18–57). The sample was collected in
the north of England via an advertisement on the university
website and snowball sampling. The advertisement on the
website provided a brief description of the experiment and
inclusion criteria and informed potential participants that all
of them would be paid £5 upon completion of the experi-
ment and that some of them could potentially win an addi-
tional £25 (see below). Exclusion criteria for participation
in the study were (a) not being native English speaking, (b)
presence of mental and/or neurological diseases, (c) history
of psychiatric disorders, and (d) presence of pathological
conditions related to the visual system. No statistical differ-
ences were found between the two groups in terms of age
[t(57) = 1.26, p = 0.20] and gender [χ2 = 0.007, p = 0.93].
Materials andMeasures
The Inventory ofProblems‑29 (IOP‑29)
The IOP-29 is a self-administered test measuring a range
of emotional, cognitive, and social experiences (Viglione
Content courtesy of Springer Nature, terms of use apply. Rights reserved.
87Psychological Injury and Law (2023) 16:83–97
1 3
etal., 2017). Out of the 29 items, 27 provide three possible
response options, i.e., True, False, and Doesn’t make sense.
The other two items are open-ended questions that involve
calculations or logical reasoning. Ultimately, IOP-29 results
are interpreted on the basis of the False Disorder Probability
Score (FDS), which provides a probability value of finding
a given score within a reference sample of genuine patients
vs a reference sample of experimental feigners. The higher
the FDS, the lower the credibility of the presented com-
plaints. Viglione and Giromini (2020) set the FDS cutoff
at ≥ 0.50.
The algorithm underlying the FDS of IOP-29 is not
discussed in detail here for reasons of test security, so as
not to compromise its effectiveness. However, for the pre-
sent article, it is important to note that each IOP-29 item
contains one or more feigning-key response options, the
endorsement of which suggests a possible exaggeration or
negative response bias. In addition, the FDS uses a scaling
approach that incorporates a multiple-weighting procedure,
which is discussed in detail in the first IOP-29 validation
article (Viglione etal., 2017). Thus, an item keyed True
might have a weighting score of +2 for True, a weighting
score of +1 for Doesn’t Make Sense, and a weighting score
of −1 for False. For another item, in contrast, the response
option False may have a weighting score of +1 whereas the
response options True and Doesn’t make sense might have
both a weighting score of 0. In the current study, for those
items where more than one feigning-key response option is
available, we considered the response option with the high-
est weight to be the “target” feigning-key response option
for the item.
The validity of the IOP-29 has been demonstrated in
several studies. In particular, its effectiveness in detect-
ing feigned schizophrenia has been observed in several
countries such as North America (Viglione etal., 2017),
the UK (Winters etal., 2021), Italy (Di Girolamo etal.,
2021; Giromini etal., 2018, 2020b; Pignolo etal., 2021),
Slovenia (Šömen etal., 2021), and France (Banovic etal.,
2021). These studies demonstrated that the IOP-29 is
valid, reliable, and easily adaptable across cultures and
languages (see also Boskovic etal., 2022; Ilgunaite etal.,
2022). In fact, in a recent quantitative review, Giromini
and Viglione (2022) showed that the same cutoffs yielded
similar results in different cultures, populations, and con-
texts. This undoubtedly simplifies the use of the test and
minimizes errors due to different cutoff interpretations.
Importantly, IOP-29 ecological (Roma etal., 2019) and
incremental validity (Giromini etal., 2019, 2020a) has
also been demonstrated by empirical research. Indeed, sev-
eral studies consistently indicated that using the IOP-29
with other SVTs and PVTs improved classification accu-
racy (for a quantitative literature review, see Giromini &
Viglione, 2022).
Parallel Version oftheInventory ofProblems‑29
We created a parallel version of the IOP-29 to ensure that the
control and feigning groups did not differ from each other in
their visual scanning approach to the questions and response
options of a test, when they are asked to respond honestly.
Said differently, we wanted to rule out the possibility that the
participants in the control and feigning groups had generally
different eye movement approaches regardless of the condi-
tion to which they had been assigned.
These 29 items were extracted from the same pool of 181
items from which the standard IOP-29 items were extracted.
In fact, to develop the False Disorder Probability Score,
Viglione etal. (2018) conducted a series of simulation
studies utilizing a longer version of the IOP-29—namely
IOP—and comprising a broader (i.e., 181) pool of items.
Additional information concerning these items may be found
in Viglione etal. (2018). Similar to the standard IOP-29, the
parallel version has two items with open-ended response
options, whereas all other items have the three aforemen-
tioned response options (i.e., True, False, and Doesn’t make
sense).
Eye Tracker
An EyeLink 1000 Plus Desktop Mount tracker was used
to record participants’ eye behavior, using a chin rest to
minimize head movements. Consistent with the guidelines
reported in the EyeLink manual, the participant sat 40cm
away from the camera. Eye movements were sampled at
500Hz which allows to report eyes’ location every 2ms1
with an accuracy within 0.25–0.50° of visual angle. The
EyeLink 1000 Plus provided a spatial resolution2 of 0.01° of
visual angle. Before each task (i.e., IOP-29 and its parallel
version), all participants were asked to complete a 9-point
calibration and validation in order to set the eye tracker for
an accurate gaze point calculation tailored on each partici-
pant’s eye. Figure1 shows the experimental setup, the Eye-
Link 1000 Plus apparatus, and prototypical subjects looking
at the screen.
Procedure
A malingering experimental paradigm (sometimes named
“analogue” or “simulation” study) was implemented. Prior
to participants’ recruitment, the study was approved by the
1 In eye tracking systems, temporal resolution is equivalent to their
sampling rate, e.g., the number of times per second that the location
of gaze is reported.
2 Spatial resolution may be defined as the precision level of the
instrument and, as such, measures its reliability.
Content courtesy of Springer Nature, terms of use apply. Rights reserved.
88 Psychological Injury and Law (2023) 16:83–97
1 3
Lancaster University Research Ethics Committee. Partici-
pants volunteered to take part in the study and were ran-
domly assigned to the “feigning” condition (i.e., instructed
to feign schizophrenia) or the “honest” condition (i.e., asked
to complete the entire procedure honestly). Specifically, par-
ticipants assigned to the feigning group (n = 43) received
a vignette3 describing a scenario in which they might find
it convenient to simulate schizophrenia. The most typical
symptoms and manifestations of schizophrenia were also
reported at the end of the vignette. Experimental feigners
were warned not to exaggerate the symptom presentation to
avoid being detected as simulators. To this end, they were
told that if they could appear genuinely suffering from schiz-
ophrenia without being identified as simulators, they would
have the chance to win £25. Said differently, prior to starting
that experimental procedure, feigners were informed that
they could win £25 if they could produce test results that
look like those of patients with schizophrenia. Thereby, the
potential £25 award served as an external incentive. Experi-
mental feigners were also administered the parallel version
of the IOP-29 but with the instruction to answer honestly.
Participants assigned to the “honest” condition (i.e., con-
trol group, n = 40) received a vignette describing the same
scenario, but they were not asked to put themselves in the
shoes of someone willing to feign schizophrenia. More spe-
cifically, the vignette they were presented with was about
someone else feigning schizophrenia, and they were asked to
read and memorize it. This was done in order to ensure they
actually read and processed it. Then, they were instructed to
complete the IOP-29 and its parallel version honestly, fol-
lowing standard instructions. Honest responders were also
informed, prior to the beginning of the experiment, they
could have the chance to win £25 if they completed both
tasks.
For the entire duration of the experiment (i.e., both while
filling out the standard and the parallel IOP-29), partici-
pants’ eyes were being tracked and their eye movements
recorded. The layout of both the IOP-29 and its parallel ver-
sion closely resembled the layout of the IOP-29 in its online
administration format (Fig.2). Order of administration of
the two IOP-29 versions (i.e., standard and parallel) was
randomized and counterbalanced. Once the experiment was
completed, all participants were paid £5 and were asked to
provide their email so that they could be contacted in case
they resulted to be the winners of the £25 award.
Data Analysis
Preliminary Analyses
First, to rule out the possibility that the two groups simply
had a different visual approach attending to the items and
response options on a test, we compared the mean dwell
time (measured in ms), number of fixations, and number
of runs from a response option to another made by feigners
vs controls while taking the parallel form of the IOP-29.
Because both groups were instructed to respond honestly
to the parallel IOP-29, no between-group differences were
expected. Next, to ensure that feigners made an effort to
follow instructions and to respond to the items of the stand-
ard IOP-29 as if they were suffering from schizophrenia,
we inspected the scores of the IOP-29 FDS produced by
the two groups. Given that the IOP-29 has demonstrated
strong validity in discriminating bona fide from experimen-
tally feigned schizophrenia (Giromini & Viglione, 2022),
we anticipated significant between-group differences, with
a large or very large effect size.4
Main Analyses
To evaluate whether feigners scanned the text and response
options of the IOP-29 items differently from honest controls
(hypothesis 1), we calculated five key indicators:
1. The average dwell time (ms) spent on reading the text of
each item of the IOP-29 (Mean Dwell (Items’ Text)).
Fig. 1 Experimental setup showing the EyeLink 1000 Plus apparatus
and a prototypical participant looking at the screen
3 The vignettes used are reported in Ales etal. (2021).
4 Consistent with Rogers etal. (2003), because feigning studies typi-
cally produce substantial effect sizes, we characterized Cohen’s d
effect sizes ≥ 0.75 as “moderate,” ≥ 1.25 as “large,” and ≥ 1.75 as
“very large.”.
Content courtesy of Springer Nature, terms of use apply. Rights reserved.
89Psychological Injury and Law (2023) 16:83–97
1 3
2. The average dwell time (ms) spent on visually scanning
the three response options (i.e., “True,” “False,” and
“Doesn’t Make Sense”) of the 27 multiple-choice items
of the IOP-29 (Mean Dwell (Response Options)).
3. The average number of fixations made while reading the
text of each item of the IOP-29 (Mean Fixations Count
(Items’ Text)).
4. The average number of fixations made by the participant
while visually scanning the three response options of the
27 multiple-choice items of the IOP-29 (Mean Fixations
Count (Response Options)).
5. The average number of times the eyes of the exami-
nee moved from the outside to the inside of any given
response option areas, across all of the 27 multiple-
choice items of the IOP-29 (Mean Run Count (Response
Options)). This may be conceived of as an index of
uncertainty by the participant, given it is based on the
number of times the examinee moves their eyes from
one response option to another within the same IOP-29
item.
Next, we inspected whether our experimental feigners
focused their visual attention more on the feigning-keyed
response options than did the controls (hypothesis 2). For
example, for an item stating “I have never smiled in my life,”
the feigning-keyed option would be “False,” because feign-
ers are expected to endorse F more frequently than hon-
est responders do, since it is really unlikely for a person to
authentically say that they never smiled in their life.5 Thus,
our hypothesis 2 states that for an item like this, feigners
would focus their visual attention more on the response
option “False” than would controls. Additionally, we also
tested whether experimental feigners spent more time, com-
pared to honest controls, scanning those feigning-keyed
response options, even when they eventually decided not
to endorse them. This was done because feigned-keyed
response options are obviously more likely to be chosen by
feigners than by controls, so we were concerned that feign-
ers might focus more on these response options than con-
trols simply because they endorsed them more, rather than
because they thought about them more or for longer time.
To test these hypotheses, we performed a series of t-tests
assessing possible between-group differences on the average
dwell time (ms), fixations count, and number of runs made
from a response option to another on IOP-29 feigning-keyed
response options.
Fig. 2 Prototypical image of
how the IOP-29 items were pre-
sented on the screen. Note: To
protect test security, we did not
report an actual item from the
IOP-29. This is a representation
of how the items were portrayed
on the screen. This set-up corre-
sponds to the online administra-
tion format of the IOP-29. In
order to test our hypotheses,
and prior to data collection, we
created four Areas of Interest
(AOI) corresponding to the four
“boxes” in the image, i.e., AOI
#1 = “Question or Statement”
box; AOI #2 = “True or Mostly
True” box; AOI #3 = “False
or Mostly False” box; AOI
#4 = “Does not make sense”
box. The three response option
boxes measured 5cm × 2cm;
the question/statement box
measured 22.5cm × 3.5cm
5 This item is not included in the actual IOP-29, it is only used here
to demonstrate the principle.
Content courtesy of Springer Nature, terms of use apply. Rights reserved.
90 Psychological Injury and Law (2023) 16:83–97
1 3
Results
Results ofPreliminary Analyses
Consistent with the hypothesis that the experimental group
did make an effort to fake schizophrenia, and in line with
previous research on this matter (Giromini & Viglione,
2022), the IOP-29 FDS scores of our experimental feigners
(M = 0.82; SD = 0.22) were significantly higher than those
of our honest controls (M = 0.13; SD = 0.12), t(67.3) = 17.83,
p < 0.001.6 Cohen’s d was 3.84, which is in line with previ-
ous research comparing honest responders against experi-
mental feigners of schizophrenia (Giromini & Viglione,
2022). The Receiver Operator Characteristic curve (ROC)
was 0.98 (SE = 0.01; see Figs.3 and 4 and Table1).
Furthermore, as hypothesized, the average dwell time
(ms), the fixations count, and the number of runs from a
response option to another during visual inspection of the
parallel IOP-29 did not differ by group (all p’s > 0.05). These
findings indicate that when instructed to respond honestly,
the two groups did not significantly differ in their approach
to visually scanning the items and response options of the
parallel IOP-29.
Results ofMain Analyses
As shown in Table2, on average, feigners spent more
time than controls looking at the text of the IOP-29
items (Cohen’s d = 0.48), but no statistically significant
Fig. 3 Receiver Operator Characteristic Curve (AUC) of IOP-29
FDS. Note: The Receiver Operator Characteristic (ROC) curve illus-
trates the diagnostic accuracy of the IOP-29 by showing the true-pos-
itive rate (sensitivity) and the true-negative rate (specificity). This, the
Area under the Curve, is a graphical measure of the accuracy of the
IOP-29
Fig. 4 Representation of IOP-29
FDS scores by group. Note: The
figure shows the graphical rep-
resentation of the distribution of
the IOP-29 FDS scores obtained
in the two conditions, i.e.,
controls and experimental feign-
ers. The Y-Axis represents the
FDS scores range, whereas the
X-Axis represents the frequency
of participants that obtained a
specific score. The dotted line in
the X-axis indicates the IOP-29
FDS value of 0.50
6 Because homoscedasticity could not be assumed, the Welch-Satter-
thwaite method was used to adjust degrees of freedom.
Content courtesy of Springer Nature, terms of use apply. Rights reserved.
91Psychological Injury and Law (2023) 16:83–97
1 3
differences emerged when considering the total time spent
on the response options, nor when considering possible
runs from one response option to another. However, as
hypothesized, experimental feigners focused their visual
attention on feigning- keyed response options more than
controls did, regardless of whether they eventually decided
to endorse those response options (Table3). Therefore,
feigners spent more time observing feigning-keyed
response options and returned to those response options
more frequently than controls did. Crucially, this finding
holds true even when considering the average dwell time
(ms), average fixations count, and average number of runs
from a response option to another referred to feigning-
keyed response options not endorsed by the respondent.
Said differently, our experimental feigners paid more
attention to the feigning-keyed response options even if
they eventually decided not to endorse them. The size of
these differences ranges from d = 0.86 to 1.11.
Discussion
Assessing symptoms validity is a crucial step in order to
draw useful conclusions about an examinee’s health status,
make accurate diagnoses, and plan appropriate medical
treatments. This is true especially in high-stakes forensic
contexts, in which there is a significant risk of incurring in
false or exaggerated symptom presentations. Thus, to detect
possible over-reporting, clinical and forensic psychologists
are expected to utilize several SVTs and PVTs in their daily
practice.
Our study sought to examine the utility of pairing a bio-
behavioral measure with a SVT in order to improve detection
of feigned psychiatric conditions. Specifically, we conducted
a simulation study to investigate eye movements during the
administration of the IOP-29 in a sample of 83 healthy indi-
viduals, half of which were asked to simulate schizophrenia
(the other half served as group of control).
To date, a few studies have addressed the use of eye
movements in relation to feigned cognitive deficits, but none
have examined eye behavior in relation to those psychiatric
disorders whose eye-tracking abnormalities are well-estab-
lished (e.g., schizophrenia). Only one study (Ales etal.,
2021) attempted to address this topic but its focus was on
PVTs and not SVTs. Therefore, the current study aimed to
investigate the oculomotor behavior of healthy participants
who were asked to feign schizophrenia while completing an
SVT (i.e., the IOP-29), comparing their eye movements to
control participants who were asked to complete the same
test honestly.
The results of this study indicate that overall, compared to
controls, feigners spent more time looking at the text of the
IOP-29 items and that they focused longer on and returned
more frequently to feigning-keyed response options. Taken
together, these results suggest that tracking an examinee’s
eye movements while taking an SVT can provide informa-
tion about the credibility of their responses.
Experimental feigners spent slightly more time than
controls looking at the text of the IOP-29 items. Therefore,
feigners may have been thinking about which option they
should endorse. There is consensus that fixation duration
in a task is associated with the duration of the cognitive
processes and the degree of engagement in that same task
(e.g., Irwin, 2004). This is consistent with the theory that
deception increases cognitive load and the effort required by
the participant (Blandón-Gitlin etal., 2014; Sporer, 2016;
Vrij etal., 2011), partly through inhibition of the truthful
response (Lane & Wegner, 1995). In fact, the two groups
(i.e., controls and feigners) did not differ when they were
Table 1 Sensitivity and specificity of the IOP-29 based on three com-
monly inspected cutoffs
Based on the professional manual of the IOP-29 (Viglione &
Giromini, 2020), a cutoff score of ≥ 0.65 is recommended to obtain
a specificity of about 90%, a cutoff score of ≥ 0.50 is recommended
to obtain both specificity and sensitivity of about 80%, and a cutoff
score of ≥ 0.30 is recommended to obtain a sensitivity of about 90%
Sensitivity Specificity
IOP-29 FDS ≥ 0.65 88.4% 100.0%
IOP-29 FDS ≥ 0.50 88.4% 97.5%
IOP-29 FDS ≥ 0.30 93.0% 90.0%
Table 2 Visual inspection of
IOP-29 items and response
options by controls and feigners
The unit of measurement of the Mean Dwell is milliseconds, ms
Controls (n = 40) Feigners (n = 43) t (81) p d
M SD M SD
Mean dwell (items’ text) 2487.4 801.1 2887.4 868.4 2.18 0.03 0.48
Mean dwell (response options) 446.5 229.2 488.7 271.1 0.76 0.45 0.17
Mean fixations count (items’ text) 13.00 3.75 14.54 3.77 1.87 0.07 0.41
Mean fixations count (response options) 2.34 1.11 2.51 1.34 0.61 0.54 0.13
Mean run count (response options) 1.85 1.49 2.88 1.66 0.60 0.60 0.12
Content courtesy of Springer Nature, terms of use apply. Rights reserved.
92 Psychological Injury and Law (2023) 16:83–97
1 3
both asked to respond honestly, suggesting that it is decep-
tion that drives the delay.
Additionally, compared to honest responders, experimen-
tal feigners spent more time and made a higher number of
fixations and a higher number of runs from one response
option to another in the feigning-keyed response options,
even when eventually they did not they endorse that option.
The extra time experimental feigners took may be due to the
effort required by high-level decision-making and problem-
solving strategies. When requested to respond to the items,
patients with schizophrenia or control participants may ask
themselves if that particular item represents them or their
experience of their own symptoms, whereas feigners have
to (a) consider whether a particular item could reflect the
experience of a genuine patient affected by schizophrenia;
(b) reason out how to respond in order to appear schizo-
phrenic but, simultaneously, not be detected; and (c) sup-
press thoughts about their own experience.
It is worth mentioning that this study also has some
important limitations. First and most importantly, there
was no direct comparison with patients with schizophrenia.
As a simulation design, in our study, small incentives were
offered to experimental feigners and their performance was
compared with that of honest controls. As such, one might
question the generalizability of our results to clinical settings
since no comparison was made with patients with a genu-
ine schizophrenic symptomatology. Future studies should
compare results of experimental feigners to those of genu-
ine patients in order to test whether individuals attempting
to malinger are able to feign schizophrenic-like symptoms
without being detected, offering a higher generalizability of
findings. Second, our sample consisted mostly of women.
With regard to eye movements, results on sex differences in
visual scanning generated mixed results. Recently, Mathew
etal. (2020) investigated sex differences in oculomotor tasks
and their results showed no significant differences. However,
some studies have observed slight differences using specific
stimuli or tasks. As for the IOP-29, no significant sex differ-
ences were reported suggesting no gender influences on IOP-
29 results (Carvalho etal., 2021; Giromini etal., 2020a).
Nevertheless, one might question the generalizability of our
results and future studies should take sex of the participants
into account. Third, our study was designed as a malinger-
ing experimental paradigm and, although we made an effort
to engage participants assigned in the feigning group (e.g.,
financial incentives, relevant vignette scenario), they were
nonetheless instructed to feign schizophrenia so the external
validity of our study might be questioned, given the differ-
ences with real-life forensic contexts. In addition, we did
not employ a post-manipulation check. Rogers (2008) rec-
ommended using post-test questions in simulation studies
to verify that the participant understood their task and was
compliant with the instructions. Therefore, this certainly
represents a limitation of our study. However, the IOP-29
performed almost the same in this study as in other similar
studies in which a manipulation check was implemented (for
a review, see Giromini & Viglione, 2022), suggesting that
our results should not have been compromised. Somewhat
related, our internal validity should not have been affected
by the experimental design we implemented, given that
our participants were not suspected malingerers but rather
were openly instructed to feign symptoms of schizophre-
nia. Moreover, as mentioned above, the use of role simu-
lation and random assignments to the feigning condition
should have preserved internal validity of our study. Fourth,
although the items of the IOP-29 include feigning-keyed
response options, the ultimate feigning score of the IOP-29
is generated by considering a multitude of factors, including
the consistency between one response and another (Viglione
etal., 2017). In addition, test-takers are unlikely aware of
Table 3 Visual inspection of feigning-keyed response options by controls and feigners
The unit of measurement of the mean dwell is milliseconds, ms. For all comparisons, because homoscedasticity could not be assumed, the
Welch-Satterthwaite method was used to adjust degrees of freedom
Controls (n = 40) Feigners (n = 43) t df p d
M SD M SD
Mean dwell (feigning-keyed response options)
Regardless of endorsement 93.3 59.3 183.0 107.3 4.76 66.4 < 0.01 1.03
Feigning-keyed responses not endorsed 76.0 51.3 150.8 109.9 4.02 60.4 < 0.01 0.86
Mean fixations count (feigning keyed response options)
Regardless of endorsement 0.50 0.28 0.98 0.54 5.17 64.3 < 0.01 1.11
Feigning-keyed responses not endorsed 0.39 0.25 0.82 0.61 4.24 56.5 < 0.01 0.91
Mean run count (feigning-keyed response options)
Regardless of endorsement 0.34 0.18 0.61 0.32 4.49 66.9 < 0.01 1.01
Feigning-keyed responses not endorsed 0.29 0.17 0.53 0.34 4.04 63.9 < 0.01 0.87
Content courtesy of Springer Nature, terms of use apply. Rights reserved.
93Psychological Injury and Law (2023) 16:83–97
1 3
which of the IOP-29 response options are more likely to sug-
gest bona fide schizophrenia vs deliberate feigning. Accord-
ingly, future studies in which fewer and more straightforward
response options are provided for each item (e.g., the SIMS)
would be beneficial. Finally, a technical limitation worth
mentioning is that our experiment was designed so that the
participant chose the response option (i.e., True, False, and
Doesn’t make sense) by clicking one out of three keys on the
numeric keypad (i.e., 1, 2, and 3). This was done to prevent
the participant from looking away from the screen—which
would have compromised data acquisition—as could have
happened using the mouse. However, this may have resulted
in an automated process in which the participant was not
prompted to look at the area of interest corresponding to
the response option. On one hand, this would explain the
absence of significant differences between experimental
simulators and honest participants in the number and dura-
tion of fixations on the IOP-29 response options overall; on
the other hand, it makes it even more remarkable that feign-
ers paid more attention than controls to the feigning-keyed
response options. Indeed, using the keyboard instead of the
mouse may have underestimated participants’ eye behavior
in terms of duration and number of fixations. Thus, it is
perhaps remarkable that we were able to objectively discern
that, despite the possible underestimation of eye movements’
measurement, our experimental feigners paid more attention
to the feigning-keyed response option overall by comparison
to the control group.
Despite these limitations, our study sought to provide pre-
liminary evidence that eye movements may improve symp-
tom validity assessment. Indeed, the use of eye-tracking
technology in conjunction with the administration of the
IOP-29 has the potential to improve our understanding of the
cognitive load of experimental feigners during item inspec-
tion, as well as the simulation strategies used by individuals
instructed to pretend to be mentally ill. Our results contribute
to a deeper understanding of the decision-making and cogni-
tive processes underlying deception mechanisms and simula-
tion attempts. Although eye-tracking technology—and other
neuropsychological measures as well—is advancing both in
terms of cost effectiveness and usability, we believe that,
to date, they are not yet ready to be paired with symptom
validity assessment in real-world settings. Nevertheless, they
certainly represent a resource to refine available SVTs and
PVTs. As such, our study may represent a proof-of-concept
that the use of bio-behavioral measures such as eye move-
ments is extremely useful in validity assessment contexts,
given the increasing demand for valid and reliable instru-
ments that would enhance the quality of clinical and foren-
sic assessments, facilitate the practice, and encourage gold
standards in delivering psychological services (APA, 2013).
Perhaps in the future these technologies will be more acces-
sible and can be paired with self-reports for malingering
detection. Overall, our findings indicate that eye tracking
technology may be a promising adjunct for assessing symp-
tom validity.
Funding Open access funding provided by Università degli Studi di
Torino within the CRUI-CARE Agreement.
Declarations
Ethical Approval All procedures performed in studies involving human
participants were in accordance with the ethical standards of the insti-
tutional and/or national research committee and with the 1964 Helsinki
Declaration and its later amendments or comparable ethical standards.
Consent to Participate Informed consent was obtained from all indi-
vidual participants included in the study.
Competing Interests The second author, Luciano Giromini, declares
that he owns a share in the corporate (LLC) that possesses the rights to
Inventory of Problems. All other eight authors declare that they have
no conflict of interest.
Open Access This article is licensed under a Creative Commons Attri-
bution 4.0 International License, which permits use, sharing, adapta-
tion, distribution and reproduction in any medium or format, as long
as you give appropriate credit to the original author(s) and the source,
provide a link to the Creative Commons licence, and indicate if changes
were made. The images or other third party material in this article are
included in the article's Creative Commons licence, unless indicated
otherwise in a credit line to the material. If material is not included in
the article's Creative Commons licence and your intended use is not
permitted by statutory regulation or exceeds the permitted use, you will
need to obtain permission directly from the copyright holder. To view a
copy of this licence, visit http:// creat iveco mmons. org/ licen ses/ by/4. 0/.
References
Ales, F., Giromini, L., Warmelink, L., Polden, M., Wilcockson, T., Kelly,
C., Winters, C., Zennaro, A., & Crawford, T. (2021). An eye track-
ing study on feigned schizophrenia. Psychological Injury and Law,
14(3), 213–226. https:// doi. org/ 10. 1007/ s12207- 021- 09421-1
American Psychiatric Association. (2013). Diagnostic and statistical
manual of mental disorders (5th ed., Arlington, VA: American
Psychiatric Association, 2013).
American Psychological Association. (2013). Specialty guidelines for
forensic psychology. American Psychologist, 68(1), 7–19. https://
doi. org/ 10. 1037/ a0029 889
Banovic, I., Filippi, F., Viglione, D. J., Scrima, F., Zennaro, A., Zappalà,
A., & Giromini, L. (2021). Detecting coached feigning of schizo-
phrenia with the inventory of problems – 29 (IOP-29) and its mem-
ory module (IOP-M): A simulation study on a French community
sample. International Journal of Forensic Mental Health. https://
doi. org/ 10. 1080/ 14999 013. 2021. 19067 98
Barnes, G. (2008). Cognitive processes involved in smooth pursuit eye
movements. Brain and Cognition, 68(3), 309–326. https:// doi.
org/ 10. 1016/j. bandc. 2008. 08. 020
Bashem, J. R. (2016). Performance validity assessment of bona fide
and malingered traumatic brain injury using novel eye-tracking
systems. (Unpublished doctoral dissertation). Wayne State Uni-
versity, Detroit, Michigan.
Content courtesy of Springer Nature, terms of use apply. Rights reserved.
94 Psychological Injury and Law (2023) 16:83–97
1 3
Ben-Porath, Y., & Tellegen, A. (2008). Minnesota multiphasic person-
ality inventory-2-RF.Journal of Personality Assessment.
Ben-Porath, Y. S., & Tellegen, A. (2020a). MMPI-3 manual for adminis-
tration, scoring, and interpretation. University of Minnesota Press.
Ben-Porath, Y. S., & Tellegen, A. (2020b). MMPI-3 technical manual.
University of Minnesota Press.
Benson, P. J., Beedie, S. A., Shephard, E., Giegling, I., Rujescu, D., &
Clair, D. S. (2012). Simple viewing tests can detect eye move-
ment abnormalities that distinguish schizophrenia cases from
controls with exceptional accuracy. Biological Psychiatry, 72(9),
716–724. https:// doi. org/ 10. 1016/j. biops ych. 2012. 04. 019
Blandón-Gitlin, I., Fenn, E., Masip, J., & Yoo, A. H. (2014). Cogni-
tive-load approaches to detect deception: Searching for cogni-
tive mechanisms. Trends in Cognitive Sciences, 18(9), 441–444.
https:// doi. org/ 10. 1016/j. tics. 2014. 05. 004
Boskovic, I., Akca, A. Y. E., & Giromini, L. (2022). Symptom coaching
and symptom validity tests: An analog study using the structured
inventory of malingered symptomatology, self-report symptom
inventory, and inventory of problems-29. Applied Neuropsychol-
ogy: Adult. https:// doi. org/ 10. 1080/ 23279 095. 2022. 20578 56
Browndyke, J. N. (2013). Functional neuroanatomical bases of decep-
tive behavior and malingering. In D. A. Carone & S. S. Bush
(Eds.), Mild traumatic brain injury: Symptom validity assessment
and malingering (pp. 303–321). Springer Publishing.
Bush, S. S., Heilbronner, R. L., & Ruff, R. M. (2014). Psychologi-
cal assessment of symptom and performance validity, response
bias, and malingering: Official position of the Association for
Scientific Advancement in Psychological Injury and Law. Psy-
chological Injury and Law, 7, 197–205. https:// doi. org/ 10. 1007/
s12207- 014- 9198-7
Bush, S. S., Ruff, R. M., Tröster, A. I., Barth, J. T., Koffler, S. P., Pliskin,
N. H., Reynolds, C. R., & Silver, C. H. (2005). Symptom validity
assessment: Practice issues and medical necessity NAN Policy
and Planning Committee. Archives of Clinical Neuropsychology,
20(4), 419–426. https:// doi. org/ 10. 1016/j. acn. 2005. 02. 002
Calkins, M. E., Dobie, D. J., Cadenhead, K. S., Olincy, A., Freedman,
R., Green, M. F., Greenwood, T. A., Gur, R. E., Gur, R. C., Light,
G. A., Mintz, J., Nuechterlein, K. H., Radant, A. D., Schork, N.
J., Seidman, L. J., Siever, L. J., Silverman, J. M., Stone, W. S.,
Swerdlow, N. R., … Braff, D. L. (2007). The consortium on the
genetics of endophenotypes in schizophrenia: Model recruitment,
assessment, and endophenotyping methods for a multisite col-
laboration. Schizophrenia Bulletin, 33(1), 33–48. https:// doi. org/
10. 1093/ schbul/ sbl044
Carvalho, L. D. F., Reis, A., Colombarolli, M. S., Pasian, S. R.,
Miguel, F. K., Erdodi, L. A., & Giromini, L. (2021). Discrimi-
nating feigned from credible PTSD symptoms: A validation of
a Brazilian version of the Inventory of Problems-29 (IOP-29).
Psychological Injury and Law, 14(1), 58–70. https:// doi. org/ 10.
1007/ s12207- 021- 09403-3
Crawford, T. J., Higham, S., Renvoize, T., Patel, J., Dale, M., Suriya,
A., & Tetley, S. (2005). Inhibitory control of saccadic eye move-
ments and cognitive impairment in Alzheimer’s disease. Biological
Psychiatry, 57(9), 1052–1060. https:// doi. org/ 10. 1016/j. biops ych.
2005. 01. 017
Dandachi-FitzGerald, B., Merckelbach, H., & Ponds, R. W. (2017).
Neuropsychologists’ ability to predict distorted symptom pres-
entation. Journal of Clinical and Experimental Neuropsychology,
39(3), 257–264. https:// doi. org/ 10. 1080/ 13803 395. 2016. 12232 78
Di Girolamo, M., Giromini, L., Bosi, J., Warmelink, L., La Scala,
I., Loiacono, C., Miraglia, F., & Zennaro, A. (2021). The role
played by theory of mind and empathy in the feigning of psy-
chopathology. International Journal of Forensic Mental Health.
https:// doi. org/ 10. 1080/ 14999 013. 2021. 20074 32
Diefendorf, A. R., & Dodge, R. (1908). An experimental study of the
ocular reactions of the insane from photographic records. Brain,
31(3), 451–489.
Duchowski, A. T. (2007). Eye tracking methodology: Theory and prac-
tice. Springer.
Ellwanger, J., Rosenfeld, J. P., Sweet, J. J., & Bhatt, M. (1996). Detect-
ing simulated amnesia for autobiographical and recently learned
information using the P300 event-related potential. International
Journal of Psychophysiology, 23, 9–23. https:// doi. or g/ 10. 1016/
0167- 8760(96) 00035-9
Ellwanger, J., Tenhula, W. N., Rosenfeld, J. P., & Sweet, J. J. (1999).
Identifying simulators of cognitive deficit through combined use
of neuropsychological test performance and event-related poten-
tials. Journal of Clinical and Experimental Neuropsychology, 21,
866–879. https:// doi. org/ 10. 1076/ jcen. 21.6. 866. 850
Giromini, L., & Viglione, D. J. (2022). Assessing negative response
bias with the Inventory of Problems-29 (IOP-29): A quantitative
literature review. Psychological Injury and Law. https:// doi. org/
10. 1007/ s12207- 021- 09437-7
Giromini, L., Barbosa, F., Coga, G., Azeredo, A., Viglione, D. J., &
Zennaro, A. (2020a). Using the inventory of problems-29 (IOP-
29) with the Test of Memory Malingering (TOMM) in symp-
tom validity assessment: A study with a Portuguese sample of
experimental feigners. Applied Neuropsychology: Adult, 27(6),
504–516. https:// doi. org/ 10. 1080/ 23279 095. 2019. 15709 29
Giromini, L., Carfora Lettieri, S., Zizolfi, S., Zizolfi, D., Viglione, D. J.,
Brusadelli, E., Perfetti, B., di Carlo, D. A., & Zennaro, A. (2019).
Beyond rare-symptoms endorsement: A clinical comparison
simulation study using the Minnesota Multiphasic Personality
Inventory-2 (MMPI-2) with the Inventory of Problems-29 (IOP-
29). Psychological Injury and Law, 12(3), 212–224. https:// doi.
org/ 10. 1007/ s12207- 019- 09357-7
Giromini, L., Viglione, D. J., Pignolo, C., & Zennaro, A. (2018). A clin-
ical comparison, simulation study testing the validity of SIMS
and IOP-29 with an Italian sample. Psychological Injury and
Law, 11, 340–350. https:// doi. org/ 10. 1007/ s12207- 018- 9314-1
Giromini, L., Viglione, D. J., Pignolo, C., & Zennaro, A. (2020b). An
Inventory of Problems-29 sensitivity study investigating feigning
of four different symptom presentations via malingering experi-
mental paradigm. Journal of Personality Assessment, 102(4),
563–572. https:// doi. org/ 10. 1080/ 00223 891. 2019. 15669 14
Giromini, L., Young, G., & Sellbom, M. (2022). Assessing negative
response bias using self-report measures: New articles, new
issues. Psychological Injury and Law, 15(1), 1–21. https:// doi.
org/ 10. 1007/ s12207- 022- 09444-2
Gooding, D. C., & Basso, M. A. (2008). The tell-tale tasks: A review
of saccadic research in psychiatric patient populations. Brain
and Cognition, 68(3), 371–390. https:// doi. org/ 10. 1016/j. bandc.
2008. 08. 024
Green, P., Allen, L., & Astner, K. (1996). Manual for computerised
word memory test. Durham, NC: CogniSyst.
Greenwood, T. A., Braff, D. L., Light, G. A., Cadenhead, K. S., Calkins,
M. E., Dobie, D. J., Freedman, R., Green, M. F., Gur, R. E., Gur,
R. C., Mintz, J., Nuechterlein, K. H., Olincy, A., Radant, A. D.,
Seidman, L. J., Siever, L. J., Silverman, J. M., Stone, W. S., Swerd-
low, N. R., … Schork, N. J. (2007). Initial heritability analyses
of endophenotypic measures for schizophrenia: The consortium
on the genetics of schizophrenia. Archives of General Psychiatry,
64(11), 1242–1250. https:// doi. org/ 10. 1001/ archp syc. 64. 11. 1242
Greenwood, T. A., Lazzeroni, L. C., Murray, S. S., Cadenhead, K. S.,
Calkins, M. E., Dobie, D. J., Green, M. F., Gur, R. E., Gur, R. C.,
Hardiman, G., Kelsoe, J. R., Leonard, S., Light, G. A., Nuechterlein,
K. H., Olincy, A., Radant, A. D., Schork, N. J., Seidman, L. J., Siever,
L. J., Silverman, J. M., Stone, W. S., Swerdlow, N. R., Tsuang, D.
Content courtesy of Springer Nature, terms of use apply. Rights reserved.
95Psychological Injury and Law (2023) 16:83–97
1 3
W., Tsuang, M. T., Turetsky, B. I., Freedman, R., & Braff, D. L.
(2011). Analysis of 94 candidate genes and 12 endophenotypes for
schizophrenia from the Consortium on the Genetics of Schizophre-
nia. American Journal of Psychiatry, 168(9), 930–946. https:// doi.
org/ 10. 1176/ appi. ajp. 2011. 10050 723
Hannula, D. E., Baym, C. L., Warren, D. E., & Cohen, N. J. (2012).
The eyes know: Eye movements as a veridical index of memory.
Psychological Science, 23(3), 278–287. https:// doi. org/ 10. 1177/
09567 97611 429799
Hartman, D. E. (2008). The Computerized Test of Information Process-
ing (CTIP) by Tom Tombaugh. Applied Neuropsychology, 15,
226–227. https:// doi. org/ 10. 1080/ 09084 28080 23245 72
Heitger, M. H., Jones, R. D., Macleod, A. D., Snell, D. L., Frampton, C.
M., & Anderson, T. J. (2009). Impaired eye movements in post-
concussion syndrome indicate suboptimal brain function beyond
the influence of depression, malingering or intellectual ability.
Brain, 132(10), 2850–2870. https:// doi. org/ 10. 1093/ brain/ awp181
Hutton, S. (2008). Cognitive control of saccadic eye movements. Brain
and Cognition, 68(3), 327–340. https:// doi. org/ 10. 1016/j. bandc.
2008. 08. 021
Ilgunaite, G., Giromini, L., Bosi, J., Viglione, D. J., & Zennaro, A.
(2022). A clinical comparison simulation study using the Inven-
tory of Problems-29 (IOP-29) with the Center for Epidemiologic
Studies Depression Scale (CES-D) in Lithuania. Applied Neu-
ropsychology: Adult, 29(2), 155–162.
Irwin, D. E. (2004). Fixation location and fixation duration as indices
of cognitive processing. In J. M. Henderson & F. Ferreira (Eds.),
The interface of language, vision, and action: Eye movements
and the visual world (pp. 105–133). Psychology Press.
Johnson, R., Barnhardt, J., & Zhu, J. (2003). The deceptive response:
Effects of response conflict and strategic monitoring on the late
positive component and episodic memory-related brain activ-
ity. Biological Psychology, 64(3), 217–253. https:// doi. org/ 10.
1016/j. biops ycho. 2003. 07. 006
Kanser, R. J., Bashem, J. R., Patrick, S. D., Hanks, R. A., & Rapport, L.
J. (2020). Detecting feigned traumatic brain injury with eye track-
ing during a test of performance validity.Neuropsychology,34(3).
Kikuchi, M., Miura, K., Morita, K., Yamamori, H., Fujimoto, M.,
Ikeda, M., Yasuda, Y., Nakaya, A., & Hashimoto, R. (2018).
Genome-wide association analysis of eye movement dysfunc-
tion in schizophrenia. Scientific Reports, 8(1), 1–9. https:// doi.
org/ 10. 1038/ s41598- 018- 30646-9
Kojima, T., Matsushima, E., Nakajima, K., Shiraishi, H., Ando, K.,
Ando, H., & Shimazono, Y. (1990). Eye movements in acute,
chronic, and remitted schizophrenics. Biological Psychiatry,
27(9), 975–989. https:// doi. org/ 10. 1016/ 0006- 3223(90) 90035-Z
Kojima, T., Matsushima, E., Ohta, K., Toru, M., Han, Y. H., Shen, Y.
C., Moussaoui, D., David, I., Sato, K., Yamashita, I., Kathmann,
N., Hippius, H., Thavundavil, J. X., Lal, S., Vasavan Nair, N. P.,
Potkin, S. G., & Prilipko, L. (2001). Stability of exploratory eye
movements as a marker of schizophrenia – A WHO multi-center
study. Schizophrenia Research, 52(3), 203–213. https:// doi. org/
10. 1016/ s0920- 9964(00) 00181-x
Kozel, F. A., Johnson, K. A., Mu, Q., Grenesko, E. L., Laken, S. J., &
George, M. S. (2005). Detecting deception using functional mag-
netic resonance imaging. Biological Psychiatry, 58, 605–613.
https:// doi. org/ 10. 1016/j. biops ych. 2005. 07. 040
Lane, J. D., & Wegner, D. M. (1995). The cognitive consequences of
secrecy. Journal of Personality and Social Psychology, 69(2),
237–253. https:// doi. org/ 10. 1037/ 0022- 3514. 69.2. 237
Larrabee, G. J. (2012). Performance validity and symptom validity
in neuropsychological assessment. Journal of the International
Neuropsychological Society, 18(4), 625–630. https:// doi. org/ 10.
1017/ S1355 61771 20002 40
Lencer, R., Sprenger, A., Reilly, J. L., McDowell, J. E., Rubin, L. H.,
Badner, J. A., Keshavan, M. S., Pearlson, G. D., Tamminga, C.
A., Gershom, E. S., Clementz, B. A., & Sweeney, J. A. (2015).
Pursuit eye movements as an intermediate phenotype across psy-
chotic disorders: Evidence from the B-SNIP study. Schizophrenia
Research, 169(1–3), 326–333. https:// doi. org/ 10. 1016/j. schres.
2015. 09. 032
Levy, D. L., Sereno, A. B., Gooding, D. C., & O’Driscoll, G. A. (2010).
Eye tracking dysfunction in schizophrenia: Characterization and
pathophysiology. Behavioral Neurobiology of Schizophrenia and
Its Treatment. https:// doi. org/ 10. 1007/ 7854_ 2010_ 60
Light, G. A., Swerdlow, N. R., Rissling, A. J., Radant, A., Sugar, C. A.,
Sprock, J., Pela, M., Geyer, M. A., & Braff, D. L. (2012). Char-
acterization of neurophysiologic and neurocognitive biomarkers
for use in genomic and clinical outcome studies of schizophrenia.
PLoS ONE, 7(7), e39434. https:// doi. org/ 10. 1371/ journ al. pone.
00394 34
Liu, R., Gao, B., Li, Y., & Sheng, L. (2001). Simulated malingering:
A preliminary trial on Hiscock’s Forced-Choice Digit Memory
Test.Chinese Journal of Clinical Psychology.
Maruta, J., Suh, M., Niogi, S. N., Mukherjee, P., & Ghajar, J. (2010).
Visual tracking synchronization as a metric for concussion
screening. Journal of Head Trauma Rehabilitation, 25(4), 293–
305. https:// doi. org/ 10. 1097/ HTR. 0b013 e3181 e67936
Mathew, J., Masson, G. S., & Danion, F. R. (2020). Sex differences in
visuomotor tracking. Scientific Reports, 10(1), 1–12. https:// doi.
org/ 10. 1038/ s41598- 020- 68069-0
Miller, H. A. (2001). M-FAST: Miller forensic assessment of symptoms
test professional manual. Odessa, FL: Psychological Assessment
Resources.
Morey, L. C. (1991). Personality assessment inventory (PAI). Pro-
fessional manual. Odessa, FL: Psychological Assessment
Resources.
Morey, L. C. (2007). Personality assessment inventory (PAI). Profes-
sional manual (2nd ed.). Psychological Assessment Resources.
Morita, K., Miura, K., Kasai, K., & Hashimoto, R. (2019). Eye move-
ment characteristics in schizophrenia: A recent update with clini-
cal implications. Neuropsychopharmacology Reports, 40(1), 2–9.
https:// doi. org/ 10. 1002/ npr2. 12087
Müri, R. M., & Nyffeler, T. (2008). Neurophysiology and neuroanat-
omy of reflexive and volitional saccades as revealed by lesion
studies with neurological patients and transcranial magnetic
stimulation (TMS). Brain and Cognition, 68(3), 284–292. https://
doi. org/ 10. 1016/j. bandc. 2008. 08. 018
O’Driscoll, G. A., & Callahan, B. L. (2008). Smooth pursuit in schizo-
phrenia: A meta-analytic review of research since 1993. Brain
and Cognition, 68(3), 359–370. https:// doi. org/ 10. 1016/j. bandc.
2008. 08. 023
Olk, B., & Kingstone, A. (2003). Why are antisaccades slower than
prosaccades? A novel finding using a new paradigm. Neu-
roreport, 14(1), 151–155. https:// doi. org/ 10. 1097/ 00001 756-
20030 1200- 00028
Pierre, J. M., Shnayder, I., Wirshing, D. A., & Wirshing, W. C. (2004).
Intranasal quetiapine abuse. American Journal of Psychiatry,
161, 1718. https:// doi. org/ 10. 1176/ appi. ajp. 161.9. 1718
Pierrot-Deseilligny, C., Milea, D., & Müri, R. M. (2004). Eye move-
ment control by the cerebral cortex. Current Opinion in Neurology,
17(1), 17–25. https:// doi. org/ 10. 1097/ 00019 052- 20040 2000- 00005
Pignolo, C., Giromini, L., Ales, F., & Zennaro, A. (2021). Detection
of feigning of different symptom presentations with the PAI and
IOP-29. Assessment. https:// doi. org/ 10. 1177/ 10731 91121 10612 82
Radant, A. D., Millard, S. P., Braff, D. L., Calkins, M. E., Dobie,
D. J., Freedman, R., Green, M. F., Greenwood, T. A., Gur, R.
E., Gur, R. C., Lazzeroni, L. C., Light, G. A., Meichle, S. P.,
Nuechterlein, K. H., Olincy, A., Seidman, L. J., Siever, L. J.,
Silverman, J. M., Stone, W. S., … Tsuang, D. W. (2015). Robust
differences in antisaccade performance exist between COGS
schizophrenia cases and controls regardless of recruitment
Content courtesy of Springer Nature, terms of use apply. Rights reserved.
96 Psychological Injury and Law (2023) 16:83–97
1 3
strategies. Schizophrenia Research, 163(1–3), 47–52. https://
doi. org/ 10. 1016/j. schres. 2014. 12. 016
Resnick, P. J. (1999). The detection of malingered psychosis. Psychi-
atric Clinics of North America, 22(1), 159–172.
Rogers, R. (2008). Detection strategies for malingering and defensive-
ness. In R. Rogers (Ed.), Clinical assessment of malingering and
deception (pp. 14–35). Guilford Press.
Rogers, R., Bagby, R. M., & Dickens, S. E. (1992). Structured Inter-
view of Reported Symptoms (SIRS) and professional manual.
Odessa, FL: Psychological Assessment Resources.
Rogers, R., Sewell, K. W., & Gillard, N. D. (2010). Structured Inter-
view of Reported Symptoms, second edition: Professional test
manual (2nd ed.). Psychological Assessment Resources.
Rogers, R., Sewell, K. W., Martin, M. A., & Vitacco, M. J. (2003).
Detection of feigned mental disorders: A meta-analysis of the
MMPI-2 and malingering. Assessment, 10, 160–177. https:// doi.
org/ 10. 1177/ 10731 91103 01000 2007
Roma, P., Giromini, L., Burla, F., Ferracuti, S., Viglione, D. J., &
Mazza, C. (2019). Ecological validity of the Inventory of
Problems-29 (IOP-29): An Italian study of court-ordered, psy-
chological injury evaluations using the Structured Inventory
of Malingered Symptomatology (SIMS) as criterion variable.
Psychological Injury and Law, 13, 57–65. https:// doi. org/ 10.
1007/ s12207- 019- 09368-4
Rosenfeld, J. P., Ellwanger, J. W., Nolan, K., Wu, S., Bermann, R. G.,
& Sweet, J. (1999). P300 scalp amplitude distribution as an index
of deception in a simulated cognitive deficit model. International
Journal of Psychophysiology, 33, 3–19. https:// doi. or g/ 10. 1016/
S0167- 8760(99) 00021-5
Rosenfeld, J. P., Reinhart, A. M., Bhatt, M., Ellwanger, J., Gora, K.,
Sekera, M., etal. (1998). P300 correlates of simulated amnesia
on a matching-to-sample task: Topographic analyses of deception
vs. truth-telling responses. International Journal of Psychophysi-
ology, 28, 233–248. https:// doi. org/ 10. 1016/ S0167- 8760(97)
00084-6
Rosenfeld, J. P., Sweet, J. J., Chuang, J., Ellwanger, J., & Song, L.
(1996). Detection of simulated malingering using forced choice
recognition enhanced with event-related potential recording.
The Clinical Neuropsychologist, 10, 163–179. https:// doi. org/
10. 1080/ 13854 04960 84066 78
Shapiro, A. P., & Teasell, R. W. (1998). Misdiagnosis of chronic pain
as hysteria and malingering. Current Review of Pain, 2, 19–28.
https:// doi. org/ 10. 1007/ s11916- 998- 0059-5
Sharpe, J. A. (2008). Neurophysiology and neuroanatomy of smooth
pursuit: Lesion studies. Brain and Cognition, 68(3), 241–254.
https:// doi. org/ 10. 1016/j. bandc. 2008. 08. 015
Sherman, E. M. S., Slick, D. J., & Iverson, G. L. (2020). Multidimen-
sional malingering criteria for neuropsychological assessment:
A 20-year update of the malingered neuropsychological dys-
function criteria. Archives of Clinical Neuropsychology, 35(6),
735–764. https:// doi. org/ 10. 1093/ arclin/ acaa0 19
Silverstein, S. M., & Keane, B. P. (2011). Vision science and schizo-
phrenia research: Toward a re-view of the disorder editors’
introduction to special section. Schizophrenia Bulletin, 37(4),
681–689. https:// doi. org/ 10. 1093/ schbul/ sbr053
Smith, G. P., & Burger, G. K. (1997). Detection of malingering: Valida-
tion of the Structured Inventory of Malingered Symptomatology
(SIMS). Journal of American Academic Psychiatry and the Law,
25, 183–189. https:// doi. org/ 10. 1037/ t04573- 000
Šömen, M. M., Lesjak, S., Majaron, T., Lavopa, L., Giromini, L.,
Viglione, D. J., & Podlesek, A. (2021). Using the Inventory of
Problems-29 (IOP-29) with the Inventory of Problems Memory
(IOP-M) in malingering-related assessments: A study with a Slo-
venian sample of experimental feigners. Psychological Injury and
Law, 14, 104–113. https:// doi. org/ 10. 1007/ s12207- 021- 09412-2
Sporer, S. L. (2016). Deception and cognitive load: Expanding our
horizon with a working memory model. Frontiers in Psychology,
7, 420. https:// doi. org/ 10. 3389/ fpsyg. 2016. 00420
Suzuki, M., Takahashi, S., Matsushima, E., Tsunoda, M., Kurachi, M.,
Okada, T., Hayashi, T., Ishii, Y., Morita, K., Maeda, H., Katayama,
S., Kawahara, R., Otsuka, T., Hirayasu, Y., Sekine, M., Okubo,
Y., Motoshita, M., Ohta, K., Uchiyama, M., & Kojima, T. (2009).
Exploratory eye movement dysfunction as a discriminator for schiz-
ophrenia. European Archives of Psychiatry and Clinical Neurosci-
ence, 259(3), 186–194. https:// doi. org/ 10. 1007/ s00406- 008- 0850-7
Sweet, J. J., Heilbronner, R. L., Morgan, J. E., Larrabee, G. J., Rohling,
M. L., Boone, K. B., Kirkwood, M. W., Schroeder, R. W., Suhr,
J. A., & Participants, C. (2021). American Academy of Clinical
Neuropsychology (AACN) 2021 consensus statement on validity
assessment: Update of the 2009 AACN consensus conference
statement on neuropsychological assessment of effort, response
bias, and malingering. The Clinical Neuropsychologist, 35(6),
1053–1106. https:// doi. org/ 10. 1080/ 13854 046. 2021. 18960 36
Takahashi, S., Tanabe, E., Yara, K., Matsuura, M., Matsushima, E., &
Kojima, T. (2008). Impairment of exploratory eye movement in
schizophrenia patients and their siblings. Psychiatry and Clinical
Neurosciences, 62(5), 487–493. https:// doi. org/ 10. 1111/j. 1440-
1819. 2008. 01840.x
Tardif, H. P., Barry, R. J., & Johnstone, S. J. (2002). Event-related
potentials reveal processing differences in honest vs. malingered
memory performance. International Journal of Psychophysiology,
46, 147–158. https:// doi. org/ 10. 1016/ S0167- 8760(02) 00090-9
Tardif, H. P., Barry, R. J., Fox, A. M., & Johnstone, S. J. (2000). Detec-
tion of feigned recognition memory impairment using the old
new effect of the event-related potential. International Journal
of Psychophysiology, 36, 1–9. https:// doi. org/ 10. 1016/ S0167-
8760(00) 00083-0
Thomas, L. E., & Lleras, A. (2007). Moving eyes and moving thought:
On the spatial compatibility between eye movements and cogni-
tion. Psychonomic Bulletin & Review, 14(4), 663–668. https://
doi. org/ 10. 3758/ BF031 96818
Tombaugh, T. N. (1996). Test of Memory Malingering (TOMM). Multi
Health Systems.
Tomer, E., Lupu, T., Golan, L., Wagner, M., & Braw, Y. (2018). Eye
tracking as a mean to detect feigned cognitive impairment in
the word memory test. Applied Neuropsychology: Adult, 27(1),
49–61. https:// doi. org/ 10. 1080/ 23279 095. 2018. 14804 83
Vagnini, V. L., Berry, D. T., Clark, J. A., & Jiang, Y. (2008). New
measures to detect malingered neurocognitive deficit: Applying
reaction time and event-related potentials. Journal of Clinical
and Experimental Neuropsychology, 30(7), 766–776. https:// doi.
org/ 10. 1080/ 13803 39070 17547 46
Van der Stigchel, S., Meeter, M., & Theeuwes, J. (2006). Eye move-
ment trajectories and what they tell us. Neuroscience & Biobe-
havioral Reviews, 30(5), 666–679. https:// doi. org/ 10. 1016/j.
neubi orev. 2005. 12. 001
van Stockum, S., MacAskill, M., Anderson, T., & Dalrymple-Alford,
J. (2008). Don’t look now or look away: Two sources of saccadic
disinhibition in Parkinson’s disease? Neuropsychologia, 46(13),
3108–3115. https:// doi. org/ 10. 1016/j. neuro psy ch ologia. 2008. 07. 002
Vendemia, J. M. C., Buzan, R. F., & Simon-Dack, S. L. (2005). Reac-
tion time of motor responses in two-stimulus paradigms involv-
ing deception and congruity with varying levels of difficulty.
Behavioural Neurology, 16(1), 25–36. https:// doi. org/ 10. 1155/
2005/ 804026
Viglione, D. J., & Giromini, L. (2020). Inventory of Problems – 29:
Professional manual. IOP-Test, LLC.
Viglione, D. J., Giromini, L., & Landis, P. (2017). The development of
the Inventory of Problems-29: A brief self-administered meas-
ure for discriminating bona fide from feigned psychiatric and
Content courtesy of Springer Nature, terms of use apply. Rights reserved.
97Psychological Injury and Law (2023) 16:83–97
1 3
cognitive complaints. Journal of Personality Assessment, 99,
534–544. https:// doi. org/ 10. 1080/ 00223 891. 2016. 12338 82
Viglione, D. J., Giromini, L., Landis, P., McCullaugh, J. M., Piztiz, T.
D., O’Brien, S., Wood, S., Connell, K., & Abramsky, A. (2018).
Development and validation of the False Disorder Score: The focal
scale of the Inventory of Problems. Journal of Personality Assess-
ment, 2, 1–9. https:// doi. org/ 10. 1080/ 00223 891. 2018. 14924 13
Vrij, A., Granhag, P. A., Mann, S., & Leal, S. (2011). Outsmarting the
liars: Toward a cognitive lie detection approach. Current Direc-
tions in Psychological Science, 20(1), 28–32. https:// doi. org/ 10.
1177/ 09637 21410 391245
Willison, J., & Tombaugh, T. N. (2006). Detecting simulation of attention
deficits using reaction time tests. Archives of Clinical Neuropsy-
chology, 21, 41–52. https:// doi. org/ 10. 1016/j. acn. 2005. 07. 005
Winters, C. L., Giromini, L., Crawford, T. J., Ales, F., Viglione, D. J.,
& Warmelink, L. (2021). An Inventory of Problems–29 (IOP–29)
study investigating feigned schizophrenia and random responding
in a British community sample. Psychiatry, Psychology and Law,
28, 235–254. https:// doi. org/ 10. 1080/ 13218 719. 2020. 17677 20
Young, G. (2014). Malingering, feigning, and response bias in psychi-
atric/psychological injury: Implications for practice and court.
Springer.
Zhong, S., Liang, X., Wang, J., Mellsop, G., Zhou, J., & Wang, X.
(2021). Simulated malingering on binomial forced-choice digit
memory test – Using eye movements to understand faking cogni-
tion impairment process. The Journal of Forensic Psychiatry &
Psychology, 32(6), 808–824. https:// doi. org/ 10. 1080/ 14789 949.
2021. 19182 11
Publisher's Note Springer Nature remains neutral with regard to
jurisdictional claims in published maps and institutional affiliations.
Content courtesy of Springer Nature, terms of use apply. Rights reserved.
1.
2.
3.
4.
5.
6.
Terms and Conditions
Springer Nature journal content, brought to you courtesy of Springer Nature Customer Service Center GmbH (“Springer Nature”).
Springer Nature supports a reasonable amount of sharing of research papers by authors, subscribers and authorised users (“Users”), for small-
scale personal, non-commercial use provided that all copyright, trade and service marks and other proprietary notices are maintained. By
accessing, sharing, receiving or otherwise using the Springer Nature journal content you agree to these terms of use (“Terms”). For these
purposes, Springer Nature considers academic use (by researchers and students) to be non-commercial.
These Terms are supplementary and will apply in addition to any applicable website terms and conditions, a relevant site licence or a personal
subscription. These Terms will prevail over any conflict or ambiguity with regards to the relevant terms, a site licence or a personal subscription
(to the extent of the conflict or ambiguity only). For Creative Commons-licensed articles, the terms of the Creative Commons license used will
apply.
We collect and use personal data to provide access to the Springer Nature journal content. We may also use these personal data internally within
ResearchGate and Springer Nature and as agreed share it, in an anonymised way, for purposes of tracking, analysis and reporting. We will not
otherwise disclose your personal data outside the ResearchGate or the Springer Nature group of companies unless we have your permission as
detailed in the Privacy Policy.
While Users may use the Springer Nature journal content for small scale, personal non-commercial use, it is important to note that Users may
not:
use such content for the purpose of providing other users with access on a regular or large scale basis or as a means to circumvent access
control;
use such content where to do so would be considered a criminal or statutory offence in any jurisdiction, or gives rise to civil liability, or is
otherwise unlawful;
falsely or misleadingly imply or suggest endorsement, approval , sponsorship, or association unless explicitly agreed to by Springer Nature in
writing;
use bots or other automated methods to access the content or redirect messages
override any security feature or exclusionary protocol; or
share the content in order to create substitute for Springer Nature products or services or a systematic database of Springer Nature journal
content.
In line with the restriction against commercial use, Springer Nature does not permit the creation of a product or service that creates revenue,
royalties, rent or income from our content or its inclusion as part of a paid for service or for other commercial gain. Springer Nature journal
content cannot be used for inter-library loans and librarians may not upload Springer Nature journal content on a large scale into their, or any
other, institutional repository.
These terms of use are reviewed regularly and may be amended at any time. Springer Nature is not obligated to publish any information or
content on this website and may remove it or features or functionality at our sole discretion, at any time with or without notice. Springer Nature
may revoke this licence to you at any time and remove access to any copies of the Springer Nature journal content which have been saved.
To the fullest extent permitted by law, Springer Nature makes no warranties, representations or guarantees to Users, either express or implied
with respect to the Springer nature journal content and all parties disclaim and waive any implied warranties or warranties imposed by law,
including merchantability or fitness for any particular purpose.
Please note that these rights do not automatically extend to content, data or other material published by Springer Nature that may be licensed
from third parties.
If you would like to use or distribute our Springer Nature journal content to a wider audience or on a regular basis or in any other manner not
expressly permitted by these Terms, please contact Springer Nature at
onlineservice@springernature.com
Available via license: CC BY 4.0
Content may be subject to copyright.