Kaisa Tiippana

Kaisa Tiippana
University of Helsinki | HY · Psychology and Logopedics

PhD
Research and teaching in multisensory perception and learning

About

81
Publications
26,700
Reads
How we measure 'reads'
A 'read' is counted each time someone views a publication summary (such as the title, abstract, and list of authors), clicks on a figure, or views or downloads the full-text. Learn more
1,689
Citations
Citations since 2017
31 Research Items
823 Citations
2017201820192020202120222023020406080100120140
2017201820192020202120222023020406080100120140
2017201820192020202120222023020406080100120140
2017201820192020202120222023020406080100120140
Introduction
Kaisa Tiippana works in the Department Psychology and Logopedics, University of Helsinki. Kaisa does research in Cognitive Psychology, Experimental Psychology and Psychophysics. The current project is Multisensory Perception and Learning, including audiovisual speech perception. Helsinki University homepage: https://researchportal.helsinki.fi/en/persons/kaisa-tiippana Orcid: https://orcid.org/0000-0002-2305-8104
Additional affiliations
January 2009 - present
University of Helsinki
September 1998 - December 2008
Helsinki University of Technology
September 1995 - August 1998

Publications

Publications (81)
Article
Full-text available
Habituated response tendency associated with affordance of an object is automatically inhibited if this affordance cue is extracted from a non-target object. This study presents two go/no-go experiments investigating whether this response control operates in response selection processes and whether it is linked to conflict-monitoring mechanisms. In...
Article
Full-text available
Theatre-based practices, such as improvisation, are frequently applied to simulate everyday social interactions. Although the improvisational context is acknowledged as fictional, realistic emotions may emerge, a phenomenon labelled the ‘paradox of fiction’. This study investigated how manipulating the context (real-life versus fictional) modulates...
Article
Full-text available
Laterality effects generally refer to an advantage for verbal processing in the left hemisphere and for non-verbal processing in the right hemisphere, and are often demonstrated in memory tasks in vision and audition. In contrast, their role in haptic memory is less understood. In this study, we examined haptic recognition memory and laterality for...
Article
Full-text available
Even though some individuals subjectively associate various symptoms with infrasound, there are very few systematic studies on the contribution of infrasound to the perception, annoyance, and physiological reactions elicited by wind turbine sound. In this study, sound samples were selected among long-term measurement data from wind power plant and...
Article
Full-text available
Objectives Teaching involves multiple performance situations, potentially causing psychosocial stress. Since the theater-based improvisation method is associated with diminished social stress, we investigated whether improvisation lessened student teachers’ stress responses using the Trier Social Stress Test (TSST; preparatory phase, public speech,...
Article
Full-text available
The left hemisphere is known to be generally predominant in verbal processing and the right hemisphere in non-verbal processing. We studied whether verbal and non-verbal lateralization is present in haptics by comparing discrimination performance between letters and nonsense shapes. We addressed stimulus complexity by introducing lower case letters...
Technical Report
Full-text available
Some individuals have reported various symptoms that they have intuitively associated with infrasound from wind turbines. Scientific evidence on the potential association or studies focusing directly on the health effects of wind turbine infrasound are lacking. This research project aimed at assessing whether wind turbine infrasound has harmful eff...
Article
It has been shown recently that when participants are required to pronounce a vowel at the same time with the hand movement, the vocal and manual responses are facilitated when a front vowel is produced with forward-directed hand movements and a back vowel is produced with backward-directed hand movements. This finding suggests a coupling between s...
Article
Full-text available
When touched, dissimilar materials, such as metal and wood, evoke different thermal sensations when both are maintained at room temperature due to the inherent differences in their thermo-physical properties. In this study, we employed psychophysical experiments to quantify the tactile perception of surface temperature using pine wood, oak wood and...
Article
Full-text available
This study examined the effects of a theater-based improvisation method for promoting student-teachers’ self-rated social interaction competence. 39 healthy undergraduate students participated in an intervention study applying the improvisation method in the context of teacher education. The intervention group (N=19) were trained in the basics of i...
Article
Research has shown connections between articulatory mouth actions and manual actions. This study investigates whether forward–backward hand movements could be associated with vowel production processes that programme tongue fronting/backing, lip rounding/spreading (Experiment 1), and/or consonant production processes that programme tongue tip and t...
Article
We developed a computerized audiovisual training programme for school-aged children with specific language impairment (SLI) to improve their phonological skills. The programme included various tasks requiring phonological decisions. Spoken words, pictures, letters and written syllables were used as training material. Spoken words were presented eit...
Article
Full-text available
Some classical studies on temporal order judgments (TOJ) suggested a single central process comparing stimulus onsets across modalities. The prevalent current view suggests that there is modality-specific timing estimation followed by a cross-modal stage. If the latter view is correct, TOJ’s may vary depending on stimulus modality. Further, if TOJ...
Article
Full-text available
The brain’s left hemisphere often displays advantages in processing verbal information, while the right hemisphere favours processing non-verbal information. In the haptic domain due to contra-lateral innervations, this functional lateralization is reflected in a hand advantage during certain functions. Findings regarding the hand-hemisphere advant...
Article
Seeing articulatory gestures enhances speech perception. Perception of auditory speech can even be changed by incongruent visual gestures, which is known as the McGurk effect (e.g., dubbing a voice saying /mi/ onto a face articulating /ni/, observers often hear /ni/). In children, the McGurk effect is weaker than in adults, but no previous knowledg...
Article
In the course of normal aging, memory functions show signs of impairment. Studies of memory in the elderly have previously focused on a single sensory modality, although multisensory encoding has been shown to improve memory performance in children and young adults. In this study, we investigated how audiovisual encoding affects auditory recognitio...
Article
Audiovisual semantic congruency during memory encoding has been shown to facilitate later recognition memory performance. However, it is still unclear whether this improvement is due to multisensory semantic congruency or just semantic congruency per se. We investigated whether dual visual encoding facilitates recognition memory in the same way as...
Article
We investigated the effects of audiovisual semantic congruency on recognition memory performance. It has been shown previously that memory performance is better for semantically congruent stimuli that are presented together in different modalities (e.g., a dog's bark with a picture of the dog) during encoding, compared to stimuli presented together...
Article
Full-text available
The shape and size-related sound symbolism phenomena assume that, for example, the vowel [i] and the consonant [t] are associated with sharp-shaped and small-sized objects, whereas [ɑ] and [m] are associated with round and large objects. It has been proposed that these phenomena are mostly based on the involvement of articulatory processes in repre...
Article
Manual actions and speech are connected: for example, grip execution can influence simultaneous vocalizations and vice versa. Our previous studies show that the consonant [k] is associated with the power grip and the consonant [t] with the precision grip. Here we studied whether the interaction between speech sounds and grips could operate already...
Article
Full-text available
We have recently shown in Finnish speakers that articulation of certain vowels and consonants has a systematic influence on simultaneous grasp actions as well as on forward and backward hand movements. Here we studied whether these effects generalize to another language, namely Czech. We reasoned that if the results generalized to another language...
Article
Full-text available
Previous studies on tactile experiences have investigated a wide range of material surfaces across various skin sites of the human body in self-touch or other touch modes. Here, we investigate whether the sensory and emotional aspects of touch are related when evaluating wooden surfaces using fingertips in the absence of other sensory modalities. T...
Article
Contraction of a muscle modulates not only the corticospinal excitability (CSE) of the contracting muscle but also that of different muscles. We investigated to what extent the CSE of a hand muscle is modulated during preparation and execution of teeth clenching and ipsilateral foot dorsiflexion either separately or in combination. Hand-muscle CSE...
Article
Purpose: Lipreading and its cognitive correlates were studied in school-age children with typical language development and delayed language development due to specific language impairment (SLI). Method: Forty-two children with typical language development and 20 children with SLI were tested by using a word-level lipreading test and an extensive...
Article
Full-text available
Previous research has shown that precision and power grip performance is consistently influenced by simultaneous articulation. For example, power grip responses are performed relatively fast with the open-back vowel [a], whereas precision grip responses are performed relatively fast with the close-front vowel [i]. In the present study, the particip...
Data
Datasets for Experiment 2 used for statistical analyses. (XLSX)
Data
Datasets for Experiment 1 used for statistical analyses. (XLSX)
Article
Full-text available
Previous studies have shown congruency effects between specific speech articulations and manual grasping actions. For example, uttering the syllable [kɑ] facilitates power grip responses in terms of reaction time and response accuracy. A similar association of the syllable [ti] with precision grip has also been observed. As these congruency effects...
Article
Previous studies have shown a congruency effect between manual grasping and syllable articulation. For instance, a power grip is associated with syllables whose articulation involves the tongue body and/or large mouth aperture ([kɑ]) whereas a precision grip is associated with articulations that involve the tongue tip and/or small mouth aperture ([...
Article
Full-text available
Recent studies have shown that articulatory gestures are systematically associated with specific manual grip actions. Here we show that executing such actions can influence performance on a speech-categorization task. Participants watched and/or listened to speech stimuli while executing either a power or a precision grip. Grip performance influenc...
Data
Datasets used for statistical analyses. (XLSX)
Article
Some theories concerning speech mechanisms assume that overlapping representations are involved in programming certain articulatory gestures and hand actions. The present study investigated whether planning of movement direction for articulatory gestures and manual actions could interact. The participants were presented with written vowels (Experim...
Article
Full-text available
Although we live in a multisensory world, children's memory has been usually studied concentrating on only one sensory modality at a time. In this study, we investigated how audiovisual encoding affects recognition memory. Children (n = 114) from three age groups (8, 10 and 12 years) memorized auditory or visual stimuli presented with a semanticall...
Article
Full-text available
Studies of memory and learning have usually focused on a single sensory modality, although human perception is multisensory in nature. In the present study, we investigated the effects of audiovisual encoding on later unisensory recognition memory performance. The participants were to memorize auditory or visual stimuli (sounds, pictures, spoken wo...
Article
Full-text available
Seeing articulatory movements influences perception of auditory speech. This is often reflected in a shortened latency of auditory event-related potentials (ERPs) generated in the auditory cortex. The present study addressed whether this early neural correlate of audiovisual interaction is modulated by attention. We recorded ERPs in 15 subjects whi...
Article
Full-text available
Article
Full-text available
It has been proposed that articulatory gestures are shaped by tight integration in planning mouth and hand acts. This hypothesis is supported by recent behavioral evidence showing that response selection between the precision and power grip is systematically influenced by simultaneous articulation of a syllable. For example, precision grip response...
Article
Full-text available
The present study was motivated by a theory, which proposes that speech includes articulatory gestures that are connected to particular hand actions. We hypothesized that certain articulatory gestures would be more associated with the precision grip than with the power grip, and vice versa. In the study, the participants pronounced a syllable and p...
Article
Information from the acoustic speech signal and the talking face is integrated into a unified percept. This is demonstrated in the McGurk effect, in which discrepant visual articulation changes the auditory perception of a consonant. We studied acoustic (A) and visual (V) phonetic features that contribute to audiovisual speech perception by measuri...
Article
Full-text available
Purpose: The effect of the signal-to-noise ratio (SNR) on the perception of audiovisual speech in children with and without developmental language disorder (DLD) was investigated by varying the noise level and the sound intensity of acoustic speech. The main hypotheses were that the McGurk effect (in which incongruent visual speech alters the audi...
Article
Full-text available
Audiovisual speech perception was studied in adults with Asperger syndrome (AS), by utilizing the McGurk effect, in which conflicting visual articulation alters the perception of heard speech. The AS group perceived the audiovisual stimuli differently from age, sex and IQ matched controls. When a voice saying /p/ was presented with a face articulat...
Article
In the best-known example of the McGurk effect, an auditory consonant /b/ that is presented with a face articulating /g/ is heard as a fusion /d/. However, sometimes this kind of stimulus is heard as /g/, ie, a visual-dominant percept. We explored the stimulus features giving rise to these percepts by using two different stimuli at various levels o...
Article
Full-text available
Individuals with Asperger syndrome (AS) have problems in following conversation, especially in the situations where several people are talking. This might result from impairments in audiovisual speech perception, especially from difficulties in focusing attention to speech-relevant visual information and ignoring distracting information. We studied...
Article
Full-text available
Audiovisual speech perception has been considered to operate independent of sound location, since the McGurk effect (altered auditory speech perception caused by conflicting visual speech) has been shown to be unaffected by whether speech sounds are presented in the same or different location as a talking face. Here we show that sound location effe...
Article
Full-text available
The McGurk effect has been shown to be modulated by attention. However, it remains unclear whether attentional effects are due to changes in unisensory processing or in the fusion mechanism. In this paper, we used published experimental data showing that distraction of visual attention weakens the McGurk effect, to fit either the Fuzzy Logical Mode...
Article
Auditory and visual information is integrated when perceiving speech, as evidenced by the McGurk effect in which viewing an incongruent talking face categorically alters auditory speech perception. Audiovisual integration in speech perception has long been considered automatic and pre-attentive but recent reports have challenged this view. Here we...
Article
Full-text available
The theory of 'weak central coherence' [Happe, F., & Frith, U. (2006). The weak coherence account: Detail-focused cognitive style in autism spectrum disorders. Journal of Autism and Developmental Disorders, 36(1), 5-25] implies that persons with autism spectrum disorders (ASDs) have a perceptual bias for local but not for global stimulus features....
Article
In face-to-face conversation speech is perceived by ear and eye. We studied the prerequisites of audio-visual speech perception by using perceptually ambiguous sine wave replicas of natural speech as auditory stimuli. When the subjects were not aware that the auditory stimuli were speech, they showed only negligible integration of auditory and visu...
Article
Maximum likelihood models of multisensory integration are theoretically attractive because the goals and assumptions of sensory information processing are explicitly stated in such optimal models. When subjects perceive stimuli categorically, as opposed to on a continuous scale, Maximum Likelihood Integration (MLI) can occur before or after categor...
Article
Full-text available
Information processing in auditory and visual modalities interacts in many circumstances. Spatially and temporally coincident acoustic and visual information are often bound together to form multisensory percepts [B.E. Stein, M.A. Meredith, The Merging of the Senses, A Bradford Book, Cambridge, MA, (1993), 211 pp.; Psychol. Bull. 88 (1980) 638]. Sh...
Article
Full-text available
Auditory and visual information is integrated when perceiving speech, as evidenced by the McGurk effect in which viewing an incon-gruent talking face categorically alters auditory speech perception. Audiovisual integration in speech perception has long been considered automatic and pre-attentive but recent reports have challenged this view. Here we...
Article
Normal-learning children (NL) and children with learning disabilities (LD) reported their perceptions of unisensory (auditory or visual), concordant audiovisual (e.g. visual /apa/ and auditory /apa/) and conflicting (e.g. visual /aka/ and auditory /apa/) speech stimuli in quiet and noise (0 dB and -12 dB signal-to-noise ratio, SNR). In normal popul...
Article
Full-text available
Speech perception is audiovisual, as demonstrated by the McGurk effect in which discrepant visual speech alters the auditory speech percept. We studied the role of visual attention in audiovisual speech perception by measuring the McGurk effect in two conditions. In the baseline condition, attention was focused on the talking face. In the distracte...
Article
Full-text available
We tested whether listener's knowledge about the nature of the auditory stimuli had an effect on audio-visual (AV) integration of speech. First, subjects were taught to categorize two sine-wave (sw) replicas of the real speech tokens /omso/ and /onso/ into two arbitrary nonspeech categories without knowledge of the speech-like nature of the sounds....
Article
Full-text available
Seeing a talker's articulatory gestures may affect the observer's auditory speech percept. Observing congruent articulatory gestures may enhance the recognition of speech sounds [J. Acoust. Soc. Am. 26 (1954) 212], whereas observing incongruent gestures may change the auditory percept phonetically as occurs in the McGurk effect [Nature 264 (1976) 7...
Conference Paper
Full-text available
We present three models of audiovisual speech perception at varying signal-to-noise ratios (SNR). The first model is Massaro's Fuzzy Logical Model of Perception (FLMP) applied at each SNR. The second model imposes the constraint that the visual response probabilities are the same regardless of the SNR. Both models describe the data well. Root Mean...
Article
Full-text available
Choice reaction times (CRTs) to contrast differences were measured and compared with contrast increment thresholds obtained from concurrently measured psychometric functions at pedestal contrasts in the vicinity of detection threshold. Contrast discrimination functions had a classical dipper shape. The main finding was that CRTs were shorter at low...
Article
Full-text available
Contrast matching was performed with isoluminant red-green and s-cone gratings at spatial frequencies ranging from 0.5 to 8 c/deg. Contrast threshold curves were low-pass in shape, in agreement with previous findings. Contrast matching functions resembled threshold curves at low contrast levels, but became flat and independent of spatial frequency...