CIBR seminar on Mon 28.5. at 14-15: Piia Astikainen, Orsolya Kolozsvari, Weiyong Xu

  • Aika 28.05.2018 klo 14.0015.00 (Europe/Helsinki / UTC300)
  • Paikka Agora Delta
  • Yhteyshenkilön nimi
  • Yhteyshenkilön puhelinnumero 040 805 3533
  • Lisää tapahtuma kalenteriin iCal

We are delighted to have University Researcher Piia Astikainen, Doctoral Students Orsolya Kolozsvari and Weiyong Xu from the University of Jyväskylä as our speakers.

Piia will be giving presentation about Automatic processing of facial emotions in dysphoria:

It is not known to what extent the automatic encoding and change detection of peripherally presented facial emotion is altered in dysphoria. To this end we used magnetoencephalography (MEG) to record automatic brain responses to happy and sad faces in dysphoric (Beck’s Depression Index ≥ 13) and control participants. Stimuli were presented in a passive oddball condition, which allowed potential negative bias in dysphoria at different stages of face processing (M100, M170, and M300) and alterations of change detection (visual mismatch negativity, vMMN) to be investigated. The magnetic counterpart of the vMMN was elicited at all stages of face processing, indexing automatic deviance detection in facial emotions. The M170 amplitude was modulated by emotion, response amplitudes being larger for sad faces than happy faces. Group differences were found for the M300, and they were indexed by two different interaction effects. At the left occipital region of interest, the dysphoric group had larger amplitudes for sad than happy deviant faces, reflecting negative bias in deviance detection, which was not found in the control group. On the other hand, the dysphoric group showed no vMMN to changes in facial emotions, while the vMMN was observed in the control group at the right occipital region of interest. Our results indicate that there is a negative bias in automatic visual deviance detection, but also a general change detection deficit in dysphoria.

Orsolya´s presentation is titled Familiarity and congruency interact in audio-visual speech perception:

During speech perception listeners rely on multi-modal input and make use of both visual and auditory information. When presented with contrasts of syllables, the differences in brain responses are not caused merely by the acoustic or visual features of the stimuli however. The familiarity of a syllable, i.e. whether it appears in the viewer-listener's native language or not, may also modulate brain responses. We investigated how the familiarity of the presented stimuli affects brain responses to audio-visual speech in Finnish native speakers and Chinese native speakers. The stimuli (syllables) presented were Audio-Visual (congruent or incongruent), Audio only, or Visual only. Source waveforms were examined in three time-windows: 75 - 125 ms, 125 - 300 ms, 300 - 475 ms. We found significant differences in congruency comparisons for Chinese native speakers and in familiarity comparisons for Finnish native speakers. Our results suggest congruency and familiarity have an interactive, but distinct influence on audio-visual speech perception, specifically in the auditory and visual areas.

Weiyong´s presentation will be Audiovisual processing of Chinese characters elicits integration and congruency effects in MEG:

Associating written letters/characters to speech sounds is crucial for initial stage of reading acquisition. Most previous studies have focused only on audiovisual integration in alphabetic languages, less is known about logographic languages such as Chinese. Here we investigated how long-term learning affects the underlying neural mechanisms of audiovisual integration in a logographic language by looking at the integration and congruency effects in an active audiovisual task in both native Chinese and Finnish speakers with magnetoencephalography (MEG). The two groups showed distinct suppressive integration effect with left lateralization (left angular and supramarginal gyri, inferior frontal and superior temporal cortices) in Chinese group and right lateralization (right inferior parietal and occipital cortices) in the Finnish group. Congruency effect was only observed in Chinese group in left inferior frontal and superior temporal cortex in a late time window (about 600-700 ms) probably due to both audiovisual integration and semantic processing. The current MEG study indicated that learning of logographic languages have a large impact on the audiovisual processing of written characters and provided additional insight to the audiovisual integration of logographic language. The audiovisual integration in logographic language showed clear resemblance to alphabetic languages in the left superior temporal cortex, but with unique aspect of the processing of logographic stimuli observed at the left inferior frontal cortex.

Everyone interested in MEG research are warmly welcome!