There will be a departmental talk on 10 Aug (4pm) by Donald Derrick. More details are provided below.
Date/Time: 10 August 2010 (Tuesday), 4pm
Title: Aerotactile Integration in Speech Perception
Recently, we demonstrated that perceivers integrate naturalistic tactile information during auditory speech perception without previous training. Drawing on the observation that some speech sounds produce tiny bursts of aspiration (such as English ‘p’), we applied slight, inaudible air puffs on participants’ skin at one of three locations: the right hand, the neck, and the ankle. Syllables heard simultaneously with cutaneous air puffs were more likely to be heard as aspirated (for example, causing participants to mishear ‘b’ as ‘p’). These results demonstrate that perceivers integrate event-relevant tactile information in auditory perception in much the same way as they do visual information. They do so using the whole body, but integration requires the stimuli be unambiguously relatable to the speech event. We also demonstrated that asynchronous air puffs asymmetrically enhance speech perception. We are currently working on a follow-up study examining brain activity during aero-tactile integration in speech perception, as well as studies on the ecology and nature of aerotactile integration in speech perception.
About the speaker:
I am a PhD Candidate at UBC’s department of Linguistics. I study phonetics, laboratory phonology, articulatory synthesis and visualization. I am a member of the Artisynth research group which consists of electrical and mechanical engineers, and Linguists from the ISRL. I work on the experimental side of airflow simulation and the anatomy side of tongue modeling. I also work on Linguistic visualization, and am the author of TreeForm, a popular syntax tree editing tool.
Bryan Gick and I have published an article on “Aero-tactile integration in speech perception” in Nature on the 26th of November, 2009. Since then, we have had follow-up research accepted to InterSpeech 2010 in Japan, and the Journal of the Acoustical Society of America – Express Letters (JASA-EL). These represent the first publications from our research program on multi-modal integration in speech perception.
I am concurrently working on a research program on speech motor programming and planning for my dissertation research. This research emphasizes the importance of subphonemic planning during each speech utterance, and the mismatch between articulation and phonology.
A zipped file of the multimedia presentation can be downloaded at: