23 September 2008

Ingmar Steiner

Data-driven articulatory resynthesis

One possible interface for data-driven high-level control of an articulatory synthesizer is the resynthesis of electromagnetic articulographic (EMA) trajectories. By aligning the articulatory gestures (which are transformed into control point trajectories in the synthesizer's vocal tract model) in such a way that the original motion-captured speech is closely matched, we receive a training corpus suitable for HMM-based synthesis of control trajectories for unseen utterances. This talk presents preliminary results in automatic EMA-based articulatory resynthesis and will introduce new vocal tract MRI data (acquired on 16/09 at SBIRC) which will be used to adapt the vocal tract model to a new speaker.

[Back to the P-workshop top page]

owner-pworkshop@ling.ed.ac.uk