Formant-tracking linear prediction models for speech processing in noisy environments

Yan, Q., Vaseghi, S. V., Zavarehei, E. and Milner, B. P. (2005) Formant-tracking linear prediction models for speech processing in noisy environments. In: 9th European Conference on Speech Communication and Technology, 2005-09-04 - 2005-09-08.

Full text not available from this repository. (Request a copy)

Abstract

This paper presents a formant-tracking method for estimation of the time-varying trajectories of a linear prediction (LP) model of speech in noise. The main focus of this work is on the modelling of the non-stationary temporal trajectories of the formants of speech for improved LP model estimation in noise. The proposed approach provides a systematic framework for modelling the interframe correlation of speech parameters across successive frames, the intra-frame correlations are modelled by LP parameters. The formant-tracking LP model estimation is composed of two stages: (a) a pre-cleaning intra-frame spectral amplitude estimation stage where an initial estimate of the magnitude frequency response of the LP model of clean speech is obtained and (b) an inter-frame signal processing stage where formant classification and Kalman filters are combined to estimate the trajectory of formants. The effects of car and train noise on the observations and estimation of formants tracks are investigated. The average formant tracking errors at different signal to noise ratios (SNRs) are computed. The evaluation results demonstrate that after noise reduction and Kalman filtering the formant tracking errors are significantly reduced.

Item Type: Conference or Workshop Item (Paper)
Faculty \ School: Faculty of Science > School of Computing Sciences
UEA Research Groups: Faculty of Science > Research Groups > Interactive Graphics and Audio
Faculty of Science > Research Groups > Smart Emerging Technologies
Depositing User: Vishal Gautam
Date Deposited: 14 Jun 2011 15:17
Last Modified: 22 Apr 2023 02:45
URI: https://ueaeprints.uea.ac.uk/id/eprint/23271
DOI:

Actions (login required)

View Item View Item