Evaluation of Information Contained in Spectral Features

Abstract:

In this paper we estimate the information contained in features to recognize phones represented by HMMs states. The features investigated are derivatives of MFCCs. The information is defined as the ‘Phone Entropy’, which is Shannon’s conditional entropy applied to phones. Related to the entropy are the bounds concerning the minimum phone error rate, which are achieved by a recognizer based on the Bayes principle. We use the Fano and Golić bounds as the lower and upper bounds, respectively. This paper is focused on estimating the Phone Entropy and the underlying probability functions. An approach to overcome the first order Markov model, as used in the state of the art HMM technology, is investigated. Experimental results are presented using the AURORA framework, where we determine the Phone Entropy and phone error rates for different noise levels. Phone error rates are compared to the bounds given by Fano and Golić.


Year: 2009
In session: Signalverarbeitung
Pages: 115 to 122