||In the training process of hidden Markov model (HMM), the topologies of HMMs, which includes the number of states and the connectivity of states, are usually pre-set based on experience or heuristic. However, it is uncertain whether one topology is optimal in one particular task. Many model complexity measures have been proposed for model selection such as Bayesian information criterion, minimum description length. and minimum message length. In this thesis, an alternate approach of reducing complexity of the well-trained HMMs is proposed. The classical neural network weight pruning technique, called the Optimal Brain Surgeon (OBS) is adopted to pruning HMMs. In the method, an HMM component is first pruned if it results in a minimal decrease in total log-likelihood of the training data. The decrease in total log-likelihood is approximated by the Taylor’s Series with the second derivatives information in total log-likelihood function. The experimental evaluation showed that pruning HMM transitions with OBS method is able to reduce the topology of well trained, though perhaps over-fitted, HMMs successfully. The algorithm removed transitions optimally and, as a result, the memory and computation costs was reduced and the recognition performance of the pruned HMM on unseen test data was also improved in some cases (and did not get worse in all cases). The OBS method on HMM transition deletion is further extended to HMM state deletion by removing all incoming and outgoing transitions of the state simultaneously. This method has greatly reduced the computation cost of OBS as well by deleting multiple transitions in a single iteration. It was applied to a more complex HMM that was used in the multi-band automatic speech recognition. It showed that OBS in deleting states saves memory and conputation costs by reducing the model complexity and is more efficient than OBS on transition deletion since it requires fewer number of iterations.