0


语音识别中神经网络的体系结构问题

On Architectural Issues of Neural Networks in Speech Recognition
课程网址: http://videolectures.net/interACT2016_ney_neural_networks/  
主讲教师: Hermann Ney
开课单位: 亚琛大学
开课时间: 2016-07-31
课程语种: 英语
中文简介:
最近,人工神经网络(ANN)能够显著提高语音识别系统的性能。在语音识别中,神经网络已经有超过25年的广泛研究。尽管付出了巨大的努力,但基于人工神经网络的语音识别系统的体系结构仍存在许多悬而未决的问题。这类问题的例子有:1)与用人工神经网络代替发射概率函数的混合方法不同,直接方法有可能直接模拟(语音)标签的后状态序列,而不使用经典隐马尔可夫模型(HMM)的生成概念。2) 在CTC方法(连接主义时间分类)中,HMM被简化为仅对每个音素(或手写识别中的字符)使用单个标签。CTC训练标准是所有可能的标签序列后验分布的总和。3) 最近出现了所谓的基于注意的方法,用递归神经网络取代了传统的HMM形式。在这三种情况下,我们面临的问题是,这些基于人工神经网络的方法与混合HMM的传统判别框架相比如何。我们将更详细地讨论这些方法的优缺点,并将其与传统混合HMM进行比较。
课程简介: Recently, artificial neural networks (ANN) were able to improve the performance of speech recognition systems dramatically. There have been more than 25 years of extensive research on neural networks in speech recognition. Despite this huge effort, there are a number of open issues concerning the architecure of ANN based systems for speech recognition. Examples of such issues are: 1) Unlike the hybrid approach of replacing the emission probability function by an ANN, there is the possibility of a direct approach that models the posterior state sequence of (phonetic) labels directly without using the generative concepts of classicial hidden Markov models (HMM). 2) In the CTC approach (connectionist temporal classification), the HMM is simplified by using a single label per phoneme (or character in handwriting recognition) only. The CTC training criterion is the sum over all possible posterior distributions of label sequences. 3) Recently there have been so-called attention based approaches that replace the conventional HMM formalism by a recurrent neural network. In these three cases, we are faced with the questions of how these ANN based approaches compare with the conventional discriminative framework of hybrid HMMs. We will discuss the advantages and disadvantages of these approaches in more detail and compare them with conventional hybrid HMMs.
关 键 词: 人工神经网络; 语音识别系统; 体系结构
课程来源: 视频讲座网
数据采集: 2021-11-26:zkj
最后编审: 2021-11-26:zkj
阅读次数: 57