Loading…
Fast Likelihood Computation in Speech Recognition using Matrices
Acoustic modeling using mixtures of multivariate Gaussians is the prevalent approach for many speech processing problems. Computing likelihoods against a large set of Gaussians is required as a part of many speech processing systems and it is the computationally dominant phase for Large Vocabulary C...
Saved in:
Published in: | Journal of signal processing systems 2013-02, Vol.70 (2), p.219-234 |
---|---|
Main Authors: | , , |
Format: | Article |
Language: | English |
Subjects: | |
Citations: | Items that this one cites |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
cited_by | |
---|---|
cites | cdi_FETCH-LOGICAL-c273t-a2387a9fec53545e525f0503c0ac90861e14e1da60718e4ad7d60b62083c58573 |
container_end_page | 234 |
container_issue | 2 |
container_start_page | 219 |
container_title | Journal of signal processing systems |
container_volume | 70 |
creator | Gajjar, Mrugesh R. Sreenivas, T. V. Govindarajan, R. |
description | Acoustic modeling using mixtures of multivariate Gaussians is the prevalent approach for many speech processing problems. Computing likelihoods against a large set of Gaussians is required as a part of many speech processing systems and it is the computationally dominant phase for Large Vocabulary Continuous Speech Recognition (LVCSR) systems. We express the likelihood computation as a multiplication of matrices representing augmented feature vectors and Gaussian parameters. The computational gain of this approach over traditional methods is by exploiting the structure of these matrices and efficient implementation of their multiplication. In particular, we explore direct low-rank approximation of the Gaussian parameter matrix and indirect derivation of low-rank factors of the Gaussian parameter matrix by optimum approximation of the likelihood matrix. We show that both the methods lead to similar speedups but the latter leads to far lesser impact on the recognition accuracy. Experiments on 1,138 work vocabulary RM1 task and 6,224 word vocabulary TIMIT task using Sphinx 3.7 system show that, for a typical case the matrix multiplication based approach leads to overall speedup of 46 % on RM1 task and 115 % for TIMIT task. Our low-rank approximation methods provide a way for trading off recognition accuracy for a further increase in computational performance extending overall speedups up to 61 % for RM1 and 119 % for TIMIT for an increase of word error rate (WER) from 3.2 to 3.5 % for RM1 and for no increase in WER for TIMIT. We also express pairwise Euclidean distance computation phase in Dynamic Time Warping (DTW) in terms of matrix multiplication leading to saving of approximately
of computational operations. In our experiments using efficient implementation of matrix multiplication, this leads to a speedup of 5.6 in computing the pairwise Euclidean distances and overall speedup up to 3.25 for DTW. |
doi_str_mv | 10.1007/s11265-012-0704-4 |
format | article |
fullrecord | <record><control><sourceid>proquest_cross</sourceid><recordid>TN_cdi_proquest_miscellaneous_1429872332</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>1429872332</sourcerecordid><originalsourceid>FETCH-LOGICAL-c273t-a2387a9fec53545e525f0503c0ac90861e14e1da60718e4ad7d60b62083c58573</originalsourceid><addsrcrecordid>eNp9kE1PwzAMhisEEmPwA7j1yKVgJ02T3kATA6QhJD7OUUjdLaNrStIe-Pd0FK6cbFnvY9lPkpwjXCKAvIqIrBAZIMtAQp7lB8kMS15mClEc_vWA6jg5iXELUIAUOEuulyb26cp9UOM23lfpwu-6oTe9823q2vSlI7Kb9JmsX7fuZzpE167TR9MHZymeJke1aSKd_dZ58ra8fV3cZ6unu4fFzSqzTPI-M4wracqarOAiFySYqEEAt2BsCapAwpywMuNZqCg3lawKeC8YKG6FEpLPk4tpbxf850Cx1zsXLTWNackPUWPOSiUZ52yM4hS1wccYqNZdcDsTvjSC3tvSky092tJ7WzofGTYxccy2awp664fQjh_9A30Df4NriA</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>1429872332</pqid></control><display><type>article</type><title>Fast Likelihood Computation in Speech Recognition using Matrices</title><source>Springer Link</source><creator>Gajjar, Mrugesh R. ; Sreenivas, T. V. ; Govindarajan, R.</creator><creatorcontrib>Gajjar, Mrugesh R. ; Sreenivas, T. V. ; Govindarajan, R.</creatorcontrib><description>Acoustic modeling using mixtures of multivariate Gaussians is the prevalent approach for many speech processing problems. Computing likelihoods against a large set of Gaussians is required as a part of many speech processing systems and it is the computationally dominant phase for Large Vocabulary Continuous Speech Recognition (LVCSR) systems. We express the likelihood computation as a multiplication of matrices representing augmented feature vectors and Gaussian parameters. The computational gain of this approach over traditional methods is by exploiting the structure of these matrices and efficient implementation of their multiplication. In particular, we explore direct low-rank approximation of the Gaussian parameter matrix and indirect derivation of low-rank factors of the Gaussian parameter matrix by optimum approximation of the likelihood matrix. We show that both the methods lead to similar speedups but the latter leads to far lesser impact on the recognition accuracy. Experiments on 1,138 work vocabulary RM1 task and 6,224 word vocabulary TIMIT task using Sphinx 3.7 system show that, for a typical case the matrix multiplication based approach leads to overall speedup of 46 % on RM1 task and 115 % for TIMIT task. Our low-rank approximation methods provide a way for trading off recognition accuracy for a further increase in computational performance extending overall speedups up to 61 % for RM1 and 119 % for TIMIT for an increase of word error rate (WER) from 3.2 to 3.5 % for RM1 and for no increase in WER for TIMIT. We also express pairwise Euclidean distance computation phase in Dynamic Time Warping (DTW) in terms of matrix multiplication leading to saving of approximately
of computational operations. In our experiments using efficient implementation of matrix multiplication, this leads to a speedup of 5.6 in computing the pairwise Euclidean distances and overall speedup up to 3.25 for DTW.</description><identifier>ISSN: 1939-8018</identifier><identifier>EISSN: 1939-8115</identifier><identifier>DOI: 10.1007/s11265-012-0704-4</identifier><language>eng</language><publisher>Boston: Springer US</publisher><subject>Approximation ; Circuits and Systems ; Computation ; Computer Imaging ; Electrical Engineering ; Engineering ; Gaussian ; Image Processing and Computer Vision ; Mathematical analysis ; Mathematical models ; Multiplication ; Pattern Recognition ; Pattern Recognition and Graphics ; Signal,Image and Speech Processing ; Speech processing ; Tasks ; Vision</subject><ispartof>Journal of signal processing systems, 2013-02, Vol.70 (2), p.219-234</ispartof><rights>Springer Science+Business Media New York 2012</rights><lds50>peer_reviewed</lds50><woscitedreferencessubscribed>false</woscitedreferencessubscribed><cites>FETCH-LOGICAL-c273t-a2387a9fec53545e525f0503c0ac90861e14e1da60718e4ad7d60b62083c58573</cites></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><link.rule.ids>314,780,784,27924,27925</link.rule.ids></links><search><creatorcontrib>Gajjar, Mrugesh R.</creatorcontrib><creatorcontrib>Sreenivas, T. V.</creatorcontrib><creatorcontrib>Govindarajan, R.</creatorcontrib><title>Fast Likelihood Computation in Speech Recognition using Matrices</title><title>Journal of signal processing systems</title><addtitle>J Sign Process Syst</addtitle><description>Acoustic modeling using mixtures of multivariate Gaussians is the prevalent approach for many speech processing problems. Computing likelihoods against a large set of Gaussians is required as a part of many speech processing systems and it is the computationally dominant phase for Large Vocabulary Continuous Speech Recognition (LVCSR) systems. We express the likelihood computation as a multiplication of matrices representing augmented feature vectors and Gaussian parameters. The computational gain of this approach over traditional methods is by exploiting the structure of these matrices and efficient implementation of their multiplication. In particular, we explore direct low-rank approximation of the Gaussian parameter matrix and indirect derivation of low-rank factors of the Gaussian parameter matrix by optimum approximation of the likelihood matrix. We show that both the methods lead to similar speedups but the latter leads to far lesser impact on the recognition accuracy. Experiments on 1,138 work vocabulary RM1 task and 6,224 word vocabulary TIMIT task using Sphinx 3.7 system show that, for a typical case the matrix multiplication based approach leads to overall speedup of 46 % on RM1 task and 115 % for TIMIT task. Our low-rank approximation methods provide a way for trading off recognition accuracy for a further increase in computational performance extending overall speedups up to 61 % for RM1 and 119 % for TIMIT for an increase of word error rate (WER) from 3.2 to 3.5 % for RM1 and for no increase in WER for TIMIT. We also express pairwise Euclidean distance computation phase in Dynamic Time Warping (DTW) in terms of matrix multiplication leading to saving of approximately
of computational operations. In our experiments using efficient implementation of matrix multiplication, this leads to a speedup of 5.6 in computing the pairwise Euclidean distances and overall speedup up to 3.25 for DTW.</description><subject>Approximation</subject><subject>Circuits and Systems</subject><subject>Computation</subject><subject>Computer Imaging</subject><subject>Electrical Engineering</subject><subject>Engineering</subject><subject>Gaussian</subject><subject>Image Processing and Computer Vision</subject><subject>Mathematical analysis</subject><subject>Mathematical models</subject><subject>Multiplication</subject><subject>Pattern Recognition</subject><subject>Pattern Recognition and Graphics</subject><subject>Signal,Image and Speech Processing</subject><subject>Speech processing</subject><subject>Tasks</subject><subject>Vision</subject><issn>1939-8018</issn><issn>1939-8115</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2013</creationdate><recordtype>article</recordtype><recordid>eNp9kE1PwzAMhisEEmPwA7j1yKVgJ02T3kATA6QhJD7OUUjdLaNrStIe-Pd0FK6cbFnvY9lPkpwjXCKAvIqIrBAZIMtAQp7lB8kMS15mClEc_vWA6jg5iXELUIAUOEuulyb26cp9UOM23lfpwu-6oTe9823q2vSlI7Kb9JmsX7fuZzpE167TR9MHZymeJke1aSKd_dZ58ra8fV3cZ6unu4fFzSqzTPI-M4wracqarOAiFySYqEEAt2BsCapAwpywMuNZqCg3lawKeC8YKG6FEpLPk4tpbxf850Cx1zsXLTWNackPUWPOSiUZ52yM4hS1wccYqNZdcDsTvjSC3tvSky092tJ7WzofGTYxccy2awp664fQjh_9A30Df4NriA</recordid><startdate>20130201</startdate><enddate>20130201</enddate><creator>Gajjar, Mrugesh R.</creator><creator>Sreenivas, T. V.</creator><creator>Govindarajan, R.</creator><general>Springer US</general><scope>AAYXX</scope><scope>CITATION</scope><scope>7SC</scope><scope>7SP</scope><scope>8FD</scope><scope>JQ2</scope><scope>L7M</scope><scope>L~C</scope><scope>L~D</scope></search><sort><creationdate>20130201</creationdate><title>Fast Likelihood Computation in Speech Recognition using Matrices</title><author>Gajjar, Mrugesh R. ; Sreenivas, T. V. ; Govindarajan, R.</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c273t-a2387a9fec53545e525f0503c0ac90861e14e1da60718e4ad7d60b62083c58573</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2013</creationdate><topic>Approximation</topic><topic>Circuits and Systems</topic><topic>Computation</topic><topic>Computer Imaging</topic><topic>Electrical Engineering</topic><topic>Engineering</topic><topic>Gaussian</topic><topic>Image Processing and Computer Vision</topic><topic>Mathematical analysis</topic><topic>Mathematical models</topic><topic>Multiplication</topic><topic>Pattern Recognition</topic><topic>Pattern Recognition and Graphics</topic><topic>Signal,Image and Speech Processing</topic><topic>Speech processing</topic><topic>Tasks</topic><topic>Vision</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Gajjar, Mrugesh R.</creatorcontrib><creatorcontrib>Sreenivas, T. V.</creatorcontrib><creatorcontrib>Govindarajan, R.</creatorcontrib><collection>CrossRef</collection><collection>Computer and Information Systems Abstracts</collection><collection>Electronics & Communications Abstracts</collection><collection>Technology Research Database</collection><collection>ProQuest Computer Science Collection</collection><collection>Advanced Technologies Database with Aerospace</collection><collection>Computer and Information Systems Abstracts Academic</collection><collection>Computer and Information Systems Abstracts Professional</collection><jtitle>Journal of signal processing systems</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Gajjar, Mrugesh R.</au><au>Sreenivas, T. V.</au><au>Govindarajan, R.</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Fast Likelihood Computation in Speech Recognition using Matrices</atitle><jtitle>Journal of signal processing systems</jtitle><stitle>J Sign Process Syst</stitle><date>2013-02-01</date><risdate>2013</risdate><volume>70</volume><issue>2</issue><spage>219</spage><epage>234</epage><pages>219-234</pages><issn>1939-8018</issn><eissn>1939-8115</eissn><abstract>Acoustic modeling using mixtures of multivariate Gaussians is the prevalent approach for many speech processing problems. Computing likelihoods against a large set of Gaussians is required as a part of many speech processing systems and it is the computationally dominant phase for Large Vocabulary Continuous Speech Recognition (LVCSR) systems. We express the likelihood computation as a multiplication of matrices representing augmented feature vectors and Gaussian parameters. The computational gain of this approach over traditional methods is by exploiting the structure of these matrices and efficient implementation of their multiplication. In particular, we explore direct low-rank approximation of the Gaussian parameter matrix and indirect derivation of low-rank factors of the Gaussian parameter matrix by optimum approximation of the likelihood matrix. We show that both the methods lead to similar speedups but the latter leads to far lesser impact on the recognition accuracy. Experiments on 1,138 work vocabulary RM1 task and 6,224 word vocabulary TIMIT task using Sphinx 3.7 system show that, for a typical case the matrix multiplication based approach leads to overall speedup of 46 % on RM1 task and 115 % for TIMIT task. Our low-rank approximation methods provide a way for trading off recognition accuracy for a further increase in computational performance extending overall speedups up to 61 % for RM1 and 119 % for TIMIT for an increase of word error rate (WER) from 3.2 to 3.5 % for RM1 and for no increase in WER for TIMIT. We also express pairwise Euclidean distance computation phase in Dynamic Time Warping (DTW) in terms of matrix multiplication leading to saving of approximately
of computational operations. In our experiments using efficient implementation of matrix multiplication, this leads to a speedup of 5.6 in computing the pairwise Euclidean distances and overall speedup up to 3.25 for DTW.</abstract><cop>Boston</cop><pub>Springer US</pub><doi>10.1007/s11265-012-0704-4</doi><tpages>16</tpages></addata></record> |
fulltext | fulltext |
identifier | ISSN: 1939-8018 |
ispartof | Journal of signal processing systems, 2013-02, Vol.70 (2), p.219-234 |
issn | 1939-8018 1939-8115 |
language | eng |
recordid | cdi_proquest_miscellaneous_1429872332 |
source | Springer Link |
subjects | Approximation Circuits and Systems Computation Computer Imaging Electrical Engineering Engineering Gaussian Image Processing and Computer Vision Mathematical analysis Mathematical models Multiplication Pattern Recognition Pattern Recognition and Graphics Signal,Image and Speech Processing Speech processing Tasks Vision |
title | Fast Likelihood Computation in Speech Recognition using Matrices |
url | http://sfxeu10.hosted.exlibrisgroup.com/loughborough?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-07T16%3A35%3A13IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_cross&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Fast%20Likelihood%20Computation%20in%20Speech%20Recognition%20using%20Matrices&rft.jtitle=Journal%20of%20signal%20processing%20systems&rft.au=Gajjar,%20Mrugesh%20R.&rft.date=2013-02-01&rft.volume=70&rft.issue=2&rft.spage=219&rft.epage=234&rft.pages=219-234&rft.issn=1939-8018&rft.eissn=1939-8115&rft_id=info:doi/10.1007/s11265-012-0704-4&rft_dat=%3Cproquest_cross%3E1429872332%3C/proquest_cross%3E%3Cgrp_id%3Ecdi_FETCH-LOGICAL-c273t-a2387a9fec53545e525f0503c0ac90861e14e1da60718e4ad7d60b62083c58573%3C/grp_id%3E%3Coa%3E%3C/oa%3E%3Curl%3E%3C/url%3E&rft_id=info:oai/&rft_pqid=1429872332&rft_id=info:pmid/&rfr_iscdi=true |