Loading…
Conditional leaving-one-out and cross-validation for discount estimation in Kneser-Ney-like extensions
The smoothing of n-gram models is a core technique in language modelling (LM). Modified Kneser-Ney (mKN) ranges among one of the best smoothing techniques. This technique discounts a fixed quantity from the observed counts in order to approximate the Turing-Good (TG) counts. Despite the TG counts op...
Saved in:
Main Authors: | , , |
---|---|
Format: | Conference Proceeding |
Language: | English |
Subjects: | |
Online Access: | Request full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
cited_by | |
---|---|
cites | |
container_end_page | 5016 |
container_issue | |
container_start_page | 5013 |
container_title | |
container_volume | |
creator | Andres-Ferrer, J. Sundermeyer, M. Ney, H. |
description | The smoothing of n-gram models is a core technique in language modelling (LM). Modified Kneser-Ney (mKN) ranges among one of the best smoothing techniques. This technique discounts a fixed quantity from the observed counts in order to approximate the Turing-Good (TG) counts. Despite the TG counts optimise the leaving-one-out (L1O) criterion, the discounting parameters introduced in mKN do not. Moreover, the approximation to the TG counts for large counts is heavily simplified. In this work, both ideas are addressed: the estimation of the discounting parameters by L1O and better functional forms to approximate larger TG counts. The L1O performance is compared with cross-validation (CV) and mKN baseline in two large vocabulary tasks. |
doi_str_mv | 10.1109/ICASSP.2012.6289046 |
format | conference_proceeding |
fullrecord | <record><control><sourceid>ieee_CHZPO</sourceid><recordid>TN_cdi_ieee_primary_6289046</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><ieee_id>6289046</ieee_id><sourcerecordid>6289046</sourcerecordid><originalsourceid>FETCH-LOGICAL-i220t-747fa8da32cc4480717c4c1bcb384381e93baa97b683837bc1a8bc5c34cd69e73</originalsourceid><addsrcrecordid>eNo1UMtOwzAQNC-JUPIFvfgHXPyKH0dUQUFUgFSQuFWOs0GGYKM4rejfE9Syl9XOzK5mB6EpozPGqL26n1-vVs8zThmfKW4sleoIlVYbJpUWdBztMSq40JYwS99O0MU_UclTVLCKU6KYtOeozPmDjjWuUqEK1M5TbMIQUnQd7sBtQ3wnKQJJmwG72GDfp5zJ1nWhcX8y3KYeNyH7tIkDhjyErz0eIn6IkKEnj7AjXfgEDD8DxDyS-RKdta7LUB76BL3e3rzM78jyaTE-tySBczoQLXXrTOME915KQzXTXnpW-1oYKQwDK2rnrK6VEUbo2jNnal95IX2jLGgxQdP93QAA6-9-NNfv1ofIxC8gel3X</addsrcrecordid><sourcetype>Publisher</sourcetype><iscdi>true</iscdi><recordtype>conference_proceeding</recordtype></control><display><type>conference_proceeding</type><title>Conditional leaving-one-out and cross-validation for discount estimation in Kneser-Ney-like extensions</title><source>IEEE Xplore All Conference Series</source><creator>Andres-Ferrer, J. ; Sundermeyer, M. ; Ney, H.</creator><creatorcontrib>Andres-Ferrer, J. ; Sundermeyer, M. ; Ney, H.</creatorcontrib><description>The smoothing of n-gram models is a core technique in language modelling (LM). Modified Kneser-Ney (mKN) ranges among one of the best smoothing techniques. This technique discounts a fixed quantity from the observed counts in order to approximate the Turing-Good (TG) counts. Despite the TG counts optimise the leaving-one-out (L1O) criterion, the discounting parameters introduced in mKN do not. Moreover, the approximation to the TG counts for large counts is heavily simplified. In this work, both ideas are addressed: the estimation of the discounting parameters by L1O and better functional forms to approximate larger TG counts. The L1O performance is compared with cross-validation (CV) and mKN baseline in two large vocabulary tasks.</description><identifier>ISSN: 1520-6149</identifier><identifier>ISBN: 1467300454</identifier><identifier>ISBN: 9781467300452</identifier><identifier>EISSN: 2379-190X</identifier><identifier>EISBN: 9781467300469</identifier><identifier>EISBN: 1467300446</identifier><identifier>EISBN: 9781467300445</identifier><identifier>EISBN: 1467300462</identifier><identifier>DOI: 10.1109/ICASSP.2012.6289046</identifier><language>eng</language><publisher>IEEE</publisher><subject>Approximation methods ; Computational modeling ; Cross Validation ; Estimation ; Language Modelling ; Leaving-One-Out ; modified Kneser-Ney smoothing ; Optimization ; Smoothing methods ; Training</subject><ispartof>2012 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), 2012, p.5013-5016</ispartof><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktohtml>$$Uhttps://ieeexplore.ieee.org/document/6289046$$EHTML$$P50$$Gieee$$H</linktohtml><link.rule.ids>309,310,780,784,789,790,2058,27925,54555,54920,54932</link.rule.ids><linktorsrc>$$Uhttps://ieeexplore.ieee.org/document/6289046$$EView_record_in_IEEE$$FView_record_in_$$GIEEE</linktorsrc></links><search><creatorcontrib>Andres-Ferrer, J.</creatorcontrib><creatorcontrib>Sundermeyer, M.</creatorcontrib><creatorcontrib>Ney, H.</creatorcontrib><title>Conditional leaving-one-out and cross-validation for discount estimation in Kneser-Ney-like extensions</title><title>2012 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP)</title><addtitle>ICASSP</addtitle><description>The smoothing of n-gram models is a core technique in language modelling (LM). Modified Kneser-Ney (mKN) ranges among one of the best smoothing techniques. This technique discounts a fixed quantity from the observed counts in order to approximate the Turing-Good (TG) counts. Despite the TG counts optimise the leaving-one-out (L1O) criterion, the discounting parameters introduced in mKN do not. Moreover, the approximation to the TG counts for large counts is heavily simplified. In this work, both ideas are addressed: the estimation of the discounting parameters by L1O and better functional forms to approximate larger TG counts. The L1O performance is compared with cross-validation (CV) and mKN baseline in two large vocabulary tasks.</description><subject>Approximation methods</subject><subject>Computational modeling</subject><subject>Cross Validation</subject><subject>Estimation</subject><subject>Language Modelling</subject><subject>Leaving-One-Out</subject><subject>modified Kneser-Ney smoothing</subject><subject>Optimization</subject><subject>Smoothing methods</subject><subject>Training</subject><issn>1520-6149</issn><issn>2379-190X</issn><isbn>1467300454</isbn><isbn>9781467300452</isbn><isbn>9781467300469</isbn><isbn>1467300446</isbn><isbn>9781467300445</isbn><isbn>1467300462</isbn><fulltext>true</fulltext><rsrctype>conference_proceeding</rsrctype><creationdate>2012</creationdate><recordtype>conference_proceeding</recordtype><sourceid>6IE</sourceid><recordid>eNo1UMtOwzAQNC-JUPIFvfgHXPyKH0dUQUFUgFSQuFWOs0GGYKM4rejfE9Syl9XOzK5mB6EpozPGqL26n1-vVs8zThmfKW4sleoIlVYbJpUWdBztMSq40JYwS99O0MU_UclTVLCKU6KYtOeozPmDjjWuUqEK1M5TbMIQUnQd7sBtQ3wnKQJJmwG72GDfp5zJ1nWhcX8y3KYeNyH7tIkDhjyErz0eIn6IkKEnj7AjXfgEDD8DxDyS-RKdta7LUB76BL3e3rzM78jyaTE-tySBczoQLXXrTOME915KQzXTXnpW-1oYKQwDK2rnrK6VEUbo2jNnal95IX2jLGgxQdP93QAA6-9-NNfv1ofIxC8gel3X</recordid><startdate>20120101</startdate><enddate>20120101</enddate><creator>Andres-Ferrer, J.</creator><creator>Sundermeyer, M.</creator><creator>Ney, H.</creator><general>IEEE</general><scope>6IE</scope><scope>6IH</scope><scope>CBEJK</scope><scope>RIE</scope><scope>RIO</scope></search><sort><creationdate>20120101</creationdate><title>Conditional leaving-one-out and cross-validation for discount estimation in Kneser-Ney-like extensions</title><author>Andres-Ferrer, J. ; Sundermeyer, M. ; Ney, H.</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-i220t-747fa8da32cc4480717c4c1bcb384381e93baa97b683837bc1a8bc5c34cd69e73</frbrgroupid><rsrctype>conference_proceedings</rsrctype><prefilter>conference_proceedings</prefilter><language>eng</language><creationdate>2012</creationdate><topic>Approximation methods</topic><topic>Computational modeling</topic><topic>Cross Validation</topic><topic>Estimation</topic><topic>Language Modelling</topic><topic>Leaving-One-Out</topic><topic>modified Kneser-Ney smoothing</topic><topic>Optimization</topic><topic>Smoothing methods</topic><topic>Training</topic><toplevel>online_resources</toplevel><creatorcontrib>Andres-Ferrer, J.</creatorcontrib><creatorcontrib>Sundermeyer, M.</creatorcontrib><creatorcontrib>Ney, H.</creatorcontrib><collection>IEEE Electronic Library (IEL) Conference Proceedings</collection><collection>IEEE Proceedings Order Plan (POP) 1998-present by volume</collection><collection>IEEE Xplore All Conference Proceedings</collection><collection>IEEE Xplore</collection><collection>IEEE Proceedings Order Plans (POP) 1998-present</collection></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext_linktorsrc</fulltext></delivery><addata><au>Andres-Ferrer, J.</au><au>Sundermeyer, M.</au><au>Ney, H.</au><format>book</format><genre>proceeding</genre><ristype>CONF</ristype><atitle>Conditional leaving-one-out and cross-validation for discount estimation in Kneser-Ney-like extensions</atitle><btitle>2012 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP)</btitle><stitle>ICASSP</stitle><date>2012-01-01</date><risdate>2012</risdate><spage>5013</spage><epage>5016</epage><pages>5013-5016</pages><issn>1520-6149</issn><eissn>2379-190X</eissn><isbn>1467300454</isbn><isbn>9781467300452</isbn><eisbn>9781467300469</eisbn><eisbn>1467300446</eisbn><eisbn>9781467300445</eisbn><eisbn>1467300462</eisbn><abstract>The smoothing of n-gram models is a core technique in language modelling (LM). Modified Kneser-Ney (mKN) ranges among one of the best smoothing techniques. This technique discounts a fixed quantity from the observed counts in order to approximate the Turing-Good (TG) counts. Despite the TG counts optimise the leaving-one-out (L1O) criterion, the discounting parameters introduced in mKN do not. Moreover, the approximation to the TG counts for large counts is heavily simplified. In this work, both ideas are addressed: the estimation of the discounting parameters by L1O and better functional forms to approximate larger TG counts. The L1O performance is compared with cross-validation (CV) and mKN baseline in two large vocabulary tasks.</abstract><pub>IEEE</pub><doi>10.1109/ICASSP.2012.6289046</doi><tpages>4</tpages><oa>free_for_read</oa></addata></record> |
fulltext | fulltext_linktorsrc |
identifier | ISSN: 1520-6149 |
ispartof | 2012 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), 2012, p.5013-5016 |
issn | 1520-6149 2379-190X |
language | eng |
recordid | cdi_ieee_primary_6289046 |
source | IEEE Xplore All Conference Series |
subjects | Approximation methods Computational modeling Cross Validation Estimation Language Modelling Leaving-One-Out modified Kneser-Ney smoothing Optimization Smoothing methods Training |
title | Conditional leaving-one-out and cross-validation for discount estimation in Kneser-Ney-like extensions |
url | http://sfxeu10.hosted.exlibrisgroup.com/loughborough?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-08T03%3A18%3A21IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-ieee_CHZPO&rft_val_fmt=info:ofi/fmt:kev:mtx:book&rft.genre=proceeding&rft.atitle=Conditional%20leaving-one-out%20and%20cross-validation%20for%20discount%20estimation%20in%20Kneser-Ney-like%20extensions&rft.btitle=2012%20IEEE%20International%20Conference%20on%20Acoustics,%20Speech%20and%20Signal%20Processing%20(ICASSP)&rft.au=Andres-Ferrer,%20J.&rft.date=2012-01-01&rft.spage=5013&rft.epage=5016&rft.pages=5013-5016&rft.issn=1520-6149&rft.eissn=2379-190X&rft.isbn=1467300454&rft.isbn_list=9781467300452&rft_id=info:doi/10.1109/ICASSP.2012.6289046&rft.eisbn=9781467300469&rft.eisbn_list=1467300446&rft.eisbn_list=9781467300445&rft.eisbn_list=1467300462&rft_dat=%3Cieee_CHZPO%3E6289046%3C/ieee_CHZPO%3E%3Cgrp_id%3Ecdi_FETCH-LOGICAL-i220t-747fa8da32cc4480717c4c1bcb384381e93baa97b683837bc1a8bc5c34cd69e73%3C/grp_id%3E%3Coa%3E%3C/oa%3E%3Curl%3E%3C/url%3E&rft_id=info:oai/&rft_id=info:pmid/&rft_ieee_id=6289046&rfr_iscdi=true |