Loading…

Properties of Jeffreys Mixture for Markov Sources

We discuss the properties of Jeffreys mixture for a Markov model. First, we show that a modified Jeffreys mixture asymptotically achieves the minimax coding regret for universal data compression, where we do not put any restriction on data sequences. Moreover, we give an approximation formula for th...

Full description

Saved in:
Bibliographic Details
Published in:IEEE transactions on information theory 2013-01, Vol.59 (1), p.438-457
Main Authors: Takeuchi, J., Kawabata, T., Barron, A. R.
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Items that cite this one
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
cited_by cdi_FETCH-LOGICAL-c354t-b88030a3d25736b94ddba5d7285fc3e7fd5ead56b3f849dd3f3c605b262234193
cites cdi_FETCH-LOGICAL-c354t-b88030a3d25736b94ddba5d7285fc3e7fd5ead56b3f849dd3f3c605b262234193
container_end_page 457
container_issue 1
container_start_page 438
container_title IEEE transactions on information theory
container_volume 59
creator Takeuchi, J.
Kawabata, T.
Barron, A. R.
description We discuss the properties of Jeffreys mixture for a Markov model. First, we show that a modified Jeffreys mixture asymptotically achieves the minimax coding regret for universal data compression, where we do not put any restriction on data sequences. Moreover, we give an approximation formula for the prediction probability of Jeffreys mixture for a Markov model. By this formula, it is revealed that the prediction probability by Jeffreys mixture for the Markov model with alphabet {0,1} is not of the form ( nx | s +α)/( ns +β), where nx | s is the number of occurrences of the symbol x following the context s ∈ {0,1} and ns = n 0| s + n 1| s . Moreover, we propose a method to compute our minimax strategy, which is a combination of a Monte Carlo method and the approximation formula, where the former is used for earlier stages in the data, while the latter is used for later stages.
doi_str_mv 10.1109/TIT.2012.2219171
format article
fullrecord <record><control><sourceid>proquest_pasca</sourceid><recordid>TN_cdi_proquest_journals_1242476772</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><ieee_id>6307868</ieee_id><sourcerecordid>1365133895</sourcerecordid><originalsourceid>FETCH-LOGICAL-c354t-b88030a3d25736b94ddba5d7285fc3e7fd5ead56b3f849dd3f3c605b262234193</originalsourceid><addsrcrecordid>eNpdkEtLAzEUhYMoWKt7wc2ACG6m5v1YiviotChY1yGTB0xtm5rMiP33prR04epyud85nHsAuERwhBBUd7PxbIQhwiOMkUICHYEBYkzUijN6DAYQIlkrSuUpOMt5XlbKEB4A9J7i2qeu9bmKoXr1ISS_ydW0_e365KsQUzU16Sv-VB-xT9bnc3ASzCL7i_0cgs-nx9nDSz15ex4_3E9qSxjt6kZKSKAhDjNBeKOoc41hTmDJgiVeBMe8cYw3JEiqnCOBWA5ZgznGhCJFhuB257tO8bv3udPLNlu_WJiVj33WiHCGCJGKFfT6HzovWVclnUaYYiq4ELhQcEfZFHNOPuh1apcmbTSCetuhLh3qbYd632GR3OyNTbZmEZJZ2TYfdFhAUb7cWl_tuNZ7fzhzUs5ckj8_LXfl</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>1242476772</pqid></control><display><type>article</type><title>Properties of Jeffreys Mixture for Markov Sources</title><source>IEEE Electronic Library (IEL) Journals</source><creator>Takeuchi, J. ; Kawabata, T. ; Barron, A. R.</creator><creatorcontrib>Takeuchi, J. ; Kawabata, T. ; Barron, A. R.</creatorcontrib><description>We discuss the properties of Jeffreys mixture for a Markov model. First, we show that a modified Jeffreys mixture asymptotically achieves the minimax coding regret for universal data compression, where we do not put any restriction on data sequences. Moreover, we give an approximation formula for the prediction probability of Jeffreys mixture for a Markov model. By this formula, it is revealed that the prediction probability by Jeffreys mixture for the Markov model with alphabet {0,1} is not of the form ( nx | s +α)/( ns +β), where nx | s is the number of occurrences of the symbol x following the context s ∈ {0,1} and ns = n 0| s + n 1| s . Moreover, we propose a method to compute our minimax strategy, which is a combination of a Monte Carlo method and the approximation formula, where the former is used for earlier stages in the data, while the latter is used for later stages.</description><identifier>ISSN: 0018-9448</identifier><identifier>EISSN: 1557-9654</identifier><identifier>DOI: 10.1109/TIT.2012.2219171</identifier><identifier>CODEN: IETTAW</identifier><language>eng</language><publisher>New York, NY: IEEE</publisher><subject>Applied sciences ; Approximation ; Approximation methods ; Asymptotic properties ; Bayes code ; Coding, codes ; Context ; Data compression ; Encoding ; Exact sciences and technology ; Information theory ; Information, signal and communications theory ; Jeffreys prior ; Markov analysis ; Markov models ; Markov processes ; Mathematical analysis ; Mathematical models ; Maximum likelihood estimation ; minimax regret ; Minimax technique ; Monte Carlo simulation ; Probability ; Signal and communications theory ; stochastic complexity ; Strategy ; Telecommunications and information theory ; universal source coding ; Upper bound ; Vectors</subject><ispartof>IEEE transactions on information theory, 2013-01, Vol.59 (1), p.438-457</ispartof><rights>2014 INIST-CNRS</rights><rights>Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) Jan 2013</rights><lds50>peer_reviewed</lds50><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c354t-b88030a3d25736b94ddba5d7285fc3e7fd5ead56b3f849dd3f3c605b262234193</citedby><cites>FETCH-LOGICAL-c354t-b88030a3d25736b94ddba5d7285fc3e7fd5ead56b3f849dd3f3c605b262234193</cites></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktohtml>$$Uhttps://ieeexplore.ieee.org/document/6307868$$EHTML$$P50$$Gieee$$H</linktohtml><link.rule.ids>314,780,784,4024,27923,27924,27925,54796</link.rule.ids><backlink>$$Uhttp://pascal-francis.inist.fr/vibad/index.php?action=getRecordDetail&amp;idt=27078032$$DView record in Pascal Francis$$Hfree_for_read</backlink></links><search><creatorcontrib>Takeuchi, J.</creatorcontrib><creatorcontrib>Kawabata, T.</creatorcontrib><creatorcontrib>Barron, A. R.</creatorcontrib><title>Properties of Jeffreys Mixture for Markov Sources</title><title>IEEE transactions on information theory</title><addtitle>TIT</addtitle><description>We discuss the properties of Jeffreys mixture for a Markov model. First, we show that a modified Jeffreys mixture asymptotically achieves the minimax coding regret for universal data compression, where we do not put any restriction on data sequences. Moreover, we give an approximation formula for the prediction probability of Jeffreys mixture for a Markov model. By this formula, it is revealed that the prediction probability by Jeffreys mixture for the Markov model with alphabet {0,1} is not of the form ( nx | s +α)/( ns +β), where nx | s is the number of occurrences of the symbol x following the context s ∈ {0,1} and ns = n 0| s + n 1| s . Moreover, we propose a method to compute our minimax strategy, which is a combination of a Monte Carlo method and the approximation formula, where the former is used for earlier stages in the data, while the latter is used for later stages.</description><subject>Applied sciences</subject><subject>Approximation</subject><subject>Approximation methods</subject><subject>Asymptotic properties</subject><subject>Bayes code</subject><subject>Coding, codes</subject><subject>Context</subject><subject>Data compression</subject><subject>Encoding</subject><subject>Exact sciences and technology</subject><subject>Information theory</subject><subject>Information, signal and communications theory</subject><subject>Jeffreys prior</subject><subject>Markov analysis</subject><subject>Markov models</subject><subject>Markov processes</subject><subject>Mathematical analysis</subject><subject>Mathematical models</subject><subject>Maximum likelihood estimation</subject><subject>minimax regret</subject><subject>Minimax technique</subject><subject>Monte Carlo simulation</subject><subject>Probability</subject><subject>Signal and communications theory</subject><subject>stochastic complexity</subject><subject>Strategy</subject><subject>Telecommunications and information theory</subject><subject>universal source coding</subject><subject>Upper bound</subject><subject>Vectors</subject><issn>0018-9448</issn><issn>1557-9654</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2013</creationdate><recordtype>article</recordtype><recordid>eNpdkEtLAzEUhYMoWKt7wc2ACG6m5v1YiviotChY1yGTB0xtm5rMiP33prR04epyud85nHsAuERwhBBUd7PxbIQhwiOMkUICHYEBYkzUijN6DAYQIlkrSuUpOMt5XlbKEB4A9J7i2qeu9bmKoXr1ISS_ydW0_e365KsQUzU16Sv-VB-xT9bnc3ASzCL7i_0cgs-nx9nDSz15ex4_3E9qSxjt6kZKSKAhDjNBeKOoc41hTmDJgiVeBMe8cYw3JEiqnCOBWA5ZgznGhCJFhuB257tO8bv3udPLNlu_WJiVj33WiHCGCJGKFfT6HzovWVclnUaYYiq4ELhQcEfZFHNOPuh1apcmbTSCetuhLh3qbYd632GR3OyNTbZmEZJZ2TYfdFhAUb7cWl_tuNZ7fzhzUs5ckj8_LXfl</recordid><startdate>201301</startdate><enddate>201301</enddate><creator>Takeuchi, J.</creator><creator>Kawabata, T.</creator><creator>Barron, A. R.</creator><general>IEEE</general><general>Institute of Electrical and Electronics Engineers</general><general>The Institute of Electrical and Electronics Engineers, Inc. (IEEE)</general><scope>97E</scope><scope>RIA</scope><scope>RIE</scope><scope>IQODW</scope><scope>AAYXX</scope><scope>CITATION</scope><scope>7SC</scope><scope>7SP</scope><scope>8FD</scope><scope>JQ2</scope><scope>L7M</scope><scope>L~C</scope><scope>L~D</scope><scope>F28</scope><scope>FR3</scope></search><sort><creationdate>201301</creationdate><title>Properties of Jeffreys Mixture for Markov Sources</title><author>Takeuchi, J. ; Kawabata, T. ; Barron, A. R.</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c354t-b88030a3d25736b94ddba5d7285fc3e7fd5ead56b3f849dd3f3c605b262234193</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2013</creationdate><topic>Applied sciences</topic><topic>Approximation</topic><topic>Approximation methods</topic><topic>Asymptotic properties</topic><topic>Bayes code</topic><topic>Coding, codes</topic><topic>Context</topic><topic>Data compression</topic><topic>Encoding</topic><topic>Exact sciences and technology</topic><topic>Information theory</topic><topic>Information, signal and communications theory</topic><topic>Jeffreys prior</topic><topic>Markov analysis</topic><topic>Markov models</topic><topic>Markov processes</topic><topic>Mathematical analysis</topic><topic>Mathematical models</topic><topic>Maximum likelihood estimation</topic><topic>minimax regret</topic><topic>Minimax technique</topic><topic>Monte Carlo simulation</topic><topic>Probability</topic><topic>Signal and communications theory</topic><topic>stochastic complexity</topic><topic>Strategy</topic><topic>Telecommunications and information theory</topic><topic>universal source coding</topic><topic>Upper bound</topic><topic>Vectors</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Takeuchi, J.</creatorcontrib><creatorcontrib>Kawabata, T.</creatorcontrib><creatorcontrib>Barron, A. R.</creatorcontrib><collection>IEEE All-Society Periodicals Package (ASPP) 2005-present</collection><collection>IEEE All-Society Periodicals Package (ASPP) 1998-Present</collection><collection>IEEE Xplore Digital Library</collection><collection>Pascal-Francis</collection><collection>CrossRef</collection><collection>Computer and Information Systems Abstracts</collection><collection>Electronics &amp; Communications Abstracts</collection><collection>Technology Research Database</collection><collection>ProQuest Computer Science Collection</collection><collection>Advanced Technologies Database with Aerospace</collection><collection>Computer and Information Systems Abstracts – Academic</collection><collection>Computer and Information Systems Abstracts Professional</collection><collection>ANTE: Abstracts in New Technology &amp; Engineering</collection><collection>Engineering Research Database</collection><jtitle>IEEE transactions on information theory</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Takeuchi, J.</au><au>Kawabata, T.</au><au>Barron, A. R.</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Properties of Jeffreys Mixture for Markov Sources</atitle><jtitle>IEEE transactions on information theory</jtitle><stitle>TIT</stitle><date>2013-01</date><risdate>2013</risdate><volume>59</volume><issue>1</issue><spage>438</spage><epage>457</epage><pages>438-457</pages><issn>0018-9448</issn><eissn>1557-9654</eissn><coden>IETTAW</coden><abstract>We discuss the properties of Jeffreys mixture for a Markov model. First, we show that a modified Jeffreys mixture asymptotically achieves the minimax coding regret for universal data compression, where we do not put any restriction on data sequences. Moreover, we give an approximation formula for the prediction probability of Jeffreys mixture for a Markov model. By this formula, it is revealed that the prediction probability by Jeffreys mixture for the Markov model with alphabet {0,1} is not of the form ( nx | s +α)/( ns +β), where nx | s is the number of occurrences of the symbol x following the context s ∈ {0,1} and ns = n 0| s + n 1| s . Moreover, we propose a method to compute our minimax strategy, which is a combination of a Monte Carlo method and the approximation formula, where the former is used for earlier stages in the data, while the latter is used for later stages.</abstract><cop>New York, NY</cop><pub>IEEE</pub><doi>10.1109/TIT.2012.2219171</doi><tpages>20</tpages></addata></record>
fulltext fulltext
identifier ISSN: 0018-9448
ispartof IEEE transactions on information theory, 2013-01, Vol.59 (1), p.438-457
issn 0018-9448
1557-9654
language eng
recordid cdi_proquest_journals_1242476772
source IEEE Electronic Library (IEL) Journals
subjects Applied sciences
Approximation
Approximation methods
Asymptotic properties
Bayes code
Coding, codes
Context
Data compression
Encoding
Exact sciences and technology
Information theory
Information, signal and communications theory
Jeffreys prior
Markov analysis
Markov models
Markov processes
Mathematical analysis
Mathematical models
Maximum likelihood estimation
minimax regret
Minimax technique
Monte Carlo simulation
Probability
Signal and communications theory
stochastic complexity
Strategy
Telecommunications and information theory
universal source coding
Upper bound
Vectors
title Properties of Jeffreys Mixture for Markov Sources
url http://sfxeu10.hosted.exlibrisgroup.com/loughborough?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-07T16%3A17%3A00IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_pasca&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Properties%20of%20Jeffreys%20Mixture%20for%20Markov%20Sources&rft.jtitle=IEEE%20transactions%20on%20information%20theory&rft.au=Takeuchi,%20J.&rft.date=2013-01&rft.volume=59&rft.issue=1&rft.spage=438&rft.epage=457&rft.pages=438-457&rft.issn=0018-9448&rft.eissn=1557-9654&rft.coden=IETTAW&rft_id=info:doi/10.1109/TIT.2012.2219171&rft_dat=%3Cproquest_pasca%3E1365133895%3C/proquest_pasca%3E%3Cgrp_id%3Ecdi_FETCH-LOGICAL-c354t-b88030a3d25736b94ddba5d7285fc3e7fd5ead56b3f849dd3f3c605b262234193%3C/grp_id%3E%3Coa%3E%3C/oa%3E%3Curl%3E%3C/url%3E&rft_id=info:oai/&rft_pqid=1242476772&rft_id=info:pmid/&rft_ieee_id=6307868&rfr_iscdi=true