Loading…

The use of information and information gain in the analysis of attribute dependencies

This paper demonstrates the possible conclusions which can be drawn from an analysis of entropy and information. Because of its universality, entropy can be widely used in different subjects, especially in biomedicine. Based on simulated data the similarities and differences between the grouping of...

Full description

Saved in:
Bibliographic Details
Published in:Biometrical letters 2012-12, Vol.49 (2), p.149-158
Main Authors: Moliński, Krzysztof, Dobek, Anita, Tomaszyk, Kamila
Format: Article
Language:English
Citations: Items that cite this one
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
cited_by cdi_FETCH-LOGICAL-c1901-6f34bf94290db3945d72fe9a8d29177c93eb8ccae8c489d3a3799a05cc2fca173
cites
container_end_page 158
container_issue 2
container_start_page 149
container_title Biometrical letters
container_volume 49
creator Moliński, Krzysztof
Dobek, Anita
Tomaszyk, Kamila
description This paper demonstrates the possible conclusions which can be drawn from an analysis of entropy and information. Because of its universality, entropy can be widely used in different subjects, especially in biomedicine. Based on simulated data the similarities and differences between the grouping of attributes and testing of their independencies are shown. It follows that a complete exploration of data sets requires both of these elements. A new concept introduced in this paper is that of normed information gain, allowing the use of any logarithm in the definition of entropy.
doi_str_mv 10.2478/bile-2013-0011
format article
fullrecord <record><control><sourceid>proquest</sourceid><recordid>TN_cdi_proquest_journals_3155454225</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>3155454225</sourcerecordid><originalsourceid>FETCH-LOGICAL-c1901-6f34bf94290db3945d72fe9a8d29177c93eb8ccae8c489d3a3799a05cc2fca173</originalsourceid><addsrcrecordid>eNpVjs1LAzEUxIMoWGqvnhc8R_Py0eQdpfgFBS8teCvZ5EVTarZudg_-967opadhht8Mw9g1iFuprbtr84G4FKC4EABnbCYBkRtr387ZDBwuuXIAl2xR615MiFlOmJ2x7eaDmrFS06Uml9T1n37IXWl8iSf-3ecyBc0w4b74w3fN9bfjh6HP7ThQE-lIJVIJmeoVu0j-UGnxr3O2fXzYrJ75-vXpZXW_5gFQAF8mpduEWqKIrUJtopWJ0LsoEawNqKh1IXhyQTuMyiuL6IUJQabgwao5u_nbPfbd10h12O27sZ_u1Z0CY7TRUhr1A59lVEk</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>3155454225</pqid></control><display><type>article</type><title>The use of information and information gain in the analysis of attribute dependencies</title><source>Publicly Available Content Database</source><creator>Moliński, Krzysztof ; Dobek, Anita ; Tomaszyk, Kamila</creator><creatorcontrib>Moliński, Krzysztof ; Dobek, Anita ; Tomaszyk, Kamila</creatorcontrib><description>This paper demonstrates the possible conclusions which can be drawn from an analysis of entropy and information. Because of its universality, entropy can be widely used in different subjects, especially in biomedicine. Based on simulated data the similarities and differences between the grouping of attributes and testing of their independencies are shown. It follows that a complete exploration of data sets requires both of these elements. A new concept introduced in this paper is that of normed information gain, allowing the use of any logarithm in the definition of entropy.</description><identifier>ISSN: 1896-3811</identifier><identifier>EISSN: 2199-577X</identifier><identifier>DOI: 10.2478/bile-2013-0011</identifier><language>eng</language><publisher>Poznan: De Gruyter Poland</publisher><ispartof>Biometrical letters, 2012-12, Vol.49 (2), p.149-158</ispartof><rights>20120101.</rights><lds50>peer_reviewed</lds50><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c1901-6f34bf94290db3945d72fe9a8d29177c93eb8ccae8c489d3a3799a05cc2fca173</citedby></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktohtml>$$Uhttps://www.proquest.com/docview/3155454225?pq-origsite=primo$$EHTML$$P50$$Gproquest$$Hfree_for_read</linktohtml><link.rule.ids>314,776,780,25731,27901,27902,36989,44566</link.rule.ids></links><search><creatorcontrib>Moliński, Krzysztof</creatorcontrib><creatorcontrib>Dobek, Anita</creatorcontrib><creatorcontrib>Tomaszyk, Kamila</creatorcontrib><title>The use of information and information gain in the analysis of attribute dependencies</title><title>Biometrical letters</title><description>This paper demonstrates the possible conclusions which can be drawn from an analysis of entropy and information. Because of its universality, entropy can be widely used in different subjects, especially in biomedicine. Based on simulated data the similarities and differences between the grouping of attributes and testing of their independencies are shown. It follows that a complete exploration of data sets requires both of these elements. A new concept introduced in this paper is that of normed information gain, allowing the use of any logarithm in the definition of entropy.</description><issn>1896-3811</issn><issn>2199-577X</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2012</creationdate><recordtype>article</recordtype><sourceid>PIMPY</sourceid><recordid>eNpVjs1LAzEUxIMoWGqvnhc8R_Py0eQdpfgFBS8teCvZ5EVTarZudg_-967opadhht8Mw9g1iFuprbtr84G4FKC4EABnbCYBkRtr387ZDBwuuXIAl2xR615MiFlOmJ2x7eaDmrFS06Uml9T1n37IXWl8iSf-3ecyBc0w4b74w3fN9bfjh6HP7ThQE-lIJVIJmeoVu0j-UGnxr3O2fXzYrJ75-vXpZXW_5gFQAF8mpduEWqKIrUJtopWJ0LsoEawNqKh1IXhyQTuMyiuL6IUJQabgwao5u_nbPfbd10h12O27sZ_u1Z0CY7TRUhr1A59lVEk</recordid><startdate>20121201</startdate><enddate>20121201</enddate><creator>Moliński, Krzysztof</creator><creator>Dobek, Anita</creator><creator>Tomaszyk, Kamila</creator><general>De Gruyter Poland</general><scope>ABUWG</scope><scope>AFKRA</scope><scope>AZQEC</scope><scope>BENPR</scope><scope>CCPQU</scope><scope>DWQXO</scope><scope>PIMPY</scope><scope>PQEST</scope><scope>PQQKQ</scope><scope>PQUKI</scope><scope>PRINS</scope></search><sort><creationdate>20121201</creationdate><title>The use of information and information gain in the analysis of attribute dependencies</title><author>Moliński, Krzysztof ; Dobek, Anita ; Tomaszyk, Kamila</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c1901-6f34bf94290db3945d72fe9a8d29177c93eb8ccae8c489d3a3799a05cc2fca173</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2012</creationdate><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Moliński, Krzysztof</creatorcontrib><creatorcontrib>Dobek, Anita</creatorcontrib><creatorcontrib>Tomaszyk, Kamila</creatorcontrib><collection>ProQuest Central (Alumni)</collection><collection>ProQuest Central</collection><collection>ProQuest Central Essentials</collection><collection>ProQuest Central</collection><collection>ProQuest One Community College</collection><collection>ProQuest Central</collection><collection>Publicly Available Content Database</collection><collection>ProQuest One Academic Eastern Edition (DO NOT USE)</collection><collection>ProQuest One Academic</collection><collection>ProQuest One Academic UKI Edition</collection><collection>ProQuest Central China</collection><jtitle>Biometrical letters</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Moliński, Krzysztof</au><au>Dobek, Anita</au><au>Tomaszyk, Kamila</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>The use of information and information gain in the analysis of attribute dependencies</atitle><jtitle>Biometrical letters</jtitle><date>2012-12-01</date><risdate>2012</risdate><volume>49</volume><issue>2</issue><spage>149</spage><epage>158</epage><pages>149-158</pages><issn>1896-3811</issn><eissn>2199-577X</eissn><abstract>This paper demonstrates the possible conclusions which can be drawn from an analysis of entropy and information. Because of its universality, entropy can be widely used in different subjects, especially in biomedicine. Based on simulated data the similarities and differences between the grouping of attributes and testing of their independencies are shown. It follows that a complete exploration of data sets requires both of these elements. A new concept introduced in this paper is that of normed information gain, allowing the use of any logarithm in the definition of entropy.</abstract><cop>Poznan</cop><pub>De Gruyter Poland</pub><doi>10.2478/bile-2013-0011</doi><tpages>10</tpages><oa>free_for_read</oa></addata></record>
fulltext fulltext
identifier ISSN: 1896-3811
ispartof Biometrical letters, 2012-12, Vol.49 (2), p.149-158
issn 1896-3811
2199-577X
language eng
recordid cdi_proquest_journals_3155454225
source Publicly Available Content Database
title The use of information and information gain in the analysis of attribute dependencies
url http://sfxeu10.hosted.exlibrisgroup.com/loughborough?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-02-12T16%3A49%3A37IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=The%20use%20of%20information%20and%20information%20gain%20in%20the%20analysis%20of%20attribute%20dependencies&rft.jtitle=Biometrical%20letters&rft.au=Moli%C5%84ski,%20Krzysztof&rft.date=2012-12-01&rft.volume=49&rft.issue=2&rft.spage=149&rft.epage=158&rft.pages=149-158&rft.issn=1896-3811&rft.eissn=2199-577X&rft_id=info:doi/10.2478/bile-2013-0011&rft_dat=%3Cproquest%3E3155454225%3C/proquest%3E%3Cgrp_id%3Ecdi_FETCH-LOGICAL-c1901-6f34bf94290db3945d72fe9a8d29177c93eb8ccae8c489d3a3799a05cc2fca173%3C/grp_id%3E%3Coa%3E%3C/oa%3E%3Curl%3E%3C/url%3E&rft_id=info:oai/&rft_pqid=3155454225&rft_id=info:pmid/&rfr_iscdi=true