Loading…

Independent Factor Analysis

We introduce the independent factor analysis (IFA) method for recovering independent hidden sources from their observed mixtures. IFA generalizes and unifies ordinary factor analysis (FA), principal component analysis (PCA), and independent component analysis (ICA), and can handle not only square no...

Full description

Saved in:
Bibliographic Details
Published in:Neural computation 1999-05, Vol.11 (4), p.803-851
Main Author: Attias, H.
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Items that cite this one
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
cited_by cdi_FETCH-LOGICAL-c496t-aff89af3cbdaa911f5af45ffea902557c684879c5d442d4e9c9f6eed6e1cb6ad3
cites cdi_FETCH-LOGICAL-c496t-aff89af3cbdaa911f5af45ffea902557c684879c5d442d4e9c9f6eed6e1cb6ad3
container_end_page 851
container_issue 4
container_start_page 803
container_title Neural computation
container_volume 11
creator Attias, H.
description We introduce the independent factor analysis (IFA) method for recovering independent hidden sources from their observed mixtures. IFA generalizes and unifies ordinary factor analysis (FA), principal component analysis (PCA), and independent component analysis (ICA), and can handle not only square noiseless mixing but also the general case where the number of mixtures differs from the number of sources and the data are noisy. IFA is a two-step procedure. In the first step, the source densities, mixing matrix, and noise covariance are estimated from the observed data by maximum likelihood. For this purpose we present an expectation-maximization (EM) algorithm, which performs unsupervised learning of an associated probabilistic model of the mixing situation. Each source in our model is described by a mixture of gaussians; thus, all the probabilistic calculations can be performed analytically. In the second step, the sources are reconstructed from the observed data by an optimal nonlinear estimator. A variational approximation of this algorithm is derived for cases with a large number of sources, where the exact algorithm becomes intractable. Our IFA algorithm reduces to the one for ordinary FA when the sources become gaussian, and to an EM algorithm for PCA in the zero-noise limit. We derive an additional EM algorithm specifically for noiseless IFA. This algorithm is shown to be superior to ICA since it can learn arbitrary source densities from the data. Beyond blind separation, IFA can be used for modeling multidimensional data by a highly constrained mixture of gaussians and as a tool for nonlinear signal encoding.
doi_str_mv 10.1162/089976699300016458
format article
fullrecord <record><control><sourceid>proquest_pubme</sourceid><recordid>TN_cdi_pubmed_primary_10226184</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>69730671</sourcerecordid><originalsourceid>FETCH-LOGICAL-c496t-aff89af3cbdaa911f5af45ffea902557c684879c5d442d4e9c9f6eed6e1cb6ad3</originalsourceid><addsrcrecordid>eNp90MtKAzEUBuAgiq3VF1CQLsTd1JzcJlmWYrVQcKPgLqSZBKbMzcmMUJ_elCkoWNzkbL7_cPIjdA14BiDIA5ZKpUIoRTHGIBiXJ2gMnOJESvl-isZ7kESRjtBFCNuoBGB-jkaACREg2RjdrKrMNS4-VTddGtvV7XRemWIX8nCJzrwpgrs6zAl6Wz6-Lp6T9cvTajFfJ5Yp0SXGe6mMp3aTGaMAPDeece-dUZhwnlohmUyV5RljJGNOWeWFc5lwYDfCZHSC7oe9TVt_9C50usyDdUVhKlf3QQuVUixSiJAM0LZ1CK3zumnz0rQ7DVjvK9F_K4mh28P2flO67Fdk6CCCuwMwwZrCt6ayefhxkEpBcWTLgZV5p7d138aWgq6crT8BcqYpJpQJTTCBmImX6K-8OX7Q7Miif37wDSjzibw</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>69730671</pqid></control><display><type>article</type><title>Independent Factor Analysis</title><source>MIT Press Journals</source><creator>Attias, H.</creator><creatorcontrib>Attias, H.</creatorcontrib><description>We introduce the independent factor analysis (IFA) method for recovering independent hidden sources from their observed mixtures. IFA generalizes and unifies ordinary factor analysis (FA), principal component analysis (PCA), and independent component analysis (ICA), and can handle not only square noiseless mixing but also the general case where the number of mixtures differs from the number of sources and the data are noisy. IFA is a two-step procedure. In the first step, the source densities, mixing matrix, and noise covariance are estimated from the observed data by maximum likelihood. For this purpose we present an expectation-maximization (EM) algorithm, which performs unsupervised learning of an associated probabilistic model of the mixing situation. Each source in our model is described by a mixture of gaussians; thus, all the probabilistic calculations can be performed analytically. In the second step, the sources are reconstructed from the observed data by an optimal nonlinear estimator. A variational approximation of this algorithm is derived for cases with a large number of sources, where the exact algorithm becomes intractable. Our IFA algorithm reduces to the one for ordinary FA when the sources become gaussian, and to an EM algorithm for PCA in the zero-noise limit. We derive an additional EM algorithm specifically for noiseless IFA. This algorithm is shown to be superior to ICA since it can learn arbitrary source densities from the data. Beyond blind separation, IFA can be used for modeling multidimensional data by a highly constrained mixture of gaussians and as a tool for nonlinear signal encoding.</description><identifier>ISSN: 0899-7667</identifier><identifier>EISSN: 1530-888X</identifier><identifier>DOI: 10.1162/089976699300016458</identifier><identifier>PMID: 10226184</identifier><language>eng</language><publisher>One Rogers Street, Cambridge, MA 02142-1209, USA: MIT Press</publisher><subject>Algorithms ; Computer Simulation ; Exact sciences and technology ; Factor Analysis, Statistical ; Learning ; Likelihood Functions ; Mathematics ; Models, Statistical ; Multivariate analysis ; Normal Distribution ; Probability ; Probability and statistics ; Sciences and techniques of general use ; Statistics</subject><ispartof>Neural computation, 1999-05, Vol.11 (4), p.803-851</ispartof><rights>2000 INIST-CNRS</rights><lds50>peer_reviewed</lds50><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c496t-aff89af3cbdaa911f5af45ffea902557c684879c5d442d4e9c9f6eed6e1cb6ad3</citedby><cites>FETCH-LOGICAL-c496t-aff89af3cbdaa911f5af45ffea902557c684879c5d442d4e9c9f6eed6e1cb6ad3</cites></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktohtml>$$Uhttps://direct.mit.edu/neco/article/doi/10.1162/089976699300016458$$EHTML$$P50$$Gmit$$H</linktohtml><link.rule.ids>314,780,784,27924,27925,54009,54010</link.rule.ids><backlink>$$Uhttp://pascal-francis.inist.fr/vibad/index.php?action=getRecordDetail&amp;idt=1178630$$DView record in Pascal Francis$$Hfree_for_read</backlink><backlink>$$Uhttps://www.ncbi.nlm.nih.gov/pubmed/10226184$$D View this record in MEDLINE/PubMed$$Hfree_for_read</backlink></links><search><creatorcontrib>Attias, H.</creatorcontrib><title>Independent Factor Analysis</title><title>Neural computation</title><addtitle>Neural Comput</addtitle><description>We introduce the independent factor analysis (IFA) method for recovering independent hidden sources from their observed mixtures. IFA generalizes and unifies ordinary factor analysis (FA), principal component analysis (PCA), and independent component analysis (ICA), and can handle not only square noiseless mixing but also the general case where the number of mixtures differs from the number of sources and the data are noisy. IFA is a two-step procedure. In the first step, the source densities, mixing matrix, and noise covariance are estimated from the observed data by maximum likelihood. For this purpose we present an expectation-maximization (EM) algorithm, which performs unsupervised learning of an associated probabilistic model of the mixing situation. Each source in our model is described by a mixture of gaussians; thus, all the probabilistic calculations can be performed analytically. In the second step, the sources are reconstructed from the observed data by an optimal nonlinear estimator. A variational approximation of this algorithm is derived for cases with a large number of sources, where the exact algorithm becomes intractable. Our IFA algorithm reduces to the one for ordinary FA when the sources become gaussian, and to an EM algorithm for PCA in the zero-noise limit. We derive an additional EM algorithm specifically for noiseless IFA. This algorithm is shown to be superior to ICA since it can learn arbitrary source densities from the data. Beyond blind separation, IFA can be used for modeling multidimensional data by a highly constrained mixture of gaussians and as a tool for nonlinear signal encoding.</description><subject>Algorithms</subject><subject>Computer Simulation</subject><subject>Exact sciences and technology</subject><subject>Factor Analysis, Statistical</subject><subject>Learning</subject><subject>Likelihood Functions</subject><subject>Mathematics</subject><subject>Models, Statistical</subject><subject>Multivariate analysis</subject><subject>Normal Distribution</subject><subject>Probability</subject><subject>Probability and statistics</subject><subject>Sciences and techniques of general use</subject><subject>Statistics</subject><issn>0899-7667</issn><issn>1530-888X</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>1999</creationdate><recordtype>article</recordtype><recordid>eNp90MtKAzEUBuAgiq3VF1CQLsTd1JzcJlmWYrVQcKPgLqSZBKbMzcmMUJ_elCkoWNzkbL7_cPIjdA14BiDIA5ZKpUIoRTHGIBiXJ2gMnOJESvl-isZ7kESRjtBFCNuoBGB-jkaACREg2RjdrKrMNS4-VTddGtvV7XRemWIX8nCJzrwpgrs6zAl6Wz6-Lp6T9cvTajFfJ5Yp0SXGe6mMp3aTGaMAPDeece-dUZhwnlohmUyV5RljJGNOWeWFc5lwYDfCZHSC7oe9TVt_9C50usyDdUVhKlf3QQuVUixSiJAM0LZ1CK3zumnz0rQ7DVjvK9F_K4mh28P2flO67Fdk6CCCuwMwwZrCt6ayefhxkEpBcWTLgZV5p7d138aWgq6crT8BcqYpJpQJTTCBmImX6K-8OX7Q7Miif37wDSjzibw</recordid><startdate>19990515</startdate><enddate>19990515</enddate><creator>Attias, H.</creator><general>MIT Press</general><scope>IQODW</scope><scope>CGR</scope><scope>CUY</scope><scope>CVF</scope><scope>ECM</scope><scope>EIF</scope><scope>NPM</scope><scope>AAYXX</scope><scope>CITATION</scope><scope>7X8</scope></search><sort><creationdate>19990515</creationdate><title>Independent Factor Analysis</title><author>Attias, H.</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c496t-aff89af3cbdaa911f5af45ffea902557c684879c5d442d4e9c9f6eed6e1cb6ad3</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>1999</creationdate><topic>Algorithms</topic><topic>Computer Simulation</topic><topic>Exact sciences and technology</topic><topic>Factor Analysis, Statistical</topic><topic>Learning</topic><topic>Likelihood Functions</topic><topic>Mathematics</topic><topic>Models, Statistical</topic><topic>Multivariate analysis</topic><topic>Normal Distribution</topic><topic>Probability</topic><topic>Probability and statistics</topic><topic>Sciences and techniques of general use</topic><topic>Statistics</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Attias, H.</creatorcontrib><collection>Pascal-Francis</collection><collection>Medline</collection><collection>MEDLINE</collection><collection>MEDLINE (Ovid)</collection><collection>MEDLINE</collection><collection>MEDLINE</collection><collection>PubMed</collection><collection>CrossRef</collection><collection>MEDLINE - Academic</collection><jtitle>Neural computation</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Attias, H.</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Independent Factor Analysis</atitle><jtitle>Neural computation</jtitle><addtitle>Neural Comput</addtitle><date>1999-05-15</date><risdate>1999</risdate><volume>11</volume><issue>4</issue><spage>803</spage><epage>851</epage><pages>803-851</pages><issn>0899-7667</issn><eissn>1530-888X</eissn><abstract>We introduce the independent factor analysis (IFA) method for recovering independent hidden sources from their observed mixtures. IFA generalizes and unifies ordinary factor analysis (FA), principal component analysis (PCA), and independent component analysis (ICA), and can handle not only square noiseless mixing but also the general case where the number of mixtures differs from the number of sources and the data are noisy. IFA is a two-step procedure. In the first step, the source densities, mixing matrix, and noise covariance are estimated from the observed data by maximum likelihood. For this purpose we present an expectation-maximization (EM) algorithm, which performs unsupervised learning of an associated probabilistic model of the mixing situation. Each source in our model is described by a mixture of gaussians; thus, all the probabilistic calculations can be performed analytically. In the second step, the sources are reconstructed from the observed data by an optimal nonlinear estimator. A variational approximation of this algorithm is derived for cases with a large number of sources, where the exact algorithm becomes intractable. Our IFA algorithm reduces to the one for ordinary FA when the sources become gaussian, and to an EM algorithm for PCA in the zero-noise limit. We derive an additional EM algorithm specifically for noiseless IFA. This algorithm is shown to be superior to ICA since it can learn arbitrary source densities from the data. Beyond blind separation, IFA can be used for modeling multidimensional data by a highly constrained mixture of gaussians and as a tool for nonlinear signal encoding.</abstract><cop>One Rogers Street, Cambridge, MA 02142-1209, USA</cop><pub>MIT Press</pub><pmid>10226184</pmid><doi>10.1162/089976699300016458</doi><tpages>49</tpages></addata></record>
fulltext fulltext
identifier ISSN: 0899-7667
ispartof Neural computation, 1999-05, Vol.11 (4), p.803-851
issn 0899-7667
1530-888X
language eng
recordid cdi_pubmed_primary_10226184
source MIT Press Journals
subjects Algorithms
Computer Simulation
Exact sciences and technology
Factor Analysis, Statistical
Learning
Likelihood Functions
Mathematics
Models, Statistical
Multivariate analysis
Normal Distribution
Probability
Probability and statistics
Sciences and techniques of general use
Statistics
title Independent Factor Analysis
url http://sfxeu10.hosted.exlibrisgroup.com/loughborough?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2024-12-29T03%3A09%3A31IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_pubme&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Independent%20Factor%20Analysis&rft.jtitle=Neural%20computation&rft.au=Attias,%20H.&rft.date=1999-05-15&rft.volume=11&rft.issue=4&rft.spage=803&rft.epage=851&rft.pages=803-851&rft.issn=0899-7667&rft.eissn=1530-888X&rft_id=info:doi/10.1162/089976699300016458&rft_dat=%3Cproquest_pubme%3E69730671%3C/proquest_pubme%3E%3Cgrp_id%3Ecdi_FETCH-LOGICAL-c496t-aff89af3cbdaa911f5af45ffea902557c684879c5d442d4e9c9f6eed6e1cb6ad3%3C/grp_id%3E%3Coa%3E%3C/oa%3E%3Curl%3E%3C/url%3E&rft_id=info:oai/&rft_pqid=69730671&rft_id=info:pmid/10226184&rfr_iscdi=true