Loading…

TPCNet : Representation learning for H  i mapping

We introduce TPCNet, a neural network predictor that combines Convolutional and Transformer architectures with Positional encodings, for neutral atomic hydrogen (H i) spectral analysis. Trained on synthetic datasets, our models predict cold neutral gas fraction (fCNM) and H i opacity correction fact...

Full description

Saved in:
Bibliographic Details
Published in:Monthly notices of the Royal Astronomical Society 2024-12
Main Authors: Nguyen, Hiep, Tang, Haiyang, Alger, Matthew, Marchal, Antoine, Muller, Eric G M, Ong, Cheng Soon, McClure-Griffiths, N M
Format: Article
Language:English
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
cited_by
cites
container_end_page
container_issue
container_start_page
container_title Monthly notices of the Royal Astronomical Society
container_volume
creator Nguyen, Hiep
Tang, Haiyang
Alger, Matthew
Marchal, Antoine
Muller, Eric G M
Ong, Cheng Soon
McClure-Griffiths, N M
description We introduce TPCNet, a neural network predictor that combines Convolutional and Transformer architectures with Positional encodings, for neutral atomic hydrogen (H i) spectral analysis. Trained on synthetic datasets, our models predict cold neutral gas fraction (fCNM) and H i opacity correction factor ($\mathcal {R_{\mathrm{H{\small I}}}$) from emission spectra based on the learned relationships between the desired output parameters and observables (optically-thin column density and peak brightness). As a follow-up to Murray et al. (2020)’s shallow Convolutional Neural Network (CNN), we construct deep CNN models and compare them to TPCNet models. TPCNet outperforms deep CNNs, achieving a 10percnt average increase in testing accuracy, algorithmic (training) stability, and convergence speed. Our findings highlight the robustness of the proposed model with sinusoidal positional encoding applied directly to the spectral input, addressing perturbations in training dataset shuffling and convolutional network weight initializations. Higher spectral resolutions with increased spectral channels offer advantages, albeit with increased training time. Diverse synthetic datasets enhance model performance and generalization, as demonstrated by producing fCNM and $\mathcal {R_{\mathrm{H{\small I}}}$ values consistent with evaluation ground truths. Applications of TPCNet to observed emission data reveal strong agreement between the predictions and Gaussian decomposition-based estimates (from emission and absorption surveys), emphasizing its potential in H i spectral analysis.
doi_str_mv 10.1093/mnras/stae2631
format article
fullrecord <record><control><sourceid>crossref</sourceid><recordid>TN_cdi_crossref_primary_10_1093_mnras_stae2631</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>10_1093_mnras_stae2631</sourcerecordid><originalsourceid>FETCH-LOGICAL-c164t-7d3bc85751ca1549f881c0e92c60db06314841afa7c71be76fcc859d7abf5b133</originalsourceid><addsrcrecordid>eNo1j8tKxDAYhYMoWEe3rvMCnfn_prnUnRR1hEFFxnX5myZSmV5IsvFtfBafzHpbHTgcPs7H2CXCGqESm2EMFDcxkSuUwCOWoVAyLyqljlkGIGRuNOIpO4vxDQBKUaiMFfun-sElfsWf3RxcdGOi1E8jPzgKYz--cj8Fvv384D0faJ6X5pydeDpEd_GXK_Zye7Ovt_nu8e6-vt7lFlWZct2J1hqpJVpCWVbeGLTgqsIq6FpYLpamRPKkrcbWaeXtMq86Ta2XLQqxYutfrg1TjMH5Zg79QOG9QWi-jZsf4-bfWHwBeF9Lmw</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype></control><display><type>article</type><title>TPCNet : Representation learning for H  i mapping</title><source>Oxford Journals Open Access Collection</source><source>EZB Electronic Journals Library</source><creator>Nguyen, Hiep ; Tang, Haiyang ; Alger, Matthew ; Marchal, Antoine ; Muller, Eric G M ; Ong, Cheng Soon ; McClure-Griffiths, N M</creator><creatorcontrib>Nguyen, Hiep ; Tang, Haiyang ; Alger, Matthew ; Marchal, Antoine ; Muller, Eric G M ; Ong, Cheng Soon ; McClure-Griffiths, N M</creatorcontrib><description>We introduce TPCNet, a neural network predictor that combines Convolutional and Transformer architectures with Positional encodings, for neutral atomic hydrogen (H i) spectral analysis. Trained on synthetic datasets, our models predict cold neutral gas fraction (fCNM) and H i opacity correction factor ($\mathcal {R_{\mathrm{H{\small I}}}$) from emission spectra based on the learned relationships between the desired output parameters and observables (optically-thin column density and peak brightness). As a follow-up to Murray et al. (2020)’s shallow Convolutional Neural Network (CNN), we construct deep CNN models and compare them to TPCNet models. TPCNet outperforms deep CNNs, achieving a 10percnt average increase in testing accuracy, algorithmic (training) stability, and convergence speed. Our findings highlight the robustness of the proposed model with sinusoidal positional encoding applied directly to the spectral input, addressing perturbations in training dataset shuffling and convolutional network weight initializations. Higher spectral resolutions with increased spectral channels offer advantages, albeit with increased training time. Diverse synthetic datasets enhance model performance and generalization, as demonstrated by producing fCNM and $\mathcal {R_{\mathrm{H{\small I}}}$ values consistent with evaluation ground truths. Applications of TPCNet to observed emission data reveal strong agreement between the predictions and Gaussian decomposition-based estimates (from emission and absorption surveys), emphasizing its potential in H i spectral analysis.</description><identifier>ISSN: 0035-8711</identifier><identifier>EISSN: 1365-2966</identifier><identifier>DOI: 10.1093/mnras/stae2631</identifier><language>eng</language><ispartof>Monthly notices of the Royal Astronomical Society, 2024-12</ispartof><lds50>peer_reviewed</lds50><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed><orcidid>0000-0001-5621-1577 ; 0000-0002-2712-4156</orcidid></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><link.rule.ids>314,780,784,27924,27925</link.rule.ids></links><search><creatorcontrib>Nguyen, Hiep</creatorcontrib><creatorcontrib>Tang, Haiyang</creatorcontrib><creatorcontrib>Alger, Matthew</creatorcontrib><creatorcontrib>Marchal, Antoine</creatorcontrib><creatorcontrib>Muller, Eric G M</creatorcontrib><creatorcontrib>Ong, Cheng Soon</creatorcontrib><creatorcontrib>McClure-Griffiths, N M</creatorcontrib><title>TPCNet : Representation learning for H  i mapping</title><title>Monthly notices of the Royal Astronomical Society</title><description>We introduce TPCNet, a neural network predictor that combines Convolutional and Transformer architectures with Positional encodings, for neutral atomic hydrogen (H i) spectral analysis. Trained on synthetic datasets, our models predict cold neutral gas fraction (fCNM) and H i opacity correction factor ($\mathcal {R_{\mathrm{H{\small I}}}$) from emission spectra based on the learned relationships between the desired output parameters and observables (optically-thin column density and peak brightness). As a follow-up to Murray et al. (2020)’s shallow Convolutional Neural Network (CNN), we construct deep CNN models and compare them to TPCNet models. TPCNet outperforms deep CNNs, achieving a 10percnt average increase in testing accuracy, algorithmic (training) stability, and convergence speed. Our findings highlight the robustness of the proposed model with sinusoidal positional encoding applied directly to the spectral input, addressing perturbations in training dataset shuffling and convolutional network weight initializations. Higher spectral resolutions with increased spectral channels offer advantages, albeit with increased training time. Diverse synthetic datasets enhance model performance and generalization, as demonstrated by producing fCNM and $\mathcal {R_{\mathrm{H{\small I}}}$ values consistent with evaluation ground truths. Applications of TPCNet to observed emission data reveal strong agreement between the predictions and Gaussian decomposition-based estimates (from emission and absorption surveys), emphasizing its potential in H i spectral analysis.</description><issn>0035-8711</issn><issn>1365-2966</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2024</creationdate><recordtype>article</recordtype><recordid>eNo1j8tKxDAYhYMoWEe3rvMCnfn_prnUnRR1hEFFxnX5myZSmV5IsvFtfBafzHpbHTgcPs7H2CXCGqESm2EMFDcxkSuUwCOWoVAyLyqljlkGIGRuNOIpO4vxDQBKUaiMFfun-sElfsWf3RxcdGOi1E8jPzgKYz--cj8Fvv384D0faJ6X5pydeDpEd_GXK_Zye7Ovt_nu8e6-vt7lFlWZct2J1hqpJVpCWVbeGLTgqsIq6FpYLpamRPKkrcbWaeXtMq86Ta2XLQqxYutfrg1TjMH5Zg79QOG9QWi-jZsf4-bfWHwBeF9Lmw</recordid><startdate>20241211</startdate><enddate>20241211</enddate><creator>Nguyen, Hiep</creator><creator>Tang, Haiyang</creator><creator>Alger, Matthew</creator><creator>Marchal, Antoine</creator><creator>Muller, Eric G M</creator><creator>Ong, Cheng Soon</creator><creator>McClure-Griffiths, N M</creator><scope>AAYXX</scope><scope>CITATION</scope><orcidid>https://orcid.org/0000-0001-5621-1577</orcidid><orcidid>https://orcid.org/0000-0002-2712-4156</orcidid></search><sort><creationdate>20241211</creationdate><title>TPCNet : Representation learning for H  i mapping</title><author>Nguyen, Hiep ; Tang, Haiyang ; Alger, Matthew ; Marchal, Antoine ; Muller, Eric G M ; Ong, Cheng Soon ; McClure-Griffiths, N M</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c164t-7d3bc85751ca1549f881c0e92c60db06314841afa7c71be76fcc859d7abf5b133</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2024</creationdate><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Nguyen, Hiep</creatorcontrib><creatorcontrib>Tang, Haiyang</creatorcontrib><creatorcontrib>Alger, Matthew</creatorcontrib><creatorcontrib>Marchal, Antoine</creatorcontrib><creatorcontrib>Muller, Eric G M</creatorcontrib><creatorcontrib>Ong, Cheng Soon</creatorcontrib><creatorcontrib>McClure-Griffiths, N M</creatorcontrib><collection>CrossRef</collection><jtitle>Monthly notices of the Royal Astronomical Society</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Nguyen, Hiep</au><au>Tang, Haiyang</au><au>Alger, Matthew</au><au>Marchal, Antoine</au><au>Muller, Eric G M</au><au>Ong, Cheng Soon</au><au>McClure-Griffiths, N M</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>TPCNet : Representation learning for H  i mapping</atitle><jtitle>Monthly notices of the Royal Astronomical Society</jtitle><date>2024-12-11</date><risdate>2024</risdate><issn>0035-8711</issn><eissn>1365-2966</eissn><abstract>We introduce TPCNet, a neural network predictor that combines Convolutional and Transformer architectures with Positional encodings, for neutral atomic hydrogen (H i) spectral analysis. Trained on synthetic datasets, our models predict cold neutral gas fraction (fCNM) and H i opacity correction factor ($\mathcal {R_{\mathrm{H{\small I}}}$) from emission spectra based on the learned relationships between the desired output parameters and observables (optically-thin column density and peak brightness). As a follow-up to Murray et al. (2020)’s shallow Convolutional Neural Network (CNN), we construct deep CNN models and compare them to TPCNet models. TPCNet outperforms deep CNNs, achieving a 10percnt average increase in testing accuracy, algorithmic (training) stability, and convergence speed. Our findings highlight the robustness of the proposed model with sinusoidal positional encoding applied directly to the spectral input, addressing perturbations in training dataset shuffling and convolutional network weight initializations. Higher spectral resolutions with increased spectral channels offer advantages, albeit with increased training time. Diverse synthetic datasets enhance model performance and generalization, as demonstrated by producing fCNM and $\mathcal {R_{\mathrm{H{\small I}}}$ values consistent with evaluation ground truths. Applications of TPCNet to observed emission data reveal strong agreement between the predictions and Gaussian decomposition-based estimates (from emission and absorption surveys), emphasizing its potential in H i spectral analysis.</abstract><doi>10.1093/mnras/stae2631</doi><orcidid>https://orcid.org/0000-0001-5621-1577</orcidid><orcidid>https://orcid.org/0000-0002-2712-4156</orcidid><oa>free_for_read</oa></addata></record>
fulltext fulltext
identifier ISSN: 0035-8711
ispartof Monthly notices of the Royal Astronomical Society, 2024-12
issn 0035-8711
1365-2966
language eng
recordid cdi_crossref_primary_10_1093_mnras_stae2631
source Oxford Journals Open Access Collection; EZB Electronic Journals Library
title TPCNet : Representation learning for H  i mapping
url http://sfxeu10.hosted.exlibrisgroup.com/loughborough?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-01T11%3A21%3A26IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-crossref&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=TPCNet%20:%20Representation%20learning%20for%20H%C2%A0%20i%20mapping&rft.jtitle=Monthly%20notices%20of%20the%20Royal%20Astronomical%20Society&rft.au=Nguyen,%20Hiep&rft.date=2024-12-11&rft.issn=0035-8711&rft.eissn=1365-2966&rft_id=info:doi/10.1093/mnras/stae2631&rft_dat=%3Ccrossref%3E10_1093_mnras_stae2631%3C/crossref%3E%3Cgrp_id%3Ecdi_FETCH-LOGICAL-c164t-7d3bc85751ca1549f881c0e92c60db06314841afa7c71be76fcc859d7abf5b133%3C/grp_id%3E%3Coa%3E%3C/oa%3E%3Curl%3E%3C/url%3E&rft_id=info:oai/&rft_id=info:pmid/&rfr_iscdi=true