Loading…
A singular Riemannian geometry approach to Deep Neural Networks I. Theoretical foundations
Deep Neural Networks are widely used for solving complex problems in several scientific areas, such as speech recognition, machine translation, image analysis. The strategies employed to investigate their theoretical properties mainly rely on Euclidean geometry, but in the last years new approaches...
Saved in:
Published in: | Neural networks 2023-01, Vol.158, p.331-343 |
---|---|
Main Authors: | , |
Format: | Article |
Language: | English |
Subjects: | |
Citations: | Items that this one cites |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
cited_by | |
---|---|
cites | cdi_FETCH-LOGICAL-c311t-3dd8c074d299745c4de80b3e0f33832bec32c73b34cb31de5f8250ed0135f2723 |
container_end_page | 343 |
container_issue | |
container_start_page | 331 |
container_title | Neural networks |
container_volume | 158 |
creator | Benfenati, Alessandro Marta, Alessio |
description | Deep Neural Networks are widely used for solving complex problems in several scientific areas, such as speech recognition, machine translation, image analysis. The strategies employed to investigate their theoretical properties mainly rely on Euclidean geometry, but in the last years new approaches based on Riemannian geometry have been developed. Motivated by some open problems, we study a particular sequence of maps between manifolds, with the last manifold of the sequence equipped with a Riemannian metric. We investigate the structures induced through pullbacks on the other manifolds of the sequence and on some related quotients. In particular, we show that the pullbacks of the final Riemannian metric to any manifolds of the sequence is a degenerate Riemannian metric inducing a structure of pseudometric space. We prove that the Kolmogorov quotient of this pseudometric space yields a smooth manifold, which is the base space of a particular vertical bundle. We investigate the theoretical properties of the maps of such sequence, eventually we focus on the case of maps between manifolds implementing neural networks of practical interest and we present some applications of the geometric framework we introduced in the first part of the paper.
•We consider Neural Networks as sequences of smooth maps between manifolds.•We investigate the structures induced through pull-backs on the manifolds.•The pull-backs of the final Riemannian metric to any manifolds is a degenerate metric.•The theoretical study leads to construct equivalence classes in the input manifold. |
doi_str_mv | 10.1016/j.neunet.2022.11.022 |
format | article |
fullrecord | <record><control><sourceid>proquest_cross</sourceid><recordid>TN_cdi_proquest_miscellaneous_2754047843</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><els_id>S0893608022004634</els_id><sourcerecordid>2754047843</sourcerecordid><originalsourceid>FETCH-LOGICAL-c311t-3dd8c074d299745c4de80b3e0f33832bec32c73b34cb31de5f8250ed0135f2723</originalsourceid><addsrcrecordid>eNp9kE9P3DAQxa2qFSyUb1AhH3tJOraTjXOphPhTkFArIbj0Yjn2BLwk9tZ2QHz7Gi3l2NOTRr83M-8R8oVBzYCtv21qj4vHXHPgvGasLvKBrJjs-op3kn8kK5C9qNYgYZ8cpLQBgLVsxB7ZF-sWegCxIr9PaHL-fpl0pDcOZ-29057eY5gxxxeqt9sYtHmgOdAzxC39iUvUU5H8HOJjolc1vX3AEDE7U-ZjWLzV2QWfPpNPo54SHr3pIbm7OL89vayuf_24Oj25roxgLFfCWmmgayzv-65pTWNRwiAQRiGk4AMawU0nBtGYQTCL7Sh5C2iBiXbkHReH5Otub_n0z4Ipq9klg9OkPYYlKd61DTRdSV7QZoeaGFKKOKptdLOOL4qBem1VbdSuVfXaqmJMFSm247cLyzCjfTf9q7EA33cAlpxPDqNKxqE3aF1Ek5UN7v8X_gIWnoqX</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2754047843</pqid></control><display><type>article</type><title>A singular Riemannian geometry approach to Deep Neural Networks I. Theoretical foundations</title><source>ScienceDirect Freedom Collection</source><creator>Benfenati, Alessandro ; Marta, Alessio</creator><creatorcontrib>Benfenati, Alessandro ; Marta, Alessio</creatorcontrib><description>Deep Neural Networks are widely used for solving complex problems in several scientific areas, such as speech recognition, machine translation, image analysis. The strategies employed to investigate their theoretical properties mainly rely on Euclidean geometry, but in the last years new approaches based on Riemannian geometry have been developed. Motivated by some open problems, we study a particular sequence of maps between manifolds, with the last manifold of the sequence equipped with a Riemannian metric. We investigate the structures induced through pullbacks on the other manifolds of the sequence and on some related quotients. In particular, we show that the pullbacks of the final Riemannian metric to any manifolds of the sequence is a degenerate Riemannian metric inducing a structure of pseudometric space. We prove that the Kolmogorov quotient of this pseudometric space yields a smooth manifold, which is the base space of a particular vertical bundle. We investigate the theoretical properties of the maps of such sequence, eventually we focus on the case of maps between manifolds implementing neural networks of practical interest and we present some applications of the geometric framework we introduced in the first part of the paper.
•We consider Neural Networks as sequences of smooth maps between manifolds.•We investigate the structures induced through pull-backs on the manifolds.•The pull-backs of the final Riemannian metric to any manifolds is a degenerate metric.•The theoretical study leads to construct equivalence classes in the input manifold.</description><identifier>ISSN: 0893-6080</identifier><identifier>EISSN: 1879-2782</identifier><identifier>DOI: 10.1016/j.neunet.2022.11.022</identifier><identifier>PMID: 36509003</identifier><language>eng</language><publisher>United States: Elsevier Ltd</publisher><subject>Algorithms ; Classification ; Deep learning ; Degenerate metrics ; Image Processing, Computer-Assisted - methods ; Neural networks ; Neural Networks, Computer ; Riemann geometry</subject><ispartof>Neural networks, 2023-01, Vol.158, p.331-343</ispartof><rights>2022 Elsevier Ltd</rights><rights>Copyright © 2022 Elsevier Ltd. All rights reserved.</rights><lds50>peer_reviewed</lds50><woscitedreferencessubscribed>false</woscitedreferencessubscribed><cites>FETCH-LOGICAL-c311t-3dd8c074d299745c4de80b3e0f33832bec32c73b34cb31de5f8250ed0135f2723</cites><orcidid>0000-0002-2985-374X</orcidid></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><link.rule.ids>314,776,780,27901,27902</link.rule.ids><backlink>$$Uhttps://www.ncbi.nlm.nih.gov/pubmed/36509003$$D View this record in MEDLINE/PubMed$$Hfree_for_read</backlink></links><search><creatorcontrib>Benfenati, Alessandro</creatorcontrib><creatorcontrib>Marta, Alessio</creatorcontrib><title>A singular Riemannian geometry approach to Deep Neural Networks I. Theoretical foundations</title><title>Neural networks</title><addtitle>Neural Netw</addtitle><description>Deep Neural Networks are widely used for solving complex problems in several scientific areas, such as speech recognition, machine translation, image analysis. The strategies employed to investigate their theoretical properties mainly rely on Euclidean geometry, but in the last years new approaches based on Riemannian geometry have been developed. Motivated by some open problems, we study a particular sequence of maps between manifolds, with the last manifold of the sequence equipped with a Riemannian metric. We investigate the structures induced through pullbacks on the other manifolds of the sequence and on some related quotients. In particular, we show that the pullbacks of the final Riemannian metric to any manifolds of the sequence is a degenerate Riemannian metric inducing a structure of pseudometric space. We prove that the Kolmogorov quotient of this pseudometric space yields a smooth manifold, which is the base space of a particular vertical bundle. We investigate the theoretical properties of the maps of such sequence, eventually we focus on the case of maps between manifolds implementing neural networks of practical interest and we present some applications of the geometric framework we introduced in the first part of the paper.
•We consider Neural Networks as sequences of smooth maps between manifolds.•We investigate the structures induced through pull-backs on the manifolds.•The pull-backs of the final Riemannian metric to any manifolds is a degenerate metric.•The theoretical study leads to construct equivalence classes in the input manifold.</description><subject>Algorithms</subject><subject>Classification</subject><subject>Deep learning</subject><subject>Degenerate metrics</subject><subject>Image Processing, Computer-Assisted - methods</subject><subject>Neural networks</subject><subject>Neural Networks, Computer</subject><subject>Riemann geometry</subject><issn>0893-6080</issn><issn>1879-2782</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2023</creationdate><recordtype>article</recordtype><recordid>eNp9kE9P3DAQxa2qFSyUb1AhH3tJOraTjXOphPhTkFArIbj0Yjn2BLwk9tZ2QHz7Gi3l2NOTRr83M-8R8oVBzYCtv21qj4vHXHPgvGasLvKBrJjs-op3kn8kK5C9qNYgYZ8cpLQBgLVsxB7ZF-sWegCxIr9PaHL-fpl0pDcOZ-29057eY5gxxxeqt9sYtHmgOdAzxC39iUvUU5H8HOJjolc1vX3AEDE7U-ZjWLzV2QWfPpNPo54SHr3pIbm7OL89vayuf_24Oj25roxgLFfCWmmgayzv-65pTWNRwiAQRiGk4AMawU0nBtGYQTCL7Sh5C2iBiXbkHReH5Otub_n0z4Ipq9klg9OkPYYlKd61DTRdSV7QZoeaGFKKOKptdLOOL4qBem1VbdSuVfXaqmJMFSm247cLyzCjfTf9q7EA33cAlpxPDqNKxqE3aF1Ek5UN7v8X_gIWnoqX</recordid><startdate>202301</startdate><enddate>202301</enddate><creator>Benfenati, Alessandro</creator><creator>Marta, Alessio</creator><general>Elsevier Ltd</general><scope>CGR</scope><scope>CUY</scope><scope>CVF</scope><scope>ECM</scope><scope>EIF</scope><scope>NPM</scope><scope>AAYXX</scope><scope>CITATION</scope><scope>7X8</scope><orcidid>https://orcid.org/0000-0002-2985-374X</orcidid></search><sort><creationdate>202301</creationdate><title>A singular Riemannian geometry approach to Deep Neural Networks I. Theoretical foundations</title><author>Benfenati, Alessandro ; Marta, Alessio</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c311t-3dd8c074d299745c4de80b3e0f33832bec32c73b34cb31de5f8250ed0135f2723</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2023</creationdate><topic>Algorithms</topic><topic>Classification</topic><topic>Deep learning</topic><topic>Degenerate metrics</topic><topic>Image Processing, Computer-Assisted - methods</topic><topic>Neural networks</topic><topic>Neural Networks, Computer</topic><topic>Riemann geometry</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Benfenati, Alessandro</creatorcontrib><creatorcontrib>Marta, Alessio</creatorcontrib><collection>Medline</collection><collection>MEDLINE</collection><collection>MEDLINE (Ovid)</collection><collection>MEDLINE</collection><collection>MEDLINE</collection><collection>PubMed</collection><collection>CrossRef</collection><collection>MEDLINE - Academic</collection><jtitle>Neural networks</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Benfenati, Alessandro</au><au>Marta, Alessio</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>A singular Riemannian geometry approach to Deep Neural Networks I. Theoretical foundations</atitle><jtitle>Neural networks</jtitle><addtitle>Neural Netw</addtitle><date>2023-01</date><risdate>2023</risdate><volume>158</volume><spage>331</spage><epage>343</epage><pages>331-343</pages><issn>0893-6080</issn><eissn>1879-2782</eissn><abstract>Deep Neural Networks are widely used for solving complex problems in several scientific areas, such as speech recognition, machine translation, image analysis. The strategies employed to investigate their theoretical properties mainly rely on Euclidean geometry, but in the last years new approaches based on Riemannian geometry have been developed. Motivated by some open problems, we study a particular sequence of maps between manifolds, with the last manifold of the sequence equipped with a Riemannian metric. We investigate the structures induced through pullbacks on the other manifolds of the sequence and on some related quotients. In particular, we show that the pullbacks of the final Riemannian metric to any manifolds of the sequence is a degenerate Riemannian metric inducing a structure of pseudometric space. We prove that the Kolmogorov quotient of this pseudometric space yields a smooth manifold, which is the base space of a particular vertical bundle. We investigate the theoretical properties of the maps of such sequence, eventually we focus on the case of maps between manifolds implementing neural networks of practical interest and we present some applications of the geometric framework we introduced in the first part of the paper.
•We consider Neural Networks as sequences of smooth maps between manifolds.•We investigate the structures induced through pull-backs on the manifolds.•The pull-backs of the final Riemannian metric to any manifolds is a degenerate metric.•The theoretical study leads to construct equivalence classes in the input manifold.</abstract><cop>United States</cop><pub>Elsevier Ltd</pub><pmid>36509003</pmid><doi>10.1016/j.neunet.2022.11.022</doi><tpages>13</tpages><orcidid>https://orcid.org/0000-0002-2985-374X</orcidid></addata></record> |
fulltext | fulltext |
identifier | ISSN: 0893-6080 |
ispartof | Neural networks, 2023-01, Vol.158, p.331-343 |
issn | 0893-6080 1879-2782 |
language | eng |
recordid | cdi_proquest_miscellaneous_2754047843 |
source | ScienceDirect Freedom Collection |
subjects | Algorithms Classification Deep learning Degenerate metrics Image Processing, Computer-Assisted - methods Neural networks Neural Networks, Computer Riemann geometry |
title | A singular Riemannian geometry approach to Deep Neural Networks I. Theoretical foundations |
url | http://sfxeu10.hosted.exlibrisgroup.com/loughborough?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-30T07%3A12%3A23IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_cross&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=A%20singular%20Riemannian%20geometry%20approach%20to%20Deep%20Neural%20Networks%20I.%20Theoretical%20foundations&rft.jtitle=Neural%20networks&rft.au=Benfenati,%20Alessandro&rft.date=2023-01&rft.volume=158&rft.spage=331&rft.epage=343&rft.pages=331-343&rft.issn=0893-6080&rft.eissn=1879-2782&rft_id=info:doi/10.1016/j.neunet.2022.11.022&rft_dat=%3Cproquest_cross%3E2754047843%3C/proquest_cross%3E%3Cgrp_id%3Ecdi_FETCH-LOGICAL-c311t-3dd8c074d299745c4de80b3e0f33832bec32c73b34cb31de5f8250ed0135f2723%3C/grp_id%3E%3Coa%3E%3C/oa%3E%3Curl%3E%3C/url%3E&rft_id=info:oai/&rft_pqid=2754047843&rft_id=info:pmid/36509003&rfr_iscdi=true |