Loading…

Chaotic Convergence of Newton's Method

In 1680 Newton proposed an algorithm for finding roots of polynomials. His method has since evolved but the core concept remains intact. The convergence of Newton's Method has been widely challenged to be unstable or even chaotic. Here we briefly review this evolution, and consider the question...

Full description

Saved in:
Bibliographic Details
Published in:IEEE transactions on signal processing 2024, Vol.72, p.1683-1690
Main Author: Allen, Jont B.
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
cited_by
cites cdi_FETCH-LOGICAL-c287t-a9317257657cde1e4a62e1bbbd042bd33d537c43db71e1b08df52455312469d13
container_end_page 1690
container_issue
container_start_page 1683
container_title IEEE transactions on signal processing
container_volume 72
creator Allen, Jont B.
description In 1680 Newton proposed an algorithm for finding roots of polynomials. His method has since evolved but the core concept remains intact. The convergence of Newton's Method has been widely challenged to be unstable or even chaotic. Here we briefly review this evolution, and consider the question of stable convergence. Newton's method may be applied to any complex analytic function, such as polynomials. Its derivation is based on a Taylor series expansion in the Laplace frequency s=\sigma+\jmath\omega. The convergence of Newton's method depends on the R egion of Convergence (RoC). Under certain conditions, nonlinear (NL) limit-cycles appear, resulting in a reduced rate of convergence to a root. Since Newton's method is inherently complex analytic (that is, linear and convergent), it is important to establish the source of this NL divergence, which we show is due to violations of the Nyquist Sampling theorem, also known as aliasing. Here the conditions and method for uniform convergence are explored. The source of the nonlinear limit-cycle is explained in terms of aliasing. We numerically demonstrate that reducing the step-size always results in a more stable convergence. The down side is that it always results in a sub-optimal convergence. It follows that a dynamic step-size would be ideal, by slowly increasing the step-size until it fails, and then decreasing it in small steps until it converges. Finding the optimal step-size is a reasonable solution.
doi_str_mv 10.1109/TSP.2024.3376088
format article
fullrecord <record><control><sourceid>proquest_cross</sourceid><recordid>TN_cdi_crossref_primary_10_1109_TSP_2024_3376088</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><ieee_id>10466642</ieee_id><sourcerecordid>3033624773</sourcerecordid><originalsourceid>FETCH-LOGICAL-c287t-a9317257657cde1e4a62e1bbbd042bd33d537c43db71e1b08df52455312469d13</originalsourceid><addsrcrecordid>eNpNkE1Lw0AQhhdRsFbvHjwEBD0l7u7M7iZHCfUD6gdYwduSZCe2RbN1N1X896a0B0_zMjzvDDyMnQqeCcGLq9nLcya5xAzAaJ7ne2wkChQpR6P3h8wVpCo3b4fsKMYl5wKx0CN2Uc4r3y-apPTdN4V36hpKfJs80k_vu8uYPFA_9-6YHbTVR6ST3Ryz15vJrLxLp0-39-X1NG1kbvq0KkAYqYxWpnEkCCstSdR17TjK2gE4BaZBcLURw57nrlUSlQIhURdOwJidb--ugv9aU-zt0q9DN7y0wAG0RGNgoPiWaoKPMVBrV2HxWYVfK7jd2LCDDbuxYXc2hsrZtrIgon84aq1Rwh8S1FjQ</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>3033624773</pqid></control><display><type>article</type><title>Chaotic Convergence of Newton's Method</title><source>IEEE Electronic Library (IEL) Journals</source><creator>Allen, Jont B.</creator><creatorcontrib>Allen, Jont B.</creatorcontrib><description>In 1680 Newton proposed an algorithm for finding roots of polynomials. His method has since evolved but the core concept remains intact. The convergence of Newton's Method has been widely challenged to be unstable or even chaotic. Here we briefly review this evolution, and consider the question of stable convergence. Newton's method may be applied to any complex analytic function, such as polynomials. Its derivation is based on a Taylor series expansion in the Laplace frequency &lt;inline-formula&gt;&lt;tex-math notation="LaTeX"&gt;s=\sigma+\jmath\omega&lt;/tex-math&gt;&lt;/inline-formula&gt;. The convergence of Newton's method depends on the R egion of Convergence (RoC). Under certain conditions, nonlinear (NL) limit-cycles appear, resulting in a reduced rate of convergence to a root. Since Newton's method is inherently complex analytic (that is, linear and convergent), it is important to establish the source of this NL divergence, which we show is due to violations of the Nyquist Sampling theorem, also known as aliasing. Here the conditions and method for uniform convergence are explored. The source of the nonlinear limit-cycle is explained in terms of aliasing. We numerically demonstrate that reducing the step-size always results in a more stable convergence. The down side is that it always results in a sub-optimal convergence. It follows that a dynamic step-size would be ideal, by slowly increasing the step-size until it fails, and then decreasing it in small steps until it converges. Finding the optimal step-size is a reasonable solution.</description><identifier>ISSN: 1053-587X</identifier><identifier>EISSN: 1941-0476</identifier><identifier>DOI: 10.1109/TSP.2024.3376088</identifier><identifier>CODEN: ITPRED</identifier><language>eng</language><publisher>New York: IEEE</publisher><subject>Algorithms ; Aliasing ; Analytic functions ; analytic-roots ; Convergence ; Divergence ; Evolution ; Fractals ; Limit-cycles ; Newton method ; Newton methods ; Nyquist-sampling ; Poles and zeros ; Polynomials ; regions of convergence (RoC) ; Series expansion ; Taylor series ; Trajectory</subject><ispartof>IEEE transactions on signal processing, 2024, Vol.72, p.1683-1690</ispartof><rights>Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2024</rights><lds50>peer_reviewed</lds50><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed><cites>FETCH-LOGICAL-c287t-a9317257657cde1e4a62e1bbbd042bd33d537c43db71e1b08df52455312469d13</cites><orcidid>0000-0003-3106-1191</orcidid></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktohtml>$$Uhttps://ieeexplore.ieee.org/document/10466642$$EHTML$$P50$$Gieee$$Hfree_for_read</linktohtml><link.rule.ids>314,777,781,4011,27905,27906,27907,54778</link.rule.ids></links><search><creatorcontrib>Allen, Jont B.</creatorcontrib><title>Chaotic Convergence of Newton's Method</title><title>IEEE transactions on signal processing</title><addtitle>TSP</addtitle><description>In 1680 Newton proposed an algorithm for finding roots of polynomials. His method has since evolved but the core concept remains intact. The convergence of Newton's Method has been widely challenged to be unstable or even chaotic. Here we briefly review this evolution, and consider the question of stable convergence. Newton's method may be applied to any complex analytic function, such as polynomials. Its derivation is based on a Taylor series expansion in the Laplace frequency &lt;inline-formula&gt;&lt;tex-math notation="LaTeX"&gt;s=\sigma+\jmath\omega&lt;/tex-math&gt;&lt;/inline-formula&gt;. The convergence of Newton's method depends on the R egion of Convergence (RoC). Under certain conditions, nonlinear (NL) limit-cycles appear, resulting in a reduced rate of convergence to a root. Since Newton's method is inherently complex analytic (that is, linear and convergent), it is important to establish the source of this NL divergence, which we show is due to violations of the Nyquist Sampling theorem, also known as aliasing. Here the conditions and method for uniform convergence are explored. The source of the nonlinear limit-cycle is explained in terms of aliasing. We numerically demonstrate that reducing the step-size always results in a more stable convergence. The down side is that it always results in a sub-optimal convergence. It follows that a dynamic step-size would be ideal, by slowly increasing the step-size until it fails, and then decreasing it in small steps until it converges. Finding the optimal step-size is a reasonable solution.</description><subject>Algorithms</subject><subject>Aliasing</subject><subject>Analytic functions</subject><subject>analytic-roots</subject><subject>Convergence</subject><subject>Divergence</subject><subject>Evolution</subject><subject>Fractals</subject><subject>Limit-cycles</subject><subject>Newton method</subject><subject>Newton methods</subject><subject>Nyquist-sampling</subject><subject>Poles and zeros</subject><subject>Polynomials</subject><subject>regions of convergence (RoC)</subject><subject>Series expansion</subject><subject>Taylor series</subject><subject>Trajectory</subject><issn>1053-587X</issn><issn>1941-0476</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2024</creationdate><recordtype>article</recordtype><sourceid>ESBDL</sourceid><recordid>eNpNkE1Lw0AQhhdRsFbvHjwEBD0l7u7M7iZHCfUD6gdYwduSZCe2RbN1N1X896a0B0_zMjzvDDyMnQqeCcGLq9nLcya5xAzAaJ7ne2wkChQpR6P3h8wVpCo3b4fsKMYl5wKx0CN2Uc4r3y-apPTdN4V36hpKfJs80k_vu8uYPFA_9-6YHbTVR6ST3Ryz15vJrLxLp0-39-X1NG1kbvq0KkAYqYxWpnEkCCstSdR17TjK2gE4BaZBcLURw57nrlUSlQIhURdOwJidb--ugv9aU-zt0q9DN7y0wAG0RGNgoPiWaoKPMVBrV2HxWYVfK7jd2LCDDbuxYXc2hsrZtrIgon84aq1Rwh8S1FjQ</recordid><startdate>2024</startdate><enddate>2024</enddate><creator>Allen, Jont B.</creator><general>IEEE</general><general>The Institute of Electrical and Electronics Engineers, Inc. (IEEE)</general><scope>97E</scope><scope>ESBDL</scope><scope>RIA</scope><scope>RIE</scope><scope>AAYXX</scope><scope>CITATION</scope><scope>7SC</scope><scope>7SP</scope><scope>8FD</scope><scope>JQ2</scope><scope>L7M</scope><scope>L~C</scope><scope>L~D</scope><orcidid>https://orcid.org/0000-0003-3106-1191</orcidid></search><sort><creationdate>2024</creationdate><title>Chaotic Convergence of Newton's Method</title><author>Allen, Jont B.</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c287t-a9317257657cde1e4a62e1bbbd042bd33d537c43db71e1b08df52455312469d13</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2024</creationdate><topic>Algorithms</topic><topic>Aliasing</topic><topic>Analytic functions</topic><topic>analytic-roots</topic><topic>Convergence</topic><topic>Divergence</topic><topic>Evolution</topic><topic>Fractals</topic><topic>Limit-cycles</topic><topic>Newton method</topic><topic>Newton methods</topic><topic>Nyquist-sampling</topic><topic>Poles and zeros</topic><topic>Polynomials</topic><topic>regions of convergence (RoC)</topic><topic>Series expansion</topic><topic>Taylor series</topic><topic>Trajectory</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Allen, Jont B.</creatorcontrib><collection>IEEE All-Society Periodicals Package (ASPP) 2005-present</collection><collection>IEEE Open Access Journals</collection><collection>IEEE All-Society Periodicals Package (ASPP) 1998-Present</collection><collection>IEEE Electronic Library (IEL)</collection><collection>CrossRef</collection><collection>Computer and Information Systems Abstracts</collection><collection>Electronics &amp; Communications Abstracts</collection><collection>Technology Research Database</collection><collection>ProQuest Computer Science Collection</collection><collection>Advanced Technologies Database with Aerospace</collection><collection>Computer and Information Systems Abstracts – Academic</collection><collection>Computer and Information Systems Abstracts Professional</collection><jtitle>IEEE transactions on signal processing</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Allen, Jont B.</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Chaotic Convergence of Newton's Method</atitle><jtitle>IEEE transactions on signal processing</jtitle><stitle>TSP</stitle><date>2024</date><risdate>2024</risdate><volume>72</volume><spage>1683</spage><epage>1690</epage><pages>1683-1690</pages><issn>1053-587X</issn><eissn>1941-0476</eissn><coden>ITPRED</coden><abstract>In 1680 Newton proposed an algorithm for finding roots of polynomials. His method has since evolved but the core concept remains intact. The convergence of Newton's Method has been widely challenged to be unstable or even chaotic. Here we briefly review this evolution, and consider the question of stable convergence. Newton's method may be applied to any complex analytic function, such as polynomials. Its derivation is based on a Taylor series expansion in the Laplace frequency &lt;inline-formula&gt;&lt;tex-math notation="LaTeX"&gt;s=\sigma+\jmath\omega&lt;/tex-math&gt;&lt;/inline-formula&gt;. The convergence of Newton's method depends on the R egion of Convergence (RoC). Under certain conditions, nonlinear (NL) limit-cycles appear, resulting in a reduced rate of convergence to a root. Since Newton's method is inherently complex analytic (that is, linear and convergent), it is important to establish the source of this NL divergence, which we show is due to violations of the Nyquist Sampling theorem, also known as aliasing. Here the conditions and method for uniform convergence are explored. The source of the nonlinear limit-cycle is explained in terms of aliasing. We numerically demonstrate that reducing the step-size always results in a more stable convergence. The down side is that it always results in a sub-optimal convergence. It follows that a dynamic step-size would be ideal, by slowly increasing the step-size until it fails, and then decreasing it in small steps until it converges. Finding the optimal step-size is a reasonable solution.</abstract><cop>New York</cop><pub>IEEE</pub><doi>10.1109/TSP.2024.3376088</doi><tpages>8</tpages><orcidid>https://orcid.org/0000-0003-3106-1191</orcidid><oa>free_for_read</oa></addata></record>
fulltext fulltext
identifier ISSN: 1053-587X
ispartof IEEE transactions on signal processing, 2024, Vol.72, p.1683-1690
issn 1053-587X
1941-0476
language eng
recordid cdi_crossref_primary_10_1109_TSP_2024_3376088
source IEEE Electronic Library (IEL) Journals
subjects Algorithms
Aliasing
Analytic functions
analytic-roots
Convergence
Divergence
Evolution
Fractals
Limit-cycles
Newton method
Newton methods
Nyquist-sampling
Poles and zeros
Polynomials
regions of convergence (RoC)
Series expansion
Taylor series
Trajectory
title Chaotic Convergence of Newton's Method
url http://sfxeu10.hosted.exlibrisgroup.com/loughborough?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-17T07%3A49%3A20IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_cross&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Chaotic%20Convergence%20of%20Newton's%20Method&rft.jtitle=IEEE%20transactions%20on%20signal%20processing&rft.au=Allen,%20Jont%20B.&rft.date=2024&rft.volume=72&rft.spage=1683&rft.epage=1690&rft.pages=1683-1690&rft.issn=1053-587X&rft.eissn=1941-0476&rft.coden=ITPRED&rft_id=info:doi/10.1109/TSP.2024.3376088&rft_dat=%3Cproquest_cross%3E3033624773%3C/proquest_cross%3E%3Cgrp_id%3Ecdi_FETCH-LOGICAL-c287t-a9317257657cde1e4a62e1bbbd042bd33d537c43db71e1b08df52455312469d13%3C/grp_id%3E%3Coa%3E%3C/oa%3E%3Curl%3E%3C/url%3E&rft_id=info:oai/&rft_pqid=3033624773&rft_id=info:pmid/&rft_ieee_id=10466642&rfr_iscdi=true