Loading…
Extensions and enhancements of decoupled extended Kalman filter training
We describe here three useful and practical extensions and enhancements of the decoupled extended Kalman filter (DEKF) neural network weight update procedure, which has served as the backbone for much of our applications-oriented research for the last six years. First, we provide a mechanism that co...
Saved in:
Main Authors: | , |
---|---|
Format: | Conference Proceeding |
Language: | English |
Subjects: | |
Online Access: | Request full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
cited_by | |
---|---|
cites | |
container_end_page | 1883 vol.3 |
container_issue | |
container_start_page | 1879 |
container_title | |
container_volume | 3 |
creator | Puskorius, G.V. Feldkamp, L.A. |
description | We describe here three useful and practical extensions and enhancements of the decoupled extended Kalman filter (DEKF) neural network weight update procedure, which has served as the backbone for much of our applications-oriented research for the last six years. First, we provide a mechanism that constrains weight values to a pre-specified range during training to allow for fixed-point deployment of trained networks. Second, we examine modifications of DEKF training for alternative cost functions; as an example, we show how to use DEKF training to minimize a measure of relative entropy, rather than mean squared error, for pattern classification problems. Third, we describe an approximation of DEKF training that allows a multiple-output training problem to be treated with single-output training complexity. |
doi_str_mv | 10.1109/ICNN.1997.614185 |
format | conference_proceeding |
fullrecord | <record><control><sourceid>ieee_6IE</sourceid><recordid>TN_cdi_ieee_primary_614185</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><ieee_id>614185</ieee_id><sourcerecordid>614185</sourcerecordid><originalsourceid>FETCH-LOGICAL-i87t-54c3b8a8ca9f7d956bfc43c4eaf48a24b9f9c3ed2e879557bdc172cf45a5daf83</originalsourceid><addsrcrecordid>eNotj8FLwzAYxQMiqHP34Sn_QGvSJE1ylDLdcMzL7uNr8mVG2nQ0FfS_t7K9y3vwezx4hKw4Kzln9nnb7Pclt1aXNZfcqBvywLRhQvKqMndkmfMXmyWVtLW9J5v1z4QpxyFlCslTTJ-QHPaYpkyHQD264fvc4Uz-i34O79D1kGiI3YQjnUaIKabTI7kN0GVcXn1BDq_rQ7Mpdh9v2-ZlV0Sjp0JJJ1oDxoEN2ltVt8FJ4SRCkAYq2dpgnUBfodFWKd16x3XlglSgPAQjFuTpMhsR8XgeYw_j7_FyVfwBfpZLuQ</addsrcrecordid><sourcetype>Publisher</sourcetype><iscdi>true</iscdi><recordtype>conference_proceeding</recordtype></control><display><type>conference_proceeding</type><title>Extensions and enhancements of decoupled extended Kalman filter training</title><source>IEEE Electronic Library (IEL) Conference Proceedings</source><creator>Puskorius, G.V. ; Feldkamp, L.A.</creator><creatorcontrib>Puskorius, G.V. ; Feldkamp, L.A.</creatorcontrib><description>We describe here three useful and practical extensions and enhancements of the decoupled extended Kalman filter (DEKF) neural network weight update procedure, which has served as the backbone for much of our applications-oriented research for the last six years. First, we provide a mechanism that constrains weight values to a pre-specified range during training to allow for fixed-point deployment of trained networks. Second, we examine modifications of DEKF training for alternative cost functions; as an example, we show how to use DEKF training to minimize a measure of relative entropy, rather than mean squared error, for pattern classification problems. Third, we describe an approximation of DEKF training that allows a multiple-output training problem to be treated with single-output training complexity.</description><identifier>ISBN: 0780341228</identifier><identifier>ISBN: 9780780341227</identifier><identifier>DOI: 10.1109/ICNN.1997.614185</identifier><language>eng</language><publisher>IEEE</publisher><subject>Backpropagation ; Cost function ; Covariance matrix ; Entropy ; Equations ; Laboratories ; Neural networks ; Pattern classification ; Recurrent neural networks ; Spine</subject><ispartof>Proceedings of International Conference on Neural Networks (ICNN'97), 1997, Vol.3, p.1879-1883 vol.3</ispartof><woscitedreferencessubscribed>false</woscitedreferencessubscribed></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktohtml>$$Uhttps://ieeexplore.ieee.org/document/614185$$EHTML$$P50$$Gieee$$H</linktohtml><link.rule.ids>309,310,780,784,789,790,2058,4050,4051,27925,54920</link.rule.ids><linktorsrc>$$Uhttps://ieeexplore.ieee.org/document/614185$$EView_record_in_IEEE$$FView_record_in_$$GIEEE</linktorsrc></links><search><creatorcontrib>Puskorius, G.V.</creatorcontrib><creatorcontrib>Feldkamp, L.A.</creatorcontrib><title>Extensions and enhancements of decoupled extended Kalman filter training</title><title>Proceedings of International Conference on Neural Networks (ICNN'97)</title><addtitle>ICNN</addtitle><description>We describe here three useful and practical extensions and enhancements of the decoupled extended Kalman filter (DEKF) neural network weight update procedure, which has served as the backbone for much of our applications-oriented research for the last six years. First, we provide a mechanism that constrains weight values to a pre-specified range during training to allow for fixed-point deployment of trained networks. Second, we examine modifications of DEKF training for alternative cost functions; as an example, we show how to use DEKF training to minimize a measure of relative entropy, rather than mean squared error, for pattern classification problems. Third, we describe an approximation of DEKF training that allows a multiple-output training problem to be treated with single-output training complexity.</description><subject>Backpropagation</subject><subject>Cost function</subject><subject>Covariance matrix</subject><subject>Entropy</subject><subject>Equations</subject><subject>Laboratories</subject><subject>Neural networks</subject><subject>Pattern classification</subject><subject>Recurrent neural networks</subject><subject>Spine</subject><isbn>0780341228</isbn><isbn>9780780341227</isbn><fulltext>true</fulltext><rsrctype>conference_proceeding</rsrctype><creationdate>1997</creationdate><recordtype>conference_proceeding</recordtype><sourceid>6IE</sourceid><recordid>eNotj8FLwzAYxQMiqHP34Sn_QGvSJE1ylDLdcMzL7uNr8mVG2nQ0FfS_t7K9y3vwezx4hKw4Kzln9nnb7Pclt1aXNZfcqBvywLRhQvKqMndkmfMXmyWVtLW9J5v1z4QpxyFlCslTTJ-QHPaYpkyHQD264fvc4Uz-i34O79D1kGiI3YQjnUaIKabTI7kN0GVcXn1BDq_rQ7Mpdh9v2-ZlV0Sjp0JJJ1oDxoEN2ltVt8FJ4SRCkAYq2dpgnUBfodFWKd16x3XlglSgPAQjFuTpMhsR8XgeYw_j7_FyVfwBfpZLuQ</recordid><startdate>1997</startdate><enddate>1997</enddate><creator>Puskorius, G.V.</creator><creator>Feldkamp, L.A.</creator><general>IEEE</general><scope>6IE</scope><scope>6IL</scope><scope>CBEJK</scope><scope>RIE</scope><scope>RIL</scope></search><sort><creationdate>1997</creationdate><title>Extensions and enhancements of decoupled extended Kalman filter training</title><author>Puskorius, G.V. ; Feldkamp, L.A.</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-i87t-54c3b8a8ca9f7d956bfc43c4eaf48a24b9f9c3ed2e879557bdc172cf45a5daf83</frbrgroupid><rsrctype>conference_proceedings</rsrctype><prefilter>conference_proceedings</prefilter><language>eng</language><creationdate>1997</creationdate><topic>Backpropagation</topic><topic>Cost function</topic><topic>Covariance matrix</topic><topic>Entropy</topic><topic>Equations</topic><topic>Laboratories</topic><topic>Neural networks</topic><topic>Pattern classification</topic><topic>Recurrent neural networks</topic><topic>Spine</topic><toplevel>online_resources</toplevel><creatorcontrib>Puskorius, G.V.</creatorcontrib><creatorcontrib>Feldkamp, L.A.</creatorcontrib><collection>IEEE Electronic Library (IEL) Conference Proceedings</collection><collection>IEEE Proceedings Order Plan All Online (POP All Online) 1998-present by volume</collection><collection>IEEE Xplore All Conference Proceedings</collection><collection>IEEE Explore</collection><collection>IEEE Proceedings Order Plans (POP All) 1998-Present</collection></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext_linktorsrc</fulltext></delivery><addata><au>Puskorius, G.V.</au><au>Feldkamp, L.A.</au><format>book</format><genre>proceeding</genre><ristype>CONF</ristype><atitle>Extensions and enhancements of decoupled extended Kalman filter training</atitle><btitle>Proceedings of International Conference on Neural Networks (ICNN'97)</btitle><stitle>ICNN</stitle><date>1997</date><risdate>1997</risdate><volume>3</volume><spage>1879</spage><epage>1883 vol.3</epage><pages>1879-1883 vol.3</pages><isbn>0780341228</isbn><isbn>9780780341227</isbn><abstract>We describe here three useful and practical extensions and enhancements of the decoupled extended Kalman filter (DEKF) neural network weight update procedure, which has served as the backbone for much of our applications-oriented research for the last six years. First, we provide a mechanism that constrains weight values to a pre-specified range during training to allow for fixed-point deployment of trained networks. Second, we examine modifications of DEKF training for alternative cost functions; as an example, we show how to use DEKF training to minimize a measure of relative entropy, rather than mean squared error, for pattern classification problems. Third, we describe an approximation of DEKF training that allows a multiple-output training problem to be treated with single-output training complexity.</abstract><pub>IEEE</pub><doi>10.1109/ICNN.1997.614185</doi></addata></record> |
fulltext | fulltext_linktorsrc |
identifier | ISBN: 0780341228 |
ispartof | Proceedings of International Conference on Neural Networks (ICNN'97), 1997, Vol.3, p.1879-1883 vol.3 |
issn | |
language | eng |
recordid | cdi_ieee_primary_614185 |
source | IEEE Electronic Library (IEL) Conference Proceedings |
subjects | Backpropagation Cost function Covariance matrix Entropy Equations Laboratories Neural networks Pattern classification Recurrent neural networks Spine |
title | Extensions and enhancements of decoupled extended Kalman filter training |
url | http://sfxeu10.hosted.exlibrisgroup.com/loughborough?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-04T23%3A17%3A36IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-ieee_6IE&rft_val_fmt=info:ofi/fmt:kev:mtx:book&rft.genre=proceeding&rft.atitle=Extensions%20and%20enhancements%20of%20decoupled%20extended%20Kalman%20filter%20training&rft.btitle=Proceedings%20of%20International%20Conference%20on%20Neural%20Networks%20(ICNN'97)&rft.au=Puskorius,%20G.V.&rft.date=1997&rft.volume=3&rft.spage=1879&rft.epage=1883%20vol.3&rft.pages=1879-1883%20vol.3&rft.isbn=0780341228&rft.isbn_list=9780780341227&rft_id=info:doi/10.1109/ICNN.1997.614185&rft_dat=%3Cieee_6IE%3E614185%3C/ieee_6IE%3E%3Cgrp_id%3Ecdi_FETCH-LOGICAL-i87t-54c3b8a8ca9f7d956bfc43c4eaf48a24b9f9c3ed2e879557bdc172cf45a5daf83%3C/grp_id%3E%3Coa%3E%3C/oa%3E%3Curl%3E%3C/url%3E&rft_id=info:oai/&rft_id=info:pmid/&rft_ieee_id=614185&rfr_iscdi=true |