Loading…

FBN: Federated Bert Network with client-server architecture for cross-lingual signature verification

•Bert is embedded into the federated learning framework with C-S architecture.•Federated Bert Network integrates an improved local-model aggregation algorithm.•Length Alignment Algorithm adapts multi-attribute time series to batch calculating.•Our model is superior in cross-lingual verification and...

Full description

Saved in:
Bibliographic Details
Published in:Pattern recognition 2023-10, Vol.142, p.109681, Article 109681
Main Authors: Xie, Liyang, Wu, Zhongcheng, Zhang, Xian, Li, Yong
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
cited_by
cites cdi_FETCH-LOGICAL-c255t-db250884517f17e9ed313c385c8cd5e8a279202554da2f3aec3e84b0a9770aa73
container_end_page
container_issue
container_start_page 109681
container_title Pattern recognition
container_volume 142
creator Xie, Liyang
Wu, Zhongcheng
Zhang, Xian
Li, Yong
description •Bert is embedded into the federated learning framework with C-S architecture.•Federated Bert Network integrates an improved local-model aggregation algorithm.•Length Alignment Algorithm adapts multi-attribute time series to batch calculating.•Our model is superior in cross-lingual verification and privacy data protection. Online signature verification has a great challenge due to the poor performance of deep learning techniques on cross-lingual datasets under privacy constraints. In this paper, we propose a novel Federated Bert Network (FBN) by embedding the Bidirectional Encoder Representations from Transformers (Bert) into a Federated Learning (FL) framework with client-server architecture. A new Length Alignment Algorithm is employed to unify the signature pairs’ sequence length, and the input representations are fed into the different clients to complete the independent learning of local-models. In addition, the server (coordinator) uses the improved Federated Average Algorithm with Reward-Punishment Mechanism (FedAvgRP) to aggregate these local-models and further generate a global-model. After multiple iterations, the optimal model can be obtained and cross-tested on four datasets (SVC 2004, MCYT-330, BioecurID, and Ours) with skilled forged (random forged) EERs of 7.65% (4.76%), 10.73% (8.46%), 10.09% (7.13%), and 8.28% (5.74%), respectively, far higher than that of the independent learning of state-of-the-art methods. Compared with the domain adaptation and improved FL models, our FBN model performs best in random and skilled forgery scenarios. Moreover, the FedAvgRP algorithm helps our model maintain high performance in the face of data attacks.
doi_str_mv 10.1016/j.patcog.2023.109681
format article
fullrecord <record><control><sourceid>elsevier_cross</sourceid><recordid>TN_cdi_crossref_primary_10_1016_j_patcog_2023_109681</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><els_id>S003132032300376X</els_id><sourcerecordid>S003132032300376X</sourcerecordid><originalsourceid>FETCH-LOGICAL-c255t-db250884517f17e9ed313c385c8cd5e8a279202554da2f3aec3e84b0a9770aa73</originalsourceid><addsrcrecordid>eNp9kM1OAjEUhRujiYi-gYu-wGB_prTjwkSIqAnBja6b0t6B4jhD2gLx7SmMa1c3ufeek3M-hO4pGVFCxw-b0dYk261GjDCeV9VY0Qs0oEryQtCSXaIBIZwWnBF-jW5i3BBCZT4MkJtNFo94Bg6CSeDwBELCC0iHLnzjg09rbBsPbSoihD0EbIJd-wQ27QLgugvYhi7GovHtamcaHP2qNedbfva1tyb5rr1FV7VpItz9zSH6mr18Tt-K-cfr-_R5XlgmRCrckgmiVCmorKmEChyn3HIlrLJOgDJMVrmhEKUzrOYGLAdVLomppCTGSD5EZe97DhWg1tvgf0z41ZToEym90T0pfSKle1JZ9tTLIGfbewg62tzZgvMhN9Wu8_8bHAE0inU3</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype></control><display><type>article</type><title>FBN: Federated Bert Network with client-server architecture for cross-lingual signature verification</title><source>ScienceDirect Freedom Collection</source><creator>Xie, Liyang ; Wu, Zhongcheng ; Zhang, Xian ; Li, Yong</creator><creatorcontrib>Xie, Liyang ; Wu, Zhongcheng ; Zhang, Xian ; Li, Yong</creatorcontrib><description>•Bert is embedded into the federated learning framework with C-S architecture.•Federated Bert Network integrates an improved local-model aggregation algorithm.•Length Alignment Algorithm adapts multi-attribute time series to batch calculating.•Our model is superior in cross-lingual verification and privacy data protection. Online signature verification has a great challenge due to the poor performance of deep learning techniques on cross-lingual datasets under privacy constraints. In this paper, we propose a novel Federated Bert Network (FBN) by embedding the Bidirectional Encoder Representations from Transformers (Bert) into a Federated Learning (FL) framework with client-server architecture. A new Length Alignment Algorithm is employed to unify the signature pairs’ sequence length, and the input representations are fed into the different clients to complete the independent learning of local-models. In addition, the server (coordinator) uses the improved Federated Average Algorithm with Reward-Punishment Mechanism (FedAvgRP) to aggregate these local-models and further generate a global-model. After multiple iterations, the optimal model can be obtained and cross-tested on four datasets (SVC 2004, MCYT-330, BioecurID, and Ours) with skilled forged (random forged) EERs of 7.65% (4.76%), 10.73% (8.46%), 10.09% (7.13%), and 8.28% (5.74%), respectively, far higher than that of the independent learning of state-of-the-art methods. Compared with the domain adaptation and improved FL models, our FBN model performs best in random and skilled forgery scenarios. Moreover, the FedAvgRP algorithm helps our model maintain high performance in the face of data attacks.</description><identifier>ISSN: 0031-3203</identifier><identifier>EISSN: 1873-5142</identifier><identifier>DOI: 10.1016/j.patcog.2023.109681</identifier><language>eng</language><publisher>Elsevier Ltd</publisher><subject>Client-server architecture ; Federated Average Algorithm with Reward-Punishment Mechanism ; Federated Bert Network ; Length Alignment Algorithm ; Online signature verification</subject><ispartof>Pattern recognition, 2023-10, Vol.142, p.109681, Article 109681</ispartof><rights>2023 Elsevier Ltd</rights><lds50>peer_reviewed</lds50><woscitedreferencessubscribed>false</woscitedreferencessubscribed><cites>FETCH-LOGICAL-c255t-db250884517f17e9ed313c385c8cd5e8a279202554da2f3aec3e84b0a9770aa73</cites></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><link.rule.ids>314,776,780,27901,27902</link.rule.ids></links><search><creatorcontrib>Xie, Liyang</creatorcontrib><creatorcontrib>Wu, Zhongcheng</creatorcontrib><creatorcontrib>Zhang, Xian</creatorcontrib><creatorcontrib>Li, Yong</creatorcontrib><title>FBN: Federated Bert Network with client-server architecture for cross-lingual signature verification</title><title>Pattern recognition</title><description>•Bert is embedded into the federated learning framework with C-S architecture.•Federated Bert Network integrates an improved local-model aggregation algorithm.•Length Alignment Algorithm adapts multi-attribute time series to batch calculating.•Our model is superior in cross-lingual verification and privacy data protection. Online signature verification has a great challenge due to the poor performance of deep learning techniques on cross-lingual datasets under privacy constraints. In this paper, we propose a novel Federated Bert Network (FBN) by embedding the Bidirectional Encoder Representations from Transformers (Bert) into a Federated Learning (FL) framework with client-server architecture. A new Length Alignment Algorithm is employed to unify the signature pairs’ sequence length, and the input representations are fed into the different clients to complete the independent learning of local-models. In addition, the server (coordinator) uses the improved Federated Average Algorithm with Reward-Punishment Mechanism (FedAvgRP) to aggregate these local-models and further generate a global-model. After multiple iterations, the optimal model can be obtained and cross-tested on four datasets (SVC 2004, MCYT-330, BioecurID, and Ours) with skilled forged (random forged) EERs of 7.65% (4.76%), 10.73% (8.46%), 10.09% (7.13%), and 8.28% (5.74%), respectively, far higher than that of the independent learning of state-of-the-art methods. Compared with the domain adaptation and improved FL models, our FBN model performs best in random and skilled forgery scenarios. Moreover, the FedAvgRP algorithm helps our model maintain high performance in the face of data attacks.</description><subject>Client-server architecture</subject><subject>Federated Average Algorithm with Reward-Punishment Mechanism</subject><subject>Federated Bert Network</subject><subject>Length Alignment Algorithm</subject><subject>Online signature verification</subject><issn>0031-3203</issn><issn>1873-5142</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2023</creationdate><recordtype>article</recordtype><recordid>eNp9kM1OAjEUhRujiYi-gYu-wGB_prTjwkSIqAnBja6b0t6B4jhD2gLx7SmMa1c3ufeek3M-hO4pGVFCxw-b0dYk261GjDCeV9VY0Qs0oEryQtCSXaIBIZwWnBF-jW5i3BBCZT4MkJtNFo94Bg6CSeDwBELCC0iHLnzjg09rbBsPbSoihD0EbIJd-wQ27QLgugvYhi7GovHtamcaHP2qNedbfva1tyb5rr1FV7VpItz9zSH6mr18Tt-K-cfr-_R5XlgmRCrckgmiVCmorKmEChyn3HIlrLJOgDJMVrmhEKUzrOYGLAdVLomppCTGSD5EZe97DhWg1tvgf0z41ZToEym90T0pfSKle1JZ9tTLIGfbewg62tzZgvMhN9Wu8_8bHAE0inU3</recordid><startdate>202310</startdate><enddate>202310</enddate><creator>Xie, Liyang</creator><creator>Wu, Zhongcheng</creator><creator>Zhang, Xian</creator><creator>Li, Yong</creator><general>Elsevier Ltd</general><scope>AAYXX</scope><scope>CITATION</scope></search><sort><creationdate>202310</creationdate><title>FBN: Federated Bert Network with client-server architecture for cross-lingual signature verification</title><author>Xie, Liyang ; Wu, Zhongcheng ; Zhang, Xian ; Li, Yong</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c255t-db250884517f17e9ed313c385c8cd5e8a279202554da2f3aec3e84b0a9770aa73</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2023</creationdate><topic>Client-server architecture</topic><topic>Federated Average Algorithm with Reward-Punishment Mechanism</topic><topic>Federated Bert Network</topic><topic>Length Alignment Algorithm</topic><topic>Online signature verification</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Xie, Liyang</creatorcontrib><creatorcontrib>Wu, Zhongcheng</creatorcontrib><creatorcontrib>Zhang, Xian</creatorcontrib><creatorcontrib>Li, Yong</creatorcontrib><collection>CrossRef</collection><jtitle>Pattern recognition</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Xie, Liyang</au><au>Wu, Zhongcheng</au><au>Zhang, Xian</au><au>Li, Yong</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>FBN: Federated Bert Network with client-server architecture for cross-lingual signature verification</atitle><jtitle>Pattern recognition</jtitle><date>2023-10</date><risdate>2023</risdate><volume>142</volume><spage>109681</spage><pages>109681-</pages><artnum>109681</artnum><issn>0031-3203</issn><eissn>1873-5142</eissn><abstract>•Bert is embedded into the federated learning framework with C-S architecture.•Federated Bert Network integrates an improved local-model aggregation algorithm.•Length Alignment Algorithm adapts multi-attribute time series to batch calculating.•Our model is superior in cross-lingual verification and privacy data protection. Online signature verification has a great challenge due to the poor performance of deep learning techniques on cross-lingual datasets under privacy constraints. In this paper, we propose a novel Federated Bert Network (FBN) by embedding the Bidirectional Encoder Representations from Transformers (Bert) into a Federated Learning (FL) framework with client-server architecture. A new Length Alignment Algorithm is employed to unify the signature pairs’ sequence length, and the input representations are fed into the different clients to complete the independent learning of local-models. In addition, the server (coordinator) uses the improved Federated Average Algorithm with Reward-Punishment Mechanism (FedAvgRP) to aggregate these local-models and further generate a global-model. After multiple iterations, the optimal model can be obtained and cross-tested on four datasets (SVC 2004, MCYT-330, BioecurID, and Ours) with skilled forged (random forged) EERs of 7.65% (4.76%), 10.73% (8.46%), 10.09% (7.13%), and 8.28% (5.74%), respectively, far higher than that of the independent learning of state-of-the-art methods. Compared with the domain adaptation and improved FL models, our FBN model performs best in random and skilled forgery scenarios. Moreover, the FedAvgRP algorithm helps our model maintain high performance in the face of data attacks.</abstract><pub>Elsevier Ltd</pub><doi>10.1016/j.patcog.2023.109681</doi></addata></record>
fulltext fulltext
identifier ISSN: 0031-3203
ispartof Pattern recognition, 2023-10, Vol.142, p.109681, Article 109681
issn 0031-3203
1873-5142
language eng
recordid cdi_crossref_primary_10_1016_j_patcog_2023_109681
source ScienceDirect Freedom Collection
subjects Client-server architecture
Federated Average Algorithm with Reward-Punishment Mechanism
Federated Bert Network
Length Alignment Algorithm
Online signature verification
title FBN: Federated Bert Network with client-server architecture for cross-lingual signature verification
url http://sfxeu10.hosted.exlibrisgroup.com/loughborough?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-31T08%3A10%3A57IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-elsevier_cross&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=FBN:%20Federated%20Bert%20Network%20with%20client-server%20architecture%20for%20cross-lingual%20signature%20verification&rft.jtitle=Pattern%20recognition&rft.au=Xie,%20Liyang&rft.date=2023-10&rft.volume=142&rft.spage=109681&rft.pages=109681-&rft.artnum=109681&rft.issn=0031-3203&rft.eissn=1873-5142&rft_id=info:doi/10.1016/j.patcog.2023.109681&rft_dat=%3Celsevier_cross%3ES003132032300376X%3C/elsevier_cross%3E%3Cgrp_id%3Ecdi_FETCH-LOGICAL-c255t-db250884517f17e9ed313c385c8cd5e8a279202554da2f3aec3e84b0a9770aa73%3C/grp_id%3E%3Coa%3E%3C/oa%3E%3Curl%3E%3C/url%3E&rft_id=info:oai/&rft_id=info:pmid/&rfr_iscdi=true