Loading…

A Trust Inference Method Employing Combinatorial Strategies

Most trust inference algorithms based on Q-learning and learning automata often suffer from the problem of insufficient storage space, as well as some algorithms using the shortest trust path do not fully utilize the trust information. To solve the above problems, a trust inference algorithm (Duelin...

Full description

Saved in:
Bibliographic Details
Published in:Wireless communications and mobile computing 2023-02, Vol.2023, p.1-10
Main Authors: Xiao, Ruili, Tong, Xiangrong
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Items that cite this one
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Most trust inference algorithms based on Q-learning and learning automata often suffer from the problem of insufficient storage space, as well as some algorithms using the shortest trust path do not fully utilize the trust information. To solve the above problems, a trust inference algorithm (DuelingDQNTrust) based on deep reinforcement learning is proposed. It uses Dueling DQN to compute Q values to reduce the required storage space and does not limit the traversal depth. Moreover, the way to find out the trusted neighbors of a user is crucial because it affects the reliability of the trust path. So three combination strategies are proposed to find users’ trusted neighbors more accurately. One strategy is to define a priority metric by combining node importance and user similarity to find the top k neighbor users of the user as the trusted neighbor set. Another two strategies consider node importance and user similarity in a different order of priority, which first uses a metric to find the top k neighbor users of the user, and then further filters these k users on another metric to determine the trusted neighbor set of the user. The evaluation of the algorithm on the FilmTrust dataset indicates that the algorithm outperforms several existing trust inference algorithms.
ISSN:1530-8669
1530-8677
DOI:10.1155/2023/2929449