Loading…

A Hierarchical Network with User Memory Matrix for Long Sequence Recommendation

In many recommendation scenarios, the interactions between users and items are divided into a series of sessions according to the time interval. The traditional Recurrent Neural Network has some shortcomings, such as limited memory ability, inflexible access to memory data, and obvious deficiency in...

Full description

Saved in:
Bibliographic Details
Published in:Wireless communications and mobile computing 2022-01, Vol.2022, p.1-12
Main Authors: Dong, Jiawei, Sun, Fuzhen, Wu, Tianhui, Wu, Xiangshuai, Zhang, Wenlong, Wang, Shaoqing
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Items that cite this one
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:In many recommendation scenarios, the interactions between users and items are divided into a series of sessions according to the time interval. The traditional Recurrent Neural Network has some shortcomings, such as limited memory ability, inflexible access to memory data, and obvious deficiency in feature capture for long sequences. To deal with the mentioned issues, we propose a hierarchical network with user memory matrix, named HNUM2, which utilizes the memory network to store users' long-term and short-term interests. The memory network is more flexible to access memory data, which can solve the problem of insufficient capture of long sequence features. The proposed model is a hierarchical recommendation algorithm, which consists of two layers. The first layer is the session-level GRU model, which obtains the sequence characteristics of the current session to predict the next item. The second layer is the user-level memory network model which exploits the attention mechanism and incorporates the write module and read module. The experimental results on two public available datasets show that HNUM2 has achieved significant performance improvement comparing to the state-of-the-art methods.
ISSN:1530-8669
1530-8677
DOI:10.1155/2022/5457044