Title | Sequence recommendation using multi-level self-attention network with gated spiking neural P systems |
Publication Type | Journal Papers |
Year of Publication | 2023 |
Authors | Bai, X., Huang Y., Peng H., Wang J., Yang Q., Orellana-Martín D., Ramírez-de-Arellano A., & Pérez-Jiménez M. J. |
Journal Title | Information Sciences |
Publisher | Elsevier |
Place Published | Amsterdam (The Netherlands) |
Pages | 119916 |
Abstract | Sequence recommendation is used to predict the user's next potentially interesting items and behaviors. It not only focuses on the user's independent interaction behavior, but also considers the user's historical behavior sequence. However, sequence recommendation still faces some challenges: the existing models still have shortcomings in addressing long-term dependencies and fully utilizing contextual information in sequence recommendation. To address these challenges, we propose a four-channel model based on a multi-level self-attention network with gated spiking neural P (GSNP) systems, termed SR-MAG model. The four channels are divided into two groups, and each group is composed of an attention channel and an GSNP attention channel. Moreover, they process long-term sequences and short-term sequences respectively to obtain long-term or short-term attention channel features. These features are then passed through a self-attention network to effectively extract user context information. The proposed SR-MAG model is tested on three real datasets and compared with 10 baseline methods. Experimental results demonstrate the effectiveness of the proposed SR-MAG model in sequence recommendation tasks. |
Keywords | Gated spiking neural P systems, Nonlinear spiking neural P systems, Self-attention mechanism, Sequence recommendation |
URL | https://www.sciencedirect.com/science/article/pii/S0020025523015013 |
ISSN Number | 0020-0255 |
DOI | https://doi.org/10.1016/j.ins.2023.119916 |