Deep self-attention for sequential recommendation

Beichuan Zhang, Zhijiao Xiao, Shenghua Zhong

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

Sequential recommendation aims to recommend the next item that a user will likely interact with by capturing the useful sequential patterns from users' historical behaviors. Recently, it has become an important and popular component in various e-commerce platforms. As a successful network, Transformer has been widely used to adaptively capture the dynamics of users' historical behaviors for sequential recommendation. In recommender systems, the size of embedding is usually set to be small. Under small embedding, the dot-product in Transformer may have the limitation on calculating the complex relevance between keys and queries. To address the common but neglected issue, in this paper, we present a new model, Deep Self-Attention for Sequential Recommendation (DSASrec), which proposes a chunking deep attention to compute attention weights. The chunking deep attention has two modules: a deep module and a chunking module. The deep module is used to improve the nonlinearity of the attention function. The chunking module is used to calculate attention weights several times like the multi-head attention in Transformer. Extensive experiments on three benchmark datasets show that our model can achieve state-of-the-art results. Our implementation is available in PyTorch.

Original languageEnglish (US)
Title of host publicationProceedings - SEKE 2021
Subtitle of host publication33rd International Conference on Software Engineering and Knowledge Engineering
PublisherKnowledge Systems Institute Graduate School
Pages321-326
Number of pages6
ISBN (Electronic)1891706527
DOIs
StatePublished - 2021
Event33rd International Conference on Software Engineering and Knowledge Engineering, SEKE 2021 - Pittsburgh, United States
Duration: Jul 1 2021Jul 10 2021

Publication series

NameProceedings of the International Conference on Software Engineering and Knowledge Engineering, SEKE
Volume2021-July
ISSN (Print)2325-9000
ISSN (Electronic)2325-9086

Conference

Conference33rd International Conference on Software Engineering and Knowledge Engineering, SEKE 2021
Country/TerritoryUnited States
CityPittsburgh
Period7/1/217/10/21

Keywords

  • Chunking representation
  • Deep learning
  • Dot-product
  • Recommender system
  • Transformer

ASJC Scopus subject areas

  • Software

Fingerprint

Dive into the research topics of 'Deep self-attention for sequential recommendation'. Together they form a unique fingerprint.

Cite this