TY - JOUR
T1 - IntentRec
T2 - Incorporating latent user intent via contrastive alignment for sequential recommendation
AU - Hwang, Seonjin
AU - Lee, Younghoon
N1 - Publisher Copyright:
© 2025 Elsevier B.V.
PY - 2025/9/1
Y1 - 2025/9/1
N2 - Predicting the next item a user will interact with is a core task in sequential recommendation (SR). Traditional approaches predominantly focus on modeling patterns in item purchase sequences, yet often fall short in uncovering the underlying motivations behind user behavior. To overcome this limitation, we introduce IntentRec, a novel SR framework designed to incorporate latent user intent signals extracted from user-written reviews. Unlike conventional models that treat item sequences in isolation, IntentRec bridges the semantic gap between review content and behavioral data by aligning their representations in a shared embedding space through contrastive learning. Review sequences chronologically ordered text reflecting users’ thoughts serve as a rich source of intent, which is fused into the item sequence representation during training. To ensure practicality in real-time recommendation scenarios, our method excludes review inputs at inference time, acknowledging that reviews naturally occur after item interactions. IntentRec employs BERT, a pre-trained language model, to extract nuanced user intent from textual reviews, and introduces a cross-attention-enhanced contrastive loss to tightly couple review-derived signals with item-based preferences. Extensive experiments conducted on four widely-used SR benchmarks demonstrate that IntentRec consistently outperforms eight state-of-the-art baselines. Further ablation studies confirm the crucial role of review-based user intent in improving sequential recommendation accuracy.
AB - Predicting the next item a user will interact with is a core task in sequential recommendation (SR). Traditional approaches predominantly focus on modeling patterns in item purchase sequences, yet often fall short in uncovering the underlying motivations behind user behavior. To overcome this limitation, we introduce IntentRec, a novel SR framework designed to incorporate latent user intent signals extracted from user-written reviews. Unlike conventional models that treat item sequences in isolation, IntentRec bridges the semantic gap between review content and behavioral data by aligning their representations in a shared embedding space through contrastive learning. Review sequences chronologically ordered text reflecting users’ thoughts serve as a rich source of intent, which is fused into the item sequence representation during training. To ensure practicality in real-time recommendation scenarios, our method excludes review inputs at inference time, acknowledging that reviews naturally occur after item interactions. IntentRec employs BERT, a pre-trained language model, to extract nuanced user intent from textual reviews, and introduces a cross-attention-enhanced contrastive loss to tightly couple review-derived signals with item-based preferences. Extensive experiments conducted on four widely-used SR benchmarks demonstrate that IntentRec consistently outperforms eight state-of-the-art baselines. Further ablation studies confirm the crucial role of review-based user intent in improving sequential recommendation accuracy.
KW - Contrastive learning
KW - Language model
KW - Review text
KW - Review-based recommendation
KW - Sequential recommendation
UR - https://www.scopus.com/pages/publications/105008924573
U2 - 10.1016/j.elerap.2025.101522
DO - 10.1016/j.elerap.2025.101522
M3 - Article
AN - SCOPUS:105008924573
SN - 1567-4223
VL - 73
JO - Electronic Commerce Research and Applications
JF - Electronic Commerce Research and Applications
M1 - 101522
ER -