0


STAMP:基于会话的推荐的短期注意力/记忆优先级模型

STAMP: Short‑Term Attention/Memory Priority Model for Session‑based Recommendation
课程网址: http://videolectures.net/kdd2018_zeng_stamp/  
主讲教师: Yifu Zeng
开课单位: 中国科学技术大学
开课时间: 2018-11-23
课程语种: 英语
中文简介:
基于匿名会话的用户行为预测是基于网络的行为建模研究中的一个挑战性问题,主要是由于用户行为的不确定性和信息的有限性。递归神经网络的最新进展为解决这一问题提供了很有前途的方法,长期-短期记忆模型被证明能够从以前的点击中捕捉用户的一般兴趣。然而,现有的方法都没有明确考虑到用户当前行为对下一步行动的影响。在这项研究中,我们认为,长期记忆模型可能不足以对长时间会话进行建模,这些会话通常包含由无意点击引起的用户兴趣漂移。提出了一种新的短期注意力/记忆优先级模型作为补救措施,该模型能够从会话上下文的长期记忆中捕获用户的一般兴趣,同时考虑到用户最近点击的短期记忆中的当前兴趣。基于2015年RecSys挑战赛和2016年CIKM杯的三个基准数据集,对所提出的注意力机制的有效性和有效性进行了广泛评估。数值结果表明,我们的模型在所有测试中都达到了最先进的性能。
课程简介: Predicting users’ actions based on anonymous sessions is a challenging problem in web-based behavioral modeling research, mainly due to the uncertainty of user behavior and the limited information. Recent advances in recurrent neural networks have led to promising approaches to solving this problem, with long short-term memory model proving effective in capturing users’ general interests from previous clicks. However, none of the existing approaches explicitly take the effects of users’ current actions on their next moves into account. In this study, we argue that a long-term memory model may be insufficient for modeling long sessions that usually contain user interests drift caused by unintended clicks. A novel short-term attention/memory priority model is proposed as a remedy, which is capable of capturing users’ general interests from the long-term memory of a session context, whilst taking into account users’ current interests from the short-term memory of the last-clicks. The validity and efficacy of the proposed attention mechanism is extensively evaluated on three benchmark data sets from the RecSys Challenge 2015 and CIKM Cup 2016. The numerical results show that our model achieves state-of-the-art performance in all the tests.
关 键 词: STAMP; 基于会话的推荐; 递归神经网络; 基于网络的行为建模研究
课程来源: 视频讲座网
数据采集: 2023-02-09:cyh
最后编审: 2023-02-09:cyh
阅读次数: 77