|
Frontiers of Information Technology & Electronic Engineering
ISSN 2095-9184 (print), ISSN 2095-9230 (online)
2020 Vol.21 No.8 P.1206-1216
HAM: a deep collaborative ranking method incorporating textual information
Abstract: The recommendation task with a textual corpus aims to model customer preferences from both user feedback and item textual descriptions. It is highly desirable to explore a very deep neural network to capture the complicated nonlinear preferences. However, training a deeper recommender is not as effortless as simply adding layers. A deeper recommender suffers from the gradient vanishing/exploding issue and cannot be easily trained by gradient-based methods. Moreover, textual descriptions probably contain noisy word sequences. Directly extracting feature vectors from them can harm the recommender’s performance. To overcome these difficulties, we propose a new recommendation method named the HighwAy recoMmender (HAM). HAM explores a highway mechanism to make gradient-based training methods stable. A multi-head attention mechanism is devised to automatically denoise textual information. Moreover, a block coordinate descent method is devised to train a deep neural recommender. Empirical studies show that the proposed method outperforms state-of-the-art methods significantly in terms of accuracy.
Key words: Deep learning, Recommendation system, Highway network, Block coordinate descent
1浙江省大数据智能计算重点实验室,中国杭州市,310027
2浙江大学计算机辅助设计与图形学国家重点实验室,中国杭州市,310027
3浙江大学计算机科学与技术学院,中国杭州市,310027
摘要:基于文本语料库的推荐任务旨在通过挖掘用户反馈数据以及物品文本描述数据对用户偏好建模。当前研究人员亟需探索使用深度神经网络捕获复杂的非线性偏好。然而,训练网络结构更深的推荐器并不能通过简单添加网络层数实现。一个网络结构更深的推荐器会面临梯度消失/爆炸问题,导致其无法通过基于梯度的方法进行模型训练。此外,物品文字描述数据可能包含嘈杂的单词序列;直接从这类特征向量中提取特征可能会影响推荐器性能。为解决上述问题,本文提出一种新的基于极深神经网络的排序推荐方法:高速网络推荐器(HighwAy recoMmender, HAM)。首先基于高速公路网络设计一个全新神经网络推荐框架,该框架能有效地稳定深度推荐器的梯度流。其次,使用一种多头注意力编码器,自动对文本信息降噪。最后,提出一种全新的基于块坐标下降的方法,可更加有效地训练具有更深网络结构的推荐器。实验结果表明,与当前先进方法相比,HAM具有更好推荐性能。
关键词组:
References:
Open peer comments: Debate/Discuss/Question/Opinion
<1>
DOI:
10.1631/FITEE.1900382
CLC number:
TP181
Download Full Text:
Downloaded:
5630
Download summary:
<Click Here>Downloaded:
1841Clicked:
7681
Cited:
0
On-line Access:
2024-08-27
Received:
2023-10-17
Revision Accepted:
2024-05-08
Crosschecked:
2020-07-13