CLC number: TP181
On-line Access: 2020-08-10
Received: 0201-07-28
Revision Accepted: 2019-12-20
Crosschecked: 2020-07-13
Cited: 0
Clicked: 6779
Citations: Bibtex RefMan EndNote GB/T7714
Cheng-wei Wang, Teng-fei Zhou, Chen Chen, Tian-lei Hu, Gang Chen. HAM: a deep collaborative ranking method incorporating textual information[J]. Frontiers of Information Technology & Electronic Engineering, 2020, 21(8): 1206-1216.
@article{title="HAM: a deep collaborative ranking method incorporating textual information",
author="Cheng-wei Wang, Teng-fei Zhou, Chen Chen, Tian-lei Hu, Gang Chen",
journal="Frontiers of Information Technology & Electronic Engineering",
volume="21",
number="8",
pages="1206-1216",
year="2020",
publisher="Zhejiang University Press & Springer",
doi="10.1631/FITEE.1900382"
}
%0 Journal Article
%T HAM: a deep collaborative ranking method incorporating textual information
%A Cheng-wei Wang
%A Teng-fei Zhou
%A Chen Chen
%A Tian-lei Hu
%A Gang Chen
%J Frontiers of Information Technology & Electronic Engineering
%V 21
%N 8
%P 1206-1216
%@ 2095-9184
%D 2020
%I Zhejiang University Press & Springer
%DOI 10.1631/FITEE.1900382
TY - JOUR
T1 - HAM: a deep collaborative ranking method incorporating textual information
A1 - Cheng-wei Wang
A1 - Teng-fei Zhou
A1 - Chen Chen
A1 - Tian-lei Hu
A1 - Gang Chen
J0 - Frontiers of Information Technology & Electronic Engineering
VL - 21
IS - 8
SP - 1206
EP - 1216
%@ 2095-9184
Y1 - 2020
PB - Zhejiang University Press & Springer
ER -
DOI - 10.1631/FITEE.1900382
Abstract: The recommendation task with a textual corpus aims to model customer preferences from both user feedback and item textual descriptions. It is highly desirable to explore a very deep neural network to capture the complicated nonlinear preferences. However, training a deeper recommender is not as effortless as simply adding layers. A deeper recommender suffers from the gradient vanishing/exploding issue and cannot be easily trained by gradient-based methods. Moreover, textual descriptions probably contain noisy word sequences. Directly extracting feature vectors from them can harm the recommender’s performance. To overcome these difficulties, we propose a new recommendation method named the HighwAy recoMmender (HAM). HAM explores a highway mechanism to make gradient-based training methods stable. A multi-head attention mechanism is devised to automatically denoise textual information. Moreover, a block coordinate descent method is devised to train a deep neural recommender. Empirical studies show that the proposed method outperforms state-of-the-art methods significantly in terms of accuracy.
[1]Adomavicius G, Tuzhilin A, 2005. Toward the next generation of recommender systems: a survey of the state-of-the-art and possible extensions. IEEE Trans Knowl Data Eng, 17(6):734-749.
[2]Bansal T, Belanger D, McCallum A, 2016. Ask the GRU: multi-task learning for deep text recommendations. Proc 10th ACM Conf on Recommender Systems, p.107-114.
[3]Bennett J, Lanning S, 2007. The Netflix prize. Proc KDD Cup and Workshop, p.35.
[4]Cai XY, Han J, Yang L, 2018. Generative adversarial network based heterogeneous bibliographic network representation for personalized citation recommendation. 32nd AAAI Conf on Artificial Intelligence, p.5747-5754.
[5]Chorowski JK, Bahdanau D, Serdyuk D, et al., 2015. Attention-based models for speech recognition. Proc 30th Int Conf on Neural Information Processing Systems, p.577-585.
[6]Devooght R, Bersini H, 2016. Collaborative filtering with recurrent neural networks. https://arxiv.org/abs/1608.07400
[7]Goodfellow I, Bengio Y, Courville A, 2016. Deep Learning. MIT Press, Cambridge, MA.
[8]Gopalan PK, Charlin L, Blei D, 2014. Content-based recommendations with Poisson factorization. Proc Advances in Neural Information Processing Systems, p.3176-3184.
[9]Grvčar M, Mladenivč D, Fortuna B, et al., 2005. Data sparsity issues in the collaborative filtering framework. Proc 7th Int Workshop on Knowledge Discovery on the Web, p.58-76.
[10]He XN, Liao LZ, Zhang HW, et al., 2017. Neural collaborative filtering. Proc 26th Int Conf on World Wide Web, p.173-182.
[11]Hsieh CK, Yang L, Cui Y, et al., 2017. Collaborative metric learning. Proc 26th Int Conf on World Wide Web, p.193-201.
[12]Jin M, Luo X, Zhu H, et al., 2018. Combining deep learning and topic modeling for review understanding in context-aware recommendation. Proc Conf of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, p.1605-1614.
[13]Kim D, Park C, Oh J, et al., 2016. Convolutional matrix factorization for document context-aware recommendation. Proc 10th ACM Conf on Recommender Systems, p.233-240.
[14]Kiros R, Zhu Y, Salakhutdinov RR, et al., 2015. Skip-thought vectors. Advances in Neural Information Processing Systems, p.3294-3302.
[15]Koren Y, 2008. Factorization meets the neighborhood: a multifaceted collaborative filtering model. Proc 14th ACM SIGKDD Int Conf on Knowledge Discovery and Data Mining, p.426-434.
[16]Koren Y, Bell R, Volinsky C, 2009. Matrix factorization techniques for recommender systems. Computer, 42(8):30-37.
[17]Levy O, Goldberg Y, 2014. Neural word embedding as implicit matrix factorization. Proc 27th Int Conf on Neural Information Processing Systems, p.2177-2185.
[18]Linden G, Smith B, York J, 2003. Amazon.com recommendations: item-to-item collaborative filtering. IEEE Int Comput, 7(1):76-80.
[19]Liu CH, Jin T, Hoi SCH, et al., 2017. Collaborative topic regression for online recommender systems: an online and Bayesian approach. Mach Learn, 106(5):651-670.
[20]McLaughlin MR, Herlocker JL, 2004. A collaborative filtering algorithm and evaluation metric that accurately model the user experience. Proc 27th Annual Int ACM SIGIR Conf on Research and Development in Information Retrieval, p.329-336.
[21]Mhaskar H, Liao Q, Poggio T, 2017. When and why are deep networks better than shallow ones? Proc 31st AAAI Conf on Artificial Intelligence, p.2343-2349.
[22]Neyshabur B, Bhojanapalli S, McAllester D, et al., 2017. Exploring generalization in deep learning. Proc 30th Conf on Advances in Neural Information Processing Systems, p.5947-5956.
[23]Paterek A, 2007. Improving regularized singular value decomposition for collaborative filtering. Proc KDD Cup and Workshop, p.5-8.
[24]Raghu M, Poole B, Kleinberg J, et al., 2017. On the expressive power of deep neural networks. Proc 34th Int Conf on Machine Learning, p.2847-2854.
[25]Rendle S, Freudenthaler C, Gantner Z, et al., 2009. BPR: Bayesian personalized ranking from implicit feedback. Proc 25th Conf on Uncertainty in Artificial Intelligence, p.452-461.
[26]Ruder S, 2017. An overview of multi-task learning in deep neural networks. https://arxiv.org/abs/1706.05098
[27]Salakhutdinov R, Mnih A, 2007. Probabilistic matrix factorization. Proc 20th Int Conf on Neural Information Processing Systems, p.1257-1264.
[28]Salakhutdinov R, Mnih A, Hinton G, 2007. Restricted Boltzmann machines for collaborative filtering. Proc 24th Int Conf on Machine Learning, p.791-798.
[29]Shoja BM, Tabrizi N, 2019. Customer reviews analysis with deep neural networks for e-commerce recommender systems. EEE Access, 7:119121-119130.
[30]Srebro N, Rennie J, Jaakkola TS, 2004. Maximum-margin matrix factorization. Conf on Neural Information Processing Systems, p.1329-1336.
[31]Srivastava RK, Greff K, Schmidhuber J, 2015. Training very deep networks. Advances in Neural Information Processing Systems, p.2377-2385.
[32]Strub F, Mary J, 2015. Collaborative filtering with stacked denoising autoencoders and sparse inputs. NIPS Workshop on Machine Learning for e-Commerce, p.1-8.
[33]Vaswani A, Shazeer N, Parmar N, et al., 2017. Attention is all you need. 31st Conf on Neural Information Processing Systems, p.5998-6008.
[34]Wang C, Blei DM, 2011. Collaborative topic modeling for recommending scientific articles. Proc 17th ACM SIGKDD Int Conf on Knowledge Discovery and Data Mining, p.448-456.
[35]Wang H, Wang NY, Yeung DY, 2015. Collaborative deep learning for recommender systems. Proc 21st ACM SIGKDD Int Conf on Knowledge Discovery and Data Mining, p.1235-1244.
[36]Wang H, Shi XJ, Yeung DY, 2016. Collaborative recurrent autoencoder: recommend while learning to fill in the blanks. Proc 30th Int Conf on Neural Information Processing Systems, p.415-423.
[37]Wu Y, DuBois C, Zheng AX, et al., 2016. Collaborative denoising auto-encoders for top-N recommender systems. Proc 9th ACM Int Conf on Web Search and Data Mining, p.153-162.
Open peer comments: Debate/Discuss/Question/Opinion
<1>