Regularizing Matrix Factorization with User and Item Embeddings for Recommendation

August 31, 2018 ยท Entered Twilight ยท ๐Ÿ› International Conference on Information and Knowledge Management

๐ŸŒ… TWILIGHT: Old Age
Predates the code-sharing era โ€” a pioneer of its time

"Last commit was 6.0 years ago (โ‰ฅ5 year threshold)"

Evidence collected by the PWNC Scanner

Repo contents: .gitignore, MODELS, MultiProcessParallelSolver.py, README.md, __init__.py, batched_inv_joblib.py, cofactor.py, content_wmf.py, data, global_constants.py, model_runner.py, parallel_rme.py, produce_negative_cooccurrence.py, produce_negative_embedding.py, produce_positive_cooccurrence.py, rec_eval.py, rme_preprocess.py, rme_rec.py, text_utils.py, utils.py, wmf.py

Authors Thanh Tran, Kyumin Lee, Yiming Liao, Dongwon Lee arXiv ID 1809.00979 Category cs.IR: Information Retrieval Cross-listed cs.AI Citations 71 Venue International Conference on Information and Knowledge Management Repository https://github.com/thanhdtran/RME.git โญ 46 Last Checked 2 months ago
Abstract
Following recent successes in exploiting both latent factor and word embedding models in recommendation, we propose a novel Regularized Multi-Embedding (RME) based recommendation model that simultaneously encapsulates the following ideas via decomposition: (1) which items a user likes, (2) which two users co-like the same items, (3) which two items users often co-liked, and (4) which two items users often co-disliked. In experimental validation, the RME outperforms competing state-of-the-art models in both explicit and implicit feedback datasets, significantly improving Recall@5 by 5.9~7.0%, NDCG@20 by 4.3~5.6%, and MAP@10 by 7.9~8.9%. In addition, under the cold-start scenario for users with the lowest number of interactions, against the competing models, the RME outperforms NDCG@5 by 20.2% and 29.4% in MovieLens-10M and MovieLens-20M datasets, respectively. Our datasets and source code are available at: https://github.com/thanhdtran/RME.git.
Community shame:
Not yet rated
Community Contributions

Found the code? Know the venue? Think something is wrong? Let us know!

๐Ÿ“œ Similar Papers

In the same crypt โ€” Information Retrieval