Privileged Graph Distillation for Cold Start Recommendation


Abstract in English

The cold start problem in recommender systems is a long-standing challenge, which requires recommending to new users (items) based on attributes without any historical interaction records. In these recommendation systems, warm users (items) have privileged collaborative signals of interaction records compared to cold start users (items), and these Collaborative Filtering (CF) signals are shown to have competing performance for recommendation. Many researchers proposed to learn the correlation between collaborative signal embedding space and the attribute embedding space to improve the cold start recommendation, in which user and item categorical attributes are available in many online platforms. However, the cold start recommendation is still limited by two embedding spaces modeling and simple assumptions of space transformation. As user-item interaction behaviors and user (item) attributes naturally form a heterogeneous graph structure, in this paper, we propose a privileged graph distillation model~(PGD). The teacher model is composed of a heterogeneous graph structure for warm users and items with privileged CF links. The student model is composed of an entity-attribute graph without CF links. Specifically, the teacher model can learn better embeddings of each entity by injecting complex higher-order relationships from the constructed heterogeneous graph. The student model can learn the distilled output with privileged CF embeddings from the teacher embeddings. Our proposed model is generally applicable to different cold start scenarios with new user, new item, or new user-new item. Finally, extensive experimental results on the real-world datasets clearly show the effectiveness of our proposed model on different types of cold start problems, with average $6.6%, 5.6%, $ and $17.1%$ improvement over state-of-the-art baselines on three datasets, respectively.

Download