...
首页> 外文期刊>ACM transactions on the web >Pre-Training Across Different Cities for Next POI Recommendation
【24h】

Pre-Training Across Different Cities for Next POI Recommendation

机译:

获取原文
获取原文并翻译 | 示例
           

摘要

The Point-of-Interest (POI) transition behaviors could hold absolute sparsity and relative sparsity very differently for different cities. Hence, it is intuitive to transfer knowledge across cities to alleviate those data sparsity and imbalance problems for next POI recommendation. Recently, pre-training over a large-scale dataset has achieved great success in many relevant fields, like computer vision and natural language processing. By devising various self-supervised objectives, pre-training models can produce more robust representations for downstream tasks. However, it is not trivial to directly adopt such existing pre-training techniques for next POI recommendation, due to the lacking of common semantic objects (users or items) across different cities. Thus in this paper, we tackle such a new research problem of pre-training across different cities for next POI recommendation. Specifically, to overcome the key challenge that different cities do not share any common object, we propose a novel pre-training model named CATUS, by transferring the category-level universal transition knowledge over different cities. Firstly, we build two self-supervised objectives in CATUS: next category prediction and next POI prediction, to obtain the universal transition-knowledge across different cities and POIs. Then, we design a category-transition oriented sampler on the data level and an implicit and explicit transfer strategy on the encoder level to enhance this transfer process. At the fine-tuning stage, we propose a distance oriented sampler to better align the POI representations into the local context of each city. Extensive experiments on two large datasets consisting of four cities demonstrate the superiority of our proposed CATUS over the state-of-the-art alternatives. The code and datasets are available at https://github.com/NLPWM-WHU/CATUS.

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号