Addressing cold start with dataset transfer in e-commerce learning to rank
2021
Learning To Rank (LTR) models are a crucial component for the relevance ranking in e-commerce applications. When properly trained, the LTR models yield state-of-the-art ranking performance. A popular approach to training the LTR models involves minimizing a loss over historical customer interactions. Historical customer interactions are preferred over manually annotated data since the latter is costly to acquire and quickly becomes stale. Historical customer interactions are not always available when a new e-commerce platform is launched, or an existing platform is extended to new products. In this paper, we present the use of inverse propensity weighting as a strategy to create datasets where no historical customer interaction is available. We show results of an online A/B test which demonstrate the efficacy of our proposed strategy. Due to a simple practical application of the proposed strategy, we believe this can help LTR practitioners.
Research areas