Adaptive Online Hyper-Parameters Tuning for Ad Event-Prediction Models

Publication
Apr 6, 2017
Abstract

Gemini native is one of Yahoo's fastest growing businesses, reaching a run-rate of 600 Millions USD in the past year. Driving Gemini native models that are used to predict both, click probability (pCTR) and conversion probability (pCONV), is OFFSET - a feature enhanced collaborative-filtering (CF) based event prediction algorithm. OFFSET is a one-pass algorithm that updates its model for every new batch of logged data using a stochastic gradient descent (SGD) based approach. As all learning algorithms, OFFSET includes several hyper-parameters that can be tuned to provide best performance for a given system conditions. Since the marketplace environment is very dynamic and influenced by seasonality and other temporal factors, having a fixed single set of hyper-parameters (or configuration) for the learning algorithm is sub-optimal.
In this work we present an online hyper-parameters tuning algorithm, which takes advantage of the system parallel map-reduce based architecture, and strives to adapt the hyper-parameters set to provide the best performance at a specific time interval. Online evaluation via bucket testing of the tuning algorithm showed a significant 5% revenue lift in all sections, and a staggering 12% lift specifically for Yahoo HomePage section. Since then, the tuning algorithm was pushed into production, tuning both click- and conversion-prediction models, and is generating a hefty estimated lift of 30 Million USD yearly for Gemini native.
The proposed tuning mechanism can  be easily generalized to fit any learning algorithm that continuously learns on incoming streaming data, in order to adapt its hyper-parameters to temporal changes.

  • International World Wide Web Conference (WWW 2017 Industrial Track)
  • Conference/Workshop Paper

BibTeX