Xingyou (Richard) Song - OmniPred: Towards Universal Regressors with Language Models

Описание к видео Xingyou (Richard) Song - OmniPred: Towards Universal Regressors with Language Models

Title: OmniPred: Towards Universal Regressors with Language Models

Abstract:
Over the broad landscape of experimental design, regression has been a powerful tool to accurately predict the outcome metrics of a system or model given a set of parameters, but has been traditionally restricted to methods which are only applicable to a specific task. In this paper, we introduce OmniPred, a framework for training language models as universal end-to-end regressors over (x, y) evaluation data from diverse real world experiments. Using data sourced from Google Vizier, one of the largest blackbox optimization databases in the world, our extensive experiments demonstrate that through only textual representations of mathematical parameters and values, language models are capable of very precise numerical regression, and if given the opportunity to train over multiple tasks at scale, can significantly outperform traditional regression models.

Speaker: Xingyou (Richard) Song https://xingyousong.github.io/

Paper: https://arxiv.org/abs/2402.14547

Code: https://github.com/google-research/op...

Комментарии

Информация по комментариям в разработке