Diversity-Preserving K-Armed Bandits, Revisited - IRT Saint Exupéry - Institut de Recherche Technologique Accéder directement au contenu
Pré-Publication, Document De Travail Année : 2020

Diversity-Preserving K-Armed Bandits, Revisited

Résumé

We consider the bandit-based framework for diversity-preserving recommendations introduced by Celis et al. (2019), who approached it mainly by a reduction to the setting of linear bandits. We design a UCB algorithm using the specific structure of the setting and show that it enjoys a bounded distribution-dependent regret in the natural cases when the optimal mixed actions put some probability mass on all actions (i.e., when diversity is desirable). Simulations illustrate this fact. We also provide regret lower bounds and briefly discuss distribution-free regret bounds.
Fichier principal
Vignette du fichier
FairBandits-ALT20.pdf (1.4 Mo) Télécharger le fichier
Origine : Fichiers produits par l'(les) auteur(s)

Dates et versions

hal-02957485 , version 1 (05-10-2020)
hal-02957485 , version 2 (05-04-2024)

Identifiants

Citer

Hédi Hadiji, Sébastien Gerchinovitz, Jean-Michel Loubes, Gilles Stoltz. Diversity-Preserving K-Armed Bandits, Revisited. 2020. ⟨hal-02957485v1⟩
279 Consultations
99 Téléchargements

Altmetric

Partager

Gmail Facebook X LinkedIn More