GAN Based Data Augmentation for Indoor Localization Using Labeled and Unlabeled Data - ETIS, équipe ICI Accéder directement au contenu
Communication Dans Un Congrès Année : 2021

GAN Based Data Augmentation for Indoor Localization Using Labeled and Unlabeled Data

Wafa Njima
  • Fonction : Auteur
  • PersonId : 1092552
Marwa Chafii
Raed M Shubair
  • Fonction : Auteur
  • PersonId : 1110499

Résumé

Machine learning techniques allow accurate indoor localization with low online complexity. However, a large amount of collected data samples is needed to properly train a deep neural network (DNN) model used for localization. In this paper, we propose to generate fake fingerprints using generative adversarial networks (GANs) based on a small amount of collected data samples. We consider an indoor scenario where collected labeled data samples are rare and insufficient to generate fake samples of a good multitude and diversity in order to provide a good localization accuracy. Thus, both labeled and unlabeled fingerprints are provided to the GAN so that more realistic fake data samples are generated. Then, a DNN model is trained on mixed dataset comprising real collected labeled and pseudo-labeled fingerprints as well as fake generated pseudo-labeled fingerprints. The data augmentation based on real measurements leads to a mean localization accuracy improvement of 9.66% in comparison to the conventional semi-supervised localization algorithm.
Fichier principal
Vignette du fichier
BalkanCom_FV.pdf (246.39 Ko) Télécharger le fichier
Origine : Fichiers produits par l'(les) auteur(s)

Dates et versions

hal-03347456 , version 1 (17-09-2021)

Identifiants

  • HAL Id : hal-03347456 , version 1

Citer

Wafa Njima, Marwa Chafii, Raed M Shubair. GAN Based Data Augmentation for Indoor Localization Using Labeled and Unlabeled Data. Fourth International Balkan Conference on Communications and Networking (BalkanCom 2021), Sep 2021, Novi Sad, Serbia. ⟨hal-03347456⟩
103 Consultations
266 Téléchargements

Partager

Gmail Facebook X LinkedIn More