Skip to content

Research at St Andrews

Unsupervised domain adaptation in activity recognition: a GAN-based approach

Research output: Contribution to journalArticlepeer-review


Andrea Rosales Sanabria, Franco Zambonelli, Juan Ye

School/Research organisations


Sensor-based human activity recognition (HAR) is having a significant impact in a wide range of applications in smart city, smart home, and personal healthcare. Such wide deployment of HAR systems often faces the annotation-scarcity challenge; that is, most of the HAR techniques, especially the deep learning techniques, require a large number of training data while annotating sensor data is very time- and effort-consuming. Unsupervised domain adaptation has been successfully applied to tackle this challenge, where the activity knowledge from a well-annotated domain can be transferred to a new, unlabelled domain. However, these existing techniques do not perform well on highly heterogeneous domains. This paper proposes shift-GAN that integrate bidirectional generative adversarial networks (Bi-GAN) and kernel mean matching (KMM) in an innovative way to learn intrinsic, robust feature transfer between two heterogeneous domains. Bi-GAN consists of two GANs that are bound by a cyclic constraint, which enables more effective feature transfer than a classic, single GAN model. KMM is a powerful non-parametric technique to correct covariate shift, which further improves feature space alignment. Through a series of comprehensive, empirical evaluations, shift-GAN has not only achieved its superior performance over 10 state-of-the-art domain adaptation techniques but also demonstrated its effectiveness in learning activity-independent, intrinsic feature mappings between two domains, robustness to sensor noise, and less sensitivity to training data.


Original languageEnglish
Pages (from-to)19421-19438
Number of pages18
JournalIEEE Access
Publication statusPublished - 22 Jan 2021

    Research areas

  • Human activity recognition, Domain adaptation, Ensemble learning, Generative adversarial networks, Covariate shift, Kernal mean matching

Discover related content
Find related publications, people, projects and more using interactive charts.

View graph of relations

Related by author

  1. ContrasGAN: unsupervised domain adaptation in Human Activity Recognition via adversarial and contrastive learning

    Rosales Sanabria, A., Zambonelli, F., Dobson, S. A. & Ye, J., 6 Nov 2021, (E-pub ahead of print) In: Pervasive and Mobile Computing. In Press, p. 1-34 34 p., 101477.

    Research output: Contribution to journalArticlepeer-review

  2. Collaborative activity recognition with heterogeneous activity sets and privacy preferences

    Civitarese, G., Ye, J., Zampatti, M. & Bettini, C., 4 Nov 2021, (E-pub ahead of print) In: Journal of Ambient Intelligence and Smart Environments. Pre-press, p. 1-20 20 p.

    Research output: Contribution to journalArticlepeer-review

  3. Investigating multisensory integration in emotion recognition through bio-inspired computational models

    Mansouri Benssassi, E. & Ye, J., 19 Aug 2021, (E-pub ahead of print) In: IEEE Transactions on Affective Computing. Early Access, 13 p.

    Research output: Contribution to journalArticlepeer-review

  4. Continual learning in sensor-based human activity recognition: an empirical benchmark analysis

    Jha, S., Schiemer, M., Zambonelli, F. & Ye, J., 16 Apr 2021, (E-pub ahead of print) In: Information Sciences. In Press, p. 1-35 35 p.

    Research output: Contribution to journalArticlepeer-review

  5. Continual activity recognition with generative adversarial networks

    Ye, J., Nakwijit, P., Schiemer, M., Jha, S. & Zambonelli, F., 27 Mar 2021, In: ACM Transactions on Internet of Things. 2, 2, p. 1-25 25 p., 9.

    Research output: Contribution to journalArticlepeer-review

Related by journal

  1. Radar signatures of drones equipped with heavy payloads and dynamic payloads generating inertial forces

    Rahman, S., Robertson, D. A. & Govoni, M. A., 18 Dec 2020, In: IEEE Access. 8, p. 220542-220556 15 p.

    Research output: Contribution to journalArticlepeer-review

ID: 272509284