Jingjing Gao’s research while affiliated with Hangzhou Dianzi University and other places

What is this page?


This page lists works of an author who doesn't have a ResearchGate profile or hasn't added the works to their profile yet. It is automatically generated from public (personal) data to further our legitimate goal of comprehensive and accurate scientific recordkeeping. If you are this author and want this page removed, please let us know.

Publications (1)


HSLC
Flowchart of the proposed method (HSLC)
Illustrating intra-class and inter-class discriminative structure preservation
Parameter sensitivity analysis for α\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$\alpha$$\end{document}
Parameter sensitivity analysis for β\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$\beta$$\end{document}

+3

Hybrid structure with label consistency for unsupervised domain adaptation
  • Article
  • Publisher preview available

March 2025

·

2 Reads

Signal Image and Video Processing

Wantian Zhang

·

Jingjing Gao

·

·

[...]

·

Unsupervised domain adaptation (UDA) is a technique for learning from a label-rich source domain and transferring the learned knowledge to an unlabeled target domain. Current researches on feature-based UDA methods usually utilize the pseudo labels to find new feature representations that can minimize the distribution difference between the two domains. But the inaccurate pseudo labels may hinder exploiting the precise intrinsic structures, leading to poor performance. In addition, some theories reveal that the transferability of features might be compromised during the process of learning feature representations. To address these problems, we propose hybrid structure with label consistency (HSLC) for UDA. Firstly, in a dynamically updated low-dimensional space, HSLC adaptively captures the local connectivity of target data by using the local manifold self-learning strategy, and explores the discriminative information of source domain by minimizing the intra-class distance. Then, the pseudo labels of target domain can be obtained by class centroid matching. Furthermore, we utilize between-domain and within-domain label consistency by training multiple class-wise domain classifiers to reweight target samples, which enhances the quality of pseudo labels by considering between-domain sample correlation and geometric structure of target domain. Finally, the refined pseudo labels are used to maximize the inter-class distance for the two domains, which not only reduces the impact of inaccurate pseudo labels on preserving discriminative structure but also contributes to exploring various intrinsic properties. Extensive experiments on the benchmark datasets demonstrate that our method is competitive with the state-of-the-art UDA methods.

View access options