Research on transfer learning theory and methods in hybrid network fusion high-precision indoor positioning

  • Ansari, Nirwan (PI)
  • 陈章鑫, null (CoPI)
  • 段林甫, null (CoPI)
  • 邹晶, null (CoPI)
  • 朱世林, null (CoPI)
  • 李林, null (CoPI)
  • 徐峰, null (CoPI)
  • 八木, 裕一郎 八. (PI)

Project: Research project

Project Details


With the rapid development of Internet of Things technology, the ubiquitous hybrid network environment provides a rich fingerprint space for target positioning. Compared with the positioning environment of homogeneous and heterogeneous networks, hybrid network fingerprints have the following characteristics: (1) Location fingerprints have hardware differences, time differences, spatial differences, and dimensional differences; (2) Location fingerprints have A mixture of labeled and unlabeled data; (3) Location fingerprint labels are imprecise, inconsistent, and contradictory; (4) Location fingerprints have cross-network and cross-domain distribution differences. In view of the above characteristics, this project proposes research on transfer learning theory and methods for hybrid network integration, which mainly includes: (1) popular alignment method of hybrid network fingerprints; (2) cross-network and cross-domain mutually constrained self-learning knowledge transfer method; (3) Domain adaptive transfer learning method for computational effectiveness; and (4) Robust positioning method for cross-network and cross-domain fusion. The research of this project will overcome the insufficient utilization of environmental information in traditional homogeneous network fusion positioning, fully improve the utilization efficiency of fingerprint space, and establish a new high-precision indoor positioning framework based on hybrid network fusion.

Effective start/end date1/1/8612/31/21


  • National Natural Science Foundation of China: $101,377.00


Explore the research topics touched on by this project. These labels are generated based on the underlying awards/grants. Together they form a unique fingerprint.