Yulin Yang

Yulin Yang
University of Delaware | UDel UD · Department of Mechanical Engineering

Doctor of Engineering

About

43
Publications
19,960
Reads
How we measure 'reads'
A 'read' is counted each time someone views a publication summary (such as the title, abstract, and list of authors), clicks on a figure, or views or downloads the full-text. Learn more
704
Citations
Additional affiliations
July 2015 - November 2015
University of Delaware
Position
  • Research Assistant

Publications

Publications (43)
Preprint
Full-text available
In this paper, we study in-depth the problem of online self-calibration for robust and accurate visual-inertial state estimation. In particular, we first perform a complete observability analysis for visual-inertial navigation systems (VINS) with full calibration of sensing parameters, including IMU and camera intrinsics and IMU-camera spatial-temp...
Article
Full-text available
When studying the right invariant error states for consistent visual-inertial navigation system (VINS), we discover that although invariant extended Kalman filter (EKF) based VINS can consistently model system uncertainties, the features in the state can cause high computational cost in state covariance propagation compared to the standard EKF. To...
Conference Paper
Full-text available
In this paper, we introduce a novel visual-inertial-wheel odometry (VIWO) system for ground vehicles, which efficiently fuses multi-modal visual, inertial and 2D wheel odometry measurements in a sliding-window filtering fashion. As multi-sensor fusion requires both intrinsic and extrinsic (spatiotemproal) calibration parameters which may vary over...
Preprint
Full-text available
Multi-sensor fusion of multi-modal measurements from commodity inertial, visual and LiDAR sensors to provide robust and accurate 6DOF pose estimation holds great potential in robotics and beyond. In this paper, building upon our prior work (i.e., LIC-Fusion), we develop a sliding-window filter based LiDAR-Inertial-Camera odometry with online spatio...
Conference Paper
Full-text available
Batch optimization based inertial measurement unit (IMU) and visual sensor fusion enables high rate localiza-tion for many robotic tasks. However, it remains a challenge to ensure that the batch optimization is computationally efficient while being consistent for high rate IMU measurements without marginalization. In this paper, we derive inspirati...
Conference Paper
Full-text available
In this paper, we present an open platform, termed OpenVINS, for visual-inertial estimation research for both the academic community and practitioners from industry. The open sourced codebase provides a foundation for researchers and engineers to quickly start developing new capabilities for their visual-inertial systems. This codebase has out of t...
Article
In this paper, we present a real‐time high‐precision visual localization system for an autonomous vehicle which employs only low‐cost stereo cameras to localize the vehicle with a priori map built using a more expensive 3D LiDAR sensor. To this end, we construct two different visual maps: a sparse feature visual map for visual odometry (VO) based m...
Chapter
Due to increasing proliferation of autonomous vehicles, securing robot navigation against malicious attacks becomes a matter of urgent societal interest, because attackers can fool these vehicles by manipulating their sensors, exposing us to unprecedented vulnerabilities and ever-increasing possibilities for malicious attacks. To address this issue...
Conference Paper
Full-text available
In this paper, we present a tightly-coupled monoc-ular visual-inertial navigation system (VINS) using points and lines with degenerate motion analysis for 3D line triangulation. Based on line segment measurements from images, we propose two sliding window based 3D line triangulation algorithms and compare their performance. Analysis of the proposed...
Article
In this article, we perform a thorough observability analysis for linearized inertial navigation systems (INS) aided by exteroceptive range and/or bearing sensors (such as cameras, LiDAR, and sonars) with different geometric features (points, lines, planes, or their combinations). In particular, by reviewing common representations of geometric feat...
Article
Full-text available
In this letter, we develop a low-cost stereo visual-inertial localization system, which leverages efficient multi-state constraint Kalman filter (MSCKF)-based visual-inertial odometry (VIO) while utilizing an a priori LiDAR map to provide bounded-error three-dimensional navigation. Besides the standard sparse visual feature measurements used in V...
Conference Paper
Full-text available
This paper presents a tightly-coupled aided iner-tial navigation system (INS) with point and plane features, a general sensor fusion framework applicable to any visual and depth sensor (e.g., RGBD, LiDAR) configuration, in which the camera is used for point feature tracking and depth sensor for plane extraction. The proposed system exploits geometr...
Preprint
Full-text available
2019 International Conference on Robotics and Automation (ICRA)
Article
In this paper we present a novel method to perform target tracking of a moving rigid body utilizing an inertial measurement unit (IMU) with cameras. A key contribution is the tightly-coupling of the target motion estimation within a visual-inertial navigation system (VINS), allowing for improved performance of both processes. In particular, we buil...
Article
In this paper we perform in-depth observability analysis for both spatial and temporal calibration parameters of an aided inertial navigation system (INS) with global and/or local sensing modalities. In particular, we analytically show that both spatial and temporal calibration parameters are observable if the sensor platform undergoes random motio...
Conference Paper
Full-text available
This paper presents the formalization of the closest point plane representation and an analysis of its incorporation in 3D indoor simultaneous localization and mapping (SLAM). We present a singularity free plane factor leveraging the closest point plane representation, and demonstrate its fusion with inertial preintegratation measurements in a grap...
Conference Paper
Full-text available
In this paper, we perform observability analysis for inertial navigation systems (INS) aided by generic exteroceptive range and/or bearing sensors with different geometric features including points, lines and planes. While the observability of vision-aided INS (VINS, which uses camera as a bearing sensor) with point features has been extensively st...
Preprint
Full-text available
In this paper, we perform a thorough observability analysis for linearized inertial navigation systems (INS) aided by exteroceptive range and/or bearing sensors (such as cameras, LiDAR and sonars) with different geometric features (points, lines and planes). While the observability of vision-aided INS (VINS) with point features has been extensively...
Conference Paper
Full-text available
Due to increasing proliferation of autonomous vehicles, securing robot navigation against malicious attacks becomes a matter of urgent societal interest, because attackers can fool these vehicles by manipulating their sensors, exposing us to unprecedented vulnerabilities and ever-increasing possibilities for malicious attacks. To address this issue...
Conference Paper
Full-text available
In SLAM, the size of the state vector tends to grow when exploring unknown environments, causing the ever-increasing computational complexity. To reduce the computational cost, one needs to continuously marginalize part of previous states (features and/or poses). This can be achieved by using either the null-space operation or Schur complement base...
Conference Paper
Full-text available
In this paper, we introduce a novel acoustic-inertial navigation system (AINS) for Autonomous Underwater Vehicles (AUVs). We are aiming to reduce the cost and latency of current underwater navigation systems that typically employ high-accuracy and thus high-cost inertial sensors. In particular, the proposed approach efficiently fuses the acoustic o...
Patent
Full-text available
The invention relates to a connecting and supporting structure of a camera and a total station. The connecting and supporting structure of the camera and the total station is used for mounting the camera and the total station. The total station comprises a horizontal rotating base and a telescope system which are connected. The connecting and suppo...
Article
Full-text available
For most off-the-shelf cameras, the lens distortion is inevitable. An accurate template-based calibration method for the lens distortion of an area array digital camera is proposed. A checkerboard template is employed as a standard object in this method and only one image of the template is needed. The mapping relationship in distortion-free imagin...

Network

Cited By

Projects

Projects (2)
Project
Two targets: 1) Observability analysis for inertial navigation system with generic sensors; 2) Observability analysis for inertial aided navigation system with points, lines, planes or their combinations.