Conference Paper

Rapid Autonomous Navigation of Geosynchronous Satellite Using Novel Infrared Earth Measurements

Authors:
To read the full-text of this research, you can request a copy directly from the authors.

No full-text available

Request Full-text Paper PDF

To read the full-text of this research,
you can request a copy directly from the authors.

Article
Full-text available
Infrared Earth sensors are widely used in attitude-determination and control systems of satellites. The main deficiency of static infrared Earth sensors is the requirement of a small field of view (FOV). A typical FOV for a static infrared Earth sensor is about 20° to 30°, which may not be sufficient for low-Earth-orbiting micro-satellites. A novel compact infrared Earth sensor with an FOV of nearly 180° is developed here. The Earth sensor comprises a panoramic annular lens (PAL) and an off-the-shelf camera with an uncooled complementary-metal-oxide-semiconductor (CMOS) infrared sensor. PAL is used to augment FOV so as to obtain a complete infrared image of the Earth from low-Earth-orbit. An algorithm is developed to compensate for the distortion caused by PAL and to calculate the vector of the Earth. The new infrared Earth sensor is compact with low power consumption and high precision. Simulated images and on-orbit infrared images obtained via the micro-satellite ZDPS-2 are used to assess the performance of the new infrared Earth sensor. Experiments show that the accuracy of the Earth sensor is about 0.032°.
Article
A comparative accuracy analysis is performed for two types of scanning IR Earth horizon sensors developed during a few last years. Resulting measurement error contribution of certain error sources, including instability of hardware parameters, as well as influence of seasons latitude variations of the Earth IR radiance, is estimated based on computer simulation. The conclusion is motivated concerning application of the obtained results for reducing measurement errors of some IR Earth horizon sensors currently exploiting on board of the satellites.
Article
Modern CCD cameras are usually capable of a spatial accuracy greater than 1/50 of the pixel size. However, such accuracy is not easily attained due to various error sources that can affect the image formation process. Current calibration methods typically assume that the observations are unbiased, the only error is the zero-mean independent and identically distributed random noise in the observed image coordinates, and the camera model completely explains the mapping between the 3D coordinates and the image coordinates. In general, these conditions are not met, causing the calibration results to be less accurate than expected. In the paper, a calibration procedure for precise 3D computer vision applications is described. It introduces bias correction for circular control points and a nonrecursive method for reversing the distortion model. The accuracy analysis is presented and the error sources that can reduce the theoretical accuracy are discussed. The tests with synthetic images indicate improvements in the calibration results in limited error conditions. In real images, the suppression of external error sources becomes a prerequisite for successful calibration.
Astronautical Guidance
  • R Battin
State estimation method for spacecraft autonomous navigation: Rewiew
  • D Wang
  • B Hou
  • J Wang
State estimation method for spacecraft autonomous navigation: Rewiew
  • wang
High precision extraction of infrared light spot center in vision measurement
  • wang
High precision extraction of infrared light spot center in vision measurement
  • W Wang
  • Y Shi
  • D Li