Science topic
Multisensory Integration - Science topic
Explore the latest questions and answers in Multisensory Integration, and find Multisensory Integration experts.
Questions related to Multisensory Integration
At the moment I am working with Sentinel-2 images and would like to generate per pixel angle bands (view_zenith, view_azimuth, sun_zenith and sun_azimuth).
This is my best so far: https://github.com/marujore/sentinel2_angle_bands
Considering Sentinel-2 SAFE structure, I found that MTD_TL.xml (metadata) describes the angle bands. This XML file describes the angle bands as 5000m resolution 23x23 data. Regarding solar zenith and azimuth each one is separated in one matrix, while for view zenith and azimuth it is distributed in several matrix based on the band and detectors.
Another consideration to be done is that this 23x23 matrix (at 5000m) is larger than the granule image. Regarding this, from ESA Sentinel Application Platform (SNAP) forum (https://forum.step.esa.int/t/generate-view-angles-from-metadata-sentinel-2/5598/2) I found out that not all fields from this matrix are used.
That said, I would like to know if there is any code or platform that is used to calculate Sentinel-2 angle bands. I found out that SNAP can perform that but it has several memory issues.
Has someone done further research or a conceptual clarification on the "abstract body-model" proposed by Tsakiris, Haggard and Constantini which is proposed in the attached paper ?
Many scientists believe such information is deleted by attentive processes, but there is no proof for that.