Science topic

Multisensory Integration - Science topic

Explore the latest questions and answers in Multisensory Integration, and find Multisensory Integration experts.
Questions related to Multisensory Integration
  • asked a question related to Multisensory Integration
Question
8 answers
At the moment I am working with Sentinel-2 images and would like to generate per pixel angle bands (view_zenith, view_azimuth, sun_zenith and sun_azimuth).
Considering Sentinel-2 SAFE structure, I found that MTD_TL.xml (metadata) describes the angle bands. This XML file describes the angle bands as 5000m resolution 23x23 data. Regarding solar zenith and azimuth each one is separated in one matrix, while for view zenith and azimuth it is distributed in several matrix based on the band and detectors.
Another consideration to be done is that this 23x23 matrix (at 5000m) is larger than the granule image. Regarding this, from ESA Sentinel Application Platform (SNAP) forum (https://forum.step.esa.int/t/generate-view-angles-from-metadata-sentinel-2/5598/2) I found out that not all fields from this matrix are used.
That said, I would like to know if there is any code or platform that is used to calculate Sentinel-2 angle bands. I found out that SNAP can perform that but it has several memory issues.
Relevant answer
I started developing a script to calculate the angle images.
If anyone need something similar, it can be found at https://github.com/marujore/sentinel_angle_bands . The only parameter is the MTD_TL path in the sentinel ".SAFE/GRANULE" folder. However, the sensor azimuth bands are not satisfatory, as can be seen in attached images.
  • asked a question related to Multisensory Integration
Question
1 answer
Has someone done further research or a conceptual clarification on the "abstract body-model" proposed by Tsakiris, Haggard and Constantini which is proposed in the attached paper ?
Relevant answer
Answer
dear Max, i could find the following reference into the bibliography for the thesis of a [very good] Physio working in my Institute [abstract available in PubMed]: I hope this tiny hint may be somehow useful, Mauro C.
Beschin N, Robertson IH. Personal versus extrapersonal neglect: a group study of their dissociation using a reliable clinical test. Cortex 1997;33:379-84.
  • asked a question related to Multisensory Integration
Question
8 answers
Many scientists believe such information is deleted by attentive processes, but there is no proof for that.
Relevant answer
Answer
To come back to your question of not entered data. I would like to mention that many researchers now believe that attentional processes are mediated by feedback between lower and higher visual areas. In other words features and representations come into the attentional domain when recurrent activity develops between higher and lower areas for particular features. It's possible that this recurrent activity is the mediator of conscious working memory (see model of Dehaene). So visual input that leads to activation of neurons up to the prefrontal cortex, e.g Frontral eye fields, probably involves phasic responses within 100ms and die out after 200ms, if they do not attract or engage our attentional systems. Attentional modulation even down to V1 operates mostly after 200ms.
Interestingly these unconsciously activated features can later be brought in to attention (iconic memory) if there has not been new visual input causing backward masking. The reason is probably that neural activity lingers on and becomes suppressed only after the activation of other neurons.
  • asked a question related to Multisensory Integration
Question
4 answers
An approach to multi sensor data fusion.
Relevant answer
Answer
I would strongly recommend a look at the work on decentralized (sensor) data fusion by Simon Julier and Jeffrey K. Uhlmann or for starters http://dsp-book.narod.ru/HMDF/2379ch12.pdf or even better the Handbook of Multisensor data fusion