Science method
Optical Imaging - Science method
Explore the latest questions and answers in Optical Imaging, and find Optical Imaging experts.
Questions related to Optical Imaging
I have a video of Brownian motion of microbes under a microscope. From this video, I need to calculate the angular velocity of a particular prominent cell. From my understanding, what I know is that I need to identify some points within the cell, track its location at successive time frames and thus get the velocity by dividing the distance traveled by that point by time.
Now my question is given the quality of video I have, what is the best way to track some points within the video? My quality of video is such that manual identification of points at different time snapshots within the video is not possible.
Attached a particular time snapshot of the video as an image here.

2024 4th International Conference on Image Processing and Intelligent Control (IPIC 2024) will be held from May 10 to 12, 2024 in Kuala Lumpur, Malaysia.
Conference Webiste: https://ais.cn/u/ZBn2Yr
---Call For Papers---
The topics of interest for submission include, but are not limited to:
◕ Image Processing
- Image Enhancement and Recovery
- Target detection and tracking
- Image segmentation and labeling
- Feature extraction and image recognition
- Image compression and coding
......
◕ Intelligent Control
- Sensors in Intelligent Photovoltaic Systems
- Sensors and Laser Control Technology
- Optical Imaging and Image Processing in Intelligent Control
- Fiber optic sensing technology in the application of intelligent photoelectric system
......
All accepted papers will be published in conference proceedings, and submitted to EI Compendex, Inspec and Scopus for indexing.
Important Dates:
Full Paper Submission Date: April 19, 2024
Registration Deadline: May 3, 2024
Final Paper Submission Date: May 3, 2024
Conference Dates: May 10-12, 2024
For More Details please visit:
Invitation code: AISCONF
*Using the invitation code on submission system/registration can get priority review and feedback

The optical image of the silver nanowires show white color and blue colored silver nanowires when placed under an optical microscope.
Specifications:
1. Reflective mode
2. bright field.
3. Magnification: 1000x.

This material is AISI 4140 or SCM440 .
What is white etching part in this optical image?


Can the IMCOUNTOR image diagram in MATLAB be used to run on holograms as well?
The image below is an example of an activated sludge cell under a light microscope and the second figure is an example of another cell.




Hello;
It is well known that when light reaches an optical element, part of it is lost through absorption, diffusion, and back reflection. In the case of mirrors, this value is well characterized and a realistic estimate would be around 4-5% (or less depending of the material). However, I cannot find similar information on commercial or scientific sites for beam-splitters. For example, in a well-known optical products company, if we enter the raw data the percentage of reflected and transmitted light adds up to more than 100% at some points on the curve! Without a doubt this has to do with the measurement methodology.
In the case of scientific articles, some estimate this absorption to be around 2% assuming that it is a block or sheet of a certain material (ignoring ghost images). However, this does not make sense since it would then be more interesting to use a dichroic beam splitter than a mirror in certain circumstances.
Of course everything will depend on the thickness, material used, AR treatment. However, I cannot find a single example and I am not able to know the order of magnitude. Does anyone know of any reference where a realistic estimate of the useful light that is lost when using a beam splitter of whatever characteristics is made?
Thanks !
I have TerraSAR-X spotlight image & worldview II optical image. And I want to extract soil moisture or soil roughness information from that image in regional scale. Many soil moisture models need field measurements to calculate the parameters. However, in this case no any field measurements to find. Hence I Would like to know any soil moisture model or relationship of radar backscatter with soil moisture + surface roughness without using any field measurements.
I'm working with different satellite imagery and I was wondering if there is a way to see what units (DN, radiance, reflectance) the pixel values are in. Currently I'm using ENVI, QGIS and SNAP working with KOMPSAT-3 and WorldView data.
Is there a way to tell if the images have been calibrated or not?
I am planning to use Y-90 and Cu-64 for some radiolabeling experiments and is it possible to detect the Cerenkov signals from these solutions in a Chemidoc system?
My purpose is to obtain preliminary results for screening purposes and then we will be sending out samples to another facility that is equipped with IVIS imagers.
Hi everyone, I am using an reflective phase-only SLM as the phase modulation source.
I have read some papers about the influence of the incident angle. It seems that the with the growthe of the incident anlge, the phase modulation has more distortion. It is better to keep the incident angle less than 10 degrees. In one week, I try two normal configurations: with/ without beam splitter and here is my problem.
1. With beam splitter, then incident angle of the SLM is 0 degree. However, I find that there exists second image! It seems that the second image comes from reflection path (to the original path) caused by the beam splitter. When I put the pattern onto the SLM, these two images look the same, but the second image is distorted.
2. So I remove the beam splitter and use the tilted incident angle. However, the lens after the SLM would block the beam when the incident angle is small... But the second image disappears.
So I am stuck here, could anyone give some suggestion about this problem? From your experience with SLM. Thanks in advance!
We are looking for glass-bottom 96-well plates suitable for high content imaging. As these are quite expensive I was looking for alternative suppliers and came across Cellvis who sell them at a more reasonable price. Has anyone used these plates for high content or similar applications and would be able to provide some feedback? Specifically with regards to variations in bottom thickness, bottom flatness, batch-to-batch variation, and suitability for standard TC procedures? Thank you,
Stefan
Mostly, CCD camera is used for speckle measurements. I am interested to do measurements with cmos camera, are there any characteristics of cmos camera which can effect my measurements?
e.g a lower sensitivity due to the smaller fill factor, higher temporal noise, higher pattern noise, higher dark current, and the nonlinear characteristic curve are primarily mentioned.
After recording the speckle contrast, next step is to measure it quantitatively and accurately. I am not sure how many pixels of an image should be considered for its calculation. Speckles were recorded under dark room condition.
It would be great if someone can tell from his/her experienece or recommend me a research paper about this specific issue. I searched but i was unable to find any papers or research work related to it.
Thanks in advance!
I need an accurate and real time algorithm to register optical and SAR image. can anyone help?
Hi there,
I hope you are doing well.
I am working with imaging systems. I am confused about the effects of linear polarizer in such systems ( I mean how a linear polarizer can improve the resolution?) and why working with one polarization is better than two polarization in image processing systems?
Bests,
Hi everyone. I meet a trouble case. I want to acquire the Fourier spectrum of an optical image for post-processing. It is known that most energy of the image concentrates on the low-frequency domain. Therefore, it is inevitable that saturation would happen because of the strong constrast between low -and high- frequency domain.
I try to reduce the intensity of laser to amend this problem, but the intensity in high-frequency domain also reduces, hence introducing noise and even no light in these areas anymore.
Is there any better solution? Could you give me some suggestions? Thanks in advance!
I am requesting expert advice in determining camera field of view and data processing considerations, for the purposes of proximal Structure from Motion 3D volumetric reconstructions of 0-2 m tall plants in an agricultural field, taken from a 1.5 mph moving platform, using color cameras and the pinhole optical model compute?
This because in Maricopa Phenotyping, Plant Group Phenotyping Team 2019 experimentation under the leadership of Dr. Thompson, we plan to image cotton plants, using three Nikon N1 aw1 16 MP DSLR action cameras, triggered together at 1 Hz, one camera mounted in nadir view and two oblique on either side of the row crop. We plan to process images using Photoscan.
However, as final (adjustable) camera mounting positions are created via new square tube arms on Professor PSC, I seek additional input in how to suspend, point and set cameras optimally, and so support success in the expected subsequent large volume SfM processing.
Please see iteration two in the Project update, “A second year Professor – Tenure Track?”
I have seen some workers used Matlab in combination with Image J for that purpose, others used scripting languages as well, Is this means that ImageJ can not do all the necessary tasks alone with all its free plugins.
While analyzing the trajectories of 40 nm Au nanoparticles diffusing on a surface ( dark-field optical image), I often land up with trajectories which cross. As a result the trajectory of a single particle is identified as several discreet trajectories by the algorithm. I use the standard centre of mass method for tracking. Is there a better technique which can take care of crossed trajectories by taking into consideration velocity/directionality or other parameters of a moving particle? However this is random walk, so this may be difficult. The particles are quite faint, and the background noise is appreciable even after post -processing, so the method also has to be robust in locating two very close spaced particles.
Given the ideal case, in the optically cleared sample there must be no light scattering, hence the polarisation and coherence are preserved. Would it be possible to interfere the lighting within the optically cleared sample?
Or in case of patterned light, how far the pattern would survive inside the sample?
I want to code and process from scratch, by using the available libraries. Which could be the appropriate methods when one have only a single image?
I am looking for a good program that helps me to calculate the penetration depth of light depending on some parameters? Any suggestions?
I found in a preliminary study that there is a relationship between focal depth and slip in local study in Fiji islands region,
If a subject such as soft tissue of human body, was exposed to a certain light, how we can measure the penetration depth of this light in that subject, practically and NOT by calculation?
Hi everyone,
What configuration of lenses is utilized to make light sheets for imaging purposes?
Spectral matting is a method permits the segmentation of the foreground taking into consideration all the details of it.
Sentinel - 1 product types are:
SH (single HH polarisation)
SV (single VV polarisation)
DH (dual HH+HV polarisation)
DV (dual VV+VH polarisation)
Which polarisation is useful for surface deformation studies?
diffuse grey radiative surface
Is there a version of the optical arrays that can capture the image intensities over a continuous region rather than the discrete sampling as being done in modern digital cameras in the form of pixels.
I would like use annular illumination with a LEDs ring illuminator or a optical fibers (a bundle) for an uniform (or relatively) of a surface (disk) of about few cm diameter?
I am working to establish the optical density of muscle fibers (type I, II, IIa & IIx) and am having some difficulty locating the function to do so on ImageJ.
We're using fiber optical G.652 and it needs Chromatic dispersion compensator but the latency is increasing, then if we use fiber optical G.655 the latency can improve?
I am willing to image latex beads under phase contrast microscope. I have suspended solution of latex beads from Sigma. I want to know good procedure to make a slide with latex beads and covering with a glass slide. I have tried drying the suspension on slide and then a drop of PBS solution and covering with cover-slip. I find difficulty to get good sample in this way. Could someone suggest the better procedure to prepare the sample ?
Regards
Hello everyone,
Does confocal micro. provide absorption cross section per unit chlorophyll a ? What is the best method for bio-optical measurement?
Anyone have experience using HyperOCR Satlantic product ?
Thank you,
Uyen Nguyen
I aimed to use laser rangefinding instruments to find the presence and distance of vessels in the waters for fishermen to be aware of his surroundings.I think that laser will help me with this.But still I want to know whether there even other sources which can work for my process even more efficiently.
Suppose I culture cells on a glass bottom dish for imaging, for example. The cell is an adhesive cell which tends to spread on the glass substrate. I wonder how large the gap can be, between the membrane of the cell and the glass substrate where the cell is attached to.
Actually, I've been considering that the gap can be ignorable since the cells are literally 'attached' to the glass substrate. However, recently I was told that there might be at least ~200 nm gap between the membrane and the glass substrate, even if the cell is firmly attached to the glass. Since my experiment matters in the order of 10 nm, I would like to make sure about the actual distance of the gap, if any.
Of course it depends on the cell shape and may vary upon local area in a single cell, I would like to know the overall idea, in case of the large leading edge of an adhesive cell.
Or, if you can share any idea about measuring the distance, I would appreciate it. Thank you for reading.
Hello, dear researchers, I have a modulation system which can give me the backscattered(reflection) and forward scattered (transmission) Stokes parameter of the particle under study. According to the Polarization Guide by Edward Collett, the Stokes of the elliptically polarized light are given as:
S0=Ex0^2+Ey0^2
S1=Ex0^2-Ey0^2
S2=2Ex0*Ey0*Cos(delta)
S3=2Ex0*Ey0*Sin(delta)
where Ex0,Ey0 are the amplitude of the scattered orthogonally e.field components and delta is the optical retardation due to a material(the particle under study)
Absorption Stokes can be found out from Reflection and transmission light Stokes and from all three types of Stokes, we can find out the orthogonally e-filed components imaginary and real part and delta from S2 and S3 which is the angle between them.
now my question is that with above-mentioned quantities can I find out the Scattering cross section? I used 400 nm to 700 nm wavelength.
How VV, VH, HH, HV polarized wave interacts with smooth, rough, vertical heighten objects and flat surfaces ? Why we prefer HV for AGB of Forest cover. One reason is volume / depolarization. Please explain the other reasons as well.
Dear Researchers,
I am new to this imaging field, so I am not familiar about 1 question.
(1) Why people use anti-Stokes emission for imaging?
Generally anti-stokes emission is always weaker than Stokes emission (2 or 3 photon process),right?
Then why not to use Stokes emission for imaging?
e.g. mostly used Er case, excite at 800 nm and strong Stokes emission appears at 980 nm.
Fulfill all the criteria needed for imaging so far I know, like optical window of body tissue, Detector sensitivity (well available Si-detector), availability of excitation source (800 nm strong laser) and so on.
Can anybody suggest me, proper purpose of Anti-Stokes emission in imaging?
I want to evaluate biofilm in plastic surface, and it is possible I can use a confocal microscopie but first need to know if laser light could through the plastic.
thanks
I have no references regarding Hydroxyapatite optical microscope images I got optical studies papers on HAP has PL and SEM only but I want optical images of hydroxyapatite using microscopes.Anyone clarify me regarding this question?
Thank you
If the limit of an optical microscopy system is to resolve the finest element in group 7 of the resolution charts, which has the width of ~2.19um.
Does this mean the resolution of the system is 2.19um or double that (4.38um)?
Hi,
We are planning to build a high finesse(>2000) FP cavity in which mirror reflectivity is 99.93%, in order to stabilize the ECDL. These FP cavity will be used in PDH locking in the later stage and we would like to know the AR coating necessary for the rear side of mirror since we are planning to order the mirrors.
After taking out the copper foil from quartz tube, I took the optical images. But I can't even see the copper grains. Can anyone suggest me what it is? Im using methane as carbon source in Ar and H2 atmosphere. Optical image is attached here.
From sub-pixel correlation of optical imagery the migration of sand dunes in river bed can be analyzed. How does this analysis enable me to suggest a suitable site for the construction of a bridge?
gray scale images.
general microscope.
What is the most common way to estimate the image is focused or not ?
Can you recommend some effective algorithm for automatic focusing ?
I am attaching certain optical images , one of base matrix and other of reinforced. Is there any possibility to judge the increase of pores from these images.

The distance from the proximal to the distal mirrors will be about .5 meters
I am currently looking for one that can be used for simulations, and thus a non-computer graphics based algorithm/model to depict the pathway of light in multi-layered skin and tissue targets.
Hi all,
I’m planning on performing voltage-sensitive dye imaging experiments in cortical brain slices to study neuronal-astrocytic interactions.
Since blue dyes have longer wavelengths (>600nm), and consequently the light scattering is reduced, I was thinking of using RH-1691 from Optical Imaging Ltd, but the VSD signal I observe is very small or none.
1- Although these dyes are optimal for in vivo recordings, do you think that they could be also useful for in situ functional imaging VSD recordings in mice/rat cortical brain slices? If not, what dye/s do you think would be more useful?
2) What would be the optimal incubation time and washing period for a cortical brain slice incubated with the dye?
3) Finally, would it be better to use red laser excitation with stained slices or just the proper filters?
I would appreciate if anyone can give me some advice about this to optimize the technique.
Thank you,
Alba Bellot Saez
From Unwrapped SAR Interferogram Line of Sight (LOS) component & from the sub-pixel correlation of optical imagery, NS and EW true horizontal components are derived. Is it possible to get the true vertical component from these three derived components?
HI, i have experimental optical microscopic results of Oral skin effected from cancer. I want to get the images by using some simulations methods like FDTD, FEM etc. Is it possible to have simulations in order to get images comparable with experimental imaging results? Which method can be useful and how can we define the material? I mean, for defining material we have to put there refractive index for specific wavelength, then how can we decide which is the the material? If there are more than one material, how can we put refractive index?
I have serial sectioned optical micrographs, which I need to set up a 3d view.
I am planning to register missile borne SAR images and optical pictures and want to know the differences between missle-borne and air-borne SAR images
In my samples you can find cracks, small spherical pores and large irregular pores/voids. So, I'm trying to understand how can you make sure that a large irregular pore/void is actually a pore and not the cross section of a crack.
I have attached two images to demonstrate my question.
Hello,
I have 5 LEDs with different wavelength and i want to do superposition of Leds to get broadband spectrum. I do not want to use any lens to collimate of light.
Is there any basic idea to collimate of 5 Leds light without using any lens?
confused to select digital camera to sample cotton leaves image,can any body suggest and reason?
Raman spectroscopy and two-photon microscopy are to be used to image 8um thick Formalin fixed paraffin embedded (FFPE) human breast tissues. The FFPE tissue sections are to be mounted on CaF2 slides. To avoid sample floating off the slide and peeling are there any mounting media / coverslips that can be used to secure the tissue sections onto the CaF2 slide that does not influence or affect the Raman signal during imaging?
The linearly polarized light incident to the tissue gives us different response from the surface and the bulk in the highly scattering tissues such as dermis, retina, etc.
The reflected light from the surface keeps the same polarization while the light reflected back from the layers in depth undergoes the multiple scattering that certainly depolarize the incident light. The de-polariztion ratio may account for the discrimination of the unhealthy from healthy tissues leading to the diagnosis of the disease.
The phenomena may contribute to diagnosis the diabetic retinopathy , dermal disorders and cutaneous and subcutaneous diseases.

Hi.
I plan to do CLSM on biofilm for the first time ever in other institution, but the service provider doesn't have experience in viewing biofilm by CLSM, I hope to get some answers from those who did it before. How should I prepare my samples to be viewed under CLSM? Can I just grow them on glass slide or 96-well plates then stain with LIVE BacLight Kit ? How long can I store them before getting to the institution for viewing ?
I'm confused about two things.
1. What is the definition of image compression?
2. Is that okay to say a "compression" is done when the size of an image is decreased by optical means?
Despite the discrete cosine and wavelet transforms, I wonder if there is another effective optical method for image compression.
Hi everyone,
We're building a small optical setup for reflective (episcopic) imaging using a C-mount camera and a LED light source. I've read that the best illumination is called köhler illumination, but I don't know the kind of optics we need to achieve that.
We plan to order everything we need from Thorlabs, but it can be a bit overwhelming to try and pick the right parts without expert knowledge of the optics.
Looking forward to your help!
Best,
Bjarke
We are interested in constructing an intrinsic optical imaging system to investigate changes in cortical blood flow. The simple versions of this procedure - look relatively straight forward to put together but the devil is always in the details.
I am not familiar with pre-processing RADARSAT imagery. Is it so different compare to pre-processing optical imagery? Is it possible to use ENVI software to do this task?
Being transparent or creating an illusion? I may propose a device to transfer the images from one location to display at another location using the optical fibers. Furthermore, invisibility may be re-defined over different IR-visible and UV spectral ranges. What optical materials, nanostructures or hybrid mechanisms can be used to enhance the invisibility process?
The planar phase front and the spherical phase front may change the focal point of an ideal micro-lens ( having min aberration). This fact may be enhanced drastically for an incident distorted wavefront.
Is there a quantum treatment of optical imaging which is comparably comprehensive as the classical treatment in Born&Wolf: Principles of Optics?