Science topic
Interferometry - Science topic
Measurement of distances or movements by means of the phenomena caused by the interference of two rays of light (optical interferometry) or of sound (acoustic interferometry).
Questions related to Interferometry
I need help analyzing enzyme kinetic data.
I have data from the Octet K2 system. In my experiment, I load the sensor with our protein of interest (6XHis tag on my recombinant protein to Ni-NTA sensors) and then expose this sensor to increasing concentrations of the candidate binding protein (five concentrations per experiment and each experiment is replicated four times). Each association step is followed by a dissociation step in buffer. A control sensor is used in each experiment where a sensor is loaded with the protein of interest but only exposed to buffer. (See picture, Part 1)
I have separate data where I loaded smaller recombinant domains of the protein of interest to the sensor and exposed it to the candidate binding protein. I would like to combine this data (the binding of the full-length protein and the binding of the domains) on the same graph.
My problem: In trying to analyze the data with the software provided with the Octet system (HT 11.1), the data misaligns. (See picture, Part 2)
My goal is to determine kinetic constants (KD) of the full-length protein and its separate domains to the protein of interest.
Suggestions for correctly aligning the data in the Octet software HT11.1? (I think the misalignment is because the program is trying to align the y axis to baseline 1 instead of baseline 2, which is the baseline right before the association step. If so, can you change this label after the fact?)
If the glitch with the Octet software cannot be fixed, then is there a manual/tutorial for the enzyme kinetic module for Sigma Plot?
I found I can extract the raw data from the Octet system. I can remove the background from the control sensor and manually assign concentrations. I uploaded this into Sigma plot 15, which has an enzyme kinetic module. I found the embedded help guide, but I have specific questions. For example:
*My candidate binding protein does not change, but how do you take into account the change in the kilodaltons of the proteins that are loaded to the sensor, full length vs. the smaller domain proteins? This is automatically taken care of in the Octet software.
*How do I differentiate between the association and dissociation phases?
I am new to Octet biolayer analysis and the Enzyme Kinetic Module analysis in Sigma Plot.
Any help will be greatly appreciated! I am happy to provide any more information.
Atmospheric imaging especially ionospheric irregularity imaging with radar interferometry.
Hi, I receive the results of one of my samples (fibres) using White Light Interferometry. However, I do not understand the interpretation of the results (see image), specifically the image about the distribution. Because I am trying to see if the information about those results is about the width or length of the fibres.
I was reading a paper introducing the Delayed Self-Heterodyne Interferometry technique by Okoshi in 1980. It was aimed to measure the linewidth of lasers and the main set-up was something like a Mach-Zehnder interferometer, but one of the arms was delayed by a fiber path much longer than the laser's coherence length, and there was an AOM in the other arm [I don't know if it's important]. The interference of these two beams was claimed to indicate the linewidth of the laser. What I do not understand is that how these uncorrelated beams do interfere?
I want to know if the number of fringes and their shape is an important factor for the accuracy of phase definition?
hello everyone!
I hope you all are staying safe and happy!
As a keen learner in optics experimentations. I was looking forward to find the set of experiments for the interferometry. I believe those are from the Newport organization. I was not able to actually. I would like to know if you guys have user manual pdf of the interferometry experiments to be performed , so that I can start with the experiments.
Thank you
I am looking forward to hearing from you all soon.
Regards
Ketan
Any recommendations about DINSAR (Differential Interferometry Synthetic Aperture Radar) applied to forests?
Can someone please provide me a Phase Comparison Direction Finding or Correlative Interferometry MATLAB codes?
It is my understanding so far that in this kind of experiments like the one measuring the 4π (i.e. 720° Dirac Belt trick) rotation characteristic of 1/2 spin fermions like neutrons, two neutron beams are polarized via S-G apparatus to the same quantum spin number. The two separated polarized beams are initially in phase meaning identical in every aspect. Continuing, one of the two beams is then brought out of phase from the other by forcing it to continuous Larmor precession while the other beam is not forced to precess. The two beams are then combined together in superposition and a interference signal is obtained.
I understand that because in the one beam the neutrons are wobbling all the time (Larmor precession) most of the time the beams are never in phase and don't have all four quantum numbers identical and therefore the Pauli's exclusion principle is not violated. Therefore, most of the time a steady noise interference output signal is produced of the two neutron beams combined.
However, as these experiments show for every 4π of Larmor rotation period, the two beams get momentarily in phase and a maxima in the signal output is generated due constructive interference:
see first figure attached
My question here is, at the points where the maxima in the interference signal are observed as shown above, meaning the two beams are monetarily in phase, do these events not violate the Pauli's exclusion principle?
The best explanation I could find so far in the literature to resolve my confusion is that mathematically this means the wavefunctions of the two combined fermions must be antisymmetric (antiparallel spin) which leads to the probability amplitude of the interference wavefunction going to a zero maxima if the two fermionic particle beams are in the same phase.
Thus according to the above interpretation IMO the signal output will be like this:
see second attached figure
But then how can be the two beams be in phase and at the same time having a destructive interference? And most importantly, if the two neutron beams combined end up having anti parallel spin because the Pauli's exclusion principle, how then can these experiments measure the 720° rotation Dirac Belt trick characteristic of these fermions (i.e. neutrons)?
Would that not totally mess up the experiment?
I'm confused, please help.
Which are the main limitations of SPR comparing with BLI, especially in terms of buffer and matrix especially for liposome-protein interactions assays ?
I have studied different papers related to Direction Finding (DF). However, what is really confusing is the terminologies of methods, techniques and algorithms. For example, we have amplitude, phase and amp-phase comparison methods. Once we look at the techniques there are several such as Doppler, Watson-Watt, Correlative Interferometry, etc. Similarly, there are MUSIC, ESPRIT and MLE algorithms.
Now I have really confused, as some papers compare Watson-Watt with MLE, others compare MUSIC with Correlative Interferometry. So is there any difference between these techniques and algorithms, or all of these are just similar alternative names.
The SAR data shows decorrelation for highly vegetated terrains. Has any of the latest software overcome this shortcoming?
Hello everyone,
Recently, I started studying SAR Multiple Aperture Interferometry (MAI) in order to produce horizontal displacement estimations. I have read some papers but I cannot find any open source code that can help me understand in depth the MAI technique.
This is a paper that i am interested in and related to my question.
If you have been working on MAI, would you please give me some recommendation?
Thanks in advance,
Kleanthis
I like to do project in laser measurement system. My objective is to eliminate or reduce the cosine error, abbe error in Laser measurement system. I have Rhenishaw XL80 system.
Please tell the procedure to eliminate that particular error. And help me to complete my project.
how to do f-2f interferometry from an octave spanning supercontinnum with frequency spacing 120MHz. Which would be the most suitable optical filter in this range for filtering f and 2f components from supercontinnum, to detect carrier envelope offset frequency of a mode-locked laser.
Thanks in advance!
Dear all, I'd like to open here a sort of forum for understanding how the geodesists community is moving in view of the X-band SAR satellite constellation. The new constellation will offer new "free, near real-time SAR data" with the "latest information about any spot on the planet within the hour". This will open completely new horizons for InSAR monitoring of ground deformation especially for rapid phenomena such as eruptions and seismic crises. The huge amount of so frequent data acquisitions will open also new needs for rapid and automatic processing. My question are: who knows more? Are you planning a routine use of these data? How?
An experiment that I propose contains a beam-splitter for protons. There is plenty of articles in the literature, about atoms interferometry. But, except for electron interferometry, I don't see articles on interferometry with charged particles. Whatever I need is a beam-splitter for protons, or ions - anyway, not neutral atoms. The beam-splitter I need should be imbalanced, i.e. the transmission coefficient be different from the reflection coefficient.
Can somebody help?
I have stuck with the interferometry of Sentinel-1 data in the phase of coregistration. The SNAP gives java heap space error several times for two weeks. I am finding any freely available software that has used for Interferometric DEM generation. Please help me. Thanks in advance.
The question has been raised for clarifying the idea of using electrostatic probes in a high dense low-pressure plasma. Primarily the probes would disturb the plasma potential and hence space charges would accumulate in the probe surface, as a result, the sheath dimension would increase. I need some idea for using other kinds of diagnostics which would be feasible for such plasma (interferometry, spectroscopy or other as such). Please recommend.
Are there any researchers out there skilled with interferometry who would be interested in a theoretical experiment to test the validity of a certain unverified metric of general relativity? It would be hoped to evolve into an actual land based experiment, however, I am not an experimentalist so I would have more questions than answers, i.e., I would like to exchange ideas with such a person to crystallize the experiment.
As recently concluded in a parallel discussion, see reference below, LIGO is unable to exclude that mirror displacements as observed along their interferometer arms in fact result from much larger mirror displacements of similar profile along the vertical.
This is because mirror suspensions act along the vertical which over a distance of 4 km varies by an angle of about 2 arc min (a nautic mile = 1,852 m by definition corresponds to 1 arc min of angular distance at sea level). So every vertical mirror displacement will exhibit a displacement component about three orders of magnitude smaller along the connecting interferometer tube.
As LIGO is unable to directly measure vertical mirror displacements with adequate sensitivity they cannot distinguish whether horizontal displacements such as assigned to gravitational wave interaction are due to horizontal excitation or to vertical excitation at three orders of magnitude larger amplitudes.
Since, Holographic Interferometry Focuses on the Formation and Interpretation of Fringe-Patterns, it should be noted that any changes experienced by the Structural-System and/or Object over a Predefined Time-Range can be detected with Wavelength Accuracy. It should further be noted, that it is the Time-Variable, that is of importance in Holographic Interferometry, which makes the Non-Destructive Testing [NDT] applications possible and interesting.
How much can you or should you vary the different Load-Conditions, while applying Holographic Interferometry to Structural-Component Testing Aspects, without compromising the Wavelength-Accuracy for the overall Structural Component and/or System?
A student sent me this image and asked how it was done. My background is in optics, but it has been a number of years since I worked with surface interferometry.
I am wondering if you can identify what type of surface interferometry system was used?
Description: with the image
For this image an interferometer was used to measure and photograph the interference patterns made by a group of water striders (family Gerridae) as they walk on the surface of water.
Credit: Perennou Nuridsany/Science Source
What we know so far:
Images and still photographs were done by a French team(Claude Nuridsany and Marie Perennou). They have not answered any emails about this technique.
Taken on film at least 15 years ago, but is probably older.
Color is real (not applied with computer)
The technique is described as interferometry
The insects are on water, the light could be transmitted through the container or reflected from the surface. I suspect it might be light reflected from the surface.
We found a movie at:
This question that has caused quite a debate over the last few weeks.
Thank you for the help.
The image here was found at ScienceSource a science stock image site
When calculating interferometric coherence, why can't you do so on a pixel by pixel basis? I know the equation for estimating coherence = (|S1∙S2* |)/√(S1∙S1*∙S2∙S2*) where S1 and S2 are the two single look complexes. And I know this calculation uses a maximum likelihood estimator but why do you need to specify an estimation window and why cant the estimation window size be 1?
Thank you.
Mechanical contact stylus techniques are the traditional method of measuring the internal diameter (ID) of shafts. Bore gauge is a good example for this method. Now I am trying to use non-contact method and after researching about different non-contact techniques and the available equipment, I found confocal chromatic sensor very accurate (as low as 0.5-2 microns). I learned how to use it to find some GD&T features like run-out but I don't know how to find the bore size (internal diameter). The case study is explained below. Any suggestion is highly appreciated.
Geometry: A cylindrical shaft with internal diameter of 0.926" and length of 40"
Question: To find the size of ID by using confocal chromatic sensor.
Thank you
While turning of titanium alloy Ti64ELI (manufactured by laser sintering ) by Single point diamond machining by very precise turning by Nanaform 200 CNC machine.It was observed that surface roughness incresed with the increase with cutting speed and surface roughness decreses with increase with feed rate. which is in clear contradiction with conventional trend . The surface roughness is recorded by correlation coherence interferometry(CCI) method. I am looking for the reasonable argument supporting the findings.
Have you ever had challenges in measuring infinitesimal variations in the environment including 'in vivo' biological analytes? Microwave technology has proposed methodologies in the past few decades towards the compact, yet sensitive, sensors. While their relatively high-quality factor helps to resolve a wide range of variations, their sensitivity is limited by the constant presence of the host medium. The following article is laying out an interferometry approach to circumvent this issue for extremely sensitive measurements.
Conference Paper Sensitivity Optimization in SRRs Using Interferometry Phase ...
I am using laser diode centered at 840 nm as the source. The experimental setup is similar to the figure shown below. The beam splitter cube splits the beam into reference and sample. The coating for all lenses and the beam splitter is for the specified wavelength. The objectives in both the arms are not coated for 840 nm wavelength but has transmission of 60 % at that wavelength.
When I image the sample on the camera, the reflection from the back surface of the beam splitter is prominent. Rotating the beam splitter deflects the beam on the objective but does not necessarily remove the back-reflection (ghost image) from the beam splitter.
How to get rid of back reflection?
Which beam splitters are generally used for Full-Field OCT or Linnik based interferometry systems? I have gone through few papers and they use the similar beam splitter as I use- Non-polarising cube beamsplitters and they don't face the ghosting problem.
I need to study the surface topology of a specific polymer. Is there any method to study the surface properties of polymers using interferometry?
Hello,
I'm studying the CGH interferometry used on the test of large-departure asphere.
I want to make sure what is the maximum aspheric departure of the test surface that can be measured? what determines the maximum aspheric departure?Is it the CGH machining accuracy?
Does anyone know? We can have a discussion.Thank you!
Hi,
I would like suggestions for a fast test for early detection of microcracks on an organic coating. In the past SEm was used but it is expensive and only a small area can be observed.
We are looking for a fast ( and cheaper alternative) to survey severl mm2 at a time for early detection of microcracks. Substrate is normally a metal ( steel, cast iron ) and the coating can be from 20microns to 100 microns thick.
someone suggested white light interferometry for this application, does anyone have experince with this technique? what resolution can we expect, how fast is it? ( is the equpment more accesble than SEM?
Could you suggest any other technique for this application?
Thanks in advance ,
Dear Colleagues,
Recently I had an argument with a colleague on whether the energy-transfer equations, and Beer's law specifically are applicable to smallish distances, specifically below the photone free pathlength. Both sides provided arguments, but, the discussion being during a coffee break, no citations.
Could you comment. Of special interest would be the citations on treatments of the cases when the lengths involved are much shorter than the corresponding free pathlengths.
A hundred years ago, when the difussion was a hot topic, this limit had no practical value. Now, when femtosecond pulses and low-coherence interferometry are widely used, this unresolved topic is back.
I suspect that it has been treated long time ago, probably around the time of Eistein and Smouluhovskii ... but I can't find citations.
Could you help?
I have been using Michelson Interferometry to characterise the surface flatness of my glass cell. In the reference arm of my interferometer, I am using a mirror of surface flatness lambda/10. The wavelength I am using in this measurement is centered around 630nm.
One of fringe patterns I captured is shown in the imaged attached to this question.
I wish to reconstruct the surface of the glass cell through the interference fringes. Are there any existing computational techniques/opensource code to do so?
hello,
i am using cygwin. i have imported the snaphu in cygwin and it is running. it is showing some parameter variation like number of nodes in network, pivots, tree size. i have assured the the status of program using task manager of windows explorer. i am not getting how long it will run. it has been taken 48hrs already.
my system configuration is HP Z620, 24GB RAM, Windows8.1
I should find free SAR images ( Spaceborne or Airborne).
Especially I need complex SAR images and also StereoSAR images for same area.
So I can examine both interferometric and radargrametric algorithms' results.
Thanks.
This might sound like a simple/stupid question, but working in photoacoustics often requires substantial work in the form of choosing how to acoustically couple my acoustic sensor to my sample of interest.
Assuming I am using a transducer such as those in the link, what is my best bet for acoustic coupling to reduce bubble formation that is simple to set up, inexpensive, and repeatable?
Obviously, a lot of this depends upon the situation, but I'm interested in hearing what you use for this since I'm trying to figure out what would be easiest given my application. I've seen water coupling, gel coupling, and optical coupling (interferometry), but each of these methods has significant drawbacks (water coupling always seems to have bubbles unless I use a vacuum, gel is the same way, interferometry is expensive).
What do you use?
Spectrometer+Si-CCD? Fourier-Transform Interferometry? Something else?
Wavelengthrange 500-800 nm, very low photon flux << 1 photon/s.
Hi to all,
Consider the following scenario (shown in the attached file):
Two mutually coherent and collimated light beams intersect as shown, creating the depicted 'bright' and 'dark' stationary interference fringes (fig. 'A'). Suppose we insert a very thin (compared to the fringe width) and, ideally, perfectly conducting 'sheet' across, say, the central 'dark fringe'(fig. 'B').
It certainly appears as though we can "cut each of the light beams in two, across an impassable barrier", yet they will persist and continue to freely propagate! This appears to be the case both for 'classical' EM waves as well as quantum-optical wavefunctions. Of course, no infinitely thin and perfectly conducting sheet exists, but it does seem that this effect will remain sufficiently intact under realistic conditions.
Is this possible??
It is well known that SAR (Synthetic Aperture Radar) interferometry is based on the SAR technology. How SAR can detect the deformation in three dimensions after earthquake happen? is it possible to measure the slip rate and fault parameters from SAR technology?
Hi All,
I was wondering what laser source is suitable in deep-UV region for interferometry-based Raman spectroscopy? NeCu laser vs Argon ion? By the way, I would like to use laser source with excitation <250 nm.
Thanks
I am doing the research work of the wavelength phase-shifting interferometry. When the plate thickness is thin, the parasitic stripes formed by the front and rear surface can not be separated, what is a good way to do?
Relation b/w scratches and diffraction intensity and the methods to detect the scratch density?
Recently I was reading the article"Lu B, Yang X, Abendroth H, et al. Time-average subraction method in electronic speckle pattern interferometry[J]. Optics Communications, 1989, 70(3):177-180." And I was wondering how could we get the equation(13) through the subtraction of equation(9) and (10) , and I have problem with the term of cos(2*x). Unfortunately, because of it is an article of 1989 and I could not find the author's email. I doubt there may be some mistakes in equation (13), and I don't know how to get the term of Icos[2(φo -φr)]| ?
We are attempting to estimate the Snow Depth by using Remote Sensing Methods. We found that one of the methods that is used for this purpose is Altimetry and using Passive microwave data.
Is this possible to estimate snow depth using SAR data like Sentinel-1 with the high resolution (20 meter for example)? or is there satellite data or product that we can estimate snow depth with high spatial resolution?
Can we estimate snow depth with interferometry (by using for example different bands or images from two different dates)?
For my masters project I need to extend an existing setup to be able to measure the phase of the recorded intensity field. My supervisor proposed to use an off-axis holographic method to obtain phase information of the field. Now my question is how do I stabilize the phase difference between the reference wave and object wave. For clarity I added a sketch of the setup
I'm working on a setup for digital holographic interferometry where I determine phase difference maps of objects in unloaded and loaded states.
I already applied a correction related to the sensitivity vector which originates from the difference in illumination and observation directions. The results are almost but not exactly right. Now I wonder if there's an additional correction necessary which compensates for the beam splitter cube between camera and object in lateral directions?
I'd be very grateful for any advice on this or on how to determine deformation from holographic phase differences in general.
The setup is simple: reference and object waves are combined through a beam splitter cube in front of the camera sensor. No phase shifting, just the phase difference before and after deformation.
Thanks in advance...
actually I am working on a project to monitor nonspecific t cell activation..
I have extracted RNA and using kit i got cDNA from last three months i am working to optimize human interferon gamma primer using conventional pcr but i fail to get any result.
I met a problem on using the piezoelectric actuator to actively stabilize the homodyne interferometry to avoid the signal fading problem and to eliminate the experimental low frequency noises. The mirror is attached to the PZT actuator to reflect the reference beam. A Proportional-Integral circuit was designed , together with the PZT and homodyne interferometry, to constitute the feedback loop. I doubt the following problems: 1.Does the hysterisis or linearity and creep effects of the PZT influence the stabilization? Should these effects be seperatedly compensated or not, because the PZT is part of the closed-feedback-loop. 2. Does the response time of the PZT and the circuit influence the stabilization? 3. I cannot stabilize the homdyne interferometry well, what other factors may contribute to the unlocking of stabilization ? I do really appreciate your answers.
How can I synthesise the Sparse antenna array using PSO, GA and DE techniques? I have to optimize the 2D Array factor.
I am using the SNAP processor to generate the DEM from the SENTINEL 1A SLC data. In the tutorial available (attached herewith), there comes a step called topographic phase removal (interferogram flattening). I have a doubt as this step is to be implemented only for calculation of the displacement phase as the removal of the topographic phase will remove the height from the interferogram leaving only with the displacement phase. So if I am deriving a dem I have skip this step and do it for estimating displacement (deformation) which by principle will be a two pass differential interferometry.
I would like to suggestions as to whether the interpretation I have provided above is correct and in that case what will be processing step for removing the displacement phase while generating the dem.
I contacted Fisher Scientific about what the flatness of their fisherfinest microscope slides is, and they told me that they do not know that information.
I have deposited some nanometer films on these slides as substrates and I wanted to confirm some suspicions I had regarding these slides not being very flat, etc. based upon how they are made.
Does anyone have this information/an estimate of how flat they might be?
Which one of this level of processing for Satellite data ( COSMO SkyMed ) will be good for my study.And I am ordering STRIPMAP HIMAGE. Mostly, I will be using for DEM extraction, river cross section extraction, Landuse/landcover etc.
1. Level 0( RAW)
2. Level 1A( SCS) - Single Look Complex Slant
3. Level 1B( MDG) - Detected Ground Multi Look
4. Level 1C( GEC) - Geocoded Ellipsoid Corrected
5. Level 1D(GTC)- Geocoded Terrain Corrected
In NEST-DORIS software
How to produce three pass interferometry "DInSAR" I need further information than existing in the software user manual
I am using Sentinel-1 TOPS SAR data for Himalayan Glacier. i have used Sentinel-1 Tool Box for interferogram generation. there is no step for phase unwrapping. SNAPU is the only way to unwrap the phase. but i could not understand how to unwrap using SNAPU.
Please tell me the phase unwrapping method for Sentinel-1 datasets.
I have read the recent paper from Abbott et al about the recent discovery of gravitational waves by the LIGO experience. I'm wondering: if gravitational waves are an evidence of a change in the spacetime structure, they also affect the measurement devices (I mean everything except the arms of the interferometer). In other words, would the possible impact on these devices modify the measurement results or even create false alarms? Was that taken into account in the data processing?
Hello all,
I've come across a lot of information on several types of interferometers, but most of this information is quite scattered along the literature. I've been asking around if anyone knows of a good reference book on interferometry and the different types of interferometers (such as Fabry-Pérot's, Mach's, Michelson's, Bath's).
Thank you in advance,
Alcides
First of all I want to apologize for my english. I am not a native english speaker.
This question relates to the Fabry-Perot-Interferometer respectively the Fabry-Perot-Sensor.
In a lot of scientific papers it is mentioned that the analysis of the cavity length of the Fabry-Perot-Interferometer is accomplished with the help of white light interferometry. This mean a broadband light source with low coherence length is used. So the cavity length is much greater than the coherence length of the light. In my understanding interference and hence modulation of the light in this case is not possible because constant phase relation as a premise for interference is not given. But how is it still possible that interference occurs and the spectrum of the broadband light source is modulated?
I added a dissertation where a white light based Fabry-Perot-Sensor is used (Page 21).
Thank you in advance!
When we use a reflective LC SLM for a self referenced interferogram, what are the parameters to determine the angle of tilt of the SLM? I read that angle of tilt also depends on the grating separation.
I want to know if we can create interferograms using two different SAR satellites? the satellites must have different orbit and phase?
Trying to install Heterodyne Interferometry.
I am working on SAR interferometry with some TSX Stripmap mode images. I have some choices for "Processor Gain Attenuation" which are 0, 10 and 20dB but I totally have no idea about this value.
Could you let me know some suggestions?
Could you tell me which software is best for TSX interferometry processing? Thank you!
Xuan
I am looking for PM HNF fibers for a f-2f interferometer to detect the offset beat of a frequency comb oscillator (frep = 80 MHz, 1560 nm). The f-2f interferometer should be build all in fiber. Further I don't know yet how the dispersion/nonlinearity profile should look like to achieve peak powers at 1064 nm and 2128 nm within the supercontinuum.I have found companies such as Sumitomo and OFS. However, the MOQ and pricing is quite high. Are there samples or fibers from other manufacturers available which might suit this application ? Many thanks ! Sebastian
I heard that the swept wavelength interferometry is used for a distributed sensing over ~km dynamic range. What kinds of tunable source is used for this kind of application? (linewidth, bandwidth, tuning rate ..etc)
I am trying to find an implementation of the method described in: Fourier-transform method of fringe-pattern analysis for computer-based topogrphy and interferometry by Mitsuo Takeda et al ?
Can the refractive index of the aqueous solution change with increasing pressure?
And how the pressure change may affect the behavior of the thermal gradient in the aqueous media?
laser induced dielectric breakdown
looking to eliminate the etalon reflections. Need to know if there is a way to eliminate or remove the etalons in frequency domain and obtain a single pulse in the time domain?
optical temperature measurement methods
After forming an interferogram, phase filtering and phase unwrapping. The interferogram is geocoded. Now, how can I extract displacement information from the produced interferogram?
when point source is placed near the surface of a glass plate of width 1.15 mm, I have got the fringes as shown in the figure attached. Rotating the glass slide to an angle of 10 through an axis perpendicular to the laser beam, the central bright fringe becomes dark due to change in its optical path length. How the central fringe is varying due to rotation is shown in the attachment. I want to simulate how the fringe pattern is varying with the change in the phase difference between the beams on matlab, how should I proceed, please give your references/books/articles.....
In geodetic applications, how can one use differential interferometry for crustal displacement monitoring?
I use NEST software From ESA "https://earth.esa.int/web/guest/home;jsessionid=9944AA626F7C6F2942DD3DC2027427BC.eodisp-prod4040"
for DInSAR processing but I encountered some problems in Phase Unwrapping of the produced interferograms.
These problems represented in installing and compiling SNAPHU "http://web.stanford.edu/group/radar/softwareandlinks/sw/snaphu/"
on Linux Ubuntu
I know Michelson Interferometry for the temporal and coherence length measurements and Shack-Hartmann wave front sensor for the spatial coherence detection. What else do you suggest?
I want to record the interference pattern at the focal plane of the pre-defined field.
Is there any news on the timeframe for a global availability of SRTM-1 (1 Arc-Second) data?
I use a photodiode light in my setup (FTS Spectrometer). I want to align the beam for the best visibility.