Shape and deformation measurements of 3D objects using volume speckle field and phase retrieval
ABSTRACT Shape and deformation measurement of diffusely reflecting 3D objects are very important in many application areas, including quality control, nondestructive testing, and design. When rough objects are exposed to coherent beams, the scattered light produces speckle fields. A method to measure the shape and deformation of 3D objects from the sequential intensity measurements of volume speckle field and phase retrieval based on angular-spectrum propagation technique is described here. The shape of a convex spherical surface was measured directly from the calculated phase map, and micrometer-sized deformation induced on a metal sheet was obtained upon subtraction of the phase, corresponding to unloaded and loaded states. Results from computer simulations confirm the experiments.
- SourceAvailable from: Nikolay Petrov
[Show abstract] [Hide abstract]
- "The radiation system 1 is the basis of all optical setups. The sources of various spectral range (from visible to X-ray) both monochromatic      and multispectral   may appear there. The next necessary element of the setups is an investigated object 2. Nowadays it can be very diverse: transmitting  and reflective , the amplitude with phase noise, pure phase, amplitude-phase, self-illuminated . "
ABSTRACT: Additional datasets allow the wavefront phase retrieval by the iteration procedure. They can be obtained in practice using the set of parameters in the phase analyzer, radiation and registration systems of the phase retrieval setups.
- [Show abstract] [Hide abstract]
ABSTRACT: Three dimensional microscopy allows the reconstruction of both phase and amplitude of the object wavefronts. This in turn sheds information on the optical path length profile of the object. Conventionally 3D microscopy is achieved using two beam interference techniques, which requires adjustment of the beam for high quality interference fringes as well as is more prone to external vibrations. Here we discuss a single beam 3D microscopic technique. The technique works by sampling the volume speckle field generated by putting a diffuser in the path of the probe beam passing through the object at several axial planes and computing the complex amplitude of the object wavefront using angular spectrum approach towards scalar diffraction theory.
- [Show abstract] [Hide abstract]
ABSTRACT: We demonstrate what is, to the best of our knowledge, the first electronically controlled variable focus lens (ECVFL)-based sensor for remote object shape sensing. Using a target illuminating laser, the axial depths of the shape features on a given object are measured by observing the intensity profile of the optical beam falling on the object surface and tuning the ECVFL focal length to form a minimum beam spot. Using a lens focal length control calibration table, the object feature depths are computed. Transverse measurement of the dimensions of each object feature is done using a surface-flooding technique that completely illuminates a given feature. Alternately, transverse measurements can also be made by the variable spatial sampling scan technique, where, depending upon the feature sizes, the spatial sampling spot beam size is controlled using the ECVFL. A proof-of-concept sensor is demonstrated using an optical beam from a laser source operating at a power of 10 mW and a wavelength of 633 nm. A three-dimensional (3D) test object constructed from LEGO building blocks forms has three mini-skyscraper structures labeled A, B, and C. The (x, y, z) dimensions for A, B, and C are (8 mm, 8 mm, 124.84 mm), (24.2 mm, 24.2 mm, 38.5 mm), and (15.86 mm, 15.86 mm, 86.74 mm), respectively. The smart sensor experimentally measured (x,y,z) dimensions for A, B, C are (7.95 mm, 7.95 mm, 120 mm), (24.1 mm, 24.1 mm, 37 mm), and (15.8 mm, 15.8 mm, 85 mm), respectively. The average shape sensor transverse measurement percentage errors for A, B, and C are +/-0.625%, +/-0.41%, and +/-0.38%, respectively. The average shape sensor axial measurement percentage errors for A, B, and C are +/-4.03%, +/-3.9%, and +/-2.01%, respectively. Applications for the proposed shape sensor include machine parts inspection, 3D object reconstruction, and animation.Applied Optics 03/2010; 49(7):1139-50. DOI:10.1364/AO.49.001139 · 1.78 Impact Factor