Available via license: CC BY-NC 4.0
Content may be subject to copyright.
APPLIED OPTICS 2017 © The Authors,
some rights reserved;
exclusive licensee
American Association
for the Advancement
of Science. Distributed
under a Creative
Commons Attribution
NonCommercial
License 4.0 (CC BY-NC).
3D-printed eagle eye: Compound microlens system for
foveated imaging
Simon Thiele,
1
* Kathrin Arzenbacher,
1
Timo Gissibl,
2
Harald Giessen,
2
Alois M. Herkommer
1
We present a highly miniaturized camera, mimicking the natural vision of predators, by 3D-printing different multi-
lens objectives directly onto a complementary metal-oxide semiconductor (CMOS) image sensor. Our system com-
bines four printed doublet lenses with different focal lengths (equivalent to f= 31 to 123 mm for a 35-mm film) in a
2 × 2 arrangement to achieve a full field of view of 70° with an increasing angular resolution of up to 2 cycles/deg
field of view in the center of the image. The footprint of the optics on the chip is below 300 mm×300mm, whereas
their height is <200 mm. Because the four lenses are printed in one single step without the necessity for any further
assembling or alignment, this approach allows for fast design iterations and can lead to a plethora of different
miniaturized multiaperture imaging systems with applications in fields such as endoscopy, optical metrology, optical
sensing, surveillance drones, or security.
INTRODUCTION
Direct three-dimensional (3D) printing of micro-optical and nano-
optical components by femtosecond direct laser writing has recently rev-
olutionized the field of micro-optics. While the first publications covered
single optical components, such as microlenses (1), diffraction gratings
(2), waveguides (3), free-form surfaces (4,5), phase masks (6), or pho-
tonic crystals (7), the field has considerably developed in terms of com-
plexity. Hybrid refractive-diffractive components (8), hybrid free-form
lenses that are directly printed onto optical fibers (9), or even multi-
component optical systems (10,11) exhibit a huge potential for many
fields of application. The main advantages compared to traditional fab-
rication methods, such as microprecision machining (12), nanoimprint
lithography, or traditional wafer-level approaches [such as liquid phase
lens fabrication (13)], are an almost unrestricted design freedom, one-
step fabrication without the necessity for subsequent assembly and
alignment, and the flexibility to write on arbitrary substrates. Gissibl
et al. (11) were the first to demonstrate multicomponent objective lenses
directly printed onto a complementary metal-oxide semiconductor
(CMOS)imagingsensorinasinglestep.Theaimofthisworkisto
further unleash the potential of this technology by creating a multiaper-
ture foveated imaging system.
Foveated vision systems are very common in the animal world,
especially among predators (14). Many tasks do not require an equal
distribution of spatial resolutionoverthefieldofview(FOV).How-
ever, the central area near the optical axis should exhibit the highest
resolution. This is the same in humans, where the cones in the eye are
located at the so-called fovea, which gives the highest acuity in vision—
thus the term “foveated imaging.”In eagles, this physiological effect is
particularly pronounced, which is why foveated imaging is also some-
times called “eagle eye vision.”Similarly, technical applications, such
as drone cameras, robotic vision, vision sensors for autonomous cars,
or other movable systems, benefit from a higher resolution at the
center of their FOV. Various publications present foveated imaging
systems for different purposes (15–20).
Multiaperture miniaturized cameras are known for their flat design
and usually use stitching of the FOV to combine the small subimages
created by a specially designed microlens array into one larger image
(21,22). Foveated systems are of particular relevance if their bandwidth
is limited by the size of detector pixels or by the readout time (23). In
these cases, an optimum distribution of object space features on the
limited spatial or temporal bandwidth becomes essential. Because of
the high quality of our microimaging systems, optical performance
and possible miniaturization are mainly limited by the size of the de-
tector pixels. Therefore, we developed a multiaperture design that com-
bines four aberration-corrected air-spaced doublet objectives with
different focal lengths and a common focal plane that is situated on a
CMOS image sensor. Particularly beneficial is the ability to create
aspherical free-form surfaces, which are heavily used in the lens design.
Figure 1 shows a sketch of the setup and how the pixel data are sub-
sequently fused to form the foveated image. Each objective lens creates
an image of the same lateral size on the chip. However, because of the
varying FOVs, the telephoto lens (20° FOV) magnifies a small central
section of the object field of the 70° lens. Appropriate scaling and over-
laying of the images thus lead to a foveated image with increasing object
space resolution toward the center of the image. To resolve features with
acertain spatial frequency in object space, the Nyquist limit requires at
least twice the spatial frequency for the detector pixels in image space.
In the case of the central telephoto lens, the same number of pixels
covers only ~29% of the overall FOV. This means that the Nyquist fre-
quency in the center is increased by a factor of ~3.5, allowing the res-
olution of significantly smaller features. This is indicated by the different
pixel sizes in Fig. 1B.
RESULTS AND DISCUSSION
Figure 2 shows the image sensor after 3D-printing multiple groups of
the foveated imaging systems directly on the chip. When compared to
the scanning electron microscope (SEM) micrograph in Fig. 1C, the
surfaces of the lenses appear less smooth and exhibit ridges on the
surfaces. This can be partly explained by the depth of field stitching
of the digital microscope, leading to artifacts. Furthermore, back re-
flections and interference fringes from inner surfaces become visible,
whichisnotthecaseforSEMimages.Theridgesleadtoanincreased
amount of stray light and reduce the overall contrast of the images.
Further details about the surface quality can be found in Gissibl et al.
(9,11). To assess the optical performance without the influence of the
chip, we characterized the lenses on a glass slide beforehand.
1
Institute of Applied Optics and Research Center SCoPE, University of Stuttgart, 70569
Stuttgart, Germany.
2
4th Physics Institute and Research Center SCoPE, University of
Stuttgart, 70569 Stuttgart, Germany.
*Corresponding author. Email: thiele@ito.uni-stuttgart.de
SCIENCE ADVANCES |RESEARCH ARTICLE
Thiele et al.Sci. Adv. 2017; 3: e1602655 15 February 2017 1of6
on February 16, 2017http://advances.sciencemag.org/Downloaded from
Figure 3 compares the normalized on-axis modulation transfer
function (MTF) contrast as a function of angular resolution in object
space for the four different lenses after measurement with the knife
edge method. As expected, systems with longer focal lengths and
smaller FOVs exhibit higher object space contrast at higher resolutions
because of their telephoto zoom factor. Compared to the design MTF,
which is diffraction-limited or close to the diffraction limit, there is a sig-
nificant loss in contrast, which can be explained by imperfect manu-
facturing. The dashed vertical lines indicate the theoretical resolution
limit due to the pixel pitch of the imaging sensor. All systems deliver
more than 10% of contrast at the physical limits of the sensor, which
means that the resolution is limited by pixel pitch. More specifically,
the pixel response function strongly suppresses all spatial frequencies
above the ones indicated by the vertical lines. Because of the variation
in magnification, the aliasing in the case of wide FOV systems is the
limitation on angular resolution. Because the lenses for the outer zones
ofthefoveatedimageareusedoff-axis, less MTF contrast has to be
expected and imaging might not be pixel-limited in every case. However,
Fig. 2. 3D-printed four-lens systems on the chip.(A) CMOS image sensor with compound lenses directly printed onto the chip. The change in color on the sensor surface
results from scratching off functional layers, such as the lenslet array and the color filters. (B) Detail of one lens group with four different FOVs for foveated imaging forming
one camera. The combined footprint is less than 300 mm × 300 mm.
Fig. 1. Working principle of the 3D-printed foveated imaging system. (A) System of four different compound lenses on the same CMOS image sensor, combining
different FOVs in one single system. The lenses exhibit equivalent focal lengths for a 35-mm film from f= 31, 38, 60, and 123 mm. (B) Exemplary fusion of the pixelized
object space content to create the foveated image. (C) SEM image of a 3D-printed doublet lens. The individual free-form surfaces with higher-order aspherical corrections are
clearly visible. (D) Light microscope image of the 60° FOV compound lens.
SCIENCE ADVANCES |RESEARCH ARTICLE
Thiele et al.Sci. Adv. 2017; 3: e1602655 15 February 2017 2of6
on February 16, 2017http://advances.sciencemag.org/Downloaded from
some field-dependent aberrations, such as field curvature, can be
minimized by readjusting the back focal distance for each of the
doublet lenses individually. Off-axis MTF measurements of very sim-
ilar lenses can be found in Gissibl et al.(11). MTF curves for the
systems with FOVs of 40° to 70° show a similar optical performance
in object space as a result of more similar fnumbers (Table 1) and
geometries. Nonetheless, using different lenses is still justified as long
as the resolution is pixel-limited and a redistribution of sensor spatial
bandwidth is desired.
Figure 4 shows the simulated and measured results for four cases.
Figure 4A exhibits a comparison between measured and simulated im-
aging performance of lens 1 with a 70° FOV printed on a glass slide and
imaged through a microscope. Because the simulated results do not in-
clude surface imperfections and scattering, a smaller overall contrast is
the most striking difference. Figure 4B compares the foveated images of
lenses 1 to 4. Both results exhibit visible improvement in resolved
features for the central part. If the pixelation effects of the image sensor
are taken into account, all images will lose resolution. Figure 4C shows
the simulation results of lens 1 if a pixel size of 1.4 mmisassumedand
compares it to the data attained on the chip. Because the chip does not
perfectly record the image, there is a notable difference in quality if it is
compared to the imaging on a glass slide. This effect can be explained
by the chip plane being not perfectly aligned with the focal plane and by
the fact that the microlens array on the chip, which is important for the
imaging performance, was removed before 3D printing. After creating
the foveated images (Fig. 4D), the imaging resolution is considerably
increased toward the center of the images. Although the measured
image does not completely achieve the quality of the simulated one, a
significant improvement is visible. By using more advanced image
fusion algorithms, it will be possible to further improve the image
quality and increase the resolution toward the center in a smoothly
varying fashion. However, the implementation of these approaches
is beyond the scope of this paper.
To further demonstrate the potential of our approach, we used the
test image “Lena”(24) and a Siemens star as targets (Fig. 4, E and F).
In both examples, the center of the original images contains more
details than the outer parts. Using our foveated approach, the totally
available bandwidth is optimized such that the important parts
(center) are transferred with a higher spatial bandwidth (resolution)
compared to the less important outer areas in the pictures. The results
confirm markedly that our multiaperture system delivers superior
images. The advantages of a foveated system (Fig. 4I) are further shown
in a simulated comparison of the four-lens system with a single-lens
reference (Fig. 4H) having the same image footprint and FOV (70°).
Whilethesingle-lenssystemisbulkierandrequiresmoretimeforfab-
rication, its resolution in the center of the FOV is considerably lower
than that for the foveated image (Fig. 4G). Although the numbers “2”
and “3”are readable in the center of Fig. 4D, they are not resolvable in
the nonfoveated case.
CONCLUSION
Our work demonstrates for the first time direct 3D printing of varying
complex multicomponent imaging systems onto a chip to form a mul-
tiaperture camera. We combine four different air-spaced doublet
lenses to obtain a foveated imaging system with an FOV of 70° and
angular resolutions of >2 cycles/deg in the center of the image. At the
moment, the chip dimensions and pixel size limit the overall systems
dimensions and our optical performance, whereas future devices
can become smaller than 300 mm×300mm × 200 mminvolume
and, at the same time, transfer images with higher resolution. The
method thus enables considerably smaller imaging systems as compared
to the state of the art. To the best of our knowledge, there are no fab-
rication methods available that can beat this approach in terms of min-
iaturization, functionality, and imaging quality. Further improvements
would include antireflection coatings on the lenses, either by coatings or
by nanostructuring; the use of triplets or more lens elements for aber-
ration correction; and the inclusion of absorbing aperture stops.
With fabrication times of 1 to 2 hours for one objective lens, cheap
high-volume manufacturing is difficult at the moment. However,
printing just the shell and a lamellar supporting frame and direct ul-
traviolet curing (8) can reduce the fabrication time. In addition, par-
allelization of the printing process can help to scale up the fabrication
volumes. Finally, in some applications, such as endoscopy, high-
throughput manufacturing is not desired as much as are functional-
ity and optical performance. Another problem that is especially prevalent
in multiaperture designs is the suppression of light from undesired
angles, reducing overall contrast and potentially leading to ghost images.
Fig. 3. Design and measurement of normalized MTF contrast in object space
as a function of angular resolution. The data do not include the transfer
function of the CMOS image sensor and are obtained by knife edge MTF mea-
surements of the samples printed on a glass slide. The dashed vertical lines indi-
cate the cutoff spatial frequency of the pixel response function above which the
imaging resolution is strongly suppressed. The dashed horizontal line marks the
10% contrast limit, which was used as the criterion for resolvability in this work.
Table 1. Selected parameters of the designed lens systems.
Lens 1 Lens 2 Lens 3 Lens 4
FOV (°) 70 60 40 20
Visible object diameter at a distance
of 1 m (m)
2.75 1.73 0.84 0.36
Focal length (mm) 64.6 78.3 123.9 252.2
fNumber 0.7 0.8 1.2 2.6
Hyperfocal distance (mm) 6.4 7.2 8.1 7.3
Fresnel number at l= 550 nm 70 58 36.7 18
35-mm Equivalent focal length (mm) 31 38 60 123
SCIENCE ADVANCES |RESEARCH ARTICLE
Thiele et al.Sci. Adv. 2017; 3: e1602655 15 February 2017 3of6
on February 16, 2017http://advances.sciencemag.org/Downloaded from
Within the all-transparent material system, special structures could be
designed to guide light into uncritical directions by refraction or total
internal reflection. However, it would be more favorable to use real ab-
sorptive structures. These could be created by filling predefined cavities
with black ink or directly depositing black metal, among others.
As an example of how this kind of device can be used and out-
perform a conventional camera of similar size, we would see the usage
in a small microdrone, which, similar to an insect, would be capable of
transmitting a wide-field overview and, at the same time, offer detailed
imaging of a certain region of interest as well. Similarly, an application
in capsule endoscopy with directed vision or as a movable vision sen-
sor on a robotic arm offers a broad range of opportunities.
MATERIALS AND METHODS
The 3D-printing technology used is almost unrestricted in terms of
fabrication limitations. It offers high degrees of freedom and unique
opportunities for the optical design. However, finding the optimum
system can become more difficult because the parameter space is
much less constrained as compared to many classical design problems.
Because of the mature one-step fabrication process, the challenges of
the development are—in comparison to competing manufacturing
methods—thus shifted from technology toward the optical design.
To ensure an efficient use of the available space, we designed four
different two-lens systems with full FOVs of 70°, 60°, 40°, and 20°. The
numbers were chosen based on the achievable performance in previ-
ous experiments and such that each lens contributed to the foveated
image with similarly sized sections of the object space. Table 1 shows
an overview of the resulting parameters. Because the lens stacks and
support materials were all fully transparent, it was important to keep
the aperture stop on the front surface during design. Otherwise, light
refracted and reflected by the support structures would negatively in-
fluence the imaging performance. Buried apertures inside the lenses
were not possible until now because absorptive layers could not be
implemented by femtosecond 3D printing. Because of the scaling laws
of optical systems (25,26), small fnumbers could be easily achieved.
The aperture diameter was 100 mm for all lenses. As a restriction, the
imagecirclediameterwassetto90mm.
Before simulation and optimization, it is important to determine
the best-suited method. The Fresnel numbers of all systems indicate
that diffraction does not significantly influence the simulation results
(Table 1). Therefore, geometric optics and standard ray tracing can be
used to design the different lenses. We used the commercial ray tracing
software ZEMAX. Because the fabrication method posed no restric-
tions for the surface shape, the aspheric interfaces up to the 10th order
were used. As a refractive medium, the photoresist IP-S of the com-
pany Nanoscribe GmbH was implemented based on previously mea-
sured dispersion data. After local and global optimization, the
resulting designs revealed a diffraction-limited performance (Strehl
ratio >0.8) for most of the lenses and field angles. The ray tracing
Fig. 4. Comparison of simulation and measurement for the foveated imaging systems. (A) Imaging through a single compound lens with a 70° FOV. (B)Foveatedimages
for four different lenses with FOVs of 20°, 40°, 60°, and 70°. The measurement for (A) and (B) was carried out on a glass substrate. (C) Same as (A) but simulated and measured
on the CMOS image sensor with a pixel size of 1.4 mm×1.4mm. (D) Foveated results from the CMOS image sensor. Comparison of the 70° FOV image with its foveated
equivalents after 3D-printing on the chip. (E) Measured comparison of the test picture “Lena.”(F) Measured imaging performance for a Siemens star test target. (G)Simulated
image for a single-lens reference with an image footprint comparable to the foveated system. (Hand I) Geometry of reference lens and foveated system at the same scale.
SCIENCE ADVANCES |RESEARCH ARTICLE
Thiele et al.Sci. Adv. 2017; 3: e1602655 15 February 2017 4of6
on February 16, 2017http://advances.sciencemag.org/Downloaded from
design was finalized polychromatically with direct MTF optimization,
which includes diffraction effects at the apertures. Compared to con-
ventional single-interface microlenses, the close stacking of two elements
offered significant advantages and was crucial for the imaging per-
formance. On the one hand, pupil positions and focal lengths can be
changed independently, which allows for real telephoto and retrofocus
systems. On the other hand, aberrations such as field curvature, astig-
matism, spherical aberration, and distortion can be corrected effectively.
After the optical design, the final results were transferred to computer-
aided design software. In terms of support structure design, it was im-
portant to find a good trade-off between rigidity and later developability
of the inner surfaces. To date, the best results had been achieved with
open designs based on pillars, as shown in Fig. 1D. All of the lens fix-
tures had an outer diameter of 120 mm.
The fabrication process itself was described in detail by Gissibl et al.
(9). Figure 5 shows the different stages of the development process. To
measure the imaging performance, samples had been 3D-printed onto
glass substrates as well as onto a CMOS imaging sensor (Omnivision
5647). This chip offers a pixel pitch of 1.4 mm, which resulted in single
images with ~3240 pixels. Using a state-of-the-art sensor with a 1.12-mm
pixel pitch would increase this number to ~5071 pixels. To improve
the adhesion of the lenses, the color filter and microlens array on the
sensor had to be removed before 3D-printing. Figure 2 shows the sen-
sor with nine groups of the same four objectives. Each group forms its
Fig. 5. Development cycle of different lens systems. FOVs varying between 20° and 70°. The process chain can be separated into optical design, mechanical design, 3D
printing, and measurement of the imaging performance using a USAF 1951 test target (top to bottom).
SCIENCE ADVANCES |RESEARCH ARTICLE
Thiele et al.Sci. Adv. 2017; 3: e1602655 15 February 2017 5of6
on February 16, 2017http://advances.sciencemag.org/Downloaded from
own foveated camera and occupies a surface area of less than 300 mm×
300 mm. The filling factor of ~0.5 still offers room for improvement,
although, in principle, it is possible to design the system such that the
four separate lenses are closely merged into one single object, which is
then3D-printedinonesinglestep.
To characterize the optical performance without pixelation effects,
we printed the four different compound lenses onto glass slides. Be-
cause the lenses were designed for imaging from infinity and their focal
lengths were smaller than 260 mm, the hyperfocal distance was about
8 mm (Table 1) and objects from half this distance to infinity remained
focused. To assess the imaging quality, we reimaged the intermediate
image formed by the lenses with an aberration-corrected microscope.
Measurements of the MTF based on imaging a knife edge were per-
formed in the same way as previously described (11).
The foveated camera performance was evaluated after 3D-printing
on the image chip. The sensor device was placed at a distance of
70 mm from a target, which consisted of different patterns printed
onto white paper. The target was illuminated from the backside with
an incoherent white light source. The image data from the chip were
then read out directly. It has to be notedthatthechipandthereadout
software automatically performed some operations with the images,
such as color balance or base contrast adjustment. However, there
was no edge enhancement algorithm used that would skew the
displayed results. Because of their different fnumbers, all lenses led
to a different image brightness. To compensate for this effect, we
adjusted the illumination lens such that approximately the same
optical power was transferred to the image for all four lenses.
The images were then separated manually with external software.
For proof-of-principle purposes, it was not necessary to fully automate
the image fusion process and thus stitching of the partial pictures was
performed manually.
REFERENCES AND NOTES
1. R.Guo,S.Xiao,X.Zhai,J.Li,A.Xia,W.Huang,Microlensfabricationbymeansof
femtosecond two photon photopolymerization. Opt. Express 14, 810–816 (2006).
2. R. Winfield, B. Bhuian, S. C. G. O’Brien, G. M. Crean, Fabrication of grating structures by
simultaneous multi-spot fs laser writing. Appl. Surf. Sci. 253, 8086–8090 (2007).
3. N. Lindenmann, G. Balthasar, D. Hillerkuss, R. Schmogrow, M. Jordan, J. Leuthold,
W. Freude, C. Koos, Photonic wire bonding: A novel concept for chip-scale interconnects.
Opt. Express 20, 17667–17677 (2012).
4. S. Thiele, T. Gissibl, H. Giessen, A. M. Herkommer, Ultra-compact on-chip LED collimation
optics by 3D femtosecond direct laser writing. Opt. Lett. 41, 3029–3032 (2016).
5. C. Liberale, G. Cojoc, P. Candeloro, G. Das, F. Gentile, F. De Angelis, E. Di Fabrizio, Micro-
optics fabrication on top of optical fibers using two-photon lithography. IEEE Photonics
Technol. Lett. 22, 474–476 (2010).
6. T. Gissibl, M. Schmidt, H. Giessen, Spatial beam intensity shaping using phase masks on
single-mode optical fibers fabricated by femtosecond direct laser writing. Optica 3,
448–451 (2016).
7. M. Deubel, G. von Freymann, M. Wegener, S. Pereira, K. Busch, C. M. Soukoulis, Direct laser
writing of three-dimensional photonic-crystal templates for telecommunications. Nat.
Mater. 3, 444–447 (2004).
8. M. Malinauskas, A. Žukauskas, V. Purlys, K. Belazaras, A. Momot, D. Paipulas, R. Gadonas,
A. Piskarskas, H. Gilbergs, A. Gaidukevičiūtė, I. Sakellari, M. Farsari, S. Juodkazis,
Femtosecond laser polymerization of hybrid/integrated micro-optical elements and their
characterization. J. Opt. 10, 124010 (2010).
9. T.Gissibl,S.Thiele,A.M.HerkommerandH.Giessen,Sub-micrometreaccurate
free-form optics by three-dimensional printing on single-mode fibres. Nat. Commun.
7, 11763 (2016).
10. A. Žukauskas, M. Malinauskas, E. Brasselet, Monolithic generators of pseudo-nondiffracting
optical vortex beams at the microscale. Appl. Phys. Lett. 103, 181122 (2013).
11. T. Gissibl, S. Thiele, A. M. Herkommer, H. Giessen, Two-photon direct laser writing of
ultracompact multi-lens objectives. Nat. Photon. 10, 554–560 (2016).
12. L. Li, A. Y. Yi, Design and fabrication of a freeform microlens array for a compact
large-field-of-view compound-eye camera. Appl. Optics 51, 1843–1852 (2012).
13. W. Mönch, H. Zappe, Fabrication and testing of micro-lens arrays by all-liquid techniques.
J. Opt. 6, 330–337 (2004).
14. R. Navarro, Darwin and the eye. J. Optom. 2,59 (2009).
15. Y. Qin, H. Hua, M. Nguyen, Multiresolution foveated laparoscope with high resolvability.
Opt. Lett. 38 2191–2193 (2013).
16. H. Hua, S. Liu, Dual-sensor foveated imaging system. Appl. Opt. 47, 317–327
(2008).
17. A. Ude, C. Gaskett, G. Cheng, Foveated vision systems with two cameras per eye, in
Proceedings of the IEEE International Conference on Robotics and Automation (IEEE,
2006), pp. 3457–3462.
18. T. R. Hillman, T. Gutzler, S. A. Alexandrov, D. D. Sampson, High-resolution, wide-field
object reconstruction with synthetic aperture Fourier holographic optical microscopy.
Opt. Express 17, 7873–7892 (2009).
19. G. Carles, S. Chen, N. Bustin, J. Downing, D. McCall, A. Wood and A. R. Harvey, Multi-
aperture foveated imaging. Opt. Lett. 41, 1869–1872 (2016).
20. G. Y. Belay, H. Ottevaere, Y. Meuret, M. Vervaeke, J. V. Erps, H. Thienpont,
Demonstration of a multichannel, multiresolution imaging system. Appl. Opt. 52,
6081–6089 (2013).
21. A. Brückner, R. Leitel, A. Oberdörster, P. Dannberg, F. Wippermann, A. Bräuer,
Multi-aperture optics for wafer-level cameras. J. Micro-Nanolith. Mem. 10, 043010
(2011).
22. A.Brückner,A.Oberdörster,J.Dunkel,A.Reinmann,M.Müller,F.Wippermann,Ultra-
thin wafer-level camera with 720p resolution using micro-optics. SPIE Proc. 9193, 91930W
(2014).
23. W. S. Geisler, J. S. Perry, Real-time foveated multiresolution system for low-bandwidth
video communication. SPIE Proc. 3299, 294 (1998).
24. Signal and Image Processing Institute, University of Southern California [online]; http://
sipi.usc.edu/database/database.php?volume=misc&image=12#top.
25. A. Lohmann, Scaling laws for lens systems. Appl. Optics 28, 4996–4998 (1989).
26. D. J. Brady, N. Hagen, Multiscale lens design. Opt. Express 17, 10659–10674 (2009).
Acknowledgments: We thank Nanoscribe GmbH for assistance and for supplying
photopolymers. Further thanks go to M. Totzeck from Carl Zeiss AG for fruitful discussions.
Funding: This work was supported by Baden-Württemberg Stiftung, the European Research
Council (Complexplas), Deutsche Forschungsgemeinschaft, Bundesministerium für Bildung
und Forschung (PRINTOPTICS), and Carl Zeiss Stiftung. Author contributions: S.T., H.G.,
and A.M.H. developed the concept and supervised the research. K.A. designed the lenses
and carried out experimental investigations. T.G. fabricated the optical systems and recorded
the SEM images. S.T. wrote the manuscript together with H.G. and performed imaging
measurements. Competing interests: Some of the results presented in the paper have been
filed as a patent (PCT/EP2016/001721). Data and materials availability: All data needed to
evaluate the conclusions in the paper are present in th e paper. Additional dat a related to this paper
may be requested from the authors.
Submitted 27 October 2016
Accepted 9 January 2017
Published 15 February 2017
10.1126/sciadv.1602655
Citation: S. Thiele, K. Arzenbacher, T. Gissibl, H. Giessen, A. M. Herkommer, 3D-printed eagle
eye: Compound microlens system for foveated imaging. Sci. Adv. 3, e1602655 (2017).
SCIENCE ADVANCES |RESEARCH ARTICLE
Thiele et al.Sci. Adv. 2017; 3: e1602655 15 February 2017 6of6
on February 16, 2017http://advances.sciencemag.org/Downloaded from
doi: 10.1126/sciadv.1602655
2017, 3:.Sci Adv
and Alois M. Herkommer (February 15, 2017)
Simon Thiele, Kathrin Arzenbacher, Timo Gissibl, Harald Giessen
foveated imaging
3D-printed eagle eye: Compound microlens system for
this article is published is noted on the first page.
This article is publisher under a Creative Commons license. The specific license under which
article, including for commercial purposes, provided you give proper attribution.
licenses, you may freely distribute, adapt, or reuse theCC BY For articles published under
. here
Association for the Advancement of Science (AAAS). You may request permission by clicking
for non-commerical purposes. Commercial use requires prior permission from the American
licenses, you may distribute, adapt, or reuse the articleCC BY-NC For articles published under
http://advances.sciencemag.org. (This information is current as of February 16, 2017):
The following resources related to this article are available online at
http://advances.sciencemag.org/content/3/2/e1602655.full
online version of this article at: including high-resolution figures, can be found in theUpdated information and services,
http://advances.sciencemag.org/content/3/2/e1602655#BIBL
0 of which you can access for free at: cites 24 articles,This article
trademark of AAAS
otherwise. AAAS is the exclusive licensee. The title Science Advances is a registered
York Avenue NW, Washington, DC 20005. Copyright is held by the Authors unless stated
Newpublished by the American Association for the Advancement of Science (AAAS), 1200
(ISSN 2375-2548) publishes new articles weekly. The journal isScience Advances
on February 16, 2017http://advances.sciencemag.org/Downloaded from