Available via license: CC BY 4.0
Content may be subject to copyright.
www.nature.com/eye
COMMENT
OPEN
Meta smart glasses—large language models and the future for
assistive glasses for individuals with vision impairments
Ethan Waisberg ]]]
1
✉, Joshua Ong ]]]
2
, Mouayad Masalkhi ]]]
3
, Nasif Zaman
4
, Prithul Sarker ]]]
4
, Andrew G. Lee
5,6,7,8,9,10,11,12
and
Alireza Tavakkoli ]]]
4
© The Author(s) 2023
Eye; https://doi.org/10.1038/s41433-023-02842-z
INTRODUCTION
In late September 2023, Meta unveiled its second generation of
smart glasses in collaboration with Ray-Ban [1]. These smart
glasses come with several improvements, including enhanced
audio and cameras, over and a lighter design. The glasses are
equipped with an ultra-wide 12 megapixel camera and immersive
audio recording capabilities, allowing users to capture moments
with a high level of detail and depth (Fig. 1) [1, 2]. These smart
glasses are part of Meta’s efforts to develop AR and VR
technologies. In addition, the glasses are equipped with AI-
powered assistants like Meta AI [1].
Ray-Ban Meta smart glasses also represent a promising
development in assistive technology for individuals with visual
impairments and have the potential to significantly enhance their
quality of life. The field of assistive technology has been
advancing rapidly in recent years, particularly due to significant
advances in artificial intelligence [3] and augmented reality [4].
Envision is currently one of the leading smart glasses developers,
and their technology allows visual information to be articulated
into speech for individuals with vision impairments. A recent
update included GPT integration, allowing users to ask the glasses
specific questions, like to summarize text, or only reading vegan
items from a menu. GPT-4 [5]. Future updates will further increase
the usefulness of this integration [6].
The Envision smart glasses are built on the Google Glass
Enterprise Edition 2 (now discontinued), and the high price of
the Google smart glasses likely posed as a barrier of the
adoption to this helpful technology in vision impaired indivi-
duals. Lowering the cost of assistive technologies is essential,
as previous research in the UK found a staggeringly low
employment rate of 26% for blind and partially sighted working
age individuals [7].
By Meta attempting to make smart glasses a mainstream
technology, the cost of smart glasses will continue to decrease in
the coming years. The incorporated advanced camera technology
can provide real-time image processing, while the built in AI can
recognize objects and convert this visual information into speech [1].
An update planned within the next year is expected to allow users to
ask Meta AI questions about what they are looking at. Users can
potentially interact with these assistants to receive auditory
information about their environment, read text aloud, recognize
faces, or get directions, which can be invaluable for individuals with
visual impairments (Fig. 2). Future incorporation of GPS navigation
accompanied with audio cues facilitates self-navigating for indivi-
duals with visual impairments in new environments. Previous
research in the U.K. showed that nearly 40% of blind and partially
sighted individuals are not currently able to complete all of the
journey that they need or wish to make [7]. Better accessibility
through the usage of smart glasses can lead to greater independence
for individuals with vision impairments.
Meta hopes to incorporate augmented reality in future versions
of smart glasses, and describes the current stage as a stepping
stone to true augmented reality. Users with vision impairments
would benefit highly from true augmented reality glasses, with
potential features like magnification, contrast enhancement, and
color correction, enhancing their ability to see and navigate their
surroundings more effectively. Meta’s future augmented reality
work will be compared to the Apple Vision Pro, which is also
looking to make mixed reality devices mainstream [8, 9]. Further
research will also be required to minimize the variability between
various different VR/AR devices prior to clinical use [10]. We look
forward to continued advances in augmented reality with AI
integration, and believe this technology can revolutionize how
individuals with vision impairments interact with the world.
1
Department of Ophthalmology, University of Cambridge, Cambridge, UK.
2
Department of Ophthalmology and Visual Sciences, University of Michigan Kellogg Eye Center, Ann
Arbor, MI, USA.
3
University College Dublin School of Medicine, Belfield, Dublin, Ireland.
4
Human-Machine Perception Laboratory, Department of Computer Science and
Engineering, University of Nevada, Reno, Reno, NV, USA.
5
Center for Space Medicine, Baylor College of Medicine, Houston, TX, USA.
6
Department of Ophthalmology, Blanton Eye
Institute, Houston Methodist Hospital, Houston, TX, USA.
7
The Houston Methodist Research Institute, Houston Methodist Hospital, Houston, TX, USA.
8
Departments of
Ophthalmology, Neurology, and Neurosurgery, Weill Cornell Medicine, New York, NY, USA.
9
Department of Ophthalmology, University of Texas Medical Branch, Galveston, TX,
USA.
10
University of Texas MD Anderson Cancer Center, Houston, TX, USA.
11
Texas A&M College of Medicine, Bryan, TX, USA.
12
Department of Ophthalmology, The University of
Iowa Hospitals and Clinics, Iowa City, IA, USA. ✉email: ew690@cam.ac.uk
Received: 17 October 2023 Revised: 1 November 2023 Accepted: 10 November 2023
1234567890();,:
Fig. 1 Technology components of Ray-Ban Meta smart glasses. Reprinted without changes from Laurent C, Iqbal, M.Z., Campbell, A.G.
Adopting smart glasses responsibly: potential benefits, ethical, and privacy concerns with Ray-Ban stories. AI Ethics. under Creative Commons
Attribution 4.0 International License http://creativecommons.org/licenses/by/4.0/.
Fig. 2 Diagram of how smart glasses can provide auditory direction guidance for individuals with vision impairments.
E. Waisberg et al.
2
Eye
REFERENCES
1. Introducing the New Ray-Ban | Meta Smart Glasses. Meta. 2023. https://
about.fb.com/news/2023/09/new-ray-ban-meta-smart-glasses/.
2. Iqbal MZ, Campbell AG. Adopting smart glasses responsibly: potential benefits,
ethical, and privacy concerns with Ray-Ban stories. AI Ethics. 2023;3:325–327.
3. Waisberg E, Ong J, Paladugu P, Kamran SA, Zaman N, Lee AG, et al. Challenges of
artificial intelligence in space medicine. Space Sci Technol. 2022;2022:1–7.
4. Masalkhi M, Waisberg E, Ong J, Zaman N, Sarker P, Lee AG, et al. Apple vision pro
for ophthalmology and medicine. Ann Biomed Eng. 2023;51:2643–2646.
5. Waisberg E, Ong J, Masalkhi M, Kamran SA, Zaman N, Sarker P, et al. GPT-4: a new
era of artificial intelligence in medicine. Ir J Med Sci. 2023. https://doi.org/
10.1007/s11845-023-03377-8.
6. Paladugu PS, Ong J, Nelson N, Kamran SA, Waisberg E, Zaman N, et al. Generative
adversarial networks in medicine: important considerations for this emerging
innovation in artificial intelligence. Ann Biomed Eng. 2023;51:2130–2142.
7. Slade, J, Edwards, R. My Voice 2015: the views and experiences of blind and
partially sighted people in the UK. accessed 10 Oct. 2023.
8. Waisberg E, Ong J, Masalkhi M, Zaman N, Sarker P, Lee AG, et al. Apple Vision Pro
and why extended reality will revolutionize the future of medicine. Ir J Med Sci.
2023. https://doi.org/10.1007/s11845-023-03437-z.
9. Waisberg E, Ong J, Masalkhi M, Zaman N, Sarker P, Lee AG, et al. The future of
ophthalmology and vision science with the Apple Vision Pro. Eye. 2023. https://
doi.org/10.1038/s41433-023-02688-5.
10. Sarker P, Zaman N, Ong J, Paladugu P, Aldred M, Waisberg E, et al. Test–retest
reliability of virtual reality devices in quantifying for relative afferent pupillary
defect. Trans Vis Sci Technol. 2023;12:2.
AUTHOR CONTRIBUTIONS
EW—Writing. JO—Writing. MM—Writing, figure development. NZ—Review, intel-
lectual support. PS—Review, intellectual support. AGL—Review, intellectual support.
AT—Review, intellectual support.
COMPETING INTERESTS
The authors declare no competing interests.
ADDITIONAL INFORMATION
Correspondence and requests for materials should be addressed to Ethan Waisberg .
Reprints and permission information is available at http://www.nature.com/
reprints
Publisher’s note Springer Nature remains neutral with regard to jurisdictional claims
in published maps and institutional affiliations.
Open Access This article is licensed under a Creative Commons
Attribution 4.0 International License, which permits use, sharing,
adaptation, distribution and reproduction in any medium or format, as long as you give
appropriate credit to the original author(s) and the source, provide a link to the Creative
Commons licence, and indicate if changes were made. The images or other third party
material in this article are included in the article’s Creative Commons licence, unless
indicated otherwise in a credit line to the material. If material is not included in the
article’s Creative Commons licence and your intended use is not permitted by statutory
regulation or exceeds the permitted use, you will need to obtain permission directly
from the copyright holder. To view a copy of this licence, visit http://creativecom-
mons.org/licenses/by/4.0/.
© The Author(s) 2023
E. Waisberg et al.
3
Eye