Visual impairment restricts everyday mobility and limits the accessibility of places, which for the non-visually impaired is taken for granted. A short walk to a close destination, such as a market or a school becomes an everyday challenge. In this chapter, we present a novel solution to this problem that can evolve into an everyday visual aid for people with limited sight or total blindness. The proposed solution is a digital system, wearable like smart-glasses, equipped with cameras. An intelligent system module, incorporating efficient deep learning and uncertainty-aware decision-making algorithms, interprets the video scenes, translates them into speech, and describes them to the user through audio. The user can almost naturally interact with the system via a speech-based user interface, which is also capable of understanding the user’s emotions. The capabilities of this system are investigated in the context of accessibility and guidance to outdoor environments of cultural interest, such as the historic triangle of Athens. A survey of relevant state-of-the-art systems, technologies and services is performed, identifying critical system components that better adapt to the goals of the system, user needs and requirements, toward a user-centered architecture design.