Akshay Kolgar Nayak’s research while affiliated with Old Dominion University and other places

What is this page?


This page lists works of an author who doesn't have a ResearchGate profile or hasn't added the works to their profile yet. It is automatically generated from public (personal) data to further our legitimate goal of comprehensive and accurate scientific recordkeeping. If you are this author and want this page removed, please let us know.

Publications (4)


Understanding Low Vision Graphical Perception of Bar Charts
  • Conference Paper

October 2024

·

3 Reads

Yash Prakash

·

Akshay Kolgar Nayak

·

·

[...]

·


Assessing the Accessibility and Usability of Web Archives for Blind Users

September 2024

·

29 Reads

Web archives play a crucial role in preserving the digital history of the internet, given the inherent volatility of websites that constantly undergo modifications, content updates, and migrations, or even cease to exist altogether. Web archives ensure that present and historical web information will be available in the future for researchers, historians, students, corporations, and general public. Given their importance, it is essential for web archives to be equally accessible to everyone, including those with visual disabilities. In the absence of a prior in-depth investigation in this regard, this paper examines the status-quo accessibility and usability of five popular web archives for people who are blind. Specifically, we analyzed reports generated by an automated accessibility checker tool and also collected feedback from a user study with 10 blind screen reader users. The analysis of accessibility reports revealed issues that were common across the different archives, including a lack of text alternatives for images and the absence of proper aria labels. The user study showed that blind users struggled to do even basic search tasks to locate desired mementos or snapshots of websites saved in the archives. The participants also explicitly indicated that they found it strenuous to interact with web archives. Informed by these findings, we provide accessibility design suggestions for archives’ web developers and assistive technology developers.


Towards Enhancing Low Vision Usability of Data Charts on Smartphones

September 2024

·

11 Reads

IEEE Transactions on Visualization and Computer Graphics

The importance of data charts is self-evident, given their ability to express complex data in a simple format that facilitates quick and easy comparisons, analysis, and consumption. However, the inherent visual nature of the charts creates barriers for people with visual impairments to reap the associated benefts to the same extent as their sighted peers. While extant research has predominantly focused on understanding and addressing these barriers for blind screen reader users, the needs of low-vision screen magnifer users have been largely overlooked. In an interview study, almost all low-vision participants stated that it was challenging to interact with data charts on small screen devices such as smartphones and tablets, even though they could technically “see” the chart content. They ascribed these challenges mainly to the magnifcation-induced loss of visual context that connected data points with each other and also with chart annotations, e.g., axis values. In this paper, we present a method that addresses this problem by automatically transforming charts that are typically non-interactive images into personalizable interactive charts which allow selective viewing of desired data points and preserve visual context as much as possible under screen enlargement. We evaluated our method in a usability study with 26 low-vision participants, who all performed a set of representative chart-related tasks under different study conditions. In the study, we observed that our method signifcantly improved the usability of charts over both the status quo screen magnifer and a state-of-the-art space compaction-based solution.


All in One Place: Ensuring Usable Access to Online Shopping Items for Blind Users

June 2024

·

15 Reads

Proceedings of the ACM on Human-Computer Interaction

Perusing web data items such as shopping products is a core online user activity. To prevent information overload, the content associated with data items is typically dispersed across multiple webpage sections over multiple web pages. However, such content distribution manifests an unintended side effect of significantly increasing the interaction burden for blind users, since navigating to-and-fro between different sections in different pages is tedious and cumbersome with their screen readers. While existing works have proposed methods for the context of a single webpage, solutions enabling usable access to content distributed across multiple webpages are few and far between. In this paper, we present InstaFetch, a browser extension that dynamically generates an alternative screen reader-friendly user interface in real-time, which blind users can leverage to almost instantly access different item-related information such as description, full specification, and user reviews, all in one place, without having to tediously navigate to different sections in different webpages. Moreover, InstaFetch also supports natural language queries about any item, a feature blind users can exploit to quickly obtain desired information, thereby avoiding manually trudging through reams of text. In a study with 14 blind users, we observed that the participants needed significantly lesser time to peruse data items with InstaFetch, than with a state-of-the-art solution.