ChapterPDF Available

Inspection of In-Vehicle Touchscreen Infotainment Display for Different Screen Locations, Menu Types, and Positions

Authors:

Abstract

There has been a rapid increase in ‘In-vehicle touchscreens’ usage over the last decade, particularly for information and entertainment functions. And because of the dynamic interface functionality, the touchscreen becomes a major attraction for the automobile industry. As a result, today most car manufacturer uses touchscreen instead of static push buttons. For display screen locations, there have been two legacy display locations in the car. One is a ‘Head-up display’ and ‘Head-down display’. The head-up display is similar to jet plans mounted on top of the instrument panel board and head-down display - located on the car console. However, the HUD position won't work for touchscreen devices since the touchscreen required direct touch input from the driver for which the HUD location either blocks the road view or is out of the driver's reach. Besides, there are numbers of possibilities for HDD locations to enhance driver’s interaction and driving experience. In the research study, we are evaluating 8 different possible “in-vehicle touch-screen designs”. (2 HDD locations for touch screen display- Top & Middle, 2 different main menu layouts- Horizontal & Vertical, 2 screen mount positions- Fixed screen vs Tilted screen according to user preference). The study focused on assessing possibilities for different HDD locations within proximity of the driver. This experiment aims to find the most effective and efficient touchscreen position with the least possible discomfort and distraction. Participants were instructed to perform a series of different Music, A/C, Seat control tasks on the developed In-vehicle infotainment prototype in the experiment. The experiment tasks were focused on controlled and real-life task-based scenarios.
Inspection of In-Vehicle Touchscreen
Infotainment Display for Different Screen
Locations, Menu Types, and Positions
Saumil Patel(B), Yi Liu, Ruobing Zhao, Xinyu Liu, and Yueqing Li
Lamar University, Beaumont, TX 77705, USA
spatel74@lamar.edu
Abstract. There has been a rapid increase in ‘In-vehicle touchscreens’ usage
over the last decade, particularly for information and entertainment functions. And
because of the dynamic interface functionality, the touchscreen becomes a major
attraction for the automobile industry. As a result, today most car manufacturer
uses touchscreen instead of static push buttons. For display screen locations, there
have been two legacy display locations in the car. One is a ‘Head-up display’
and ‘Head-down display’. The head-up display is similar to jet plans mounted
on top of the instrument panel board and head-down display - located on the car
console. However, the HUD position won’t work for touchscreen devices since
the touchscreen required direct touch input from the driver for which the HUD
location either blocks the road view or is out of the driver’s reach. Besides, there
are numbers of possibilities for HDD locations to enhance driver’s interaction and
driving experience.
In the research study, we are evaluating 8 different possible “in-vehicle touch-
screen designs”. (2 HDD locations for touch screen display- Top & Middle, 2
different main menu layouts- Horizontal & Vertical, 2 screen mount positions-
Fixed screen vs Tilted screen according to user preference). The study focused on
assessing possibilities for different HDD locations within proximity of the driver.
This experiment aims to find the most effective and efficient touchscreen position
with the least possible discomfort and distraction. Participants were instructed
to perform a series of different Music, A/C, Seat control tasks on the developed
In-vehicle infotainment prototype in the experiment. The experiment tasks were
focused on controlled and real-life task-based scenarios.
Keywords: In-vehicle touch screen display ·Human machine interface ·
Automotive user interfaces ·Touch screen display locations ·Menu types
1 Introduction
1.1 What is IVIS? and History
IVIS In-vehicle infotainment System. As the name indicates, it is a computer system
that provides live information about vehicle conditions using sensors and cameras and
allows drivers to access various entertainment options and vehicle information while
© The Author(s), under exclusive license to Springer Nature Switzerland AG 2022
H. Krömker (Ed.): HCII 2022, LNCS 13335, pp. 258–279, 2022.
https://doi.org/10.1007/978-3-031-04987-3_18
Inspection of In-Vehicle Touchscreen Infotainment Display 259
driving. IVIS is a system specially designed for drivers to make the driving experience
safer and more pleasurable.
In recent years human society evolved from the “industrial society age” and transi-
tioned into the “knowledge society age.“ Therefore, the advancement of Human Machine
Interfaces (HMI) is a tough, interdisciplinary test. Other than the technical aspect, the
improvement was likewise tested by the need to adhere to the cognitive & psychological
principle shown in the need to pick interaction designs that fit the user’s mental model.
Ex: Seat-belt sign design, window switch design, tire pressure on-screen, entertainment
options.
The first automotive IVIS was mechanical. Their primary purpose and implemented
functionalities planned to inform the driver with relevant information about the vehicle
speed, gas level, break current status.
The MG T type was the first kind of manual car information system has been built in
1937. As time progresses, designers started developing more functionalities. Later on,
displaying only this information was not sufficient anymore. The drivers also wanted
to be entertained while driving. Therefore, entertainment functions like radios were
progressively integrated into the car, leading to increased complexity. A system that
combines vehicle condition information with entertainment functions is called an info-
tainment system [3]. However, in 2010 GPRS introduced a small screen in cars. It opens
up a new way of communicating with the system using the dashboard. Before that,
functionality was more mechanical buttons instead of digital screens and touchpads.
Moreover, the type of information introduced to the Driver has also evolved
and changed over time. Today, standard functionalities of HMIs encompass the dis-
play of vehicle-related information, advanced driver assistance functionalities, and
entertainment components like radio, media player.
Due to the increased complexity of the HMI, which consists of various input and
output interfaces, its usability has become a significant quality factor [2]. Modern HMIs
consists of a graphical user interface, a control unit, speech dialog systems, and gesture-
based systems like touch interfaces. Furthermore, the application of up-to-date hardware
and software components enables a steadily rising number of use cases. The specialized
acknowledgment is liable for the sufficient execution of the simplicity of creating car
HMIs numerous regular decades prior. Looking at the present and future developments,
the primary difference to past developments is ‘information processing’ and ‘entertain-
ment options’. The types and the complexity of car IVIS have quickly changed in recent
decades, corresponding to computer frameworks’ improvement: from simple order line
interfaces to a wide assortment of graphical UIs, discourse exchange frameworks, and
motion-based frameworks like touch interfaces.
2 Literature
This thesis aims to evaluate the location and layout design of the In-vehicle infotainment
system. The purpose of the study is to create the most effective and efficient interaction
solution between the In-vehicle infotainment system (machine) and the Driver (human)
with the least possible distractions and discomforts.
The literature section includes some of the previous studies about automotive display
locations and layout designs on driving performance. The first section reviewsthe general
260 S. Patel et al.
introduction. The second section focuses on the effect of display location on driving
performance. The third section reviews research on Interface design, mainly on Icons
and display layout. The last section explains the limitations of previous studies and
research questions.
Determinants of Automotive Infotainment Display-Based Applications:
Operating vehicle driving is primarily characterized as a primary task. However, driving
tasks include other necessary eyes off-road task activities such as mirror glances or
speed checks. When a secondary task (entertainment system, car information system,
navigation, cellphone use) comes into play, attention on the road is substantially reduced.
Thus, an ongoing switch between the primary task(driving) and the secondary task
increases safety risk.
The availability of various input devices and interaction methods makes interac-
tion more engaging, practical, and complex at the same time. There have been several
experiments were conducted to evaluate input devices and input methods [4,37,38,
43].With the rapid development of touchscreen display devices, IVIS dashboard displays
have become the most chosen interaction device in the automotive industry. Besides,
next-generation EV vehicles come with a larger screen display on the dashboard panel.
However, a significant disadvantage of IVIS is that drivers must look at the display
while accessing it. At the same time, the IVIS display has various application benefits
by commanding attention [44] which lies within the Driver’s immediate field of view.
2.1 Device Locations
Several studies have been conducted on Head-up and Head-down device locations
searching for the most effective device location for cars [47,51,30,51]. The prior
study divided device location into HUD Head-Up Display and HDD Head Down
Display.
A HUD or heads-up display, also known as a HUD. According to the definition, an
Automotive HUD is a transparent display that presents data in an automobile without
requiring users to look away from the road. Originally HUDs were pioneered for fighter
jets in the early 70s and later for low-flying military helicopter pilots, for whom infor-
mation overload was a significant issue, and for whom changing their view to look at
the aircraft’s instruments could be a fatal distraction [28].
According to prior research, the Head-up display location is the most preferred and
practical location for a display screen device [44,5]. Since driving, attention relies
on ambient/peripheral visual resources. If the device location is closer to the driver’s
focal/central vision while driving, it’s easier for the driver to maintain eyes on the road.
Head-up display close to windshield location makes it quite effortless for the driver to
maintain their eyes on the road compared to any other location [26]. In addition, HUD’s
location next to the windshield makes glance distance comparatively low, resulting in
lower subjective load, faster responses to hazards [22,12].
However, the HUD cannot thoroughly replace the HDD display types (CID and
DCD). Besides, for some applications (e.g., displaying the detailed navigation map or
long lists), the HUD is not fully developed from a visualization technology and usability
perspective 1.
Inspection of In-Vehicle Touchscreen Infotainment Display 261
HDD stands for Head down display location. And as the name indicates, the HDD
display location required a driver to move his head down from the “driving environment.”
So, HDD is mostly placed on or attached to the car’s console.
Head-down displays (HDD) were divided into two types the driver-centered display
(DCD) for displaying speed and car status information and a central information display
(CID) for entertainment options and car functions1.
From the first built automotive, all of the Automotive uses HDD: either driver-
centered display (DCD) or central information display (CID) to display necessary driving
information. HDD is considered to be the most effective location for displaying detailed
information such as maps, driving safety/maintenance alerts, entertainment options in
automotive [1]. Nowadays, as the modern automotive infotainment system became more
interactive, HDD is considered the preferable position [27].
A study conducted by Normark [41]Liu[30] found no significant difference between
HUD and HDD task performance. A comparison study performed by Ablassmeier1 using
an eye gaze study indicates no secondary task performance difference between HUD
and DCD (driver-centered display). On the contrary, participants’ lane-keeping perfor-
mance fluctuates for HUD compared to HDD. Another study conducted to measure the
performance cost of digital HUD displays used three different locations: 2 HUD and
1 HDD location. 1st HUD location called digital HUD, located below the horizon
and 10° to the left of the center of the front screen, and another HUD location called
analog HUD was centered relative to digital location subtended 10° both vertically and
horizontal. HDD location was located on the car dashboard. According to experiment
data, participants performed worse at maintaining their lane position for digital HUD
compared to the analog HUD and HDD conditions [19] similar to the experiment con-
ducted by Ablassmeier1. A study conducted on “The clutter effect of HUD” reveals no
difference in lane-keeping data for HUD and HDD. Since driving performance relies
on ambient/peripheral visual resources(surrounding) and secondary tasks rely on focal
visual resources [42,52].The possibility that device location angle from central vision
matters more compared to HUD and HDD location difference. Similarly, as focal vision
becomes a priority during hazard detection, HUD performs better compared to HDD
because of HUD’s closest location to the driving environment area [22].
According to prior research, HDD’s distant location appeared more restrictive than
HUD [47]. However, not all research studies are in favor of HUD [1,19].
Besides the number of comparison studies that have been conducted on display
locations [1,19,47], there were no conclusive results regarding the suitability of one
location over another. Additionally, the majority of the prior research gave importance
to comparisons between HUD and HDD display location, ignoring the importance of
task type and input interaction.
Prior research lacks study on touch-screen device location. The majority of prior
studies have considered a windshield or similar positions as HUD locations. However,
similar locations could only work for display-only tasks (navigation, lane change); it
won’t work for tasks requiring user input. Besides, accessing any information/function
that required some input from the driver causes similar or greater distraction since
accessing any task required to glance at display and input device [8]. The majority of
262 S. Patel et al.
functions required some input from the driver. And for input interaction, the touchscreen
is considered to be the most effective and efficient device [27].
None of the prior studies considered the possibility of different HDD positions.
However, today number of automotive manufacturers are using different HDD locations.
There are numerous possibilities for HDD location, which might be most suitable for
secondary tasks with input. Besides, suppose the device is used for the HUD location.
In that case, it will be blocking the driving view area unless the device is transparent or
holographic. However, HUD visual technology is still not there yet. It certainly cannot
completely replace HDD1. Moreover, besides accessing HUD touchscreen can cause
driving discomfort. Finding the most suitable device location for touch screen devices
with the least discomfort will be a key research finding.
2.2 Navigation Menu
The following section represents prior research on navigation menu design.
The user interface design has been a long-term interest in HCI research, and there
has been considerable general research done on it. However, very few research papers
have tried to explore the relation between ‘task in hand’ and ‘human performance’ with
the possibility of different Navigation menu designs. The navigation menu is nothing but
a group of buttons, where buttons contain graphical/text icons organized and grouped to
work as the accessibility path of the system’s functionality. The traditional Navigation
menu design is mainly contained buttons: button location, button design, shape, space
between buttons, icon type in the buttons, icon design [8,9,20,21]. However, modern
navigation menu design includes touch gesture functionality (tapping, flicking, panning),
visual or haptic feedback [4,37]. Besides, today users are getting more aware of visual
perception.
Since driving tasks required eyes on the road, an in-vehicle touchscreen is considered
a major visual distraction. The majority of the previous work considered touchscreen
display as a greater distraction for driving tasks. Because of visual processing constraints,
it is hard to grasp all visual layout elements at once glance [18,31,39], instead of focusing
only on layout elements that come under the user’s foveal vision. It reveals that even if the
entire layout is represented in front of the user, the user cannot visually process all of its
menu options, buttons, icons, fields, and other elements. As a result, the graphical layout
becomes the problem of precise attention deployment [25]: How to present information
effectively and efficiently?
Since we cannot ignore the use of IVIS while driving, we are focusing on designing
and evaluating the most effective and efficient position for the navigation menu on the
in-vehicle touchscreen.
Navigation Menu visual design can be divided into three groups:
Layout position: Horizontal vs. vertical, top vs. bottom.
Layout element design: buttons, button location, button design, shape, space between
buttons, icon type in the buttons, icon design [8,9,20,21].
Layout shape: linear, pie [7], circular, rectangle, triangle.
Inspection of In-Vehicle Touchscreen Infotainment Display 263
Navigation Menu Types. There has been a rapid increase in in-vehicle touchscreens
over the last decade, particularly for information and entertainment functions. In com-
parison, traditional vehicle controls are limited to static design and cannot update once
designed. With the touchscreen’s dynamic control option, functions and layout can be
altered and added as functionality evolves [32].
However, traditional static buttons are located on the car console to access it, unlike
the touchscreen’s dynamic control functionality. As Driving is considered a highly visual
task where the driver “monitors a continuous stream of information through which the
vehicle is traveling” [34] touchscreen becomes a major visual distraction. One of the
major limitations of using in-vehicle displays is the poor design of the navigation menu,
deep menu selections, excessive information presented at once, and similar-looking
buttons that result in elevated visual distraction. It will result in longer task completion
times, extending the frequency and duration of glances away from the road [11]. The
more drivers glance away from the road and the further away they have to look, the
more likely it will be for safety-critical information related to the driving environment
to be missed [22]. As shown in the figure, a number of auto manufacturers use control
penal as a replacement for traditional static functions. As technology evolves, the use of
touch screens will gain more popularity. However, none of the prior research explores
the effect of navigation menu types on the screen relative to screen position and type of
button interactions (Fig. 1).
Fig. 1. Static buttons: (I) Push buttons, (II). Toyota A/C rotary controller, (III). Models 3 touch
screen display (Images by Saumil (June 2021))
There have been studies investigating all manner of aspects to do with navigation
menu type on-screen [17,36]. However, the majority of the study included a website
menu as a primary task. Very few studies explore the menu type effect with driving tasks
[6,13].
Navigation Menu Type in Websites. A study conducted by Burrell and Sodan [6]
focused on the position of menus. The study included six different menu types. They
experimented with a menu located on top and bottom, top, top tabbed, and left and top
and right of the screen. In conclusion, no significant difference in performance besides
the user preferred the top tabbed menu the most.
In 2011 Faulkner and Hayton [15] evaluated left, and right-side placed menu for an
e-commerce website. Experiment tasks included searching and purchase of Christmas
products. There was no significant difference in the time taken to perform the experiment
task.
264 S. Patel et al.
A study performed by Murano and Oenga [36] on menu design includes vertical
and horizontal menus. The study was performed on an E-commerce website using the
left side as vertical and topside as a horizontal menu. There were no major performance
differences in both designs. However, later assumed that reason for no performance
difference could have been less demanding experiment tasks [36]. Murano and Lomas
conducted another experiment on four different menu types (left, right, top, and bottom
of the screen). Where participant’s performance was measured by mouse clicks and the
number of errors, unlike previous experiments by Murano [36], tasks were designed to
be more demanding and difficult. As a result, Top and left side menus required fewer
clicks and the least number of errors [35]. According to a UX study performed by Nelson
Norman group web users spends 80% of their time viewing the left side of the page and
20% viewing the right half [16].
Besides, several studies were conducted on vertical and horizontal menu typing.
There are no conclusive results. On the contrary, the number of experiment study results
have been diluted by the type of experiment task and task complexity [14,29,35].
Visual Attention for In-Vehicle Touch Screen Device. For an In-vehicle infotain-
ment system driver has to perform secondary tasks without losing sight of the road.
There are two menu elements for a vehicle information system: one is the navigation
menu, and the second is static control functions. The static control function’s location
on the screen does play a vital role in limiting visual distraction.
According to an experiment conducted by Eren, Burnett, & Large on visual dis-
traction, participants required greater attention and several visual glances to perform
operations on the middle section of the screen and lower glances at the screen’s cor-
ners [13]. Besides, the screen’s top right corner was more in line with the driver’s sight
(depending on the side of the drive), resulted in fewer glances.
A study conducted by Vilchez on traffic signsand their effect on trajectory movement
reveals that visual distraction location can influence lane-keeping performance [49]. In
the experiment, the traffic sign was presented on the roadside, and participants were told
to acknowledge the sign while conducting a driving task. According to the experiment
data, the participant’s lane-keeping performance was heavily influenced by the sign’s
side. In the experiment, participants’ vehicles moved from the centerline to the side
where a traffic sign was presented as they tried to look and lean towards that side. This
study can be applied to IVIS’s menu type design for its location on the screen.
Previous research indicated that the driver’s side position and corner of the screen
required the least glance to operate [13]. The majority of applications with a naviga-
tion menu preferred vertical position. However, the vertical menu consumes more space
compared to the horizontal counterpart. As a result, in the era of desktop-based appli-
cations, the horizontal menu has become more popular. As Jakob’s law of the internet
user experience states, ‘users spend most of the time on other sites.’ In other words,
users prefer your site to work the same way as all the other sites they already know [40].
Following the similar navigation approach means following the existing mental model,
allowing users to apply their knowledge from previous experiences.
For menu typed at different corners increases the distance between menus, and as
Fitts’s Law states, the amount of time required for a person to move to a target area is
Inspection of In-Vehicle Touchscreen Infotainment Display 265
a function of the distance to the target divided by the target’s size. Thus, the longer the
distance and the smaller the target’s size, the longer it takes [17].
Besides, prior research didn’t explore the relationship between different touch screen
locations and menu types on user performance. For example, for the device’s lower car
console position, the top layout position is the nearest location to the driver; however,
the device’s upper console position bottom layout is the nearest location to the driver.
With the complexity of diverse functions and tasks in hand, visual attention deployment
becomes more important.
Despite the opinions and numerous studies around this subject, there are still unan-
swered questions regarding which menu type or design might be optimal in terms of
performance and user preference.
2.3 Limitations of Previous Research
IVIS (In-vehicle infotainment system) display is the most enchanting invention in the
automobile industry in the last decade. It brings driving experience to the next level by
introducing hands in control to drivers and by providing real-time information, such as
vehicle speed, entertainment options, map navigation, maintenance warning indicator
signs, and fuel indicator signs while driving. However, IVIS creates a visual distraction
for a driving task. As a part of the solution, the number of elements in the IVIS device
location and layout design can be altered and manipulated to increase safety and driving
experience.
Considerable research has been done on the IVIS display design. Much of the
research has been introduced in the literature of this thesis. The thesis literature review
explores many studies that have been examined and evaluated the effect of head-down
and head-up display locations on users associated with different task types. The major-
ity of research studies have been using display devices for IVIS instead of the touch
screen; there is a significant lack of research for touch screens as IVIS devices. Besides
all this research conducted, there are no conclusive results regarding the suitability of
one display type location over the other.
Most importantly, most studies performed a comparison between head-down display
and windshield as head-up displays; however, a similar position won’t work for a touch
screen display. Previous research didn’t explore the possibility of different HDD device
locations. Besides, several of the research studies have examined the display location
experiment using IVIS functional task; however, most experiments have been completed
with one specific task design or completely neglected effect of real driving scenario.
Research done in layout design is mostly limited and subjective. Prior layout research
didn’t explore the relationship between device location and layout location on driving
performance. However, it does provide some insight into button design and driving
performance, but it lacks a clear explanation for one over another.
Reviews studies highlighted some issues with simulation software and hardware
capabilities rendering the information projection on the screen location and distance and
capability with the IVIS device. Besides, existing research shows a comprehensive study
covering the effect of an environmental factor on driving performance. Simultaneously,
the secondary operation device (IVIS) has not included all the factors necessary to claim
266 S. Patel et al.
if the daytime or nighttime driving period performed better. Furthermore, the study of
interior feature light effect, including screen layout design, affects driving performance
and secondary task completion time, including distraction gaze time in the nighttime.
Finally, research studies in this literature explain independent study on device
location, layout, and icons’ effect on driving and secondary task performance. How-
ever, the research community fails to test all three independent factors combined and
cross-functional impact in a single experiment.
3 Methodology
3.1 Participants
Eleven healthy participants, with ages ranging between 22 to 35 years, volunteered
to participate in the study. An online general demographic questionnaire session was
held to gather the data of participants’ age, gender, education, years of driving experi-
ence, years of experience using phone, tablets, or any other touch display devices, years
of experience with IVIS display (HUD, HDD), neck discomfort and eye stress while
using different displays, comfortability using driving simulator, comfortability using a
computer application.
Once participants were done with the general demographic questionary, a Secondary
“In-vehicle infotainment system” survey questionnaire session was held online, contin-
uing the first one; the purpose of the survey is to gather participant’s opinions on IVIS
usage and preference. Their understanding of what IVIS can do and design preference.
Participants have to obtain their post-experiment information and subjective com-
ments once they finish the IVIS experiment task. Post-experiment information includes
participants’ feedback, subjective comments, suggestion, or preference they have any.
Participants were provided with a written, informed consent form. The study was
reviewed and approved by the Institutional Review Board of Lamar University.
3.2 Material
Driving Simulator. All experiment testing took place using STISIM Drive, a fixed-
based driving simulator based at the Lamar University. A STISIM Drive driving simulator
can help to improve basic driving skills by offering virtual training in diverse driving
scenarios such as urban, rural, and metro city routes. A driving simulator accurately
simulates random crossing pedestrians, unexpected lanes changing of the vehicles, and
sudden break of the lead vehicle on the road.
The three 32 in., high-definition LED monitors provided drivers with a 135-degree
field-of-view. A set of full-size driving controls, including the accelerator, brake pedal,
steering wheel, and turn signals, were used to provide real-time feedback for speed
control and lane maintenance. Wang [50] have used similar simulator setups for driving
simulator experiments in terms of the real-world driving environment (Fig. 2).
Inspection of In-Vehicle Touchscreen Infotainment Display 267
Fig. 2. Simulator screens with 135-degree field of view (Image by Saumil (Jul 2021))
Driving Environment: The driving environment created in a loop way consisted of 4
different scenarios connecting with each other, starting from city area 2–3 lanes road to
one-lane bidirectional ruler area road connecting to industrial zone and directed towards
city residential area to the highway.
City area driving scenario was 2 miles long, consisting of two- and three-way lanes.
Traffic in the city driving scenario was high. At various points throughout this drive, the
participants encountered critical events (instant braking, pedestrian on the street, front
car instant braking), which required an overt response maneuver in order to avoid a
collision.
Interstate highway consisted of single lanes, with a posted speed limit of 70 mph.
The highway scenario was 1 mile long. Highway included one-lane bidirectional roads.
These roads included large amounts and varying degrees of curvature as well as frequent
elevation changes and two crossroads. Traffic on highways varies from low to high, and
high at the city to the interstate highway and low at the industrial roadway connecting
to the highway. Interstate highway was connected with the city and industrial area.
The industrial roadway scenario included one-lane bidirectional rural roads. These
roads included two crossroads. The industrial roadway was 2 miles long (traveling at
the posted 55mph speed limit). There was limited traffic in the driver’s lane; however,
in the oncoming lane of traffic, there was a high level of traffic.
Residential area scenario included 2 miles long 1 and 2 lanes driveway, with a posted
speed limit of 20 mph at school zone and 35 at residential zone. Traffic difficulty in the
residential area was greater. Residential driving scenarios include critical static lane
obstacles such as pedestrian crossing lanes, truck sudden lane change, a car passing in
one way surrounded by parking vehicles.
Lane obstacles: Throughout the drive, there were 5–6 different types of obstacles
were planted on different driving scenarios. At various points during the drive, a static
and moving obstacle appeared in the driver’s lane or Oncoming Lane Drift Periodically
in the driver’s lane. Drivers were allowed approximately 2 s to 3 s to recognize the hazard
and initiate a safe response maneuver. Participants were initiated to perform secondary
tasks approximately 1 s–3 s before obstacles become visible. Because implant of the
268 S. Patel et al.
secondary task before obstacles provide an opportunity to measure and analyze the
subsequent effect of participants driving and lateral control performance.
Response times for lane obstacle events were measured from the moment the obstacle
was visible until the driver made a measurable response. After obstacle appearance Back
to drive time for these events was recorded until the beginning of the steering wheel
deflection.
Braking responses were recorded as soon as the pedal was depressed 5%.
Experiment Setup. The experiment was conducted in the daytime at the ‘Human Factor
and Ergonomics Lab’ at Lamar University. “F-GT Simulator Cockpit” was used for
cockpit setup with touch screen device stand, surrounded by 3 XX-inch LCD screen in
135 Degree angle. Setup was designed to provide a realistic driving experience.
3.3 Independent Variables
Device locations
Menu types
Device positions
Device Locations
1. Top Location: In the top location, the touch screen is mounted on the top of the
car console and parallel to the steering wheel position. Location: Screen is located
approximately 35° diagonally offset from the center of the horizon line, 25 cm to
right from the staring wheel center, and 72 cm up from the bottom.
2. Middle location: As shown in the figure touchscreen is mounted at the middle down
position of the car console. This position required a driver to head down every time
he performs the task. Location: The screen is located approximately 35° diagonally
offset from the center of the horizon line, 25 cm to right from the staring wheel
center, and 55 cm up from the bottom.
Menu Types
1. Vertical menu: In the vertical menu, the navigation menu is aligned vertically on the
interface at the driver’s side. The touch screen is either mounted on top or middle
car console location.
2. Horizontal menu: In the horizontal menu, the navigationmenu is aligned horizontally
on the interface at the top of the screen (Fig. 3).
Inspection of In-Vehicle Touchscreen Infotainment Display 269
Fig. 3. (I) Vertical menu type (ii) Horizontal menu type (Prototype Screenshot by Saumil (May
2021))
Device Positions. In geometry, the touch-screen position, angular position, attitude, or
direction of an object such as a line, plane, or rigid body is part of the description of
how it is placed in the space it occupies. It refers to the rotation that is needed to move
the object from a reference placement to its current placement.
1. Fixed Position: Touch screen is mounted at the top or middle car console position.
The fixed position is used as a reference point for screen location (Fig. 4).
Fig. 4. Fixed positions: (I) Top location, (II) Medium location (Image by Saumil (July 2021))
2. User preferred Positions: Touch screen is mounted at the top or middle car console.
Touch-screen position is angularly tilted towards the driver’s side as per the driver’s
preference from a fixed reference position (Fig. 5).
270 S. Patel et al.
Fig. 5. User preferred positions: (I) Top location, (II) Medium location (Image by Saumil (July
2021))
Experiment Timeline: Number of experiments: 9 =Baseline (No task) +8 different
IVIS positions.
Touch-Screen location (Top, Middle) x Menu type (Horizontal, Vertical) x Origi-
nation type (Fixed, user-preferred).
Total experiment time =Number of experiments x single experiment time =9x8–
12 min +20 min (break & questionary) =92 min 128 min.
Single Experiment Timeline: Total distance in the single experiment: 3miles.
Total time is taken to perform a single experiment: 8–10 min (3miles driving).
Total driving scenarios: 3 (Metro, Industrial, Residential).
Number of tasks conducted in a single experiment: 6 tasks.
Number of tasks conducted in a single scenario: 2 tasks.
3.4 Procedure
The study was conducted at Lamar university Human Factor & Ergonomics Lab in the
daytime. The participants were allowed to take a break and have some food if they
want. However, all the experiment activities have been recorded in order to measure
learnability, efficiency, error rate, memorability, and satisfaction rate. They were also
instructed to have a good night sleep just prior to taking part in the study.
Before the start of the experiment, a general description of the experimental proce-
dure and the aim of the study was read aloud to each participant. After that, the participant
was asked to fill in a demographic survey questionnaire and a pre-study questionnaire.
Prior to the first run, the participant was given time to get acquainted with the driving
simulator and the display. There were given a 5 min–10 min trial drive.
In total, 9 (1 baseline (No secondary task) ×2 HDD locations ×2 Navigation menu
layout ×2 screen position) trials were conducted in this study.
In every experiment trial, the participants were instructed to perform prescribed
IVIS tasks. The prescribed tasks (see Table) were instructed by the facilitator using
recorded audio instructions. The task sets were presented in a random order for both
HDD locations, positions, and Navigation menu layouts sequences. The instructions
for each task were read to the participant by recorded audio by the facilitator; If the
Inspection of In-Vehicle Touchscreen Infotainment Display 271
participants were failed to follow the secondary task instruction, he or she required to
say repeat or replay, and the facilitator will replay particular secondary task instruction.
The secondary tasks were video recorded by the facilitator using iPhone 6s. During
all this time, the participant was asked to drive the vehicle normally and obey the traffic
rules.
At the end of every trial, the participants were asked to fill in NASA TLX and post-
survey questionnaires concerning the user experience with the simulator and the visual
interaction display.
4 Results
In this experiment, 3 * 6 analysis of variance (ANOVA) was performed to analyze
the effect of touchscreen device 2 different display locations, 2 menu types, and two
orientation positions on participants’ driving and secondary task performance.
4.1 Driving Task Performance
The results showed a significant main effect of orientation on the participant’s total
number of collisions (F1,10 =5.08, p =0.0479). The post hoc test showed that the
mean number of collisions was significantly higher in fixed position (Mean =1.2045,
Std Dev =1.5638) than in user preferred position (Mean =0.7272, Std Dev =0.8453)
(Figure no. - 31). However, the results didn’t show any significant main effect of location
(F1,10 =0.01, p =0.9321), or layout (F1,10 =0.31, p =0.5884) on number of collisions.
No interaction effect between location and layout (F1,10 =0.83, p =0.3828), location
and orientation (F1,10 =2. 73, p =0. 1296) or layout and orientation (F1,10 =0.95, p
=0. 3531) was found either (Fig. 6).
Fig. 6. Distribution of Collision graph
The results showed a significant interaction effect of layout and orientation on the
participant’s total number of centerline crossings (F1,10 =18.22, p =0.0016). When
272 S. Patel et al.
the orientation changes from fixed to user-preferred, the total number of centerline
crossings will decrease in horizontal layout but increase in a vertical layout (Figure no.
- 35) (Fig. 7).
Fig. 7. Interaction effect of layout and orientation
However, the results didn’t show any significant main effect of location (F1,10 =
1.86, p =0. 2025), layout (F1,10 =0. 71, p =0. 4192), and orientation (F1,10 =0. 22,
p=0. 6527) on number of centerline crossings.
No interaction effect between location and layout (F1,10 =0. 05, p =0. 8292), or
location and orientation (F1,10 =0. 56, p =0. 4705) was found either.
The results showed a significant interaction effect of location and orientation on the
total number of stops sign tickets (F1,10 =5.20, p =0.0457). When the orientation
changes from fixed to user-preferred for top location, the total number of stops sign
tickets will increase. On the contrary, when orientation changes from fixed to user-
preferred for the middle location, the total number of stops sign tickets will decrease
(Fig. 8).
Fig. 8. Interaction effect of location and orientation
However, the results didn’t show any significant main effect of location (F1,10 =0.
34, p =0. 5737), layout (F1,10 =3.20, p =0.1039), and orientation (F1,10 =1.00, p
=0. 3409) on number of stops sign tickets.
Inspection of In-Vehicle Touchscreen Infotainment Display 273
No interaction effect between location and layout (F1,10 =0. 39, p =0. 5482), or
layout and orientation (F1,10 =0.48, p =0. 5059) was found either.
4.2 Secondary Task Performance
Secondary task performance was measured using the total time it took to perform the
secondary task, the total number of crashes while performing the secondary task (total
number of crashes), and the total number of secondary task errors (Total number of
errors).
The results showed a significant main effect of layout type on the participant’s total
secondary task completion time (F1,10 =5.39, p =0.0427). The post hoc test showed
that the mean number of secondary task completion time was significantly higher for
horizontal menu (Mean =109.2484, Std Dev =47.784) compare to vertical menu
(Mean =99.438, Std Dev =43.345). However, the result didn’t show any significant
main effect of location (F1,10 =1.20, p =0.2985), or orientation (F1,10 =3.89, p =
0.0769) on secondary task completion time. No interaction effect between location and
layout (F1,10 =0.01, p =0. 9313), location and orientation (F1,10 =0.30, p =0. 5930)
or layout and orientation (F1,10 =2.30, p =0. 1600) was found either. The results
showed a significant main effect of orientation on the total number of crashes (F1,10
=5.11, p =0.0473). The post hoc test showed that the mean number of crashes were
significantly higher for fixed position (Mean =0.8409, Std Dev =1.098) compared to
user preferred position (Mean =0.4090, Std Dev =0.6927) (Figure no. - 34). However,
the results didn’t show any significant main effect of location (F1,10 =0.11, p =0.7499),
or layout (F1,10 =1.81, p =0.2077) on number of crashes. No interaction effect between
location and layout (F1,10 =0.03, p =0. 8628), location and orientation (F1,10 =0.02,
p=0. 8943) or layout and orientation (F1,10 =0. 03, p =0. 8713) was found either.
The results showed a significant main effect of layout type on the participant’s number
of secondary task errors (F1,10 =6.92, p =0.0251). The post hoc test showed that the
mean number of secondary task errors was significantly higher for horizontal menu type
(Mean =1.250, Std Dev =1.586) compared to vertical menu type (Mean =0.4318, Std
Dev =0.899) (Figure no. - 35).
However, the result didn’t show any significant main effect of location (F1,10 =0.02,
p=0.8909), or orientation (F1,10 =0.19, p =0.6761) on number of secondary task
errors. No interaction effect between location and layout (F1,10 =0.13, p =0.7290),
location and orientation (F1,10 =0.05, p =0. 8351) or layout and orientation (F1,10 =
0.21, p =0.6595) was found either.
5 Discussion
Several studies were conducted to identify the best location for an In-vehicle infotainment
display [30,47,48]. According to previous experiment results by Normark [41] and Liu
[30] there is no significant difference between HUD and HDD task performance. Includ-
ing Ablassmeier’s eye gaze study indicates no secondary task performance difference
between HUD and DCD (driver-centered display).
274 S. Patel et al.
However, operating the vehicle is characterized as a primary task. When attention
is not centered on the road, it would be a distraction and safety concern; the eyes-
off-the-road time is still basically coordinated to essential undertaking exercises. For
example, side mirror check, blank spot check, reflect looks, or speed/sign checks. And
safe vehicle control can be reasonably maintained using peripheral/ambient vision. With
an increase in the vehicle’s speed, eyesight becomes narrow. In the meantime, it is unsafe
to pay attention to the secondary activity, especially when the task is not in your focal
visual field, according to the study conducted by Horrey, Wickens, & Alexander. The
top location performs better than the middle location because the top location is closest
to the driving environment area [24].
This experiment aimed to find the most suitable location for the ‘In-vehicle info-
tainment system’ with different menu types and orientation positions. For example,
according to previous research, it is expected that HDD location is a significant distrac-
tion for driving. However, the top location distance becomes closer to the driving area for
two HDD positions and becomes a preferable position for the IVIS screen. But if menu
type and orientation position are included with booth location, comparison becomes
more difficult. For example, with the horizontal menu on the top screen, accessing the
menu becomes difficult. According to participants’ reviews, ‘accessing the last options
on the horizontal menu was more challenging to reach.’ Since the horizontal menu is
in the top position, the eyesight distance between the previous options or edge of the
screen becomes hard to reach and more restrictive; however, if you applied horizontal
menu type on the middle location, accessibility between driver and screen increases.
From the post, questionary data majority of participants preferred top position com-
pared to middle positions. According to one of the participants, ‘Top position match
the eye level with driving area’, another participant described the top position as visu-
ally comfortable and easy to reach. This study compared two menu positions for IVIS.
According to the results, menu types had a significant effect on secondary task comple-
tion time, the number of secondary task errors, and a significant interaction effect on the
number of centerline crossings with orientation positions. However, the result did not
significantly impact the number of crashes, the total number of road edge excursions, and
Overspeed. Several studies were conducted to identify the best menu type for In-vehicle
infotainment [15,35,36].
With excessive data accessibility, route menus become fundamentally crucial for fast
and exact access. Because of visual preparing imperatives, the driver doesn’t ordinarily
see all visible format components immediately look exhaustively [18,31,39]. Instead
of concentrating just on design components that go under the client’s foveal vision. It
uncovers that regardless of whether the whole format is addressed before the client, the
client can’t outwardly handle the entirety of its menu alternatives, catches, symbols,
fields, and different components. Subsequently, the graphical design turns into the issue
of precise attention deployment [25]. According to a study conducted by Burrell and
Sodan [6] on six different menu types, the top tabbed menu appeared to be most signifi-
cant. Similarly, a study conducted by Murano and Oenga [36] on horizontal and vertical
menu design found no significant difference in performance. Another analysis directed
by Murano and Lomas on four different menu types (left, right, top, and lower part of the
screen), in contrast to past explore by Murano [36], tasks designed to be more demanding
Inspection of In-Vehicle Touchscreen Infotainment Display 275
and challenging. As a result, the Top and left side menus required fewer clicks and the
least number of errors [35].
According to a UX study performed by Nelson Norman group web users spends 80%
of their time viewing the left side of the page and 20% viewing the right half [16]. With
left positioned vertical menu, accessing the navigation menu becomes more effortless,
and with the vertical menu, the easy option to slide for excessive options becomes
more vital. Today the majority of applications with a navigation menu preferred vertical
position. As Jakob’s law of the internet user experience states, ‘users spend most of the
time on other sites.’ In other words, users prefer your site to work the same way as
all the other sites they already know [40]. Following the similar navigation approach
means following the existing mental model, allowing users to apply their knowledge
from previous experiences.
For menu located at distant corners increases the distance between user and button.
As Fitts’s Law states, the amount of time required for a person to move to a target area
is a function of the distance to the target divided by the target’s size. Thus, the longer the
distance and the smaller the target’s size, the longer it takes [17]. Besides, layout analysis
for various IVIS menu screens was performed by different analytical techniques as their
usability evolve. Analysis methods contain similar grouping functions and arrange these
groups’ layout according to three factors: 1. frequency, 2. importance, and 3. sequence
of use [46]. The most efficient GUI design will be the ideal trade between these three
factors regarding the task on hand. For numerous options in a menu, it required an
efficient content organization method. One approach for organizing menu content is a
hierarchy that includes “depth and breadth.”
However, the vertical menu consumes fixed and more space compared to the horizon-
tal counterpart. As a result, sometimes, it can be challenging to look at the detail moved
on the right side—however, its where complete application layout, usability comes into
focus.
Overall left side navigation menu reduces glance distance for drivers (for right-side
drive countries) compare to any other menu position.
This experiment included eight combined positions. Majority of the previous studies
considered each location menu type individually. However, there are possible com-
binations for better performance. For example, in most data analyses, the top location
appeared to be the best performing compared to the medium location. However, medium
location with horizontal and user preferred positions is more effective than the top loca-
tion with horizontal menu and fixed position or top location with horizontal menu and
fixed position. In orientation position, users can adjust according to their needs. As a
result, gaze distance decreases, and the screen appeared to be more accessible.
6 Conclusion
This research study investigated the combined effect of 2 locations: Top and middle, 2
menu types: Horizontal and vertical, and 2 positions: fixed and user preferred position on
driving task and secondary task performance, number of errors, and number of collisions
using 8 different combinations. However, results revealed no main effect of location on
driving performance.
276 S. Patel et al.
For THU (Top location-horizontal menu-user preferred position) the least number
of crashes were recorded and for TVF (Top location-vertical menu-fixed position) and
MVF (Middle location-vertical menu-fixed position) positions, the highest number of
crashes were recorded. Participants conducted the least number of task errors on the
TVU location. Top positions are ideal for screen display because of their location close
to the windshield area. However, according to a few participants, a Medium location for
screen mounting is preferable over the top. According to participants’ comments, the
screen’s top location blocks the driving view, and screen light reflation creates visual
distraction and discomfort. According to participants’ reviews, the vertical menu type is
preferable over the horizontal. As per one finding, the horizontal menu positions the top
location is in line with the driving view area, causing most minor distraction and neck
discomfort. However, performance did not show a significant difference.
Based on results, the user preferred(orientation) position is the most preferable
and significantly affected secondary task performance. Freedom of screen adjustment
provides greater flexibility.
6.1 Research Limitation and Future Work
This section explains some of the limitations of studies. The research was conducted on
F-GT Simulator Cockpit, a patented design for the GT racing position. Racing car seats
tend to be more grounded than SUVs, which might be why SUV car seat results might
differ.
In the future, as vehicles become more personalized and loaded with screens. User-
preferred screen position will be in more focus, especially for the driver position.
References
1. Ablassmeier, M., Poitschke, T., Wallhoff, F., Bengler , K., Rigoll, G.: Eye Gaze studies
comparing head-up and head-down displays in vehicles. In: IEEE International Conference
on Multimedia and Expo, vol. 1, pp. 2250—2252. IEEE, Piscataway (2007)
2. Ariza, M., Zato, J.G., Naranjo, J.E.: HMI design in vehicles based in usability and accessibility
concepts. In: 12th International Workshop on Computer Aided Systems Theory, EUROCAST
2009.Archivo Digital UPM, Madrid (2009)
3. Buhmann, A., Hellmueller, L.: Pervasive entertainment, ubiquitous entertainment.centre for
the study of communication and culture. Santa Clara: A Quarterly Review of Communication
Research (2009)
4. Burnett, G., Crossland, A., Large, D.R., Harvey, C.: The impact of interaction mechanisms
with in- vehicle touch screens on task performance. In:Conference: Ergonomics & Human
Factors.Stratford-upon-Avon, UK (2019)
5. Burnett, G., Lawson, G., Millen, L., Pickering, C.: Designing touchpad user-interfaces for
vehicles: which tasks are most suitable? Behav. Inf. Technol. 30(3), 403–414 (2011)
6. Burrell, A., Sodan, A.: Web interface navigation design: which style of navigation-link menus
do users prefer? In: International Conference on Data Engineering Workshops, Atlanta, GA,
USA, pp. 3–7. IEEE Computer Society (2006)
7. Callahan, J., Hopkins, D., Weiser, M., Shneiderman, B.: An empirical comparison of pie vs.
linear menus. In: Proceedings of the SIGCHI Conference on Human Factors in Computing
Systems, pp. 95–100. Association for Computing Machinery, Washington, D.C., USA (1988)
Inspection of In-Vehicle Touchscreen Infotainment Display 277
8. Card, S.: User perceptual mechanisms in the search of computer command menus. In:
Proceedings of Human Factors in Computer Systems, pp. 190–196. ACM, New York (1982)
9. Card, S.K., Moran, T.P., Newell, A.: The Psychology of Human-Computer Interaction.
Lawrence Erlbaum Associates Publishers, Hillsdale (1983)
10. Chiang, I.: Usability Testing Basics.Techsmith (2015). https://www.techsmith.com/
11. Chisholm, S.L., Caird, J.K., Lockhart, J., Fern, L.: Driving Performance while engaged in
MP-3 player interaction: effects of practice and task difficulty on PRT and eye movements.
In: 4th International Driving Symposium on Human Factors in Driver Assessment, Training,
and Vehicle Design, pp. 238–245. Driving Assessment (2007)
12. Conley, C., Gabbard, J., Smith, M.: ead-Up vs. Head-down displays: examining traditional
methods of display assessment while driving. In: Proceedings of the 8th International Con-
ference on Automotive User Interfaces and Interactive Vehicular Applications, pp. 185–192,
New York, NY, USA. Automotive’UI 2016 (2016)
13. Eren, A., Burnett, G., Large, D.R.: Can in-vehicle touchscreens be operated with zero visual
demand? An exploratory driving simulator study. In: International Conference on Driver
Distraction and Inattention.Sydney, New South Wales, Australia (2015)
14. Fang, X., Holsapple, C.W.: An empirical study of web site navigation structures’ impacts on
web site usability. Decis. Supp. Syst. 43(2), 476–491 (2007)
15. Faulkner, X., Hayton, C.: When left might not be right. J. Usability Stud. 6(4), 245–256 (2011)
16. Fessenden, T.: Nelson Norman group.www.nngroup.com.https://www.nngroup.com/art
icles/horizontal-attention-leans-left/. Accessed 22 Oct 2017
17. Fitts, P.M.: The information capacity of the human motor system in controlling the amplitude
of movement. J. Exp. Psychol. 47(6), 381–391 (1954)
18. Green, P.: The 15-second rule for driver information systems. In: America Conference
Proceedings (CD) (standard (J2364)), pp. 1–9 (1999)
19. Hagen, L., Herdman, C., Brown, M.: The Perfomance consts of digital head-up displays. In:
International Symposium on Aviation Psychology, pp. 244–246, Dayton, Ohio (2007)
20. Hemenway, K.: Psychological issues in the use of icons in command menus. In: Proceedings
of the Conference on Human Factors in Computing Systems, pp. 20–23. ACM, New York
(1982)
21. Hodgson, G., Ruth, S.R.: The use of menus in the design of on-line sytems: a retrospective
view. ACM SIGCHI Bull. 17(1), 16–22 (1985)
22. Horrey, W.J., Wickens, C.D., Alexander, A.L.: The effects of head-up display clutter and in-
vehicle display separation on concurrent driving performance. In: Proceedings of the Human
Factors and Ergonomics Society Annual Meeting,vol. 47, no. 16, pp. 1880–1884 (2003)
23. Horrey, W.J., Wickens, C.D., Consalus, K.P.: Modeling drivers’ visual attention allocation
while interacting with in-vehicle technologies. J. Exp. Psychol. Appl. 12(2), 67–78 (2006)
24. Horrey, W., Alexander, A., Wickens, C.D.: Does workload modulate the effects of in-vehicle
display location on concurrent driving and side task performance. In: Proceedings of the
Driving Simulation Conference, vol. 217, pp. 1–20. Dearborn, Michigan (2003)
25. Jokinen, J.P., Wang, Z., Sarcar, S., Oulasvirta, A., Ren, X.: Adaptive feature guidance:
modelling visual search with graphical layouts. Int. J. Hum. Comput. Stud. 136, 1–22 (2020)
26. Jose, R., Lee, G., Billinghurst, M.: A comparative study of simulated augmented real-
ity displays for vehicle navigation. In: Proceedings of the 28th Australian Conference on
Computer-Human Interaction, pp. 40–48 (2016)
27. Large, D.R., Burnett, G., Crundall, E., Lawson, G.: Twist it, touch it, push it, swipe it: evaluat-
ing secondary input devices for use with an automotive touchscreen HMI. In: Automotive’UI
2016 Proceedings of the 8th International Conference on Automotive User Interfaces and
Interactive Vehicular Applications, pp. 161–168. ACM, Ann Arbor (2016)
278 S. Patel et al.
28. Lawrence, J., Risser, M., Prinzel: Head-Up Displays and Attention Capture.NASA,
Man/System Technology and Life Support. NASA Langley Research Center Hampton, VA,
United States. NTRS (2004)
29. Leuthold, S., Schmutz, P., Bargas-Avila, J., Tuch, A.N., Opwis, K.: Vertical versus dynamic
menus on the world wide web: eye tracking study measuring the influence of menu design
and task complexity on user performance and subjective preference. Comput. Hum. Behav.
27, 459–472 (2011)
30. Liu, Y.-C.,Wen, M.-H.: Comparison of head-up display (HUD) vs. head-down display (HDD):
driving performance of commercial vehicle operators in Taiwan. Int. J. Hum.-Comput. Stud.
61(5), 679–697 (2004)
31. Manufacturers, A.: Statement of principles, criteria and verification procedures on driver
interactions with advanced in-vehicle information and communication systems.Draft Version
3.0, Alliance of Automobile Manufactures, Washington, DC, USA (2003)
32. McGookin, D., Brewster, S., Jiang, W.:Investigating touchscreen accessibility for people with
visual impairments. In: Proceedings of the 5th Nordic Conference on Human-Computer Inter-
action: Building Bridges, Lund, Sweden, pp. 298–307. Association for Computing Machinery
(2008)
33. Molich, R., Hornbaek, K., Krug, S., Johnson, J., Scott, J.: Usability and Accessibility (7.4)
(2008). www.techsmith.com
34. Mourant, R.R., Rockwell, T.H.: Mapping eye-movement patterns to the visual scene in
driving: an exploratory study. Hum. Fact. 12(1), 81–87 (1970)
35. Murano, P., Lomas, T.: Menu positioning on web pages. Does it matter? Int. J. Adv. Comput.
Sci. Appl. 6(4) (2015)
36. Murano, P., Oenga, K.: The impact on effectiveness and user satisfaction of menu positioning
on web pages. Int. J. Adv. Comput. Sci. Appl. 3–9 (2012)
37. Louveton, N., McCall, R.: Driving while using a smartphone-based mobility application:
evaluating the impact of three multi-choice user interfaces on visual- manual distraction.
Appl. Ergon. 54, 196–204 (2016)
38. Ng, A., Brewster, S.: An evaluation of touch and pressure-based scrolling and haptic feedback
for in-car touchscreens. Automotive User Interfaces and Interactive Vehicular Applications,
pp. 11–20. Oldenburg, Germany. ACM (2017)
39. NHTSA: Visual-Manual NHTSA Driver Distraction Guidelines for Portable and Aftermarket
Devices.National Highway TrafficSafety Administration (NHTSA), Washington, DC (2016)
40. Nielsen, J.: End of Web Design.Nielsen Norman Group. https://www.nngroup.com/articles/
end-of-web-design. Accessed 22 July 2000
41. Normark , C., Tretten, P., Gärling, A.: Do redundant head-up and head-down display con-
figurations cause distractions? In: Driving Assessment Conference, pp. 398–404. Big Sky,
Montana, USA (2009)
42. Previc, F.: The neuropsychology of 3-D space. Psychol. Bull. 124(2), 123–164 (1998)
43. Purucker, C., Naujoks, F., Prill, A., Neukum, A.: Evaluating distraction of in-vehicle infor-
mation systems while driving by predicting total eyes-off-road times with keystroke level
modeling. Appl. Ergon. (n.d.)
44. Rydstrom, A., Brostrom, R., Bengtsson, P.: A comparison of two contemporary types of in-car
multifunctional interfaces. Appl. Ergon. (43), 507–514 (2012)
45. Santos, J., Merat, N., Mouta, S., Brookhuis, K.W.: The interaction between driving and in-
vehicle information systems: comparison of results from laboratory, simulator and real-world
studies. Transp. Res. Part F: Traffic Psychol. Behav. 8(2), 135–146 (2005)
46. Stanton, N., Hedge, A., Brookhuis, K., Salas, E., Hendrick, H.: Handbook of Human Factors
and Ergonomics Methods. Taylor & Francis (2005)
Inspection of In-Vehicle Touchscreen Infotainment Display 279
47. Topliss, B., Harvey, C., Burnett, G.: How long can a driver look? Exploring time thresholds
to evaluate head-up display imagery. In: 12th International Conference on Automotive User
Interfaces and Interactive Vehicular Applications, pp. 9–18, New York, NY, USA: Association
for Computing Machinery (2020)
48. Tretten, P., Gärling, A., Nilsson, R., Larsson, T.: An on-road study of head-up display: pre-
ferred location and acceptance levels. In: Proceedings of the Human Factors and Ergonomics
Society Annual Meeting,vol. 55, pp. 1914–1918. SAGE Journals (2011)
49. Vilchez, J.: Representativity and univocity of traffic signs and their effect on trajectory
movement in a driving-simulation task: regulatory signs. J. Safety Res. 66, 101–111 (2018)
50. Wang, Y., Mehler, B., Reimer, B., Lammers, V., D’Ambrosio, L.A., Coughlin, J.F.: The valid-
ity of driving simulation for assessing differences between in-vehicle informational interfaces:
a comparison with field testing. Ergonomics 53(3), 404–420 (2010)
51. Weinberg, G., Harsham, B., Forlines, C., Medenica, Z.: Contextual push-to-talk: short-
ening voice dialogs to improve driving performance. In: International Conference on
Human-Computer Interaction with Mobile Devices and Services, pp. 113–122 (2010)
52. Wickens, C.: Multiple resources and performance prediction. Theor. Issues Ergon. Sci. 3(2),
159–177 (2002)
... Accord Hybrid [5,6,21] Accord Hybrid [2,3,22] Tesla Model 3 [112,125] Model 3 [113,120,121,124] Model 3 [113,120,121,124] Model 3 [113,120,121,124] Model 3 [113,120,121,124] Model Y [114] Model Y [114,120,121,124] Model Y [114,120,121,124] Model Y [114,120,121,124] Model Y [114,120,121,124] Model S [115,119,122,123] Model S [115,122,123] Model S [115,122,123] Model S [115,122,123] Model S [110,116] Model X [117,122,123] Model X [117,122,123] Model X [117,122,123] Model X [117,122,123] Model X [111,118] [29,31] X3 [27] X3 [25] A wide range of other factors have been studied, including touchscreen size [e.g., 34], display design (e.g., menu layout [99]), interaction types (e.g., on-screen gestures [69]), and feedback [e.g., 53,128]. However, much of the existing research has been conducted with simulated in-vehicle touchscreen systems and it is unclear how the design of current OEM touchscreens aligns with research findings. ...
... The effects may also depend on driver characteristics (e.g., height). Patel et al. [99] found that tilting the screen towards the driver resulted in fewer collisions in a simulator study where participants encountered various obstacles. However, when tilting the screen is an option, the touchscreen layout should be considered. ...
... However, when tilting the screen is an option, the touchscreen layout should be considered. When the menu was located along the top of the screen, tilting the screen towards the driver reduced centreline crossings, but when the menu was placed along the left side of the screen, tilting the screen was associated with more centreline crossings, potentially due to the menu becoming obstructed [99]. The Tesla Model S and Model X (which can tilt towards the driver) have a menu along the bottom of the screen. ...
... Numerous human factors research studies have been conducted to inform the in-vehicle touchscreen interface design. Many of them examined how driver task performance is affected by Graphical User Interface (GUI) design variables, such as touch-key size (Cockburn et al., 2018;Eren et al., 2018;Feng et al., 2018;Kim et al., 2014;Liu et al., 2022;Suh & Ferris, 2019;Tao et al., 2018), touch-key shape (Tao et al., 2018), touch-key spacing (Tao et al., 2018), number of on-screen items (Feng et al., 2018;Kujala & Saariluoma, 2011;Kujala & Salvucci, 2015), menu structure/type (Burnett et al., 2013;Kujala & Salvucci, 2015;Ma et al., 2018;Patel et al., 2022), scrolling method (Hoffman et al., 2005;Kujala & Saariluoma, 2011), number of on-screen text lines (Xian & Jin, 2015), number of steps required for the completion of a secondary task (Grahn & Kujala, 2020;Xian & Jin, 2015), text size (Crundall et al., 2016), and on-screen scroll button location (Kujala & Salvucci, 2015). Many of these GUI design variables represent different ways of operationalizing touchscreen task difficulty. ...
... Many of these GUI design variables represent different ways of operationalizing touchscreen task difficulty. Other studies investigated the effects of physical user interface design variables, such as touchscreen location (Fuller et al., 2008;Hagiwara et al., 2013;Patel et al., 2022), touchscreen orientation (Patel et al., 2022), touchscreen size (Grahn & Kujala, 2020;Ma et al., 2018), touchscreen click position , touchscreen slide gesture area , and key feedback modality (Cockburn et al., 2018;Pitts et al., 2012aPitts et al., , 2012Russomanno et al., 2017;Soomro & Cockburn, 2020;Suh & Ferris, 2019). ...
... Many of these GUI design variables represent different ways of operationalizing touchscreen task difficulty. Other studies investigated the effects of physical user interface design variables, such as touchscreen location (Fuller et al., 2008;Hagiwara et al., 2013;Patel et al., 2022), touchscreen orientation (Patel et al., 2022), touchscreen size (Grahn & Kujala, 2020;Ma et al., 2018), touchscreen click position , touchscreen slide gesture area , and key feedback modality (Cockburn et al., 2018;Pitts et al., 2012aPitts et al., , 2012Russomanno et al., 2017;Soomro & Cockburn, 2020;Suh & Ferris, 2019). ...
Article
Full-text available
Objective This study investigated the effects of nondriving-related task (NDRT) touchscreen location and NDRT difficulty level on the driver task performance, eye gaze behavior, and workload during SAE Level 3 conditionally automated driving. Two driver tasks were considered: a visuomanual NDRT and a take-over task. Background Touchscreens are expected to play important roles inside automated vehicles. However, few studies have investigated the driver-touchscreen interaction during automated driving. Method A driving simulator experiment was conducted. The experimental task consisted of two successive subtasks: an NDRT followed by a take-over task. NDRT touchscreen location (Upper Left, Upper Right, and Lower Right) and NDRT difficulty level (Easy and Hard) were the independent variables. A set of driver task performance, eye gaze behavior, and perceived workload measures were employed for each subtask as the dependent variables. Results NDRT touchscreen location significantly affected both the NDRT and the take-over task performance. Lower Right was superior to Upper Right in the NDRT performance but was inferior in the take-over task performance. NDRT touchscreen location affected the perceived physical workload of the NDRT. NDRT difficulty level affected the perceived workload of the take-over task. Conclusion The research findings enhance our understanding of how NDRT touchscreen location and NDRT difficulty level impact the driver task performance during conditionally automated driving, and, further provide useful design implications and knowledge. Application The study results would inform the NDRT touchscreen interface design and the NDRT design for conditionally automated vehicles.
... Drivers and passengers can interact with in-car user interfaces through modalities like voice commands, mid-air gestures, and touchscreen gestures [2,17,27,43]. Custom mid-air gestures tailored to user requirements have shown promise in reducing distractions during interaction [43]. Various interaction techniques have now been integrated into the steering wheel, utilizing its different areas and sides to engage multiple parts of the hand, thereby expanding its role beyond mere steering [24]. ...
Article
Aerial gaze target recognition is an important step in aerial eye control interaction. In order to achieve accurate aerial gaze target recognition, the VT4LM recognition algorithm was constructed in this paper. By this algorithm, the facial images containing the eyes of the flight operators were first inputted into Vision Transformer (ViT) to extract the local features, and the head posture of the flight operators was inputted into four LSTMs to extract the global features. Then, the local and global features of the flight operators were inputted into three fully connected layers, two dropout layers and one softmax classifier. Finally, the recognition result of aerial gaze target was obtained. In this paper the effectiveness of the VT4LM recognition algorithm was verified through the identification of four aerial line-of-sight gaze targets named Head-up display, Accelerator push rod, Control lever and Rudder by four flight operators during simulated flight. The experimental results showed that the accuracy of the VT4LM algorithm for aerial gaze target recognition reached 89.29% and the Cross Entropy Loss was 1.45. Compared to the other three recognition methods, the VT4LM algorithm had the highest recognition accuracy and minimum loss. When using the VT4LM algorithm to detect four simulated flight operators staring at four aerial gaze targets, the recognition accuracy was all higher than 85.00%. It could be seen that the VT4LM algorithm had a good performance in aerial gaze target recognition.
Conference Paper
Full-text available
Currently, the visual demand incurred by vehicle displays is evaluated using time criteria (such as those provided by NHTSA). This 60-participant driving simulator study investigated to what extent glance time criteria applies to Head-up Display (HUD) imagery, considering 48 locations across the windshield (and 3 in-vehicle display positions). Participants were required to make a long controlled continuous glance to a sample of these locations. Consequently, the time at which lateral/longitudinal unsafe driving occurred (e.g. deviating out of lane, unacceptable time to collision) could then be assessed. Using the selected measures, the results suggest that drivers are able to maintain driving performance for longer than recommended NHTSA guidelines for in-vehicle displays whilst engaging with HUD imagery in various locations. Importantly, the data from this study provides initial maps for designers highlighting the visual demand implications of HUD imagery across the windshield.
Article
Full-text available
We present a computational model of visual search on graphical layouts. It assumes that the visual system is maximising expected utility when choosing where to fixate next. Three utility estimates are available for each visual search target: one by unguided perception only, and two, where perception is guided by long-term memory (location or visual feature). The system is adaptive, starting to rely more upon long-term memory when its estimates improve with experience. However, it needs to relapse back to perception-guided search if the layout changes. The model provides a tool for practitioners to evaluate how easy it is to find an item for a novice or an expert, and what happens if a layout is changed. The model suggests, for example, that (1) layouts that are visually homogeneous are harder to learn and more vulnerable to changes, (2) elements that are visually salient are easier to search and more robust to changes, and (3) moving a non-salient element far away from original location is particularly damaging. The model provided a good match with human data in a study with realistic graphical layouts.
Conference Paper
Full-text available
An in-car study was conducted to examine different input techniques for list-based scrolling tasks and the effectiveness of haptic feedback for in-car touchscreens. The use of physical switchgear on centre consoles is decreasing which allows designers to develop new ways to interact with in-car applications. However, these new methods need to be evaluated to ensure they are usable. Therefore, three input techniques were tested: direct scrolling, pressure-based scrolling and scrolling using onscreen buttons on a touchscreen. The results showed that direct scrolling was less accurate than using onscreen buttons and pressure input, but took almost half the time when compared to the onscreen buttons and was almost three times quicker than pressure input. Vibrotactile feedback did not improve input performance but was preferred by the users. Understanding the speed vs. accuracy trade-off between these input techniques will allow better decisions when designing safer in-car interfaces for scrolling applications.
Conference Paper
Full-text available
Touchscreen Human-Machine Interfaces (HMIs) inherently demand some visual attention. By employing a secondary device, to work in unison with a touchscreen, some of this demand may be alleviated. In a medium-fidelity driving simulator, twenty-four drivers completed four typical in- vehicle tasks, utilising each of four devices – touchscreen, rotary controller, steering wheel controls and touchpad (counterbalanced). Participants were then able to combine devices during a final ‘free-choice’ drive. Visual behaviour, driving/task performance and subjective ratings (workload, emotional response, preferences), indicated that in isolation the touchscreen was the most preferred/least demanding to use. In contrast, the touchpad was least preferred/most demanding, whereas the rotary controller and steering wheel controls were largely comparable across most measures. When provided with ‘free-choice’, the rotary controller and steering wheel controls presented as the most popular candidates, although this was task-dependent. Further work is required to explore these devices in greater depth and during extended periods of testing.
Article
Introduction: The effect of traffic signs in the motor behavior of drivers is not completely understood. Knowing how humans process the meaning of signs (not just by learning, but instinctively) will improve reaction time and decision making when traveling. The economic, social, and psychological consequences of car accidents are well studied. Every effort to find the solution of this social problem is encouraged. Method: This study identifies which traffic signs are more ergonomic for participants, from a cognitive point of view, and determines, at the same time, their effect in participants' movement trajectory in a driving-simulation task: the tracking task. Results: The results point out that the signs least representative of their meaning produce a quantitative and qualitative different deviation from the center of the road than the most representative ones.
Conference Paper
In this paper we report on a user study in a simulated environment that compares three types of Augmented Reality (AR) displays for assisting with car navigation: Heads Up Display (HUD), Head Mounted Display (HMD) and Heads Down Display (HDD). The virtual cues shown on each of the interface were the same, but there was a significant difference in driver behaviour and preference between interfaces. Overall, users performed better and preferred the HUD over the HDD, and the HMD was ranked lowest. These results have implications for people wanting to use AR cues for car navigation.
Conference Paper
Technological advances in Head-Up Displays (HUDs) have renewed vehicle manufacturer interest. Recent research links time with eyes off of the road to increased chance of accidents, a problem that could be diminished when using HUDs. Before HUDs can be safely integrated, there must be proven methods of assessing these displays. NHTSA established guidelines for assessing in-vehicle tasks; however, these have not been examined with HUDs, differ from traditional Head-Down Displays (HDDs). This study followed most of the NHTSA guidelines for the Eye Glance Test for a visual text search task while tracking driving and secondary task performance measures. The HUD performed worse on the NHTSA eye glance test than the HDD did; however, the driving performance measures were superior when driving with the HUD. There were no significant differences in the secondary task performance between the two displays. Therefore, the NHTSA standard may not adequately assess HUDs in vehicles.
Article
Increasingly complex in-vehicle information systems (IVIS) have become available in the automotive vehicle interior. To ensure usability and safety of use while driving, the distraction potential of system-associated tasks is most often analyzed during the development process, either by employing empirical or analytical methods, with both families of methods offering certain advantages and disadvantages. The present paper introduces a method that combines the predictive precision of empirical methods with the economic advantages of analytical methods. Keystroke level modeling (KLM) was extended to a task-dependent modeling procedure for total eyes-off-road times (TEORT) resulting from system use while driving and demonstrated by conducting two subsequent simulator studies. The first study involved the operation of an IVIS by N = 18 participants. The results suggest a good model fit (R2 Adj. = 0.67) for predicting the TEORT, relying on regressors from KLM and participant age. Using the parameter estimates from study 1, the predictive validity of the model was successfully tested during a second study with N = 14 participants using a version of the IVIS prototype with a revised design and task structure (rPred.-Obs. = 0.58). Possible applications and shortcomings of the approach are discussed.