Content uploaded by Robert J Mills
Author content
All content in this area was uploaded by Robert J Mills on May 18, 2016
Content may be subject to copyright.
Review of Business Information Systems – Second Quarter 2008 Volume 12, Number 2
1
The Role Of End-User Training
In Technology Acceptance
Bryan Marshall, Georgia College and State University
Robert Mills, Utah State University
David Olsen, Utah State University
ABSTRACT
The purpose of this paper is to examine the role end-user training has on performance expectancy
and effort expectancy, two variables associated with technology acceptance. The technology-
based elements of the HIPAA security rules among oral surgeons were used for the study. The
method of the investigation was a cross-sectional correlational study using a self-reported mailed
questionnaire. The survey was created using preexisting scales from the Unified Theory of
Acceptance and Use of Technology. Results suggest end-user training is positively correlated
with both performance expectancy and effort expectancy.
Keywords: End-User Training, IS Survey Research, Technology Acceptance, Unified Theory of Acceptance and
Use of Technology, Health Care, HIPAA
INTRODUCTION
dvances in systems, powerful software suites, and data management have increased the amount of
end-user computing required by accountants (16), and presumably other professionals, particularly
in generating and analyzing reports. Emerging areas such as information security and business
intelligence have also increased the need of end-users to adopt new technologies. End-users often receive training in
an effort to make adoption and implementation of new systems possible. Unfortunately, the role of end-user training
and technology acceptance is not well documented.
The purpose of this research is to examine the impact end-user training has on technology acceptance. As
companies continue to rely on end-user training to improve productivity and gain competitive advantage, the impact
of end-user training on the acceptance of new technologies must be examined. Technology acceptance theory has
played a key role in the examination of user behavior toward the acceptance of new technologies. Technology
acceptance theory identifies different models, frameworks, and theories, which have been used to discover the
factors or constructs influencing the acceptance and use of technology. Several research studies have demonstrated
that a variety of factors influence the rate of acceptance of new technologies when new systems are implemented
(5). Prior research also suggests that small organizations encounter unique challenges when attempting to adopt a
new technology (11, 12).
Research studies within the domain of technology acceptance has included end-user training as a
significant contributing factor that influences users’ behavior toward the acceptance of technology (5, 7, 18, 19).
However, the research findings concerning the role, importance, and utility of end-user training are ambiguous. A
better understanding of the determinants of perceived usefulness would enable those responsible for implementing
new systems to design organizational interventions that would increase user acceptance and usage of new systems
(22).
A
Review of Business Information Systems – Second Quarter 2008 Volume 12, Number 2
2
Background On UTAUT
In 2003 Venkatesh, Davis, Davis, and Morris presented a theory which synthesized previous research in
technology acceptance and is known as the Unified Theory of Acceptance and Use of Technology (UTAUT). The
UTAUT model integrated eight of the prominent user behavior models and included the (a) Theory of Reasoned
Action (TRA), (b) Technology Acceptance Model (TAM), (c) Motivational Model (MM), (d) Theory of Planned
Behavior (TPB), (e) Combined TAM and TPB (C-TAM-TPB), (f) Model of PC Utilization (MPCU), (g) Innovation
Diffusion Theory (IDT), and (h) Social Cognitive Theory (SCT).
The UTAUT model included several objectives: (a) to review user acceptance research, (b) to compare the
eight models, (c) to formulate the UTAUT model, and (d) to empirically validate the UTAUT model. Most of the
related UTAUT studies used university students as the participants. In addition, the reviewed studies examined
looking at technologies that were voluntarily adopted (23). Venkatesh et al. found 32 constructs across the eight
models and they also found four “moderating” variables, concluding that few studies have been done that look at
complex “managerial” information technologies.
The UTAUT model has four main independent variables: performance expectancy, effort expectancy,
facilitating conditions, and social influence (see Appendix 1). The first independent variable performance
expectancy is defined as the degree to which an individual believes that using the system will help one to attain
gains in job performance (23). Second, effort expectancy is defined as the “degree of ease associated with the use of
the system” (23). Third, facilitating conditions is defined as the “degree to which an individual believes that an
organizational and technical infrastructure exists to support use of the system” (23). Finally, social influence is
defined as the “degree to which an individual believes that important others believe he or she should use the system”
(23).
There were four main moderating variables also defined in the literature. Of these, experience is defined as
the amount of experience that a person has in a specific domain. Previous research has shown that as experience
increases, effort expectancy will decrease, or in other words, the system will be easier to use (1, 2, 6, 20, 21). The
survey also included three other variables that the UTAUT measure addresses, but the variables were found to be
insignificant: (a) attitude toward the technology, (b) anxiety, and (c) self-efficacy.
End-User Training
Companies increasingly use end-user training to help create a more productive and competitive workforce.
Anne Fisher’s 2005 Fortune article identifies the profession of training & development specialists as one of the 20
fastest-growing professional jobs over the next ten years (8). While constructs such as performance expectancy and
effort expectancy are critical factors in actual implementation behavior of technology acceptance, an equally
important question is how an organization can specifically impact the level of performance expectancy and effort
expectancy. Performance expectancy is related to how useful the system is perceived to be while effort expectancy
is associated with how easy the system is to use (23). End-user training programs are often designed to specifically
address issues of usefulness and ease of use.
Malcolm Knowles’ classic Theory of Andragogy (13) includes four principles related to training adult
learners. These principles include:
1. Adults need to be involved in the planning and evaluation of their instruction.
2. Experience (including mistakes) provides the basis for learning activities.
3. Adults are most interested in learning about topics that have immediate relevance to their job or personal
life.
4. Adult learning is problem-centered rather than content-oriented.
Although Knowles doesn’t specifically use the constructs "perceived usefulness" and "perceived ease of use", there
is clearly a logical connection between principles of andragogy and technology acceptance.
Review of Business Information Systems – Second Quarter 2008 Volume 12, Number 2
3
Because end-user training programs are often problem-centered, individuals receiving training are likely to
find technology systems easier to use. Additionally, programs that implement andragogy principles such as
addressing the immediate relevance to job performance are more likely to increase individuals' perception of the
usefulness of the system than those who did not receive training. The evaluation of e-learning and end-user training
programs often examines both usefulness and ease of use as part of the assessment process. For instance, Cappell
and Hayen (4) specifically include both ease of use and usefulness as specific elements of their learning unit
assessment for evaluating e-learning. Based on past experience, the authors posit end-user training programs will
positively correlate with both performance and effort expectancy.
HYPOTHESES
H1: End-user training will positively correlate with performance expectancy.
Figure 1: Hypothesis 1
H2: End-user training will positively correlate with effort expectancy.
Figure 2: Hypothesis 2
METHOD
This section describes a description of the population, sample selection, power analysis, and details related
to the development and administration of the survey instrument. The HIPAA Security Rule was chosen as the
application of the theory for this study. The HIPAA Security Rule came into effect in March of 2005 and requires
medical practices to secure electronic patient data. The HIPAA Security Rule was chosen because of its importance
to the medical profession, specifically to small medical practices.
Research Population
The American Association of Oral and Maxillofacial Surgeons (AAOMS) is the nonprofit professional
organization serving oral surgeons. According to AAOMS, approximately 90 percent of all oral surgeons belong to
its organization. As a service to its members, AAOMS currently posts the names, city, state, and zipcode of 5,472
oral surgeons on the AAOMS home page (3). Each state was accessed separately in order to gather the population to
be sampled, , and the oral surgeons from each state were copied into an Excel spreadsheet along with a street
address, city, state, zipcode, and phone number.
Power Analysis
A power analysis was completed to estimate the required number of respondents in this study. Power
analysis software was used to determine that 200 participants would be sufficient to run a correlation matrix on the
hypothesized models. Assuming a 20 percent survey return rate, a sample of 1,000 randomly selected medical
practices was chosen to achieve the 200 estimated participants.
Review of Business Information Systems – Second Quarter 2008 Volume 12, Number 2
4
Instrument Development
The HIPAA Security Rule contains 37 different policies and procedures, mandating the use of many
different technologies. Due to the nature of technology acceptance research, application of the theory to multiple
technologies within the same study is not practical (5, 14). For this reason the 37 different policies were broken
down into separate categories of “technical” and “non-technical”, and only technical categories were considered for
this study. The survey was created using preexisting scales from the Unified Theory of Acceptance and Use of
Technology (UTAUT). Rewording the scales in a way to apply the theory to different domains is common practice
in technology acceptance research (5, 15, 1). In this instance, the survey questions were applied to the HIPAA
Security Rule policy on mandatory use of electronic data backup systems in medical practices.
Survey Administration
According to the HIPAA Security Rule, each medical practice must designate a compliance officer to
maintain the medical office records. Each of these compliance officers was contacted and asked to complete the
survey. The office manager or practicing physician was asked to complete the survey if no compliance officer had
been identified.
RESULTS
This section presents the results of the study by providing a description of the respondents, internal
reliability of the measures, and concludes with a presentation of the correlation matrix.
Description Of Respondents
Out of the 1,000 surveys sent out, 208 were received at the time of this analysis. The overall demographics
of the respondents are presented in Appendix 3. With respect to gender of the respondents, 106 (51%) were female
and 94 (45%) were male, with 8 practices not reporting gender (4%). The primary age group of the respondents was
between 41 and 60 (65%). The cover letter which was sent with the survey to the medical practices asked the doctor
to have the HIPAA compliance officer or office manager fill out the short survey. The largest percentage of
respondents (42%) indicated they had received equal to or higher than a doctorate degree. Only 3 percent of the
medical practices reported not using computers in the office. Conversely, 70 percent of the offices reported using
computers for over 10 years. Additionally, 82 percent reported using computers since the year 2000.
The dependent variable “actual implementation behavior” was analyzed using descriptive statistics. The
frequency of usage showed that over 80 percent of all practices backup their patient data at least daily (5 times a
week). Over 92 percent of the practices report backing up their data at least once a week.
Internal Reliability of the Measures
Cronbach’s alpha was used to assess the reliability of the measure used in this study. The coefficient
reported in Cronbach’s alpha usually range from 0 to 1. All of the items were tested to determine the constructs
Cronbach’s alpha. As seen in Table 1, each construct had an alpha over .65. This high reliability was expected
because the questions were based on existing scales (14, 17).
Table 1: Cronbach’s Alpha Reliability Test
Construct
Number of Items
Cronbach's Alpha
Training and Resources
3
0.913
Performance Expectancy
2
0.697
Effort Expectancy
2
0.929
Review of Business Information Systems – Second Quarter 2008 Volume 12, Number 2
5
Correlation Matrix
A correlation matrix is a table of the different constructs and how they correlate with each other (16). Table
2 presents the descriptive statistics for each variable, including mean and standard deviation, and the correlation
between the variables with the significance level. The significance of each correlation was evaluated at the two-
tailed test level. The results of the correlation matrix in Table 2 revealed that the UTAUT factors (training,
performance, and effort) were all strongly correlated.
Table 2: Correlation Matrix
Training correlated with performance (.789) and effort (.840), all of which were significant at the .01 level.
Performance correlated with effort (.776) significant at the .01 level. In this study the two dependent variables
(usage and how many times) were significantly correlated (.618) at the .01 level. To further support previous
research (23), each of the constructs from the UTAUT model (training, performance, and effort) were significantly
correlated with the actual implementation behavior (usage) at the .01 level.
DISCUSSION
The purpose of this research was to examine the impact of end-user training on technology acceptance
theory to determine how to improve the implementation and adoption of technologies mandated by the federal
government. Two hypotheses (Appendix 2) were tested and supported in this study using a correlation matrix.
Training was positively correlated with both performance expectancy and effort expectancy. End-user training
appears to be an important and understudied factor in technology acceptance. End-user training programs could be
specifically designed to improve performance expectancy and effort expectancy to aid in the acceptance of new
technologies in organizations.
SUGGESTIONS FOR FUTURE RESEARCH
Future research is needed in several areas related to this research. First, additional research should be
conducted to determine the mediating and moderating effects among elements such as end-user training and
technology acceptance factors. In addition, we believe that specific research to determine which training models,
components, and principles have the greatest impact on performance expectancy and effort expectancy will be
valuable.
Additional research is also necessary to examine specific instructional design prescriptions used for
creating end-user training programs have the greatest impact on technology acceptance. With the growing
complexities of computer applications and the increasing diversity of end users, the importance of researching the
impact of specific end-user training designs on technology acceptance is critical (10). Further, individual
characteristics of the trainees also impact the role of end-user training programs and warrants further investigation.
Prior research suggests anxieties related to communication and computing have a direct influence on an individual’s
interaction with technology tools used in an e-Learning environment (9, 24). More research is necessary to
Variables
Training
Performance
Expectancy
Effort
Expectancy
Usage (Yes/No)
Usage (Number
of Times)
Standard Deviation
1.130
1.160
1.160
0.240
2.150
Mean
4.270
4.150
4.270
0.940
5.080
Usage
(Number Times)
1.000
Usage (Yes/No)
1.000
0.618*
Effort
1.000
0.541*
0.305*
Performance
1.000
0.776*
0.569*
0.361*
Training
1.000
0.769*
0.840*
0.567*
0.372*
* Significant at the .01 level
Review of Business Information Systems – Second Quarter 2008 Volume 12, Number 2
6
determine what impact these individual characteristics have on end-user training, and ultimately on technology
acceptance.
Taking advantage of end-user training programs to assist in the adoption of technology appears to be a
valuable, but understudied research area. By better understanding the impact training has on technology acceptance,
organizations will be better prepared to design programs specifically geared toward improving both performance
expectancy and effort expectancy, and ultimately improving the organizations acceptance of technology.
REFERENCES
1. Agarwal, R., and Prasad, J. The Role of Innovation Characteristics and Perceived Voluntariness in the
Acceptance of Information Technologies, Decision Sciences, (28:3), 1997, pp. 557-582.
2. Agarwal, R., and Prasad, J. A Conceptual and Operational Definition of Personal Innovativeness in the
Domain of Information Technology, Information Systems Research, (9:2), 1998, pp. 204-215.
3. American Association of Oral Maxillofacial Surgeons (AAOMS). About AAOMS, Online, retrieved on
4/1/2005 from http://www.aaoms.org/aboutus.cfm.
4. Cappel, J. J. and Hayen, R. L. Evaluating E-Learning: A Case Study, Journal of Computer Information
Systems, (44:4), 2004, pp. 49-56.
5. Davis, F. D. Perceived Usefulness, Perceived Ease of Use, and User Acceptance of Information
Technology, MIS Quarterly, (13:3), 1989, pp. 319-340.
6. Davis, F. D., Bagozzi, R. P., and Warshaw, P. R. User Acceptance of Computer Technology: A
Comparison of Two Theoretical Models, Management Science, (35:8), 1989, pp. 982-1003.
7. Davis, S. A. Training End Users: An Experimental Investigation of Roles of the Computer Interface and
Training Methods, MIS Quarterly, (17:1), 1993, pp. 61-85.
8. Fisher, A. Hot Careers for the Next 10 Years, Fortune, March 21, 2005, pp 131.
9. Fuller, R. M, Vician, C., and Brown, S. A. E-Learning and Individual Characteristics: The Role of
Computer Anxiety and Communication Apprehension, Journal of Computer Information Systems, (46:4),
2006, pp. 103-115.
10. Hasan, B. and Ali, J. M. H. An Empirical Examination of a Model of Computer Learning Performance,
Journal of Computer Information Systems, (44:4), 2004, pp. 27–33.
11. Lee, J. Discriminant Analysis of Technology Adoption Behavior: A Case of Internet Technologies in
Small Businesses, Journal of Computer Information Systems, (44:4), 2004, pp. 57-66.
12. Lee, J. and Runge, J. Adoption of Information Technology in Small Business: Testing Drivers of Adoption
for Entrepreneurs, Journal of Computer Information Systems, (42:1), 2001, pp. 44-57.
13. Knowles, M. The Adult Learner: A Neglected Species (3rd Ed.), Gulf Publishing, Houston, TX, 1984.
14. Ma, Q., and Liu, L. The Technology Acceptance Model: A Meta-Analysis of Empirical Findings, Journal
of Organizational and End User Computing, (16:1), 2004, pp. 59-72.
15. Morris, M. G., and Dillon, A. The Influence of User Perceptions on Software Utilization: Application and
Evaluation of A Theoretical Model of Technology Acceptance, IEEE Software, (14:4), 1997, pp. 58-75.
16. Paquette, L. R. Problem Based Learning In the AIS Course, The Review of Business Information Systems,
(7:2), 2003, pp 59-72.
17. Pedhazur, E. J., and Schmelkin, L. P. Measurement, Design, and Analysis: An Integrated Approach,
Lawrence Erlbaum Associates, Hillsdale, New Jersey, 1991.
18. Riemenschneider, C. K. and McKinney, V. R. Assessing Belief Differences in Small Business Adopters
and Non-Adopters of Web-Based E-Commerce, Journal of Computer Information Systems, (42:2), 2001,
pp. 101-108.
19. Riemenschneider, C. K. and Hardgrave, B. C. Explaining Software Development Tool Use with the
Technology Acceptance Model, Journal of Computer Information Systems, (41:4), 2001, pp. 1-8.
20. Thompson, R. L., Higgins, C. A., and Howell, J. M. Personal Computing: Toward A Conceptual Model of
Utilization, MIS Quarterly, (15:1), 1991, pp. 131.
21. Thompson, R. L., Higgins, C. A., and Howell, J. M. Influence of Experience on Personal Computer
Utilization: Testing A Conceptual Model, Journal of Management Information Systems, (11:1), 1994, pp.
167-187.
Review of Business Information Systems – Second Quarter 2008 Volume 12, Number 2
7
22. Venkatesh, V. Creating an Effective Training Environment for Enhancing Telework, Int. J. Human-
Computer Studies, (52), 2000, pp. 991-1005.
23. Venkatesh, V., Morris, M. G., Davis, G. B., and Davis, F. D. User Acceptance of Information Technology:
Toward A Unified View, MIS Quarterly, (27:3), 2003, pp. 425-478.
24. Vician, C, and Davis, L. R. Investigating Computer Anxiety and Communication Apprehension as
Performance Antecedents in a Computing-Intensive Learning Environment. Journal of Computer
Information Systems, (43:2), 2002-2003, 51-57.
APPENDICES
Appendix 1: Unified Theory of Acceptance and Use of Technology.
Appendix 2: Hypothesis description.
Hypothesis #
Description
Results
H1
End-User Training will positively correlate with Performance expectancy.
Supported
H2
End-User Training will positively correlate with Effort expectancy.
Supported
Review of Business Information Systems – Second Quarter 2008 Volume 12, Number 2
8
Appendix 3: Description Of Respondents.
Characteristics
Frequency
Percent (%)
Gender
Male
94
45.19
Female
106
50.96
Unknown
8
3.85
Age
Under 24
2
0.96
25-30
11
5.29
31-35
13
6.25
36-40
23
11.06
41-45
39
18.75
46-50
35
16.83
51-55
30
14.42
56-60
32
15.38
61-65
8
3.85
Over 66
2
0.96
Unknown
13
6.25
Education
High School
28
13.46
Associates Degree
32
15.38
Bachelor Degree
26
12.50
Master Degree
14
6.73
Doctorate Degree
87
41.83
Unknown
21
10.10
Computer Experience
Under 1 Year
7
3.37
1-5 Years
25
12.02
6-10 Years
60
28.85
11-15 Years
73
35.10
Over 15 Years
38
18.27
Unknown
5
2.40