ArticlePDF Available

Developing the information literacy self-efficacy scale. Journal of Documentation, 62(6), 730-743

Authors:

Abstract

Purpose – The main aim of this paper is to describe the development of a scale designed to measure self‐efficacy for information literacy. Design/methodology/approach – Cronbach's alpha, item analysis and item discrimination indices, principal component analysis, varimax rotation, and discriminant validity were used to measure reliability and validity of the scale. A 28‐item refined version of the scale was found highly reliable and of reasonable length. Findings – Further refinement based on principal component analysis indicated three major components, which allow approaching information literacy skills regarding to their complexity levels. Originality/value – The information literacy self‐efficacy scale is recommended to identify individuals with low self‐efficacy beliefs, which may be a significantly limiting factor for them to explore their information literacy skills.
Developing the information
literacy self-efficacy scale
S. Serap Kurbanoglu, Buket Akkoyunlu and Aysun Umay
Hacettepe University, Ankara, Turkey
Abstract
Purpose The main aim of this paper is to describe the development of a scale designed to measure
self-efficacy for information literacy.
Design/methodology/approach Cronbach’s alpha, item analysis and item discrimination
indices, principal component analysis, varimax rotation, and discriminant validity were used to
measure reliability and validity of the scale. A 28-item refined version of the scale was found highly
reliable and of reasonable length.
Findings Further refinement based on principal component analysis indicated three major
components, which allow approaching information literacy skills regarding to their complexity levels.
Originality/value The information literacy self-efficacy scale is recommended to identify
individuals with low self-efficacy beliefs, which may be a significantly limiting factor for them to
explore their information literacy skills.
Keywords Information literacy, Measurement, testing and instruments, Skills, Turkey
Paper type Research paper
Introduction
Todays’ societies, the most salient characteristic of which is the continuous change,
have often been considered as the information societies. As the amount of information
increases, technology gains momentum, the use of technology is becoming widespread
and societies are restructuring themselves in ways that react to these changes. It has
become obligatory for any individual of the information societies to have lifelong
learning skills to keep up with the changes and get acquainted with the new
developments.
Information literacy
Societies of information age need confident, and independent learners equipped with
lifelong learning-skills. Self-regulated learning and information literacy are key skills
required not only for lifelong learning but also for success in the information-based
societies. An information literate individual knows how to learn and is capable of
continuing lifelong learning. Information literacy is the term being applied to the skills
of information problem solving (American Library Association, 2000). The use of
information problem solving skills, in other words, information literacy skills is
becoming the necessary intellectual ingredient of any individual’s life.
Information literacy incorporates the abilities to recognize when information is
needed and then to initiate search strategies designed to locate the needed information.
It includes evaluating, synthesizing, and using information appropriately, ethically,
and legally once it is accessed from any media, including electronic or print sources. It
also includes communicating and sharing the results of the information
problem-solving efforts accurately and creatively across the range of information
The current issue and full text archive of this journal is available at
www.emeraldinsight.com/0022-0418.htm
JDOC
62,6
730
Received 19 October 2004
Revised 30 November 2004
Accepted 30 November 2004
Journal of Documentation
Vol. 62 No. 6, 2006
pp. 730-743
q Emerald Group Publishing Limited
0022-0418
DOI 10.1108/00220410610714949
formats, and evaluating how well the final product resolved the information problem
and how appropriate and efficient the steps taken to reach the desired outcome.
Furthermore, an information literate individual devises strategies for updating
self-generated knowledge and recognizes the principles of intellectual freedom and
equitable access to information (American Association of School Librarians and
Association for Educational Communications and Technology, 1998); Association of
College and Research Libraries, 2000; Association of College and Research Libraries,
2000; see also: Australian and New Zealand Institute for Information Literacy, 2004;
Doyle, 1994; Society of College, National and University Libraries, 1999; Spitzer et al.,
1998).
Self-efficacy and its importance for information literacy and lifelong learning
According to Bandura (1977) success is not only based on the possession of necessary
skills, it also requires the confidence to use these skills effectively. In other words,
learning certain skills is not enough, individuals should also develop confidence in the
skills that they are learning. Hence, besides possessing information literacy skills
individuals of today’s societies must also feel competent and confident in the use of
these skills. Therefore, attainment of high sense of self-efficacy beliefs is as important
as possessing information literacy skills.
Self-efficacy refers to a belief in one’s ability to successfully perform a particular
behaviour or task (Cassidy and Eachus, 1998). Bandura (1997) defines self-efficacy as a
belief in one’s capabilities to organize and execute the course of action required to
attain a goal. Self-efficacy beliefs provide the foundation for human motivation, well
being, and personal accomplishment. People have little incentive to act, if they believe
that the task in their hands, exceed their capabilities, but they undertake and perform
activities if they believe that their actions can produce the desired outcomes (Bandura,
1977, 1986; Pajares, 2002; Koul and Rubba, 1999; Cassidy and Eachus, 1998). In other
words, people tend to perform tasks and activities in which they feel competent and
confident and avoid those in which they do not (Kear, 2000; Pajares, 2002).
Self-efficacy beliefs determine how long individuals will persevere and how resilient
they will be in the face of difficulties and how much effort they will expend on an
activity. Individuals with a high self-efficacy perception expect to succeed and will
persevere in an activity until it is completed. On the contrary, individuals with low
self-efficacy perception, anticipate failure and are less likely to persist doing
challenging activities. The higher the sense of efficacy, the greater the effort,
persistence, and resilience (Pajares, 2002; Kear, 2000), which are two factors crucial for
information problem solving, self-regulated learning, and lifelong-learning. Bandura
underlines that individuals who develop a strong sense of self-efficacy are well
equipped to educate themselves when they have to rely on their own initiative
(Bandura, 1986). This is why strong self-efficacy perception for information literacy
becomes a necessity to accomplish lifelong learning.
Self-efficacy influences human functioning. Although the knowledge and skills
people possess play critical roles on the choices they make, people’s level of motivation,
and actions are based more on what they believe than on what is objectively true
(Bandura, 1997; Kear, 2000; Pajares, 2002). That is one reason why self-efficacy is so
important for lifelong learning. If individuals feel themselves competent and confident
about their information literacy skills they will willingly undertake and easily solve
Developing
the ILSES
731
information problems. Otherwise, it is more likely that they will avoid and hesitate to
try solving information problems in their hands.
Because self-efficacy is based on self-perceptions regarding particular behaviours,
the construct is considered to be situation specific or domain sensitive. That is, an
individual may exhibit high levels of self-efficacy within one domain while exhibiting
low levels within another one (Cassidy and Eachus, 1998). Thus, self-efficacy has
generated research in areas as diverse as medicine, business, psychology, education
and computers (Kear, 2000; O’Leary, 1985; Lev, 1997; Schunk, 1985; Koul and Rubba,
1999; Delcourt and Kinzie, 1993; Karsten and Roth, 1998; Compeau and Higgins, 1995;
Geer et al., 1998). However, the number of the research regarding self-efficacy for
information literacy, are few in number (Akkoyunlu and Kurbanoglu, 2003;
Kurbanoglu, 2003; Kurbanoglu and Akkoyunlu, 2003).
Measuring self-efficacy
Perceived self-efficacy refers to an identified level and strength of self-efficacy (Kear,
2000). The strength of self-efficacy is measured by degrees of certainty that one can
perform given tasks (Zimmerman, 1995). Therefore, self-efficacy demands to be
measured directly (rather than indirectly) by the use of self-report scales (Cassidy and
Eachus, 1998). Preparation of self-efficacy scales requires time and patience. One must
be certain to measure the self-efficacy beliefs relevant to the behavior in question
(Pajares, 2002).
There are a number of scales, which have been developed to measure perceived
self-efficacy in different context such as computer literacy (see, Cassidy and Eachus,
1998; Compeau and Higgins, 1995; Lloyd and Gressard, 1984; Delcourt and Kinzie,
1993) and teaching efficacy (see, Tschannen-Moran and Woolfolk Hoy, 2001; Henson
et al., 2001; Koul and Rubba, 1999). However, no self-efficacy scale for information
literacy found in the literature.
The necessity for the development of such a scale relates to the impact information
literacy is having on many aspects of life and in particular on lifelong learning.
Increasingly individuals of information societies are expected to be proficient users of
information. Low self-efficacy may be a significantly limiting factor for individuals
exploring information problem-solving skills vital for lifelong learning. The
development of an appropriate measure of self-efficacy for information literacy will
enable individuals “at risk” to be identified.
Method
The aim of the study
The main aim of this study is to describe the development of an information literacy
self-efficacy scale (ILSES) designed to measure self-efficacy for information literacy
and find out how well the instrument measures what it claims to assess.
Participants
Participants included randomly chosen 415 teachers from various branches. The
response rate of the participants was 90 percent (374 teachers) of whom 62 percent
were female, and 38 percent were male. The participants ranged in age from 20 to 52
years (mean ¼ 34.5, SD ¼ 2.2) were from five private and 14 public schools, of which
60.4 percent taught primary level, and 39.6 percent taught secondary level.
JDOC
62,6
732
Statistical analysis
Following statistical analysis were carried out: First, item analysis and item
discrimination indices were used to address the validity of the items on the scale, that
is, the extent to which the items tap the attributes they were intended to assess. Second,
Principal Components Analysis (PCA) and varimax rotation were carried out to
determine the construct of the scale and last, discriminant validity was used to
determine the validity for the subscales.
Developing the research instrument phase one
In the first stage literature in the domain was reviewed and seven main categories,
A. Defining the need for information, B. Initiating the search strategy, C. Locating and
accessing the resources, D. Assessing and comprehending the information,
E. Interpreting, synthesizing, and using the information, F. Communicating the
information, G. Evaluating the product and process, were named[1].
Covering each category 40 statements, such as: I feel confident and competent “to
define the information I need”, “to identify a variety of potential sources of
information”, to locate information sources in the library”, “to initiate search
strategies by using keywords and Boolean logic”, “to evaluate www sources”, and “to
prepare a bibliography”, were developed. A seven-point Likert scale, anchored with
notations: 7 ¼ almost always true, 6 ¼ usually true, 5 ¼ often true, 4 ¼ occasionally
true, 3 ¼ sometimes but infrequently true, 2 ¼ usually not true, 1 ¼ almost never true
was used to design the instrument (see Appendix 1). The instrument[2] was field-tested
with 50 teachers. The alpha reliability coefficient (0,78) signifying that the scale was
reliable.
Following the initial field-testing stage, participants, 374 teachers representing
different levels and branches from both public and private schools, were required to
indicate their level of confidence to each statement along the seven-point Likert scale.
Internal consistency of the 40-item scale as calculated by Cronbach’s alpha was quite
high (0,84).
Developing the research instrument phase two
On the second stage, item analyses conducted on data collected in order to find out
about the item validity. Item discrimination indices for each item were calculated (see
Table I). Discrimination indices of the 40 items in the scale ranged from 2 0.397 to
0.876. After the elimination of 12 items (C10, C14, C15, D17, D18, D19, D20, D22, E27,
E28, F31, F38) item validity indices of which are less than 0.20, median of the item
validity for the rest of the scale increased to 0.495. Internal consistency of the 28-item
scale as calculated by Cronbach’s alpha was also higher (0.92). This indicates that
refined 28-item instrument measures self-efficacy for information literacy better.
Developing the research instrument phase three
In the third stage, in order to explore the main components and the structure of
information literacy, further principal component analyses, factor loadings of which
are presented on Table II, run on the refined 28-item scale. Principal component
analysis extraction along with the Varimax rotation indicated the presence of three
components as well as indicating items, Eigenvalue is less than 1.5, which loaded
poorly on all factors. Of the 28 construct items, 17 loaded well on three components. In
Developing
the ILSES
733
total, 11 items did not load well. Thus, through the process of selection based on factor
loading the 28-item scale was refined to 17-item, reliability of which is calculated 0.82.
It is especially worthy that 17-item refined scale, which can be used to determine
subjects’ self-efficacy levels for information literacy, exhibits high reliability without
excessive length.
Three components extracted as a result of the principal component analysis were
examined and labeled based on Bloom’s taxonomy and learning principles. Component
1, which was comprised of either items related to defining, selecting, interpreting,
communicating information and learning from experience, is labeled as intermediate
information literacy skills. Component 2 was labeled as basic information literacy
skills. The five items loaded on this component were related to finding and using
information. Component 3 was labeled as advanced information literacy skills. This
component was made up of four items related to synthesizing information and
evaluating the information problem solving process and its products. Undoubtedly,
classifying information literacy skills from basic to advanced enables information
literacy instructors to address them accordingly in their instruction programs (see
Table III).
Developing the research instrument phase four
Following the scale refinement process, discriminant validity of the subscales, both for
28-item and 17-item scales, was also assessed by comparing total self-efficacy scores
across the subscale scores. The emergence of the positive correlation of the subscales
(see Tables IV and V) suggested that both 28-item and 17-item scales could be
considered to measure the underlying construct of efficacy and that subscale scores as
Item no. Item no
A1 0.467 D21 0.220
B2 0.630 D22 2 0.113
B3 0.438 D23 0.270
B4 0.591 D24 0.315
C5 0.730 D25 0.769
C6 0.496 E26 0.415
C7 0.243 E27 0.010
C8 0.425 E28 2 0.004
C9 0.397 E29 0.617
C10 0.156 F30 0.508
C11 0.581 F31 0.122
C12 0.876 F32 0.501
C13 0.494 F33 0.485
C14 0.148 F34 0.499
C15 0.146 F35 0.635
D16 0.713 F36 0.711
D17 2 0.128 F37 0.459
D18 2 0.264 F38 0.143
D19 2 0.397 G39 0.429
D20 2 0.142 G40 0.374
Table I.
Item discrimination
indices of the
40-item scale
JDOC
62,6
734
Component
Items 1 2 3
A 1 Define the information I need 0.463 0.060 0.231
B 2 Identify a variety of potential sources of information 0.484 0.250 0.304
B 3 Limit search strategies by subject, language and date 0.189 0.379 0.219
B 4 Initiate search strategies by using keywords and
Boolean logic
0.156 0.573 0.443
C 5 Decide where and how to find the information I need 0.511 0.608 0.158
C 6 Use different kinds of print sources (i.e. books,
periodicals, encyclopedias, chronologies, etc.)
0.276 0.572 0.024
C 7 Use electronic information sources 2 0.034 0.535 2 0.042
C 8 Locate information sources in the library 0.044 0.511 0.329
C 9 Use library catalogue 2 0.035 0.627 0.149
C 11 Locate resources in the library using the library
catalogue
0.272 0.481 0.285
C 12 Use internet search tools (such as search engines,
directories, etc.)
0.657 0.457 0.444
C 13 Use different kinds (types) of libraries 0.311 0.315 0.304
D 16 Use many resources at the same time to make a
research
0.466 0.470 0.341
D 21 Determine the authoritativeness, currentness and
reliability of the information sources
0.060 0.322 2 0.102
D 23 Select information most appropriate to the
information need
0.639 2 0.038 2 0.335
D 24 Identify points of agreement and disagreement
among sources
0.254 0.135 0.151
D 25 Evaluate www sources 0.671 0.430 0.213
E 26 Synthesize newly gathered information with
previous information
0.205 0.084 0.444
E 29 Interpret the visual information (i.e. graphs, tables,
diagrams)
0.547 0.171 0.254
F 30 Write a research paper 0.588 0.328 2 0.141
F 32 Determine the content and form the parts (i.e.
introduction, conclusion) of a presentation (written,
oral)
0.194 0.086 0.752
F 33 Prepare a bibliography 0.732 2 0.115 0.132
F 34 Create bibliographic records and organize the
bibliography
0.059 0.212 0.753
F 35 Create bibliographic records for different kinds of
materials (i.e. books, articles, thesis, papers, web
pages)
0.616 0.149 0.338
F 36 Make citations and use quotations within the text 0.679 0.181 0.339
F 37 Choose a format (i.e. written, oral, visual) appropriate
to communicate with the audience (i.e. students,
colleagues)
0.293 0.399 0.035
G 39 Learn from my information problem solving
experience and improve my information literacy skill
0.422 0.107 0.185
G 40 Criticize the quality of my information seeking
process and its products
0.284 2 0.150 0.623
Table II.
Rotated component
matrix (eigenvalues
over 1.5)
Developing
the ILSES
735
well as a total score could be calculated. Thus, both the subscale scores and the total
score can be used to assess efficacy.
Developing the research instrument phase five
On this stage, in order to make ILSES available to English speaking researchers English
version of the refined scale was also prepared. 47 students from the Department of
English Translation and Interpretation were required to reply both Turkish and English
versions of the scale. Test-retest of the items in the 28-item and 17-item scales was
calculated as 0.91 and 0.81, respectively (see Tables VI and VII). Correlation coefficients
of test-retest indicated the reliability of the English version for both.
Items Intermediate information literacy skills
1 A 1 Define the information I need
2 D 23 Select information most appropriate to the information need
3 E 29 Interpret the visual information (i.e. graphs, tables, diagrams)
4 F 30 Write a research paper
5 F 33 Prepare a bibliography
6 F 35 Create bibliographic records for different kinds of materials (i.e. books, articles,
thesis, web pages)
7 F 36 Make citations and use quotations within the text
8 G 39 Learn from my information problem solving experience and improve my
information literacy skill
Basic information literacy skills
9 C 6 Use different kinds of print sources (i.e. books, periodicals, encyclopedias,
chronologies, etc.)
10 C 7 Use electronic information sources
11 C 8 Locate information sources in the library
12 C 9 Use library catalogue
13 C 11 Locate resources in the library using the library catalogue
Advanced information literacy skills
14 E 26 Synthesize newly gathered information with previous information
15 F 32 Determine the content and form the parts (i.e. introduction, conclusion) of a
presentation (written, oral)
16 F 34 Create bibliographic records and organize the bibliography
17 G 40 Criticize the quality of my information seeking process and its products
Table III.
A 17-item refined scale
(final version)
ABCDEFG
0.47 0.57 0.56 0.46 0.58 0.61 0.43
Table IV.
Discriminant validity of
subscales for
28-item scale
ACDEFG
0.52 0.72 0.54 0.60 0.89 0.57
Table V.
Discriminant validity of
subscales for
17-item scale
JDOC
62,6
736
Conclusions and suggestions
Although the reliability of the 40-item scale was reasonable (0.84), item analysis to find
out about the item validity indicated that there were items in the 40-item scale, which
either repeated each other or did not measure well-enough the related category. Based
on the results of this analysis, the scale was refined into 28-item, and the use of 40-item
scale is not recommended although it seems like 40-item scale is more comprehensive
and there are missing items in the refined versions of the scale (such as “oral
presentation” for the communication category). The results indicated that 28-item
scale, with the highest Cronbach’s alpha (0.92 for the Turkish version and 0.91 for the
English version) among the three versions, could be considered highly reliable. It is of
reasonable length and should prove to be a useful tool for researchers who are
interested in measuring individual’s self-efficacy levels for information literacy. The
use of 28-item scale is highly recommended to identify individuals with low
self-efficacy beliefs, which may be a significantly limiting factor for them to explore
their information literacy skills.
Item no. r Item no. r
1 0.82 11 0.84
2 0.82 12 0.79
3 0.85 13 0.74
4 0.86 14 0.85
5 0.84 15 0.79
6 0.83 16 0.82
7 0.89 17 0.78
8 0.83
9 0.81
10 0.78 Overall 0,81
Table VII.
Correlation coefficient of
test-retest of the
17-item scale
Item no. r Item no. r
1 0.75 16 0.76
2 0.82 17 0.73
3 0.71 18 0.74
4 0.70 19 0.73
5 0.76 20 0.79
6 0.83 21 0.78
7 0.70 22 0.70
8 0.83 23 0.78
9 0.73 24 0.78
10 0.74 25 0.76
11 0.69 26 0.76
12 0.69 27 0.67
13 0.73 28 0.80
14 0.64
15 0.71 Overall 0.91
Table VI.
Correlation coefficient of
test-retest of the
28-item scale
Developing
the ILSES
737
A further principal component analysis, which indicated 17 items loaded on three main
components, was carried out for exploring the components and the construct of
information literacy. The main aim of this analysis was to find out whether it was
possible to present a different approach to the construction of information literacy
skills. Three components indicated by the principal component analysis, which were
labelled as basic, intermediate and advanced, could provide a guide for information
literacy instructors. Information literacy instruction programs could be examined to
determine whether and how these components are being addressed and conscious
effort could be made to address them according to their complexity level. A 17-item
scale is recommended for those who like to approach information literacy skills
regarding to their complexity levels based on learning principles (see Appendix 1-3,
Tables AI-AIII.
Notes
1. Previously published definitions and standards for information literacy were carefully
considered and compared. Seven categories labeled based on the common points withdrawn
mainly from Doyle’s Rubrics for Information Literacy (Doyle, 1994), AASL & AECT’s
Information Literacy Standards for Student Learning (American Association of School
Librarians and Association for Educational Communications and Technology, 1998),
ACRL’s Information Literacy Competency Standards for Higher Education (Association of
College and Research Libraries, 2000), the Big6 Approach to Information Problem Solving
(Spitzer et al., 1998), SCONUL’s Seven Pillar Information Literacy Model (Society of College,
National and University Libraries, 1999) and ANZIL’s Information Literacy Standards
(Australian and New Zealand Institute for Information Literacy, 2004).
2. Initial version of this scale was developed and used by the researchers (see Akkoyunlu and
Kurbanoglu, 2003; Kurbanoglu, 2003). Since then the instrument has been revised and gone
through a number of changes.
References
Akkoyunlu, B. and Kurbanoglu, S. (2003), “O
¨
g
˘
retmen adaylarının bilgi okuryazarlıg
˘
ıve
bilgisayar o
¨
z-yeterlik algıları u
¨
zerine bir c¸alıs¸ma (“A study on initial teacher training
students’ perceived self-efficacy for information literacy and computers”), Hacettepe
U
¨
niversitesi Eg
˘
itim Faku
¨
ltesi Dergisi, Vol. 24, pp. 1-10.
American Association of School Librarians and Association for Educational Communications
and Technology (1998), Information Literacy Standards for Student Learning, American
Library Association, Chicago, IL.
American Library Association (2000), Information Literacy: a Position Paper on I
˙
nformation
Problem Solving, ALA, available at: www.ala.org/aasl/positions/ps_infolit.html
Association of College and Research Libraries (2000), Information Literacy Competency
Standards for Higher Education: Standards, Performance I
˙
ndicators, and Outcomes,
ACRL, available at: www.ala.org/acrl/ilstandardlo.html
Australian and New Zealand Institute for Information Literacy (2004) in Bundy, A. (Ed.),
Australian and New Zealand Information Literacy Framework: Principles, Standards and
Practices, 2nd ed., ANZIL, Adelaide.
Bandura, A. (1977), “Self-efficacy: toward a unifying theory of behavior change”, Psychological
Review, Vol. 84, pp. 191-215.
Bandura, A. (1986), Social Foundations of Thought and Action: a Social Cognitive Theory,
Prentice Hall, Englewood Cliffs, NJ.
Bandura, A. (1997), Self-efficacy: The Exercise of Control, W.H. Freeman and Company,
New York, NY.
JDOC
62,6
738
Cassidy, S. and Eachus, P. (1998), “Developing the computer self-efficacy (CSE) scale:
investigating the relationship between CSE, gender and experience with computers”,
Computer Self-Efficacy Web Site, available at: www.chssc.salford.ac.uk/healthSci/selfeff/
selfeff.htm
Compeau, D.R. and Higgins, C.A. (1995), “Computer self-efficacy: development of a measure and
initial test”, MIS Quarterly, June, pp. 189-211.
Delcourt, M. and Kinzie, M. (1993), “Computer technologies in teacher education: the
measurement of attitudes and self-efficacy”, Journal of Research and Development in
Education, Vol. 27, pp. 31-7.
Doyle, C. (1994), Information Literacy in an Information Society: A Concept for the Information
Age, ERIC, Syracuse, NY.
Geer, R., White, B. and Barr, A. (1998), The Effect of an Information Literacy Subject on Teacher
Education S tudents Computing Self-efficacy, available at: www.cegsa.sa.edu.au/
conference/acec98/acec98.htm
Henson, R., Kogan, L.R. and Vacha-Haase, T. (2001), “A reliability generalization study of the
teacher efficacy scale and related instruments”, Educational and Psychological
Measurement, Vol. 61, pp. 404-20.
Karsten, R. and Roth, M.R. (1998), “The relationship of computer experience and computer
self-efficacy to performance in introductory computer literacy courses”, Journal of
Research on Technology Education, Vol. 31, pp. 14-24.
Kear, M. (2000), Concept Analysis of Self-efficacy. Graduate Research in Nursing, available at:
http://graduateresearch.com/Kear.htm
Koul, R. and Rubba, P. (1999), “An analysis of the reliability and validity of personal internet
teaching efficacy beliefs scale”, Electronic Journal of Science Education, September,
available at: http://unr.edu/homepage/crowther/ejse/koulrubba.html
Kurbanoglu, S. (2003), “Self-efficacy: a concept closely linked to information literacy and lifelong
learning”, Journal of Documentation, Vol. 59, pp. 635-46.
Kurbanoglu, S. and Akkoyunlu, B. (2003), Information Literacy and Teacher Education: A Study
Applied in Turkey, poster presented at the 69th IFLA General Conference and Council.
Berlin, 1-9 August.
Lev, E.L. (1997), “Bandura’s theory of self-efficacy: applications to oncology”, Scholarly Inquiry
for Nursing Practice, Vol. 11, pp. 21-42.
Lloyd, B.H. and Gressard, C. (1984), “Reliability and factorial validity of computer attitude
scales”, Educational and Psychological Measurement, Vol. 42, pp. 501-5.
O’Leary, A. (1985), “Self-efficacy and health”, Behavioral Research and Technology, Vol. 23,
pp. 437-51.
Pajares, F. (2002), Overview of Social Cognitive Theory and of Self-efficacy, available at: www.
emory.edu/EDUCATION/MFP/eff.html
Schunk, D.H. (1985), “Self-efficacy and classroom learning”, Psychology in the Schools, Vol. 22,
pp. 208-23.
Society of College, National and University Libraries (1999), Information Skills in Higher
Education: A SCONUL Position Paper, available at: www.sconul.ac.uk/activities/inf_lit/
papers/Seven_pillars.html
Spitzer, K.L., Eisenberg, M.B. and Love, C.A. (1998), Information Literacy: Essential Skills for the
Information Age, ERIC, Syracuse, NY.
Tschannen-Moran, M. and Woolfolk Hoy, A. (2001), “Teacher efficacy: capturing an elusive
construct”, Teaching and Teacher Education, Vol. 17, pp. 783-805.
Zimmerman, B.J. (1995), “Self-efficacy and educational development”, in Bandura, A. (Ed.),
Self-efficacy in Changing Societies , Cambridge University Press, New York, NY, pp. 202-31.
Developing
the ILSES
739
Appendix 1
I feel confident and competent to
A1 Define the information I need 1 234567
B2 Identify a variety of potential sources of information 1 234567
B3 Limit search strategies by subject, language and date 1 234567
B4 Initiate search strategies by using keywords and
Boolean logic
1234567
C5 Decide where and how to find the information I need 1 234567
C6 Use different kinds of print sources (such as books,
periodicals, encyclopedias, chronologies, etc.)
1234567
C7 Use electronic information sources 1 234567
C8 Locate information sources in the library 1 234567
C9 Use library catalogue 1 234567
C10 Interpret information on the library catalogue 1 234567
C11 Locate resources in the library using the library
catalogue
1234567
C12 Use internet search tools (such as search engines,
directories, etc.)
1234567
C13 Use different kinds (types) of libraries 1 234567
C14 Use different kinds of library catalogues (i.e. card
catalogues, online catalogues)
1234567
C15 Use/search indexes and electronic databases 1 234567
D16 Use many resources at the same time to make a
research
1234567
D17 Differentiate between fact and opinion 1 234567
D18 Recognize errors in logic 1 234567
D19 Classify the information 1 234567
D20 Recognize interrelationships among concepts 1 234567
D21 Determine the authoritativeness, currentness and
reliability of the information sources
1234567
D22 Evaluate information critically 1 234567
D23 Select information most appropriate to the
information need
1234567
D24 Identify points of agreement and disagreement
among sources
1234567
D25 Evaluate www sources 1 234567
E26 Synthesize newly gathered information with
previous information
1234567
E27 Synthesize and summarize information gathered
from different sources
1234567
E28 Paraphrase the information 1 234567
E29 Interpret the visual information (i.e. graphs, tables,
diagrams)
1234567
F30 Write a research paper 1 234567
F31 Make an oral presentation 1 234567
F32 Determine the content and form the parts
(introduction, conclusion) of a presentation (written,
oral)
1234567
F33 Prepare a bibliography 1 234567
(continued)
Table AI.
Information literacy
self-efficacy scale
40-item initial version
JDOC
62,6
740
I feel confident and competent to
F34 Create bibliographic records and organize the
bibliography
1234567
F35 Create bibliographic records for different kinds of
materials (i.e. books, articles, web pages)
1234567
F36 make citations and use quotations within the text 1 234567
F37 Choose a format (i.e. written, oral, visual) appropriate
to communicate with the audience
1234567
F38 Determine the level appropriate to communicate with
the audience
1234567
G39 Learn from my information problem solving
experience and improve my information literacy skill
1234567
G40 Criticize the quality of my information seeking
process and its products
1234567
Notes: This scale has been prepared to determine your level of efficacy on issues related with the
information (to find, use and communicate information) Here the notations shall be referred to as
7 ¼ almost always true, 6 ¼ usually true, 5 ¼ often true, 4 ¼ occasionally true, 3 ¼ sometimes but
infrequently true, 2 ¼ usually not true, 1 ¼ almost never true. Please mark the most suitable choice for
you. Thanks for your cooperation. A ¼ Defining the need for information B ¼ Initiating the search
strategy C ¼ Locating and accessing the resources D ¼ Assessing and comprehending information
E ¼ Interpreting, synthesizing, and using information F ¼ Communicating Information
G ¼ Evaluating the product and process
Table AI.
Developing
the ILSES
741
Appendix 2
I feel confident and competent to
A1 Define the information I need 1234567
B2 Identify a variety of potential sources of information 1234567
B3 Limit search strategies by subject, language and date 1234567
B4 Initiate search strategies by using keywords and Boolean
logic
1234567
C5 Decide where and how to find the information I need 1234567
C6 Use different kinds of print sources (i.e. books, periodicals,
encyclopedias, chronologies, etc.)
1234567
C7 Use electronic information sources 1234567
C8 Locate information sources in the library 1234567
C9 Use library catalogue 1234567
C10 Locate resources in the library using the library catalogue 1234567
C11 Use internet search tools (such as search engines, directories,
etc.)
1234567
C12 Use different kinds (types) of libraries 1234567
D13 Use many resources at the same time to make a research 1234567
D14 Determine the authoritativeness, currentness and reliability
of the information sources
1234567
D15 Select information most appropriate to the information need 1234567
D16 Identify points of agreement and disagreement among
sources
1234567
D17 Evaluate www sources 1234567
E18 Synthesize newly gathered information with previous
information
1234567
E19 Interpret the visual information (i.e. graphs, tables, diagrams) 1234567
F20 Write a research paper 1234567
F21 Determine the content and form the parts (introduction,
conclusion) of a presentation (written, oral)
1234567
F22 Prepare a bibliography 1234567
F23 Create bibliographic records and organize the bibliography 1234567
F24 Create bibliographic records for different kinds of materials
(i.e. books, articles, web pages)
1234567
F25 Make citations and use quotations within the text 1234567
F26 Choose a format (i.e. written, oral, visual) appropriate to
communicate with the audience
1234567
G27 Learn from my information problem solving experience and
improve my information literacy skill
1234567
G28 Criticize the quality of my information seeking process and its
products
1234567
Notes: This scale has been prepared to determine your level of efficacy on issues related with the
information (to find, use and communicate information) Here the notations shall be referred to as
7 ¼ almost always true, 6 ¼ usually true, 5 ¼ often true, 4 ¼ occasionally true, 3 ¼ sometimes but
infrequently true, 2 ¼ usually not true, 1 ¼ almost never true. Please mark the most suitable choice for
you. Thanks for your cooperation. A ¼ Defining the need for information B ¼ Initiating the search
strategy C ¼ Locating and accessing the resources D ¼ Assessing and comprehending information
E ¼ Interpreting, synthesizing, and using information F ¼ Communicating Information
G ¼ Evaluating the product and process
Table AII.
Information literacy
self-efficacy scale
28-item version
JDOC
62,6
742
Appendix 3
Corresponding author
S. Serap Kurbanoglu can be contacted at: serap@ hacettepe.edu.tr
I feel confident and competent to
C1 Use different kinds of print sources (i.e. books,
periodicals, encyclopedias, chronologies, etc.)
1234567
C2 Use electronic information sources 1 234567
C3 Locate information sources in the library 1 234567
C4 Use library catalogue 1 234567
C5 Locate resources in the library using the library
catalogue
1234567
A6 Define the information I need 1 234567
D7 Select information most appropriate to the
information need
1234567
E8 Interpret the visual information (i.e. graphs, tables,
diagrams)
1234567
F9 Write a research paper 1 234567
F10 Prepare a bibliography 1 234567
F11 Create bibliographic records for different kinds of
materials (i.e. books, articles, web pages)
1234567
F12 Make citations and use quotations within the text 1 234567
G13 Learn from my information problem solving
experience and improve my information literacy skill
1234567
E14 Synthesize newly gathered information with
previous information
1234567
F15 Determine the content and form the parts
(introduction, conclusion) of a presentation (written,
oral)
1234567
F16 Create bibliographic records and organize the
bibliography
1234567
G17 Criticize the quality of my information seeking
process and its products
1234567
Notes: This scale has been prepared to determine your level of efficacy on issues related with the
information (to find, use and communicate information) Here the notations shall be referred to as
7 ¼ almost always true, 6 ¼ usually true, 5 ¼ often true, 4 ¼ occasionally true, 3 ¼ sometimes but
infrequently true, 2 ¼ usually not true, 1 ¼ almost never true. Please mark the most suitable choice for
you. Thanks for your cooperation. A ¼ Defining the need for information B ¼ Initiating the search
strategy C ¼ Locating and accessing the resources D ¼ Assessing and comprehending information
E ¼ Interpreting, synthesizing, and using information F ¼ Communicating Information
G ¼ Evaluating the product and process
Table AIII.
Information literacy
self-efficacy scale
17-item version
Developing
the ILSES
743
To purchase reprints of this article please e-mail: reprints@emeraldinsight.com
Or visit our web site for further details: www.emeraldinsight.com/reprints
... In discussing information literacy, it is important to note that teachers with fewer information literacy skills are more likely to avoid handling information problem solving. By contrast, those who feel more confident in their information literacy skills will be more willing to tackle activities related to information problem solving (Kurbanoglu, 2003;Kurbanoglu et al., 2006). Pan and Franklin (2011) confirmed that positive prior experience with information search strategies is significantly related to the successful development of self-efficacy. ...
... The second and third hypotheses tested the direct effect of two information literacy components (perceived information evaluation literacy and perceived information search literacy) on the ICT self-efficacy in teaching of pre-service teachers. The results of hypothesis testing are in line with the results of other authors (Hatlevik, 2017;Kurbanoglu et al., 2006;Pan & Franklin, 2011). Hatlevik (2017) revealed that general ICT self-efficacy predicts information evaluation strategies. ...
... For the second hypothesis, analyzing the direct effects of perceived information evaluation literacy and perceived information search literacy on ICT self-efficacy in teaching, we noted that perceived information search literacy (IL-S) more reliably predicts ICT self-efficacy in teaching than perceived information evaluation literacy (IL-E) (Table 10). This finding seems to underpin the fact that having the ability to search for information allows one to feel more confident (Kurbanoglu et al., 2006) and, hence, perceive one's self-efficacy. Positive experiences with information search strategies are directly related to how teachers perceive self-efficacy (Pan & Franklin, 2011). ...
Article
Full-text available
This study analyzes the relationship between pre-service teachers’ information search and information evaluation literacy and their information and communication technology (ICT) self-efficacy in teaching. Theoretical analysis confirmed a direct relation between information literacy and pre-service teachers’ ICT self-efficacy in teaching. However, there is insufficient understanding regarding the effect that specific components of information literacy, i.e., information search and evaluation, have on teachers’ ICT self-efficacy in teaching. Data were collected using an online survey of students in teacher training programs who were engaged as research participants. The analysis disclosed that perceived information evaluation literacy has a more strongly expressed indirect impact on teachers’ ICT self-efficacy than a direct impact, while perceived information search literacy has a stronger direct impact on teachers’ ICT self-efficacy in teaching. Therefore, for teacher educators, it is important to develop a sufficient level of information literacy and ensure a positive experience with information search and evaluation, which are related to higher pre-service teachers’ ICT self-efficacy in teaching. This finding may help support calls for teacher training and/or professional development programs with a focus on information literacy, which might increase teachers’ ICT self-efficacy in teaching and improve ICT use in teaching.
... IL is defined as a set of abilities to recognise when information is needed, to initiate search strategies to locate, evaluate, assess, analyse and effectively use the needed information, and to make decisions in formal and informal learning contexts, at work, at home and at the educational settings (ACRL, 2017, p. 12;Serap Kurbanoglu et al., 2006). DL is defined as the JD 78,7 ability to use information and communication technologies to find, evaluate, create and communicate information, requiring both cognitive and technical skills (American Library Association, 2012, p. 1). ...
... For example, we asked questions such as "I would expect to use digital technologies to seek for information", or "I will continue using digital technologies in the future". The literacies scales, DL with ten items and IL with seven items were derived from (Ng, 2012;Serap Kurbanoglu et al., 2006). For DL, we asked, e.g. ...
... Digital literacy (Ng, 2012;Serap Kurbanoglu et al., 2006 ...
Article
Full-text available
Purpose Information and digital literacy have recently received much interest, and they are being viewed as critical strategic organisational resources and skills that employees need to obtain in order to function at their workplaces. Yet, the role of employees' literacy seems to be neglected in current literature. This paper aims to explore the roles that information and digital literacy play on the employees' perception in relation to usefulness and ease of use of digital technologies and consequently their intention to use technology in the practices they perform at the workplace. Design/methodology/approach This paper builds a conceptual model with key constructs (information literacy and digital literacy) as new antecedents to the technology acceptance model and aims to establish that information literacy and digital literacy are indirect determinants of employees' intention to use digital technologies at the workplace. The data set used in this paper comprises of 121 respondents and structural equation modelling was used. Findings The findings reveal that both information literacy and digital literacy have a direct impact on perceived ease of use of technology but not on the perceive usefulness. The findings also show that both literacies have an indirect impact on the intention to use digital technology at work via attitude towards use. Practical implications Managers and decision-makers should pay close attention to the literacy levels of their staff. Because literacies are such an important skillset in the digital age, managers and chief information officers may want to start by identifying which work groups or individuals require literacy training and instruction, and then provide specific and relevant training or literacy interventions to help those who lack sufficient literacy. Originality/value This is one of the first studies to consider information literacy and digital literacy as new antecedents of the technology acceptance model at the workplace environment.
... This quantitative study used descriptive survey method and the sample sized of this study was composed of n=330 conveniently selected LIS students of two (DoIM University of Sargodha and DoIM University of the Punjab) leading LIS schools of Pakistan. adapt data collection tool developed by (Kurbanoglu, Akkoyunlu, & Umay, 2006) pilot tested (Cronbach's alpha value .842 for 28 IL items and .801 ...
Article
This study aimed to investigate the perceived level of IL skills, strength, weakness, challenges, and levels of IL programs attended by ILS students of two leading library schools of Pakistan. This research used descriptive survey method and distributed questionnaire among 330 participants which returned 232 complete usable responses. This study found that participants of both LIS schools perceived good level of IL skills while the majority of them prefer internet sources as they possess adequate internet surfing skills. In addition to challenges the large number of participants lacks knowledge and skills to use computer and personal computer (M = 2.71, SD = 1.245) and (M = 2.84, SD = 1.439). Whereas, the results of IL program attended depicted that majority of participants obtained low mean of (M = 2.76, SD = .985) to (M = 2.57, SD = .991). This study concludes that students of both IL schools were performing well with regard to their IL skills even though there is a dire need for advance level of IL instructions and embedded IL course integration at all a level of degree programs to equipped students with necessary skills.
... The sample of the study is made up of 60 pre-service school librarians. Through purposive sampling, the researcher used the entire population of 60 preservice school librarians in their final year at the department of library and information science, university of Nigeria, Nsukka For the questionnaire, Kurbanoglu et al. (2006) developed the ILSE scale (ILSES). Participants are asked to rate the effectiveness of each item on the 28 item instrument using a seven-point Likert scale anchored by notations such as: 7=almost always true, 6=usually true, 5= often true, 4= occasionally true, 3=sometimes but infrequently true, 2=usually not true, 1=almost never true. ...
Article
The study examined the relationship that exists between academic resilience and information literacy self-efficacy among final year pre-service school librarians in a selected University in Nigeria. This study adopted the correlational survey research design. Through multistage sampling technique, 60 final year students of Library and Information science were selected. The questionnaires such as Academic Resilience Scale (ARS) and Information Literacy Self-Efficacy Scale (ILSES) were used to collect data. Linear regression was used to test the null hypotheses. The correlation coefficient of R=0.601 shows that there was a moderate positive association between academic resilience and information literacy self-efficacy. The regression of ANOVA of F(1,58) = 32.863, p = 0.00 indicated that there was a significant moderate relationship between academic resilience and information literacy self-efficacy of final year pre-service school librarians. The findings further indicated that final year pre-service librarians’ academic resilience is a significant predictor of their information literacy self-efficacy irrespective of gender and age.
... This survey measured self-efficacy for IL. It is a 28-item scale developed and validated by Kurbanoglu, Akkoyunlu and Umay (2006) with the initial stem 'I feel confident and competent to', followed by various statements pertaining to abilities regarding IL such as 'Synthesize newly gathered information with previous information'. Participants responded on a seven-point Likerttype scale. ...
Article
Full-text available
This study tested the influence of instructional elements within an online Open Educational Resource (OER) focused on information literacy (IL) on outcome measures of IL achievement, learner satisfaction and IL self-efficacy among undergraduate students. An online OER was designed to address the domains of access, evaluation and communication of IL guided by the notion of instructional scaffolding and self-regulated learning. Participants were randomly placed into one of six different OER conditions: (a) full version with all instructional elements, (b) lean version, (c) version without tooltip text, (d) version without embedded practice questions, (e) version without learning objectives and (f) version without summaries. There were no significant differences found across the six conditions on the dependent measures. Participants averaged 58% for IL achievement, performing slightly better in the domain of access versus evaluate and communicate. Limitations include a controlled laboratory setting where participants were not necessarily motivated to complete the study tasks at a high level of achievement. Future research can explore more ecologically valid environments where learners might be more motivated, along with more rigorous intervention and assessment construction. This paper includes implications for educators and researchers to explore the established and innovative instructional elements that are natural affordances of an online OER in IL. This paper presents innovative IL instruction that does not require instructor or learner training and evaluates its effectiveness using a sound, replicable methodological approach to isolate the effects of the individual instructional elements.
... Map of the core publication channels based on bibliographic-coupling Garfield's algorithmic historiography shows that the starting point for the field's growth was the 1999 paper on Information Literacy in Science and Engineering Undergraduate Education by Leckie and Fullerton [20]. Core documents of the field concern the self-assessment of IL skills [21][22][23], the development of an IL test [24], and IL instructions [25], the theoretical distinction between IL and digital literacy [26], problem-solving [27], and the role of motivation in IL [28]. ...
Chapter
Full-text available
Abstract. This exploratory paper investigates with bibliometric methods the area of research and practice constituted by information literacy in science, technology, engineering, and mathematics (STEM). Amongst the findings are the most central publication channels, authors, and topics. Academic librarianship and library and information science appear as intellectual bases for this field, although a degree of specialisation (particularly towards the health sciences and engineering) also emerges. The findings are discussed in light of Richard Whitley’s sociology of science and Annemaree Lloyd’s sociocultural approach to information literacy. Keywords: STEM information literacy · STEM · Information literacy · Bibliometric mapping · Science mapping · Bibliometrics · Sociology of science · Bibliometrix
Article
The study investigated gender differences in information literacy self-efficacy among academic librarians working in federal university libraries in South-east Nigeria. The descriptive survey research design was used in this study. Using purposive sampling, twenty academic librarians were selected for the study. The Information Literacy Self-Efficacy Scale (ILSES) was used to obtain the data. In total, there are 28 items on the scale, which has a high Cronbach Alpha reliability score of 0.91. In this scale, seven sub-dimensions are measured extensively. To analyze the data, percentages, means, standard deviations, t-tests, and Lavene's tests were used. In terms of mean ratings, male and female academic librarians did not differ significantly (2,28) = -183; p =.855. As such, gender has no effect on information literacy self-efficacy in academic librarians. Any difference in the perceived self-efficacy of academic librarians in information literacy was not caused by their gender. To improve work efficiency and professional competency, academic librarians should upgrade their information literacy skills.
Article
Background: Quantitative and qualitative improvement of graduate student projects requires recognizing the problems of the target group perspective. Objectives: This study aimed to evaluate the research self-efficacy of students and the effectiveness of their research professors. Methods: This cross-sectional analytical study was conducted on all graduate students of basic sciences at the medical school by census method (116 people) in 2019 who were in dissertation compiling process with their supervisors. Two separate questionnaires were designed for professors and students. The data were analyzed by descriptive statistics using SPSS Software Version 16 and their relationship were examined by the Pearson correlation coefficient. Results: According to the data collected from 116 students, 65.5% were female, 69% were single, 91.4% were master's, and 8.6% were doctoral. The self-efficacy was divided into 7 areas, which was assessed based on the students' point of view. The highest score was in ethics (3.77 ± 0.9), and the lowest score was in qualitative research and statistical methods (3.01 ± 0.8). Regarding the evaluation of professors from students, self-efficacy in ethics (4.67 ± 0.66) and implementation method had the highest scores, and self-efficacy in statistics and analysis had the lowest score (3.25 ± 1.18). Pearson correlation measurements indicated that the score of the students' research method had a significant relationship with areas, including conceptualization, report writing, and translation of English texts (P < 0.05). Conclusions: Getting a good grade from students, especially high score in ethics, shows the effectiveness of the activities and the weakness in qualitative research and statistical analysis.
Article
Full-text available
Information literacy (IL) is the set of skills and knowledge that allows us to find, evaluate, and use the information we need, as well as to filter out the information we don’t need. IL skills are the necessary tools that help us successfully navigate the present and future landscape of information. Information and technology affects every person in every possible setting—work, education, recreation. This paperoffers an overview of IL focusing on three contexts for successful IL learning and teaching: (i) the information process itself, (ii) technology in context, and (iii) implementation through real needs in real situations. The author covers conceptual understandings of IL, the range of IL standards and models, technology within the IL framework, and practical strategies for effective IL skills learning and instruction in a range of situations.http://dx.doi.org/10.14429/djlit.28.2.166
Article
Full-text available
Purpose – The purpose of this paper is to argue that e-learning can be a viable alternative teaching method for information literacy according to a comparison of librarian’s time spent on face-to-face teaching with tutoring the e-learning course, average time spent in a week on learning by the students, time flexibility of e-learning, students’ satisfaction with e-learning and students’ ability to gain practical skills and theoretical knowledge through e-learning. Design/methodology/approach – Satisfaction of medical students with e-learning and their average weekly time spent learning were assessed through surveys designed in Google Documents. Weekly time spent by students learning in class and the number of the librarian's teaching hours were set by the university schedule, and time spent on tutoring e-learning was measured. Details of accesses to study materials and submission of tasks as well as exam results were collected from Masaryk University Learning Management System. Findings – In 2011, 50 per cent less time was expended on tutoring e-learning than time spent with the same number of students in the previous three years in the classroom. One-third of the students learned for more hours a week through e-learning than students in class. No significant difference in gained theoretical knowledge between these students was found. On average, 90 per cent of tasks submitted to e-learning were correct the first time. e-learning was appreciated by the students for its time (93 per cent) and space (83 per cent) flexibility, the online materials (62 per cent) and self-managing learning time (55 per cent). Details of access to the study materials confirmed time flexibility. Originality/value – Due to time saved and considering the lack of any significant difference in the knowledge gained by students, e-learning can be a viable alternative teaching method for information literacy.
Article
Presents an integrative theoretical framework to explain and to predict psychological changes achieved by different modes of treatment. This theory states that psychological procedures, whatever their form, alter the level and strength of self-efficacy. It is hypothesized that expectations of personal efficacy determine whether coping behavior will be initiated, how much effort will be expended, and how long it will be sustained in the face of obstacles and aversive experiences. Persistence in activities that are subjectively threatening but in fact relatively safe produces, through experiences of mastery, further enhancement of self-efficacy and corresponding reductions in defensive behavior. In the proposed model, expectations of personal efficacy are derived from 4 principal sources of information: performance accomplishments, vicarious experience, verbal persuasion, and physiological states. Factors influencing the cognitive processing of efficacy information arise from enactive, vicarious, exhortative, and emotive sources. The differential power of diverse therapeutic procedures is analyzed in terms of the postulated cognitive mechanism of operation. Findings are reported from microanalyses of enactive, vicarious, and emotive modes of treatment that support the hypothesized relationship between perceived self-efficacy and behavioral changes. (21/2 p ref)
Article
The test instrument utilized to diagnose and evaluate a trainee`s ability are necessary for an effective information literacy education. Nevertheless, there is a lack of a standardized test instrument to comprehensively measure students` information literacy. The purpose of this study is to develop a standardized test instrument to evaluate the information literacy of middle school students, and to verify the reliability and validity of the test instrument. For this purpose, this study selected factors that can show the information literacy and developed an information literacy test framework that was designed based on Guilford`s SOI model and Meeker`s SOI-LA test. The test instrument that was developed through this study is a 30-item Web-based multiple-choice test. This study administrated tests in middle school students (794 students joined), and analyzed difficulty, reliability, discrimination index, validity of tests, and reviewed tests items to qualify the standardized test. The cutoff score was also decided when using these tests as a diagnostic information literacy assessment.
Article
Le présent article porte sur les compétences informationnelles au sein du système d’éducation du Québec, tant pour les élèves que pour les enseignants. Les auteurs y proposent une revue des études empiriques sur le niveau des compétences informationnelles des futurs maîtres du Québec avant d’introduire la méthodologie et les résultats d’une enquête portant, d’une part, sur les habitudes de recherche d’information et, d’autre part, sur le sentiment d’autoefficacité de futurs enseignants quant à leurs compétences informationnelles. L’analyse des obstacles au développement de ces compétences vient clore ce texte.
Article
Assessment is an increasingly important part of communication pedagogy, not just for periodic accreditation reviews but for ongoing justifications for resources and course design. This project relates the story of how another college at our university prompted us to prove that our Science Writing and Presentation course really delivered what it was supposed to deliver. Our resulting assessment allowed us to maintain control over the course and how it would be measured, along with providing us with data we are leveraging to demonstrate student learning outcomes while maintaining and even extending the impact of our course and attracting increased instructional resources.