ChapterPDF Available

Student analytics as part of the longitudinal evaluation of language programs

Authors:

Abstract and Figures

This paper provides strategies and examples of how program administrators can monitor a variety of potentially important indicators for the health of a university language program and how they can utilize technological tools and resources for the collection and analysis of program-relevant data. It identifies and addresses four challenges of a language program: (1) the long-term tracking and monitoring of student enrollments and their meaningful interpretation taking into account both internal (departmental and institutional) and external (regional and national) data, (2) the identification of potential issues of the program and subsequent intervention, (3) the establishment of student profiles that could be relevant for program planning and design, and (4) the tracking of individual students’ success and special achievements in their career paths. The paper makes a case for the use of internal student surveys, complemented by internal and external data as essential components of a continuous, internally-driven program evaluation that will also be of importance for periodic academic program reviews.
Content may be subject to copyright.
AAUSC 2014 Volume – Issues in
Language Program Direction
Innovation and Accountability in
Language Program Evaluation
John Norris, Georgetown University
Nicole Mills, Harvard University
Editors
Stacey Katz Bourns, Harvard University
Series Editor
Australia • Brazil • Mexico • Singapore • United Kingdom • United States
75096_fm_ptg01_hires_i-xvi.indd 3 9/18/14 10:41 PM
© 2016, Cengage Learning
WCN: 01-100-101
ALL RIGHTS RESERVED. No part of this work covered by the
copyright herein may be reproduced, transmitted, stored, or used
in any form or by any means graphic, electronic, or mechanical,
including but not limited to photocopying, recording, scanning,
digitizing, taping, web distribution, information networks, or
information storage and retrieval systems, except as permitted
under Section 107 or 108 of the 1976 United States Copyright Act,
without the prior written permission of the publisher.
Library of Congress Control Number: 2014948811
ISBN-13: 978-1-305-27509-6
Cengage Learning
20 Channel Center Street
Boston, MA
USA
Cengage Learning is a leading provider of customized learning
solutions with office locations around the globe, including
Singapore, the United Kingdom, Australia, Mexico, Brazil,
and Japan. Locate your local office at international.cengage.com/
region.
Cengage Learning products are represented in Canada by Nelson
Education, Ltd.
For your course and learning solutions, visit www.cengage.com.
Purchase any of our products at your local college store or at our
preferred online store www.cengagebrain.com.
Instructors: Please visit login.cengage.com and log in to access
instructor-specific resources.
AAUSC 2014 Volume - Issues
in Language Program
Direction: Innovation and
Accountability in Language
Program Evaluation
John Norris, Nicole Mills, and
Stacey Katz Bourns
Product Director: Beth Kramer
Product Assistant: Jacob Schott
Marketing Brand Manager:
Christine Sosa
IP Analyst: Jessica Elias
IP Project Manager: Farah Fard
Manufacturing Planner: Betsy
Donaghey
Art and Design Direction,
Production Management,
and Composition: Lumina
Datamatics, Inc.
For product information and
technology assistance, contact us at Cengage Learning
Customer & Sales Support, ---
For permission to use material from this text or product,
submit all requests online at www.cengage.com/permissions
Further permissions questions can be e-mailed to
permissionrequest@cengage.com
Printed in the United States of America
Print Number: 01 Print Year: 2014
75096_fm_ptg01_hires_i-xvi.indd 4 9/19/14 2:42 PM
62
Chapter 3
Student Analytics and the Longitudinal Evaluation
of Language Programs1
Peter Ecke and Alexander Ganz, University of Arizona
Introduction: Challenges for Internal Program
Evaluation
The best preparation for external language program evaluation (e.g., periodic
departmental reviews) is a continuous internally driven self-evaluation of a pro-
gram. In this chapter, we will make a case for the use of survey, enrollment,
and other longitudinal data to internally evaluate foreign language and culture
programs over time. We will not focus on the assessment of student learning
outcomes, which undoubtedly represents an essential part of program evalua-
tion and which has been relatively well covered elsewhere in the literature (see
Norris, 2006). Instead, we will explore additional aspects considered important
for an ongoing evaluation of language programs including: (a) the continu-
ous monitoring (and meaningful interpretation) of short-term and long-term
enrollment trends, (b) the gathering of data about student profiles (their back-
grounds and prior experiences with languages, their motives for language study,
as well as major and minor areas of study), and (c) special achievements in stu-
dents’ career paths. We argue that using enrollment data and student analytics
and involving students as our most important stakeholders in the evaluation
process will help achieve the main objectives of program evaluation: program
improvement, educational effectiveness, and even language program survival
(Norris, 2009).
In addition, we attempt to demonstrate that with current technological stan-
dards and possibilities for student analytics, program directors and administrators
now have the tools to identify program features in need of improvement; to as-
sess students’ interests, motives, and curricular strategies and paths over several
years; and to use these data to make educated decisions about curricular changes
and long-term program planning. Student analytics refers to the collection and
communication of meaningful patterns in data about students with potential rel-
evance for language and culture programs. We would like to stress that there are
1We would like to acknowledge assistance by Rachel Walker and Alan Eby with data analysis of the
program internal survey and thank the editors, two anonymous reviewers, and Kelly Schlinder for
helpful suggestions on earlier versions of this chapter.
75096_ch03_ptg01_hires_062-082.indd 62 9/18/14 10:30 PM
Student Analytics and the Longitudinal Evaluation of Language Programs 63
two components of student analytics: (1) the collection and analysis of relevant
data and (2) the usage of data-based findings to guide informed decision-making.
Objectives of the latter are program improvement and increased student satisfac-
tion. We will explore the use of student analytics to address four challenges to
language and culture programs:
• Challenge 1 involves the long-term tracking and monitoring of student
enrollments and their interpretation, taking into account both internal
(institutional, departmental) and external (regional, national) data.
• Challenge 2 concerns the identification of potential issues of the pro-
gram (e.g., enrollment slumps, course scheduling, unclear course titles,
and contents of under-enrolled courses) as well as subsequent interven-
tion to resolve problems and improve program components.
• Challenge 3 relates to the establishment of student profiles that can be
relevant for program planning and design (e.g., students’ motives and
interests, background languages, and major and minor areas of study).
• Finally, Challenge 4 pertains to keeping track of individual students’
career paths, including main achievements during their course of study
and after completion of the language and culture program.
We will illustrate how such data matter for program evaluation by referring
to sample data from our program and by discussing how these data have (or have
not) helped us respond to particular challenges that our program has faced over
the past decade. The data that we gathered for student analytics come from two
kinds of sources: (1) institutional and external data on student enrollments, lan-
guage program offerings, and student profiles, and (2) internal surveys created
by the department or program to collect data on student attributes relevant to
the program and feedback from students about the program. Let us now turn to
the four challenges that we have attempted to address, in part, through student
analytics.
Challenge 1: Tracking and Interpreting Student
Enrollments
In times of economic crisis and budget cuts, student enrollments are used as an
important, if not the most important, criterion by administrators to make deci-
sions about the continuation, curtailing, or closure of programs. Keeping track of
short-term and long-term enrollment trends and reacting to or preventing nega-
tive trends is an essential aspect of internally driven program evaluation. In order
to reasonably interpret a program’s enrollment data, however, it is important to
take into account enrollment trends at the institutional, regional, and national
levels (see Furman, Goldberg & Lusin, 2010; Levine, 2011), including enrollment
trends in non-postsecondary contexts, such as high schools (see, e.g., ACTFL,
2010; Rhodes & Pufahl, 2010), because they either may have a direct impact on
enrollments at the college level (Andress et al., 2002) or may help put departmen-
tal data in context and interpret them meaningfully.
75096_ch03_ptg01_hires_062-082.indd 63 9/18/14 10:30 PM
64 Peter Ecke and Alexander Ganz
Let us consider as an example the enrollment data from our basic language
program (BLP) in German at the University of Arizona (UA). Figure 3.1 illustrates
the total number of student enrollments in the spring and fall semesters in the
first four semesters of instruction in German over a period of 13 years from 2000
to 2013.
Overall, the data show an impressive growth of enrollments in the BLP over
the 13-year period, suggesting a healthy and growing program. Enrollments in-
creased by 49% percent from 222 in the fall 2000 semester to 331 in the fall 2013
semester. However, these data are relatively meaningless if we do not know to
what extent overall student enrollments, or perhaps more specifically undergrad-
uate student enrollments, at the university as a whole increased. In other words,
if we compare program enrollment data with institutional data, we are in a better
position to assess program health. Most institutions, like ours, make enrollment
statistics available to the public on websites called “Fact Book” or “Student Data”
(Office of Institutional Research and Planning Support, 2014a, 2014b) so that they
can be used easily for comparison purposes. In our case, overall student enroll-
ments at the institution have indeed increased, but at a lower rate compared to the
enrollments in the BLP. Overall student enrollments at the University of Arizona
increased by 17.8% from 34,488 in 2000 to 40,621 in 2013, and undergraduate
enrollments increased by 19.9% from 26,404 in 2000 to 31,670 in 2013. Figure 3.2
is a graphic representation of the growth rates of enrollments in the BLP, overall
enrollments at the university, and undergraduate enrollments at the university.
The correlation of enrollment numbers from the BLP with overall institu-
tional enrollments and undergraduate enrollments can be quite informative. The
patterns, visually displayed in Figure 3.2, already suggest some degree of corre-
lation between BLP enrollments, institutional enrollments, and undergraduate
150
2000
2001
2002
2003
2004
2005
2006
2007
2008
2009
2010
2011
2012
2013
200
250
300
350
400
450
Number of enrolled students
Spring Fall
Figure 3.1 Number of students enrolled in the basic German language
program in fall and spring semesters 2000–2013.
75096_ch03_ptg01_hires_062-082.indd 64 9/18/14 10:30 PM
Student Analytics and the Longitudinal Evaluation of Language Programs 65
enrollments. All three scores increase over time, but there is more variation in
enrollments in the BLP compared to the larger institutional data. The Pearson
correlation coefficient for BLP enrollments and undergraduate enrollments is
r = 0.738 (p < .01), which suggests a relatively strong relation. Squaring the cor-
relation coefficient r can help interpret the meaningfulness of this finding: The
squared value of r = .738 is r2 = .54. In statistical terms, this means that the varia-
tion in undergraduate enrollments accounts for 54% of the variation in BLP
enrollments. In other words, a substantial part (more than half) of the variation in
BLP enrollments over time is related to variation in undergraduate enrollments.
Although correlation analyses do not allow for clear-cut cause–effect assumptions,
the data do suggest that university undergraduate enrollments strongly influence
BLP enrollments, perhaps more so than actions or activities in the program. A
decrease in undergraduate enrollments would most likely negatively affect BLP
enrollments. The correlation coefficient between BLP enrollments and overall
university enrollments (including graduate students) is .701. This correlation
coefficient is somewhat weaker, but still strong, accounting for about 49% of the
variation in BLP enrollments.
The analysis of enrollment trends for specific course levels can also be in-
formative. First-semester language course enrollments are particularly important
for the strength of the BLP since students in these courses will provide the basis
150
200
250
300
350
400
450
Enrollments
Basic German language
program enrollments
University enrollments
(multiply by 100)
Undergraduate enrollments
(multiply by 100)
2000
2001
2002
2003
2004
2005
2006
2007
2008
2009
2010
2011
2012
2013
Figure 3.2 Overall enrollments at the university, undergraduate
enrollments at the university, and enrollments in the basic German
language program in fall semesters 2000–2013. Note: German
BLP enrollments are totals; university enrollments and university
undergraduate enrollments each need to be multiplied by 100. They are
illustrated together to compare trends.
75096_ch03_ptg01_hires_062-082.indd 65 9/18/14 10:30 PM
66 Peter Ecke and Alexander Ganz
for enrollments in subsequent courses in the following semesters. In Figure 3.3,
we present the student enrollment numbers in the program’s first-semester
German courses for spring and fall semesters of the years 2000–2013. Although
one could illustrate enrollments for the whole academic year, we have displayed
enrollments separately for the spring and fall semesters for practical reasons:
Keeping track of enrollments for each semester can help projecting enrollments
and scheduling classes for subsequent semesters. German 101 enrollments in the
fall semesters increased by 72% from 2000 to 2013. Recall that we noticed a simi-
lar (albeit weaker) increase in the overall BLP enrollments in Figure 3.1. The dif-
ference in the rate of enrollment increases is likely a result of the university’s
foreign language requirement. Most of the students in our BLP enroll only in the
first two semesters of German to comply with university requirements for Bach-
elor of Science degree programs. Fewer students continue language study in the
third and fourth semesters, which are required for Bachelor of Arts programs. The
data suggest that we have not taken full advantage of increases in first-semester
enrollments for third- and fourth-semester courses. The challenge is to better
motivate, recruit, and keep students for study in third and fourth semesters. In
recent years, fewer incoming students at our university reported prior experience
with German. A portion of these students previously placed directly into second-
year courses of German. We will return to this issue shortly in the discussion of
internal survey data.
The first-semester enrollments in the BLP, illustrated in Figure 3.3, can also
be submitted to correlation analyses with large-scale institutional enrollment
data. In Figure 3.4, we present the German 101 enrollments in the fall semesters
and the new freshmen enrollments at the university for the period 2000–2013.
(Note that the figures of the latter again have to be multiplied by 100).
0
50
100
150
200
250
2000
2001
2002
2003
2004
2005
2006
2007
2008
2009
2010
2011
2012
2013
Number of enrolled students
Spring Fall
Figure 3.3 Number of students enrolled in first-semester courses in the
basic German language program in fall and spring semesters 2000–2013.
75096_ch03_ptg01_hires_062-082.indd 66 9/18/14 10:30 PM
Student Analytics and the Longitudinal Evaluation of Language Programs 67
The correlation coefficient for German 101 enrollments and freshmen enroll-
ments (r = .733) is identical to the correlation coefficient for German 101 en-
rollments and undergraduate enrollments (r = .733), which suggests that both
university data sets can account for much of the variation in BLP enrollments.
Internal data, which we have collected through online surveys, can provide addi-
tional information about the student body. For example, over 41% of the students
who enrolled in first-semester German courses in fall semesters 2011–2013 iden-
tified themselves as freshmen, the largest group of German 101 students. Other
students in German 101 courses included sophomores (26%), juniors (19%),
seniors (9%), graduate students (3%), and other (1%).
Comparison of the program data to enrollment trends in German at the na-
tional and regional levels can also increase the meaningfulness of the data. From
the 2009 MLA survey of Enrollments in languages other than English in United
States institutions of higher education (Furman et al., 2010), we know that total
numbers of students enrolled in German at U.S. institutions of higher education
increased by a modest 5.8% from 91,100 in 2002 to 96,349 in 2009. We also know
that the percentage of German in overall foreign language enrollments decreased
from 2002 to 2009. In 2002, German enrollments represented 6.5% of foreign lan-
guage enrollments compared to 6% in 2006 and 5.7% in 2009 (Furman et al.,
2010). German language programs and programs of other traditionally taught
languages at the national level have faced a number of challenges over the past
decade (Ecke, 2010, 2011; Levine, 2011; McCarthy, 2010; Tatlock, 2010).
If we compare institutional BLP enrollment numbers with national enroll-
ment numbers, we are in a better position to meaningfully interpret our own in-
stitutional data. Fall semester enrollments in the BLP, presented in Figure 3.1,
0
50
100
150
200
250
Enrollments
German 101
enrollments
New freshmen enrollments
at university (multiply by 100)
2000
2001
2002
2003
2004
2005
2006
2007
2008
2009
2010
2011
2012
2013
Figure 3.4 German 101 enrollments in the basic German language
program and new freshmen enrollments at the university (in hundreds)
infall semesters 2000–2013.
75096_ch03_ptg01_hires_062-082.indd 67 9/18/14 10:30 PM
68 Peter Ecke and Alexander Ganz
increased by a substantial 64% from 242 in 2002 to 397 in 2009, which is a higher
growth rate than the 5.7% at the national level. Although such a comparison and
reference to external data leaves aside other factors that could potentially con-
tribute to BLP enrollments at the institution, it would likely be valuable in an
external academic program review that may be decisive for “disciplinary survival”
in times of economic crisis (Norris & Watanabe, 2013).
Challenge 2: Identifying Potential Issues and
Subsequent Intervention
Thus far, we have focused on interpreting long-term enrollment numbers and
trends. Overall, the BLP enrollments presented in Figure 3.1 show growth for the
period between 2000 and 2013 within the UA German program. However, the fig-
ure also shows considerable variation, including a substantial decline in enroll-
ment numbers from 2010 to 2012. As administrators and program directors, we
are confronted with the challenging task of identifying the causes of such nega-
tive trends and possibly counteracting and reversing the trend. Why did the BLP
enrollment numbers fall in this particular period of time? We do not have one
straightforward answer that could explain the enrollment slump between 2010
and 2012. There is no data yet from the national level that may help interpret our
enrollment data within a larger educational context. However, we did identify a
number of potential factors in our analysis that could have contributed, perhaps
collectively, to the decrease in enrollments. These factors and challenges include:
1. A temporary decline in growth in the number of incoming freshmen in
the fall of 2010 (which we know from institutional enrollment data);
2. A decrease in the number of high schools that offer German in the
region and state (as documented by ACTFL (2010), Rhodes & Pufahl
(2010), and findings from our internally administered student back-
ground questionnaires that show that students report prior knowledge
of German less frequently in recent years);
3. A disproportionate increase in enrollments in Chinese (compared to
other foreign languages, including German) at the institution and in
the region (ACTFL, 2010; Furman et al., 2010; Rhodes & Pufahl, 2010);
4. Increased efforts by colleagues to recruit students into competing lan-
guage programs (e.g. during the orientation week);
5. A comparatively high price tag of the instructional package ($340)
required for German 101 (and used in the following two semesters)
compared to instructional packages for other language courses.
In response, we attempted to address factor (4) in 2011 through increased efforts
to advertise the program and recruit students during the orientation week and
through social networking sites such as Facebook. We addressed factor (5) by
negotiating a custom-made instructional package with the textbook publisher
in 2012, which brought down the price of the instructional package from $340
75096_ch03_ptg01_hires_062-082.indd 68 9/18/14 10:30 PM
Student Analytics and the Longitudinal Evaluation of Language Programs 69
to $120. The package now includes a textbook that has a soft (not hard) cover, is
spiral bound, and includes an online workbook and supporting media instead of a
paperback workbook with CD/DVD programs used in previous years. When the pub-
lisher increased the price again in 2013, we successfully negotiated that the price
be changed back to the earlier negotiated level. With the objective to address factor
(2), we continued our efforts for program articulation, for example, within the local
chapter of the AATG and through other outreach events, such as organizing campus
visits of high-school students. Enrollments started to rise again in 2013, but it is
not clear whether any of the measures mentioned earlier have impacted enrollment
growth. In our most recent online survey of student backgrounds, we included
questions that we hope will provide us with some information about the signifi-
cance or insignificance of measures that we have taken to increase enrollments.
In addition to the previously discussed temporary enrollment decrease, we
have faced other challenges, such as alternative and potentially more attrac-
tive schedules for classes and the possibility of scheduling intensive 7.5-week or
half-a-semester courses. To gather relevant information from students, our most
important stakeholders, we developed online surveys and questionnaires, using
tools such as SurveyMonkey and Google Docs. Some of the internally developed
surveys will be discussed in the next section.
Challenge 3: Establishing Student Profiles Relevant
forProgram Planning
While institutions provide general statistical information on student enrollments,
there is a clear advantage in maintaining a regularly administered internal sur-
vey for the language program. As pointed out by Sinka and Zachau (2005), sta-
tistical information about the effectiveness, position, and alignment of language
programs within the broader university context may prove essential for adminis-
trators, language program directors, and department heads. Nevertheless, stan-
dardized university surveys, course evaluations, and enrollment statistics provide
only limited information on students as individuals. In addition, language pro-
grams need specific custom-designed tools and procedures to analyze their stu-
dents’ backgrounds, interests, and needs. In particular, the use of longitudinal
data on students’ backgrounds and motives to study the foreign language can
increase administrators’ understanding of what students are looking for in a lan-
guage program. These insights can give them the opportunity to properly align
classes, identify potential problems, track program development over time, and
have valuable data at hand when confronted with academic program reviews.
Program-Internal Surveys
To exemplify the potential for student analytics, we will discuss the development
and usage of a background questionnaire, which has been used in the BLP at
the University of Arizona for the past 10 years. The first author originally imple-
mented a paper-based survey in an attempt to capture program-relevant informa-
tion about students’ backgrounds aside from general and publically available data
75096_ch03_ptg01_hires_062-082.indd 69 9/18/14 10:30 PM
70 Peter Ecke and Alexander Ganz
on university students. This survey has been used consistently at the beginning
of each semester since 2003 and was converted into a digital format in the spring
2010 semester. Since its first launch, the survey has been taken by over 4,000 stu-
dents in the BLP. The digital survey data in more recent years has been gathered
by using an online form that was developed through the use of the university’s
Google Apps system. The online survey submissions are verified by students using
their own university credentials, and each survey was clearly identified by their
name and email, as previously done on the paper forms. Students are asked by
course instructors to fill out the online survey. They are told that the goal of the
questionnaire is to obtain information that may help to better tailor the courses
and program curriculum toward students’ needs.
The current version of the survey elicits basic information about students,
such as course section, instructor, age, major/minor area of study, and class (fresh-
man, sophomore, etc.). The survey includes questions about previous experience
with the target language (in high school, college, and other contexts) and provides
18 items (potential motives for the study of German) that students are asked to
rate on a 4-point Likert scale from 1 (not very important) to 4 (very important).
The questionnaire also asks students to self-rate their proficiency in their first and
(other) foreign languages, and concludes with an open-ended question about their
learning objectives in the particular course. In the fall of 2013, two more ques-
tions were added to inquire about students’ perception of the German program
and to learn whether the before-mentioned recruitment efforts made during ori-
entation had any impact on student enrollments. A copy of the questionnaire (fol-
lowing the older paper-based format) can be found in the appendix.
Objectives of the Survey
One main objective of the survey has been to provide course instructors with rel-
evant information about their course participants, which would help instructors
identify students who had not been placed in the appropriate course and enhance
individualization and teaching based on the needs and interests of the students.
With the survey data, instructors are provided with individualized student infor-
mation, which is made available by the end of the first week of classes. The online
survey follows a simple, yet effective, format eliciting valuable information from
students within minutes, with instructors being notified instantly when a student
has submitted the questionnaire responses. Each semester, the existing survey
setup is cloned during orientation week and online links to the surveys are assigned
for each class level. After the initial setup and testing, data are collected quickly and
are instantly available to instructors and the language program director. The digital
survey data can be stored permanently and can be analyzed later in numerous ways
depending on issues of interest, a process that would have taken weeks with the
old paper-based survey. The questionnaire findings have helped some instructors
identify students who enrolled in an inappropriate course (e.g., students who had
substantial instruction in German in high school and registered as false beginners
in German 101). After meeting with the BLP director, these students were then reg-
istered in a (usually higher-level) course that corresponded to their background and
better met their learning needs.
75096_ch03_ptg01_hires_062-082.indd 70 9/18/14 10:30 PM
Student Analytics and the Longitudinal Evaluation of Language Programs 71
An additional objective of the survey has been the collection of student data
as part of the continuous internal evaluation of the BLP by the program director.
The readily available data search, comparison, and analysis tools in the MS Excel
data files enable administrators to obtain information on students, correlate select
data sets, or compare data from different years in order to detect salient patterns
and trends. We have presented data from these surveys in GAT (graduate assistant/
associate in teaching) orientations and department meetings and retreats to peri-
odically inform instructors in the BLP as well as faculty in the department about
trends and challenges in the BLP that may also affect the department’s upper-level
courses and enrollments. Some of the data were used in the self-report of a recent
academic program review of the department. One specific set of solicited data re-
lates to students’ motives for enrolling in the German language program. This
will be discussed next.
Student Motives
In 2010, we decided to replace an open-ended survey question about students’
reasons for enrolling in the BLP with more elaborate items that would elicit
richer and quantifiable data on student motives for the study of German in the
BLP. In order to be able to compare our data with results reported by others
on students’ motives to study German (e.g., Andress et al., 2002), we adopted
18 items used by Sinka and Zachau (2005). On a four-point Likert scale from 1
(not very important) to 4 (very important), students had to rate how important
the 18 different motives were for their enrolling/re-enrolling in basic German
language classes. Items on student motives in the survey included, for exam-
ple, “seems like fun,” “relevance to my major,” or “opportunities to live/work in
Germany” (see the appendix for a complete list of items). Variables such as “rep-
utation of German program” or “good progress in German” may be particularly
interesting for purposes of language program evaluation over time as they may
reveal something about students’ perceptions of the quality of the program and
their perceived learning progress in the program. In Figure 3.5, we illustrate the
mean scores assigned by students for motives to study German in the BLP. For
comparison purposes, we include data from two time periods: fall 2008–spring
2010 and fall 2011–fall 2013.
Motives with the highest ratings in both periods include “(German) seems
like fun,” “interesting classes,” “comfortable German class,” “I like languages,”
and “love of German,” all mostly affective reasons that had also been reported
as the most frequent motives in Sinka and Zachau (2005) for college students
and Andress et al. (2002) for high-school students. Instrumental or pragmatic
reasons, such as “possibility of career benefits” or “relevance to my major” were
rated somewhat less important, but still quite relevant, whereas motives such as
“recommendation of school counselor” and “German-related scholarship (oppor-
tunities)” were rated of little importance. The latter should be considered items
of interest, which might merit attention and action by the department, per-
haps in the form of increased contact and dialogue with student advisors from
other units and intensified advertising of scholarship opportunities for study in
German-speaking countries.
75096_ch03_ptg01_hires_062-082.indd 71 9/18/14 10:30 PM
72 Peter Ecke and Alexander Ganz
In addition, comparisons of data on motives from various time periods could
reveal potentially interesting changes in student perceptions with implications for
program evaluation, alignment, and articulation. Overall, the data sets from the
two periods in Figure 3.5 display very similar patterns and little change in stu-
dents’ motives for the study of German over time. The slightly higher ratings of
two motives in 2011–2013 compared to 2008–2010 could be used (e.g., in an aca-
demic program review) as an indicator for at least equally or slightly more posi-
tive perceptions of the BLP by students in the more recent period. “Reputation
of German program” and “good progress in German” were both rated somewhat
higher in the more recent period than in the earlier one, but of course, one has to
be careful to not overinterpret relatively small differences.
One can also break down the data and compare the ratings of students from
different class levels. Figure 3.6 presents such an analysis, comparing students’
ratings of “good progress in German” for five course levels, German 101 to
German 300.
In Figure 3.6, one probably first notes the relatively high rating given for
“good progress in German” by the students in German 300, the department’s gate-
way course primarily for students who major or minor in German. These students
can be assumed to be more motivated, more committed, and more successful in
their development of German language proficiency than many of the BLP students
0
0.5
1
1. 5
2
2.5
3
3.5
Seems lik
e fun
Interesting classes
Comfortable German class
I like languages
Love of German
Opportunities to live/work in Germany
Possibility of career benefits
Good progress in German
Relevance to my major
Previous visits to Germany
Relevance to other academic subjects
Reputation of German program
Extracurricular activities in German
Friends took German
My parents/grandparents speak German
German-related scholarship
Brother or sister took German
Recommendation of school counselor
2011–2013 2008–2010
Figure 3.5 Mean scores assigned by BLP students for motives to enroll in
German classes. Note: 1,391 students completed the survey between fall
2011 and fall 2013, and 591 completed it between fall 2008 and spring
2010.
75096_ch03_ptg01_hires_062-082.indd 72 9/18/14 10:30 PM
Student Analytics and the Longitudinal Evaluation of Language Programs 73
who take German to fulfill the university’s foreign language requirement. A sec-
ond observation relates to the slight decline in perceived progress in second-year
German (German 201 and German 202). Anecdotal reports from teachers may
also reflect the perception that both teachers and students feel that less progress
is made at this level compared to the first two semesters of language instruction.
Data sets such as these can be valuable to the language program director
and may be used as one of various measures to monitor and assess a program’s
effectiveness. The importance of data on students’ perceptions, attitudes, and mo-
tives and their possible change over time should not be underestimated, as it is
students’ choices and decisions that first and foremost affect a program’s success.
Students’ Language Background
Students’ language background is another variable of potential relevance to lan-
guage program directors. In our questionnaires, we asked students about their
first language(s) (L1) and any prior experience they may have had with foreign
languages (L2) including the target German language of our BLP. Not surpris-
ingly for our institution, the large majority (93%) of students enrolled in the BLP
from fall 2008 to spring 2010 reported to have English as their L1. About 7%
of students did not claim English as their native language. The largest groups
of non-English L1 speakers in that period were speakers of Spanish (2.6%) and
Chinese (0.8%), which were relatively small portions within the group of students
who enrolled in German. While these percentages cannot be taken as equivalents
for students with minority status, they do suggest that the percentage of interna-
tional and minority students in our BLP classes is small and below the university
average. In an effort to recruit and retain a more diverse student population, the
0.0
0.5
1. 0
1. 5
2.0
2.5
3.0
3.5
4.0
101102 201 202 300
Figure 3.6 Mean scores of student ratings of “good progress in German”
in five courses of German (fall 2008–spring 2010). Note: N = 591 students;
101 = first-semester German; 102 = second-semester German; 201 = third-
semester German; 202 = fourth-semester German; 300 = fifth-semester
intensive gateway course for majors and minors of German.
75096_ch03_ptg01_hires_062-082.indd 73 9/18/14 10:30 PM
74 Peter Ecke and Alexander Ganz
department advertised and awarded several small minority scholarships funded
through the Dean’s Office of the College of Humanities in the years 2005–2009.
The scholarship program only had a limited impact on our student recruitment
and had to be terminated with the implementation of changes in the university’s
policy on affirmative action. We are still looking for more effective ways to recruit
and address the student population, which is more diverse than the one that we
have served thus far in our program.
Students’ reports about their background in foreign languages can also
provide us with important information as instructors and administrators. In an
analysis of students’ reports on prior foreign language experience from the fall
semesters of 2004, 2006, 2008, and 2010, we found that many of the students who
enrolled in BLP courses in these semesters reported prior experiences with Ger-
man (32.8%) and Spanish (32.2%), followed by French (9.5%), ESL (4.6%), Latin
(1.8%), Italian (1.7%), and Japanese (1.5%), among experiences with other less
frequently reported languages (4.1%). Some students (11.8%) indicated that they
had no prior experience with a foreign language.
We also noted that the percentages of students who claimed to have some
kind of prior knowledge of the target German language declined after 2006. While
a high 47.5% of students reported to have had some kind of prior experience
with German in the fall of 2006, only 29.4% did so in 2008 and 26.1% did so
in 2010. These data seem to corroborate findings from national surveys on K–12
foreign language enrollments (ACTFL, 2010) and foreign language program offer-
ings in elementary and secondary schools (Rhodes & Pufahl, 2010). Particularly
the latter study reported a decrease in the percentages of secondary schools that
offered German language programs in 1997–2008. An implication of this trend
for program planning is that fewer students of German are expected to place into
higher-level courses upon entrance to the university and the BLP (see Ecke, 2011,
p. 76). Our longitudinal survey data on students’ background in German appears to
confirm that prediction for our program. While enrollments in first-year courses
of German have grown significantly (see Figure 3.3), enrollments in second- and
third-year courses have stagnated at similar levels or grown at lower rates.
Students’ Major and Minor Areas of Study
Students’ declarations of major and minor areas of study reveal a great deal about
their interests and career objectives. An open issue for directors of language and
culture programs is the question of the extent to which language programs should
or should not attempt to address students’ interests in other areas. Figure 3.7 il-
lustrates the most frequently reported majors by students in our BLP from fall
2011 through fall 2013.
Overall, BLP students’ major areas of study are very diverse. The reasons for
this diversity of reported majors can be multifaceted. Ninety-nine out of 1,069 stu-
dents enrolled in German language classes have not yet declared a major. Majors
that are most frequently reported by students are German (91), (pre-)business
(86), psychology (75), political science (70), and history (67). However, a substan-
tial number of students also major in STEM fields, such as biology (74), engineer-
ing (55), and math (52).
75096_ch03_ptg01_hires_062-082.indd 74 9/18/14 10:30 PM
Student Analytics and the Longitudinal Evaluation of Language Programs 75
The information about the most frequently reported majors is certainly worth
consideration when attempting to align outreach efforts for a program. Horizontal
program articulation could specifically target students in areas with students who
have shown particular interest in German language and culture courses in the
past. Students with such majors may also be candidates for double majors/ minors
and should be invited to meet and discuss such options with the undergraduate
advisor of the language and culture program. The data may also point to the need
to offer more specialized courses, such as business German. In accordance with
the large number of students in (pre)-business in our BLP, the department started
offering a third-year business German course more frequently (annually instead
of biannually), including as an online course version taught by the second author.
Enrollment numbers, however, have not met our expectations, perhaps because
the required proficiency level in German was too high for the targeted student
group. We may need to develop and offer a business German course at the second-
year level or offer a course that is taught in English.
The point that we have attempted to make in this section is that language
programs can relatively easily collect and analyze informative data on their stu-
dents and programs, which institutional data by itself cannot provide. The kinds
of internal student analytics that we attempted to exemplify and discuss in this
section can have a variety of uses from strategic alignment toward certain student
Figure 3.7: The most frequently reported major and minor areas of study by
students enrolled in the basic German language program (fall 2011–fall 2013).
11
12
12
12
14
20
23
23
25
27
31
33
42
49
50
52
55
60
67
70
74
75
86
91
99
020406080100 120
Anthropology
Neuroscience
Voice
Film
Nursing
English
Astronomy
Communication
Creative Writing
Art
Linguistics
Computer Science
Foreign Languages
Economics
Chemistry
Mathematics
Engineering
Physiology
History
Biology
Political Science
Psychology
(Pre‐)Business
German
Undecided
Number of students (N
5
1,069)
75096_ch03_ptg01_hires_062-082.indd 75 9/18/14 10:30 PM
76 Peter Ecke and Alexander Ganz
groups, such as majors or minors in areas who have shown particular interest
in German, to curriculum design based on students’ motives to study the lan-
guage and their linguistic and cultural background. Data sets, such as the exam-
ples shown earlier, are too valuable to be ignored and should be considered as
part of a continuous internal program evaluation in addition to periodic external
evaluations.
Institutional Databases as “Supplement”
In addition to the internally developed and used survey described earlier, we have
explored and considered other data sets by using tools available to administration
and program management through the university’s Office for Enrollment Manage-
ment to complement the information provided by students through the internal
survey. While basic student data, such as enrollment or major/ minor, and specific
information such as GPA, are readily available to administrators through a vari-
ety of university-internal data systems, access to other information is frequently
limited to either course instructors or upper-level administrators. Generic infor-
mation that is collected in our university databases includes degree counts, grad-
uation counts, average credit hours for students, average GPA, average years to
graduation, new enrollee count, number of student withdrawals, average GPA per
term, and average student GPA change, among others. These data can also be com-
puted for specific departments or programs. They certainly go beyond what could
be handled by a language program administrator on a regular basis. Some infor-
mation from the university databases (e.g., graduation counts and years to gradua-
tion) would be of relevance for periodic program reviews, but more so with respect
to (major or minor) degree programs than BLPs. In our most recent a cademic
program review, the department head did indeed request some of these data.
Challenge 4: Keeping Track of Students’ Career Paths
Academic programs are increasingly being evaluated with respect to students’
success in their careers. Success is understood here not only as the successful
completion of students’ program of study (reflected, e.g., by overall GPA, time
needed until graduation, special achievements, such as awards, study abroad,
internship, or work experiences) but also as success in students’ careers after pro-
gram completion (e.g., job placement as well as job satisfaction and performance).
Interested students, their parents, and external review committees request or in-
quire about evidence that the program is preparing its students well for real-life
challenges and a competitive job market.
What can we, as program administrators, do to track students’ career paths
and their success? How can we demonstrate the impact that our programs (and
our work as directors) have on students’ lives? Of course, tracking the careers of
every single student enrolled in a BLP would neither be possible nor make good
sense. Most students, especially in larger programs, who primarily study a lan-
guage to fulfill a second-language requirement, will major and minor in areas
other than language study. However, program participants who continue language
75096_ch03_ptg01_hires_062-082.indd 76 9/18/14 10:30 PM
Student Analytics and the Longitudinal Evaluation of Language Programs 77
study, who become majors or minors in language and culture studies, who partici-
pate in departmental study abroad programs, and who prepare for a career in lan-
guage teaching or for graduate study can and should be tracked if possible.
In addition, an important responsibility of many BLP directors is the train-
ing, supervision, and assessment of GATs (Levine, Chavez, Crane, Melin, & Lovik,
2008; Rifkin, 2000; Walz, 1992). Program directors who employ graduate students
as instructors are well advised to monitor and keep track of GATs’ development,
performance, and achievements over time as these can be valuable data to evalu-
ate the program and demonstrate its effectiveness. Keeping record of aspects of
GATs’ development, accomplishments, and successes is an important and doable
aspect of longitudinal program evaluation, which will be relevant not only for
decision-making in the program (e.g., for the selection of GATs as assistants to
the program director or for teaching positions in summer/winter terms or study
abroad programs, and their nomination for teaching awards), but also for aca-
demic program reviews and reviews of the program director for tenure and pro-
motion. Next, we suggest a number of possibilities for tracking students’ career
paths and accomplishments as part of a BLP’s continuous self-evaluation. Let us
first consider achievements and career paths of undergraduate students before we
explore achievements of BLP instructors and graduate students that are worth the
record keeping.
1. Tracking undergraduate students enrolled in the BLP who
1.1. become majors and minors
1.2. get accepted into graduate programs
1.3. participate in study abroad programs
1.4. demonstrate special achievements (e.g., scholarship awards, job
placement).
2. Tracking graduate students and adjunct instructors teaching in the BLP
according to
2.1. ratings of overall teaching effectiveness (in course evaluations)
2.2. nomination for and granting of teaching awards
2.3. job placement (in schools, organizations, and businesses)
2.4. acceptance into doctoral programs
2.5. assignment as assistants to the director, as teachers in study abroad
programs, and other special assignments.
In reference to the tracking of undergraduate students, we have kept record
of students who were accepted into our graduate programs (1.2) and students who
participated in our study abroad programs (1.3). Because the BLP director also
serves on the graduate program committee and chairs our department’s study
abroad committee, the data are relatively easy to collect. However, we have not
systematically collected data on BLP students who become majors/minors (1.1)
and students with special achievements (1.4), although we have been able to
provide a good number of anecdotal examples of such students in our academic
program reviews. While the department’s undergraduate committee developed
entrance, mid-program, and exit questionnaires for majors and minors of Ger-
man studies, which do elicit this information, the paper-based survey data have
75096_ch03_ptg01_hires_062-082.indd 77 9/18/14 10:30 PM
78 Peter Ecke and Alexander Ganz
not been analyzed on a regular basis. Converting these questionnaires into online
surveys could help collect, store, and report relevant data more consistently.
Tracking data on BLP instructors is somewhat easier, as the number of in-
structors is much smaller than the number of undergraduate students. Also, the
elicited data are closely related to the BLP director’s tasks to train GATs and to
assure the quality of teaching in the BLP, for example, by monitoring GATs’ course
evaluations. Keeping track of classroom observation reports and course evalu-
ations has proven important for various reasons. Both observation reports and
especially course evaluations have been used by the BLP as criteria for the nomi-
nation of outstanding GATs for teaching awards and for the selection of GATs for
teaching positions in the department’s summer and study abroad programs (see
Ecke, 2013). In turn, keeping track of the names of teaching award recipients and
GATs who have been given special assignments has been useful in self-reports for
academic program evaluations.
Social media sites, such as Facebook and LinkedIn, have made it easier in
recent years to keep track of former BLP instructors’ continuous study and job
placement. Our department, as many others, also maintains a Web page about
alumni. Keeping the page up to date has been a challenge, but it is certainly
worthwhile given the page’s importance, not only for showcasing alumni’s success
and advertising the program but also for periodic reflection about how well our
programs prepare students for real-life challenges and employment opportunities.
The fact that not all alumni report stellar careers and satisfying employment may
be reason enough for an honest and more critical reflection on how we prepare
our students for life beyond university study.
Conclusion
In this chapter, we discussed the use of student analytics and longitudinal data
for the continuous self-evaluation of language and culture programs. We defined
student analytics as the collection and communication of meaningful patterns in
data about students in our programs. Throughout the chapter, we attempted to
illustrate with examples from our program how potentially relevant data can be
collected and how the meaningful interpretation of data may contribute to pro-
gram evaluation and informed decision-making to improve the program. By doing
so, we by no means intended to claim that we have found the perfect solutions for
a continuous internal program evaluation. We continue to refine our assessment
instruments and look for meaningful ways to interpret and exploit these data for
program improvement. While different programs will have different needs for self-
evaluation and data collection, we hope that the four challenges outlined and ex-
emplified in this chapter may help to guide others in their efforts to use student
analytics and collect useful data about their programs over time.
We described the four challenges as follows: Challenge 1 concerns the long-
term tracking and monitoring of student enrollments and their meaningful in-
terpretation, taking into account both internal (departmental and institutional)
and external (regional and national) data. Challenge 2 involves the identifica-
tion of potential issues of the program (e.g., enrollment slumps) and subsequent
75096_ch03_ptg01_hires_062-082.indd 78 9/18/14 10:30 PM
Student Analytics and the Longitudinal Evaluation of Language Programs 79
intervention. Challenge 3 calls for the establishment of student profiles that could
be relevant for program planning and design (e.g., students’ motives, language
background, and major areas of study). Finally, Challenge 4 pertains to the track-
ing of individual students’ special achievements and success in their career paths.
We made a case for the use of internal student surveys, complemented by
select internal and external data (e.g., on enrollments) as essential components of
a continuous, long-term, internally driven evaluation of a language program, and
suggested that implementation and continuous use of data-gathering procedures
would help make programs more attractive and effective, which in some cases
may contribute to a program’s survival in difficult times. Finally, we hope to have
shown that program administrators now have available the technological tools to
elicit, store, access, and assess valuable data about students who enroll in their
programs. Implementing such procedures can save time in the long term, open
up new opportunities for the data-based analysis and evaluation of programs, and
inform decision-making for the benefit of our students and programs.
References
American Council on the Teaching of Foreign Languages. (2010). Foreign language
enrollments in K-12 public schools: Are students prepared for a global society?
Alexandria, VA: Author.
Andress, R., James, C. J., Jurasek, B., Lalande II, J. F., Lovik, T. A., Lund, D., Stoyak,
D. P., Tatlock, L., & Wipf, J. A. (2002). Maintaining the momentum from high
school to college: Report and recommendations. Die Unterrichtspraxis/Teaching
German, 35, 1–14.
Ecke, P. (2010). Deutsch in den USA. In H.-J. Krumm, C. Fandrych, B. Hufeisen,
& C. Riemer (Eds.), Deutsch als Fremd- und Zweitsprache: Ein internationales
Handbuch (pp. 1833–1839). Berlin / New York: Mouten de Gruyter.
Ecke, P. (2011). The state of German in the United States: A statistical portrait and a
call for teachers. German as a Foreign Language, 12, 55–83.
Ecke, P. (2013). Designing and running a short-term study abroad program in
Germany: Guidance for new program directors. Neues Curriculum, 1–29.
Retrieved from http://www.neues-curriculum.org/papers/Ecke2013.pdf
Furman, N., Goldberg, D., & Lusin, N. (2010). Enrollments in languages other
than English in United States institutions of higher education, fall 2009. The
Modern Language Association of America. Retrieved from http://www.mla.org
/pdf/2009_enrollment_survey.pdf
Levine, G. (2011). Stability, crisis, and other reasons for optimism: University for-
eign language education in the United States. Arts and Humanities in Higher
Education, 10, 131–142.
Levine, G., Chavez, M., Crane, C., Melin, C., & Lovik, T. (2008). The language pro-
gram director in curricular and departmental reform: A response to the MLA Ad
Hoc Report. Profession 2008, 240–254.
McCarthy, J. A. (2010). Rear guard actions against an „impoverished delusion“:
Anmerkungen zur Situation der Auslandsgermanistik in den Americas.
Unterrichtspraxis / Teaching German, 43, 31–40.
Norris, J. M. (2006). The why (and how) of student learning outcomes assessment
in college FL education. Modern Language Journal, 90(4), 590–597.
Norris, J. M. (2009). Understanding and improving language education through pro-
gram evaluation: Introduction to the special issue. Language Teaching Research,
13(1), 7–13.
75096_ch03_ptg01_hires_062-082.indd 79 9/18/14 10:30 PM
80 Peter Ecke and Alexander Ganz
Norris, J. M., & Watanabe, Y. (2013). Program evaluation. In C. A. Chapelle
(Ed.), The encyclopedia of applied linguistics (pp. 1–6). Oxford: Blackwell.
doi:10.1002/9781405198431.wbeal0963
Office of Institutional Research and Planning Support. (2014a). Student data. The
University of Arizona. Retrieved from http://oirps.arizona.edu/StudentDemo-
graphics.asp
Office of Institutional Research and Planning Support. (2014b). The University of
Arizona fact book. Retrieved from http://factbook.arizona.edu/
Rhodes, N. C., Pufahl, I. (2010). Foreign language teaching in U.S. schools: Results
of a national survey. Washington, DC: Center for Applied Linguistics.
Rifkin, B. (2000). Mentoring foreign language TAs, lecturers, and adjunct faculty.
Boston, MA: Thomson Heinle.
Sinka, M. M., & Zachau, R. (2005). An articulation study of post-secondary German
students: Results, implications, and suggestions. In C. M. Barrette & K. Paesani
(Eds.), Language program articulation: Developing a theoretical foundation
(pp. 94–108). Boston, MA: Thomson Heinle.
Tatlock, L. (2010). USA: German in the changing landscape of postsecondary educa-
tion. Die Unterrichtspraxis / Teaching German, 43, 11–21.
Walz, J. C. (Ed.). (1992). Development and supervision of teaching assistants in for-
eign languages. Boston, MA: Heinle & Heinle.
Appendix
Language Background Questionnaire
Basic Language Program, Department of German Studies
Please fill in the following questionnaire. The information will help us in planning/
teaching the course.
German course section you are taking: GER ___________________
Your name: ______________ ID: __________ Age: __________
Email: ___________________
Major: ___________________ Minor: ___________________
Please circle: Freshman Sophomore Junior Senior Graduate Other
Prior Knowledge of German:
I have some prior experience with the German Language. Y / N
(If yes, please specify below)
I began learning German at age _______
I took German in high school. Y / N
If yes, how many years? (please circle) 1 year 2 years 3 years 4+ years
Name of high school: _________________________
I have taken German language courses at the university. Y / N
If yes, which levels? (please circle) 101 102 201 202 301 302 other
Name of university: __________________________
I have learned some German from German-speaking family members. Y / N
75096_ch03_ptg01_hires_062-082.indd 80 9/18/14 10:30 PM
Student Analytics and the Longitudinal Evaluation of Language Programs 81
I used to live in a German-speaking country. Y / N
If yes, where? (please circle) Germany Austria Switzerland Liechtenstein
I have studied abroad in a German-speaking country. Y / N
If yes, what type of program? (please circle)
Fulbright Summer study abroad Semester abroad Academic year abroad
Please rate your current German language abilities.
0 = very little to none, 1 = beginner, 2 = intermediate, 3 = advanced, 4 = native
Speaking German 0 1 2 3 4
Writing German 0 1 2 3 4
Listening comprehension 0 1 2 3 4
Reading comprehension 0 1 2 3 4
Other languages:
Languages Proficiency (please circle)
1 = beginner, 4 = native
Native language 1 2 3 4
Native language 1 2 3 4
Foreign language 1 1 2 3 4
Foreign language 2 1 2 3 4
Foreign language 3 1 2 3 4
Why have you chosen to study German? Please use the following scale from 1 (not
very important) to 4 (very important).
Not very important very important
Seems like fun 1 2 3 4
I like languages 1 2 3 4
Previous visits to Germany 1 2 3 4
Possibility of career benefits 1 2 3 4
My parents/grandparents speak German 1 2 3 4
Reputation of German program 1 2 3 4
Friends took German 1 2 3 4
Brother or sister took German 1 2 3 4
Recommendation of school counselor 1 2 3 4
Good progress in German 1 2 3 4
Interesting classes 1 2 3 4
75096_ch03_ptg01_hires_062-082.indd 81 9/18/14 10:30 PM
82 Peter Ecke and Alexander Ganz
Comfortable German class 1 2 3 4
Love of German 1 2 3 4
Opportunities to live/work in Germany 1 2 3 4
Relevance to my major 1 2 3 4
Relevance to other academic subjects 1 2 3 4
Extracurricular activities in German 1 2 3 4
German-related scholarship 1 2 3 4
Name other motives (if applicable):
__________________________________ 1 2 3 4
__________________________________ 1 2 3 4
__________________________________ 1 2 3 4
What do you hope to learn in the course you are taking now?
___________________________________________________________________
___________________________________________________________________
Thank you for having completed this questionnaire.
75096_ch03_ptg01_hires_062-082.indd 82 9/18/14 10:30 PM
Chapter
EAP programs in English-medium instruction higher education institutions play a critical role as they help move students of language and content from the general to the specific. Quality assurance of EAP programs is a necessity to ensure relevance and value-added in an increasingly crowded marketplace; as well as in an environment of increasing internal and external accountability. The challenge for many institutions and EAP programs is that those in charge of administering and evaluating the EAP program often lack the know-how and experience to effectively carry out an assessment of the program’s quality. If an evaluation takes place, it is limited in the sense of depth or frequency and quite possibly ineffective, leading those who were asked to participate in its application to the logical conclusion that it was a useless exercise. This chapter proposes that those charged with the task of evaluating an EAP program, consider utilizing an established accreditation framework as a guide for conducting the evaluation. The position is made that established accreditation standards provide a clear and useful framework for evaluation. In addition, the process prepares the program staff for any possible accreditation applications. The two-birds-with-one-stone that this approach offers may prove to be a useful choice for EAP programs, particularly those with limited capacity.
Chapter
Full-text available
Article
Full-text available
This article is written for educators and administrators at institutions of higher education who would like to create and run a short-term study abroad program for their students. It discusses practical issues that need to be addressed when designing, planning, and maintaining a program by illustrating part of the decision making process that was involved in the development of a summer study program in Germany offered by the author’s department at a large public university.
Article
Full-text available
This article reviews data about German spoken at home, and German taught in U.S. elementary, secondary, and postsecondary educational institutions. It discusses data about U.S. students studying in Germany, U.S. teachers of German, doctoral research related to German conducted at U.S. universities, and employment opportunities for German language professionals. It is argued that the teaching of German is still in demand, and that the training of school teachers is the most important challenge to the profession. In order to meet this challenge, German Studies programs in colleges and universities need to do more to make teacher training a priority.
Article
This article reports on the data gathered in an AATG-sponsored national survey of 4,711 juniors and seniors in US high schools enrolled in upper-level German courses in the spring of 2000. The survey, devised by the project team of Maintaining the Momentum, sought to determine what motivated students to select German, to continue learning German and to plan to take German in college. The results demonstrate the potential resource that high school students enrolled in German constitute for college programs (93.9% plan to attend college) and the importance of taking affective motivators into account when addressing this age group. The authors of the article make twelve recommendations based on their discussion of these results and urge their colleagues in the profession to look at these data and to consider strategies for placement, curriculum design, and recruitment, tailored to their home institution, that take account of affective motivators.
Article
This article presents a brief overview of the state of university language education in the United States. Despite the impact of the world economic crisis on university language education in the United States, the profession has not yet been impacted to the extent many believe it has. Current scholarly debates allow for both a sober assessment of and an optimistic perspective on the field’s trajectory in recent years. A brief summary is offered of recent enrollments in university courses in languages other than English, which shows that enrollments have indeed remained stable, though even these stable numbers point toward shortcomings in the place of language education in academia overall. The key points of the 2007 Modern Language Association Ad Hoc Committee Report, ‘Foreign languages and higher education: New structures for a changed world’, are then presented, along with some scholarly responses to it. The report, and especially the academic debate it sparked, point toward a period of transformation in university language education; an increased attention to crucial links between theory, curriculum, research, and pedagogical practice; and an encouraging mood of advocacy for language education that transcends the defensive response to program reductions and eliminations.