Conference PaperPDF Available

Implementation of a Student Learning Analytics Fellows Program.

Authors:

Abstract and Figures

Post-secondary institutions are rapidly adopting Learning Analytics as a means for enhancing student success using a variety of implementation strategies, such as, small-scale, large-scale, vended products. In this paper, we discuss the creation and evolution of our novel Student Learning Analytics Fellows (SLAF) program comprised of faculty and staff who conduct scholarly research about teaching, learning and student success. This approach directly addresses known barriers to successful implementation, largely dealing with culture management and sustainability. Specifically, we set the conditions for catalyzed institutional change by engaging faculty in evidence-based inquiry, situated with like-minded scholars and embedded within a broader community of external partners who also support this work. This approach bridges the gap between bottom-up support for faculty concerns about student learning in courses and top-down administrative initiatives of the campus, such as the strategic plan. We describe the foundations of this implementation strategy, describe the SLAF program, summarize the areas of inquiry of our participating Fellows, present initial findings from self reports from the Fellow community, consider future directions including plans for evaluating the LA research and the broader impacts of this implementation strategy.
No caption available
… 
Content may be subject to copyright.
Companion Proceedings of the 8th
International Conference on
Learning Analytics & Knowledge
(LAK’18)
Towards User-Centred Learning Analytics
March 5
-
9, 2018, Sydney, NSW, Australia
https://lak18.solaresearch.org
Organized by
Hai Linh Truong, flickr.com
ii
This work is published under the terms of the Creative Commons
Attribution- Noncommercial-ShareAlike 3.0 Australia Licence. Under this
Licence you are free to:
Share copy and redistribute the material in any medium or format
The licensor cannot revoke these freedoms as long as you follow the license terms.
Attribution You must give appropriate credit, provide a link to the license, and indicate if
changes were made. You may do so in any reasonable manner, but not in any way that
suggests the licensor endorses you or your use.
Non-Commercial You may not use the material for commercial purposes.
NoDerivatives If you remix, transform, or build upon the material, you may not distribute
the modified material.
No additional restrictions You may not apply legal terms or technological measures that
legally restrict others from doing anything the license permits.
Editors:
Abelardo Pardo, The University of Sydney
Kathryn Bartimote, The University of Sydney
Grace Lynch, Society for Leanring Analytics Research
Simon Buckingham Shum, University of Technology Sydney, Australia
Rebecca Ferguson, The Open University, UK
Agathe Merceron, Beuth University of Applied Sciences, Germany
Xavier Ochoa, Escuela Superior Politécnica del Litoral, Ecuador
Cite as:
Pardo, A., Bartimote, K., Lynch, G., Buckingham Shum, S., Ferguson, R., Merceron, A., & Ochoa,
X. (Eds.). (2018). Companion Proceedings of the 8th International Conference on Learning
Analytics and Knowledge. Sydney, Australia: Society for Learning Analytics Research.
.
1
Implementation of a Student Learning Analytics Fellows Program
Author(s): George Rehrey, Dennis Groth, Stefano Fiorini, Carol Hostetter and Linda Shepard
Indiana University
grehrey@indiana.edu
ABSTRACT: Post-secondary institutions are rapidly adopting Learning Analytics as a means for
enhancing student success using a variety of implementation strategies, such as, small-scale,
large-scale, vended products. In this paper, we discuss the creation and evolution of our novel
Student Learning Analytics Fellows (SLAF) program comprised of faculty and staff who conduct
scholarly research about teaching, learning and student success. This approach directly
addresses known barriers to successful implementation, largely dealing with culture
management and sustainability. Specifically, we set the conditions for catalyzed institutional
change by engaging faculty in evidence-based inquiry, situated with like-minded scholars and
embedded within a broader community of external partners who also support this work. This
approach bridges the gap between bottom-up support for faculty concerns about student
learning in courses and top-down administrative initiatives of the campus, such as the strategic
plan. We describe the foundations of this implementation strategy, describe the SLAF program,
summarize the areas of inquiry of our participating Fellows, present initial findings from self-
reports from the Fellow community, consider future directions including plans for evaluating
the LA research and the broader impacts of this implementation strategy.
Keywords: Institutional Learning Analytics; change management; faculty engagement;
communities of transformation; student success; learning analytics fellows program
1 INTRODUCTION
Post-secondary institutions are rapidly adopting Learning Analytics (LA) to enhance student retention
and graduation rates (Treaster, 2017), using a variety of approaches that show differing levels of
success. Institutions are implementing both in-house early warning systems (Lonn et al., 2012; Arnold
& Pistilli, 2012) as well as large-scale vended applications (such as Loud Sight, Educational Advisory
Board or Civitas) and yet share common barriers to successful implementation: cultural barriers,
institutional commitment, policy development, and resistance to change (Bichsel, 2012; Ferguson, et
al., 2014; Macfadyen, et al., 2014). No approach appears exempt from the challenges of adoption,
inasmuch as ‘Institutional implementation of learning analytics calls for thoughtful management of
cultural change (Macfayden, et al., 2017).
1.1 Foundation/Rationale
In this paper, we discuss the creation and evolution of our Student Learning Analytics Fellows (SLAF)
program comprised of faculty and staff who are using LA to conduct scholarly research about teaching,
learning, and student success. Our main premise is that a change in faculty understanding of their
students through engaged research and participation in LA development can lead to a change in
institutional culture about student success. We also believe that joining a networked community of
2
like-minded scholars provides a unique opportunity to catalyze institutional change at the course,
curricular, program and institutional levels. The introduction of the SLAF program is supported by our
new Center for Learning Analytics and Student Success (CLASS) and builds upon a larger community
of faculty-driven work at our Center for Innovative Teaching and Learning (CITL). This includes two
successful well-established programs, our FLCs (Faculty Learning Communities) and SoTL (Scholarship
of Teaching and Learning) programs. Both programs acknowledge that faculty engagement is essential
to the adoption of new practices, and that when successful, can lead to a change of the teaching
culture at the departmental level (Austin, 2011). In the past few years, we have also come to
understand the positive effects of having faculty collaborate within a larger Community of
Transformation (Kezar & Gehrke, 2015), proving as an effective method to promote implementation
of new models and most importantly, sustain cultural change (Fairweather, 2008; Henderson &
Finkelstein, 2011). Thus, external partnerships are an integral part of our approach as we work with
Bay View Alliance and other partner institutions to invest not only in LA tools, but in the people and
communities that will use them as well (Bischel, 2012).
As recommended by Bischel (2012), our SLAF program aligns well with the strategic plan and
objectives of the campus. Those objectives include (but are not limited to): supporting retention and
graduation of students, developing best practices for recruiting and retaining diverse students,
designing evidence-based curriculum, and engaging faculty in learning analytics research. This
approach bridges a gap between bottom-up support for faculty concerns about student learning in
courses and in the curriculum and the top-down administrative initiatives outlined in the university
strategic plan, facilitating a sustained institutional change at the course, program and institutional
levels. Change that 1) embraces evidence-based decision-making, 2) demonstrates an increase in
faculty participation in inquiry and development of resources to support the use of learning analytics,
3) establishes sustainable faculty-led oversight including implementation of recommended activities,
and 4) instills ownership for student success through the curriculum.
We now describe the Student Learning Analytics Fellows (SLAF) program at our institution, provide a
summary of the faculty research to date, and the internal and external supports for this work. We also
describe our collaborations with other institutions who are also adopting this approach (see LAK 2017
workshop, Macfayden, et al., 2017) and the strengths of these emerging communities. We will also
summarize some of the initial evidence gathered to evaluate the success of the SLAF program and the
role of these efforts within the emerging field of LA (McCoy & Shih, 2016). Given the goals of the
program around institutional change, we describe future plans for the evaluation and describe the
challenges that remain. With this reflection of our SLAF program, we hope to understand the broader
impact of this work across our campus and enable opportunities for continuous improvement.
2 SLAF PROGRAM
The SLAF program engages faculty in the scholarship of student success. An annual call for proposals
(CFP) and a campus event to explain the goals sets the stage for this program. Faculty Fellows, submit
a proposal outlining their projects goals and intended outcomes and fellows with accepted proposals
attend a kick-off event prior to meeting with professional Institutional Research (IR) staff to discuss
their projects and to develop a research strategy. All aspects of the work are discussed with the IR
staff, including the availability of data, how data will be analyzed and the skill sets of the researcher.
3
For some, this initial conversation is the beginning of a close partnership with the IR office while other
Fellows opt to work independently, only returning to IR with specific questions or data needs.
The data required for Fellows includes individual student data about academic progress (e.g., degrees,
majors, courses, grades), academic preparation (e.g., high school GPA, SAT/ACT scores), student life
(e.g., residential programs, student activities), student financial status, and student demographic
information (e.g., gender, ethnicity, residency). In general, longitudinal data sets and data dictionaries
were provisioned that are highly structured and purposeful.
The process for provisioning data has been considered a significant obstacle that higher education
faces in this emerging field (Dede et al., 2016). For this work, our approach was two-pronged, 1)
linking the research proposals to the institutional mission including having administrative support and
2) administratively establishing coordination among the relevant campus compliance offices (IRB and
Data Stewards). All provisioning fell within the standard protocols of each relevant office. At this
scale, the process was manageable; however, with more Fellows, this may need to be revisited. For a
fuller review of the provisioning processes and the implementation of the program, see Rehrey et al.
(2018).
2.1 Summary of SLAF Projects
Now in its third year, 28 faculty have participated in 29 research projects, with 10 of those faculty
deciding to return for a second or third year to continue their research. During the first two years
alone, 24 participants, representing 11 programs, embarked upon 19 different projects. Collectively,
they investigated 3.2 million student enrollment records corresponding to the career progression and
characteristics of 150,000 students. The charts below provide an initial analysis of the program to
date, describing the distribution of Fellows as categorized by their academic fields (Figure 1) and the
student factors that the Fellows investigated as part of their research (Figure 2).
Figure 1: Academic Fields as a Percentage of 29 Fellow’s Projects
A description of whether the research addresses student success at the course, curriculum/program,
or university level is included as well, keeping in mind that a single project may address more than
one level at a time (Figure 2).
4
In general, faculty projects involved inquiry within four broad categories of factors that influence
student success. In many cases, the individual research projects are studying the effect of multiple
factors, and at multiple levels (course, program and university). For the purpose of this initial analysis
of the Fellows program, the factors were categorized in the following way: 1) Student Demographics:
including student characteristics such as ethnicity, race, and class standing, 2) Student Preparation:
such as transfer credits, prerequisites, curriculum pathways, pre-college courses, and remedial
educational programs, 3) Student Performance: as understood by GPA, persistence, retention,
engagement indicators and graduation rate, 4) Student Choice: as understood by major selection,
inflection points, and pathways toward graduation.
Figure 2: Fellows’ research questions at course, program and institutional levels
The student factors that the Fellows are researching is fairly evenly distributed. Not surprisingly at
this point in the program, faculty research tends to concentrate on the course and program levels.
Over time and as research questions become more complex, this should influence the levels being
addressed, with more projects making connections to data-driven institutional decision making, along
with increased rates of persistence, retention and graduation at our institution.
2.2 Survey of SLAF Community How did the program work?
The program is intentionally designed to: 1) encourage faculty to generate research questions that
make actionable and pragmatic use of LA, 2) provide large, robust data sets for SoTL-type research, 3)
increase the general understanding and use of LA data, 4) encourage data driven decision-making at
the course, curriculum, and program levels and, most importantly, 5) change faculty perceptions
about who is responsible for student success. To understand the effectiveness of the program, Likert-
scale questions with open-ended comments for each question were designed for three audiences:
Faculty Fellows, Sponsors of the Faculty Fellows (those who wrote a letter of support submitted with
their initial proposal) and their Department/School Head. All fifteen of the Faculty Fellows responded
(100% response rate), four out of nine Sponsors responded (given the low number they are not
reported here) and twelve out of eighteen (67% response rate) school/department heads responded.
The survey questions were approved by the university’s Institutional Review Board. Surveys were sent
by email with a link to Qualtrics, with a total of three requests for completion. The first research
question addressed is:
5
“Has LA data usage influenced programs and initiatives, and if so, how?”
The majority of the Fellows (73%) responded that working with LA data has increased the chance their
departments would use data to inform decisions and that there were more conversations now about
student success (60%). All Fellows had shared their projects with others in their department but were
not certain that their departments have made or will make administrative decisions on the basis of
their own LA projects (with only 47% answering in the “somewhat to strongly agree”). The
administrators agreed that having had a participant in the LA program made their department more
likely to use data to inform their decisions in the future. The next research question addressed is:
“Has participation in the SLAF encouraged you to consider student success beyond their
individual course or program?”
An interesting aspect of this question is possible selection bias wherein faculty applying for the Fellows
program can be assumed to have an interest in student success. Nearly all Fellows (93%) reported that
before the Fellows program, they saw student success as a part of their role as a faculty member and
all of the Fellows (100%) saw student success as a part of their role as a faculty member after
participating in the program. We next consider if the Fellows program helped the faculty members
perceive the importance beyond their classroom and 80% agreed or strongly agreed. One Fellow
made the following comment:
“My participation in the Fellows program completely transformed me in this regard and
helped turned me into a bit of a zealot for student success.”
Administrators responded in a similar fashion but comments suggested that they already thought of
student success as important, prior to the SLAF program. The next research question addressed is:
“Does using LA data to conduct a research project help Fellows see the value of big data as a
decision-making tool for academic decisions?”
Again, selection bias is an element to consider, so we asked the before and after questions. The
majority (80%) of Fellows saw the value of using LA data to make academic decisions before the
program and all of them (100%) saw the value after. One Fellow explained:
“As the data analytics becomes more robust and easier to use, I see how data can increase
teaching efficiency by providing detailed and summary feedback on content, assessment, and
student learning.”
Finally, we were interested in the role of community:
“Does having a sense of being a part of a university community that had a mission for student
success matter?”
While the majority (60%) were aware of other departments using data to inform their decisions, we
would like to improve this figure, especially given that the majority (87%) agreed that being a part of
the Fellows program helped them feel a part of a community with a mission of student success, and
were now more interested in engaging in aspects of the campus community that are concerned with
student success. One Fellow shows enthusiasm for the community by reflecting:
6
“I found the interactions with the other Fellows and the team at (IR) to be extremely
invigorating and exciting. I appreciated knowing that I am doing my work in a community of
like-minded individuals in a variety of disciplines.”
Conversely, another Fellow who chose to work independently remarked:
“I believe that my process and my outcomes would have been greatly improved if I had worked
with a team on this project. The ability to discuss and validate models and techniques with
colleagues who know and understand this course would have been a great advantage.”
Given that the SLAF program is relatively new on our campus, our knowledge of the impact is limited:
“What administrative changes have been made as a result of the program?
One Fellow did write:
“Our discoveries about the importance of student motivation lead us to implement a few
changes in our class curriculum.”
In the actual SLAF project reports, Fellows identified many suggestions as to what their departments’
administrations could do to improve student success, more follow-up on these suggestions is needed,
now that more time has passed. For example, one project found that 4th year students who take a
low-level course do not perform as well as the lower-level students in the same course; this suggests
that the department could promote the course to lower-level students and provide tutoring services
to upper-level students. Findings from another project suggest that students on academic probation
who take a study skills course improve chances for retention and graduation; however, there are not
enough sections for all students to take it, so a requirement is not enforced; this finding suggests
allocating resources for additional sections. Other comments propose changes may take place in the
future. One Fellow says:
“Frankly, I didn't know how to do any of this before the LA grant. We had questions, but did
not know how to get the answers. After the LA work, we still have a ton of questions, but we
have some confidence and relationships with those who can help.”
Another Fellow stated that colleagues are having more informed conversations around student
success.
2.3 SLAF Community of Transformation
The broader impacts of this work are unfolding in purposeful and unanticipated ways. The
connections to external communities are essential components of our implementation strategy. Four
institutions within the Bay View Alliance (BVA) have become partners for this work and are
implementing similar programs at their institutions (Macfayden, et al., 2017). Locally, each campus
partner forms a Community of Practice whereas these broader communities (Communities of
Transformation) are expected to contribute to sustained change on our campus (Kezar & Gehrke,
2015). Since these partnerships are newly formed, the outcomes of these communities are still
unfolding. Another recent development on our campus is the formation of the Center for Learning
Analytics and Student Success (CLASS), charged with furthering the campus' commitment to the
7
scholarship of teaching and learning in conjunction with cutting-edge research in learning analytics to
support student success. CLASS will bring together communities, including our external communities,
engaged in LA work to consider knowledge gained, innovations and adoption of strategies. An
unanticipated outcome of the work is an organically formed community of Fellows who are requesting
funding for the formation of an Educational Data Science program, a new interdisciplinary field of
study that would advance this type of work on our campus and share knowledge more broadly with
relevant disciplinary communities.
3 CONCLUSION
The complexity of evaluating the impact of communities of practice (within our institution) as well as
communities of transformation (extending to partnerships) is a recognized challenge. Communities
are complex systems, they take considerable time (seven years) to mature (Kezar & Gehrke, 2015) and
concepts around sustainability and transformation (for example) are ill-defined. Despite these
challenges, we will continue to develop strategies for the evaluation of our program and focus
attention on continuous improvement. Our institutional data, retention and graduation, can provide
initial benchmarks of our success. Knowledge gained from self-reports from our Fellows community
will continue to provide valuable feedback and we anticipate that reflections from our broader
community of partners will provide valuable information as well. In addition, we have gained insights
from researchers (McCoy & Shih, 2016) evaluating our work directly, as a case study. As part of this
presentation, we look forward to exploring the topic of evaluation with the community at LAK 2018.
As we consider the rich faculty communities (FLC and SoTL) that provided the foundation for the SLAF
program we recognize that these influences move in two directions. SLAF programs will continue to
be influenced by these communities and these communities will be influenced by LA and the SLAF
community. We anticipate growth for all, with greater capacity to improve teaching, learning, and
student success at our institution and within higher education. This was recognized when the SLAF
community on our campus came together to present final projects and reflect on our first year. Our
esteemed SOTL scholar, Craig Nelson states (at the Student Learning Analytics Showcase, November
19, 2015):
“It’s been clear for a long time that every class is an experiment and one for which we
traditionally throw the data away as soon as we gather it. And what’s clear here is there’s also
an immense amount of external data that has not been routinely brought to bear in any
thinking. And that the learning analytics is going to make it easier to look at the external data
and in the process motivate people to look a lot more systematically at the internal data. So,
I am very cheered by all of this.”
The boundaries of LA are expansive and the potential for LA to enhance student success remains
“the most dramatic factor shaping the future of higher education (Siemens & Long, 2011).
REFERENCES
Pistilli, M. D. and K. E. Arnold (2012). Course signals at Purdue: Using learning analytics to increase
student success. In Proceedings of the 2nd International Conference on Learning Analytics and
Knowledge, Number May, New York, NY, USA: ACM. LAK’ 12., pp. 2–5.
8
Austin, A. E. (2011). Promoting evidence-based change in undergraduate science education.: Fourth
Committee Meeting on Status, Contributions, and Future Directions of Discipline-Based
Education Research.
Bischel, J. (2012). Analytics in Higher Education: Benefits, Barriers, Progress and Recommendations.
Retrieved from Louisville: http://www.educause.edu/ecar
Dalrymple, M. S., L. (2016). Engaging Faculty in the Scholarship of Student Success. Paper presented
at the Association for Institutional Research Forum, New Orleans, LA.
Dede, C. H., Andrew; Mitros, Piotr. (2016). Big Data Analysis in Higher Education: Promises and Pitfalls.
Educause(September-October 2016), 23-34.
Fairweather, J. (2008). Linking Evidence and Promising Practices in Science, Technology, Engineering,
and Mathematics (STEM) Undergraduate Education Retrieved from Washington, DC:
Felten, P. J. (2013). Principles of Good Practice in SoTL. Teaching and Learning Inquiry: The ISSOTL
Journal, 1(1), 121-125.
Ferguson, R., Clow, D., Macfadyen, L., Essa, A., Dawson, S., & Alexander, S. (2014). Setting learning
analytics in context: overcoming the barriers to large-scale adoption. Paper presented at the
Proceedings of the Fourth International Conference on Learning Analytics And Knowledge,
Indianapolis, Indiana, USA.
Henderson, C., Beach, A., & Finkelstein, N. (2011). Facilitating change in undergraduate STEM
instructional practices: An analytic review of the literature. Journal of Research in Science
Teaching, 48(8), 952-984. doi:10.1002/tea.20439
Hutchings, P., Huber, M. T., & Ciccone, A. (2011). The Scholarship of Teaching and Learning
Reconsidered: Institutional Integration and Impact: Wiley.
Kezar, A., & Gehrke, S. (2015). Communities of transformation and their work scaling STEM reform.:
Pullias Center for Higher Education, Ross School of Education, University of Southern
California.
Lonn, S. et al. (2012) Bridging the gap from knowledge to action: Putting analytics in the hands of
academic advisors, LAK12:2nd International Conference on Learning Analytics and Knowledge,
3-4, Vancouver, BC.
Macfayden, L. P. D., Shane; Pardo, Abelardo; Gasevic, Dragan. (2014). Embracing Big Data in Complex
Educational Systems: The Learning Analytics Imperative and the Policy Challenge. Research &
practice in assessment, 9(2), 17-28.
Macfayden, L. P. S., M.; Rehrey, G.; Shepard, L.; Greer, J.; Groth, D.; Molinaro, M. (2017). Developing
institutional learning analytics “communities of transformation” to support student success.
Paper presented at the Learning Analytics and Knowledge Conference, Vancouver, BC.
https://doi.org/10.1145/3027385.3029426
McCoy, C., Shih, Patrick. (2016). Teachers as Producers of Data Analytics: A Case Study of a Teacher-
Focused Educational Data Science Program. Journal of Learning Analytics, 3(3), 193-214.
Rehrey, G. G., D.; Hostetter, C.; Shepard, L. (2018). The scholarship of teaching, learning, and student
success: Big data and the landscape of new opportunities. Conducting & applying the
scholarship of teaching and learning beyond the individual classroom level. Bloomington, IN:
Indiana University Press.
Rehrey, G. S., Greg; and Hostetter, Carol. (2014). SoTL Principles and Program Collaboration in the Age
of Integration. International Journal for the Scholarship of Teaching and Learning, 8(1), 1-11.
doi:https://doi.org/10.20429/ijsotl.2014.080102
Siemens, G. & Long, P. (2011). Penetrating the analytics in learning and education. Educause review,
46(5) 31-40.
Treaster, J. B. (2017, 2/2/2017). Will you graduate? Ask big data. The New York Times. Retrieved from
https://www.nytimes.com/2017/02/02/education/edlife/will-you-graduate-ask-big-
data.html?_r=1
ResearchGate has not been able to resolve any citations for this publication.
Article
Full-text available
Educational data science (EDS) is an emerging, interdisciplinary research domain that seeks to improve educational assessment, teaching, and student learning through data analytics. Teachers have been portrayed in the EDS literature as users of pre-constructed data dashboards in educational technologies, with little consideration given to them as active producers of data analytics. This article presents the case study results of an EDS program at a large university in Midwestern U.S.A. in which faculty and instructors were provided with access to institutional data and data analytics technologies in order to explore questions related to their classroom and departmental environments. Semi-structured interviews of program participants were conducted to examine the participants' experiences as practitioner researchers in EDS. The analysis showed that participants were motivated to participate to improve their learning and educational environments through data analytics, as opposed to developing a research agenda in EDS; that participants experienced a range of barriers related to data literacy; and that participant community support in addition to administrative support are vital to teacher-focused EDS programs. This study adds to a small but growing body of research in EDS that considers teachers as producers and not just consumers of data analytics.
Article
Full-text available
The increasing acceptance of the Scholarship of Teaching and Learning (SoTL) on our campus has led to spreading SoTL principles outside of the usual faculty classroom research projects and teaching/learning center. Three programs examined how SoTL principles aided in integration and initiative building. The programs are the Center for Innovative Teaching and Learning, the Scholarship of Teaching and Learning Program, and the Faculty Colloquium on Excellence in Teaching. Attempts at integration and collaboration have successfully brought SoTL principles into community building, consensus building, and program assessment. A unified voice, mutual respect, and responsiveness to institutional needs have been the necessary conditions to support the work, which may have directly and indirectly effected change in the campus culture
Technical Report
Full-text available
The objectives of this research were to assess the current state of analytics in higher education, outline the challenges and barriers to using analytics, and develop a maturity index to provide a common means of assessing progress in analytics.
Article
Full-text available
Nell’era di Internet, delle tecnologie mobili e dell’istruzione aperta, la necessità di interventi per migliorare l’efficienza e la qualità dell’istruzione superiore è diventata pressante. I big data e il Learning Analytics possono contribuire a condurre questi interventi, e a ridisegnare il futuro dell’istruzione superiore. Basare le decisioni su dati e sulle evidenze empiriche sembra incredibilmente ovvio. Tuttavia, l’istruzione superiore, un campo che raccoglie una quantità enorme di dati sui propri “clienti”, è stata tradizionalmente inefficiente nell’utilizzo dei dati, spesso operando con notevole ritardo nell’analizzarli, pur essendo questi immediatamente disponibili. In questo articolo, viene evidenziato il valore delle tecniche di analisi dei dati per l’istruzione superiore, e presentato un modello di sviluppo per i dati legati all’apprendimento. Ovviamente, l’apprendimento è un fenomeno complesso, e la sua descrizione attraverso strumenti di analisi non è semplice; pertanto, l’articolo presenta anche le principali problematiche etiche e pedagogiche connesse all’utilizzo delle tecniche di analisi dei dati in ambito educativo. Cionondimeno, il Learning Analytics può penetrare la nebbia di incertezza che avvolge il futuro dell’istruzione superiore, e rendere più evidente come allocare le risorse, come sviluppare vantaggi competitivi e, soprattutto, come migliorare la qualità e il valore dell’esperienza di apprendimento.
Article
Full-text available
This article reviews current scholarship about how to promote change in instructional practices used in undergraduate science, technology, engineering, and mathematics (STEM) courses. The review is based on 191 conceptual and empirical journal articles published between 1995 and 2008. Four broad categories of change strategies were developed to capture core differences within this body of literature: disseminating curriculum and pedagogy, developing reflective teachers, enacting policy, and developing shared vision. STEM education researchers largely write about change in terms of disseminating curriculum and pedagogy. Faculty development researchers largely write about change in terms of developing reflective teachers. Higher education researchers largely write about change in terms of enacting policy. New work often does not build on prior empirical or theoretical work. Although most articles claim success of the change strategy studied, evidence presented to support these claims is typically not strong. For example, only 21% of articles that studied implementation of a change strategy were categorized as presenting strong evidence to support claims of success or failure of the strategy. These analyses suggest that the state of change strategies and the study of change strategies are weak, and that research communities that study and enact change are largely isolated from one-another. In spite of the weak state of the literature, some conclusions related to the design of change strategies can be drawn from this review. Two commonly used change strategies are clearly not effective: developing and testing “best practice” curricular materials and then making these materials available to other faculty and “top-down” policy-making meant to influence instructional practices. Effective change strategies: are aligned with or seek to change the beliefs of the individuals involved; involve long-term interventions, lasting at least one semester; require understanding a college or university as a complex system and designing a strategy that is compatible with this system. © 2011 Wiley Periodicals, Inc. J Res Sci Teach 48: 952–984, 2011
Conference Paper
Institutional implementation of learning analytics calls for thoughtful management of cultural change. This interactive halfday workshop responds to the LA literature describing the benefits and challenges of institutional LA implementation by offering participants an opportunity to learn about and begin planning for a program to actively engage faculty as leaders of data exploration around the theme of 'student success'. This session will share experiences from five institutions actively engaged in fostering Learning Analytics Communities (LAC) by identifying key issues, sharing lessons learned, and considering structural frameworks that are transferable to other institutional contexts. Structured discussion and activities will engage participants in developing an action plan for establishing an LAC on their own campus.
Conference Paper
A solid body of research highlights pedagogical practices that impact undergraduate learning in STEM education in productive ways. At the same time, however, current reform efforts often meet resistance, and change toward evidence-based pedagogical practices among those teaching undergraduate students is often not easy or rapid. The presentation will draw on the research and literature on organizational change, academic organizations, and academic work to examine forces and factors that serve as barriers and opportunities for improving undergraduate STEM education. The presentation will take a systems approach, viewing the universities and colleges where undergraduates learn and faculty teach as complex organizations in which an array of factors are relevant to organizational change, and, more specifically, to the decisions faculty members make regarding their use of evidence-based approaches in their undergraduate teaching. Such a systems approach draws attention to: (1) individual faculty members’ values, backgrounds, abilities, and aspirations as they relate to their teaching decisions; (2) the various organizational contexts internal and external to higher education organizations that influence faculty members’ teaching decisions and practices, including the department, college, institution as a whole, and other organizations such as government bodies and accrediting agencies; and (3) the array of elements within the organizational context that can serve as “levers” or barriers to faculty members’ decisions about their teaching, including evaluation and reward systems, workload allocations, professional development opportunities, and leadership practices. Based on research findings, the presentation will suggest that efforts to foster reform in undergraduate science teaching need to take into account individual characteristics and differences among faculty members, and factors within organizational contexts that promote or inhibit change. Specifically, the presentation will highlight how professional development, reward systems, and leadership strategies can support reform efforts. The overall thesis will be that addressing barriers to reform requires examining and utilizing an array of factors embedded within the contexts in which faculty members work and STEM undergraduate education is offered.
Conference Paper
Once learning analytics have been successfully developed and tested, the next step is to implement them at a larger scale -- across a faculty, an institution or an educational system. This introduces a new set of challenges, because education is a stable system, resistant to change. Implementing learning analytics at scale involves working with the entire technological complex that exists around technology-enhanced learning (TEL). This includes the different groups of people involved -- learners, educators, administrators and support staff -- the practices of those groups, their understandings of how teaching and learning take place, the technologies they use and the specific environments within which they operate. Each element of the TEL Complex requires explicit and careful consideration during the process of implementation, in order to avoid failure and maximise the chances of success. In order for learning analytics to be implemented successfully at scale, it is crucial to provide not only the analytics and their associated tools but also appropriate forms of support, training and community building.
Course signals at Purdue: Using learning analytics to increase student success
  • M D Pistilli
  • K E Arnold
Pistilli, M. D. and K. E. Arnold (2012). Course signals at Purdue: Using learning analytics to increase student success. In Proceedings of the 2nd International Conference on Learning Analytics and Knowledge, Number May, New York, NY, USA: ACM. LAK' 12., pp. 2-5.