PreprintPDF Available

An Evaluation Plan for a Higher Education Leadership and Management Module on Academic Monitoring and Support Strategies Prepared for: Centre for Higher Education Research, Teaching and Learning Rhodes University

Authors:
Preprints and early-stage research may not have been peer reviewed yet.
1
To cite this paper: Dwayi, VM (2014). An Evaluation Plan for a Higher Education
Leadership and Management Module on Academic Monitoring and Support
Strategies. A Short Learning Program Project submitted for accreditation by the
Centre for Higher Education Research, Teaching and Learning. Rhodes University.
Grahamstown/Makhanda
An Evaluation Plan for a Higher Education Leadership and Management
Module on Academic Monitoring and Support Strategies
Prepared for:
Centre for Higher Education Research, Teaching and Learning
Rhodes University
Prepared by:
VALINDAWO VALILE M. DWAYI
Walter Sisulu University
August 2014
2
Table of Contents
Abstract ................................................................................................................................... 3
SECTION 1: INTRODUCTION AND BACKGROUND ................................................................3
1.1. Rational for the Short Course ....................................................................................... 4
1.2. Conceptual Framework ................................................................................................ 5
1.3. A Brief Description of Ontological and Epistemological Assumptions ............................. 6
1.4. Monitoring and Evaluation ........................................................................................... 8
SECTION 2: THE THEORETICAL PARADIGMS AFFECTING THE EVALUATION PLAN......................... 9
2.1. The Contestations of the Policing and Developmental Models ............................................................... 9
2.2. The Realist Approach to Evaluation .......................................................................................................11
2.3. Drawing the Links with the Teaching Philosophy ..................................................................................12
2.4. Ethical Considerations of the Evaluation Plan........................................................................................15
2.5. The Contextual Analysis: National and Institutional Quality Assurance Issues .....................................17
SECTION 3: DATA GENERATION APPROACH AND ANALYSIS ..................................................... 20
3.1. Descriptive Analysis of Different Evaluation Approaches and Instruments ..........................................20
3.2. Data from other sources ........................................................................................................................24
3.3. Dealing with Student Feedback ........................................................................................ 25
3.4. The Value of the Peer Review Process ...................................................................................................26
SECTION 4: EVALUATION AS REFLECTIVE PRACTICE ................................................................. 26
SECTION 5: EVALUATION FOR IMPROVEMENT ........................................................................ 28
5.1. Evaluation and Curriculum Design .........................................................................................................28
SECTION 6: CONCLUSION ........................................................................................................ 30
APPENDIX A: DESCRIPTION OF ACADEMIC MONITORING AND SUPPORT STRATEGIES .............. 32
APPENDIX B: CHE Institutional Audit Recommendation to WSU, April 2011 ............................. 32
APPENDIX C: Student Feedback on the Evaluation Tool about the Academic Monitoring and Support
Strategy about Evaluation of Teaching .........................................................................................................33
REFERENCES ........................................................................................................................... 35
3
Abstract
This article reports about the main design considerations for a short learning program/course that
focuses on academic monitoring and support (AMS). AMS ought to be the most significant element
for those responsible for quality enhancement of the academic project. In this article, I argue that
the default position in AMS conceptualization is to adopt the policing model instead of what could
be the developmental approach. The policing model draws on the pragmatic approach and its black
box logic. The alternative, based on the value of the public and common good, as espoused in
university education, ought to be elaborative logic and socially embedded approaches about the
context of application. The article suggests that central to a monitoring and evaluation function, and
its related dimensions of performance and assessment, therefore, ought to be the ability to think
about the philosophical statement during course/program design. The article was prepared as an
assessment task in order to meet the requirements for a short learning program/course on
evaluation of teaching/courses (EoT/C) for the Centre for Higher Education, Research Teaching and
Learning (CHERTL), Rhodes University, South Africa.
SECTION 1: INTRODUCTION AND BACKGROUND
The planned evaluation of teaching/courses practices will be located in the planned short
course which seeks to build capacity in academic monitoring and support at Walter Sisulu
University (WSU). Post the Administration phase at WSU, and as a result of the current
discussions about the institutional resource mobilisation and allocation strategy, issues of
monitoring, evaluation and reporting take a centre stage towards ensuring organisational
efficiency. The planned capacity building programme/module, of which this evaluation
task/plan is a part, is modelled on similar programmes, for example, by Rhodes University,
CHERTL PGDHET module for academic managers, and HESA Higher Education
Leadership and Management programme of the University of the Witwatersrand. Both
programmes are designed at NQF Level 8 with 30 credits. The table below is a summary of
the planned Higher Education Leadership and Management (HELM) Academic Monitoring
and Support (AMS) module, which highlights the monitoring levels as the main elements of
integrated quality management framework, the impact levels at which AMS can be
measured, the related strategies and the required HELM competencies for each level. The
listed strategies, as the basic components of the HELM AMS module, are briefly explained in
the appendix section in terms of the specific indicators about their implementation as they
should relate to course delivery, programme management, institutional systems/policies. The
recommended procedures/strategies are drawn from my experiences of developing and
coordinating academic development programmes and services at WSU.
4
Table of Academic Monitoring and Support Strategies According to Integrated Quality
Management Levels and the Required HELM Competencies
Monitoring Level
Impact Level
Academic Monitoring
and Support Strategy
Required HELM AMS
Competencies
Course/Module
Course/Module
Delivery (staff
competencies and
student
performance data)
Binding Study Advice;
Tracking and Monitoring
of Student Performance
Reflective data
management;
Developmental research
Programme
Programme
(Design,
Implementation,
Monitoring and
Improvement)
Assessment for Student
Learning; Evaluation of
Teaching/Courses
Reflective data
management;
Developmental research
Institutional
Integrated Quality
Management
Systems/Policies
Programme Reviews
Appropriate strategies
and mechanisms for
integrated academic
development
1.1. Rational for the Short Course
A typical case from educational practice motivated me to try to understand the problematic
nature of HELM roles in, and their practices about, academic development. This involved
how first year students from the WSU School of Engineering, where the study is located,
were excluded from their academic programme. From a cohort of 4615 students who
registered in four engineering programmes in 2012, 328 didn’t accumulate enough credits to
write their first examination in June 2013, and therefore were excluded from their semester
examinations. This constituted 7% of the designated group. Also, 826 students (19%) could
still not make it even after examinations, which makes 1154 (25%) as the total number of
students which lost out during this semester programme. A further scrutiny of these student
cases by analysis of their possible participation in academic development together with their
student appeals records revealed a lack of evidence with regard to how such students were
monitored and supported during their learning (WSU, 2013). This was the case despite
HEQC requirements on programme accreditation with specific reference to academic
development for student success, and student retention and throughput rates (HEQC, 2004).
While there are many factors that affect student success (Scott, Yeld and Hendry, 2007;
Berg and Hoffman, 2005; CHE, 2010), social responsiveness implies the capacity of a
university to identify and manage those factors that the institution can do something about.
Building on Tinto’s assertion (2012) that access without support is no opportunity, one could
argue, for instance, that admitting students without supporting them constitutes a form of
irresponsible judgement, especially for those universities that are located in developing
communities. The WSU case about 1154 (25% of the cohort of registered students in one
5
semester) is an example of how the higher education system may be failing some of its
students. The fact that students could reach an exclusion point with no evidence of the kinds
of systems that assist them during their learning journey challenges the common sense view
that students are to blame for their performance; the notion that attributes student’s failure to
the victim. Furthermore, this case surfaces critical questions about the whole notion of
university readiness for students-at-risk in general (Bozalek & Boughey, 2012; Dhunpath &
Vital, 2012). It thus brings to the fore the need for reconceptualising academic excellence as
part of a social justice project.
1.2. Conceptual Framework
The ideal HELM practices constitute the monitoring phase in a cyclic process of programme
planning, implementation, monitoring and improvement (HEQC, 2012) at different levels of
course/module, programme and department/faculty. Monitoring itself refers to the continuous
process of examining the delivery of programme outputs to intended beneficiaries, which is
carried out during the execution of a programme with the intention of immediately correcting
any deviation from operational objectives, the improvement phase. Structures and
mechanisms for monitoring include institutional policies, systems, strategies and resources
used to monitor and evaluate and act on quality issues, which has implications to leadership
and management role in a university/organization context. By means of quality enhancement
processes, especially the improvement phase, students learning skills, curriculum gaps, and
staff competencies become the centre of attention. Therefore, such a notion of quality
enhancement mechanisms and structures surface two important issues that inform the
rational for such a capacity building programme,
How HELM capitalises on the available institutional opportunities by taking deliberate
efforts for integrated academic development (Volbrecht & Boughey, 2005).
The kinds of departmental strategies, systems and processes that HELM apply in
order to ensure that the current deficiencies of student academic development,
curriculum development and staff capacity, as implied by the case of excluded
students from the School of Engineering, are addressed.
Listed below are the proposed program goal and its objectives:
To develop institutional capacity to manage and share best practices about the systems and
processes of academic monitoring and support for improved student academic success.
To provide university middle management with a deeper understanding of the academic
monitoring and support as the main feature of integrated quality management systems
and processes.
6
To empower leadership teams through reflective data management processes and
developmental research with relevant competencies for managing student entry,
progression and completions as part of a university value chain.
To capacitate university management and leadership for developing the appropriate
strategies and mechanisms for the effective management of the culture of teaching and
learning by means of integrated academic development.
To enable selected middle managers with high level skills, responsive knowledge and
appropriate attributes as the future leaders of higher education.
The bigger pproject, of which this programme is a part, focusses on Higher Education
Leadership and Management and Academic Monitoring and Support practices in a selected
case of university in RSA. The project is motivated by more than ten years of leading a
developmental work in the case of a historically black university in South Africa, as initially
profiled elsewhere (Dwayi, 2003; Dwayi, 2013). The research goal for the doctoral project is
therefore to explore how higher education leaders and managers promote and monitor
student academic success as reflexive deliberations and transformative praxis in the context
of a transforming higher education system. To achieve this goal, the project revolves around
the following two objectives,
Firstly, to specifically explore and analyse how higher education leaders and
managers respond to academic development initiatives against the enduring
dysfunctional culture of learning and teaching in higher education at one South
African university;
Secondly, to understand and explain the challenges and opportunities that the
integration of academic development for improved student academic success pose
and present to higher education leaders and managers at one South African
university.
1.3. A Brief Description of Ontological and Epistemological Assumptions
The project adopts critical realism as the ontological view whereby reality is understood as
stratified, emergent and fallible. According to this approach, behind what is experienced at
the empirical level are the generative mechanisms and structures at the deeper level of real
which give rise to the levels above. The middle actual level would constitute the events and
processes which are experienced and observable at the empirical level. The mechanisms
and structures at the level of real may be expressed at the level of the actual, or may not be
actualised, which means not everything at the basic level gives rise to the levels above,
hence the reference to objects that may be intransitive (not available to our consciousness).
Due to open and complex systems, not everything that takes place at the actual level is
7
causal or may be experienced at the empirical level, hence the objectives available for our
experience may be transitive or intransitive. Even where such objects may be available for
our experiences, our interpretation/experience of those at the level of the empirical is also
fallible, which means, it may not constitute the actual reflection of the reality. Thus reality is
fallible, where the elements of reality are transitive (understandable) and some may be
intransitive (not understandable because it is in the nature of such objects not to be
actualised and also not to be experienced).
According to the realist ontology, the current student exclusions would constitute the
empirical view about student learning, with the teaching events and processes, including the
relational aspects thereof, an indication of how the generative mechanisms from the level of
real are actualised, with other aspects from that level not amenable to actualisation. The
university environment itself constitute the level of the real, where the structures and
mechanisms are actually at play, with the mechanisms that may or may not be actualised;
may or not be felt at the upper levels. This takes the view of higher education as an open
and complex system, where issues of student performance cannot be reduced to student
under-preparedness for learning. Student exclusion thus could be a case of conditions and
mechanisms which are constraining, and to which our empirical experience is also very
limited. The methodology for such a study is by means of the theory of morphogenesis
(Archer, 1995), which is the analysis of the interplays of Structure, Culture and Agency.
The planned HELM module therefore begins with the AMS needs/situational analysis, which
is the structural conditioning of the HELM levels as Agency interacting with quality assurance
Structures and institutional norms and standards (Culture) at a point in time. (This is the
phase that has just been completed as part of my studies at Rhodes University). The further
analysis of HELM experiences of AMS would constitute the social interaction phase
whereby, it is assumed, and according to the SR, that the interplays between the Agency
and Structure and between Agency and Culture will, over time, lead to morphogenesis
(transformation, change in the status quo) or morphogenesis (reproduction, no change in the
status quo). The academic part of such a transformation project would be the understanding
and explanation of generative mechanisms and structures for what is observable at the
empirical level; the constraining and enabling conditions for the current AMS practices,
which are perpetuating student exclusions, according to the case referred to above. This is
done towards the development of knowledge of, and knowledge for, transformation as the
important element of higher education research (Lange, 2014). Such would be the education
research that understands and explains, from a critical realist and social realist approaches
as briefly explained above,
8
what might be possible to change (knowledge of, and for transformation, as
constraining and enabling factors for HELM AMS practices),
how might it be possible to change it (analysis of the conditions and mechanisms that
make that possible as the morphogenetic/morphostasis cycles in the context of
higher education transformation at sector level),
with what effects (equity, social responsiveness and quality enhancement as the
main impact variables for a developmental university like WSU), and
for whom (students/society to which WSU should be responsive).
Course/module evaluation therefore will entail the reflective process of assessing, in an
unbiased manner, the extent to which this HELM AMS capacity building course has
achieved its stated goals and objectives. For the purpose of this exercise, which is about
evaluation of teaching, one can draw from the WSU Evaluation of Teaching//Courses
(EoT/C) Policy, whereby evaluation is defined as
the gathering and interpretation of information regarding aspects of teaching and
curriculum with the aim of establishing a base for and monitoring of continuous
improvement. This information may be obtained from a number of sources, and
lecturers are encouraged to employ a wide range of evaluation strategies as a
regular part of their professional development.
1.4. Monitoring and Evaluation
While the HELM AMS course would, on its own, be about academic programme monitoring
and support, which is essentially descriptive in nature as it records things as they happen or
states of affair as they occur, evaluation a course, on the other hand, would involve some
form of value judgment of the course or its teaching activities (my facilitation skills,
knowledge and attributes). Evaluations typically result in various kinds of value judgements,
e.g. judgments of the effectiveness, success, efficiency of teaching, inclusive of programme
sustainability (understanding of what works and doesn’t in teaching and learning, for
instance). Monitoring systems, on the other hand, consist of sets of indicators that measure
processes, outputs and outcomes about student success; how input variables are
transformed into the expected outputs as efficiency and impact measures. Value judgments
about the worth or merit of academic programmes would only be effective when the
monitoring activities lead to academic support, that is, monitoring for improvement, which
then constitutes the logic of AMS as the intervention programme for HELM practices. This
9
would be measured as the extent to which HELM AMS practices have improved as the
outcome of the intervention mechanisms and structures in a WSU context.
The potential students (research participants/informants) for such a HELM AMS course,
which also constitute the social interaction phase of my academic project, according to
Archer’s theory of morphogenesis, is a selected group of Heads of Departments and
possible the affected Directors of Schools and Programme Coordinators, those who occupy
these academic leadership positions and also all those who aspire to. In the context of the
current rate of students who drop out of the higher education system, against the enabling
structures and systems of academic development (this point is discussed in subsequent
sections), this group of higher education practitioners become target research
participant/workshop participants based on Tinto’s notion that,
Student success does not arise by chance. Nor does substantial improvement in institutional
rates of student retention and graduation. It requires intentional, structured, and proactive
action that is systematic in nature and coordinated in application” Tinto, 2012.
Tinto’s assertion raises important issues about the politics of teaching and learning, the need
for understanding the power dynamics as conditions and circumstances that give rise to the
current practices, and how to turn the current inefficiencies into the academic project about
social justice. To me, this is why the study of academic leadership and management
matters, especially given the current challenges of transformation in higher education.
Consequent to this introduction and background, the subsequent sections have been
organised in order to respond to the evaluation criteria of the RU CHERTL module on
evaluation of teaching.
SECTION 2: THE THEORETICAL PARADIGMS AFFECTING THE EVALUATION
PLAN
2.1. The Contestations of the Policing and Developmental Models
The previous section referred to the notions of reflexive deliberations and transformative
praxis as basic elements of social realism by Archer (1995). Such notions propagate for how
the concerned parties engage in internal deliberations as a process of resolving the
contradictions and dilemmas that they encounter in context. By transformative praxis is
meant the ideal of the public good which is more about human freedoms and flourishing.
According to Kemmis, 2009, such dialectical and transformative professional practices
transcend the dichotomies of objective and subjective, of individual and social positions, by
10
locating the practice on the post-modern paradigm of practices. Critical in this position are
the aspects of connections and relations that one is able to make in the operational context.
Such a paradigm takes the position that our actions are socially and historically constituted,
and are, in turn, reconstituted by human agency and social action. It propagates for higher
education as praxis, about the notion of university as a public good (Singh, 2014). It is more
about developmental positioning of evaluation practices along the critical humanistic ideals
for change with their concern for the emancipatory context. This would be against the
reductionist and technical approach of the evaluation policing model, which is more about
compliance to the regulative mechanisms, and in the case of teaching and learning, would
entail simply checks whether the stated learning outcomes of the modules have been
attained or not, for instance. While the policing model, due to its technicist and reductionist
approach, tends to adopt the one size fits all (and those who don’t fit may get
eliminated/excluded, for instance), the developmental approach is anchored in human
capabilities and also takes care of the context, the mechanisms and the outcomes of
evaluation.
In the planned project, evaluation practices would thus entail the assessment of the extent to
which the facilitation skills, the requisite knowledge and correct attributes enable project
participants (higher education leadership and management) to reflect on their challenges
about academic monitoring and support practices, as required by the policy structures of the
university and also as related to their belief systems and values about teaching and learning.
For instance, it would be interesting to listen to the deliberations between the hard systems
thinkers from the engineering field, a discipline with a convergent way of thinking and
explanations, on one hand, and the soft systems thinkers from the humanities fields, the
divergent field, during the conversations about the pedagogical and didactic issues in the
university context. The convergent way of thinking may not be sitting well with the divergent
ways of the social scientists, something that would have to be understood by the facilitator
as the potential resource for enhanced learning. The extent to which this can happen and be
effectively achieved by means of discussing the theories and practices of academic
monitoring and support, and the possibility of conceptual understanding of the how the
current constraints come about and possible solutions thereof, would thus be a subject of
evaluation, both as formative and summative, a point to be explained later.
The point about the policing model of evaluation does not completely discount the
significance of the policing/technical approach of evaluation, though. One would still need
such a technical approach as the ultimate measure of success, but, not as both the means
and the end. This is what explains these two models as the two ends of the same evaluation
11
continuum. The key to the latter would be the practitioners understanding of this orientation
and the appropriate choices to be made in context. In a monitoring project as this one, it
would thus be important to know how the misuse of monitoring, as a HELM practice in
teaching and learning, might lead to the undermining of the other; how the over emphasis on
monitoring as a technical practice about the achievement or not achievement of learning
outcomes which therefore becomes a reductionist effort if does not lead to academic
support, where such outcomes may not have been achieved. The developmental approach
would move beyond the pragmatic approach of quality assurance which tends to blame the
victim for student failure (the case of Engineering programme above), by investigating the
circumstances and conditions in which such learning outcomes have not been achieved,
which then may lend itself to a critique of student learning styles, the nature of the curriculum
or the staff competencies.
Guba and Lincoln (1989) make the observation about such developmental evaluation roles
which need to move beyond the roles of simply human data analyst (measure-oriented); the
illuminator/historian (description-oriented), or the mediator of the judgemental process
(judgement-oriented) as the first three evaluations. The appropriate roles would entail the
developmental functions, the constructivist orientations of the evaluator (fourth evaluation)
as collaborator, investigator; discoverer and change agent. Such evaluations are negotiated
by having all affected stakeholders as co-creations of social reality. In practice, this would
mean that a typical HoD does not only gather data about student performance rates, but also
analyses such data as part of institutional research within the transformative agenda of
equity and social inclusion. For the purposes of my study, and as implied for this EoT/C
exercise, students and staff concerns, claims and issues (CCIs) about teaching and learning
would be central to academic monitoring and support, within the bigger picture of institutional
systems and policies as enablers. An effective coordination of teaching and learning would
require identifying those stakeholder CCIs and the necessary interventions towards ensuring
integrated academic development at the level of course, programme and institutional
development (Volbrecht and Boughey, 2005).
2.2. The Realist Approach to Evaluation
As part of the academic project which is located in a realistic approach (Introduction and
Background Section), the realist paradigm challenges the relativist ontology of evaluation. By
building on the notions of results oriented ways of thinking which are propagated by the
policy discourse for higher education (this point will be explained later), the actual evaluation
practices of the planned HELM module will try, by means of realist ontology, to understand
12
how the outcomes arise from the context and mechanisms of evaluations as social systems.
Such a realistic paradigm prescribes that the evaluation practices would have to move
beyond the only concern about what works and does not work (in terms of HELM AMS
practices), but by taking the position that what may work with some people in certain
conditions may not work in different conditions or with different people. Kazi & Rostilla (2002)
explain that the gold standard for the realist evaluator is to recognise that such an
explanation at any one time would require further investigation and further explanation,
hence the notion of the realist effectiveness cycle. In terms of this paradigm, not only the
reflexive deliberations of the HELM group from the engineering fields versus the humanities
fields, but also the insight about how certain mechanisms and conditions of the two fields
may be conditioning certain AMS practices at different HELM levels and how such diversity
may be further expressed and enhanced in the context of a comprehensive university like
WSU.
The realist paradigm will therefore permeate both the module evaluation plan and practices,
and also the subject of my academic project. This will help me to understand and explain, in
the face of the academic transformation project, which is about university as a public good,
how HELM practices entail the effective use of resources for creating enabling conditions for
teaching and learning, with integrated academic development as the means to an end. Such
a project has a potential to understand the possibilities of transformative leadership, how the
values and systems of teaching and learning may be integrated within the leadership and
management practices by means of reflexive deliberations and transformative praxis.
Discussions of the dilemmas that HELM levels face in dealing with student and staff profiles,
and how these may be expressed by means of curriculum innovations would thus contribute
enormously towards ensuring a practice that is steeped in social justice, particularly for a
university of WSU profile. How this might be possible as a teaching practice is addressed in
the next section.
2.3. Drawing the Links with the Teaching Philosophy
The realist ontology, as described above, assumes that there would be a particular theory
behind the workings of the HELM AMS module or intervention. Such a theory would be a
construct involving three basic elements of Context, Mechanism and Outcome (CMO), which
explain how the programme causes the measured change, with the context being the most
important element that constitutes the conditions for change. In order to find the underlying
theory, it is necessary to configure a series of CMOs during the intervention cycles, (Pawson
13
and Tilley, 1997). The following principles constitute the basic tenets of the realistic
evaluation methodology,
The HELM AMS course would be theories incarnate;
Evaluations thereof would constitute tests of such theories;
The theories in question would need to comprise Context-Mechanism-Outcome
conjectures;
Mechanisms would refer to the ways in which effects are brought about;
Contexts would refer to the conditions for the operation of Mechanisms. They are
seldom closed;
Outcomes would refer to the effects of Mechanisms activated in Context;
Mechanisms generally (though not always) would involve HELM reasoning and AMS
resources;
Such AMS course would work differently amongst different subgroups (HoDs,
Directors of Schools, Programme Coordinators).
For the planned HELM AMS project, the author will adopt the public-good capabilities based
approach, as advanced by Amartya Sen (Sen, 1999, 2009; Nussbaum 2000, 2011) as a
teaching philosophy. This will constitute the main education theory behind facilitation of
events and processes on AMS which will revolve around advancing the HELM capabilities in
dealing with the marginalised and disadvantaged students who end up being excluded from
the university system. Walker (2012) maintains that South African universities ought to
educate public good professional to make their ‘horizons’ of vision wider’ so that
professionals can contribute to reducing injustices through their actions. Drawing from the
literature about the Professional Capabilities Index (PCI), the professional education of HE
practitioners ought to contribute to the development of capabilities (freedoms or
opportunities) to form and choose valuable beings and doings (‘functionings’ or
achievements or outcomes), which enable us to choose and to live in ways we find
meaningful, productive and rewarding individually and collectively to the good of society.
Such capabilities would, in the context of the planned project, include knowledgeability about
the values and practices of academic monitoring and support as the optimum use of
opportunities that are available for ensuring student success by means of integrated
academic development. Walker (2012, p.2), drawing from Sen, 1999, maintains that the
evaluative question for such a professional education and practice would then be to ask
whether the freedoms of people (read student success by means of innovative curriculum
and professional teaching) have been enhanced, while the effectiveness criterion would
attend to how human development depends on enhancing people’s agency (read HELM
14
AMS capacity). Such a theoretical orientation to teaching is more compelling for Universities
like WSU who are so immersed in the underdeveloped area like the north eastern part of the
Eastern Cape Province (ECP), where the triple challenges of poverty, unemployment and
inequality (PUI) do not only converge but also play out in complicated ways of location,
colour, and gender. The fact that WSU offers predominantly under graduate programmes
(97%) means that a lot of work needs to be done about ensuring that increased access does
lead to success and therefore reverse the current revolving door syndrome as alluded to in
the introduction section. The story of the excluded students from such a costly field of study
as engineering, and in the context of the ECP with such PUI characteristics, presents a
compelling case to examine and explain the conditions under which HELM may be falling
short of maximising the opportunities that are presented by the planning, quality
enhancement and funding regime for inclusive education (more details about this matter in
the following section). The normative position to be followed in the HELM AMS programme
will be therefore about educating, rather, leading and managing for inclusivity, and by means
of integrated academic development practices. Such a project would revolve around the
basic principles of social justice as justice, fairness, democracy, and empathy of
transformative leadership (Shields, 2009) as the main characteristics of the AMS events and
processes of teaching and learning. By means of reflexive deliberations and transformative
praxis, the planned AMS module will be anchored on this teaching philosophy, the public
good capabilities approach, and towards inculcating the social justice orientations of higher
education leadership and management.
How this can be understood and explained by means of public-good capabilities approach,
within the critical realist ontology and the social realist epistemological framework, one may
have to ask? Appendix C provides a brief summary of HELM responses to an evaluative
question about the implementation of the EoT/C Policy at WSU. The question asked only
two items, the constraining and enabling factors about evaluation practices (this data is
presented in this section only to make a point about the lacks, tensions and contradictions of
HELM practice as the basis for developing the HELM capabilities. The data is further
explained under the Section about Evaluation Instruments and Student Feedback as part of
this assessment exercise). Developing public-good capabilities would entail HELMs having
the ability to resolve the dilemmas of university practices, as constrained by certain
university structures, towards having student interest at the centre of events and processes,
thus reduce marginalisation, exclusion and disadvantage. HELM professional capabilities
would therefore be expressed as internal integration, the micro level of AMS, whereby the
strategies about student, staff and curriculum development are effectively implemented
within the values and belief systems of each discipline. As external adaptation, programme
15
reviews, for instance, are not managed only as once off events, which constitutes the current
pragmatic approach to quality enhancement, but such management practices do lead to the
implementation of corrective action plans, as monitoring for support that takes care of the
social context, and thereby the full realisation of the integrated quality management
framework, at macro level. The academic element of such integration inquiry and practice
entails the analysis of the generative mechanisms that constrain or enable such HELM AMS
events and processes at the level of structure and culture; the analysis of possibilities (the
reflexive deliberations) of the Agency acting (transformative praxis) on those and for
ensuring student success. Drawing from Smith (2001), the development of such capabilities
is at the core of better scholarship, through scholarship of teaching and learning as the
quality of knowledge available to be built upon in form of research about learning styles and
staff competencies, for instance. How one might be wrong in pursuing such a programme
and its evaluative practices surfaces some ethical issues which are then discussed in the
next section.
2.4. Ethical Considerations of the Evaluation Plan
Kemmis (2009) explains three guiding principles of practice which are at the heart of
professional practice, learning and education. The state of aporia tends to characterise the
dilemmas and perplexities that one experiences when engaged in a professional practice
and the choices that one may have to make in carrying out such a practice. The principle of
phronesis refers to the application of rationality as being at the centre of resolving such
dilemmas. In the process of such resolution, the principle of praxis (working for the public
good) applies as the ultimate measure, as an important organising principle. These are then
the principles that will inform my considerations of evaluation ethics in the planned HELM
AMS capacity building course based on the public-good capabilities approach. From a
critical realist point of view, my awareness about the mechanisms and structures that could
be giving rise to what is measurable at the empirical level, and how then such observations
could become contestable from different perspectives, will on its own constitute an ethical
issue. In practice, this will mean that ethical issues will have to be effectively managed in the
planned programme and its evaluation practices along the three very important and
interelated aspects,
i) my relationship with the research context in general and the research participants in
particular given my position as Director of the Teaching and Learning Centre,
ii) the research process itself where my institutional responsibilities and power
dynamics with my informants may impact on my research data, and
iii) my positionality with regard to the issues at hand.
16
With regard to my own subjectivity, Trowler (2011) points out that the challenge here will be
to make the familiar strange. Literature on research ethics (Mercer, 2007; Chavez, 2008)
maintain that insider/outsider positions in research are not dichotonomous, but must rather
be understood as a continuum, which requires reflexivity on how one blends into the
research environment at different points of the research process. For a social realist
researcher and practitioner, the aspect of being critical while also complicit in organisation
systems and processes will be resolved by an acknowledgement of my own fallibility
(Bhaskar, 1989), whereby the claims that I make will be considered fallible, and thus
subjected to further critique, as part of ongoing scholarly engagement.
My background in academic development and the research context, as introduced above,
will also become a resource in terms of the richness of internal language of description
which the programme participants may not have. However, since the generation of data may
compromise their positions on issues, I will have to depend on both member and expert
checking. Williams (2009) makes an important point about the role of academic development
researchers in researching their own environments. He refers to the impasse that the AD
practitioner as a researcher may encounter where a burning urge to conduct research may
be challenged by concerns that ‘guilty’ knowledge about the situation may not only expose
our frailities and limitations as interpretive researchers, but may also harm the informants or
the context of research itself as a ‘state of aporia’. In order to manage this, Williams
recommends the pursuit of trustworthiness, courage, honesty and concern for the other.
Belluigi (2012) notes that perhaps we need to think in terms of representation and
celebration of difference, while also treating difference and sameness in a spirit of
neither/nor as we engage with problems we identify as AD practitioners.
Student academic success, as an important element of the social justice project in South
Africa, is central to the planned HELM AMS programme and its evaluation practices.
Unfortunately, this very issue can challenge the core of our diverse identities and value
systems (Scott, 2009). Bitzer (2009:xi) argues that becoming aware of the values and issues
of higher education is the first step in studying and researching the higher education field.
Such an observation cant be undervalued for a project that transcends research-practice
ethical positions.
For the proposed AMS programme, of which this EoT/C exercise is a part, I have already
recieved informed consent from my participants. This was done by means of asking them to
complete the AMS Needs Analysis Questionnaire, whereupon they were also asked to
indicate whether they would be willing to participate in the programme. The next step is to
17
ask them to complete the right to withdraw clause as part of programme registration. I am
still grappling with the ethical issue of how the identities of my participants/informants will
have to be concealed, nonetheless, although I would like to emphasise that the main focus
will be on the analysis of their experiences and perceptions about academic monitoring and
support for student academic success issues as emanating from institutional events and
processes which arise as part of their assumed positions in the organisation. Again,
reflexivity will be the key disposition here, where my own exercise of agency along power
lines will be subjected to scrutiny with the guidance of experts in the field who are
supervising the project and will also serve as mentors during programme implementatiom.
Also, a key ethical practice will be maintaining a data management system whereby
documents as data sources will be kept securely online and in print as both the sources of
verification and for audit trail where dispute could ensue. The context for such an ethical
practice is explained in the next section.
2.5. The Contextual Analysis: National and Institutional Quality Assurance Issues
The following extract from the WSU Self Evaluation Report for the CHE Quality Enhance
Project, post the Administration regime, indicates some of the contradictions, contestations,
tensions and lacks of higher education context as the public good,
“The purpose of the institutional governance project is to transform the model of WSU from a
centralized system of governance to a divisional governance and management model suitable for
a single, comprehensive, multi-campus WSU, that will ensure stability and coherence by
centralising accountability for its core functions, policies, systems, processes and procedures, and
decentralising accountability for the execution of those functions, processes and procedures
across its academically distinct and geographically dispersed campuses, with change
management interventions to sustainably embed this transformation.’ WSU, 2014.
From a realist point of view, one can easily identify from this extract and its policy discourse,
the structures and mechanisms for institutional governance. The pronouncement assumes
the results oriented way of thinking, the false logic of policy formulation and implementation
which assumes that the availability of structures and mechanisms will lead to expected
outcomes. While the evaluation of teaching is one of the centralised policies and systems for
WSU, Appendix C about the current situation on EoT/C, which was collected as part of the
structural and cultural conditioning phase of my academic project, presents a list of
constraints in implementing such a policy. This situation therefore renders a case of tensions
and lacks in university governance structures. Such lacks will be addressed by the planned
module, be explained by means of my academic project and are also discussed in this
section as part of the assessment task.
18
How the structural arrangements as alluded to above derive from other systemic levels has
more to do more with the whole transformation project of the university sector in South
Africa, than WSU alone. As a consequence of the White Paper III of the Transformation of
Higher Education (DoE, 1997), which led to the National Plan for Higher Education (DoE,
2001), government is applying some steering mechanisms in the form of planning, quality
assurance and funding to transform higher education which was inherited from the
previously fragmented system. The advent of RSA democracy unfortunately coincided with
the global shifts, to which the state has to respond to, on one hand, and the developmental
project of the country which the state has to champion, on the other hand. This situation
further presents more tensions to higher education institutions as they have to subscribe to
these transformative mechanisms while also defending the academic project. The most
contested issues revolve around the role and the value of higher education in the face of PUI
factors as explained in the previous section. The globalisation debate and its skills
development discourse seem to exert more pressure to universities as contested spaces
that have to account for the use of public funds in the face of abject poverty, escalating
unemployment and increasing inequalities and all, as indicated before, converging in a very
strange way in the Eastern Cape Province. It is in this context that WSU and its practices,
including the mandatory evaluation of teaching/courses as a structure, have to respond as
part of not only progressive thinking, but also the reconstruction and developmental project
of the country through critical citizenship.
The high skills discourse of the post 1994 education system and its associated mechanisms
of OBE and NQF (Kraak, 1999), which of late has been extended, one can argue, by the
current discussions and contestations about the notion of graduate attributes (CHE, 2013),
seems to put more pressure to such universities like WSU which still suffer from the legacy
of underdevelopment as a result of the previous social engineering project of the apartheid
system. These contradictions and tensions can be illustrated by the analysis of the recent
recommendations of the CHE Institutional Audit for WSU, for instance. Appendix B lists four
recommendations extracted from the CHE Institutional Audit Report to 2011. The CHE
recommendation about the development and implementation of teaching and learning
strategy (Recommendation 19), while also recommending the implementation of the
exclusion policy (Recommendation 22), presents a contradictory situation about the
management of student success, which therefore questions HELM capacity to manage such
contradictions. The need for the integrated quality management system (Recommendation
17) should be expressed by the actual monitoring and support for student success and
graduation rates (Recommendation 4), which then surfaces capacity challenge. The case of
excluded students without the evidence of academic development support for such students
19
poses a challenge of social justice which need to be addressed by critiquing the role and
value of knowledgeability as HELM capabilities in the university context. The latter would
entail what forms of skills, knowledge and attributes are appropriate for integrated academic
development in the mainstream of teaching and learning, in the face of the current planning,
quality and funding regime, but more significantly whether and how academic monitoring and
evaluation mechanisms for such integration (Recommendation 4) do, indeed, take place.
EoT/C practices would constitute an important element of such a project, as it will be
discussed in the subsequent paragraphs of this section.
Although WSU performance improved over the years, from 78% as the target for 2013 to
80% on student success rates, and from 18% as the target for 2013 to 19%, challenges still
remain in how to measure these performance rates at lower levels of the system (faculties,
departments, programme and course). The case of increased teaching development grant
from R30 million for 2011/12 to R35 million for 2014/15 and projected at R99 million by
2016/17 needs to be understood and explained in the context of students who continue to be
excluded from their academic programmes, perhaps due to what the university could have
done better (Tinto’s argument). The current performance rates, which have significantly
improved over the last three years, constitute fault consciousness about student academic
performance, from a systems point of view, until one interrogates the current HELM
practices, and effective EoT/C practices therefore present the potential as the significant
element of academic monitoring and support. The number of exclusions in academic
programmes, as a result of the CHE recommendation for such a policy, provides the need
for analytical priority, while the increasing teaching development grant amounts versus
exclusion cases provide a framework to problematize the structure of academic monitoring
and support and its relationship to organisational efficiency and impact requirements, one
can argue.
The main goal of the Quality Enhancement Project (CHE, 2013) is the enhancement of
student learning, which positions the matter of AMS and its evaluation at the centre of the
academic discourse. Quality enhancement for the increase of graduate attributes that are
personally, professionally, and socially responsive is the organising principle for the four
main areas of the project, professionalization of teaching, enhancing the status of teaching,
student mentorship and institutional priority programmes for teaching and learning. The
evaluation of teaching practices will have to test out the HELM capabilities in terms of the
AMS module, while the module itself constitute important elements which also articulate to
the QEP areas (read introduction section above). Evaluation of teaching will have to provide
evidence for the participants understanding of integrated academic development as the
20
principle through which the QEP programmes and activities can be realised. This could be
achieved by means of a better conceptual understanding and then effective academic
monitoring and support as the actual transference of the acquired skills, knowledge, and
attributes from the training to the actual management of teaching and learning events and
processes. Improved evaluation of teaching practices have a potential not only to provide
data about the efficiencies of teaching and learning, how students learn, and the corrective
action that may be required, which would constitute a quality enhancement indicator, but
also how the current government subsidy can be sustained.
The social interaction phase, as represented by the AMS module, will provide a space for
interrogating these issues, especially working on the recommendations of the exercise about
evaluation of teaching as one of the AMS strategies. This will be done as part of developing
the required academic space of knowledge for, and knowledge of, transformation, whereby
HELM should not only understand and implement the necessary change mechanisms and
structures but also understand and argue, from informed positions (their belief systems,
values and norms), the contestations about how to acquire such knowledge in a university
context. It is also assumed that the evaluation of those spaces, per this assessment
task/plan, will surface the need for participants to reflect on their identities, as both managers
and academics with academic development as the platform for the public good. The relevant
evaluation approaches and instruments to achieve this are discussed in the following
section.
SECTION 3: DATA GENERATION APPROACH AND ANALYSIS
3.1. Descriptive Analysis of Different Evaluation Approaches and Instruments
The previous section on teaching philosophy referred to Walker (2012, p.2), drawing from
Sen, 1999, who maintains that the evaluative question for such a professional education and
practice would then be to ask whether the freedoms of people (read student success by
means of innovative curriculum and professional teaching) have been enhanced, while the
effectiveness criterion would attend to how human development depends on enhancing
people’s agency (read HELM AMS capacity)”. This observation will be very central to the
different evaluation approaches and instruments which will be selected for the HELM AMS
course. Evaluation will be understood as a means of controlling the process of learning and
also of ensuring the set outcomes, that is, how HELM acquires the relevant skills, right
knowledge and correct attributes (SKA) of public-good capabilities as required of academic
monitoring and for integrated academic development. Evaluation will also serve the purpose
21
of testing out how AMS concepts work out in practice, particularly for the HELM group that
will be made up of people from diverse academic backgrounds, and possible in different
levels of authority.
The course will have to adopt the formative evaluation approach, based on the cumulative
principle of learning, that is, evaluating how students are able to understand and resolve the
issues about academic monitoring and support as the course progresses, that is, as learning
becomes more accumulative. In such a formative format, the specific learning areas,
outcomes and assessment criteria of student academic performance as related to the
Binding Study Advise and Tracking and Monitoring Strategy (read the Introductory Section
and Appendix I) will be evaluated as chunks of data and how such data, presented as a
curriculum about course delivery with specific reference to student academic performance, is
understood from the conceptual and basic levels of comprehension and application, and up
to the analytical and evaluative levels of understanding (according to Bloom’s taxonomy).
The interpretation of such data could give indications of whether and how the facilitators’
skills in assisting the students to understand the concepts and the process of AMS are,
indeed, achieved. In practice (the actual management of teaching and learning events and
processes), this approach could also help students themselves to have a better
understanding of how staff facilitation competencies in delivering the programme are
responsive to student academic performance data and the possibility of addressing the gap
where such has been identified. The implications to data gathering will be the extent to which
the tools/questionnaires are able to indicate how students learn, more than what they learn
(this point will be discussed further as part of student feedback).
The summative evaluative approach, on the other hand, could be applied when a particular
part of the course has been completed, say, when all the academic monitoring aspects
related to staff academic development and the relevant support programmes, which, in turn,
could be implemented for ameliorating student attritions (from course delivery perspective),
have been presented. In such a case, students have to evaluate their understanding of
teaching related to this particular section of course delivery, staff competences, that is. For
instance, the presentation on the strategies for Assessment for Student Learning and
Evaluation of Teaching/Courses will be evaluated summarily at the end of the course,
whereby a link will be drawn to how these staff related strategies can enhance other student
based strategies (Binding Study Advise and Tracking and Monitoring Strategy) towards
addressing the second level of academic monitoring and support, that is programme design,
implementation, evaluation and improvement. The summative approach, in that instance,
could also take another form, whereby students are required to comment on the coherence
22
of the course as a whole and how such coherence was reflected by means of course
delivery. For instance, by means of a summative approach for the academic staff related
strategies (Assessment of Teaching and Evaluation of Teaching/Courses), one could be in a
position to further evaluate
whether the facilitation competencies do match the student learning styles (the
student academic development related strategies), which should result in improved
students learning/capabilities about academic monitoring and support, and
whether the HELM capabilities about academic monitoring do improve from
o the conceptual level (understanding of the context, the constraining and
enabling factors as has already been gathered from the needs analysis
section of this project) to
o the application level (their ability to apply AMS strategies in their context of
managing teaching and learning, and thereby reflect their HELM practices as
public goods).
Such a possibility for the improvement of HELM capabilities is what is at the core of this
AMS module and its related academic project. In a nutshell, this is a project that seeks to
challenge, by means of academic critiques, the fault consciousness about the current
student exclusions from university, which are based on the pragmatic approach about
quality. A project about the development of HELM capabilities (freedoms or opportunities) to
form and choose academic development programmes as services as valuable beings and
doings”; the ‘functionings’ or achievements or outcomes of academic monitoring and
support, would constitute the actual praxis (particularly in a university that has to be socially
responsive, as the ultimate/ summative measure).
The summative approach also takes the formative aspects when the data generated from
such an approach is responded to in presenting the subsequent sections. In this regard, it is
very important that the difference between formative and summative is done for analytical
purposes only, otherwise in open, complex and social systems of presenting the course, the
integrated approach may have to apply.
In line with the indicated learning areas (the course is structured according to AMS
strategies, Section I), selecting or framing of questions appropriate to evaluation practice will
range from those about the facilitation about student learning to those which are about staff
competences, about curriculum structuring and about the university context (environment).
The previous section referred to the data that has been collected from student feedback
about evaluation of teaching as AMS strategy (read Appendix C). During the course delivery
23
stage, the categories of evaluation constraints and enablers will be further developed as test
items to which students will respond according to test ratings (not more than three, say,
satisfactory, neutral, not satisfactory ratings). Such a rating will assist with data analysis
when there are fewer ratings of 3 than 5, for instance. For example, the responses on
Appendix C indicate structural and mechanical problems to the current culture of evaluation
of teaching. To evaluate students’ conception of these challenges, a course presentation
could be done about EoT/C as HELM AMS Strategy and such a presentation could be
structured along the aspects of ((course design issues are addressed in the last section):
Evaluation of teaching/courses practices;
The normative framework for evaluation of teaching;
Structural arrangements; and
Ideal operations/activities.
The data from Appendix C could be discussed to identify causal mechanisms to all the
issues listed and as part of contextualising the current challenges and per framework above.
The facilitator could then design test items as an evaluation of such teaching/course along
specific areas that deal with the listed structures and constraints to assess the extent to
which students can evaluate such course offering and using the 360 degrees approach
(evaluation by self, student, peers, and supervisor).
It is important to indicate here that much as evaluation of teaching/courses seems to be
about teaching, about the teacher, the main thing will be about how students learn (Poney,
1998). This would entail evaluating whether they have understood the appropriate concepts
about the “Evaluation of Teaching/Course as AMS Strategy” and how the data accumulated
through the use of questionnaires for evaluation practices during course delivery does,
primarily, provide evidence for such conceptual shifts, or making inferences thereof from the
responses. The secondary purpose would be easy to evaluate, whether the facilitation
skills/delivery competencies do respond to students level of understanding of the evaluation
concept, and whether the processes could ‘move’ student to the upper levels of abstract
thinking of evaluation and knowledge creation (according to Bloom’s levels). The integrated
project, as part of training assessment, will try to evaluate the extent to which students are
able to transfer, through reflective practices, the acquired SKAs to their practices, and also
triangulating those by means of organisational records that attest to those, namely,
documentation of evaluation of teaching practices in the selected academic departments.
24
The latter description is just one example that is provided in order to indicate how the
already accumulated data of Phase 1 of Structural Conditioning will be translated into an
evaluation of teaching questionnaire. For other AMS strategies, a similar approach could be
followed about the strategy on Assessment for Student Learning, for example, with similar
measurement tools about SKA achievements and effective facilitation skills and transfer of
SKAs to the work place. The interesting data would be the one about the effects of the whole
exercise to student performance, evaluated as the ability to implement the Binding Study
Advice and Tracking and Monitoring Strategy. Triangulation there could assist when one is
able to draw connections or conducts correlational analysis of student performance, to the
implementation of the two HELM AMS strategies, in addition to the evidence about
facilitation skills. The multiplier effect would be the evidence about the programme as a
whole, the impact that HELM agency has towards the integrated quality framework and thus
integrated academic development, which could be measured by HELM’s capacity to work
with other units institution-wide and thus impact institution wide. Drawing from Hoytt and
Pallet (1999), such data would not only be reliable, but would also be representative of the
course participants, the learners, and would also have face validity, as it would be completed
by those directly involved in the course and on time.
3.2. Data from other sources
The WSU EoT/C Policy prescribes the use of questionnaires as data collection mechanisms.
The last paragraph has tried to describe how the weaknesses of this approach, which tends
to be superficial and only measurable at empirical level, can be circumvented by means of
triangulating one AMS strategy with others working within the integrated quality management
framework (Felder & Brent, 2004). The critical realist approach of such a practice would
require that one goes deeper than the empirical level, to the level of the actual and the level
of the real in order to identify the generative mechanisms and structures which could be
giving rise to what is experienced at the surface. In this approach, understanding the the
fault line of administering only questionnaires as data generating mechanisms becomes
paramount. Focus group interviews therefore constitute one of the data generating tools for
the academic project, whereby one can work with HELM individuals and/or groups. Also data
can be generated by means of personal journal entries, disciplinary and institutional peers,
external examiners, etc. In addition to these formal ways of data collection, data can also be
collected by means of participant observation whereby the facilitator gets immersed in the
context and thus should remain alert about some signals that may be indicating whether
learning is taking place or not. Such observations would also include evaluation as part of
the facilitation process, as reflections on the presentation and the contributions that students
25
make about the learning process itself. Other means of evaluation could be suggestion
boxes, use of WiSeUp (WSU online virtual learning platform) as interactive spaces, which
can also be used towards a development of a teaching portfolio. Assessment tasks and how
student perform in those also become additional forms of evaluation, the validation
mechanisms about student ratings, where one can also assess whether the attainment of
learning outcomes correlates with student ratings as represented in both test items and
assessment criteria. In such cases of assessment tasks, they therefore play the role towards
understanding students’ trajectories and also one’s personal reflections.
3.3. Dealing with Student Feedback
Appendix C indicates the responses from students on the use of the WSU EoT/C Policy as
one of the AMS strategies. The data is structured according to enablers and constraints of
policy implementation, including recommendations. The previous section has described how
such feedback can be further structured as the actual evaluation of teaching/courses in the
planned AMS module as part of the structural interaction phase and how other triangulating
mechanisms can be applied to maximise the validity of students ratings. Peer reviews are
described in the next section as also a means of triangulation.
While WSU EoT/C Policy Procedure prescribes what evaluation ratings would imply for
recognitions and improvements, for the purposes of the module, the following points would
be very critical in how to deal with student feedback,
Student data could provide evidence of how the public-good capabilities approach (this
teaching philosophy is explained in Section II). For instance, it could be interesting to
hear participants reflections about the matter of under-prepared students versus the
notion of an underprepared university and how the university’s limited understanding
about the notions of quality seem to perpetuate marginalisation and exclusions. It
would be quite illuminating to hear, from students’ journal entries, their opinions and
positions about the conceptions of excellence as informed by different discipline
orientations and how HELM can open up such closed orientations by adopting
academic development as the means to an end. I will expand on this point when I
discuss curriculum design issues in the next section.
The realist view, as discussed in Section II, would enable one to understand the context
and mechanisms that give rise to student reflections/feedback as explained above. By
means of a realist way of analysis, one could check the contradictions and tensions
from different sources of data and then begin to identify the actual enabling and
constraining mechanisms for the current student exclusions, the limitations of a
26
fragmented, individualistic and specialisation of the current teaching systems and
processes, which derive from a socio-economic system that has come to prevail in the
form of the neoliberal capitalism (Phillip, 2001; Lotz-Sisitka, 2013). Integrated academic
development, as the ability to understand issues of student, curriculum and staff as part
of the collective that revolves around the principle of student success, and ideally
through scholarship of teaching and learning, would provide the alternative to such
neoliberal practices. HELM reflections on these matters and by means of the planned
AMS module could thus be one way whereby evaluation of teaching becomes a viable
instrument for the academic project.
3.4. The Value of the Peer Review Process
In addition to the section on student evaluation, the WSU EOT policy includes a lecturer
evaluation section, with specific subsections for the data from self, peer, and supervisor
evaluation all along different aspects of professional knowledge, attitude and behaviour;
communication; course organisation; student assessment, etc. The fact that the data on
Appendix C indicates a number of constraints to such a policy implementation is a case of
missed opportunity which may be contributing to the current system inefficiencies, and which
the AMS module seeks to address. Peer review could therefore target those areas that
students may not have competence on, including evaluation of didactic issues. A peer could
evaluate knowledge of the subject, along the preparation and organisation and clarity of
presentation. A colleague at same level to be the facilitator, and from own discipline or a
cognate discipline, or staff developers could be invited to carry out this evaluative function.
The process could entail a five-step observation strategy inclusive of pre-observational
meeting, actual observation, data analysis, post-observation meetings, and the actual report.
To achieve the latter, standardised checklists could be used, whereby attributes associated
with good facilitation could be evaluated. Such tools could also include use of teaching
observation feedback sheet/template, for instance. Assessment tools and learning and
teaching materials and the extent to which all those make it possible for the attainment of
learning outcomes could also be evaluated through this process. Working with a peer would
thus open up spaces for collaborative projects on scholarship of teaching and learning
(Smith, 2001), for instance, beyond the programme, especially when such peers are coming
from other institutions.
SECTION 4: EVALUATION AS REFLECTIVE PRACTICE
In addition to peer review process, the WSU EOT/C policy includes a section on Self
Improvement and Proposed Action Plan, which derives from Student and Lecturer evaluation
27
sections. The fact that the data on Appendix C indicates a number of constraints to such a
policy implementation is a case of missed opportunity which may be contributing to the
current system inefficiencies, and which the AMS module seeks to address. Reflecting
critically on the evaluation data and the process, inclusive of evaluation as the important
element of a cyclic reflective practice, one can draw from Guba and Lincoln (1989; p.232)
who caution about an academic evaluation project which would purport to be dealing with
evaluation measurement based more on the narrow views about standards/measurement
than the trustworthiness, hermeneutic dialectic process and authenticity as effectiveness
criteria. The focus only on students’ information and the needs of the decision makers (as
the only stakeholders in the evaluation project) would be posing unnatural, unnecessary and
unethical limitations as such an approach would be “disenfranchising and disempowering to
the many other stakeholders who are invariably involved” in evaluation. The analysis of the
EoT/C data, per Appendix C, for instance, indicates the emerging structures which are
constrained evaluation practices at WSU. In terms of WSU policy, the facilitator would be
able to reflect on his practice by completing the section inclusive of self, peer and supervisor,
including a proposed action plan. These are the conditions that need to be addressed and
the AMS module goal to develop the HELM public-good capabilities is just one way of acting
on such data, the improvement phase of the reflective process, which then is more about
student responses or decision making needs.
Further improvement activities from the actual evaluation of the teaching in the AMS course,
in addition to what was explained in the section on evaluation approaches and instruments,
would also imply engaging in the next three steps of reflective practice, new planning and
implementation, and monitoring/evaluation, all at different levels of course, programme and
institution, as indicated on the table, Section I. For instance, at course level, there may be a
need to use the evaluation data to review whether and how the AMS course enhances
transformative learning, that is, the ability of HELM AMS module participants to surface their
belief systems and norms about teaching and learning in higher education and the requisite
SKA for monitoring the extent to which teaching and learning spaces thus become
transformative platforms for interrogating knowledge.
The AMS module will therefore cover this as part of the strategies related both to students
learning skills and staff delivery competencies. At programme level, there may be a need to
review whether the whole cycle is achieved through programme design, implementation,
evaluation and improvement. Such a process could give rise to the need for the review of
systems and policies at institutional level. Such a cyclic and geodesic approach to
evaluation, the reflective phase in particular, has the potential to situate evaluation practices
28
within the appropriate discourse of scholarship of teaching and learning, hence the need to
understand and explain these issues from the ontological and epistemological positions as
explained in Section II. A teaching/programme portfolio could thus become a platform for
self-evaluation, where both feedback received from the evaluation process and the personal
journal entries could be considered as part of the implementation process.
SECTION 5: EVALUATION FOR IMPROVEMENT
5.1. Evaluation and Curriculum Design
Vorster and Quinn (2012) provide the following hard questions for those of university
practitioners who have responsibility for designing development programmes for academics,
How do we decide what to include in a programme?
How do we make those decisions transparent to participants?
How are activities paced and sequenced?
How do we balance contextual and conceptual coherence?
How can development be assessed?
As explained in Section II, WSU is grappling with the current challenges of student access
with no success, as exemplified in the case of engineering programmes. EoT/C data on
Appendix III was explained under the section on dealing with student feedback. A proposal
was also made that a course presentation, as dealing with student feedback, could be done
along the following four areas,
Evaluation of teaching/courses practices;
The normative framework for evaluation of teaching;
Structural arrangements; and
Ideal operations/activities.
To address some of the programme design questions as part of the improvement phase of
evaluation, and towards a credible action plan as a result of the evaluation data, one can
draw from the following four principles for a strong and explicit value-based curriculum as
recommended by Vorster and Quinn ibid.
5.1.1. Disruption of everyday conceptions of teaching and learning
Data from evaluation of the course section on Biding Study Advice (BSA), for instance, could
help one understand students concerns, claims and issues (CCIs) about teaching and
learning at WSU. BSA seminars and workshops would provide a space for students to
articulate their CCIs as part of contractual arrangements which revolve around expectations
29
and responsibilities by both students and the university. For a workable curriculum focussing
on HELM capabilities, such academic monitoring and support arrangements would help not
only about the knowledge of self, and of, and for, knowledge as transformation of teaching
and learning, but also of “the other”. Doing this in practice would entail designing learning
areas, with clear learning outcomes and assessment criteria, focussing on HELM’s belief
systems and values about the ideal of academic development as social justice; focussing on
the principles and perspectives of change management, with emphasis on critical humanistic
perspective as the alternative to the current reductionist approaches of quality management,
given the post-modern ways of understanding reality; and focussing on the “other”, the
student-at-risk of academic failure, by challenging the current fragmented, individualistic, and
specialist orientations of, instead of the integrated, collective and general orientations about,
teaching and learning.
5.1.2. Valuing of disciplinary difference
Multidisciplinary projects by means of scholarship of teaching and learning, which could
transcend the four module areas of academic monitoring and support as described in
Section I, and involving HELM groups from different university disciplines, could deal with
the notions of horizontal and vertical articulation of disciplines as knowledge structures. The
conceptualisation and valuing of each discipline in terms of requisite curriculum and
graduate attributes, measurable as assessment criteria to student learning and staff
competences in terms of facilitating learning could be addressed along the relational aspects
of developing the graduate attributes, which are personally, professionally and socially
valuable, as required by the CHE QEP, for instance. The latter could be a rallying point
about the value of developing critical cross field outcomes and the importance of
partnerships across disciplinary boundaries. An integrated project at the end of the AMS
course which could be given as a form of summative assessment could be one of the ways
to evaluate HELM capabilities for addressing such disciplinary differences through integrated
academic development.
5.1.3. Development of management reflexive and reflective practices
The implementation constraints for the EoT/C policy as discussed in Section III indicate the
challenges of translating quality enhancement requirements from the macro level
(institutional policy/systems) to the micro level (teaching and learning systems and
processes). Towards unlocking these constraining structures and mechanisms, one could
begin by addressing the meso level, the HELM capabilities in transforming the policy
requirements for the benefit of the lower levels, that is, the coalface of teaching and learning,
30
where students and staff should benefit most from institutional resources. Developing HELM
capabilities by means of effective cyclic and geodesic process of programme management
(design, implement, evaluate and improve) could focus on HELM as local actors of
transforming academic development goals and objectives into local/discipline priorities
through reflexive and reflective practices. In practice, and by means of curriculum design,
HELM could be capacitated to be able to act as local innovators for the institutional policies.
They could, in turn, be in a position to shape the local modes of activities towards an
integrated quality management system (Luckett, 2007, refers to this approach as middling
out of management practices, instead of the top-down and bottom up). The HELM AMS
module on student tracking and monitoring (read Section I) seeks to achieve this objective
by helping HELM to develop early warning systems about student performance through data
management and institutional research and along the business value chain of student entry,
progression, completion and work placement.
5.1.4. Epistemological access
The concept of epistemological access is well documented in higher education studies
(Murrow, 1993; Boughey, 2005 and Muller, 2014). Higher education literature identifies ways
of knowing the subject, the mechanisms whereby the rules of mastering the concepts and
principles of a discipline are made explicit for students’ internalisation, should be at the core
of transforming teaching and learning. For academic development, the notion of
epistemological access talks directly to the issue of practitioner capabilities, which the
planned AMS module seeks to develop as HELM’s reflexive and reflective practices. It is
assumed that where you have relevant HELM levels which are knowledgeable about the
value epistemological access, they could thus be able to create enabling conditions and
circumstances for the benefit of both staff and students with the multiplier effect on student
retention. HELM AMS modules on EoT/C and Assessment for Student Learning would thus
constitute programme design focussing on staff competencies as the means for promoting
epistemological access. Programme content, along different learning areas, learning
objectives and assessment criteria could cover aspects of professional knowledge, attitude
and behaviour, materials development, communication skills, class room practice and
assessment forms and principles as the mechanisms for promoting epistemological access.
SECTION 6: CONCLUSION
Guba and Lincoln (1989, p.233), in their argument that all science, especially socially
science, is value bound, caution against “the possibility of acting to ‘value” a project
31
(program, curriculum, etc.) while acting as though values were unimportant or corrupting to
the valuing/evaluation effort should have stuck us long ago as bizarre, if not contradictory
behaviour”. The realist view, though, takes the position about the importance of valuing
social programmes as undeniably, unequivocally and unexceptionally social systems. This
refers to a never ending process of trying to describe not only what works (outcomes) but
also to explain what works for whom and under what conditions and circumstances (contexts
and mechanisms). The analysis of evaluation data can help one to resolve some of the
social systems dynamics as part of the requisite knowledge of, and knowledge for,
epistemological access in higher education. In the advent of the CHE quality enhancement
project and its goal about enhanced student learning for personally, socially and
professionally valuable graduate attributes, one may have to describe and explain the
context and mechanisms that might give rise to such an outcome by analysing the requisite
knowledgeability of HEML AMS practices.
Engaging in this task helped me to reflect on the evaluation practices for the planned AMS
capacity building programme/module, of which this evaluation exercise was a part. Being
part of the team helped me with forward thinking about how the planned module might be
implemented after the structural and cultural conditioning of the current AMS practices have
been determined by means of Phase 1 of the theory of morphogenesis. This provided me
with thinking about the evaluation aspects of how such a course might be facilitated, as the
social interaction phase, towards determining the elaboration of structures requisite for the
effective management of student exclusions. Such forward thinking also afforded me the
opportunity to engage in a hermeneutic dialectic process about evaluation concepts and
their applications as was required for this task.
On matters of personal reflexivity, one can draw from McKenna (2012), who argues that it is
down to academic developers to bring about disruptions in academic spaces as part of the
academic project. According to McKenna ibid, AD practitioners need to take their places as
deeply committed activists who can provide a theorised space for interrogating what it is to
be an academic. This is in marked contrast with the unattractive alternatives of academic
developers as a bunch of insensitive outsiders, lackeys of management, and/or collaborators
in the neoliberal project of degrading disciplinary and academic values. One can only hope
that such a project like this exercise, which tried to demonstrate the use of evaluation
concepts from a critical realist orientation, by analysing the context, mechanisms and
outcomes of evaluation of teaching and courses for effective learning processes, has,
indeed, managed to serve the academic project and also managed to demonstrate what it
might mean to be an academic.
32
APPENDIX A: DESCRIPTION OF ACADEMIC MONITORING AND SUPPORT
STRATEGIES
Academic Monitoring and
Support Strategy
Performance Indicator
Binding Study Advice (BSA)
Every WSU student to be contracted towards accumulating the
required number of credits towards final assessment and on the
basis of which academic advice is provided for learners at risk
and within the set semester/year.
Tracking and Monitoring System
(TMS),
Every WSU student to be monitored and tracked for student
academic performance per tasks completed and marked per
semester/year, and for referral of all at risk students to on time
academic development interventions.
Assessment for Student
Learning (ASL)
All assessment tasks to be moderated for ensuring the quality
of assessment for student learning and capacity building for
academic staff takes place where standards are not met.
Evaluation of Teaching/Courses
(EoT/C)
Each course/module to be evaluated per institutional policy (i.e.
within the semester for semester courses and within a year for
year courses) and procedures for on time academic
development interventions to be effectively implemented.
Programme Reviews (PR)
Each academic programme to be reviewed per institutional
policy (that is within three years of programme offering) and
procedures and corrective action plans to be implemented
timeously.
APPENDIX B: CHE Institutional Audit Recommendation to WSU, April 2011
Recommendation 4: The HEQC recommends that Walter Sisulu University strengthen its reporting and
monitoring mechanisms in all areas with special focus on student success, throughput and graduation
rates……….; and ensure that the appropriate divisions take responsibility for the implementation of policies and
plans.
Recommendation 17: The HEQC recommends that Walter Sisulu University embed a collective understanding
of quality and its monitoring across the institution; review systems to link academic and financial planning and
quality management in a transparent manner; and review administrative support to manage these processes and
systems.
Recommendation 19: The HEQC recommends that Walter Sisulu University develop an appropriate institution-
wide strategy on teaching and learning, and assessment, to ensure the success of students, and which is
33
consistent with the University’s aspirations to be a developmental university that has specific teaching and
learning goals, and which is linked to academic and pastoral support systems.
Recommendation 22: The HEQC recommends that Walter Sisulu University firmly implement its academic
exclusions policy.
APPENDIX C: Student Feedback on the Evaluation Tool about the Academic Monitoring
and Support Strategy about Evaluation of Teaching
Source
Code
What are the constraints to the practice of
EoT/C?
What could be the enablers?
(#001)
No adherence to deadlines or time frames
set. Staff member are ill-equipped on how to
conduct assessment and moderation of the
modules. Undermining attitudes towards
others.
Capacitate the academic staff members on
how to assess and moderate modules and
courses. Motivate the staff members to
meet deadlines.
(#002)
In 2013, we did evaluate all courses
manually since Online LMS was not up and
running. Analysis of these was not practical
at all given the volume of work involved.
Online evaluation tool.
What are the constraints to the practice
of EoT/C?
What could be the enablers?
(ST01)
Lack of time to do such a process
(MJ02)
Workload of lecturers and insufficient
knowledge of policies and procedure on
EoT/C
Appoint sufficient number of lecturing staff,
secretarial staff and workshops on policies
and procedures on EoT/C
(ZB03)
HoDs and lecturers non awareness of its
importance. Proper management of the
process.
QAOs and HoDs to convince departments
on its use, relevance and significance to T
& L, Access to computers for students
(BA04)
There is a disjuncture from when the
analysis is submitted by QMD to CLTD to
when interventions are undertaken, at this
point one hand does not know what the
other is doing about the finding. Academic
leadership is not very firm in ensuring that
indeed academic staff ensure that their
student do take the evaluation. Currently
capacity is a challenge.
We have WiSeUp which is making things
easier but not perfect. The Evaluation of
Teaching and Courses short course has
helped a lot, especially in terms of taking
another look at our policy and practice; I
wish many more in leadership can be
exposed to this course.
(CN05)
Delayed evaluations
For HODs to see the importance of well-
planned EoT/C for purposes of fast
intervention
(MP06)
Staff not aware of the evaluation of
teaching policy and what is expected of
them. Furthermore it does not seem that
departments are enforcing the
implementation of this policy
A roadshow orientation staff on existing
policies. Some kind of recognition/ incentive
for departments doing evaluation of
teaching
(NT08)
Induction on institution policies. Policies
are not enforced or implemented. Staff
members do not have clear understanding
on the EoT/C theories and practices
Institution or faculties to make it a
requirement for lectures to obtain a relevant
training on EoT/C
34
(STJ09)
Only student evaluation has been done
and there is no scheduled time, staff are
told to give students forms for evaluation
and collect them and hand them to HoD,
so hand them to QA .There is no specific
time of when to collect the forms while staff
were told within the said time forms stays
for weeks. The results from the analysis
are received following year and are coming
in bit and pieces.
The Department always feel what the use
of evaluation is when there’s no support of
teaching and learning infrastructure and
what to evaluate if there are no proper
tools to conduct what was required.
Our Department Teaching and Learning
equipment or infrastructure has
malfunctioned in ages and many years the
department asked for intervention, while
other institution are changing in times and
improved their technology, this statement
elaborate the fourth bullet. Equal
management to all institutional
Departments to prioritise Teaching and
Learning not to set other department to
failure Proper policy of workload for all
levels of lecturing, this is compromising the
development of staff to develop their ability
in both teaching and learning, research
and further studying.
Equal treatment for teaching and learning
infrastructure and space for development.
Proper workload policy to enable
development to academic staff and study
policy of how the replacement is going to
be remunerated and by whom e.g Ms
Osner it took 3 years of requesting for her
to go on study leave.
QA must be responsible for Evaluation and
both student and staff are supposed to be
properly informed of the goals and priorities
of evaluation and must be done
electronically.
Results must be analysed and feedback
must be sent to reciprocates before their
course review .The institution must take into
consideration where it is failing the
Departments on teaching and learning
tools.
(N010)
The understanding and interpretation of
the whole concept of EoT, its purpose and
benefits by academics becomes a
challenge, because others tend to use it in
order to achieve other objectives that are
unrelated to the set goals.
The issue of capacity at QMD is still a
challenge in terms of resources and
personnel i.e. If there could be a faster and
efficient system to capture and analysis
data it would be much easier to prepare
reports on time and start again another
cycle because they consume more time
more especially if they are done manually
even electronically you have to export the
data etc.
The issue of monitoring as to how the
comments raised on the evaluation are
corrected and implemented by academics
somehow needs to be looked at, I think by
both QMD and CLTD. And again reporting
back to the students about the steps to be
taken if there were major issues raised by
them during the evaluation.
Availability of an efficient and reliable
system like Evasys. Corporation and
participation of departments during the
exercise.
(ML12)
Lack of knowledge of the policy, fear of
academics to be evaluated and poor
HOD should spear head the EoT/C and
also make the academic staff understand
35
communication from the HODs as some of
the HODS don’t even know the policy.
that the EoT/C is not a killer but a
developmental tool for the academic staff.
Also introducing of Teaching Portfolio
REFERENCES
Archer, M. S. (1995). Realist social theory: the morphogenetic approach. Cambridge,
England, Cambridge University Press.
Berg, M.N. & Hofman, W.H.A. (2005). Student success in university education: A multi-
measurement study of the impact of student and faculty factors on study progress. Higher
Education. (50) 413-446.
Bhaskar, R. (1989). The possibility of naturalism. Brighton, Harvester Wheatsheaf.
Boughey, C. (2005). Epistemological access to the university: An alternative perspective.
South African Journal of Higher Education , 19 (3), 638-650
CHE (Council on Higher Education). 2013. Framework for Institutional Quality Enhancement
in the Second Period of Quality Assurance. Pretoria.
Dhunpath, R, and Vithal, R, (2012). Alternative Access to Higher Education: Underprepared
Students or Underprepared Institutions? Cape Town: Pearson.
Dwayi, VM. (2013). A Social Realist Account of the Roles of Academic Leadership and
Management for Student Academic Success, Case of Walter Sisulu University, South Africa;
Peer Reviewed Proceedings, 6th International Conference of Education, Research, and
Innovation (ICERI 2013); 18-20 November 2013; ISBN 978-84-616-3847-5; pp 7176-7184.
Dwayi, VM. (2003). Impact Assessment of a Capacity Development Programme in a Historically
Disadvantaged Institution, A Reflective Study. A Dissertation submitted in fulfilment of Master of
Commerce in Organizational and Management Systems, University of Natal. Durban.
Felder, R.M. and Brent, R. (2004). Random thought….How to Evaluate Teaching. Chemical
Engineering Education, 38(3), 200-202 (2004).
Guba, E.G., & Lincoln, Y.S. (1989). Fourth generation evaluation. Newbury Park, CA: Sage.
36
Hoyt, D.P. and Pallett, W.H. 1999. Appraising Teaching Effectiveness: Beyond Student
Ratings. IDEA Center. Manhattan, Kansas.
Kazi, M.A.F & Rostilla, I (2002). The Practice of realist evaluation in two countries. Paper
presented at the European Evaluation Society Conference, Sevilla, Spain, October 10.-12.
2002.
Lange, L. (2014). ‘Rethinking Transformation and Its Knowledge(s): The Case of South
African Higher Education’. Critical Studies in Teaching & Learning. 2(1), pp 1-24.
Loads, D. (2014). Book Review: Re-imagining academic staff development: spaces for
disruption, International Journal for Academic Development, Vol. 19, No. 1, 7072.
Lotz-Sisitka, H. (2013). We must start with our own children”: Reflectively Researching
Intergenerational Leadership for Social Justice, Education and Sustainability.
Luckett, K. (2007). ‘Methodology matters: Methodological possibilities for quality
improvement’, Perspectives in Education, 25 (3), pp.1-11.
McKenna, S. (2012). ‟Interrogating the Academic Development Project”. In Quinn, L. (ed.).
Reimagining Academic Staff Development: Spaces for Disruption. Stellenbosch: Sun Press,
15 26.
Morrow, W. (1993). Epistemological access in the university. AD Issues 1(1), 3-5.
Muller, J. (2014). Every picture tells a story: Epistemological access and knowledge,
Education as Change, DOI:10.1080/16823206.2014.932256.
Singh, M. (2014). ‘Higher Education and the Public Good: Precarious Potential?’ ACTA
ACADEMICA 46(1).
Phillip, K. (2010). Seeds of Neo-Colonialism? Reflections on Ecological Politics in the New
World Order”, Capitalism Nature Socialism, 12, 2, 3-47.
Poney, J. & Hall, S. (000). Closing the Loop. The Impact of Student Feedback on Students’
Subsequent Learning. SCRE Research Report No 90. The Scottish Council for Research in
Education.
Smith, R. (2001). Formative Evaluation and the Scholarship of Teaching and Learning. New
Directions for Teaching and Learning, No. 88.
37
Tinto, V. (2012). Completing College. Rethinking Institutional Action. Chicago: The University
of Chicago Press.
Volbrecht, T. and Boughey, C. (2005) “Curriculum responsiveness from the margins? A
reappraisal of Academic Development in South Africa.” Griesel, H. (ed) Curriculum
Responsiveness: Case studies in higher education. SAUVCA, Pretoria.
Pawson, R, Tilley, N. (1997). Realistic Evaluation. London: Sage Publications.
Quinn, L. (2012). Enabling and constraining conditions for academic staff development. In
Quinn, L. (ed.). Reimagining Academic Staff Development: Spaces for Disruption.
Stellenbosch: Sun Press, 27 50.
Vorster, J. & Quinn, L. (2012). ‘Theorising the Pedagogy of a Formal Programme for
University Lecturers’. In Quinn, L. (ed.). Reimagining Academic Staff Development: Spaces
for Disruption. Stellenbosch: Sun Press, 51 70.
WSU (Walter Sisulu University). (2014). Self-Evaluation Report for the CHE Quality
Enhancement Project, Mthatha.
ResearchGate has not been able to resolve any citations for this publication.
Article
Full-text available
The article argues that since the early days of the democratic transition in South Africa 'transformation' as a concept has lost its intellectual, political and moral content becoming institutionalised. In order to undo the institutionalisation of transformation, it is necessary to explore its relationship to two types of knowledge: knowledge for transformation and knowledge of transformation. The paper argues that transformation at higher education institutions needs to be seen in the interface between knowledge for and knowledge of transformation.
Article
Full-text available
This article discusses methodological issues in relation to models of quality assurance for higher education. It first traces the historical development of the widely adopted pragmatic model and shows how this model has Ween adapted by the Higher Education Quality Committee (HEQC) for the South African context. It then sets out some basic tenets of critical realism in order to critique the pragmatic model from a critical realist perspective. Finally it proposes a theory-based approach to evaluation located in a critical realist paradigm as an approach that has the potential to effect the improvement or even transformation of educational practice as a possible outcome of quality assurance.
Article
Full-text available
The term epistemological access' has been increasingly used in scholarly publications on higher education learning in South Africa and elsewhere recently. This paper reviews the context and lineaments of the term in the work of philosopher Wally Morrow and its take-up by other scholars, and sets out to elaborate the conceptual underpinnings of the notion, outlining three possible answers to the question, what is epistemological access affording access to?' and proposing a fourth as a fruitful way to deploy the term in future scholarly work on higher education learning which seeks to stipulate what graduates should learn, know and be able to do.
Article
Concerns about, and critiques of neo liberal policy regimes in higher education have heightened the search for alternative normative and organisational models, many of which have coalesced around the necessity to re imagine and defend the public missions of higher education. This has given the notion of the public good greater resonance as an alternative or supplementary frame of reference in debates on higher education and social change. This article identifies some frequently raised issues in the analytical literature on the public good in order to indicate the range of conceptual and operational challenges at stake. It is argued that the ideological constraints and practical difficulties in moving towards a public good regime make the potential and prospects of the notion uncertain and almost precarious in constituting a new foundational basis for considering the social value of higher education. Nevertheless, resisting or mediating public 'bads' and increasing or joining up a variety of public good interventions remain as necessary and valuable tasks in the face of contending social purposes of higher education.
Book
There is an inextricable link between access, equity, redress and quality in Higher Education as a driver of social and economic development. In response to the access and equity imperatives in South Africa, universities have introduced a variety of access programmes, notably the state-sponsored Foundation programmes, the success of which has never fully been investigated to assess their efficacy and impact. Based on empirical work of acknowledged experts in alternative access and Foundation provisioning in South Africa and using the University of KwaZulu-Natal as a case study, this book shifts the gaze, placing under scrutiny, the question of institutional (under) preparedness.
Article
Evaluating faculty effectiveness is important in institutions of higher education. Although evaluation is inherently threatening to most faculty members, the vast majority take their assignments seriously and want to conduct them as effectively as possible. Assessing faculty performance is a complex and time-consuming process. If it is done poorly or insensitively, it can have an adverse effect on institutional quality. Whether or not individual institutions elect to commit the resources required for valid evaluations depends on the degree to which they agree with these propositions: (1) all members of the institution should be accountable for their activities and performance; (2) the conduct and use of credible evaluation programs have an important influence on the welfare and future excellence of the individual, the department, and the institution; and (3) when improvement efforts are supported by institutional policy and guided by comprehensive and valid appraisals of current functioning, the well-being of the individual and the institution are affected positively. (Contains 18 references.) (SLD)