Dix, K. (2011). KidsMatter evaluation: Producing evidence-based research that matters. In R.H. Shute,
P.T. Slee, R. Murray-Harvey, & K.L. Dix (Eds.), Mental health and wellbeing: Educational perspectives,
(pp.343-346). Adelaide: Shannon Research Press.
research that matters
Katherine L. Dix
Flinders University, Australia
If we want more evidence-based practice, we need more practice-based
evidence. (Green & Glasgow, 2006, p.126)
It is well accepted that education is positively related to health, and that
schools can play a key role in promoting healthy behaviours and
attitudes. Improving our understanding of the relationship between
education and health will help to identify where interventions, such as
KidsMatter, are most appropriate and effective in improving both
individual and population health. KidsMatter Primary is the Australian
mental health promotion, prevention and early intervention initiative.
As an intervention, KidsMatter acts to make this relationship explicit,
by providing an educative framework to promote student mental health
and wellbeing, which further facilitates a cycle of improved educational
and health outcomes for students.
The KidsMatter model, shown in Figure P5.1, involves a whole-school
systemic approach guided by a four-component framework, with the
aims of (a) improving mental health and wellbeing of the students, (b)
reducing mental health problems among students, and (c) achieving
greater support for students experiencing mental health problems (Slee
et al., 2009).
344 MENTAL HEALTH AND WELLBEING: EDUCATIONAL PERSPECTIVES
Figure P5.1. The KidsMatter Primary model
The KidsMatter model is designed to be a completely integrated part of
a school‘s community and operation, and recognises that the positive
development of students‘ mental health is situated in complex and
dynamic systems. Evaluating a complex framework in complex
environments also makes the evaluation itself unavoidably complex,
particularly if it is to provide a strong evidence-base that informs
national policy and funding decisions. A team comprising Flinders
University‘s SWAPv members undertook the evaluation of KidsMatter
Primary and was faced with this challenge.
KidsMatter Primary commenced in 2006 as an ambitious trial in 101
primary schools across Australia. It was underpinned by a strong
theoretical foundation, and efficacy trials were implemented with
sufficient infrastructure and funding at federal level. The importance of
what KidsMatter could mean for the future of Australian children, and
the significance of this federally funded project, placed substantial
responsibility on the Flinders University research team – responsibility
to get the evaluation ‗right‘, in terms of providing a comprehensive and
statistically rigorous defensible report, accessible to an educated
audience. Foundational to achieving this outcome were four
Diverse expertise. At every stage of the complex endeavour, elegant
solutions to problems needed to be found through innovation and
perseverance. Bringing together a diverse skill-set of practical and
PORTRAIT 5. KIDSMATTER EVALUATION 345
theoretical expertise among a dedicated evaluation team ensured the
final products were of the highest standard.
Embracing collaborative complexity. Evaluations, by nature, can put
people on edge, and this can be further compounded by the stigma
attached to mental health. Relationship building and developing a
trusting collaborative partnership among all stakeholders was
paramount at all stages. Managing this complexity served to ensure that
research design and execution were responsive to project requirements
and sensitive to the culture and capacity of participating school
communities. The benefits of generating ‗good will‘ through face-to-
face, phone, email, Web 2.0 technologies, and written methods,
optimised cooperation and resulted in high participation and
engagement that facilitated the translation of analysis into a readable
report, delivered on time.
Valuing the time quality takes. Collaborative research in all its forms
takes time – time to meet and to work with participants, time to write
informative progress reports, time to prepare data carefully and develop
appropriate analytical methods, and time to translate the findings in
order to achieve maximum reach.
Rigour versus reach. These ideals are usually at opposite ends of a
continuum. The more specific and statistical a report, the less
accessible to a general audience that report becomes. Hargreaves
(2000) summed it up nicely: ―Educational researchers write mainly for
each other in their countless academic journals, which are not to be
found in a school staffroom‖ (p.202). Regardless of what the results of
the evaluation were, they had to be translated and understood, first by
the team, and later by government advisors and school principals alike.
The simple solution to this challenge was to produce two reports
developed as two sides of the same coin: a highly accessible ‗popular‘
Final Report (Slee et al., 2009), and a highly analytical Technical
Report (Dix et al., 2010).
Producing educational research that has national impact is rare. One of
the clearest findings, translated from complex statistical modelling into
meaningful information, was the improvement in mental health of 1 in
20 students. Nearly one child in every classroom was less disruptive,
less confrontational, more settled, and able to learn – a result
attributable to the effectiveness of KidsMatter. On the basis of this and
other positive evidence provided in the comprehensive Final Report,
the Australian Government announced at the start of 2010, A$18.4
346 MENTAL HEALTH AND WELLBEING: EDUCATIONAL PERSPECTIVES
million for the national rollout of KidsMatter Primary to 2100 schools
by 2014, and the trial of KidsMatter in Early Childhood. This was a
highly successful outcome and the future mental health of our children
looks optimistic. This was also a successful outcome for the Flinders
evaluation team and a measure of the quality of the research undertaken
– or was it?
Schools of Education in Australian universities are well placed to
produce the type of high-quality educational research increasingly
demanded by government to inform their policy-making decisions.
From this perspective there is an increasing need for undertaking
practice-based evidence to inform evidence-based practice.
However, running counter to addressing this need is the unprecedented
pressure to publish. Often the research that is born out of community
and industry collaboration, that has the greatest opportunity to inform
policy, is not initially publishable in scholarly journals. Time taken to
nurture collaborative partnerships and generate ‗good will‘ is time
taken away from writing for scholarly publication. The evaluation of
KidsMatter is a lived example.
This tension between conducting research that matters at the expense of
career advancement is an important issue, particularly for early career
academics. In the current Australian climate, the quantity of articles in
so-called ‗A-listed‘ journals is valued over and above the production of
quality commissioned reports. The focus in Higher Education to
support early career researchers seems amiss in the field of education.
The wealth of experience, knowledge and professional development
that comes from engaging in such projects should be career forming
and not conditional on scholarly output as the only measure of success.
Dix, K.L., Keeves, J.P., Slee, P.T., Lawson, M.J., Russell, A., Askell-
Williams, H., … Spears, B. (2010). KidsMatter Primary evaluation
technical report. Adelaide: Shannon Research Press.
Green, L .W., & Glasgow, R. E. (2006). Evaluating the relevance,
generalization, and applicability of research. Evaluation and the Health
Professions, 29(1), 126-153.
Hargreaves, D. H. (2000). Teaching as a research-based profession:
Possibilities and prospects. In B. Moon, J. Butcher, & E. Bird (Eds.),
Leading professional development in education (pp. 200-210). London:
Routledge and Falmer.
Slee, P.T., Lawson, M.J., Russell, A., Askell-Williams, H., Dix, K.L., Owens,
L., ... Spears, B. (2009). KidsMatter Primary evaluation final report.