Improving the performance of the health service delivery system? Lessons from the Towards Unity for Health projects.
ABSTRACT The World Health Organization developed the Towards Unity for Health (TUFH) strategy in 2000 for the improvement of health system performance. Twelve projects worldwide were supported to put this strategy into practice. A standard evaluation and monitoring framework was developed on the basis of which project coordinators prepared technical progress reports.
To review the utility and effectiveness of the evaluation criteria recommended by TUFH and their application in four of the original twelve projects.
We reviewed status reports provided by European project coordinators and developed a standardized reporting template to extract information using original TUFH evaluation criteria.
The original TUFH evaluation framework is very comprehensive and has only partly been followed by the field projects. The evaluation strategies employed by the projects were insufficient to demonstrate the connections between the intervention and the desired process improvements, and few of the evaluation measures address outcomes.
The evaluation strategies employed by the projects are limited in allowing us to associate the intervention with the desired process improvements. Few measures address outcomes. The evaluation of complex community interventions poses many challenges, however, tools are available to assess impact on structures and process, and selected outcome indicators may be identified to monitor progress in future projects.
Based on the review of evaluation status of the TUFH projects and resources available we recommend moving away from uniform evaluation and towards monitoring minimal, context-specific performance indicators criteria.
- SourceAvailable from: nih.gov[show abstract] [hide abstract]
ABSTRACT: The view is widely held that experimental methods (randomised controlled trials) are the "gold standard" for evaluation and that observational methods (cohort and case control studies) have little or no value. This ignores the limitations of randomised trials, which may prove unnecessary, inappropriate, impossible, or inadequate. Many of the problems of conducting randomised trials could often, in theory, be overcome, but the practical implications for researchers and funding bodies mean that this is often not possible. The false conflict between those who advocate randomised trials in all situations and those who believe observational data provide sufficient evidence needs to be replaced with mutual recognition of the complementary roles of the two approaches. Researchers should be united in their quest for scientific rigour in evaluation, regardless of the method used.BMJ 06/1996; 312(7040):1215-8. · 14.09 Impact Factor
- [show abstract] [hide abstract]
ABSTRACT: Public health interventions tend to be complex, programmatic, and context dependent. The evidence for their effectiveness must be sufficiently comprehensive to encompass that complexity. This paper asks whether and to what extent evaluative research on public health interventions can be adequately appraised by applying well established criteria for judging the quality of evidence in clinical practice. It is adduced that these criteria are useful in evaluating some aspects of evidence. However, there are other important aspects of evidence on public health interventions that are not covered by the established criteria. The evaluation of evidence must distinguish between the fidelity of the evaluation process in detecting the success or failure of an intervention, and the success or failure of the intervention itself. Moreover, if an intervention is unsuccessful, the evidence should help to determine whether the intervention was inherently faulty (that is, failure of intervention concept or theory), or just badly delivered (failure of implementation). Furthermore, proper interpretation of the evidence depends upon the availability of descriptive information on the intervention and its context, so that the transferability of the evidence can be determined. Study design alone is an inadequate marker of evidence quality in public health intervention evaluation.Journal of Epidemiology & Community Health 03/2002; 56(2):119-27. · 3.39 Impact Factor
ORIGINAL RESEARCH PAPER
Improving the Performance of the Health Service
Delivery System? Lessons from the Towards
Unity for Health Projects
OLIVER GROENE1& LUIS A. BRANDA2
1Technical Officer Quality of Health Systems and Services, WHO Regional
Office for Europe, Copenhagen and
University, Hamilton, Canada
2Professor Emeritus, McMaster
for Health (TUFH) strategy in 2000 for the improvement of health system performance.
Twelve projects worldwide were supported to put this strategy into practice. A standard
evaluation and monitoring framework was developed on the basis of which project
coordinators prepared technical progress reports.
Objectives: To review the utility and effectiveness of the evaluation criteria recommended
by TUFH and their application in four of the original twelve projects.
Methods: We reviewed status reports provided by European project coordinators and
developed a standardized reporting template to extract information using original TUFH
Results: The original TUFH evaluation framework is very comprehensive and has only
partly been followed by the field projects. The evaluation strategies employed by the
projects were insufficient to demonstrate the connections between the intervention and the
desired process improvements, and few of the evaluation measures address outcomes.
Discussion: The evaluation strategies employed by the projects are limited in allowing us
to associate the intervention with the desired process improvements. Few measures
address outcomes. The evaluation of complex community interventions poses many
challenges, however, tools are available to assess impact on structures and process, and
selected outcome indicators may be identified to monitor progress in future projects.
Conclusion: Based on the review of evaluation status of the TUFH projects and resources
available we recommend moving away from uniform evaluation and towards monitoring
minimal, context-specific performance indicators criteria.
Context: The World Health Organization developed the Towards Unity
KEYWORDSTowards Unity for Health framework, evaluation strategies, monitoring
Author for correspondence: Mr Oliver Groene, MA, MPH, Technical Officer Quality of Health
Systems and Services, Marc Aureli 22-36, E - 08006 Barcelona. Tel: þ34 93 241 8270.
Fax: þ34 93 241 8271. E-mail: email@example.com
Education for Health,
Vol. 19, No. 3, November 2006, 298–307
Education for Health ISSN 1357-6283 print/ISSN 1469-5804 online ? 2006 Taylor & Francis
Context: Towards Unity for Health
In 2000 the World Health Organization launched the initiative ‘‘Towards Unity
for Health’’ (TUFH) to study and promote efforts to foster unity of service
provision for a population’s health needs (illustrated graphically in Figure 1).
The goal of this initiative was to improve the performance of the health service
delivery system and make it more relevant to the needs of the population. This
would be achieved by raising awareness among the various actors who function
in the health services delivery system and who possess a need to create
productive partnerships through the facilitation and coordination of interven-
tions oriented at both individual and community health needs of a given
population. The conceptual model of TUFH is that this strategy of building
partnerships among key stakeholders in a specified population – policy-makers,
health managers, health professionals, academic institutions and communities –
will lead to improved coordination, relevance and performance (TUFH
Twelve projects were selected globally to implement the TUFH strategy. A
standard evaluation and monitoring framework was developed and provided to
the project coordinators for use in preparing technical progress reports. This
framework addresses four perspectives or clusters (TUFH, 1999): It is based on
Figure 1. The TUFH framework.
Lessons from the TUFH projects299
the strong assumption that new models and patterns of services result in better
partnerships among stakeholders, which then result in better integration of
services. This process in turn, then impacts the activities of health professionals
and results in better health systems performance (Table 1). This framework
is developed in greater depth in a working paper prepared by WHO
The first cluster addresses innovative patterns of services for integrating
medicine and public health (definition of reference population and
territoriality, description of organizational model for integration (range,
linkage) and use of comprehensive health information management);
The second cluster refers to the implications for health professionals
(practical impact on roles and rewards, and the degree to which educational
institutions and programmes are social accountable);
The third cluster addresses partnership functioning (the partners involved
and the quality of partnership);
The fourth cluster relates to the evidence of impact (the effects on quality,
equity, relevance and cost-effectiveness and diffusion of the idea).
This article reviews the extent to which four of the twelve TUFH field projects
followed the suggested evaluation framework.
Our review of existing TUFH projects was based on biannual reports prepared
by the TUFH project coordinators and submitted to WHO. Reports described
the progress made in the projects, difficulties experienced and the results
obtained. The following material was available at the time of the review
This review is limited to the four European TUFH projects carried out in the
Czech Republic, Italy, Spain, and the Netherlands (TUFH, 2001, 2002).
Table 1. Evaluating health system impact: structure, process and outcome
PartnershipsHealth system processes Health system results
(a) Productivity of partnership"
(a) Coordination and
(b) Fragmentation #
(b) Sustainability of partnerships"
The more productive and sustainable the partnership is (a) the better is the coordination
and integration of health care services, (b) the smaller is the fragmentation of health
care services, and (c) the better is the performance and relevance of health care services.
300 O. Groene & L. A. Branda
Rationale for this limitation is that there was a direct communication between
the WHO Regional Office for Europe and European project leaders, the latest
reports were available from the European projects; time and budget did not
allow to extend the review to include all TUFH projects. We developed a
standardized template to extract data from the reports for each of the four
clusters from the original TUFH evaluation framework: integrating medicine
and public health, implications for health professionals, partnership functioning
and evidence of impact.
Review of the TUFH projects
TUFH projects from the Czech Republic, the Netherlands, Italy and Spain
were included in the review. Reports were assessed for issues addressed in the
TUFH evaluation and monitoring framework.
The TUFH projects reported here were very diverse, both in territory and
population served, as well as in their focus on diseases, risk factors and
determinants of disease. The review found that the process evaluations of the
projects were based on rather loose criteria such as meetings planned
and attended. The outcome evaluations were drawn on repeated measure-
ments, questionnaires and national statistics. Details of the projects them-
selves can be found in the field reports and the paper by Lippeveld and
The results of this review on evaluation strategies are listed graphically in
Table 3 (Table 3).
The results demonstrate that not all the issues addressed in the TUFH
evaluation framework were actually described comprehensively in the reports.
For example, beyond identifying the target of the community intervention it
was very difficult to extract from the reports what constitutes an innovative
approach to combining public health and medicine – most projects appeared to
follow classical public health interventions. In addition, there was little
information to address the implications of the intervention for health
professionals. While there was some mention of the effectiveness of social
Table 2. TUFH reports available for the review
Date TUFH report available for:
July 2001USA, Canada, Czech Republic, Indonesia, Italy, Kenya, Morocco,
Spain, Canada, Czech Republic, Netherlands, Spain, USA, Kenya
Netherlands, Italy, Spain
Lessons from the TUFH projects301
Table 3. Overview on evaluation of European TUFH projects
and public health
Implications for health
Evidence of impact
Territory: Towns and
surrounding areas of
Kladno and Slavkov
and social care needs
Support by and
raising of health &social care
(progress report, July
Project: 1. integrated
care, 2. zoonosis control and 3. health
region, 300.000 people
1. Overcoming the
mental gap between
2. Need to set financial
3. Improve use social
Results: 1. planning of
system to evaluate
integrated care, 2.
monitoring number of
3. production of flow
Project: reduction of
180,000 people, high
risk group of
Consider to accept
in health projects
Use social marketing to
address health issues
of follow up based on
Design: Baseline survey,
trial (in high-risk
Results: Fat reduction in
302 O. Groene & L. A. Branda
Table 3. (Continued)
and public health
Implications for health
Evidence of impact
(progress report, July
Project: local committee
for health planning,
objectives (e.g. care
for terminally ill, carefor immigrants)
(district of Barcelona)
with the community is
rewarding as work fits
better the needs of
indicators based on
Result: control of
diabetic andhypertensive patients,
positive perception by team
1Within the Hartslag Limburg TUFH project a further model has been applied, developed by the department of Health, Organization,
Policy & Economics of Maastricht University. The WIZ model is a theoretical model that distinguishes four clusters of factors that influence
sustainable collaboration. It is a change management model and does not address the health outcome evaluation of partnerships.
Lessons from the TUFH projects 303
marketing and the rewards of community work for health professionals, there
was no information on how health systems could provide the right incentives
for health professionals to engage more strongly in community interventions
and in the de-fragmentation of service delivery. The documentation available
for this review provided a good overview on the background of the projects and
gave – in some cases in-depth – information on the various evaluation measures
and indicators used, such as health status measures collected for the project or
retrieved from routine health information systems. However, the evaluation
measures did not seem to be systematically derived from the TUFH evaluation
framework provided to the project coordinators. In addition, the projects’
implications for health professionals, as well as the assessments of partnership
functioning, were addressed only marginally in the report. Only very basic
proxy measures (e.g. number of meetings attended) of partnership functioning
were used in this assessment. Thus, while the reports provide a good overview
on the availability of data from routine health information systems to evaluate
community projects, they fail to deliver the information required to assess the
implications for professionals in terms of changing roles and rewards. The
reports do not contain information on additional training required for
professionals to carry out the activity, changes in responsibilities implied in
delivering the community intervention and possibilities for financial and non-
The intent of the TUFH projects was to improve health system performance
and increase health service relevance and effectiveness through partnership-
building and broad community interventions. In the process, the projects
attempt to address the underlying factors of performance and community
health. The broad scope of the interventions however, create obstacles for the
identification of common features that illustrate the actual impact of the
activities on the health of the population.
The TUFH projects have generated a lot of interest, but were designed to
produce data that were derived only partly from the criteria suggested in the
monitoring and evaluation framework. In addition, the evaluation approaches
utilized do not convincingly generate a body of evidence that allows for
replication of the model in other settings. The link between partnerships, health
system processes and health system outcomes needs to be better understood
and empirically tested before we can draw conclusions on the impact of
partnerships among stakeholders.
The interventions were quite complex, as is characteristic for public health
interventions. A proper evaluation, which would take into account the
complete TUFH monitoring and evaluation framework required substantial
resources to undertake and complete. Given that WHO was only able to
304 O. Groene & L. A. Branda
provide limited project funding, the rigor of the evaluation may have been
limited by the lack of resources.
In a review of the implementation status of the twelve global TUFH pilot
projects Lippeveld and Glasser recognized that the framework described was
inadequate for evaluation and stressed the need for further systematic
assessment. Their recommendations are as follows: (1) further evaluating the
effectiveness and sustainability of partnerships for the creation of more unified
health systems and (2) developing a set of indicators to be utilized in the
evaluation of the TUFH projects (Lippeveld & Glasser, 2002):
‘‘The ultimate goal of TUFH is to demonstrate how innovative service
delivery patterns and partnerships can lead to more unified health
systems. [...] The establishment of a monitoring and evaluation plan
with well defined indicators and baselines, [...], is therefore extremely
important for project success. Ideally, the data [...] should be generated
without creating separate data collection systems. Unfortunately, in most
countries no comprehensive routine information systems exists which
can provide these data, and therefore most of the projects have set up
some kind of separate data collection.’’
Following Lippeveld’s and Glasser’s recommendations a WHO workshop was
organized to assess the feasibility of developing specific indicators to monitor
TUFH implementation (WHO, 2002). Participants identified a range of
standardized indicators to assess the TUFH projects. However, since at that
stage, the four projects were almost completed and it was not possible to collect
the indicators suggested (Branda & Groene, 2003). The participants of the WHO
workshop also suggested that to develop uniform evaluation criteria that covers
all dimensions of a possible partnership impact, as in the initial monitoring and
evaluation framework, is a challenge considering the scope and heterogeneity of
the TUFH projects. In addition, the application of uniform criteria may not be
necessary, bearing in mind that noattempt tocompare the various projects, either
using a qualitative approach or statistical meta-analysis is planned.
In terms of methodological approach, none of the projects has applied a
controlled-group design for the assessment of interventions. Although these
designs are often considered inappropriate for the evaluation of complex public
health interventions, randomised controlled designs can be applied to the
evaluation of broad social interventions, and methodologies for cluster trials
have been developed that allow for the evaluation of the impact of complex
interventions on communities (Rychetnik et al., 2002). Also, quasi-experi-
mental designs (e.g. interrupted time-series design) or observational study
designs can be applied (Black, 1996) and such designs should be considered in
future project evaluations.
Since the assessment of partnership functioning in the four European TUFH
projects is not based on the use of systematic instruments it remains unclear
Lessons from the TUFH projects305
whether the success reported on the activity can be associated with the
partnership approach or whether the same results could have been achieved
using a traditional approach to medical and public health interventions. A
number of well-researched tools for the assessment of integration, fragmenta-
tion and partnership are already available in the literature (Groene, 2004) and
include the following: systematic research from Shortell et al. and Gillies et al.
on the integration of health care service (Gillies et al., 1993), the Partnership
Self-Assessment tool developed by the Center for the Advancement of
Collaborative Strategies in Health Care at The New York Academy of
Medicine (The Internet Partnership Assessment Tool; Weiss et al., 2001), the
tool to assess chronic care integration developed by the National Chronic Care
Consortium (NCCC) in the U.S. (National Chronic Care Consortium, 1998),
and the measure to assess human service integration, which focuses on strategic
alliances with autonomous services as one way to achieve comprehensive
health and social services for target populations developed by Browne et al.
(2004). Future TUFH projects should make use of such tools for planning and
evaluation of field projects.
We reviewed the TUFH monitoring and evaluation framework and assessed
how the European TUFH projects evaluated their interventions. We found the
initial evaluation framework very complex and lacked guidance on how to
operationalize partnership and integration. We found that the European
TUFH projects put into practice the recommendations contained in the
evaluation framework only partly and that no use of standardized methods or
tools to assess partnership, fragmentation or integration was made. The
evaluation strategies employed by the projects were insufficient to demonstrate
the connections between the intervention and the desired process improve-
ments and few of the evaluation measures address outcomes.
Despite the knowledge available in the literature on assessing partnership,
integration and fragmentation through conceptual models and standardized
tools, neither the initial TUFH monitoring and evaluation framework, nor the
TUFH projects make reference to these resources. We therefore recommend to
apply such tools in order to improve future evaluations of TUFH or other
projects that aim at improving the integration of health care delivery.
BLACK, N. (1996). Why we need observational studies to evaluate the effectiveness of
health care. British Medical Journal, 312, 1215–1218.
BOELEN, C. (2000). Challenges and partnership in health development. Towards Unity
for Health (TUFH) working paper. Geneva: World Health Organization.
306O. Groene & L. A. Branda
BRANDA, L. & GROENE, O. (2003). Evaluation meeting: Towards Unity for Health.
The Network: Towards Unity for Health Newsletter, 22, (01), 25.
BROWNE, G., ROBERTS, J., GAFNI, A. et al. (2004). Conceptualizing and validating the
human service integration measure. International Journal of Integrated Care, 4, 19.
GILLIES, R.R., SHORTELL, S.M., ANDERSON, C.P.A., et al. (1993). Conceptualizing and
measuring integration: Findings from the Health Systems Integration Study. Hospital
and Health Service Administration, 38, 467–489.
GROENE, O. (2004). Approaches towards measuring the integration and continuity in
the provision of health care services. In J. KYRIOPOULUS (Ed.), Health systems in the
world: From evidence to policy (pp. 579–599). Athens: Papazisis.
INTERNET PARTNERSHIP ASSESSMENT TOOL. www.PartnershipTool.net
LIPPEVELD, T. & GLASSER, J. (2002, May). Update and Future Perspectives. Paper
presented at the meeting of The Network: TUFH, Sicily, Italy.
NATIONAL CHRONIC CARE CONSORTIUM (1998). Self-assessment for system integration
RYCHETNIK, L., FROMMER, M., HAWE, P. & SHIELL, A. (2002). Criteria for evaluating
evidence on public health interventions. Journal of Epidemiology and Community
Health, 56, 119–127.
TOWARDS UNITY FOR HEALTH (TUFH) (1999). Criteria for development and selection
of case studies. Ko-Phuket, Thailand. Geneva: World Health Organization.
TOWARDS UNITY FOR HEALTH (TUFH) (2000). Newsletter No. 1, April, p. 2.
TOWARDS UNITY FOR HEALTH (TUFH) (2001, 2002). Project Implementation Reports.
Internal Documents. Barcelona: World Health Organization.
WEISS, E., MILLER, R. & LASKER, R. (2001). Findings from the national study of
partnership functioning: report of the partnerships that participated. New York: The
Center for Advancement of Collaborative Strategies in Health. (www.cacsh.org)
WHO (2002). Development of Evaluation Criteria for Towards Unity For Health
(TUFH) European Projects. Workshop, Barcelona, Spain. Report on a World Health
Organization meeting, Regional Office for Europe, EUR/02/5037869, 2002.
Lessons from the TUFH projects307