In the wake of the 2009 Gippsland fires: Young adults' perceptions of post-disaster social supports
Monash University Department of Rural and Indigenous Health, Moe, Victoria, Australia. Australian Journal of Rural Health
(Impact Factor: 1.23).
06/2012; 20(3):119-25. DOI: 10.1111/j.1440-1584.2012.01271.x
To explore young (18-27 years) rural adults' experience of both formal and informal social support networks post-bushfire. To inform delivery of social support services for young adults post-bushfire.
Qualitative: semi-structured, face-to-face interviews with transcripts analysed using thematic content analysis.
Gippsland 2009 flame-impacted regions: Boolarra and Central Gippsland Black Saturday fire complexes.
Ten bushfire impacted young adults (18-27 years): six female and four male.
The central theme was the importance of acknowledgement and validation of participants' experience as autonomous individuals. Participants' experience of social supports and networks as either helpful or unhelpful depended on the degree to which the supports delivered enhanced sense of acknowledgement, entitlement, affiliation, informational links, engagement in the recovery process and amelioration of displacement in relation to family, friends, community and environment.
Participants believed that how an individual, community or service provider framed loss had a significant impact on entitlement and how needs were met. Importantly, how society, policy and service providers framed young adults as either adult or adolescent impacts on how their needs were met. This study highlights the need to resituate how loss is viewed and the need for policy and service providers to address the existing nomenclature mismatch and framing of loss so that young adults are not excluded from supports essential to recovery.
Available from: Brigid Freeman
[Show abstract] [Hide abstract]
ABSTRACT: This case study explores the application of the policy cycle to implement two related University of Melbourne governance policy projects. The meta-policy (policy on policy) project represented a continuous quality improvement initiative using an elongated policy
cycle, whereas the delegations project, triggered by an institution-wide policy suite review, was implemented as a policy development initiative using a truncated policy cycle. The case study focuses on the development of key elements of institutional meta-policy (range of policy instruments, classification scheme, application of policy instruments, approval authorities, and policy cycle stages) and delegations documentation, including key elements of the delegations policy (framework, guiding legislative provisions, delegations principles), the attendant schedules (finance, building works, research-related, human resources and other
contract/document delegations) and delegations register. The case study illustrates that institutional meta-policy and delegations policy are inherently interdependent, and may be concurrently improved through implementation of the policy cycle involving extensive policy stakeholder consultation and policy benchmarking.
Available from: Brigid Freeman
[Show abstract] [Hide abstract]
ABSTRACT: The Australian federal government now requires higher education institutions to evidence effective development, implementation and review of institutional policies, however little attention has been given to policy implementation evaluation and policy review. This paper presents a case study of the development of a comprehensive policy implementation evaluation framework proposed for the University of Tasmania’s new Casual Teaching Staff Policy. The proposed policy implementation evaluation framework reflects concepts utilised in the policy development process arising from research regarding good practices with respect to university casual teaching staff. The core concepts articulated in the RED Report (Percy et al., 2008) - that is, recognition, enhancement, development - provided organising constructs for the new Casual Teaching Staff Policy (recruitment and employment, professional development in teaching and learning, evaluation and recognition, integration, communication). Subsequently, the RED Report domains (systemic and sustainable policy and practice, employment and administrative support, induction and academic management, career and professional development and reward and recognition) and University of Tasmania Academic Staff Agreement (2010) provisions guided the development of policy content. Collectively, these core concepts, domains and industrial instrument provisions informed the construction of a preliminary 2010 survey instrument and follow-up 2012 survey instrument to assess the needs of University of Tasmania casual teaching staff. The Benchmarking Leadership and Advancement of Standards for Sessional Teaching (BLASST) Project’s Sessional Staff Standards Framework (Harvey et al., 2012) guiding principles (quality learning and teaching, sessional staff support, sustainability) provided the analytic lens and meta-themes for the 2012 survey (Brown et al., 2013, forthcoming). The proposed policy implementation evaluation framework follows Stufflebeam’s CIPP model (2003) and incorporates the institutional meta-policy requirements concerning evaluation and institutional policy review. The resulting policy implementation evaluation framework is a five phase model incorporating Context, Input, Design, Process, Product, Review (CIDPPR). The proposed evaluation instrument is a variant of the BLASST Benchmarking Interactive Tool (B-BIT), which will be deployed to evaluate institutional practices against the Sessional Staff Standards Framework (that is, good practice, minimum standard or unsustainable). This paper discusses the context for casual teaching staff at the University of Tasmania, the development of the Casual Teaching Staff Policy, the perspectives of casual teaching staff and aspects of policy evaluation which contributed to the development of the proposed policy implementation evaluation framework including institutional meta-policy requirements, evaluation definitions, evaluation objectives, and evaluation approaches. This paper is presented to encourage discussion within the policy practitioner community regarding institutional policy evaluation.
Data provided are for informational purposes only. Although carefully collected, accuracy cannot be guaranteed. The impact factor represents a rough estimation of the journal's impact factor and does not reflect the actual current impact factor. Publisher conditions are provided by RoMEO. Differing provisions from the publisher's actual policy or licence agreement may be applicable.