Content uploaded by Hanabeth Luke
Author content
All content in this area was uploaded by Hanabeth Luke on Mar 07, 2025
Content may be subject to copyright.
Land Use Policy 153 (2025) 107526
0264-8377/Crown Copyright © 2025 Published by Elsevier Ltd. This is an open access article under the CC BY-NC-ND license
(http://creativecommons.org/licenses/by-nc-nd/4.0/).
Designing social surveys for understanding farming and natural resource
management: A purposeful review of best-practice survey methods
Hanabeth Luke
a,b
a
Cooperative Research Centre for High Performance Soils, Callaghan, NSW, Australia
b
College of Environmental and Life Sciences, Murdoch University, WA, Australia
ARTICLE INFO
Keywords:
Social survey methodologies
Social benchmarking methods
Literature review
Land management
Natural resource management
Rural landholders
ABSTRACT
Social survey research is often used to achieve an improved understanding of land and natural resource man-
agement. The question for many land agencies and organisations working to support farmers and other rural
landholders is how to gain this information in a scientically rigorous and cost-effective way. This paper sum-
marises ndings from a purposeful review of survey methods applied for understanding rural landholders,
including how land and natural resource management may be changing around the globe. Social surveys play a
crucial role in understanding the complexities of land management, with survey methods evolving over time as
they adapt to technological advancements and shifting research paradigms. Key ndings of this review under-
score the signicance of pre-testing, drawing on diverse sampling techniques, and tailored survey methods to
uphold data integrity and enhance response rates. Effective survey design, coupled with integration of conceptual
models and identity constructs, can enrich insights into land management practices. Embracing mixed methods
and leveraging AI for data integration offer promising avenues for future research, albeit with ethical consid-
erations and challenges in data integration. Previous reviews are extended to describe four main eras in social
survey research for natural resource management, being: 1) the Invention Era (1930–1960); 2) the Expansion Era
(1960–1990); 3) the Integration Era (’Designed Data’ +’Organic Data’) (1990s to 2022); and 4) the Brave New
Era (2022 to present). Prioritising longitudinal studies and expanding survey research globally can inform
evidence-based policymaking, addressing critical gaps in knowledge as land and natural resource management
continues to evolve and respond to changes and challenges worldwide.
1. Introduction
Many industrialised countries have a history of managing their
agricultural soils by persuading managers of private rural lands to apply
currently recommended best-practice as part of agricultural extension or
outreach programs (Anderson, 2004; Vanclay, 2004; Ampt et al., 2015;
Norton et al., 2020; Allan et al., 2022). Often these programs seek to
change land managers’ behaviours by encouraging their adoption of
new tools, practices, enterprises or philosophies. Non-regulatory ap-
proaches to persuasion include providing information and guidance,
normative campaigns, and the use of nancial or other incentives, as
described by Doremus (2003). Whatever the method, their ultimate
success depends on landholders’ decisions, which are in-part related to
practical considerations such as nances, but are also closely linked to a
range of social factors and processes, including values, attitudes and
normative considerations (Axelrod, 1994; Vanclay, 2004; Pannell et al.,
2006; de Groot and Steg, 2007). While private land managers are often
farmers, understanding the management approaches of non-farming
landholders is also important for the management of a landscape and
its natural resources (Primdahl and Kristensen, 2011). Social research
related to the drivers and foundations of land manager decision making
is therefore an important element of understanding land management
outcomes and relevant decisions made by landholders.
Social research is undertaken in the context of land management for
a variety of purposes. It is used to understand public support for the
environmental and social changes associated with proposed and existing
land uses; to understand the drivers of farmer investment and adoption
of agricultural practices; and to explore farmer challenges and aspira-
tions, usually with a goal in mind to support strategic planning and/or
policy design and evolution (Prakash and Bernauer, 2020; Luke, 2014;
Luke, 2017; Stimpson et al., 2019).
Allan et al., (2022) (p.1) highlight that seeking to increase adoption
of a practice without properly understanding its potential role in a local
context can lead to “ignoring or misunderstanding real change and
E-mail address: hanabeth.luke@murdoch.edu.au.
Contents lists available at ScienceDirect
Land Use Policy
journal homepage: www.elsevier.com/locate/landusepol
https://doi.org/10.1016/j.landusepol.2025.107526
Received 18 June 2024; Received in revised form 14 February 2025; Accepted 27 February 2025
Land Use Policy 153 (2025) 107526
2
impacts”, which is why rigorous social survey data is widely used to
gather data and derive meaningful insights into the decisions of land
managers. However, it can be a resource intensive activity, and may
have varying levels of rigour (Avemegah et al., 2021). The question for
many societies and/or their land management agencies is how to gain
meaningful information about landholder decision-making in a way that
is both scientically rigorous and cost-effective.
The aim of this paper is to inform future survey design and imple-
mentation, through undertaking a thorough and updated overview of
the literature on questionnaire-based survey methods that have a
particular focus on understanding agricultural and natural resource
management across rural landscapes. Seeking to develop a best-practice
approach to undertaking social surveys for understanding land and
natural resource management, the key questions this review responds to
are:
−What are the most important characteristics of effective social survey
design, implementation and analysis?
−What can be learned from historic phases of evolution in social
survey design?
−What are the opportunities, challenges and characteristics of the
current phase we nd ourselves in?
2. Methods
Research syntheses offer a vital means of consolidating and dissem-
inating research knowledge, guiding further research, practice, and
public understanding (Petticrew and Roberts, 2008). Research syntheses
can encompass descriptive, informative, evaluative, and connective as-
pects, often undertaken using a narrative, critical or systematic
approach (Popay et al., 2006; Luke et al., 2018). Purposeful sampling
involves selecting highly relevant and information-rich cases, extracting
and describing how information from an article can be used for the
purposes of the study at hand (Duan et al., 2015; Dekkers et al., 2022).
Examples include choosing highly cited and frequently used publica-
tions to ascertain popularly referred to literature, as well as selecting
varied cases to document diverse adaptations, and also as an outlier,
extreme or less common cases to understand a range of different man-
ifestations of an idea (Duan et al., 2015). This method helps to identify
common patterns across variations while taking variability and diversity
of ideas into account. While a purposeful review can draw broad trends
and identify common themes across literature, it cannot be regarded as a
systematic nor meta-analysis, and as such, does not attempt to claim
representative or proportionate ndings within the literature (Heyvaert
et al., 2017).
The review method undertaken entailed purposeful selection, re-
view, analysis, and synthesis of primary research, through systemati-
cally searching published literature based on the key words provided
below. The search was focussed on social survey design and methods,
with a particular focus on those implemented in the context of land
management and natural resource management. This method follows a
purposeful sampling method closely aligned with the method applied by
Allan et al. (2022); Luke et al. (2018) and Clancy et al. (2024).
To undertake a literature search, search engines Web of Science and
Google Scholar were used, inputting a combination of search terms that
included ‘social survey method/methodologies’; ‘social benchmarking’;
‘questionnaire design’ combined with phrases (using the Boolean term
AND) ‘landscape’; ‘environmental’; ‘rural’; land management; ‘rural
landholder’ and ‘natural resource management’. A reexive approach
was taken to selecting and following up on useful references from
relevant papers for inclusion in the review (Johns, 2022). For example,
on the role of cognitive interviewing as a pre-testing method. A total of
114 pieces of literature were identied as highly relevant and integrated
into the various elements of the review (See Appendix).
The resulting literature was interrogated in order to develop an un-
derstanding of the key characteristics of survey design, implementation
and analysis. Of particular interest were papers that illustrated differ-
ences over time, including several review papers that described various
historic phases of evolution in social survey design, implementation and
analysis. More recent papers describing the undertaking of social sur-
veys were analysed specically to identify emerging challenges, op-
portunities and characteristics of the present day phase of survey
research we nd ourselves in.
Further themes were then explored in the literature, for example in
relation to pre-testing methods and commonly used theoretical frame-
works, informing a narrative synthesis. A narrative synthesis entails
creating a unied whole that exceeds the sum of its parts, with the
objective to generate novel insights by elucidating connections and
outstanding ndings among individual study reports. An effective syn-
thesis can draw out information about relevant methodologies, sup-
porting informed researcher decisions about the applicability of
synthesised ndings to their own research contexts (Petticrew and
Roberts, 2008).
The resulting review provides detailed information on survey
methodologies relevant to undertaking data collection about land and
natural resource management. Responding to the research questions,
the review includes an overview of best-practice methods reported in the
literature, as well the evolution of social surveys over time. Key elements
of effective survey design, including questionnaire construction, sam-
pling techniques, response formats and data analysis are synthesised to
provide clear guidance for supporting the development and evolution of
best practice survey design in the context of land management, natural
resource management and environmental change. From this, emerging
challenges and opportunities for survey design and implementation can
be distilled.
3. Results: conducting social surveys in rural areas
Data collection methods employed for understanding drivers of
change and adoption of new innovations span a broad set of approaches,
including workshops, focus groups, interviews and surveys (Creswell,
2013). Epistemological frameworks drive the form, structure and intent
of methods employed; for example, positivist approaches seek to draw
out patterns, create statistical models and make predictions principally
using quantitative data, while constructivist approaches seek to under-
stand, describe and derive meaning, usually from qualitative data
(Pearce, 2012; Vanclay, 2015). An expanding range of analytic tools
enable collected data to be studied and understood in a variety of ways,
from basic percentages and descriptive statistics, through to complex
statistical modelling, graphical associations, thematic analyses and
actor-network mapping (Scott et al., 2020; Scott et al., 2022). While
often considered a positivist approach, generating quantitative data to
identify patterns of land management, open questions in social surveys
can also provide qualitative insights into why those patterns may be
occurring (Rouder et al., 2021).
Questionnaire design and processes employed to develop, imple-
ment, analyse and report on survey data play a crucial role in ensuring
the usefulness, quality and accuracy of the collected data (Krosnick,
2018). As times change, people use and access information in different
ways. New insights and areas of focus arise while others may become
less salient, inevitably leading to an ongoing evolution of survey
methods. The sections that follow provide an outline of commonly used
social survey approaches, and their evolution over time, including
processes of project design that include sampling, questionnaire design,
analysis techniques and approaches to interpretation.
3.1. The evolution of social surveys
Approaches to the implementation of sociological surveys in rural
areas have evolved over many years. Kalton (2019) comments on the
evolution of survey sampling methods over six decades, starting with
attempts to achieve representative samples in lieu of complete censuses.
H. Luke
Land Use Policy 153 (2025) 107526
3
The use of probability sampling was predominant in the early years,
which requires drawing a random sample of a dened population, from
which inferences can be made about the broader population (Kalton,
2019). He describes a gradual progression towards a more common use
of non-probability sampling approaches in the current era. Groves
(2011) outlines ‘three eras of survey research’, with a particular focus on
household population surveys. The three eras identied were: invention
(1930–1960); expansion (1960–1990); and ‘designed data’ supple-
mented by ‘organic data’ (1990–2011 +). The invention era was char-
acterised by sampling units based on areas of land using multi-stage
samples. Face-to-face interviewing and mail-out surveys dominated,
with extremely high response rates (>90 %), that were largely attributed
to small sample sizes and the novel nature of surveys at that time
(Groves, 2011).
From the 1960s, the expansion era saw the impacts of technology,
including increased telephone access enabling the scaling-up of sample
sizes and expanded populations of interest (Groves, 2011). This coin-
cided with the development of increased research funding for the social
sciences (Groves, 2011); and new methodologies, including the Tailored
Design Method of posting out multiple surveys and reminder notices
(Dillman, 1978). The nal era he characterised by a rise in the avail-
ability and use of online self-administered survey instruments, enabling
the gathering of large data sets, now commonly referred to as ‘Big Data’
(Weersink et al., 2018).
As survey methods evolve, so do the attitudes and goals of re-
searchers. Bauer et al. (2007) consider the evolution of ‘public under-
standing of science’ (PUS) survey research over a 25-year span,
identifying three main periods, which overlap substantially with Groves’
three eras of survey design. ‘Science Learning’ (1960–1985) is followed
by Public Understanding (1985 - early 1990s), then ‘Science and Soci-
ety’ from the 1990s up to the publication of their article in 2007. Bauer
et al. (2007)argue that the rst period had a focus on ‘science education’
for the public, underlined by a ‘decit model’, which assumed that the
public was ignorant and ‘insufciently literate’ in scientic knowledge,
ultimately impacting the public’s involvement in policy
decision-making (Bauer et al., 2007, p. 80). In the second period, survey
research shifted from scientic ‘knowledge’ and ‘understanding’ to
public ‘attitudes’ (Bauer et al., 2007, p. 83). The belief pervaded that the
public’s lack of scientic knowledge meant they also failed to appreciate
risks associated with scientic matters (Bauer et al., 2007). In the nal
period, while the decit model remained in place, the critical focus
shifted from the public towards scientic experts and institutions, with a
belief that experts and institutions held false conceptions of the public’s
understanding of science and attitudes, leading to a mismatch between
science-communication and the broader public.
Kalton (2019) notes that while an ongoing focus on using web-based
non-probability samples can achieve large, quick and low-cost samples,
this comes with the issue of low-quality estimates and
non-representative samples. These changes have also led to a continual
fall in response rates (Groves, 2011). He suggests that, over time, falling
response rates have contributed to and coincided with higher data
collection costs – both of which he argues are unsustainable as time goes
on: survey methods need to continue to evolve. In the sections that
follow, insights from the literature are brought together to provide
important considerations when undertaking survey research.
3.2. Effective design of social surveys
First, being clear about survey purpose and the research question/s it
responds to is key (Saris and Gallhofer, 2014; Batterton and Hale, 2017).
Working with local groups and designing the questionnaire through a
participatory process with local partners ensures local relevance (Curtis
et al., 2005).
Survey and section introductions should be included to provide
context, clear instructions, and information about the ethical obligations
and approvals of the project. A logical, coherent ow of questions
enhances respondent understanding and reduces cognitive burden.
Question construction is a fundamental aspect of survey design. It is
important to avoid double-barrel questions and technical jargon
(Parsons and Luke, 2021), while simple, clear and unambiguous ques-
tions ensure accurate and reliable responses. While there is always more
that is ‘nice to know’, prioritising essential questions is key to reducing
questionnaire length to ensure a higher response rate and avoid partial
completions (Luke et al., 2014).
The format of question response options has a major inuence on
data collected. The vast majority of surveys reported in the literature
lean heavily towards the collection of quantitative data. Response for-
mats such as multiple-choice questions, Likert scales, open-ended
questions, or semantic differential scales, should be selected and
tailored to t the research objectives and the type of data desired. The
use of Likert scales allows respondents to express their level of agree-
ment on a given statement. However, care is required to provide
balanced response scales (Batterton et al., 2017), including a neutral
midpoint to help respondents express their true feelings and not have to
make a false choice (Batterton et al., 2017). An opt-out or ‘not appli-
cable’ response option is also important for avoiding forced choices.
While closed-ended questions with predetermined response options
are useful for quantitative analysis and comparison, open-ended ques-
tions are also extremely useful for drawing out themes that may not have
been considered by the researcher (Luke, 2017). Providing a space for
additional comments ensures that the respondent has an opportunity to
make any relevant notes to enable the emergence of themes that may not
have been directly addressed in the closed questions.
3.2.1. Questionnaire components and design techniques
Survey questionnaires bring together a complex set of elements in
order to understand the drivers of decision-making about land and
place. This usually includes demographic factors, with a range of other
key elements, including various theories of change; place attachment
(Masterson et al., 2017); landholder identity (Groth et al., 2017); values
and risk perceptions (Schwartz, 1994; Lockwood, 1999; Slovic, 2000;
Meertens and Lions, 2008); spatial change (Brown, 2004) and other
constructs used to understand the complex dynamics of socio-ecological
stability and change (Masterson et al., 2017; Toman et al., 2019).
Stedman (2002 p.578) notes that, “turning toward clear, empirically
speciable models” is key to effective design and rigor in research about
land and resource management.
The theory of planned behaviour has been widely used in survey
design to ascertain how social norms, attitudes and beliefs may inuence
a behaviour (Ajzen, 1991). Landholder values play an important role in
decisions-making about natural resource management (Boudet et al.,
2014; Luke et al., 2014; Miller, 2017), with values often built into
questionnaires about natural resource management (NRM) and farming
(e.g. Fulton, 1996; Vaske and Donnelly, 1999; Dietz et al., 2005; Brown,
2004; Curtis and Robertson, 2003; McIntyre et al., 2008).
Value-Belief-Norm (VBN) theory aims to predict human environmental
behaviours, assumed to be driven strongly by their as intrinsic personal,
‘held values’. If a person believes that something that they hold as is
important to them is threatened and they have a sense of responsibility
(personal norm) that their actions can make a difference, then they are
more likely to act (Stern et al., 1999; Stern, 2000).
The idea of the values that people assign or attach to places plays an
important role in land management, with Stedman (2002) noting that
we are more likely to ght for places that are central to our identities.
Drawing on value-belief-norm theory, cognitive hierarchy, and the
theory of planned behaviour, Seymour et al. (2010)created a conceptual
model of ‘assigned values’ (Fig. 1). This model considers practical fac-
tors such as weather and markets; the characteristics of natural assets;
identifying that these inuence landholder valuing of their property and
shape land management decisions.
Attached or assigned values were assumed to be less stable than held
values (Fulton, 1996; Seymour et al., 2010), thought to have greater
H. Luke
Land Use Policy 153 (2025) 107526
4
utility in predicting environmental behaviour (Pannell et al., 2006).
However, Toman et al. (2019), found in their longitudinal surveys that
values did indeed shift over time in relation to external factors such as
droughts, and also with in-migration of non-farming landholders. Beliefs
and attitudes were found to be more stable. While not focussed on
land-management behaviours, a useful framework that harmonises
some of the tensions between the above theories is that of Montano and
Kasprzyk (2015), who developed the integrated behavioural model
(Fig. 2), linking self-efcacy and environmental constraints to behav-
iour. Values are not included in this model, developed in relation to
environmental health behaviours, with the relationships between fac-
tors remaining quite linear. Further work in relation to land manage-
ment may be able to determine some of the more complex relationships
between these elements, including the role of values attached to places
and how these may inuence behaviours as they change over time and
across landscapes (Larson et al., 2015; Toman et al., 2019).
In order to explore the role of identity in landholder decision-
making, a collective identity construct (from Ashmore, Deaux and
McLaughlin-Volpe, 2004) was applied in a farming context to explore
the impact of rural landholders’ occupational identity in land manage-
ment decisions (Groth et al., 2014). Rural landholders were divided into
three groups (full-time farmers, part-time farmers and non-farmers), and
important differences were found in self-categorisation, behavioural
involvement, salience of issues and social embeddedness (Groth et al.,
2014). This research was later extended to include hobby farmers as a
distinct category (Groth et al., 2017). Whilst acknowledging challenges
for Australian agriculture in regards to multifunctional rural landscapes
where non-farmer occupational identity is increasing (Luke et al., 2021),
a study of Australian macadamia farmers identied a different catego-
risation of farmers. For the purposes of supporting key industry chal-
lenges of lowered productivity and stagnation of new plantings, they
found the most useful grouping of growers to be 1) those wishing to sell;
2) those who were struggling to make a prot, and 3), those who were
satised, active and productive (Stimpson et al., 2019). This enabled the
peak industry body to focus on supporting landholders to prepare
properties for sale, and look further into the diverse support re-
quirements of those who were struggling. Notably, more than a third of
orchards changed hands in the two years that followed the study,
Fig. 1. The Assigned Values Model (Seymour et al.,2010).
Fig. 2. The integrated behaviour model developed by Montano and Kasprzyk (2015).
H. Luke
Land Use Policy 153 (2025) 107526
5
accompanied by a rise in productivity.
3.2.2. Approaches to survey implementation
Conducting pre-tests and pilot studies before the full-scale survey
administration helps identify potential issues with question clarity,
response formats and questionnaire length. Pretesting involves admin-
istering the survey to a small sample of respondents and gathering
feedback on comprehension, relevance, and overall experience (Collins,
2003; Conrad and Blair, 2004; Fowler, 2013). The testing and evaluation
of questions can also involve cognitive interviews that interrogate
question interpretation, with some researchers using response latency
and surety of response as an indicator of the reliability of answers to
questions posed (Conrad and Blair, 2004). Adjustments to the ques-
tionnaire can then be made to improve the survey’s effectiveness,
increasing the validity and reliability of subsequent data collection.
Presser et al. (2004) warn that the use of a framework with reference
points for feedback in pre-testing situations is important for improving
the effectiveness of the pre-test phase of questionnaire renement. They
raise that it is more useful when participants are clear on the rationale
for the study and the context of the pre-test, rather than undeclared
pre-testing.
The choice of sampling technique has a signicant impact on the
generalisability of survey ndings. Probability sampling methods, such
as simple random sampling or stratied sampling, ensure that each
member of the target population has an equal chance of being selected,
enabling broader inferences to be made across the population from
which the sample has been selected (Cumming, 1990; Uprichard, 2013).
The Tailored Design Method (Dillman, Smyth, and Christian, 2014) has
been utilised and modied in many different contexts for rural land-
holder surveys (e.g. Brown, 2004; Wallen et al., 2016; Groth et al., 2017;
Toman et al., 2019; Coon et al., 2020). Broadly speaking, this approach
involves the systematic posting out or delivering in person of multiple
surveys and reminders over several weeks or months. Van der Mol
(2017) found that each reminder administered over the course of three
reminders provided an important boost to responses, with the rst
reminder providing the largest increase. They also found that noting the
number of responses to date to be a helpful inclusion.
Non-probability sampling methods, such as convenience sampling or
snowball sampling, may be used when probability sampling is imprac-
tical but need to be acknowledged for their limitations. Kalton’s recent
(2023) review of sampling techniques identied that quota sampling is
increasingly favoured for its cost and time efciency. Quota sampling is
a nonprobability sampling method that assumes respondents within a
quota group represent the population equally. However, potential biases
persist as nonresponses might be lled by similar respondents, possibly
skewing the results. He also describes a number of pseudo-probability
sample designs for “hard-to-survey populations”, with smaller samples
targeted across a number of clusters, for example seven participants in
thirty villages, a design often known as the 30 ×7 design (Kalton,
2023).
Other possible sampling approaches are via snowballing, using net-
works to usually achieve a targeted sample; location sampling, where
surveys are collected at a specic location such as a supermarket. Opt-in
panels may also be used, which is a cohort of persons who have iden-
tied that they are happy to complete various surveys (Kalton, 2023).
He warns that online surveys can lead to substantial biases in the sample
selection process, particularly as there are still large groups in the
population who may not access the survey through these channels.
While some authors argue that incentives can be useful for increasing
response rates (Huang et al., 2003; Glas et al., 2019), it has also been
argued that there is further room for biasing of samples when payment is
offered for survey completion (Glas et al., 2019), that can make the
representativeness of the sample ‘highly questionable’ (Kalton, 2023, p
11)
Logan et al. (2020) report on best-practices identied across three
political science face to face survey projects that were conducted in over
80 countries. The pre-eld work best-practices they identied focus on
ensuring data integrity through partner selection, questionnaire design,
selecting and training eldworkers, and ensuring appropriate eld-
worker support. The eldwork stage requires a focus on ensuring
‘real-time quality control’ of implementation errors and survey fraud.
The post-eldwork stage aims to detect data quality issues by imple-
menting strategies such as recontacting a portion of respondents, col-
lecting para-data (information on how the survey data were collected, e.
g., interviewer, geolocation, time, date etc.), and using analysis methods
to identify any data collection issues (Logan et al., 2020).
A small number of recent studies have explored best-practice
methods for increasing survey response rates among respondents to
surveys on agriculture and/or natural resource management. Drop-off/
pick-up surveys have been identied as an effective alternative to mail-
out or phone surveys (Allred and Ross-Davis, 2011;Trentelman et al.,
2016; Jackson-Smith et al., 2016). Allred and Ross-Davis (2011) found
that a drop-off/pick-up method received a 70 % response rate, with
direct comparison to a mail-out survey that received a 50 % response
rate. Part of the higher response rate was achieved by the ease of the
survey team being able to identify ineligibility when surveys were not
picked up. Jackson-Smith et al. (2016) found that the drop-off/pick-up
approach worked better in more sparsely populated rural areas over
higher-density urban areas. Medway and Fulton (2012) found that of-
fering a concurrent online option lowered the response rate when
compared to a mail-only survey, however other research has identied
that a combined method of providing paper and email reminders, for
either a postal or online survey, to be quite effective, particularly with a
younger target population (Patrick et al., 2022).
Offering mixed mail and online options simultaneously can increase
response rates (Wallen et al., 2016; Avemegah et al., 2021). However,
Avemegah et al. (2021) noted that farmers preferred mail surveys. They
also identied that working alongside a government department to help
distribute the survey aided an improved response rate. A rural study
involving 3104 households found that 1066 residents participated, with
971 opting for mail surveys and 275 choosing the Web method
(Cantuaria and Blanes-Vidal, 2019). The study concluded that mail and
Web surveys may yield different responses and reach different respon-
dent groups. Mail surveys were associated with higher reporting of
health symptoms and negative attitudes towards environmental
stressors, even after adjusting for demographic factors, which may be
due to differing age of respondents. Other studies have found that
web-based respondents tend towards being younger and wealthier
(Wallen et al., 2016). These ndings demonstrate that a mixed imple-
mentation method may enable more balanced demographics in survey
response.
Luke et al. (2014) took a slightly different approach to surveying
local residents in regards to a key natural resource management issue.
They pioneered an election-sampling approach that was found to ach-
ieve a sound cross-sectional sample of the adult (voting) population, for
the purpose of determining community perceptions of land use change.
The survey was aligned with an election-poll in which 97 % of the voting
population participated, nding an 86.9 % opposition rate to an in-
dustrial development. At the polling booths, trained volunteers
encouraged completion of a one page exit-survey, receiving an 87.1 %
opposition rate to a question identical to the poll question, while sam-
pling just 5 % of the total population, thereby rmly demonstrating the
representativeness of this method of data collection. Likely, the of
training provided to volunteers implementing the survey contributed to
the rigour of results (Logan et al., 2020).
The Rural Wellbeing survey implemented over a decade in New
South Wales, Australia, used a combined sampling approach (Schirmer
et al., 2016; Amorsen et al., 2023). This included re-inviting previous
participants; stratied sampling of residential addresses and farmers and
recruitment through networks, combined with targeted social media
advertising on Facebook and Instagram, with a prize draw offered to
increase survey participation. This method achieved a substantial
H. Luke
Land Use Policy 153 (2025) 107526
6
sample of 6000 rural residents, including 649 farmers. Challenges of this
approach were that estimating a survey response rate was not possible
due to the multiple and non-traditional recruitment methods employed.
Representativeness of the sample was instead estimated through de-
mographic comparisons with census data. Researchers should carefully
consider the tradeoff between response rate increase, demographic mix
and potential measurement changes when employing such a broad di-
versity of mixed-mode implementation methods.
3.2.3. Analysis approaches
Researchers need to carefully consider the focus of their interest
when developing a survey, which includes advance planning of analysis
strategies to ensure that questions achieve what is intended. Quantita-
tive surveys often employ descriptive statistics, such as means, per-
centages, or correlations, while qualitative surveys may involve
thematic analysis or content analysis, often displayed visually (Rouder
et al., 2021). Commonly employed analysis techniques for quantitative
survey data are descriptive statistics, binary logistic regression, t-tests,
ANOVAs and chi-square tests (Curtis et al., 2005; Wallen et al., 2016).
Very few published landholder studies have been able to undertake
analyses over time (longitudinally), which can provide for a very
interesting understanding of the ways in which our farming regions are
evolving. For example, Toman et al. (2019) investigated rural land-
owners’ held values, assigned values, beliefs, attitudes and self-assessed
knowledge of NRM and specic farming practices, measuring the sta-
bility of variables over a 15-year period in one Australian region. They
found that across the survey population, knowledge and values were far
more changeable than beliefs and expectations (Toman et al., 2019).
Kartsounidou et al. (2023) found that completion of repeated questions
across instances of longitudinal survey implementation had a positive
effect on response behaviour.
A comparative analysis of survey data can be used to draw out
similarities and trends between sample populations or subgroups. For
example, a random sample of 2200 rural property owners were surveyed
in North Central Victoria (Sanderson and Curtis, 2017), with a
comparative analysis undertaken between irrigators and dryland
farmers. Factor analyses were used to develop binary indicators to
identify differences between the two groups. Logistic regression was
used to estimate differences in the probability of each groups imple-
menting conservation-related management practices, with this method
able to identify that while values were similar, irrigators attached
slightly different values to their land, and had a stronger business
orientation (Sanderson and Curtis, 2017). Luke (2017) repeated a
methodology developed by Luke et al. (2014) to compare views of un-
conventional gas development between two neighbouring local gov-
ernment area, with substantially different background demographics
and geographies. A combination of data sources including census data
was used to compliment the analysis and build a picture of the neigh-
bouring localities.
As mentioned above, landholder attitudes and motivations in regards
to land and water management are often explored using the Theory of
Planned Behaviour (e.g. Lynne et al., 1995; Fielding et al., 2005). Pro-
ponents of this theory often draw on Regression and structural equation
analytic methods to test relationships within a model, though they
emphasise the importance of the elicitation phase in the application of
behaviour models, in grounding the relative importance of model ele-
ments (see Montano and Kasprzyk, 2015 for further details on statistical
design relating to such models).
Assigned values are associated with physical landscape features, and
integrating survey data with a range of biophysical data, including
salinity discharge sites, remnant vegetation, and topography allows re-
searchers to make inferences based on actual landscape features, issues
and functions. Research frameworks have been developed for inte-
grating landscape values with biophysical and spatial landscape infor-
mation. This notably includes the work of the late Professor Greg Brown
(Brown, 2004; Brown and Donovan, 2014; Brown and Kytt¨
a, 2014;
Brown, Reed and Raymond, 2020), who used Geographic Information
Systems (GIS) software such as ArcView Spatial Analyst software to
generate descriptive land-use/values maps. For example, to identify
whether landholders have a sound understanding of land-degradation
issues relevant to their landscape (Curtis et al., 2005). However, a tar-
geted search of literature in this specic eld indicates that those un-
dertaking this sort of research are a relatively small group, with this
remaining an under-utilised cross-disciplinary intersection.
Ongoing developments in methods for survey analysis include deep
stratication, variance estimates and new statistical or qualitative ana-
lytic techniques, followed by developments with a ‘mixed-mode of
inference’, which includes techniques for dealing with missing data and
small area estimation (Kalton, 2019). For example, Kimberly Brown
et al. (2022) developed an analysis approach to provide an under-
standing of wellbeing as linked to self-efcacy in land management.
Exploratory factor analysis was used for clustering of principles related
to regenerative agriculture. A regression analysis was performed to
determine covariates signicantly contributing to outcome variables
and entered into Structural Equation Modelling, which then contributed
to the construction of mediation models.
Bayesian Network analyses are recognized for their utility in inte-
grating social, ecological, and economic factors to analyse complex is-
sues (Bromley et al., 2005). Bayesian Networks integrate diverse data
and knowledge, offering a simplied approach to representing complex
systems with variables linked by causal relationships (Hernandez et al.,
2024). These networks document system understanding and assump-
tions, using probability distributions to describe cause and effect re-
lationships and quantify their strengths. The process of developing
Bayesian models is conducive to cross-disciplinary collaborations and
stakeholder involvement, supporting collaborative model development,
testing, and utilisation (Bromley et al., 2005). They have been used to
support adaptive management processes by facilitating planning, action,
monitoring, and review (Smith et al., 2007). Ticehurst et al. (2011)
combined conventional statistical analysis with Bayesian Network
analysis to examine landholder adoption of practices.
Cross-triangulation with conventional statistics was found to be a useful
exercise for cross-checking and reinforcing the ndings of the Bayesian
models, as well as for feeding results back to stakeholders.
3.3. Challenges in social survey research
Challenges in survey research include falling response rates and
consequent cost increases in administering probability-based sampling
methods (Connelly et al., 2003; Rookey et al., 2012; Kalton, 2019). Low
survey response rates have been attributed to a range of factors,
including a reluctant target population; low topic salience, survey
timing; number of mailings, over-complexity of questions, font size and
sampling frame problems (Connelly et al., 2003). Also, a lack of famil-
iarity with topics, response burden, participant age and understanding
of the topics discussed (Brown, 2004). The year of survey implementa-
tion has the strongest correlation with the response rate, found by
Connelly et al. (2003) to be declining at an average rate of 0.77 % per
year. This was a similar nding to that of Stedman et al. in a 2019 study,
who identied an annual decline of 0.76 %, predicting a “dramatic and
steady decline in response rate over time”, projecting an average
response rate of 21 % for mail-out surveys by the 2030 s. Decling
response rates combined with rising costs is leading to an increased use
of low-cost online surveys, and administrative data sources such as
panels (Lehdonvirta et al., 2021), with calls for appropriate evaluation
metrics for surveys associated with non-probability samples (Kalton,
2019).
Rybak (2023) conducted a meta-analysis of implementation meth-
odologies and their comparative response bias, nding that mail-out and
mixed delivery mode resulted in signicantly lower levels of response
bias. They also note that the rigour of implementation method and
questionnaire design played an important role in minimisation of
H. Luke
Land Use Policy 153 (2025) 107526
7
response bias. While conrming that nonresponse bias is an increasing
issue for mail-out surveys of rural landowners, Coon et al. (2020),
identied ‘little consistent evidence of nonresponse bias and recommend
using multiple methods to triangulate and assess it. They suggest careful
consideration of rural contexts in the survey design, including tailoring
to older audiences (increased font size, accessible formats), engaging
underrepresented groups, gamication, and considering nonresponse
bias as far as possible during survey development. They also argue that
surveys with low response rates can yield meaningful data, particularly
when nonresponse bias is carefully considered in the design stages (Coon
et al., 2020). There are many surveys out there, and landholders can
have them ying in from many directions for multiple purposes, often
funded prom public sources. While the literature shows that an over use
of surveys is detrimental to response rates, it remains clear that response
rates can be improved through higher levels of quality in design and
implementation (Rybak, 2023).
3.4. Opportunities and innovations in survey methods
Engaging people to complete surveys about natural resource man-
agement and farming will remain an ongoing challenge, as the way
people perceive surveys changes, as well as how we access information
may change. This analysis provides evidence that tailored approaches of
delivering paper questionnaires remain an important method for un-
derstanding rural landholders, yet different stakeholders may access
surveys in different and evolving ways. Younger landholders and resi-
dents access and use information in ways that may change quite quickly
from Twitter to Instagram to Tiktok, so ensuring that surveys are able to
sample these groups successfully requires a consideration of current
trends, which often tend to online or social media sources (Ackoff et al.,
2017; Luke et al., 2020). Online tools and apps that can take advantage
of moments landholders are gathered to collect and share survey data in
real time, such as Mentimeter™ are increasingly being used in agricul-
tural circles (Aplin et al., 2023; Bull et al., 2024). While there may be
limitations to the complexity of data that can be collected, there is a
major advantage to the provision of real-time feedback to participants.
This method enables an instant capacity for landholders to reect on
their own views in the broader context of those in attendance at the
event where the Mentimeter™ survey is undertaken.
Innovations and modications to survey research may be as simple as
a movement away from quantitative methodologies towards the use of
open-ended questions, which is far less common in the literature. Wehde
and Perreault (2022), used a qualitative survey design with a format of
open ended interview type questions to elicit respondent narratives
relating to perceptions of climate change and related policy. Another
major opportunity is broadening the reach of social surveys over space
and time. Prakash and Bernauer (2020) noted that with over 50 % of
environmental opinion surveys being undertaken in the United States,
supporting global understandings of policy and public opinion in a
cross-national context requires more studies around the world. They also
call for xinal survey experiments to better understand long-term trends
in environmental management and views of land-use policy over time,
of which there are relatively few detailed studies, aside from broad
census data.
Land management is complex, with people and the things they value
at the centre, and some surveys will require collection of sensitive data.
Sensitive issues may be as straightforward as the provision of nancial
information, while others could involve unfriendly or illegal activity.
Kianersi et al., (2020) found that if being conducted face to face, gender
of researcher could impact the nature of responses given to sensitive
questions, raising the importance of including cultural considerations
when designing survey implementation strategies. Through a review of
peer-reviewed literature relating to natural resource management
(NRM), Nuno and St. John (2015) identied seven commonly used
techniques aimed to reduce non-response and social desirability bias,
which were: randomised response techniques; nominative technique;
unmatched-count technique; grouped-answer method; crosswise, trian-
gular, diagonal and hidden sensitivity models; surveys with negative
questions; and the ‘bean’ method. They recommend that NRM re-
searchers consider the incorporation of sensitive questioning tech-
niques, carefully considering the appropriateness of each approach.
Malnar and Ryan (2021) note that limited attention has been paid to
the way big social data and other high-volume IT data are integrated
into research programs, and that further research is required to under-
stand how this can be achieved effectively. While from these opportu-
nities may arise, the use of secondary data also brings challenges, such as
the meaningfulness of responses based on the original questions posed
and the sampling approach taken, as well as ethical considerations about
its use. Often, multiple data sets may need to be brought together to be
able to answer research questions, which opens both challenges, and
opportunities to develop new ways to bring them together (Weersink
et al., 2018; Kalton, 2023).
Recent advancements in articial intelligence (AI), notably with the
rise of large language models, prompt a reassessment of articial general
intelligence (AGI) possibilities. Potentially addressing an issue in the
third era of social survey design, where our ability to generate ‘big data’
exceeds our ability to manage, analyse and use data about land man-
agement at farm level up to policy level (Weersink et al., 2018), AI is
rapidly becoming a central tool in social science research, offering ef-
ciencies across key tasks and opportunities involving data about how to
manage land and farms (Ndiema, 2024). For research purposes, it aids in
literature searching and review (Grossmann et al., 2023), proposes
questions and hypotheses (Park et al., 2024), analyses data (Ziems et al.,
2024) and assists with writing (Dergaa et al., 2023). While some of these
early researchers on the efcacy of AI in research warn about high error
rates, others have suggested that they might sometimes result in higher
quality work than that which is human generated, particularly in the
analysis of qualitative data (Banker et al., 2023; Xu et al., 2024). As AI
advances and integrates further into daily life, the merging of AI and
social science grows rapidly more signicant, with frameworks required
for aiding researchers to step into new research spaces within these
emerging realms (Xu et al., 2024).
The rapid evolution of AI brings new questions about how social
science research practices can be adapted or possibly reinvente, to
harness the power of foundational AI (Grossmann et al., 2023). The rise
of AI and the extension of its applications for research is also accom-
panied with ethical challenges around transparency and replicability of
method (Grossmann et al., 2023; Dergaa et al., 2023). In relation to land
management, further opportunities arise with the integration of data
sets. To minimise a risk of further environmental degradation and social
displacement caused by AI generated data or decisions, Ndiema (2024)
urges that AI considers sustainability as a core value within ethical
frameworks being established. More broadly, caution and consideration
is urged in relation to the potential impact of AI on the credibility and
authenticity of academic work, with Dergaa (2023, p.615) highlighting
the importance of:
“Comprehensive discussions on the potential use, threats, and limi-
tations of these tools, emphasizing the importance of ethical and
academic principles, with human intelligence and critical thinking at
the forefront of the research process.”
4. Discussion
This review nds that the evolution of sociological survey imple-
mentation spans decades, transitioning from probability sampling to an
increased use of non-probability sampling approaches, driven by tech-
nological advancements and societal trends, and it continues to evolve.
Social surveys remain a key tool for understanding drivers of change and
the adoption of innovations relating to land management, with Dill-
man’s (1978) implementation method remaining a useful backbone of
rigorous data collection in rural contexts, particularly where data needs
H. Luke
Land Use Policy 153 (2025) 107526
8
to be representative.
The literature reviewed in this paper emphasises strongly the value
of employing best practices in survey implementation to maintain data
integrity. Effective survey design involves researchers developing and
communicating a strong understanding of the survey’s purpose. This
review nds that the evolution of sociological survey implementation
spans decades, transitioning from probability sampling to an increased
use of non-probability sampling approaches, driven by technological
advancements and societal trends, and it continues to evolve. developed
collaboratively with relevant stakeholders to ensure engagement and
relevance (Curtis et al., 2005; Bromley et al., 2005; Smith et al., 2007;
Smith et al., 2013; Schmidt et al., 2020). Sections and questions should
be introduced thoughtfully, prioritising essential topics, minimising
length and employing simple language to enhance respondent compre-
hension and increase completion rates. Effective question construction
and response formats, including Likert scales and open-ended questions,
facilitate accurate data collection and allows for nuanced analysis
allowing both patterns of response and the meaning behind those re-
sponses to be present in the data (Parsons and Luke, 2021). A key
consideration in analysis approaches for social surveys includes clear
alignment of questions with intended analyses using appropriate quan-
titative or qualitative questions and response options (Wallen et al.,
2016). However, the use of mixed-method surveys that integrate both
qualitative and quantitative questioning within the questionnaire design
remain less common.
Conducting pre-tests and pilot studies to rene survey instruments
and understanding how different sampling techniques may or may not
ensure generalisability is key to ensuring that surveys achieve their
desired outcomes (Conrad and Blair, 2004). Utilising varied imple-
mentation methods, such as drop-off/pick-up, mail-out or mixed-mode
approaches, can enhance response rates (Huang et al., 2003) and yield
more diverse respondent groups, while leveraging multiple recruitment
strategies (Amorsen et al., 2023). While such approaches can boost
sample sizes, challenges such as ensuring sample representativeness and
capacity to estimate response rates are important to keep front of mind
(Connelly et al., 2003).
Fig. 3shows a best-practice approach for considering effective design
and implementation of social surveys for understanding farming and
natural resource management.
Analytical approaches can add much value to survey data, and
having a broad knowledge of the available analytical tools can support
the best possible utilisation of a suite of techniques, such as descriptive
statistics, network analysis, regression analysis, and factor analyses for
exploring trends and associations. Longitudinal analyses provide valu-
able insights into evolving farming regions (Kartsounidou et al., 2023),
while the relatively rare practice of integrating spatial data enhances
understanding of landscape values and land management practices
(Brown, 2004). Ongoing developments in analysis methods aim to
address challenges such as missing data (Kalton, 2019). Small area
estimation and sensitive questioning techniques can allow for more
comprehensive insights into landholder attitudes and behaviours (Nuno
and St. John, 2015).
An integral element of questionnaires aimed at understanding land
and resource management include integrating conceptual models such
as the theory of planned behaviour and identity constructs into survey
design (Seymour et al., 2010). Many of the land management surveys
identied in this review used held and assigned values related to specic
natural assets and relevant inuencing factors (Ajzen, 1991; Fulton,
1996; Stern, 2000; Seymour et al., 2010; Montano and Kasprzyk, 2015).
Differences in self-categorization, behavioural involvement, and social
embeddedness among various categories of rural landholders can aid in
the understandings of natural resource management within local and
national contexts (Groth et al., 2017). A lack of harmony or cohesion
among models in this space indicates a need to continue to develop and
deepen understandings of how the complex attributes of landholders
and their priorities drive decision-making about land.
The future of social survey research in land management and natural
resource management presents both challenges and opportunities for
methodological advancements. Falling response rates in probability-
based samples and the rise of low-cost online surveys pose challenges,
compounded by concerns about non-response bias and increasing survey
saturation. However, strategies such as well-tailored survey design,
engagement of underrepresented groups, and development of surveys
with close consideration of rural contexts offer avenues for improving
response rates and data quality. Embracing mixed methods approaches,
newly developed or rened conceptual frameworks and sensitive ques-
tioning techniques can provide deeper insights into complex land
management issues. Adapting survey methods to cater to changing in-
formation consumption patterns, particularly among younger de-
mographics through online and social media platforms, can enhance
sample representation and data relevance.
Groves (2011 p. 870) states that ‘survey research is not dying; it is
changing’. He argues that there will always be a need for large-scale
population surveys, given the rich information gleaned through survey
data, including ‘thoughts, aspirations, and behaviours of large pop-
ulations’. He argues that the real challenge moving forward is not to
implement mixed-mode research designs but to effectively integrate
‘designed data with organic data’, allowing for survey data to be sup-
plemented with other data sets.
Expanding survey research globally and prioritising well-funded,
Fig. 3. Best-practice approach for undertaking social surveys in the context of farming and natural resource management.
H. Luke
Land Use Policy 153 (2025) 107526
9
longitudinal studies can advance understanding of environmental
management and support evidence-based policymaking, addressing key
gaps in current practical knowledge and research (Toman et al., 2019;
Prakash and Bernauer, 2020; Kartsounidou et al., 2023). As we move
into our brave new world of social survey design for environmental
management, integration of extensive social, environmental, and eco-
nomic datasets, possibly facilitated by AI, has the potential to substan-
tially enhance survey research efcacy, including applications for large
data sets and in bringing together different data sources. This has the
potential to also reduce the need for excessive surveying of rural pop-
ulations. However, challenges such as data integration, ethical concerns,
and permissions or access barriers remain key considerations. While
articial intelligence offers promise for transforming social science
research, ensuring ongoing rigour remains crucial.
While the literature on effective social survey design is extensive,
and this purposeful review does not claim to be exhaustive, a number of
insights have been provided in regards to how surveys for natural
resource management can be effectively developed, implemented and
analysed in the context of changing lands and a changing world. Shifts in
survey design and implementation have been documented over different
eras by Groves (2011) and Kalton (2019), reecting a move towards
online self-administered surveys and hybrid methods, while raising
concerns about declining response rates and data quality, with more
recent publications exploring the opportunities and challenges that
accompany the integration of AI with social sciences (Grossmann et al.,
2023; Dergaa et al., 2023) and how this may impact land and environ-
ment (Ndiema, 2024). The evolution of survey research can thus be
summarised into four main eras, with a Brave New Era having emerged
with the invention of readily available AI:
Invention Era (1930–1960):
•Sampling units based on geographic areas.
•Methods primarily included face-to-face and mail-out surveys with
high response rates due to smaller sample sizes and novelty.
Expansion Era (1960–1990):
•Introduction of technology (e.g., telephones) for larger sample sizes
and broader population coverage.
•Increased funding and adoption of new methodologies like the
Tailored Design Method for survey distribution.
Integration Era (’Designed Data’ +’Organic Data’) (1990s to 2022):
•Emergence of online self-administered surveys leading to large-scale
data collection.
•Shift towards integrating traditional survey methods with existing
data sets (e.g., census data), enabling richer and more complex
analyses.
Brave New Era (2022 to present)
•Emergence of readily available articial intelligence software for
analysis, programming, writing and more.
•Rapidly expanding tools for data set integration, with vast opportu-
nities accompanied with substantial concerns about the potential
challenges it may bring.
•This expansion occurring in the context of a society that is continu-
ally changing the way people interact with and share information.
Fast data collection and instantly displayed results are becoming
more popular.
5. Conclusions
Social surveys continue to play an important role in understanding
the dynamics of land and farm management, evolving over decades to
adapt to technological advancements within changing societal and
research landscapes. This paper and the best-practice guidelines illus-
trated in Figure three provide a useful updated review of the literature
on social survey design for the purposes of research students, staff, and
natural resource management organisations seeking to undertake their
own surveys of rural landholders.
Key learnings from those implementing rural landholder surveys
around the world emphasise the importance of pre-tests, diverse sam-
pling techniques, and tailored survey methods to maintain data integrity
while enhancing response rates. Through choosing suitable response
formats, clear communication and thoughtful question construction that
is well linked with data analysis methods, researchers can maximise the
quality and accuracy of survey data. Integration of conceptual models
and identity constructs enriches understanding of land management
practices. Ongoing work is needed to fully integrate some of the key
drivers of behaviour around land management in different contexts,
including drawing out the complex relationships between attached
values, beliefs, social norms and practical barriers or enablers of be-
haviours with place-based features (Brown et al., 2020). There are a rich
suite of theoretical, qualitative, quantitative, mixed-methods and AI
opportunities for achieving this.
Adhering to these principles of best-practice social survey design for
land and natural resource management enhances the validity, reliability
and therefore usefulness of survey ndings and also enables researchers
to draw meaningful conclusions. Future opportunities lie in embracing
mixed methods, adapting to changing information consumption pat-
terns, and integrating extensive datasets with AI, although challenges
such as data integration and ethical considerations must remain a top
consideration. Expanding survey research globally, including studies
across space and over time can advance environmental management
understanding and evidence-based policymaking, bridging critical
research gaps in knowledge as management of land and natural re-
sources evolves around the globe.
Submission declaration and verication
Submission of this article implies that the work described has not
been published previously (except in the form of an abstract or a pub-
lished lecture), that it is not under consideration for publication else-
where, that its publication is approved by all authors, and that, if
accepted, it will not be published elsewhere in the same form, in English
or in any other language, including electronically without the written
consent of the copyright-holder. I recognise that to verify compliance,
the article may be checked by Crossref Similarity Check and other
originality or duplicate checking software.
CRediT authorship contribution statement
Hanabeth Luke: Writing – review & editing, Writing – original draft,
Validation, Supervision, Resources, Project administration, Methodol-
ogy, Investigation, Funding acquisition, Formal analysis, Data curation,
Conceptualization.
Declaration of Generative AI and AI-assisted technologies in the
writing process
During the preparation of this work and following the literature re-
view, the author made very minimal use of ChatGTP 3.5 in order to
provide an overview of themes arising from the review, which provided
a little guidance on initial structuring. After using this tool, the author
wrote and heavily edited the rest of the paper, and takes full re-
sponsibility for the content of the publication.
Declaration of Competing Interest
As Corresponding and sole author I understand that I must disclose
H. Luke
Land Use Policy 153 (2025) 107526
10
any nancial and personal relationships with other people or organi-
zations that could inappropriately inuence (bias) my work. This work
has been acknowledged as supported by the Soil CRC, and declare that
there are no conicts of interest or other competing interests relevant to
this paper and its publication. It is entirely my own work, other than a
tiny bit of help from Chat GTP as declared in the document.
Appendix. : Literature Reviewed
Ackoff, S., A. Bahrenburg, and L. L. Shute. 2017. Building a future with
farmers II: Results and recommendations from the National Young Farmer
Survey. https://www.youngfarmers.org/wp-content/uploads/2018/
02/NYFC-Report-2017.pdf
Allan, C., Cooke, P., Higgins, V., Leith, P., Bryant, M., & Cockeld, G.
(2022). Adoption; a relevant concept for agricultural land management
in the 21 century?. Outlook on Agriculture, 51(4), 375–383. https://do
i-org.ezproxy.scu.edu.au/10.1177/0030727022112654
Ajzen, I. (1991). The theory of planned behavior. Organizational
behavior and human decision processes, 50(2), 179–211. https://doi.org/1
0.1016/0749–5978(91)90020-T
Amorsen, G., Mylek, M., and Schirmer, J. (2023). NSW farmers’
exposure to adverse events. Available from: https://www.droughthub.ns
w.gov.au/__data/assets/pdf_le/0010/1531909/202223-Regional-We
llbeing-Survey-Report-NSW-farmers-exposure-to-adverse-events.pdf
Aplin, K., Morgans, L., Palczynski, L., Main, D., Debbaut, C., Hep-
worth, L., & Reed, J. (2023). Calf health veterinary services: Making
them work for calves, farmers and veterinarians. Veterinary Record, 193
(8). https://doi.org/10.1002/vetr.3051
Ashmore, R. D., Deaux, K., and McLaughlin-Volpe, T. (2004). An
organizing framework for collective identity: articulation and signi-
cance of multidimensionality. Psychological bulletin, 130(1), 80.
https://doi.org/10.1037/0033–2909.130.1.80
Avemegah, E., Gu, W., Abulbasher, A., Koci, K., Ogunyiola, A.,
Eduful, J., Li, S., Barington, K., Wang, T., Kolady, D., Perkins, L., Lefer,
A. J., Kov´
acs, P., Clark, J. D., Clay, D. E., and Ulrich-Schad, J. D. (2021).
An Examination of Best Practices for Survey Research with Agricultural
Producers. Society and Natural Resources, 34(4), 538–549. https://doi.
org/10.1080/08941920.2020.1804651
Banker, S., Chatterjee, P., Mishra, H., and Mishra, A. (2023).
Machine-assisted social psychology hypothesis generation. PsyArXiv.
February, 25. https://doi.org.ezproxy.scu.edu.au/10.1037/
amp0001222
Batterton, K. A., and Hale, K. N. (2017). The Likert scale what it is
and how to use it. Phalanx, 50(2), 32–39. https://www.jstor.org/stable/
26296382
Bauer, M. W., Allum, N., and Miller, S. (2007). What can we learn
from 25 years of PUS survey research? Liberating and expanding the
agenda. Public understanding of science, 16(1), 79–95. https://doi.org/1
0.1177/0963662506071287
Bromley, J., Jackson, N.A., Clymer, O.J., Giacomello, A.M., Jensen,
F.V., 2005. The use of Hugin to develop Bayesian Networks as an aid to
integrated water resource planning. Environmental Modelling and Soft-
ware 20, 242e251. https://doi.org/10.1016/j.envsoft.2003.12.021
Brown, G. (2004). Mapping spatial attributes in survey research for
natural resource management: Methods and applications. Society and
Natural Resources, 18(1), 17–39. https://doi.org/10.1080/08941920
590881853
Brown, G., & Donovan, S. (2014). Measuring change in place values
for environmental and natural resource planning using public partici-
pation GIS (PPGIS): results and challenges for longitudinal research.
Society & Natural Resources, 27(1), 36–54. https://doi.org/10.1080
/08941920.2013.840023
Brown, G., & Kytt¨
a, M. (2014). Key issues and research priorities for
public participation GIS (PPGIS): A synthesis based on empirical
research. Applied geography, 46, 122–136. https://doi.org/10.1016/j.
apgeog.2013.11.004
Brown, G., Reed, P., & Raymond, C. M. (2020). Mapping place
values: 10 lessons from two decades of public participation GIS empir-
ical research. Applied Geography, 116, 102156. https://doi.org/10.10
16/j.apgeog.2020.102156
Brown, K., Schirmer, J., and Upton, P. (2022). Can regenerative
agriculture support successful adaptation to climate change and
improved landscape health through building farmer self-efcacy and
wellbeing? Current Research in Environmental Sustainability, 4, 100170.
https://doi.org/10.1016/j.crsust.2022.100170
Boudet, H., Clarke, C., Bugden, D., Maibach, E., Roser-Renouf, C., &
Leiserowitz, A. (2014). “Fracking” controversy and communication:
Using national survey data to understand public perceptions of hy-
draulic fracturing. Energy Policy, 65, 57–67. https://doi.org/10.1016/j.
enpol.2013.10.017
Bull, E. M., van der Cruyssen, L., V´
ag´
o, S., Kir´
aly, G., Arbour, T., &
van Dijk, L. (2024). Designing for agricultural digital knowledge ex-
change: applying a user-centred design approach to understand the
needs of users. The Journal of Agricultural Education and Extension, 30(1),
43–68. https://doi.org/10.1080/1389224X.2022.2150663
Cantuaria, M. L., and Blanes-Vidal, V. (2019). Self-reported data in
environmental health studies: mail vs. web-based surveys. BMC medical
research methodology, 19, 1–13. https://doi.org/10.1186/s12874-0
19–0882-x
Clancy, A., Hovden, J. T., & Laholt, H. (2024). A descriptive and
interpretive theory of ethical responsibility in public health nursing.
Nursing Ethics. https://doi.org/10.1177/09697330241291161
Collins, D. (2003). Pretesting survey instruments: an overview of
cognitive methods. Quality of life research, 12, 229–238. https://doi.
org/10.1023/A:1023254226592
Connelly, N. A., Brown, T. L., and Decker, D. J. (2003). Factors
affecting response rates to natural resource-focused mail surveys:
empirical evidence of declining rates over time. Society and Natural
Resources, 16(6), 541–549. https://doi.org/10.1080/08941920309152
Conrad, F. G., and Blair, J. (2004). Data quality in cognitive in-
terviews: the case of verbal reports. Methods for testing and evaluating
survey questionnaires, 67–87.
Coon, J. J., van Riper, C. J., Morton, L. W., and Miller, J. R. (2020).
Evaluating nonresponse bias in survey research conducted in the rural
Midwest. Society and Natural Resources, 33(8), 968–986. https://doi.
org/10.1080/08941920.2019.1705950
Creswell, J. W. (2013). Research Design: Qualitative, Quantitative, and
Mixed Methods Approaches. SAGE Publications.
Cumming, R. G. (1990). Is probability sampling always better? A
comparison of results from a quota and a probability sample survey.
Australian and New Zealand Journal of Public Health, 14(2), 132–137.
https://doi.org/10.1111/j.1753–6405.1990.tb00033.x
Curtis, A., Byron, I., and MacKay, J. (2005). Integrating socio-
economic and biophysical data to underpin collaborative watershed
management. Journal of the American Water Resources Association, 41(3),
549–563. https://doi.org/10.1111/j.1752-1688.2005.tb03754.x
Dekkers, R., Carey, L., & Langhorne, P. (2022). Objectives and
positioning of literature reviews. In Making Literature Reviews Work: A
Multidisciplinary Guide to Systematic Approaches (pp. 25–56). Cham:
Springer International Publishing. https://doi.org/10.1007/
978–3-030–90025-0
Dergaa, I., Chamari, K., Zmijewski, P., and Saad, H. B. (2023). From
human writing to articial intelligence generated text: examining the
prospects and potential threats of ChatGPT in academic writing. Biology
of sport, 40(2), 615–622. https://doi.org/10.5114/biolsport.20
23.125623
Dietz, T., Fitzgerald, A. and Shwom, R. (2005). ‘Environmental
values’, Annual Review of Environmental Resources, vol. 30, pp.335-372.
Dillman, D. A. (1978). Mail and telephone surveys. USA: John Wiley
and Sons.
Dillman, D. A., Smyth, J. D., and Christian, L. M. (2014). Internet,
Phone, Mail, and Mixed-Mode Surveys: The Tailored Design Method. John
H. Luke
Land Use Policy 153 (2025) 107526
11
Wiley and Sons.
Duan, N., Bhaumik, D. K., Palinkas, L. A., and Hoagwood, K. (2015).
Optimal design and purposeful sampling: Complementary methodolo-
gies for implementation research. Administration and Policy in Mental
Health and Mental Health Services Research, 42, 524–532. https://doi.
org/10.1007/s10488-014–0596-7
Fielding, K.S., Terry, D.J., Masser, B.M., Bordia, P., Hogg, M.A.,
2005. Explaining landholders’ decisions about riparian zone manage-
ment: the role of behavioural, normative and control beliefs. Journal of
Environmental Management 77, 12e21. https://doi.org/10.1016/j.
jenvman.2005.03.002
Fowler Jr, F. J. (2013). Survey Research Methods. SAGE Publications.
Fulton, D. C., Manfredo, M. J., and Lipscomb, J. (1996). Wildlife
value orientations: A conceptual and measurement approach. Human
dimensions of wildlife, 1(2), 24–47. https://doi.org/10.1080/10871
209609359060.Glas, Z. E., J. M. Getson, Y. Gao, A. S. Singh, F. R.
Eanes, L. A. Esman, B. R. Bulla, and L. S. Prokopy. 2019. Effect of
monetary incentives on mail survey response rates for midwestern
farmers. Society and Natural Resources 32 (2):229–37. https://doi.org/1
0.1080/08941920.2018.1530815
Grossmann, I., Feinberg, M., Parker, D. C., Christakis, N. A., Tetlock,
P. E., and Cunningham, W. A. (2023). AI and the transformation of social
science research. Science, 380(6650), 1108–1109. DOI: 10.1126/sci-
ence.adi1778
Groth, T. M., Curtis, A., Mendham, E., and Toman, E. (2014). Farmer
Identity in Multifunctional Landscapes: using a collective identity
construct to explore the nature and impact of occupational identity.
Australian Geographer, 45(1), 71–86. doi: 10.1080/
00049182.2014.869297
Groth, T. M., Curtis, A., Mendham, E., and Toman, E. (2017).
Examining the agricultural producer identity: utilising the collective
occupational identity construct to create a typology and prole of rural
landholders in Victoria, Australia. Journal of environmental planning and
management, 60(4), 628–646. https://doi.org/10.1080/09640568.201
6.1165189
Groves, R. M., Fowler Jr, F. J., Couper, M. P., Lepkowski, J. M.,
Singer, E., and Tourangeau, R. (2011). Survey Methodology (2nd ed.).
John Wiley and Sons.
Hernandez, S., Luke, H., & Alexanderson, M. S. (2024). Is human
activity driving climate change? Perspectives from Australian land-
holders. Frontiers in Sustainable Food Systems, 8, 1392746. https://doi.
org/10.3389/fsufs.2024.1392746
Heyvaert M, Hannes K, Onghena P. (2017). Searching for relevant
studies. In: Using Mixed Methods Research Synthesis for Literature Reviews.
Los Angeles: Sage Publications, 69–112.
Huang, J. Y., Hubbard, S. M., and Mulvey, K. P. (2003). Obtaining
valid response rates: considerations beyond the tailored design method.
Evaluation and Program Planning, 26(1), 91–97. https://doi.org
/10.1016/S0149-7189(02)00091–5
https://doi.org/10.3390/systems2020119
Jackson-Smith, D., Flint, C. G., Dolan, M., Trentelman, C. K., Holy-
oak, G., Thomas, B., and Ma, G. (2016). Effectiveness of the drop-off/
pick-up survey methodology in different neighborhood types. Journal
of Rural Social Sciences, 31(3), 3. https://egrove.olemiss.edu/jrss/vol3
1/iss3/3/
Johns, C. (2022). Reexive Narrative: Self-Inquiry Toward Self-
Realization and Its Performance In Sage Research Methods. Sage
Publications.
Kalton, G. (2019). Developments in survey research over the past 60
years: A personal perspective. International Statistical Review, 87, S10-
S30. https://doi.org/10.1111/insr.12287
Kalton, G. (2023). Probability vs. nonprobability sampling: from the
birth of survey sampling to the present day. Statistics in Transition. New
Series, 24(3), 1–22. https://www.ceeol.com/search/article-detail?
id=1127833
Kartsounidou, E., Kluge, R., Silber, H., and Gummer, T. (2023).
Survey experience and its positive impact on response behavior in lon-
gitudinal surveys: Evidence from the probability-based GESIS Panel.
International Journal of Social Research Methodology, 1–14. https://doi.
org/10.1080/13645579.2022.2163104
Kianersi, S., Luetke, M., Jules, R., and Rosenberg, M. (2020). The
association between interviewer gender and responses to sensitive sur-
vey questions in a sample of Haitian women. International Journal of
Social Research Methodology, 23(2), 229–239. https://doi.org/10.1
080/13645579.2019.1661248
Krosnick, J. A. (2018). Questionnaire design. Handbook of Survey
Research, Second Edition, 263–313.
Larson, L. R., Stedman, R. C., Cooper, C. B., & Decker, D. J. (2015).
Understanding the multi-dimensional structure of pro-environmental
behavior. Journal of environmental psychology, 43, 112–124. https://
doi.org/10.1016/j.jenvp.2015.06.004
Lehdonvirta, V., Oksanen, A., R¨
as¨
anen, P., & Blank, G. (2021). Social
media, web, and panel surveys: using non-probability samples in social
and policy research. Policy & internet, 13(1), 134–155. https://doi.
org/10.1002/poi3.238
Lockwood, M. (1999). Humans Valuing Nature: Synthesising Insights
from Philosophy, Psychology and Economics. Environmental Values, 8
(3), 381–401. https://journals.sagepub.com/doi/pdf/10.3197/096327
199129341888
Logan, C., Par´
as, P., Robbins, M., and Zechmeister, E. J. (2020).
Improving data quality in face-to-face survey research. PS: Political Sci-
ence and Politics, 53(1), 46–50. https://doi.org/10.1017/S104909651
9001161
Luke, H. (2017). Social resistance to coal seam gas development in
the Northern Rivers region of Eastern Australia: Proposing a diamond
model of social license to operate. Land use policy, 69, 266–280.
https://doi.org/10.1016/j.landusepol.2017.09.006
Luke, H., Baker, C., Allan, C. and McDonald, S. (2020). Agriculture on
the Eyre Peninsula: Rural Landholder Social Benchmarking Report 2020.
Southern Cross University, NSW, 2480. ISBN 978–0-6450707–0-5.
Available from: https://soilcrc.com.au/technical-reports/
Luke, H., Baker, C., Allan, C., McDonald, S., and Alexanderson, M.
(2021). Agriculture in The Northern Wheatbelt: Rural Landholder Social
Benchmarking Report 2021. Southern Cross University, NSW, 2480. ISBN
978–0-6450707–1-2. Available from: https://soilcrc.com.au/technical
-reports/
Luke, H., Brueckner, M., & Emmanouil, N. (2018). Unconventional
gas development in Australia: A critical review of its social license. The
Extractive Industries and Society, 5(4), 648–662. https://doi.org/10.1016
/j.exis.2018.10.006
Luke, H., Lloyd, D., Boyd, W., and den Exter, K. (2014). Unconven-
tional gas development: why a regional community said no: a report of
ndings from the 2012 Lismore City council election poll and exit poll
survey (New South Wales). Geographical Research, 52(3), 263–279.
https://doi.org/10.1111/1745–5871.12071
https://search.informit.org/doi/epdf/10.3316/ielapa.794922
375366543
Lynne, G.D., Casey, C.F., Hodges, A., Rahmani, M., 1995. Conser-
vation technology adoption decisions and the theory of planned
behaviour. Journal of Economic Psychology 16, 581e598. https://doi.
org/10.1016/0167–4870(95)00031–6
Malnar, B., and Ryan, L. (2021). Improving Knowledge Production in
Comparative Survey Research: Cross-Using Data from Four International
Survey Programmes. Czech Sociological Review, 57(6). https://www.ceeo
l.com/search/article-detail?id=1007851
Masterson, V. A., Stedman, R. C., Enqvist, J., Teng¨
o, M., Giusti, M.,
Wahl, D., and Svedin, U. (2017). The contribution of sense of place to
social-ecological systems research: a review and research agenda. Ecol-
ogy and Society, 22(1). https://www.jstor.org/stable/26270120
Medway, R. L., & Fulton, J. (2012). When more gets you less: A meta-
analysis of the effect of concurrent web options on mail survey response
rates. Public opinion quarterly, 76(4), 733–746. https://doi.org
H. Luke
Land Use Policy 153 (2025) 107526
12
/10.1093/poq/nfs047
Meertens, R. M., and Lion, R. (2008). Measuring an Individual’s
Tendency to Take Risks: The Risk Propensity Scale. Journal of Applied
Social Psychology, 38(6). https://doi.org/10.1111/j.1559–1816.2008.00
357.x
McIntyre, N., Moore, J., and Yuan, M. (2008). A place-based, values
centred approach to managing recreation on Canadian crown lands.
Society and Natural Resources, 21, 657–670. https://doi.org/10.1080/08
941920802022297
Miller, Z. D. (2017). The enduring use of the theory of planned
behavior. Human Dimensions of Wildlife, 22(6), 583–590. https://doi.
org/10.1080/10871209.2017.1347967
Montano, D. E., and Kasprzyk, D. (2015). Theory of reasoned action,
theory of planned behavior, and the integrated behavioral model. Health
behavior: Theory, research and practice, 70(4), 231. Available from: htt
ps://www.researchgate.net/prole/Daniel-Montano-2/publication/
235360939_The_theory_of_reasoned_action_and_the_theory_of_planned_
behavior/links/65cbe04b1e1ec12eff8e4c5c/The-theory-of-reasoned
-action-and-the-theory-of-planned-behavior.pdf
Ndiema, K. W. (2024). Implications of Articial Intelligence (AI) in
land management. Journal of Computer Science and Technology (JCST), 2
(1), 1–9. https://doi.org/10.51317/jcst.v2i1.490
Nuno, A., and St. John, F. A. S. (2015). How to ask sensitive questions
in conservation: A review of specialized questioning techniques. Bio-
logical Conservation, 189, 5–15. https://doi.org/10.1016/j.biocon.20
14.09.047
Norton, G. W., and Alwang, J. (2020). Changes in Agricultural
Extension and Implications for Farmer Adoption of New Practices.
Applied Economic Perspectives and Policy, 42(1), 8–20. https://doi.org
/10.1002/aepp.13008
Park, Y. J., Kaplan, D., Ren, Z., Hsu, C. W., Li, C., Xu, H., … and Li, J.
(2024). Can ChatGPT be used to generate scientic hypotheses?. Journal
of Materiomics, 10(3), 578–584. https://doi.org/10.1016/j.jmat.2023.0
8.007
Pannell, D. J., Marshall, G. R., Barr, N., Curtis, A., Vanclay, F., and
Wilkinson, R. (2006). Understanding and promoting adoption of con-
servation technologies by rural landholders. Journal of Experimental
Agriculture, 46, 1407–1424. https://doi.org/10.1071/EA05037
Parsons, R., and Luke, H. (2021). Comparing reexive and assertive
approaches to social licence and social impact assessment. The Extractive
Industries and Society, 8(2), 100765. https://doi.org/10.1016/j.exis.20
20.06.022
Patrick, M. E., Couper, M. P., Jang, B. J., Laetz, V., Schulenberg, J. E.,
O’Malley, P. M., … and Johnston, L. D. (2022). Building on a sequential
mixed-mode research design in the monitoring the future study. Journal
of Survey Statistics and Methodology, 10(1), 149–160.
Pearce, L. D. (2012). Mixed methods inquiry in sociology. American
Behavioral Scientist, 56(6), 829–848. https://doi.org/10.1177/0002764
211433798
Petticrew, M., and Roberts, H. (2008). Systematic reviews in the social
sciences: A practical guide. John Wiley and Sons.
Popay, J., Roberts, H., Sowden, A., Petticrew, M., Arai, L., Rodgers,
M., … and Duffy, S. (2006). Guidance on the conduct of narrative syn-
thesis in systematic reviews. A product from the ESRC methods programme
Version, 1(1), b92. https://citeseerx.ist.psu.edu/document?repid=rep
1&type=pdf&doi=ed8b23836338f6fdea0cc55e161b0fc5805f9e27
Prakash, A., and Bernauer, T. (2020). Survey research in environ-
mental politics: why it is important and what the challenges are. Envi-
ronmental Politics, 29(7), 1127–1134. https://doi.org/10.1080/09
644016.2020.1789337
Presser, S., Couper, M. P., Lessler, J. T., Martin, E., Martin, J.,
Rothgeb, J. M., and Singer, E. (2004). Methods for Testing and Evalu-
ating Survey Questions. The Public Opinion Quarterly, 68(1), 109–130.
http://www.jstor.org/stable/3521540
Rookey, B. D., Le, L., Littlejohn, M., and Dillman, D. A. (2012). Un-
derstanding the resilience of mail-back survey methods: An analysis of
20 years of change in response rates to national park surveys. Social
Science Research, 41(6), 1404–1414. https://doi.org/10.1016/j.ssr
esearch.2012.06.004.
Rouder, J., Olivia S., Rachel K., and Matt J. 2021. “What to Do With
All Those Open-Ended Responses? Data Visualization Techniques for
Survey Researchers.” Survey Practice, August. https://doi.org/10.29115
/SP-2021–0008
Rybak, A. (2023). Survey mode and nonresponse bias: A meta-
analysis based on the data from the international social survey pro-
gramme waves 1996–2018 and the European social survey rounds 1–9.
PLoS One, 18(3) https://doi.org/10.1371/journal.pone.0283092
Sanderson, M. R., and Curtis, A. L. (2017). Are irrigators different
from dryland operators? Insights from a comparative study in Australia.
JAWRA Journal of the American Water Resources Association, 53(6),
1453–1466. https://doi.org/10.1111/1752–1688.12584
Saris, W. E., and Gallhofer, I. N. (2014). Design, Evaluation, and
Analysis of Questionnaires for Survey Research. John Wiley and Sons.
Schmidt, L., Falk, T., Siegmund-Schultze, M., and Spangenberg, J. H.
(2020). The Objectives of Stakeholder Involvement in Transdisciplinary
Research. A Conceptual Framework for a Reective and Reexive
Practise. Ecological Economics, 176. https://doi.org/10.1016/j.
ecolecon.2020.106751
Scott, A., Woolcott, G., Keast, R., and Chamberlain, D. (2020). Sus-
tainability of collaborative networks in higher education research pro-
jects: why complexity? Why now? In Complexity Theory in Public
Administration. Routledge. https://doi.org/https://doi.org/10.1080/
14719037.2017.1364410
Seymour, E., Curtis, A., Pannell, D., Allan, C., and Roberts, A. (2010).
Understanding the role of assigned values in natural resource manage-
ment. Australasian Journal of Environmental Management, 17, 142–153.
https://doi.org/10.1080/14486563.2010.9725261
Schirmer, J., Yabsley, B., Mylek, M., and Peel, D. (2016). Wellbeing,
resilience and liveability in rural and regional Australia: The 2015 Regional
Wellbeing Survey. Available from: https://apo.org.au/node/64962
Scott, A., Keast, R., Woolcott, G., Chamberlain, D., and Che, D.
(2022). Project sustainability and complex environments: the role of
relationship networks and collaborative agency. In Governing complexity
in times of turbulence (pp. 102–126). Edward Elgar Publishing.
Smith, C., Felderhor, L., Bosch, O.J.H., 2007. Adaptive management:
making it happen through participatory systems analysis. Systems
Research and Behavioural Science 24, 567e587 https://doi.org/10.1002
/sres.835
Smith, J. W., Leahy, J. E., Anderson, D. H., and Davenport, M. A.
(2013). Community/Agency Trust and Public Involvement in Resource
Planning. Society and Natural Resources, 26(4), 452–471. https://doi.org
/10.1080/08941920.2012.678465
Slovic, P. (2000). The Perception of Risk. London: Earthscan.
Stedman, R. C. (2002). Toward a social psychology of place: Pre-
dicting behavior from place-based cognitions, attitude, and identity.
Environment and behavior, 34(5), 561–581. https://doi-org.ezproxy.
scu.edu.au/10.1177/00139165020340050
Stedman, R. C., Connelly, N. A., Heberlein, T. A., Decker, D. J., and
Allred, S. B. (2019). The End of the (Research) World As We Know It?
Understanding and Coping With Declining Response Rates to Mail Sur-
veys. Society and Natural Resources, 32(10), 1139–1154. https://doi.
org/10.1080/08941920.2019.1587127.
Stern, P.C., Dietz, T., Abel, T., Guagnano, G.A., Kalof, L., 1999. A
value-belief-norm theory of support for social movements: the case of
environmentalism. Research in Human Ecology 6, 81e97. https://www.
jstor.org/stable/24707060
Stern, P. C. (2000). Toward a coherent theory of environmentally
signicant behaviour. Journal of Social Issues, 56(3), 407–424. https://
doi.org/10.1111/0022–4537.00175
Stimpson, K., Luke, H., and Lloyd, D. (2019). Understanding grower
demographics, motivations and management practices to improve
engagement, extension and industry resilience: a case study of the
H. Luke
Land Use Policy 153 (2025) 107526
13
macadamia industry in the Northern Rivers, Australia. Australian Geog-
rapher, 50(1), 69–90. https://doi.org/10.1080/00049182.2018.1
463832
Schwartz, S. (1994). Are there universal aspects in the structure and
content of human values? Journal of Social Issues, 50, 19–45. https://doi.
org/10.1111/j.1540–4560.1994.tb01196.x
Ticehurst, J. L., Curtis, A., and Merritt, W. S. (2011). Using Bayesian
Networks to complement conventional analyses to explore landholder
management of native vegetation. Environmental Modelling and Software,
26(1), 52–65. https://doi.org/10.1016/j.envsoft.2010.03.032
Toman, E., Curtis, A. L., and Mendham, E. (2019). Same as it ever
was? Stability and change over 15 years in a rural district in south-
eastern Australia. Society and Natural Resources, 32(1), 113–132.
https://doi.org/10.1080/08941920.2018.1505014
Trentelman, C. K., Petersen, K. A., Irwin, J., Ruiz, N., and Szalay, C. S.
(2016). The case for personal interaction: drop-off/pick-up methodol-
ogy for survey research. Journal of Rural Social Sciences, 31(3), 68. https:
//egrove.olemiss.edu/jrss/vol31/iss3/4/
Uprichard, E. (2013). Sampling: Bridging probability and non-
probability designs. International Journal of Social Research Methodol-
ogy, 16(1), 1–11. https://doi.org/10.1080/13645579.2011.633391
Vanclay, F. (2004). Social Principles for Agricultural Extension to
Assist in the Promotion of Natural Resource Management. Australian
Journal of Experimental Agriculture, 44(3), 213–222. https://doi.org/10.
1071/EA02139
Vanclay, F. (2015). Qualitative methods in regional program evalu-
ation: An examination of the story-based approach. In Handbook of
Research Methods and Applications in Economic Geography (pp. 544–570).
Edward Elgar Publishing.
Van der Mol, C. (2017). Improving web survey efciency: the impact
of an extra reminder and reminder content on web survey response.
International Journal of Social Research Methodology, 20(4), 317–327.
https://doi.org/10.1080/13645579.2016.1185255.
Vaske, J.J., Donnelly, M.P., 1999. A value-attitude-behaviour model
predicting wildland preservation voting intentions. Society and Natural
Resources 12, 523e537. https://doi.org/10.1080/089419299279425
Wallen, K. E., Landon, A. C., Kyle, G. T., Schuett, M. A., Leitz, J., and
Kurzawski, K. (2016). Mode effect and response rate issues in mixed-
mode survey research: Implications for recreational sheries manage-
ment. North American Journal of Fisheries Management, 36(4),
852–863. https://doi.org/10.1080/02755947.2016.1165764
Weersink, A., Fraser, E., Pannell, D., Duncan, E., and Rotz, S. (2018).
Opportunities and challenges for big data in agricultural and environ-
mental analysis. Annual Review of Resource Economics, 10, 19–37.
https://doi.org/10.1146/annurev-resource-100516–053654
Xu, R., Sun, Y., Ren, M., Guo, S., Pan, R., Lin, H., … and Han, X.
(2024). AI for social science and social science of AI: A survey. Infor-
mation Processing and Management, 61(3), 103665. https://doi.
org/10.1016/j.ipm.2024.103665
Ziems, C., Held, W., Shaikh, O., Chen, J., Zhang, Z., and Yang, D.
(2024). Can large language models transform computational social sci-
ence?. Computational Linguistics, 1–55. https://doi.org/10.1162/col
i_a_00502
Data availability
Data will be made available on request.
References
Ackoff, S., Bahrenburg, A., Shute, L.L., 2017. Build. a Future Farmers II: Results Recomm.
Natl. Young-.-. Farmer Surv. https://www.youngfarmers.org/wp-content/uploads/
2018/02/NYFC-Report-2017.pdf.
Ajzen, I., 1991. The theory of planned behavior. Organ. Behav. Hum. Decis. Process. 50
(2), 179–211. https://doi.org/10.1016/0749-5978(91)90020-T.
Allan, C., Cooke, P., Higgins, V., Leith, P., Bryant, M., Cockeld, G., 2022. Adoption; a
relevant concept for agricultural land management in the 21 century? Outlook Agric.
51 (4), 375–383 https://doi-org.ezproxy.scu.edu.au/10.1177/0030727022112654.
Allred, S.B., Ross-Davis, A., 2011. The drop-off and pick-up method: An approach to
reduce nonresponse bias in natural resource surveys. Small-Scale For. 10 (3),
305–318. https://doi.org/10.1007/s11842-010-9150-y.
Amorsen, G., Mylek, M., and Schirmer, J. 2023. NSW farmers’ exposure to adverse
events. Available from: https://www.droughthub.nsw.gov.au/__data/assets/pdf_
le/0010/1531909/202223-Regional-Wellbeing-Survey-Report-NSW-farmers-
exposure-to-adverse-events.pdf.
Ampt, P., Cross, R., Ross, H., Howie, B., 2015. The case for retaining, redening and
reinvigorating extension in agricultural innovation systems. Rural Ext. Innov. Syst. J.
11 (1), 157–164. https://doi.org/10.3316/informit.661157849570442.
Anderson, J.R., 2004. Agricultural extension: good intentions and hard realities. World
Bank Res. Obs. 19 (1), 41–60. https://doi.org/10.1093/wbro/lkh013.
Aplin, K., Morgans, L., Palczynski, L., Main, D., Debbaut, C., Hepworth, L., Reed, J.,
2023. Calf health veterinary services: making them work for calves, farmers and
veterinarians. Vet. Rec. 193 (8). https://doi.org/10.1002/vetr.3051.
Ashmore, R.D., Deaux, K., McLaughlin-Volpe, T., 2004. An organizing framework for
collective identity: articulation and signicance of multidimensionality. Psychol.
Bull. 130 (1), 80. https://doi.org/10.1037/0033-2909.130.1.80.
Avemegah, E., Gu, W., Abulbasher, A., Koci, K., Ogunyiola, A., Eduful, J., Li, S.,
Barington, K., Wang, T., Kolady, D., Perkins, L., Lefer, A.J., Kov´
acs, P., Clark, J.D.,
Clay, D.E., and Ulrich-Schad, J.D. 2021. An Examination of Best Practices for Survey
Research with Agricultural Producers. Society and Natural Resources, 34(4), 538-
549. https://doi.org/10.1080/08941920.2020.1804651.
Axelrod, L.J., 1994. Balancing personal needs with environmental preservation:
identifying the values that guide decisions in ecological dilemmas. J. Soc. Issues 50
(3), 85–104. https://doi.org/10.1111/j.1540-4560.1994.tb02421.x.
Banker, S., Chatterjee, P., Mishra, H., and Mishra, A. 2023. Machine-assisted social
psychology hypothesis generation. PsyArXiv. February, 25. https://doi.org.ezproxy.
scu.edu.au/10.1037/amp0001222.
Batterton, K.A., and Hale, K.N. 2017. The Likert scale what it is and how to use it.
Phalanx, 50(2), 32-39. https://www.jstor.org/stable/26296382.
Bauer, M.W., Allum, N., Miller, S., 2007. What can we learn from 25 years of PUS survey
research? Liberating and expanding the agenda. Public Underst. Sci. 16 (1), 79–95.
https://doi.org/10.1177/0963662506071287.
Boudet, H., Clarke, C., Bugden, D., Maibach, E., Roser-Renouf, C., Leiserowitz, A., 2014.
Fracking controversy and communication: using national survey data to understand
public perceptions of hydraulic fracturing. Energy Policy 65, 57–67. https://doi.org/
10.1016/j.enpol.2013.10.017.
Bromley, J., Jackson, N.A., Clymer, O.J., Giacomello, A.M., Jensen, F.V., 2005. The use
of Hugin to develop Bayesian Networks as an aid to integrated water resource
planning, 242e251 Environ. Model. Softw. 20. https://doi.org/10.1016/j.
envsoft.2003.12.021.
Brown, G., 2004. Mapping spatial attributes in survey research for natural resource
management: methods and applications. Soc. Nat. Resour. 18 (1), 17–39. https://
doi.org/10.1080/08941920590881853.
Brown, G., Donovan, S., 2014. Measuring change in place values for environmental and
natural resource planning using public participation GIS (PPGIS): results and
challenges for longitudinal research. Soc. Nat. Resour. 27 (1), 36–54. https://doi.
org/10.1080/08941920.2013.840023.
Brown, G., Kytt¨
a, M., 2014. Key issues and research priorities for public participation GIS
(PPGIS): a synthesis based on empirical research. Appl. Geogr. 46, 122–136. https://
doi.org/10.1016/j.apgeog.2013.11.004.
Brown, G., Reed, P., Raymond, C.M., 2020. Mapping place values: 10 lessons from two
decades of public participation GIS empirical research. Appl. Geogr. 116, 102156.
https://doi.org/10.1016/j.apgeog.2020.102156.
Brown, K., Schirmer, J., Upton, P., 2022. Can regenerative agriculture support successful
adaptation to climate change and improved landscape health through building
farmer self-efcacy and wellbeing? Curr. Res. Environ. Sustain. 4, 100170. https://
doi.org/10.1016/j.crsust.2022.100170.
Bull, E.M., van der Cruyssen, L., V´
ag´
o, S., Kir´
aly, G., Arbour, T., van Dijk, L., 2024.
Designing for agricultural digital knowledge exchange: applying a user-centred
design approach to understand the needs of users. J. Agric. Educ. Ext. 30 (1), 43–68.
https://doi.org/10.1080/1389224X.2022.2150663.
Cantuaria, M.L., Blanes-Vidal, V., 2019. Self-reported data in environmental health
studies: mail vs. web-based surveys. BMC Med. Res. Methodol. 19, 1–13. https://doi.
org/10.1186/s12874-019-0882-x.
Clancy, A., Hovden, J.T., Laholt, H., 2024. A descriptive and interpretive theory of
ethical responsibility in public health nursing. Nurs. Ethics. https://doi.org/
10.1177/09697330241291161.
Collins, D., 2003. Pretesting survey instruments: an overview of cognitive methods. Qual.
life Res. 12, 229–238. https://doi.org/10.1023/A:1023254226592.
Connelly, N.A., Brown, T.L., Decker, D.J., 2003. Factors affecting response rates to
natural resource-focused mail surveys: empirical evidence of declining rates over
time. Soc. Nat. Resour. 16 (6), 541–549. https://doi.org/10.1080/08941920309152.
Conrad, F.G., Blair, J., 2004. Data quality in cognitive interviews: the case of verbal
reports. Methods Test. Eval. Surv. Quest. 67–87.
Coon, J.J., van Riper, C.J., Morton, L.W., Miller, J.R., 2020. Evaluating nonresponse bias
in survey research conducted in the rural Midwest. Soc. Nat. Resour. 33 (8),
968–986. https://doi.org/10.1080/08941920.2019.1705950.
Creswell, J.W., 2013. Research Design: Qualitative, Quantitative, and Mixed Methods
Approaches. SAGE Publications.
H. Luke
Land Use Policy 153 (2025) 107526
14
Cumming, R.G., 1990. Is probability sampling always better? A comparison of results
from a quota and a probability sample survey. Aust. N. Z. J. Public Health 14 (2),
132–137. https://doi.org/10.1111/j.1753-6405.1990.tb00033.x.
Curtis, A., Byron, I., MacKay, J., 2005. Integrating socio-economic and biophysical data
to underpin collaborative watershed management. J. Am. Water Resour. Assoc. 41
(3), 549–563. https://doi.org/10.1111/j.1752-1688.2005.tb03754.x.
Curtis, A., Robertson, A., 2003. Understanding management of river frontages: the
Goulburn Broken. Ecol. Manag. Restor. 4 (1), 45–54. https://doi.org/10.1046/
j.1442-8903.2003.t01-1-00137.x.
de Groot, R., Steg, L., 2007. Value orientations and environmental beliefs in ve
countries - validity of an instrument to measure egoistic, altruistic and biospheric
value orientations. J. Cross-Cult. Psychol. 38 (3), 318–332. https://doi.org/
10.1177/0022022107300278.
Dekkers, R., Carey, L., Langhorne, P., 2022. Objectives and positioning of literature
reviews. Making Literature Reviews Work: A Multidisciplinary Guide to Systematic
Approaches. Springer International Publishing, Cham, pp. 25–56. https://doi.org/
10.1007/978-3-030-90025-0.
Dergaa, I., Chamari, K., Zmijewski, P., Saad, H.B., 2023. From human writing to articial
intelligence generated text: examining the prospects and potential threats of
ChatGPT in academic writing. Biol. Sport 40 (2), 615–622. https://doi.org/10.5114/
biolsport.2023.125623.
Dietz, T., Fitzgerald, A., Shwom, R., 2005. Environmental values. Annu. Rev. Environ.
Resour. 30, 335–372.
Dillman, D.A., 1978. Mail and telephone surveys. John Wiley and Sons, USA.
Dillman, D.A., Smyth, J.D., and Christian, L.M. (2014). Internet, Phone, Mail, and Mixed-
Mode Surveys: The Tailored Design Method. John Wiley and Sons.
Doremus, H., 2003. A policy portfolio approach to biodiversity protection on private
lands. Environ. Sci. Policy 6, 217–232. https://doi.org/10.1016/S1462-9011(03)
00036-4.
Duan, N., Bhaumik, D.K., Palinkas, L.A., Hoagwood, K., 2015. Optimal design and
purposeful sampling: complementary methodologies for implementation research.
Adm. Policy Ment. Health Ment. Health Serv. Res. 42, 524–532. https://doi.org/
10.1007/s10488-014-0596-7.
Fielding, K.S., Terry, D.J., Masser, B.M., Bordia, P., Hogg, M.A., 2005. Explaining
landholders’ decisions about riparian zone management: the role of behavioural,
normative and control beliefs, 12e21 J. Environ. Manag. 77. https://doi.org/
10.1016/j.jenvman.2005.03.002.
Fowler Jr, F.J., 2013. Survey research methods. SAGE Publications.
Fulton, D.C., Manfredo, M.J., Lipscomb, J., 1996. Wildlife value orientations: a
conceptual and measurement approach. Hum. Dimens. Wildl. 1 (2), 24–47. https://
doi.org/10.1080/10871209609359060.
Glas, Z.E., Getson, J.M., Gao, Y., Singh, A.S., Eanes, F.R., Esman, L.A., Bulla, B.R.,
Prokopy, L.S., 2019. Effect of monetary incentives on mail survey response rates for
midwestern farmers, 229–37 Soc. Nat. Resour. 32 (2). https://doi.org/10.1080/
08941920.2018.1530815.
Grossmann, I., Feinberg, M., Parker, D.C., Christakis, N.A., Tetlock, P.E.,
Cunningham, W.A., 2023. AI and the transformation of social science research.
Science 380 (6650), 1108–1109. https://doi.org/10.1126/science.adi1778.
Groth, T.M., Curtis, A., Mendham, E., Toman, E., 2014. Farmer identity in
multifunctional landscapes: using a collective identity construct to explore the
nature and impact of occupational identity. Aust. Geogr. 45 (1), 71–86. https://doi.
org/10.1080/00049182.2014.869297.
Groth, T.M., Curtis, A., Mendham, E., Toman, E., 2017. Examining the agricultural
producer identity: utilising the collective occupational identity construct to create a
typology and prole of rural landholders in Victoria, Australia. J. Environ. Plan.
Manag. 60 (4), 628–646. https://doi.org/10.1080/09640568.2016.1165189.
Groves, R.M., Fowler Jr, F.J., Couper, M.P., Lepkowski, J.M., Singer, E., Tourangeau, R.,
2011. Survey Methodology, Second ed.). John Wiley and Sons.
Hernandez, S., Luke, H., Alexanderson, M.S., 2024. Is human activity driving climate
change? Perspectives from Australian landholders. Front. Sustain. Food Syst. 8,
1392746. https://doi.org/10.3389/fsufs.2024.1392746.
Heyvaert, M., Hannes, K., Onghena, P., 2017. Searching for relevant studies. In: Using
Mixed Methods Research Synthesis for Literature Reviews. Sage Publications, Los
Angeles, pp. 69–112.
Huang, J.Y., Hubbard, S.M., Mulvey, K.P., 2003. Obtaining valid response rates:
considerations beyond the tailored design method. Eval. Program Plan. 26 (1),
91–97. https://doi.org/10.1016/S0149-7189(02)00091-5.
Jackson-Smith, D., Flint, C.G., Dolan, M., Trentelman, C.K., Holyoak, G., Thomas, B.,
Ma, G., 2016. Effectiveness of the drop-off/pick-up survey methodology in different
neighborhood types. J. Rural Soc. Sci. 31 (3), 3. 〈https://egrove.olemiss.edu/jrss/
vol31/iss3/3/〉.
Johns, C., 2022. Reexive narrative: self-inquiry toward self-realization and its
performance in sage research methods. Sage Publications.
Kalton, G., 2019. Developments in survey research over the past 60 years: a personal
perspective. Int. Stat. Rev. 87, S10–S30. https://doi.org/10.1111/insr.12287.
Kalton, G., 2023. Probability vs. nonprobability sampling: from the birth of survey
sampling to the present day. Stat. Transit. N. Ser. 24 (3), 1–22. 〈https://www.ceeol.
com/search/article-detail?id=1127833〉.
Kartsounidou, E., Kluge, R., Silber, H., Gummer, T., 2023. Survey experience and its
positive impact on response behavior in longitudinal surveys: evidence from the
probability-based GESIS Panel. Int. J. Soc. Res. Methodol. 1–14. https://doi.org/
10.1080/13645579.2022.2163104.
Kianersi, S., Luetke, M., Jules, R., Rosenberg, M., 2020. The association between
interviewer gender and responses to sensitive survey questions in a sample of Haitian
women. Int. J. Soc. Res. Methodol. 23 (2), 229–239. https://doi.org/10.1080/
13645579.2019.1661248.
Krosnick, J.A. 2018. Questionnaire design. Handbook of Survey Research, Second
Edition, 263-313.
Larson, L.R., Stedman, R.C., Cooper, C.B., Decker, D.J., 2015. Understanding the multi-
dimensional structure of pro-environmental behavior. J. Environ. Psychol. 43,
112–124. https://doi.org/10.1016/j.jenvp.2015.06.004.
Lehdonvirta, V., Oksanen, A., R¨
as¨
anen, P., Blank, G., 2021. Social media, web, and panel
surveys: using non-probability samples in social and policy research. Policy Internet
13 (1), 134–155. https://doi.org/10.1002/poi3.238.
Lockwood, M., 1999. Humans valuing nature: synthesising insights from philosophy,
psychology and economics. Environ. Values 8 (3), 381–401 https://journals.
sagepub.com/doi/pdf/10.3197/096327199129341888.
Logan, C., Par´
as, P., Robbins, M., Zechmeister, E.J., 2020. Improving data quality in face-
to-face survey research. PS: Political Sci. Polit. 53 (1), 46–50. https://doi.org/
10.1017/S1049096519001161.
Luke, H., 2017. Social resistance to coal seam gas development in the Northern Rivers
region of Eastern Australia: proposing a diamond model of social license to operate.
Land Use Policy 69, 266–280. https://doi.org/10.1016/j.landusepol.2017.09.006.
Luke, H., Baker, C., Allan, C. and McDonald, S. 2020. Agriculture on the Eyre Peninsula:
Rural Landholder Social Benchmarking Report 2020. Southern Cross University,
NSW, 2480. ISBN 978-0-6450707-0-5. Available from: 〈https://soilcrc.com.au/techn
ical-reports/〉.
Luke, H., Baker, C., Allan, C., McDonald, S., Alexanderson, M., 2021. Agriculture in The
Northern Wheatbelt: Rural Landholder Social Benchmarking Report 2021. Southern
Cross University, NSW. : 〈https://soilcrc.com.au/technical-reports/〉.
Luke, H., Brueckner, M., Emmanouil, N., 2018. Unconventional gas development in
Australia: a critical review of its social license. Extr. Ind. Soc. 5 (4), 648–662.
https://doi.org/10.1016/j.exis.2018.10.006.
Luke, H., Lloyd, D., Boyd, W., den Exter, K., 2014. Unconventional gas development: why
a regional community said no: a report of ndings from the 2012 Lismore City
council election poll and exit poll survey (New South Wales). Geogr. Res. 52 (3),
263–279. https://doi.org/10.1111/1745-5871.12071.
Lynne, G.D., Casey, C.F., Hodges, A., Rahmani, M., 1995. Conservation technology
adoption decisions and the theory of planned behaviour. J. Econ. Psychol. 16.
https://doi.org/10.1016/0167-4870(95)00031-6.
Malnar, B., Ryan, L., 2021. Improving knowledge production in comparative survey
research: cross-using data from four international survey programmes. Czech Sociol.
Rev. 57 (6). 〈https://www.ceeol.com/search/article-detail?id=1007851〉.
Masterson, V.A., Stedman, R.C., Enqvist, J., Teng¨
o, M., Giusti, M., Wahl, D., Svedin, U.,
2017. The contribution of sense of place to social-ecological systems research: a
review and research agenda. Ecol. Soc. 22 (1). 〈https://www.jstor.org/stable/262
70120〉.
McIntyre, N., Moore, J., Yuan, M., 2008. A place-based, values centred approach to
managing recreation on Canadian crown lands. Soc. Nat. Resour. 21, 657–670.
https://doi.org/10.1080/08941920802022297.
Medway, R.L., Fulton, J., 2012. When more gets you less: a meta-analysis of the effect of
concurrent web options on mail survey response rates. Public Opin. Q. 76 (4),
733–746. https://doi.org/10.1093/poq/nfs047.
Meertens, R.M., Lion, R., 2008. Measuring an individual’s tendency to take risks: the risk
propensity scale. J. Appl. Soc. Psychol. 38 (6). https://doi.org/10.1111/j.1559-
1816.2008.00357.x.
Miller, Z.D., 2017. The enduring use of the theory of planned behavior. Hum. Dimens.
Wildl. 22 (6), 583–590. https://doi.org/10.1080/10871209.2017.1347967.
Montano, D.E., Kasprzyk, D., 2015. Theory of reasoned action, theory of planned
behavior, and the integrated behavioral model. Health Behav.: Theory, Res. Pract. 70
(4), 231. 〈https://www.researchgate.net/prole/Daniel-Montano-2/publication/2
35360939_The_theory_of_reasoned_action_and_the_theory_of_planned_behavior/li
nks/65cbe04b1e1ec12eff8e4c5c/The-theory-of-reasoned-action-and-the-theory-of
-planned-behavior.pdf〉(Available from).
Ndiema, K.W., 2024. Implications of Articial Intelligence (AI) in land management.
J. Comp. Sci. Technol. 2 (1), 1–9. https://doi.org/10.51317/jcst.v2i1.490.
Norton, G.W., Alwang, J., 2020. Changes in agricultural extension and implications for
farmer adoption of new practices. Appl. Econ. Perspect. Policy 42 (1), 8–20. https://
doi.org/10.1002/aepp.13008.
Nuno, A., John, F.A.S., 2015. How to ask sensitive questions in conservation: a review of
specialized questioning techniques. Biol. Conserv. 189, 5–15. https://doi.org/
10.1016/j.biocon.2014.09.047.
Pannell, D.J., Marshall, G.R., Barr, N., Curtis, A., Vanclay, F., Wilkinson, R., 2006.
Understanding and promoting adoption of conservation technologies by rural
landholders. J. Exp. Agric. 46, 1407–1424. https://doi.org/10.1071/EA05037.
Park, Y.J., Kaplan, D., Ren, Z., Hsu, C.W., Li, C., Xu, H., Li, J., 2024. Can ChatGPT be used
to generate scientic hypotheses? J. Mater. 10 (3), 578–584. https://doi.org/
10.1016/j.jmat.2023.08.007.
Parsons, R., Luke, H., 2021. Comparing reexive and assertive approaches to social
licence and social impact assessment. Extr. Ind. Soc. 8 (2), 100765. https://doi.org/
10.1016/j.exis.2020.06.022.
Patrick, M.E., Couper, M.P., Jang, B.J., Laetz, V., Schulenberg, J.E., O’Malley, P.M.,
Johnston, L.D., 2022. Building on a sequential mixed-mode research design in the
monitoring the future study. J. Surv. Stat. Methodol. 10 (1), 149–160.
Pearce, L.D., 2012. Mixed methods inquiry in sociology. Am. Behav. Sci. 56 (6),
829–848. https://doi.org/10.1177/0002764211433798.
Petticrew, M., Roberts, H., 2008. Systematic reviews in the social sciences: A practical
guide. John Wiley and Sons.
Popay, J., Roberts, H., Sowden, A., Petticrew, M., Arai, L., Rodgers, M., … and Duffy, S.
(2006). Guidance on the conduct of narrative synthesis in systematic reviews. A
product from the ESRC methods programme Version, 1(1), b92. https://citeseerx.ist.
H. Luke
Land Use Policy 153 (2025) 107526
15
psu.edu/document?
repid=rep1&type=pdf&doi=ed8b23836338f6fdea0cc55e161b0fc5805f9e27.
Prakash, A., Bernauer, T., 2020. Survey research in environmental politics: why it is
important and what the challenges are. Environ. Polit. 29 (7), 1127–1134. https://
doi.org/10.1080/09644016.2020.1789337.
Presser, S., Couper, M.P., Lessler, J.T., Martin, E., Martin, J., Rothgeb, J.M., Singer, E.,
2004. Methods for testing and evaluating survey questions. Public Opin. Q. 68 (1),
109–130. 〈http://www.jstor.org/stable/3521540〉.
Primdahl, J., Kristensen, L.S., 2011. The farmer as a landscape manager: management
roles and change patterns in a Danish region. Geogr. Tidsskr. -Dan. J. Geogr. 111 (2),
107–116. https://doi.org/10.1080/00167223.2011.10669527.
Rookey, B.D., Le, L., Littlejohn, M., Dillman, D.A., 2012. Understanding the resilience of
mail-back survey methods: an analysis of 20 years of change in response rates to
national park surveys. Soc. Sci. Res. 41 (6), 1404–1414. https://doi.org/10.1016/j.
ssresearch.2012.06.004.
Rouder, J., Olivia, S., Rachel, K., Matt, J., 2021. What to do with all those open-ended
responses? Data visualization techniques for survey researchers (August). Surv.
Pract.. https://doi.org/10.29115/SP-2021-0008.
Rybak, A., 2023. Survey mode and nonresponse bias: a meta-analysis based on the data
from the international social survey programme waves 1996–2018 and the European
social survey rounds 1 to 9. PLoS One 18 (3). https://doi.org/10.1371/journal.
pone.0283092.
Sanderson, M.R., Curtis, A.L., 2017. Are irrigators different from dryland operators?
Insights from a comparative study in Australia. JAWRA J. Am. Water Resour. Assoc.
53 (6), 1453–1466. https://doi.org/10.1111/1752-1688.12584.
Saris, W.E., and Gallhofer, I.N. (2014). Design, Evaluation, and Analysis of
Questionnaires for Survey Research. John Wiley and Sons.
Schirmer, J., Yabsley, B., Mylek, M., and Peel, D. (2016). Wellbeing, resilience and
liveability in rural and regional Australia: The 2015 Regional Wellbeing Survey.
Available from: https://apo.org.au/node/64962.
Schmidt, L., Falk, T., Siegmund-Schultze, M., and Spangenberg, J.H. 2020. The
Objectives of Stakeholder Involvement in Transdisciplinary Research. A Conceptual
Framework for a Reective and Reexive Practise. Ecological Economics, 176.
https://doi.org/10.1016/j.ecolecon.2020.106751.
Schwartz, S., 1994. Are there universal aspects in the structure and content of human
values? J. Soc. Issues 50, 19–45. https://doi.org/10.1111/j.1540-4560.1994.
tb01196.x.
Scott, A., Keast, R., Woolcott, G., Chamberlain, D., Che, D., 2022. Project sustainability
and complex environments: The role of relationship networks and collaborative
agency. In Governing complexity in times of turbulence. Edward Elgar Publishing,
pp. 102–126.
Scott, A., Woolcott, G., Keast, R., and Chamberlain, D. 2020. Sustainability of
collaborative networks in higher education research projects: why complexity? Why
now? In Complexity Theory in Public Administration. Routledge. https://doi.org/
https://doi.org/10.1080/14719037.2017.1364410.
Seymour, E., Curtis, A., Pannell, D., Allan, C., and Roberts, A. 2010. Understanding the
role of assigned values in natural resource management. Australasian Journal of
Environmental Management, 17, 142-153. https://doi.org/10.1080/
14486563.2010.9725261.
Slovic, P., 2000. The Perception of Risk. Earthscan, London.
Smith, C., Felderhor, L., Bosch, O.J.H., 2007. Adaptive management: making it happen
through participatory systems analysis, 567e587 Syst. Res. Behav. Sci. 24. https://
doi.org/10.1002/sres.835.
Smith, J.W., Leahy, J.E., Anderson, D.H., Davenport, M.A., 2013. Community/Agency
Trust and Public Involvement in Resource Planning. Soc. Nat. Resour. 26 (4),
452–471. https://doi.org/10.1080/08941920.2012.678465.
Stedman, R.C., 2002. Toward a social psychology of place: predicting behavior from
place-based cognitions, attitude, and identity. Environ. Behav. 34 (5), 561–581. 〈htt
ps://doi-org.ezproxy.scu.edu.au/10.1177/00139165020340050〉.
Stedman, R.C., Connelly, N.A., Heberlein, T.A., Decker, D.J., Allred, S.B., 2019. The End
of the (Research) World As We Know It? Understanding and coping with declining
response rates to mail surveys. Soc. Nat. Resour. 32 (10), 1139–1154. https://doi.
org/10.1080/08941920.2019.1587127.
Stern, P.C., 2000. Toward a coherent theory of environmentally signicant behaviour.
J. Soc. Issues 56 (3), 407–424. https://doi.org/10.1111/0022-4537.00175.
Stern, P.C., Dietz, T., Abel, T., Guagnano, G.A., Kalof, L., 1999. A value-belief-norm
theory of support for social movements: the case of environmentalism. Res. Hum.
Ecol. 6. 〈https://www.jstor.org/stable/24707060〉, 81e97.
Stimpson, K., Luke, H., Lloyd, D., 2019. Understanding grower demographics,
motivations and management practices to improve engagement, extension and
industry resilience: a case study of the macadamia industry in the Northern Rivers,
Australia. Aust. Geogr. 50 (1), 69–90. https://doi.org/10.1080/
00049182.2018.1463832.
Ticehurst, J.L., Curtis, A., Merritt, W.S., 2011. Using Bayesian networks to complement
conventional analyses to explore landholder management of native vegetation.
Environ. Model. Softw. 26 (1), 52–65. https://doi.org/10.1016/j.
envsoft.2010.03.032.
Toman, E., Curtis, A.L., Mendham, E., 2019. Same as it ever was? Stability and change
over 15 years in a rural district in southeastern Australia. Soc. Nat. Resour. 32 (1),
113–132. https://doi.org/10.1080/08941920.2018.1505014.
Trentelman, C.K., Petersen, K.A., Irwin, J., Ruiz, N., Szalay, C.S., 2016. The case for
personal interaction: drop-off/pick-up methodology for survey research. J. Rural
Soc. Sci. 31 (3), 68. 〈https://egrove.olemiss.edu/jrss/vol31/iss3/4/〉.
Uprichard, E., 2013. Sampling: bridging probability and non-probability designs. Int. J.
Soc. Res. Methodol. 16 (1), 1–11. https://doi.org/10.1080/13645579.2011.633391.
Van Mol, C., 2017. Improving web survey efciency: the impact of an extra reminder and
reminder content on web survey response. Int. J. Soc. res. methodol. 20 (4),
317–327. https://doi.org/10.1080/13645579.2016.1185255.
Vanclay, F., 2004. Social principles for agricultural extension to assist in the promotion
of natural resource management. Aust. J. Exp. Agric. 44 (3), 213–222. https://doi.
org/10.1071/EA02139.
Vanclay, F., 2015. Qualitative methods in regional program evaluation: An examination
of the story-based approach. In Handbook of Research Methods and Applications in
Economic Geography. Edward Elgar Publishing, pp. 544–570.
Vaske, J.J., Donnelly, M.P., 1999. A value-attitude-behaviour model predicting wildland
preservation voting intentions, 523e537 Soc. Nat. Resour. 12. https://doi.org/
10.1080/089419299279425.
Wallen, K.E., Landon, A.C., Kyle, G.T., Schuett, M.A., Leitz, J., Kurzawski, K., 2016. Mode
effect and response rate issues in mixed-mode survey research: implications for
recreational sheries management. North Am. J. Fish. Manag. 36 (4), 852–863.
https://doi.org/10.1080/02755947.2016.1165764.
Weersink, A., Fraser, E., Pannell, D., Duncan, E., Rotz, S., 2018. Opportunities and
challenges for big data in agricultural and environmental analysis. Annu. Rev.
Resour. Econ. 10, 19–37. https://doi.org/10.1146/annurev-resource-100516-
053654.
Wehde, W., Perreault, M., 2022. Developing Survey Methods for Collecting Individual
Policy Narratives: A case study of climate change narratives using an engaged
convenience sample. Int. Rev. Public Policy 4 (4:1), 37–54. https://doi.org/
10.4000/irpp.2480.
Xu, R., Sun, Y., Ren, M., Guo, S., Pan, R., Lin, H., Han, X., 2024. AI for social science and
social science of AI: a survey. Inf. Process. Manag. 61 (3), 103665. https://doi.org/
10.1016/j.ipm.2024.103665.
Ziems, C., Held, W., Shaikh, O., Chen, J., Zhang, Z., Yang, D., 2024. Can large language
models transform computational social science? Comput. Linguist. 1–55. https://doi.
org/10.1162/coli_a_00502.
H. Luke